Home > database >  Extent of MISRA C 2012 Directive 4.1: Runtime checks before pointer dereferencing in a library
Extent of MISRA C 2012 Directive 4.1: Runtime checks before pointer dereferencing in a library

Time:11-19

MISRA C 2012 Directive 4.1 says that Run-time failures should be minimized and further states for pointer dereferencing that a pointer should be checked for NULL before it's dereferenced, unless it's already known to be not NULL.

When writing a library performing simple operations, checking the input pointers in each library function for NULL creates larger and therefore less understandable code even for simple operations.

For example below library computes the squared euclidian norm of a vector and gets used in a controller

/*** Option 1: Checking pointers ***/

Library:

/* file vector.c */

bool_t bVectorNormSq(float32_t pf32Vec[], uint8_t u8Len, float32_t *pf32NormSq)
{
    uint8_t u8n;
    bool bRet = false;

    /* Check pointers */
    if( (pf32Vec != NULL) && (pf32NormSq != NULL) )
    {
        *pf32NormSq = 0.0f;

        for(u8n = 0U; u8n < u8Len; u8n  )
        {
            *pf32NormSq  = (pf32Vec[u8n] * pf32Vec[u8n]);
        }
        bRet = true;
    }
    else
    {
        /* Do not alter pf32NormSq (unknown if valid pointer) */
        bRet = false;
    }

    return bRet;
}
/* EOF */

Consumer of library:

/* file controller.c */

/* ... */

bool_t bControllerStep(void)
{
    float32_t pf32MyVec[3] = { 0 };
    float32_t f32MyNorm = 0.0f;

    /* ... */

         /* MISRA C 2012 Rule 17.7, Call will always be successful, thus return value not checked */
     (void)bVectorNormSq(pf32MyVec, 3U, &f32MyNorm); 
    
         /* ... */

}

/* EOF */

/*** Option 2: Not checking pointer, responsibility to supply valid inputs placed on caller ***/

Library:

/* file vector.c */

/**
 * @note This library assumes that valid pointers will be supplied,
 *       pointers are NOT checked before they are used.
 */

/* Assert macro expands to "(void)(CONDITION)" for NDEBUG defined */
#ifdef NDEBUG
  VECTOR_ASSERT( CONDITION ) (void)(CONDITION)
#else
  /* ... */
#endif /* NDEBUG */

float32_t f32VectorNormSq(float32_t pf32Vec[], uint8_t u8Len)
{
    float32_t f32Norm = 0.0f;
    uint8_t u8n = 0U;

    VECTOR_ASSERT(pf32Vec!=NULL);

    for(u8n = 0U; u8n < u8Len; u8n  )
    {
        f32NormSq  = (pf32Vec[u8n] * pf32Vec[u8n]);
    }

}

/* EOF */

Consumer of library:

/* file controller.c */

/* ... */

bool_t bControllerStep(void)
{
    float32_t pf32MyVec[3] = { 0 };
    float32_t f32MyNorm = 0.0f;

    /* ... */

    f32MyNorm = f32VectorNormSq(pf32MyVec, 3U);

    /* ... */

}

/* EOF */

For option 1 the library function f32VectorNormSq() can be reasoned by the caller bControllerStep() to always execute successfully (return true), because the array pf32MyVec is defined in the caller and pf32MyVec thus can't be NULL. Therefore the caller chooses to ignore the return value.

Above reasoning is likely to be the applicable for many uses of the library, resulting in callers frequently ignoring return values for option 1. This could lead to a programmer being complacend in ignoring return values, even if they shouldn't, e.g. because a zero division was detected by the library.

Opposed to this option 2 assumes the caller supplying valid pointers, documents this in the library /* @note...*/ and only uses a assert for NULL pointers to assist during development, later to be disabled for deployment. In this option a Boolean return value being present highlights to the programmer that the library operation might fail for reasons other than invalid pointers or incorrect other trivial inputs, e.g due to zero division and care should be taken before using the computed values, avoiding the risk of complacency.

Therefore my question(s):

  • Is it situations as in the example shown compliant to MISRA C for a libary to neglect checking pointers or other trivial inputs?

  • Are there any additional conditions that have to be fullfilled, besides documenting the missing input checking in the source code, e.g. is a formal deviation needed?

Application background:

This is just an aerospace student with focus in control systems, trying to understand how things are done properly in the actual coding of control systems, by writing his own control algorithms as if they were running on the real thing i.e. by following standards like MISRA C.

Code written by me will NOT be going on a live system anytime soon, but for the purpose of your answers please consider this code as running on the actual system where a failure of the function is classified as catastrophic, .i.e. everybody dies.

I'm aware of the whole software and hardware engineering (SAE ARP4754, SAE ARP4761, DO-178C, ...) around the actual implementation process. This question is really just about the instructions executing on the hardware, not about the need for redudant & dissimilar hardware or requirements, reviews, testing, ...

EDIT: The library in question is low in the code stack thus sanitizing code for inputs from the outside (sensors,...) can be expected to be present. I'm trying to avoid falling prey to Cargo cult programming by blindly following the rule "Always check all inputs".

CodePudding user response:

Regarding Option 1 - how would you write the documentation to that function? Does this make sense:

bVectorNormSq
Description of what this function does here.

pf32Vec    A pointer to an allocated array of [u8Len] that will-...
u8Len      The size of the array [pf32Vec] in bytes.
...

Returns:   true if successful, false in case pf32Vec was a null pointer.

It doesn't make sense. You already documented that pf32Vec must be a pointer to an allocated array so why would the usernot read that part, but read the part about return status? How are we even supposed to use this function?

bool result = bVectorNormSq(some_fishy_pointer, ...); // checks for null internally

if(result == false)
{
  /* error: you passed a null pointer */
}

Why can't you instead write that very same code like this?

if(some_fishy_pointer == NULL)
{
 /* handle the error instead of calling the function in the first place */
}
else
{
  bVectorNormSq(some_fishy_pointer, ...); // does not check for null internally
}

It's the same amount of error checking, same amount of branches. Literally the only difference is the slower execution speed of the first version.


Also, your extra error checks add extra complexity which could in turn lead to extra bugs.

Potential bugs:

if( (pf32Vec = NULL) ||(pf32NormSq = NULL) )

or

if( (pf32Vec =! NULL) && (pf32NormSq != NULL) )

or (not just a potential bug):

bool_t ... bool bRet = false; ... return bRet;


Extra error checking is a good thing, but place it where it matters - sanitize data at the point where it gets assigned. Example:

const char* ptr = strstr(x,y);
if(ptr == NULL) // Correct! Check close to the location where the pointer is set
{}
...
some_function(ptr);
...
void some_function (const char* str)
{
  if(str == NULL) // Not correct. This function has nothing to do with the strstr call.

}
  • Related