Silent Elimination of Bounds Checks
The C language supports pointer arithmetic where one operand is a pointer to an object and the other operand is an integer. When an expression that has an integer type is added to a pointer, the result has the type of the pointer operand.
Pointer arithmetic in C is primarily meant to support the manipulation of a pointer within an array of elements, as in the following example:
int a[2] = {1, 2}; int main(void) { int j = 0; for (int * pi = &a[0]; pi < &a[2]; ++pi) j += *pi; return j; }
This same program can be rewritten in an equivalent form using array notation:
int a[2] = {1, 2}; int main(void) { int j = 0; for (int i = 0; i < 2; ++i) j += a[i]; return j; }
Array indexing is equivalent to pointer arithmetic, but it is pointer arithmetic performed by the compiler, not the coder. An expression such as a[i], for example, is translated to *(a+i).
Provided the pointer operand and the result point to elements of the same array object during pointer arithmetic, or to one past the last element of the array object, the evaluation is well defined; otherwise, the behavior is undefined.
Undefined Behaviors
C programmers are not always entirely familiar with the implications of undefined behavior. The C Standard [ISO/IEC 2011] defines undefined behavior as behavior, upon use of a nonportable or erroneous program construct or of erroneous data, for which the standard imposes no requirements. Conforming implementations can deal with undefined behavior in a variety of fashions, such as:
- Ignoring the situation completely, with unpredictable results
- Translating or executing the program in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message)
- Terminating a translation or execution (with the issuance of a diagnostic message)
Undefined behaviors are extremely dangerous, both because they need not be diagnosed and because they can result in any behavior. Most of the security vulnerabilities described in my book, Secure Coding in C and C++, Second Edition, are the result of exploiting undefined behavior in code.
Another problem with undefined behavior is compiler optimizations. Because compilers are not obligated to generate code for undefined behavior, these behaviors are candidates for optimization. By assuming that undefined behavior will not occur, compilers can generate code with better performance characteristics.
Compiler Optimizations
The basic design of an optimizer for a C compiler is largely the same as that of an optimizer for any other procedural programming language. The fundamental principle of optimization is to replace a computation with a more efficient method that computes the same result.
However, some optimizations change behavior. In some cases, these optimizations are beneficial and eliminate undefined behaviors and vulnerabilities; in other cases, these optimizations may inadvertently introduce vulnerabilities.
Compiler developers may choose between several behavior models when translating code. However, compiler writers are not obligated to follow any particular model and are often opportunistic about which model to apply in any given situation.
The first of these is the hardware behavior model. Compilers that follow this model generate code as if the undefined behavior were defined to be whatever action the hardware instructions would take if the undefined behavior occurred. For example, if the hardware instruction wrapped around on signed integer overflow, the compiler would simply treat the wraparound as the defined behavior for integer overflow. For many years, this policy was nearly universal, so several generations of C and C++ programmers have assumed that all compilers behave this way.
Another model is super debug. Compiler vendors that follow this model provide an intensive debugging environment to trap (nearly) every undefined behavior. This policy severely degrades the application’s performance, so it is seldom used for building applications.
Finally, the total license model allows compiler writers to treat any possible undefined behavior as a “can’t-happen” condition, permitting aggressive optimizations.
Vulnerabilities Resulting from Undefined Behaviors
Increasingly, compiler writers are taking advantage of undefined behavior in the C programming languages to improve optimizations. These optimizations frequently interfere with the ability of developers to perform cause-effect analysis on their source code—that is, to analyze the dependence of downstream results on prior results. Consequently, these optimizations eliminate causality in software and increase the probability of software faults, defects, and vulnerabilities. One such occurrence afflicted Plan 9’s sprint() function and is further documented in CERT Vulnerability Note VU#162289, “C compilers may silently discard some wraparound checks.”
In general, a programmer might code a bounds-check such as
char *ptr; // ptr to start of array char *max; // ptr to end of array size_t len; if (ptr + len > max) return EINVAL;
No matter what behavior model is used, there is a bug. If len is very large, it can cause ptr + len to overflow, which creates undefined behavior. Under the hardware behavior model, the result would typically wrap around—pointing to an address that is actually lower in memory than ptr.
In attempting to fix the bug, the experienced programmer (who has internalized the hardware behavior model of undefined behavior) might write a check like this:
if (ptr + len < ptr || ptr + len > max) return EINVAL;
However, compilers that follow the total license model may optimize out the first part of the check, leaving the whole bounds check defeated. This optimization is allowed because ptr plus (an unsigned) len can compare less than ptr only when an undefined behavior occurred during calculation of ptr + len. The compiler legitimately assumes that undefined behavior never happens; consequently, ptr + len < ptr is dead code and can be removed.
Optimizations may be performed for comparisons between P + V1 and P + V2, where P is the same pointer and V1 and V2 are variables of some integer type. The total license model permits this expression to be reduced to a comparison between V1 and V2.
In our example,
if (ptr + len < ptr || ptr + len > max) return EINVAL;
this optimization translates as follows:
ptr + len < ptr ptr + len < ptr + 0 len < 0 (impossible, len is unsigned)
However, if V1 or V2 are such that the sum with P overflows, then the comparison of V1 and V2 will not yield the same result as actually computing P + V1 and P + V2 and comparing the sums. Because of possible overflows, computer arithmetic does not always obey the algebraic identities of mathematics.
This problem is easy to remediate once it is called to the attention of the programmer. For example, if it is known that ptr is less than or equal to max, then the programmer could write
if (len > max – ptr) return EINVAL;
This conditional expression eliminates the possibility of undefined behavior.
In this example, the expression buf + n may wrap for large values of n, resulting in undefined behavior.
int f(char *buf, size_t n) { return buf + n < buf + 100; }
When compiled using GCC 4.3.0 with the -O2 option, for example, the expression
buf + n < buf + 100
is optimized to n < 100, eliminating the possibility of wrapping. This behavior is usually not an issue unless one expression wraps but the other does not. This code example is still incorrect, because it is not safe to rely on compiler optimizations for security.
The undefined behavior can be eliminated by performing the optimization by hand:
int f(char * buf, size_t n) { return n < 100; }
Recommendations
The pointer overflow behavior that caused the vulnerability in Plan 9’s sprint() function changed as of GCC 4.2.4, GCC 4.3.1, and GCC 4.4.0, and all subsequent versions 4.2.x where x >= 4, versions 4.3.y where y >= 1, and versions 4.z where z >= 4.
The optimization is performed by default at -O2 and above, including –Os, but it is not performed by default at -O1 or -O0. The optimization may be enabled for -O1 with the -fstrict-overflow option or disabled for -O2 and above with the -fno-strict-overflow option. Cases where optimization occurs may be detected by using -Wstrict-overflow=N where N >= 3.
C and C++ programmers must be very careful to avoid undefined behaviors in their code. Compilers that implement the total license model are becoming increasingly common; such compilers have the latitude to change how they compile undefined behaviors at any time, and they frequently do so to further optimize code. Consequently, the results produced by code that depends on undefined behaviors can change arbitrarily without notice.
Programmers frequently overreact by disabling optimizations in their code. Optimizations are just as likely to eliminate vulnerabilities from code as they are to introduce them. Diagnostics such as -Wstrict-overflow can be very useful to identify cases where the compiler optimizes based on the assumption that signed overflow does not occur. Frequently these optimizations are harmless; sometimes they can be surprising.