Coding Conventions
I mentioned coding conventions only briefly, but their importance can't be stressed enough. The idea behind good coding conventions is to make correct code and incorrect code look different at a glance. Common examples include writing (NULL == variable) instead of (variable == NULL) in comparisons. These two look a lot more different from each other than (variable = NULL) looks from either.
Indenting is another issue. Some coding conventions require braces to be on separate lines, which makes it easy to tell the difference between beginning a block and run-on lines. Conventions that require if statements to be followed by a new block do so to avoid the dangling else problem.
One of the most misunderstood conventions is Hungarian notation. Joel Spolsky explained how this notation should be used—to encode type information that your type system can't encode, not to redundantly duplicate information that it can encode. Using different prefixes for column and row indexes, for example, means that you can easily tell when you have accidentally switched your coordinate orientation. Having different prefixes for feet and meters means that you can easily see which line will cause you to crash into Mars.
Good coding conventions don't eliminate bugs, but they do make certain categories more obvious, which leaves you with more time to spend looking for the really hard bugs.