Living the "Least Privilege" Lifestyle, Part 4: Is Developing Secure Software as an Administrator an Impossible Dream?
So far in this series about living the least privilege lifestyle, I've covered least privilege from a user's perspective:
- Why running as a member of the Administrators group has facilitated many recent worms, viruses, etc.
- Why running as a member of the Users group is safer
- How to survive daily computer use as a mere user (including lots of tools and tricks)
If you've been following along, you should now have the confidence that you can survive running as a mere user rather than as a member of the Administrators group, and are enjoying a substantially more secure computing environment. With this article, I'll change directions and focus on being a least privilege developer who writes software for other people. Here the situation gets a bit more complex, with necessary consideration for both your own security as well as your application's users.
In part 1 of this series, I made the following assertion, which we'll call the Law of Admin Insecurity:
The fact that most people run with administrative rights on their local machine is the root cause of many or most of today's security problems.
It's time for the second assertion of this series, the Law of Least Privilege Development:
You cannot develop secure applications with admin rights on your development machine.
Gasp! This rule flies in the face of conventional wisdom about developing software. Don't you need admin rights to debug, to interact with servers such as IIS, to fiddle with protected portions of the registry and file system? No, you don't. Let's face it: What are you doing most of the time when you're writing software? Editing text files! And unless you have a strange preference for storing your code files in the System32 directory or some other protected location, you don't even need special permissions to write to the disk. Attaching a debugger to a process you own doesn't require any special permissions. And if you do development correctly, you'll rarely need to access protected resources.
Of course, you probably need a few permissions that mere users don't typically need, but those permissions fall far short of full admin privileges. If you're writing server-based software, such as ASP.NET web applications, you probably need permission to debug processes you don't own. And some applications have a legitimate need to read or write to protected resources such as HKEY_LOCAL_MACHINE in the registry. But these situations are rare or are easily handled by granting a very narrow, specific permission.
The principle of least privilege doesn't say that no one can have permissions beyond those granted to members of the Users group. What it says is that a user should have exactly the permissions he or she needs to do the job—no more and no less. Developers may need a few more permissions, but only a few.
Common Software Problems
What's the problem with a developer having administrator privileges on her local machine? The problem is that developing software in that kind of environment makes possible LUA bugs that can arise in the earliest stages of a project and grow and compound from there. LUA stands for least-privileged user account, an account that's usually a member of the Users group. Windows uses an access control list (ACL) to keep track of whether a specific user has proper permissions to access an object; a typical LUA bug is an ACL issue in which software attempts to access a protected resource for which it doesn't have permissions, such as a disk file or a registry key.
Here's a simple but typical scenario in which these kinds of bugs arise. Early in a project, a developer running with administrative rights writes some low-level methods that write a configuration file to the System32 directory. (Okay, this is an absurd example. Everyone reading this should know that's a terrible place to put a configuration file, but plenty of other locations on disk are just as bad. Let's keep things simple here.) The developer tests the routines and everything works great (the first time, of course). Then she puts the library into source control and she and her team tackle the rest of the application.
During the project, the team builds many components that use those configuration routines. Other components use a property of the Configuration object to get the fully qualified path to the configuration files, perhaps to update them or read them to back them up. Over the 18-month project period, hundreds of routines grow to depend on the specific location of the files.
Finally, the scheduled ship date is just two weeks away. Everyone is working hard and it looks like the deadline is going to be met a few days early with no unresolved issues, a first for this team. Thoughts turn to the inevitable lush pizza and champagne celebration.
One pesky tester then decides that she had better log in as a mere user and try things out—not because she thinks she'll find anything, but because the annoying client requires testing under those conditions. To make a long, painful story short, she finds that the app crashes in the messiest way, with the event log filled to capacity with nasty exceptions throughout the application. The app ships three months late, after the team hunts down and eradicates all of the problems caused by the ACL issue on the configuration file. The team finally celebrates shipping with three-day old whiskey and grilled cheese sandwiches in the corner diner, on their way to the unemployment line.
Okay, this example is a bit contrived, but real problems can compound just as easily. Think of all of the protected resources that most of your apps access, and it's easy to see how problems compound.
A common response I get when I present this material live is that the commentator's company can't possibly have this problem because they test all along, sometimes even as a mere user. The problem there, of course, is that you not only have to test as a mere user; you have to make sure that you exercise every code path with a variety of user permissions. Do you test that thoroughly?
But you shouldn't rely on testing. Why not just catch the problem the first time you run the code you just wrote, and nip it in the bud? The earlier you catch a problem, the cheaper it is to fix.
A related LUA bug arises from attempting to open resources with too broad a permission request. Continuing the configuration file example, let's say that our developer writes code to open the file to read the contents. Because it's easier than stopping to think about what file access is needed for an operation, and because sometime in the future the method might be enhanced to also write to the file, the developer opens the file requesting full access. The problem here is that even though a mere user can read and execute files in System32, a LUA doesn't have full control or write access. So a mere user running the app gets an exception, even though the user can read files—which is all the method needs to do. Beware of asking for more access than you need.
Another common problem is using Local Security Authority (LSA) methods. LSA is an integral part of Windows and available for use, but requires administrative privileges. It's easy to write a whole subsystem that interacts with LSA—and that then has to be thrown out because mere users have to use the application.
And the list of potential LUA bugs goes on and on.