Developer

ASP.NET security holes: Endemic problem, or just lazy programmers?

Newfound holes in ASP.NET show how developers must take part of the reponsibility for application security instead of just blaming Microsoft.


Microsoft has gotten so much bad press over security that I tend to chalk up most newfound problems to nothing more than snipers shooting at a big, wounded target. Consequently, I viewed the announcement by security specialist H.D. Moore that he's uncovered large security holes in the new ASP.NET platform as more of the same. After speaking with Moore about his rather serious findings, though, I'm convinced that we developers must share the responsibility with Microsoft when it comes to making our applications secure.

Three big problems
Moore presented his findings at the recent CanSecWest security conference (you can read his presentation, "Breaking ASP.NET," at digitaloffense.com). Describing three problems he found in the platform, Moore demonstrated attacks that exploited all three of his vulnerabilities. Some of these attacks, particularly the "cookie-less session" exploit, are hit-yourself-over-the-head simple.

Cookie-less sessions get no cigar
The "cookie-less session" exploit involves a flaw in ASP.NET's cookie-less session management scheme, which includes a session identifier (id) in any URL. Specifically, a new session is created whenever a new session id is handed to the server, rather than making the server responsible for creating them. This flaw allows an attacker to hijack the session and masquerade as a legitimate user to gain unfettered access to the application.

The attack Moore outlines goes something like this:
  1. Come up with a valid but bogus session id, which is apparently not that difficult to do since session ids are not created using a cryptographic scheme.
  2. Embed the session id in a URL sent to the user via e-mail or some other method.
  3. Once the user presents his or her credentials and is granted access to the page, the session id remains valid, and the attacker may use it to access the system with all permissions of the user he or she has just spoofed.

Information leakage from poor configuration control
Moore also discovered a method that a potential attacker can use to examine the structure and even, under certain circumstances, the actual source code of the application in question. The trouble here revolves around poor configuration control on the part of the application's developer, and a healthy dose of mistaken documentation.

ASP.NET is capable of displaying custom error messages to a client in the event of an application fault. This feature is often disabled when debugging an application, which causes the server to return stack traces and other debugging information to the client instead of a generic "an error occurred" message. If custom errors are disabled, and debugging support is also inadvertently left on in a production application (this is the default setting in Visual Studio.NET), file names and paths, and the source code surrounding any error, will be returned to the client.

This information is embedded in comments in the HTML for the error page, says Moore, so you'd need to view the page's source to actually see it, but the fact that it's there constitutes a potentially serious problem. All of this information could prove useful to an attacker performing recon on your application, or possibly aid a competitor’s reverse engineering.

If that's not enough for you, consider that the ISAPI file extension filter, the feature that protects certain file types from being arbitrarily downloaded and viewed, does not automatically prevent files of type txt, csv, and xml from being accessed by authenticated users. If you can't see the problem here, take a look at Microsoft's recommended practice for user authentication using an xml file. At one time, that example placed the users.xml file in the root directory, where any authenticated user can download and view it, gaining access to the passwords of all the application's users.

Moore says that the documentation has since been updated to recommend placement of this file in a protected directory, so that door has been closed. However, it's possible that leftover project files (sln and slo extensions) and other questionable things might still be accessible. He recommends deleting these files before an application is placed into production, and also recommends that developers place nothing in the root directory that they wouldn't want someone else to see.

A more traditional flaw
Moore also discovered that the aspnet_state session handler class is vulnerable to an overflow attack that can allow an attacker to execute arbitrary code. This is a fairly common flaw that's been detected in dozens of other applications, so, on the surface, it's not really surprising that he's found it here. What is surprising—stunning, actually—is that .NET, with its managed run-time environment, should preclude the possibility of unprotected memory access in an application, and prevent this sort of attack from working.

Security = somebody else's problem?
Moore says that developer ignorance and misinformation by Microsoft and various technical writers are partly to blame for the problems he's uncovered. He says, for example, that of the books, tutorials, and Microsoft documentation he's examined, none discussed in depth any aspect of validating user input, which would circumvent the overflow vulnerability he found.

It's tempting to categorize this as just another case of Microsoft glossing over security issues or rushing an insecure product to market, but I'm not sure that the answer’s not closer to home. It's my experience that, by and large, developers hold fast to the belief that security is "someone else's problem." We tend to place responsibility for security on network engineers or tool vendors instead of ourselves. We've also developed (no pun intended) the bad habit of depending on secondary sources, sources that are often inaccurate, for information about the technology we use, rather than digging into primary sources like documentation. Don't believe me? Why then is publishing technical books a multi-million-dollar industry?

Of course, as is the case with the stupid users.xml trick, documentation can be wrong. That's just another reason that self-education and a healthy dose of common sense are so important. There's an object lesson here: No technology is absolutely bulletproof, and we shouldn't expect it to be, even if it bears the Microsoft trademark.

Let me know what you think
I want to know what you think about the state of application security today. Send me an e-mail, or post to our discussion, even if you just want to call me a Microsoft apologist.

 

Editor's Picks