Ruminations of J.net idle rants and ramblings of a code monkey

Thoughts on Secure File Downloads

.NET Stuff | Security | Web (and ASP.NET) Stuff
Well, that’s kinda over-simplifying it a bit. It’s more about file downloads and protecting files from folks that shouldn’t see them and comes from some of the discussion last night at the OWASP User Group. So … I was thinking that I’d put a master file-download page for my file repository. The idea around it is that there would be an admin section where I could upload the files, a process that would also put them into the database with the relevant information (name, content type, etc.). This would be an example of one of the vulnerabilities discussed last night … insecure direct object reference. Rather than giving out filenames, etc., it would be a file identifier (OWASP #4). That way, there is no direct object reference. That file id would be handed off to a handler (ASHX) that would actually send the file to the client (just doing a redirect from the handler doesn’t solve the issue at all). But I got to thinking … I might also want to limit access to some files to specific users/logins. So now we are getting into restricting URL access (OWASP #10). If I use the same handler as mentioned above, I can’t use ASP.NET to restrict access, leaving me vulnerable. Certainly, using GUIDs makes them harder to guess, but it won’t prevent UserA, who has access to FileA, sending a link to UserB, who does not have access to FileA.  However, once UserB logged in, there would be nothing to prevent him/her from getting to the file … there is no additional protection above and beyond the indirect object reference and I’m not adequately protecting URL access. This highlights one of the discussion points last night – vulnerabilities often travel in packs. We may look at things like the OWASP Top Ten and identify individual vulnerabilities, but that looks at the issues in isolation. The reality is that you will often have a threat with multiple potential attack vectors from different vulnerabilities. Or you may have a vulnerability that is used to exploit another vulnerability (for example, a Cross-Site Scripting vulnerability that is used to exploit a Cross Site Request Forgery vulnerability and so on and so on). So … what do I do here? Well, I could just not worry about it … the damage potential and level of risk is pretty low but that really just evades the question. It’s much more fun to actually attack this head on and come up with something that mitigates the threat. One method is to have different d/l pages for each role and then protect access to those pages in the web.config file. That would work, but it’s not an ideal solution. When coming up with mitigation strategies, we should also keep usability in mind and to balance usability with our mitigation strategy. This may not be ideal to the purist, but the reality is that we do need to take things like usability and end-user experience into account. Of course, there’s also the additional maintenance that the “simple” method would entail as well – something I’m not really interested in. Our ideal scenario would have 1 download page that would then display the files available to the user based on their identity, whether that is anonymous or authenticated. So … let’s go through how to implement this in a way that mitigates (note … not eliminates but mitigates) the threats. First, the database. Here’s a diagram:                                                               We have the primary table (FileList) and then the FileListXREF table. The second has the file ids and the roles that are allowed to access the file. A file that all are allowed to access will not have any records in this table. To display this list of files for a logged in user, we need to build the Sql statement dynamically, with a where clause based on the roles for the current user. This, by the way, is one of the “excuses” that I’ve heard about using string concatenation for building Sql statements. It’s not a valid one, it just takes some more. And, because we aren’t using concatenation, we’ve also mitigated Sql injection, even though the risk of that is low since the list of roles is coming from a trusted source. Still, it’s easy and it’s better to be safe. So … here’s the code. public static DataTable GetFilesForCurrentUser() { //We'll need this later. List<SqlParameter> paramList = new List<SqlParameter>(); //Add the base Sql. //This includes the "Where" for files for anon users StringBuilder sql = new StringBuilder( "SELECT * FROM FileList " + "WHERE (FileId NOT IN " + "(SELECT FileId FROM FileRoleXREF))"); //Check the user ... IPrincipal crntUser = HttpContext.Current.User; if (crntUser.Identity.IsAuthenticated) { string[] paramNames = GetRoleParamsForUser(paramList, crntUser); //Now add to the Sql sql.Append(" OR (FileId IN (SELECT FileId FROM " + "FileRoleXREF WHERE RoleName IN ("); sql.Append(String.Join(",", paramNames)); sql.Append(")))"); } return GetDataTable(sql.ToString(), paramList); } private static string[] GetRoleParamsForUser(List<SqlParameter> paramList, IPrincipal crntUser) { //Now, add the select for the roles. string[] roleList = Roles.GetRolesForUser(crntUser.Identity.Name); //Create the parameters for the roles string[] paramNames = new string[roleList.Length]; for (int i = 0; i < roleList.Length; i++) { string role = roleList[i]; //Each role is a parameter ... string paramName = "@role" + i.ToString(); paramList.Add(new SqlParameter(paramName, role)); paramNames[i] = paramName; } return paramNames; } From there, creating the command and filling the DataTable is simple enough. I’ll leave that as an exercise for the reader. This still, however, doesn’t protect us from the failure to restrict URL access issue mentioned above. True, UserA only sees the files that he has access to and UserB only sees the files that she has access to. But that’s still not stopping UserA from sending UserB a link to a file that he can access, but she can’t. In order to prevent this, we have to add some additional checking into the ASHX file to validate access. It’d be easy enough to do it with a couple of calls to Sql, but here’s how I do it with a single call … public static bool UserHasAccess(Guid FileId) { //We'll need this later. List<SqlParameter> paramList = new List<SqlParameter>(); //Add the file id parameter paramList.Add(new SqlParameter("@fileId", FileId)); //Add the base Sql. //This includes the "Where" for files for anon users StringBuilder sql = new StringBuilder( "SELECT A.RoleEntries, B.EntriesForRole " + "FROM (SELECT COUNT(*) AS RoleEntries " + "FROM FileRoleXREF X1 " + "WHERE (FileId = @fileId)) AS A CROSS JOIN "); //Check the user ... IPrincipal crntUser = HttpContext.Current.User; if (crntUser.Identity.IsAuthenticated) { sql.Append("(SELECT Count(*) AS EntriesForRole " + "FROM FileRoleXREF AS X2 " + "WHERE (FileId = @fileId) AND " + "RoleName IN ("); string[] roleList = GetRoleParamsForUser(paramList, crntUser); sql.Append(String.Join(",", roleList)); sql.Append(")) B"); } else { sql.Append("(SELECT 0 AS EntriesForRole) B"); } DataTable check = GetDataTable(sql.ToString(), paramList); if ((int)check.Rows[0]["RoleEntries"] == 0) //Anon Access {return true;} else if ((int)check.Rows[0]["EntriesForRole"] > 0) {return true;} else {return false;} } So, this little check before having the handler stream the file to the user makes sure that someone isn’t getting access via URL to something that they shouldn’t have access to. We’ve also added code to ensure that we mitigate any Sql injection errors. Now, I’ve not gotten everything put together in a “full blown usable application”. But … I wanted to show some of the thought process around securing a relatively simple piece of functionality such as this. A bit of creativity in the process is also necessary … you have to think outside the use case, go off the “happy path” to identify attack vectors and the threats represented by the attack vectors.

Fixing my instance of dasBlog

.NET Stuff | Web (and ASP.NET) Stuff
Well, I finally have the remaining issues that I had with my dasBlog installation fixed - or at least mostly so. Of course, now I see that there is a new build out on CodePlex (mostly bug fixes) but I'm not going to bother with it at this point in time. Still, in digging around in dasBlog, I've found a couple of things that I stumbled over that are (I feel) are issues. I'm not sure that I'd put them in a bugs per se as they don't break the application, but I do thing that they oughta be fixed. Don't get me wrong - dasBlog is a great blogging tool and pretty darn solid. I don't know if they've been fixed in the current release, but I'm gonna log them up there as well.  Both of these are in Exceptions for flow control: This is is newtelligence.dasBlog.Runtime.LoggingDataServiceXml::LoggingDataService.Xml. This method appears to be using exception handling for flow control. Here's what's going on: when you load up the logs for a particular day, it goes to get those services from the file system (the default location is under the Logs folder of the root web). First it checks for archived (zipped) files ... and it does this right by checking for file existence. But ... when it looks for files in the Xml format, it doesn't check for the file existence. It just tries to open it, assuming that the file is there. Well, it may not be ... if the date is in the future or there are no logs for it (i.e. you tried to get the logs for pre-dasBlog), it throws an FileNotFoundException. This is then caught by the generic application exception handler. The simple solution is to use a wrap the using block for the new StreamReader with an if block that checks File.Exists(path). newtelligence.dasBlog.Runtime.LoggingDataServiceXml.ArchiveLogFiles: This is using AppPool.QueueUserWorkItem for multi-threading. This isn't really ideal for multi-threading in a web application ... you really should use the page async model. Though ... I'm scratching my head as to the correct way to implement this. This is called by the WriteEvent method of the LoggingDataService ... so it's not something that you can really put into a PageAsyncTask. Doing so would, it seems, involved a change to the admin pages to call this on demand from the admin UI. That, however, may also violate the source independence of the archive provider. These aren't, by any stretch of the imagination, show stoppers. The exception handling one, though, really gets under my skin. Exception handling for flow control - and I'm calling it that, though really to do so is something of a stretch because it's only caught by the application's generic handler - is pretty bad mojo. And it's expensive. Much more expensive than a simple check. Just do an MSN Search for Exception Flow Control. It's all over and it's not language-specific. Fortunately, in this case, the fix is a simple one. I do have a little beef too, though and that's with the reporting. Daily reports into your email are good - don't get me wrong - but it'd be nice to see aggregated reports across a week, a month or more. And yes, before you ask, I've also applied to join the team so that I can fix things and add new stuff rather than sitting here and complaining about it.

More on Membership (from Zain's Presentation yesterday)

.NET Stuff | Web (and ASP.NET) Stuff
When I posted the notes yesterday from Zain's presentation, I posted something about adding users in code ... but I only mentioned the CreateUser() method.  Well, I just got an email from the person that requested this little tip ... with a little curve thrown in.  They also needed to add some profile information.  So, I sat down and wrote a little sample of how to do this (liberally sprinkled with comments).  For those that attended my peformance session in Second Life, there's another little perf tip in here that I forgot to put into my presentation (it's going in now though).  Here it is ...     //Some assumptions ...    //You have all the user information in a data reader    //It actually doesn't matter where it comes from    //but using a reader makes the sample easier. :)    //The using statements that are important here:    //using System.Web.Security;    //using System.Web.Profile;    public static void AddUsers(SqlDataReader rdr)    {        //Set the script timeout to 10 minutes ...         //Use a reasonable value        HttpContext.Current.Server.ScriptTimeout = 60000;        //And we will get the indexes for the fields that we want        //This is a good deal more efficient than rdr["FieldName"]        //but ... you *must* do it before entering the loop.        //It's also type safe as well. :-)        int userId = rdr.GetOrdinal("UserID");        int password = rdr.GetOrdinal("Password");        int email = rdr.GetOrdinal("Reader");        int favColor = rdr.GetOrdinal("FavoriteColor");        int height = rdr.GetOrdinal("Height");         int street = rdr.GetOrdinal("Street");         while (rdr.Read())        {            //Add the users ...            //I'm using one of the overloads here ... there are several.             //This returns the membership user that was created;            MembershipUser usr = Membership.CreateUser(rdr.GetString(userId),                 rdr.GetString(password), rdr.GetString(email));            //Now you will need to create the profile.              ProfileBase prof = System.Web.Profile.ProfileBase.Create(usr.UserName);            //And I'm making up some properties here.             //These will need to be configured.             prof.PropertyValues["FavoriteColor"].PropertyValue = rdr.GetString(favColor);            prof.PropertyValues["Height"].PropertyValue = rdr.GetString(height);             //And, if you're using groups             //(A good idea if you have a lot of properties)            ProfileGroupBase addrGroup = prof.GetProfileGroup("Address");             addrGroup.SetPropertyValue("Street", rdr.GetString(street));            //Make sure that the profile is saved            prof.Save();        }    }     And ... that's all folks!

ASP.NET Perf Tips: Second Life

Events | Second Life | Web (and ASP.NET) Stuff
I just finished my presentation on ASP.NET Performance Tips for the Second Life .NET User Group. I must say ... it went very well! The feedback from the attendees was very positive and I think everyone learned something ... if not several things. In fact, I did have one attendee say "I didn't think I'd learn anything here, but I did!". That just makes me feel all warm and fuzzy inside. So ... some links ... First, Fritz Onion's blog with the perf results that I referred to (there are actually 2 entries): http://www.pluralsight.com/blogs/fritz/archive/2004/10/19/2892.aspxhttp://www.pluralsight.com/blogs/fritz/archive/2005/02/14/5861.aspx And ... the PowerPoint deck and a sample of Async Tasks: http://downloads.microsoft-j.net/Performance_tips.zip.  One note: this is the deck that I use for RL presentations.  I trimmed some of the text from it so it would rez better in-world.  But I think the additional notes will be helpful.

Welcome Rob Conery to Microsoft!

.NET Stuff | Web (and ASP.NET) Stuff | Open Source
I just got a little message from my old buddy Rob Conery ... he's been hired by Microsoft (the ASP.NET team in fact) to work on SubSonic ... a tool that builds an object-centric DAL for you. It's cool stuff ... I've played with it a bit and even though I'm a little leery of code generation tools (call me old fashioned), it's good stuff.  You can customize the templates if ya want and add to the generated DAL pretty easily.  Rob was also the instigator for the Commerce Starter Kit ... now dashCommerce and I've worked with him on that as well.  So ... welcome to Microsoft Rob!  It's good to have ya on board!