Ruminations of idle rants and ramblings of a code monkey

Thoughts on Secure File Downloads

.NET Stuff | Security | Web (and ASP.NET) Stuff

Well, that’s kinda over-simplifying it a bit. It’s more about file downloads and protecting files from folks that shouldn’t see them and comes from some of the discussion last night at the OWASP User Group. So … I was thinking that I’d put a master file-download page for my file repository. The idea around it is that there would be an admin section where I could upload the files, a process that would also put them into the database with the relevant information (name, content type, etc.). This would be an example of one of the vulnerabilities discussed last night … insecure direct object reference. Rather than giving out filenames, etc., it would be a file identifier (OWASP #4). That way, there is no direct object reference. That file id would be handed off to a handler (ASHX) that would actually send the file to the client (just doing a redirect from the handler doesn’t solve the issue at all).

But I got to thinking … I might also want to limit access to some files to specific users/logins. So now we are getting into restricting URL access (OWASP #10). If I use the same handler as mentioned above, I can’t use ASP.NET to restrict access, leaving me vulnerable. Certainly, using GUIDs makes them harder to guess, but it won’t prevent UserA, who has access to FileA, sending a link to UserB, who does not have access to FileA.  However, once UserB logged in, there would be nothing to prevent him/her from getting to the file … there is no additional protection above and beyond the indirect object reference and I’m not adequately protecting URL access.

This highlights one of the discussion points last night – vulnerabilities often travel in packs. We may look at things like the OWASP Top Ten and identify individual vulnerabilities, but that looks at the issues in isolation. The reality is that you will often have a threat with multiple potential attack vectors from different vulnerabilities. Or you may have a vulnerability that is used to exploit another vulnerability (for example, a Cross-Site Scripting vulnerability that is used to exploit a Cross Site Request Forgery vulnerability and so on and so on).

So … what do I do here? Well, I could just not worry about it … the damage potential and level of risk is pretty low but that really just evades the question. It’s much more fun to actually attack this head on and come up with something that mitigates the threat. One method is to have different d/l pages for each role and then protect access to those pages in the web.config file. That would work, but it’s not an ideal solution. When coming up with mitigation strategies, we should also keep usability in mind and to balance usability with our mitigation strategy. This may not be ideal to the purist, but the reality is that we do need to take things like usability and end-user experience into account. Of course, there’s also the additional maintenance that the “simple” method would entail as well – something I’m not really interested in. Our ideal scenario would have 1 download page that would then display the files available to the user based on their identity, whether that is anonymous or authenticated.

So … let’s go through how to implement this in a way that mitigates (note … not eliminates but mitigates) the threats.

First, the database. Here’s a diagram:

































We have the primary table (FileList) and then the FileListXREF table. The second has the file ids and the roles that are allowed to access the file. A file that all are allowed to access will not have any records in this table.

To display this list of files for a logged in user, we need to build the Sql statement dynamically, with a where clause based on the roles for the current user. This, by the way, is one of the “excuses” that I’ve heard about using string concatenation for building Sql statements. It’s not a valid one, it just takes some more. And, because we aren’t using concatenation, we’ve also mitigated Sql injection, even though the risk of that is low since the list of roles is coming from a trusted source. Still, it’s easy and it’s better to be safe. So … here’s the code.

public static DataTable GetFilesForCurrentUser()
//We'll need this later.List<SqlParameter> paramList =
new List<SqlParameter>();//Add the base Sql.
//This includes the "Where" for files for anon usersStringBuilder sql = new StringBuilder(
"SELECT * FROM FileList " +"WHERE (FileId NOT IN " +"(SELECT FileId FROM FileRoleXREF))");//Check the user ...IPrincipal crntUser = HttpContext.Current.User;if (crntUser.Identity.IsAuthenticated)
string[] paramNames = GetRoleParamsForUser(paramList, crntUser);//Now add to the Sqlsql.Append(" OR (FileId IN (SELECT FileId FROM " +"FileRoleXREF WHERE RoleName IN (");
sql.Append(String.Join(",", paramNames));
return GetDataTable(sql.ToString(), paramList);
private static string[] GetRoleParamsForUser(List<SqlParameter> paramList, IPrincipal crntUser)
//Now, add the select for the roles.string[] roleList =
Roles.GetRolesForUser(crntUser.Identity.Name);//Create the parameters for the rolesstring[] paramNames = new string[roleList.Length];for (int i = 0; i < roleList.Length; i++)
string role = roleList[i];//Each role is a parameter ...string paramName = "@role" + i.ToString();
paramList.Add(new SqlParameter(paramName, role));
paramNames[i] = paramName;
return paramNames;

From there, creating the command and filling the DataTable is simple enough. I’ll leave that as an exercise for the reader.

This still, however, doesn’t protect us from the failure to restrict URL access issue mentioned above. True, UserA only sees the files that he has access to and UserB only sees the files that she has access to. But that’s still not stopping UserA from sending UserB a link to a file that he can access, but she can’t. In order to prevent this, we have to add some additional checking into the ASHX file to validate access. It’d be easy enough to do it with a couple of calls to Sql, but here’s how I do it with a single call …

public static bool UserHasAccess(Guid FileId)
//We'll need this later.List<SqlParameter> paramList =
new List<SqlParameter>();//Add the file id parameterparamList.Add(new SqlParameter("@fileId", FileId));//Add the base Sql.
//This includes the "Where" for files for anon usersStringBuilder sql = new StringBuilder(
"SELECT A.RoleEntries, B.EntriesForRole " +"FROM (SELECT COUNT(*) AS RoleEntries " +"FROM FileRoleXREF X1 " +"WHERE (FileId = @fileId)) AS A CROSS JOIN ");//Check the user ...IPrincipal crntUser = HttpContext.Current.User;if (crntUser.Identity.IsAuthenticated)
sql.Append("(SELECT Count(*) AS EntriesForRole " +"FROM FileRoleXREF AS X2 " +"WHERE (FileId = @fileId) AND " +"RoleName IN (");string[] roleList = GetRoleParamsForUser(paramList, crntUser);
sql.Append(String.Join(",", roleList));
sql.Append(")) B");
sql.Append("(SELECT 0 AS EntriesForRole) B"); 
DataTable check = GetDataTable(sql.ToString(), paramList);if ((int)check.Rows[0]["RoleEntries"] == 0) //Anon Access{return true;}
else if ((int)check.Rows[0]["EntriesForRole"] > 0)
{return true;}
else{return false;}

So, this little check before having the handler stream the file to the user makes sure that someone isn’t getting access via URL to something that they shouldn’t have access to. We’ve also added code to ensure that we mitigate any Sql injection errors.

Now, I’ve not gotten everything put together in a “full blown usable application”. But … I wanted to show some of the thought process around securing a relatively simple piece of functionality such as this. A bit of creativity in the process is also necessary … you have to think outside the use case, go off the “happy path” to identify attack vectors and the threats represented by the attack vectors.

Comments (2) -

Mark Kerzner 6/12/2008 8:18:54 PM #
Mark Kerzner

Thank you, J, for teaching the good road to choose.

The attack I can suggest is XSS pretending to be the trusted user. But you will probably counter with using XSS-secure library.

J Sawyer 6/12/2008 11:51:44 PM #
J Sawyer

Hello Mark,
   I'm not sure what the attack vector would be for an XSS attack. No user input is displayed to the browser, so that would be a hard one to exploit. Well, with the exception of the Description field in the database, but that's assuming that it allows HTML content.
   As far as validating the user, it's all done server-side and based on the ASP.NET Authentication cookie. While it it possible to do a replay attack with that cookie, the cookie's lifetime is short (20 minutes) so the window of opportunity for exploiting that it quite small. It's also a session cookie, so it's not (supposed to be) persisted on the client. The validation and checking of access is all done server-side based on the auth ticket. The ticket itself is encrypted and the roles are not cached client-side.