LWN.net Logo

The risks of disclosing web vulnerabilities

May 3, 2006

This article was contributed by Jake Edge.

One would think that an organization would be grateful to someone who found a vulnerability in their web application and provided them with the information needed to fix it. A recent episode where a security researcher has been charged with breaching the security of an online database makes it clear that this gratitude cannot be counted upon, however. Eric McCarty found a flaw in the University of Southern California (USC) online application system that would allow a SQL injection attack to extract the contents of a database which included some 275,000 records of both current students and applicants.

According to the original SecurityFocus article, the researcher discovered the flaw when using the system to apply to USC. The username and password text fields could be used to feed SQL commands to the database, allowing the entire contents to be read and/or modified. He then anonymously contacted SecurityFocus to disclose the flaw. Other than corresponding with SecurityFocus anonymously, McCarty did little, if anything, to cover his tracks; believing he was acting in good faith.

SecurityFocus contacted USC; the administrators of the web site claimed that only two records could be accessed via the SQL injection. When confronted with additional records, they admitted that the entire database was vulnerable and shut down the site for ten days in order to fix it. In addition, the administrators found the entries in the logfiles corresponding to the 'attack' and provided the IP address to law enforcement.

The IP address allowed the FBI to determine his identity and to execute a search warrant against him and his Gmail accounts. On his computer they evidently found seven records from the USC database and his Gmail account provided copies of the emails that he sent to SecurityFocus describing the vulnerability. The charges do not claim that he did anything with the seven records, just that he possessed them and had gotten them via 'misuse'.

The affidavit filed in the case claims that McCarty caused $140,000 in damages by causing USC to shut down its system for 10 days. It is somewhat difficult to see how telling someone about a flaw in their system makes one responsible for the time it takes them to fix it. It would seem that the original programmers of the system would be the ones who are culpable here.

Computer misuse statutes are typically written in such a way that any access, other than what is intended by the site owner, could be considered a crime. The intent of the 'perpetrator' rarely seems to be examined and this case is reminiscent of the conviction of a British security consultant last year. Daniel Cuthbert was concerned that he had been phished at a tsunami relief website and he did two simple tests to see if the site was for real. These tests set off alarms in an Intrusion Detection System and ultimately led to his conviction. In addition, his arrest caused him to lose his job as a security consultant.

It is very difficult to see how these kinds of prosecutions will lead to a safer internet and, in fact, would seem likely to cause just the opposite. Even checking for the existence of a flaw is criminal (at least in some jurisdictions) and actually finding a flaw and disclosing it (not in a public way, but privately to the affected organization) can lead to charges in other jurisdictions. Anyone who thinks they may have spotted a potential problem area in a web application would be risking a great deal by probing it further. In addition, administrators of these sites are unlikely to even look at a flaw unless one can show them an exploit. Even then, as the first USC response shows, they may be unwilling or unable to see the implications of the flaw. The sad fact is that the best response to the discovery of a web site vulnerability may be to keep it to one's self.

[Editor's note: anybody who informs LWN of a vulnerability in the LWN.net code will, assuming they have not exploited that vulnerability for their own gain, be thanked, publicly if desired.]


(Log in to post comments)

The risks of disclosing web vulnerabilities

Posted May 4, 2006 6:37 UTC (Thu) by error27 (subscriber, #8346) [Link]

It seems like student council or the ACM club there should take a stand. Someone has done them a favour and the administration has acted like slime.

The risks of disclosing web vulnerabilities

Posted May 4, 2006 8:44 UTC (Thu) by dvrabel (guest, #9500) [Link]

Sounds like McCarty didn't just find a flaw but used it to obtain personal data to which he was not authorized nor entitled and then proceeded to store and distribute that data. Should my personal data be misused in such a way I would expect and require investigation by the police or other appropriate authorities.

If we applaud such misuse of personal data then it becomes all too easy for wholescale misuse to occur under the guise of "security research".

The risks of disclosing web vulnerabilities

Posted May 4, 2006 9:27 UTC (Thu) by eskild (subscriber, #1556) [Link]

Well, yes, but... Reading the article it appears the web site owners didn't acknowledge the problem until they were presented with data that proved their systems' failure.

In other words: They flat-out denied they had a problem -- until it was proven to them with their own data.

It is hard for me to see how an organization acting in denial of their own problems could be convinced of their web site deficiencies in another manner.

So he *had* to retrieve data, he *had* to distribute them. But, of course, he *didn't* have to retain them once they were sent.

I see this behaviour as a typical "we don't have any problems, but we'll sue you to pieces if we have" scare tactic. Utterly, utterly irresponsible. And pathetic, too.

An anecdote: A couple years back I found an SQL injection vulnerability in a major Danish site, and I simply gave them a call. After some shuffling around with my phone call, I got to one of the developers. She was shocked -- but thankful, and they fixed it rapidly. That's how these things should work.

The risks of disclosing web vulnerabilities

Posted May 14, 2006 19:48 UTC (Sun) by kasperd (guest, #11842) [Link]

I see this behaviour as a typical "we don't have any problems, but we'll sue you to pieces if we have" scare tactic. Utterly, utterly irresponsible. And pathetic, too.

I have experinced that as well with a Danish company. My experience with that particular company was a different reaction on each email I send to them.

  1. ignore it: I wrote an email to them, and it appeared to be ignored. I got no reply, and nothing was done about the problem.
  2. try to talk out of it: I got a thankful answer, in which they stated, that they would do something about the problem. But they didn't.
  3. deny it: After my third email they tried to deny the existence of the problem. To which I responded, that in that case it couldn't do any harm to publish my findings.
  4. threaten: Their next reaction was to threaten me with a lawsuit in case anybody found out about the problem.

At that point I decided the best I could do was to report it the company to authorities for keeping personal data without the amount of security required by the law. At least I felt that was the best I could do to my own position in case of a lawsuit.

The company was given a very long time to respond about the problem. And just before their time ran out, they removed that particular symptom. However there was no proof that the vulnurability was really solved. And in other places there were still symptoms showing vulnurabilities, and other problems showing they just don't know what the hell they are doing.

A couple years back I found an SQL injection vulnerability in a major Danish site, and I simply gave them a call. After some shuffling around with my phone call, I got to one of the developers. She was shocked -- but thankful, and they fixed it rapidly.

Nice to hear that there still are companies handling such approaches reasonably. Unfortunately they are rare. I have reached the point where I don't know if it is worth the effort to tell sites about their security problems.

I think the next time I come across a security vulnurability in a Danish site I'm just going to report it straight to the authorities and then just publish the fact that this company has been reported.

The risks of disclosing web vulnerabilities

Posted May 4, 2006 8:48 UTC (Thu) by cate (subscriber, #1359) [Link]

[Editor's note: anybody who informs LWN of a vulnerability in the LWN.net code will, assuming they have not exploited that vulnerability for their own gain, be thanked, publicly if desired.]

Are the php errors in http://lwn.net/Gallery/ exploitable? ;-)

The risks of disclosing web vulnerabilities

Posted May 4, 2006 12:57 UTC (Thu) by csamuel (✭ supporter ✭, #2624) [Link]

The issue with Daniel Cuthbert was that the judge found that he lied to
police initially and only later changed his story to what actually
happened. This probably meant the difference between the fine he got and
a conditional discharge.

Instant disclosure

Posted May 4, 2006 14:21 UTC (Thu) by jreiser (subscriber, #11027) [Link]

So if "responsible" disclosure is discouraged, then "No prisoners!" must be the reply. Such as: randomize the effective MAC address on a laptop, go to a free wireless cloud, post the zero-day exploit on IRC. Surely there are improvements and other strategies; let's hear them!

The risks of disclosing web vulnerabilities

Posted May 4, 2006 16:17 UTC (Thu) by copsewood (subscriber, #199) [Link]

I think there is a great difference between:
  • "researching" someone else's implementation of a program - which is used to store confidential data belonging to someone other than the security researcher, and
  • the security researcher implementing this program themselves, finding a vulnerability in their own implementation of it and giving the developer of this program appropriate time to fix it before publishing the exploit.
In the UK, as this article points out, this makes the difference between unauthorised and authorised access. Unless the system owner invites security reports of discovered vulnerabilities, effort should not be put into discovering these by an uninvited party. I may accidently leave my door unlocked. If someone sees keys left in the outside door and rings the doorbell to tell the house owner, this is authorised access. If they go in through an unlocked door or try to see how easy this is to pick and wonder around the house this is trespassing - as well as being a violation of privacy. Buying a particular make and model of door lock at a hardware shop and taking it home and working out how easy it is to break it or pick it and telling others about this is generally considered fair use and fair comment.

In cases such as these it is instructive to compare actions in the virtual domain with similar actions in the physical domain, to see how the latter would be regarded both socially and in legal terms. This is also a useful acid test of computer related legislation. Based on these criteria the DMCA fails a test that the UK Computer Misuse Act passes.

The risks of disclosing web vulnerabilities

Posted May 4, 2006 18:32 UTC (Thu) by oak (guest, #2786) [Link]

So, for example discussing on the public forums (of the corresponding
system) about whether anybody else had bumbed into a "funny feature"
of the system might be OK, as long as one doesn't try use it him/herself
nor mentions that it "might" be a security hole?

Could one be even outraged that the organization had "implemented" a
feature for disclosing sensitive information?

The risks of disclosing web vulnerabilities

Posted May 12, 2006 12:34 UTC (Fri) by copsewood (subscriber, #199) [Link]

"So, for example discussing on the public forums (of the corresponding
system) about whether anybody else had bumbed into a "funny feature"
of the system might be OK," OK in the UK and in common law. Might be illegal under some circumstances in the US.

"as long as one doesn't try use it him/herself" which I take to mean breaking and entering or trespassing in physical law and a violation of the UK Computer Misuse Act. I think US state computer laws vary, don't know whether covered by US federal law.

"nor mentions that it "might" be a security hole?" Legal AFAIK in the UK, illegal under the US DMCA which is in conflict with the US Constitution.

"Could one be even outraged that the organization had "implemented" a
feature for disclosing sensitive information?" How you feel is your own business. What you say could breach the DMCA in the US but not the UK Computer Misuse Act as I understand it. In the US the DMCA discourages you from doing the responsible thing which is telling the party with a known insecure system what's wrong so they can fix it.

The risks of disclosing web vulnerabilities

Posted May 5, 2006 3:07 UTC (Fri) by fozzy (guest, #7022) [Link]

Seems like USC is not a good place to study.

That's the message to get out - something that will affect their hip pocket. Whilst I'm not in the USA, so unlikely to want to study at USC, reading stories such as this my first reaction is what does this tell me about the culture of the institution?

The risks of disclosing web vulnerabilities

Posted May 5, 2006 16:27 UTC (Fri) by cdmiller (subscriber, #2813) [Link]

Well, if some organization is going to store data about me, I should have the right to test their system and make sure they are securing my personal info properly. Perhaps a law suit is in order by the accused against USC for endangering his personal data.

The risks of disclosing web vulnerabilities

Posted May 6, 2006 12:37 UTC (Sat) by addw (guest, #1771) [Link]

Hear hear!

Copyright © 2006, Eklektix, Inc.
Comments and public postings are copyrighted by their creators.
Linux is a registered trademark of Linus Torvalds