Tim: Hi Casey. Could you introduce yourself to us please?
Casey: Certainly. I am Casey Henderson, one of the two co executive directors of the USENIX association.
Tim: Now, today we are talking about an interesting, I don’t know if censorship is exactly the right word for it, but an incident that recently took place to do with publication of some security information – can you talk about what happened exactly?
Casey: Certainly. Three authors submitted a paper to the 22nd USENIX Security Symposium which is scheduled to take place in Washington, DC, later this month. Their work was accepted by our program committee and they ultimately received an injunction from a UK court requesting that they not publish the paper. So they withdrew it from publication from the security symposium.
Tim: Now at what level was that injunction – what jurisdiction rather does that actually have effect?
Casey: As far as I know, it actually does not have an effect in the US, because it is a UK court, but I think that question is better answered by Corynne McSherry who is joining us from the EFF.
Tim: Okay, very good. Corynne, could you go ahead and introduce yourself as well?
Corynne: Sure. My name is Corynne McSherry, and I am the intellectual property director at the Electronic Frontier Foundation.
Tim: Okay. Now between USENIX and EFF, there is a strong interest not only in promoting computer security but also in open publication. So this is a I would say, not the first time that someone who has tried to curtail publication of computer security information – is this somehow different from some other time?
Corynne: It is not the first time, I know we have been involved in a number of fights involving disclosure of security information, and over and over we see the same thing, which is the company runs to court and says if you disclose these flaws, it is going to cause all kinds of problems, but really the better answer would be, as could have happened in this case, they had months to actually address the security flaws before they were going to be disclosed. And so one part of the story is why didn’t that happen? They had nine months according to the court’s order to address this flaw. And they didn’t do that. They were in fact slow to even meet with the researchers, so part of the story isn’t just the legal part of it, but here are researchers who are acting really responsibly, and nonetheless even though they tried to act responsibly and follow the rules, using publicly available information in their research, nonetheless they are having their speech shut down which is really unfortunate and dangerous, I think.
Tim: Now, do you think their timing was intentional, is that part of a strategy, is that a good way to curtail basically an appeal process, because it is so close to when it would be published?
Corynne: Well, again, and the facts I know were based on looking at the court order, because it goes through, it leaves out some of the facts that the court knows, and it appears that what happened is that the researchers reached out to the folks that actually developed the software that car manufacturers rely on. Volkswagen is the lead complaining party in the actual case. It appears that they didn’t find out about it until very very late in the game. So I am not sure that there was a shenanigan on the part of Volkswagen anyway to try to do something do anything tricky here, but nonetheless that’s how it happened.
Corynne: The court should have taken that into account which they didn’t.
Tim: Okay, can you talk about the flaws here, how serious are we talking about, it is something that if it were public information and someone were to misuse it we’d have chaos in the roadways is there a universal key that this reveals, or is it more of an edge case, a very small window of opportunity?
Corynne: I am not a technologist, I am just a humble lawyer so I can only go with what’s in the record, but it appears that one of the things that the researchers pointed out in the case was that really there are lot easier ways to rip up a car, than by using this security flaw. So, yeah, it was not going to cause chaos on the roadways.
Tim: Casey, I’d like to ask you, why is USENIX so concerned about this? Is it because they think of it in the way the EFF is, famous for fighting online censorship?
Casey: Well, USENIX is committed to open access to research, that is a fundamental part of our mission, and we are also committed to providing an unbiased forum for the dissemination of leading edge research, so USENIX is right in the middle of this, and from all intents and purposes given that this paper was very closely reviewed by a program committee consisting of experts in the field, this paper has the right to stand and be published. So USENIX is firmly supporting the authors appearing at the conference despite the injunction. Although the authors are planning to abide by the terms of the injunction and not reveal any of the information that the court deemed as too sensitive to present in person.
Tim: So they are willing to, in a sense, [elide] the content but still take part?
Casey: That’s correct.
Tim: Okay, when it comes to the fact that these are international researchers that is even the team isn’t all in one place, does the injunction apply equally to everyone there?
Casey: Well, actually it is a pretty complicated situation. Because two of the researchers are located in the Netherlands, and the third, Flavio Garcia, is located in the United Kingdom where the court case took place. And USENIX and the conference is located in the United States. So there is a variety of different tricky international items that play here. And part of the reason we talked to Corynne about this, is we were trying to detangle all of the legal business but even then Corynne is a lawyer in the United States.
Tim: And even so the fact that the researchers regardless of what they must do, maybe have some other reasons to want to abide by it, they have to live with whatever they do, and if it applies to Flavio, as one of the three authors, the others maybe also have him in mind.
Casey: Absolutely. Currently we only have one of the authors scheduled to appear at the conference, and that is Roel Verdult who is one of the authors in the Netherlands; Flavio is currently not scheduled to appear. We will see if he is able to make it there, but that certainly is a more serious decision for him, given that he lives in the United Kingdom where this injunction took place.
Tim: Have similar incidents, have they marred previous USENIX events?
Casey: I wouldn’t say marred, they’ve certainly enlivened previous USENIX events, both USENIX and Corynne, as she mentioned at the EFF, have been involved in cases like this in the past, most notably the [Felton] case I believe that was in early 2000. And that predates my time at USENIX. I arrived at USENIX just after that. But USENIX has a long history of standing firm against censorship against its authors, and have tried to partner closely with the EFF on issues such as this.
Tim: Is there any advice for researchers who have controversial research, there’s almost always someone who would like to stop them from appearing?
Casey: Well, that’s certainly something that we are interested in engaging with our community members about.
Tim: There we go, alright, for authors, going forward, security researchers who are already in a field where someone is going to want to stop them from revealing certain facts that lets someone break a system, break into a system, reveal cryptography – what advice should authors follow in order to get their research out there?
Casey: Well, they definitely have to abide by their own conscience and decide how far they are willing to go for their research. USENIX as an organization is definitely committed to academic freedom but there is a balance there when it comes to the law. I would say that some of the laws certainly don’t fit with the direction that computer science research is going into, and some of the laws having to do with computer science even contradict themselves versus other kinds of laws, which I think we saw most recently in the Bradley Manning case. In that ruling, there were parallels which I would love it if Corynne would mention where one law would apply in a certain situation, and it is completely different when it came to computing. So it seems like people can get tripped up there easily when it comes to research in just trying to negotiate technology legal issues.
Tim: Corynne, I think that is a good question. How would those compare?
Corynne: Well, I am not sure that the comparison that I would go to directly is the Bradley Manning case, although I do think we saw some very tricky business going on, where the court was trying to suggest that there was something nefarious going on because Bradley Manning was using a computer, as opposed to using low level technology – I think that that’s right. But I think that it is right that it is very hard to do a lot of computer research without getting tripped up by laws that really don’t make a whole lot of sense.
That’s why in the EFF, we have something called the Coders Rights Project that is really specifically dedicated to trying to help researchers, professionals and amateurs expose security flaws without getting themselves into trouble, to go ahead and publish their research. ____10:57 that actually benefits everybody overall. We need to know about security flaws so we can fix them. And you can follow responsible disclosure practices and so on. There is no problem with that.
But the problem is that we have laws in the United States as well that unfortunately can be used to shut down people’s research and prevent people from talking about the flaws that they are discovering. And in the United States, you have something called the Digital Millennium Copyright Act that restricts people’s ability to break encryption basically shorthand, in order, if they need to do that, even if the reason they are doing that is part of research.
Now there are exceptions in the law for research, but they are very limited and they don’t encompass a lot of people. And then what’s worse, it makes it also illegal to even ____11:46 distributing tools which is often means just sharing information about what you found out. It again involves breaking a technological protection measure.
So there is a real problem which seems to me when you have got laws that are designed to do things like protect trade secrets, or protect copyright which is what the Digital Millennium Copyright Act was supposed to do when instead, they are being misused to shut down perfectly legitimate academic research. And I really think that’s what happened here. Now I don’t practice UK law.
So I had to glean what I can from looking at the court order, but it seems to me that in essence, what the court said is that what these guys did is they accessed a program that was available online since 2009 and used it to derive an algorithm that was in that program and then from that, they made a series of other inferences which was frugally fine to do I guess.
This program has been available for years online. But the court said, well you should have known that that programmer called Tango programmer was probably based on stolen information. Now the court just speculates that it is based on stolen information. No one has actually shown that it was based on stolen information. And no one’s shown that the researchers knew that it was based on stolen information, right?
So we’ve got a whole chain of things happening here, and they start on that chain of inferences - the programmers get censored, I mean the researchers get censored. Again it is just speculation. They just should have known somehow that it was illegitimate and therefore, they should have known that they shouldn’t have used the program.
Tim: That seems like a very high burden to put on people using software they didn’t write.
Corynne: Well, I think it sends a really terrible message to researchers, that you not only –apparently you have to investigate the source of all the software that you rely on if that is part of your research - that is ridiculous. I mean if the vendors of the original technology, right, that are so worried about the security technology were so concerned, why weren’t they going after the Tango programmers of years ago, right? I mean presumably if researchers could find it, the company could have found them too. They haven’t found that, it seems to me completely legitimate for the researchers to say, okay, well, it is online, apparently this is okay.
Tim: Well, Casey, what if this researcher had come to the EFF rather to the USENIX in the form of attributing it to a false name, or an anonymous submission in general, the fact that if you go to known people that makes them traceable, it makes them targets of such laws, doesn’t it?
Casey: It does, but at the same time, the USENIX Security Symposium along with a variety of our conferences, actually have double blind submissions, so our program committee wasn’t even aware of the submitter when they put the work into our conference submission system and reviewed it. So it was entirely done anonymously to preserve the anonymity of the authors. Though ultimately of course that anonymity will be revealed in the conference program and it is pretty difficult to avoid not disclosing who the authors were at that point. But yeah, I am not even sure where to go from there, because we would be talking about someone standing in a corner and presenting the paper.
Robin: Tim, hold on. Could you please, you’ve slumped down, yes, we want to get you back on the even plane, okay.
Tim: Here we go. How’s that?
Casey: Okay. Are we even?
Tim: I think we are even.
Casey: Okay. Well, we forgot the direction your question was going in
Tim: It is, and in fact, going one step further there is obviously a lot of security research in particular that is anonymous, everything from 2600 to some very serious papers, we don’t really know who started bitcoin for that matter, so with research like this, what if he had simply then released anonymously on the internet? How would that change the situation here?
Casey: Well, it would change it in a variety of different ways. I mean they were submitting it for formal academic publication which is why ultimately it would not remain anonymous. These are folks who would like to get credit for their research, and that was part of their argument for trying to push through the court proceedings in this case because the injunction actually took place on the date of the security final paper submission deadline. These authors were trying very hard to not only get their work published but have their names on it, and have it out there at an extremely reputable and prestigious academic publishing venue which is our symposium.
So yes, they could have just released it on the internet and then I think the legal issues would have gotten much more entangled - I think Corynne can comment on that further. But the fact remains that in order for these authors to try to publish and present their work at a respected venue, that’s pretty much why they are being published and prevented from doing, it was specifically for this publication.
Corynne: There’s something ironic about that, if you ask me. If they’d just gone ahead and published it online, then it would be out there, right? But they are trying to follow traditional academic procedures which created the opportunity to have them actually censored which is sad.
Tim: So the next time someone has a security finding, and that is practically all of them, someone wants to prevent, because it makes someone else’s system look bad, or it reveals holes, again, what should they do best? I think that is the question that if you are the security researcher, what is the best way to go about this?
Casey: I think from USENIX’s side, we are particularly interested in engaging our entire community – we have a wide spectrum of opinions on this matter as you might imagine, and USENIX supports a variety of different computer science researchers that our security community in particular is extremely sensitive to this issue, and many of them would prefer that USENIX for example had just released that paper submission even though it had been withdrawn from the conference. So that is one end of the spectrum.
The other end is though is very much devoted to following standard academic protocols in terms of revealing research to vendors in a timely manner as these researchers certainly did. So it is a balance between those kinds of opinions. But to that end, USENIX is hosting a discussion about this, it is our Hot Topics at the Security Summit which takes place the day before our Security Symposium. We are going to try to answer that very question. And hear from the community. We actually have a colleague of Corynne’s from the EFF appearing there as well – Kurtz. What is his title again, Corynne?
Corynne: He is a senior staff attorney.
Casey: Yes. And he is well versed in these issues. And the co-leader of that discussion is Dan Wallach of Rice University who is also serving on the USENIX Board of Directors and is no stranger to these issues themselves. He is very well acquainted having been a target of a very similar case.