I’m at the 23rd Security Protocols Workshop, whose theme this year is is information security in fiction and in fact. Engineering is often inspired by fiction, and vice versa; what might we learn from this?
I will try to liveblog the talks in followups to this post.
An increasing number of countries implement Internet censorship at different levels and for a variety of reasons. Consequently, there is an ongoing arms race where censorship resistance schemes (CRS) seek to enable unfettered user access to Internet resources while censors come up with new ways to restrict access. In particular, the link between the censored client and entry point to the CRS has been a censorship flash point, and consequently the focus of circumvention tools. To foster interoperability and speed up development, Tor introduced Pluggable Transports — a framework to flexibly implement schemes that transform traffic flows between Tor client and the bridge such that a censor fails to block them. Dozens of tools and proposals for pluggable transports have emerged over the last few years, each addressing specific censorship scenarios. As a result, the area has become too complex to discern a big picture.
Our recent report takes away some of this complexity by presenting a model of censor capabilities and an evaluation stack that presents a layered approach to evaluate pluggable transports. We survey 34 existing pluggable transports and highlight their inflexibility to lend themselves to feature sharability for broader defense coverage. This evaluation has led to a new design for Pluggable Transports – the Tweakable Transport: a tool for efficiently building and evaluating a wide range of Pluggable Transports so as to increase the difficulty and cost of reliably censoring the communication channel.
On the 5th of December I gave a talk at a journalists’ conference on what tradecraft means in the post-Snowden world. How can a journalist, or for that matter an MP or an academic, protect a whistleblower from being identified even when MI5 and GCHQ start trying to figure out who in Whitehall you’ve been talking to? The video of my talk is now online here. There is also a TV interview I did later, which can be found here, while the other conference talks are here.
The audio recording is available and some notes from the session are below.
The session started with brief presentations from each of the panel members. Ross spoke on the economics of surveillance and in particular network effects, the topic of his paper at WEIS 2014.
Bashar discussed the difficulties of requirements engineering, as eloquently described by Billy Connolly. These challenges are particularly acute when it comes to designing for privacy requirements, especially for wearable devices with their limited ability to communicate with users.
The European Court of Justice decision in the Google case will haveimplications way beyond searchengines. Regular readers of this blog will recall stories of banks hounding innocent people for money following payment disputes, and a favourite trick is to blacklist people with credit reference agencies, even while disputes are still in progress (or even after the bank has actually lost a court case). In the past, the Information Commissioner refused to do anything about this abuse, claiming that it’s the bank which is the data controller, not the credit agency. The court now confirms that this view was quite wrong. I have therefore written to the Information Commissioner inviting him to acknowledge this and to withdraw the guidance issued to the credit reference agencies by his predecessor.
I wonder what other information intermediaries will now have to revise their business models?
We had a crypto festival in London in London in November at which a number of cryptographers and crypto policy folks got together with over 1000 mostly young attendees to talk about what might be done in response to the Snowden revelations.
Here is a video of the session in which I spoke. The first speaker was Annie Machon (at 02.35) talking of her experience of life on the run from MI5, and on what we might do to protect journalists’ sources in the future. I’m at 23.55 talking about what’s changed for governments, corporates, researchers and others. Nick Pickles of Big Brother Watch follows at 45.45 talking on what can be done in terms of practical politics; it turned out that only two of us in the auditorium had met our MPs over the Comms Data Bill. The final speaker, Smari McCarthy, comes on at 56.45, calling for lots more encryption. The audience discussion starts at 1:12:00.
The Internet is and has always been a space where participants battle for control. The two core protocols that define the Internet – TCP and IP – are both designed to allow separate networks to connect to each other easily, so that networks that differ not only in hardware implementation (wired vs. satellite vs. radio networks) but also in their politics of control (consumer vs. research vs. military networks) can interoperate easily. It is a feature of the Internet, not a bug, that China – with its extensive, explicit censorship infrastructure – can interact with the rest of the Internet.
Today we have released an open-access collection (also published as a special issue of IEEE Internet Computing), of five peer reviewed papers on the topic of Internet censorship and control, edited by Hal Roberts and myself (Steven Murdoch). The topics of the papers include a broad look at information controls, censorship of microblogs in China, new modes of online censorship, the balance of power in Internet governance, and control in the certificate authority model.
These papers make it clear that there is no global consensus on what mechanisms of control are best suited for managing conflicts on the Internet, just as there is none for other fields of human endeavour. That said, there is optimism that with vigilance and continuing efforts to maintain transparency the Internet can stay as a force for increasing freedom than a tool for more efficient repression.
The 3rd USENIX Workshop on Free and Open Communications on the Internet (FOCI ’13) seeks to bring together researchers and practitioners from technology, law, and policy who are working on means to study, detect, or circumvent practices that inhibit free and open communications on the Internet. We invite two distinct tracks for papers: a technical track for technically-focused position papers or works-in-progress; and a social science track for papers focused on policy, law, regulation, economics or related fields of study.
FOCI will favor interesting and new ideas and early results that lead to well-founded position papers. We envision that work presented at FOCI will ultimately be published at relevant, high-quality conferences. Papers will be selected primarily based on originality, with additional consideration given to their potential to generate discussion at the workshop. Papers in the technical track will also be evaluated based on technical merit. As with other USENIX events, papers accepted for FOCI ’13 will be made freely available on the USENIX website.
Yesterday, banking security vendor Thales sent this DMCA takedown request to John Young who runs the excellent Cryptome archive. Thales want him to remove an equipment manual that has been online since 2003 and which was valuable raw material in research we did on API security.
Banks use hardware security modules (HSMs) to manage the cryptographic keys and PINs used to authenticate bank card transactions. These used to be thought to be secure. But their application programming interfaces (APIs) had become unmanageably complex, and in the early 2000s Mike Bond, Jolyon Clulow and I found that by sending sequences of commands to the machine that its designers hadn’t anticipated, it was often possible to break the device spectacularly. This became a thriving field of security research.
But while API security has been a goldmine for security researchers, it’s been an embarrassment for the industry, in which Thales is one of two dominant players. Hence the attempt to close down our mine. As you’d expect, the smaller firms in the industry, such as Utimaco, would prefer HSM APIs to be open (indeed, Utimaco sent two senior people to a Dagstuhl workshop on APIs that we held a couple of months ago). Even more ironically, Thales’s HSM business used to be the Cambridge startup nCipher, which helped our research by giving us samples of their competitors’ products to break.
If this case ever comes to court, the judge might perhaps consider the Lexmark case. Lexmark sued Static Control Components (SCC) for DMCA infringement in order to curtail competition. The court found this abusive and threw out the case. I am not a lawyer, and John Young must clearly take advice. However this particular case of internet censorship serves no public interest (as with previousattempts by the banking industry to censor security research).
Consumer Focus have recently published my expert report on the issues that arise when attempting to track down people who are using peer to peer (P2P) systems to share copyright material without appropriate permissions. They have submitted this report to Ofcom who have been consulting on how they should regulate this sort of tracking down when the Digital Economy Act 2010 (DEA) mechanisms that are intended to prevent unlawful file sharing finally start to be implemented, probably sometime in 2014.
The basic idea behind the DEA provisions is that the rights holders (or more usually specialist companies) will join the P2P systems and download files that are being shared unlawfully. Because the current generation of P2P systems fails to provide any real anonymity, the rights holders will learn the IP addresses of the wrongdoers. They will then consult public records at RIPE (and the other Regional Internet Registries) to learn which ISPs were allocated the IP addresses. Those ISPs will then be approached and will be obliged, by the DEA, to consult their records and tell the appropriate account holder that someone using their Internet connection has been misbehaving. There are further provisions for telling the rights holders about repeat offenders, and perhaps even for “technical measures” to disrupt file sharing traffic.
From a technical point of view, the traceability part of the DEA process can (in principle) be made to work in a robust manner. However, there’s a lot of detail to get right in practice, both in recording the data generated by the P2P activity and within the ISPs systems — and history shows that mistakes are often made. I have some first hand experience of this, my report refers to how I helped the police track down a series of traceability mistakes that were made in a 2006 murder case! Hence I spend many pages in my report explaining what can go wrong and I set out in considerable detail the sort of procedures that I believe that Ofcom should insist upon to ensure that mistakes are rare and are rapidly detected.
My report also explains the difficulties (in many cases the insuperable difficulties) that the account holder will have in determining the individual who was responsible to the P2P activity. Consumer Focus takes the view that “this makes the proposed appeals process flawed and potentially unfair and we ask Government to rethink this process”. Sadly, there’s been no sign so far that this sort of criticism will derail the DEA juggernaut, although somecommentators are starting to wonder if the rights holders will see the process as passing a cost/benefit test.