Category Archives: Internet censorship

Debate at Cambridge Festival of Ideas: Internet Freedom

In the evening of Thursday 27 October, I will be participating in a debate at the Cambridge Festival of Ideas, on Internet Freedom. Other speakers include Jim Killock, executive director of the Open Rights Group, Herbert Snorsson, founder of Openleaks.org and David Clemente, Chatham House. Further details can be found on the festival website.

Attendance is free, but booking is required.

DCMS illustrates the key issue about blocking

This morning the Department for Culture Media and Sport (DCMS) have published a series of documents relating to the implementation of the Digital Economy Act 2010.

One of those documents, from OFCOM, describes how “Site Blocking” might be used to prevent access to websites that are involved in copyright infringement (ie: torrent sites, Newzbin, “cyberlockers” etc.).

The report appears, at a quick glance, to cover the ground pretty well, describing the various options available to ISPs to block access to websites (and sometimes to block access altogether — since much infringement is not “web” based).

The site also explains how each of the systems can be circumvented (and how easily) and makes it clear (in big bold type) “All techniques can be circumvented to some degree by users and site owners who are willing to make the additional effort.

I entirely agree — and seem to recall a story from my childhood about the Emperor’s New Blocking System — and note that continuing to pursue this chimera will just mean that time and money will be pointlessly wasted.

However OFCOM duly trot out the standard line one hears so often from the rights holders: “Site blocking is likely to deter casual and unintentional infringers and by requiring some degree of active circumvention raise the threshold even for determined infringers.

The problem for the believers in blocking is that this just isn’t true — pretty much all access to copyright infringing material involves the use of tools (to access the torrents, to process NZB files, or just to browse [one tends not to look at web pages in Notepad any more]). Although these tools need to be created by competent people, they are intended for mass use (point and click) and so copyright infringement by the masses will always be easy. They will not even know that the hurdles were there, because the tools will jump over them.

Fortuitously, the DCMS have provided an illustration of this in their publishing of the OFCOM report…

The start of the report says “The Department for Culture, Media and Sport has redacted some parts of this document where it refers to techniques that could be used to circumvent website blocks. There is a low risk of this information being useful to people wanting to bypass or undermine the Internet Watch Foundation‟s blocks on child sexual abuse images. The text in these sections has been blocked out.

What the DCMS have done (following in the footsteps of many other incompetents) is to black out the text they consider to be sensitive. Removing this blacking out is simple but tedious … you can get out a copy of Acrobat and change the text colour to white — or you can just cut and paste the black bits into Notepad and see the text.

So I confidently expect that within a few hours, non-redacted (non-blocked!) versions of the PDF will be circulating (they may even become more popular than the original — everyone loves to see things that someone thought they should not). The people who look at these non-blocked versions will not be technically competent, they won’t know how to use Acrobat, but they will see the material.

So the DCMS have kindly made the point in the simplest of ways… the argument that small hurdles make any difference is just wishful thinking; sadly for Internet consumers in many countries (who will end up paying for complex blocking systems that make no practical difference) these wishes will cost them money.

PS: the DCMS do actually understand that blocking doesn’t work, or at least not at the moment. Their main document says “Following advice from Ofcom – which we are publishing today – we will not bring forward site blocking regulations under the DEA at this time.” Sadly however, this recognition of reality is too late for the High Court.

Will Newzbin be blocked?

This morning the UK High Court granted an injunction to a group of movie companies which is intended to force BT to block access to “newzbin 2” by their Internet customers. The “newzbin 2” site provides an easy way to search for and download metadata files that can be used to automate the downloading of feature films (TV shows, albums etc) from Usenet servers. ie it’s all about trying to prevent people from obtaining content without paying for a legitimate copy (so called “piracy“).

The judgment is long and spends a lot of time (naturally) on legal matters, but there is some technical discussion — which is correct so far as it goes (though describing redirection of traffic based on port number inspection as “DPI” seems to me to stretch the jargon).

But what does the injunction require of BT? According to the judgment BT must apply “IP address blocking in respect of each and every IP address [of newzbin.com]” and “DPI based blocking utilising at least summary analysis in respect of each and every URL available at the said website and its domains and sub domains“. BT is then told that the injunction is “complied with if the Respondent uses the system known as Cleanfeed“.

There is almost nothing about the design of Cleanfeed in the judgment, but I wrote a detailed account of how it works in a 2005 paper (a slightly extended version of which appears as Chapter 7 of my 2005 PhD thesis). Essentially it is a 2-stage system, the routing system redirects port 80 (HTTP) traffic for relevant IP addresses to a proxy machine — and that proxy prevents access to particular URLs.

So if BT just use Cleanfeed (as the injunction indicates) they will resolve newzbin.com (and www.newzbin.com) which are currently both on 85.112.165.75, and they will then filter access to http://www.newzbin.com/, http://newzbin.com and http://85.112.165.75. It will be interesting to experiment to determine how good their pattern matching is on the proxy (currently Cleanfeed is only used for child sexual abuse image websites, so experiments currently pose a significant risk of lawbreaking).

It will also be interesting to see whether BT actually use Cleanfeed or if they just ‘blackhole’ all access to 85.112.165.75. The quickest way to determine this (once the block is rolled out) will be to see whether or not https://newzbin.com works or not. If it does work then BT will have obeyed the injunction but the block will be trivial to evade (add a “s” to the URL). If it does not work then BT will not be using Cleanfeed to do the blocking!

BT users will still of course be able to access Newzbin (though perhaps not by using https), but depending on the exact mechanisms which BT roll out it may be a little less convenient. The simplest method (but not the cheapest) will be to purchase a VPN service — which will tunnel traffic via a remote site (and access from there won’t be blocked). Doubtless some enterprising vendors will be looking to bundle a VPN with a Newzbin subscription and an account on a Usenet server.

The use of VPNs seems to have been discussed in court, along with other evasion techniques (such as using web and SOCKS proxies), but the judgment says “It is common ground that, if the order were to be implemented by BT, it would be possible for BT subscribers to circumvent the blocking required by the order. Indeed, the evidence shows the operators of Newzbin2 have already made plans to assist users to circumvent such blocking. There are at least two, and possibly more, technical measures which users could adopt to achieve this. It is common ground that it is neither necessary nor appropriate for me to describe those measures in this judgment, and accordingly I shall not do so.

There’s also a whole heap of things that Newzbin could do to disrupt the filtering or just to make their site too mobile to be effectively blocked. I describe some of the possibilities in my 2005 academic work, and there are doubtless many more. Too many people consider the Internet to be a static system which looks the same from everywhere to everyone — that’s just not the case, so blocking systems that take this as a given (“web sites have a single IP address that everyone uses”) will be ineffective.

But this is all moot so far as the High Court is concerned. The bottom line within the judgment is that they don’t actually care if the blocking works or not! At paragraph #198 the judge writes “I agree with counsel for the Studios that the order would be justified even if it only prevented access to Newzbin2 by a minority of users“. Since this case was about preventing economic damage to the movie studios, I doubt that they will be so sanguine if it is widely understood how to evade the block — but the exact details of that will have to wait until BT have complied with their new obligations.

The PET Award: Nominations wanted for prestigious privacy award

The PET Award is presented annually to researchers who have made an outstanding contribution to the theory, design, implementation, or deployment of privacy enhancing technology. It is awarded at the annual Privacy Enhancing Technologies Symposium (PETS).

The PET Award carries a prize of 3000 USD thanks to the generous support of Microsoft. The crystal prize itself is offered by the Office of the Information and Privacy Commissioner of Ontario, Canada.

Any paper by any author written in the area of privacy enhancing technologies is eligible for nomination. However, the paper must have appeared in a refereed journal, conference, or workshop with proceedings published in the period from August 8, 2009 until April 15, 2011.

The complete award rules including eligibility requirements can be found under the award rules section of the PET Symposium website.

Anyone can nominate a paper by sending an email message containing the following to award-chair11@petsymposium.org.

  • Paper title
  • Author(s)
  • Author(s) contact information
  • Publication venue and full reference
  • Link to an available online version of the paper
  • A nomination statement of no more than 500 words.

All nominations must be submitted by April 15th, 2011. The Award Committee will select one or two winners among the nominations received. Winners must be present at the PET Symposium in order to receive the Award. This requirement can be waived only at the discretion of the PET Advisory board.

More information about the PET award (including past winners) is available at http://petsymposium.org/award/

More information about the 2011 PET Symposium is available at http://petsymposium.org/2011.

A Merry Christmas to all Bankers

The bankers’ trade association has written to Cambridge University asking for the MPhil thesis of one of our research students, Omar Choudary, to be taken offline. They complain it contains too much detail of our No-PIN attack on Chip-and-PIN and thus “breaches the boundary of responsible disclosure”; they also complain about Omar’s post on the subject to this blog.

Needless to say, we’re not very impressed by this, and I made this clear in my response to the bankers. (I am embarrassed to see I accidentally left Mike Bond off the list of authors of the No-PIN vulnerability. Sorry, Mike!) There is one piece of Christmas cheer, though: the No-PIN attack no longer works against Barclays’ cards at a Barclays merchant. So at least they’ve started to fix the bug – even if it’s taken them a year. We’ll check and report on other banks later.

The bankers also fret that “future research, which may potentially be more damaging, may also be published in this level of detail”. Indeed. Omar is one of my coauthors on a new Chip-and-PIN paper that’s been accepted for Financial Cryptography 2011. So here is our Christmas present to the bankers: it means you all have to come to this conference to hear what we have to say!

Wikileaks, security research and policy

A number of media organisations have been asking us about Wikileaks. Fifteen years ago we kicked off the study of censorship resistant systems, which inspired the peer-to-peer movement; we help maintain Tor, which provides the anonymous communications infrastructure for Wikileaks; and we’ve a longstanding interest in information policy.

I have written before about governments’ love of building large databases of sensitive data to which hundreds of thousands of people need access to do their jobs – such as the NHS spine, which will give over 800,000 people access to our health records. The media are now making the link. Whether sensitive data are about health or about diplomacy, the only way forward is compartmentation. Medical records should be kept in the surgery or hospital where the care is given; and while an intelligence analyst dealing with Iraq might have access to cables on Iraq, Iran and Saudi Arabia, he should have no routine access to stuff on Korea or Brazil.

So much for the security engineering; now to policy. No-one questions the US government’s right to try one of its soldiers for leaking the cables, or the right of the press to publish them now that they’re leaked. But why is Wikileaks treated as the leaker, rather than as a publisher?

This leads me to two related questions. First, does a next-generation censorship-resistant system need a more resilient technical platform, or more respectable institutions? And second, if technological change causes respectable old-media organisations such as the Guardian and the New York Times to go bust and be replaced by blogs, what happens to freedom of the press, and indeed to freedom of speech?

Resumption of the crypto wars?

The Telegraph and Guardian reported yesterday that the government plans to install deep packet inspection kit at ISPs, a move considered and then apparently rejected by the previous government (our Database State report last year found their Interception Modernisation Programme to be almost certainly illegal). An article in the New York Times on comparable FBI/NSA proposals makes you wonder whether policy is being coordinated between Britain and America.

In each case, the police and spooks argue that they used to have easy access to traffic data — records of who called whom and when — so now people communicate using facebook, gmail and second life rather than with phones, they should be allowed to harvest data about who wrote on your wall, what emails appeared on your gmail inbox page, and who stood next to you in second life. This data will be collected on everybody and will be available to investigators who want to map suspects’ social networks. A lot of people opposed this, including the Lib Dems, who promised to “end the storage of internet and email records without good reason” and wrote this into the Coalition Agreement. The Coalition seems set to reinterpret this now that the media are distracted by the spending review.

We were round this track before with the debate over key escrow in the 1990s. Back then, colleagues and I wrote of the risks and costs of insisting that communications services be wiretap-ready. One lesson from the period was that the agencies clung to their old business model rather than embracing all the new opportunities; they tried to remain Bletchley Park in the age of Google. Yet GCHQ people I’ve heard recently are still stuck in the pre-computer age, having learned nothing and forgotten nothing. As for the police, they can’t really cope with the forensics for the PCs, phones and other devices that fall into their hands anyway. This doesn’t bode well, either for civil liberties or for national security.

Digital Activism Decoded: The New Mechanics of Change

The book “Digital Activism Decoded: The New Mechanics of Change” is one of the first on the topic of digital activism. It discusses how digital technologies as diverse as the Internet, USB thumb-drives, and mobile phones, are changing the nature of contemporary activism.

Each of the chapters offers a different perspective on the field. For example, Brannon Cullum investigates the use of mobile phones (e.g. SMS, voice and photo messaging) in activism, a technology often overlooked but increasingly important in countries with low ratios of personal computer ownership and poor Internet connectivity. Dave Karpf considers how to measure the success of digital activism campaigns, given the huge variety of (potentially misleading) metrics available such as page impression and number of followers on Twitter. The editor, Mary Joyce, then ties each of these threads together, identifying the common factors between the disparate techniques for digital activism, and discussing future directions.

My chapter “Destructive Activism: The Double-Edged Sword of Digital Tactics” shows how the positive activism techniques promoted throughout the rest of the book can also be used for harm. Just as digital tools can facilitate communication and create information, they can also be used to block and destroy. I give some examples where these events have occurred, and how the technology to carry out these actions came to be created and deployed. Of course, activism is by its very nature controversial, and so is where to draw the line between positive and negative actions. So my chapter concludes with a discussion of the ethical frameworks used when considering the merits of activism tactics.

Digital Activism Decoded, published by iDebate Press, is now available for download, and can be pre-ordered from Amazon UK or Amazon US (available June 30th now).

Update (2010-06-17): Amazon now have the book in stock at both their UK and US stores.

Digital Activism Decoded

Ineffective self-blocking by the National Enquirer

It used to be simple to explain how browsing works. You type a link into the browser, the browser asks a DNS server at your ISP to translate the human-friendly hostname into the IP address of the web server, and then the browser contacts the server with an HTTP request requesting the page that you want to view.

It’s not quite that simple any more — which is rather bad news for the National Enquirer, the US tabloid which decided, three years or so ago, following a brush with the UK libel laws, that it would not publish a UK edition, or allow visits to its website from the UK. Unfortunately, the Enquirer’s blocking is no longer working as effectively as it used to.

Continue reading Ineffective self-blocking by the National Enquirer

Panorama looks at unlawful filesharing

Last night’s Panorama looked at the issue of unlawful filesharing and the proposals within the Digital Economy Bill that the UK Government thinks will deal with it.

The Open Rights Group has criticised the programme for spending too long examing the differences of opinion among music makers, and too little time talking about rights — perhaps that’s an inevitable side effect for fronting the programme with Jo Whiley, a Radio One DJ. This probably increased the audience amongst the under-30s who do a great deal of the file sharing; and for whom this may be the first time that they’ve had the bill’s proposals explained to them. So lose some, win some!

The programme had a number of stunts : they slowed down the broadband of a student household (not only was their MP3 going to take 13 weeks to download, they found they couldn’t effectively look at their email). They got a digital forensics expert to look at a family’s computers, finding copies of LimeWire (tricky stuff forensics!) and portraying this as a smoking gun for unlawfulness. The same expert camped outside the student house and piggybacked on their WiFi (apparently by employing a default password on their broadband router to authorise themselves to have access).

You can also see yours truly:
Richard Clayton on Panorama
demonstrating an anonymity network (it was in fact Tor, but I’d done a little tweaking to ensure that its standard discouragement of file sharing activity didn’t have any impact) : and showing that a Bit Torrent tracker stopped recording me as being in Cambridge, but placed me at the Tor exit node in Germany instead.

I argued that as soon as large numbers of people were getting in trouble for file sharing because they were traceable — then they wouldn’t stop file sharing, but they would stop being traceable.

All in all, within the limitations of a 30-minute prime-time main-channel show, I think the Panorama team provided a good introduction to a complex topic. You can judge for yourself (from within the UK) for the next 7 days on the BBC iPlayer, or in three parts on YouTube (I’m two minutes into part 3, at least until a web blocking injunction bars your access to what might well be an infringement of copyright).