Posts filed under 'Internet censorship

Dec 31, '13

We had a crypto festival in London in London in November at which a number of cryptographers and crypto policy folks got together with over 1000 mostly young attendees to talk about what might be done in response to the Snowden revelations.

Here is a video of the session in which I spoke. The first speaker was Annie Machon (at 02.35) talking of her experience of life on the run from MI5, and on what we might do to protect journalists’ sources in the future. I’m at 23.55 talking about what’s changed for governments, corporates, researchers and others. Nick Pickles of Big Brother Watch follows at 45.45 talking on what can be done in terms of practical politics; it turned out that only two of us in the auditorium had met our MPs over the Comms Data Bill. The final speaker, Smari McCarthy, comes on at 56.45, calling for lots more encryption. The audience discussion starts at 1:12:00.

Jun 19, '13

The Internet is and has always been a space where participants battle for control. The two core protocols that define the Internet – TCP and IP – are both designed to allow separate networks to connect to each other easily, so that networks that differ not only in hardware implementation (wired vs. satellite vs. radio networks) but also in their politics of control (consumer vs. research vs. military networks) can interoperate easily. It is a feature of the Internet, not a bug, that China – with its extensive, explicit censorship infrastructure – can interact with the rest of the Internet.

Today we have released an open-access collection (also published as a special issue of IEEE Internet Computing), of five peer reviewed papers on the topic of Internet censorship and control, edited by Hal Roberts and myself (Steven Murdoch). The topics of the papers include a broad look at information controls, censorship of microblogs in China, new modes of online censorship, the balance of power in Internet governance, and control in the certificate authority model.

These papers make it clear that there is no global consensus on what mechanisms of control are best suited for managing conflicts on the Internet, just as there is none for other fields of human endeavour. That said, there is optimism that with vigilance and continuing efforts to maintain transparency the Internet can stay as a force for increasing freedom than a tool for more efficient repression.

Apr 8, '13

The 3rd USENIX Workshop on Free and Open Communications on the Internet (FOCI ‘13) seeks to bring together researchers and practitioners from technology, law, and policy who are working on means to study, detect, or circumvent practices that inhibit free and open communications on the Internet. We invite two distinct tracks for papers: a technical track for technically-focused position papers or works-in-progress; and a social science track for papers focused on policy, law, regulation, economics or related fields of study.

FOCI will favor interesting and new ideas and early results that lead to well-founded position papers. We envision that work presented at FOCI will ultimately be published at relevant, high-quality conferences. Papers will be selected primarily based on originality, with additional consideration given to their potential to generate discussion at the workshop. Papers in the technical track will also be evaluated based on technical merit. As with other USENIX events, papers accepted for FOCI ‘13 will be made freely available on the USENIX website.

For further details, see the call for papers (PDF version). The submission deadline is 6 May 2013.

Jan 12, '13

Yesterday, banking security vendor Thales sent this DMCA takedown request to John Young who runs the excellent Cryptome archive. Thales want him to remove an equipment manual that has been online since 2003 and which was valuable raw material in research we did on API security.

Banks use hardware security modules (HSMs) to manage the cryptographic keys and PINs used to authenticate bank card transactions. These used to be thought to be secure. But their application programming interfaces (APIs) had become unmanageably complex, and in the early 2000s Mike Bond, Jolyon Clulow and I found that by sending sequences of commands to the machine that its designers hadn’t anticipated, it was often possible to break the device spectacularly. This became a thriving field of security research.

But while API security has been a goldmine for security researchers, it’s been an embarrassment for the industry, in which Thales is one of two dominant players. Hence the attempt to close down our mine. As you’d expect, the smaller firms in the industry, such as Utimaco, would prefer HSM APIs to be open (indeed, Utimaco sent two senior people to a Dagstuhl workshop on APIs that we held a couple of months ago). Even more ironically, Thales’s HSM business used to be the Cambridge startup nCipher, which helped our research by giving us samples of their competitors’ products to break.

If this case ever comes to court, the judge might perhaps consider the Lexmark case. Lexmark sued Static Control Components (SCC) for DMCA infringement in order to curtail competition. The court found this abusive and threw out the case. I am not a lawyer, and John Young must clearly take advice. However this particular case of internet censorship serves no public interest (as with previous attempts by the banking industry to censor security research).

Jul 27, '12

Consumer Focus have recently published my expert report on the issues that arise when attempting to track down people who are using peer to peer (P2P) systems to share copyright material without appropriate permissions. They have submitted this report to Ofcom who have been consulting on how they should regulate this sort of tracking down when the Digital Economy Act 2010 (DEA) mechanisms that are intended to prevent unlawful file sharing finally start to be implemented, probably sometime in 2014.

The basic idea behind the DEA provisions is that the rights holders (or more usually specialist companies) will join the P2P systems and download files that are being shared unlawfully. Because the current generation of P2P systems fails to provide any real anonymity, the rights holders will learn the IP addresses of the wrongdoers. They will then consult public records at RIPE (and the other Regional Internet Registries) to learn which ISPs were allocated the IP addresses. Those ISPs will then be approached and will be obliged, by the DEA, to consult their records and tell the appropriate account holder that someone using their Internet connection has been misbehaving. There are further provisions for telling the rights holders about repeat offenders, and perhaps even for “technical measures” to disrupt file sharing traffic.

From a technical point of view, the traceability part of the DEA process can (in principle) be made to work in a robust manner. However, there’s a lot of detail to get right in practice, both in recording the data generated by the P2P activity and within the ISPs systems — and history shows that mistakes are often made. I have some first hand experience of this, my report refers to how I helped the police track down a series of traceability mistakes that were made in a 2006 murder case! Hence I spend many pages in my report explaining what can go wrong and I set out in considerable detail the sort of procedures that I believe that Ofcom should insist upon to ensure that mistakes are rare and are rapidly detected.

My report also explains the difficulties (in many cases the insuperable difficulties) that the account holder will have in determining the individual who was responsible to the P2P activity. Consumer Focus takes the view that “this makes the proposed appeals process flawed and potentially unfair and we ask Government to rethink this process”. Sadly, there’s been no sign so far that this sort of criticism will derail the DEA juggernaut, although some commentators are starting to wonder if the rights holders will see the process as passing a cost/benefit test.

Jul 25, '12

I am co-editing a special edition of IEEE Internet Computing on Internet Censorship and Control. We are looking for short (up to 5,000 words) articles on the technical, social, and political mechanisms and impacts of Internet censorship and control. We’re soliciting both technical and social science articles, and especially encourage those that combine the two. Appropriate topics include

  • explorations of how the Internet’s technical, social, and political structures impact its censorship and control;
  • evaluations of how existing technologies and policies affect Internet censorship and control;
  • proposals for new technologies and policies;
  • discussions on how proposed technical, legal, or governance changes to the Internet will impact censorship and control;
  • analysis of techniques, methodologies, and results of monitoring Internet censorship and control; and
  • examinations of trade-offs between control and freedom, and how these sides can be balanced.

Please email the guest editors a brief description of the article you plan to submit by 15 August 2012. For further details, see the full CFP. Please distribute this CFP, and use this printable flyer if you wish.

Mar 24, '12

Nominations are invited for the 2012 PET Award by 31 March 2012.

The PET Award is presented annually to researchers who have made an outstanding contribution to the theory, design, implementation, or deployment of privacy enhancing technology. It is awarded at the annual Privacy Enhancing Technologies Symposium (PETS).

The PET Award carries a prize of 3000 USD thanks to the generous support of Microsoft. The crystal prize itself is offered by the Office of the Information and Privacy Commissioner of Ontario, Canada.

Any paper by any author written in the area of privacy enhancing technologies is eligible for nomination. However, the paper must have appeared in a refereed journal, conference, or workshop with proceedings published in the period from 1 June 2010 until 31 March 2012.

For eligibility requirements, refer to the award rules.

Anyone can nominate a paper by sending an email message containing the following to award-chairs12@petsymposium.org:

  • Paper title
  • Author(s)
  • Author(s) contact information
  • Publication venue and full reference
  • Link to an available online version of the paper
  • A nomination statement of no more than 500 words.

All nominations must be submitted by 31 March 2012. The Award Committee will select one or two winners among the nominations received. Winners must be present at the 2012 PET Symposium in order to receive the Award. This requirement can be waived only at the discretion of the PET Advisory board.

More information about the PET award (including past winners) is see the award website.

Dec 25, '11

Every Christmas we give our friends in the banking industry a wee present. Sometimes it’s the responsible disclosure of a vulnerability, which we publish the following February: 2007’s was PED certification, 2008’s was CAP while in 2009 we told the banking industry of the No-PIN attack. This year too we have some goodies in the hamper: watch our papers at Financial Crypto 2012.

In other years, we’ve had arguments with the bankers’ PR wallahs. In 2010, for example, their trade association tried to censor one of our students’ thesis. That saga also continues; Britain’s bankers tried once more to threaten us so we told them once more to go away. We have other conversations in progress with bankers, most of them thankfully a bit more constructive.

This year’s Christmas present is different: it’s a tale with a happy ending. Eve Russell was a fraud victim whom Barclays initially blamed for her misfortune, as so often happens, and the Financial Ombudsman Service initially found for the bank as it routinely does. Yet this was clearly not right; after many lawyers’ letters, two hearings at the ombudsman, two articles in The Times and a TV appearance on Rip-off Britain, Eve won. This is the first complete case file since the ombudsman came under the Freedom of Information Act; by showing how the system works, it may be useful to fraud victims in the future.

For your Christmas entertainment, we offer the bank statement which told Eve of the fraud; the initial exchange of letters between Eve’s lawyers and the bank; the ombudsman’s routine initial ruling against Eve, and her protest; the correspondence between the ombudsman and Barclays; Eve’s appeal and expert opinion; the verdict; and the offer of settlement. And let’s not forget the Thunder. A Merry Christmas to all!

Dec 1, '11

In early November, a sophisticated fraud was shut down and a number of people arrested. Malware from a family called “DNSChanger” had been placed on around four million machines (Macs as well as Windows machines) over several years.

The compromised users had their DNS traffic redirected to criminally operated servers. The main aim of the criminals seems to have been to redirect search queries and thereby to make money from displaying adverts.

Part of the mitigation of DNSChanger involves ISC running DNS servers for a while (so that 4 million people whose DNS servers suddenly disappear don’t simultaneously ring their ISP helpdesks complaining that the Internet is broken).

To prevent bad people running the DNS servers instead, the address blocks containing the IPs of the rogue DNS servers which used to belong to the criminals (but are now pointed at ISC) have been “locked”.

This is easy for ARIN (the organisation who looks after North American address space) to acquiesce to, because they have US legal paperwork compelling their assistance. However, the Dutch police have generated some rather less compelling paperwork and served that on RIPE; so RIPE is now asking the Dutch court to clarify the position.

Further details of the issues with the legal paperwork can be found on (or linked from) the Internet Governance Project blog. The IGP is a group of mainly but not entirely US academics working on global Internet policy issues.

As the IGP rightly point out, this is going to be an important case because it is going to draw attention to the role of the RIRs — just at the time when that role is set to become even more important.

As we move to crypto-secured BGP routing, the RIRs (ARIN, RIPE etc) will be providing cryptographic assurance of the validity of address block ownership. Which means, in effect, that we are building a system where the courts in one country (five countries in all, for five RIRs) could remove ISPs and hosting providers from the Internet… and some ISPs [and their governments] (who are beginning to think ahead) are not entirely keen on this prospect.

If, as one might expect, the Dutch courts eventually uphold the DNSChanger compulsion on RIPE (even if the Dutch police have to have a second go at making the paperwork valid) then maybe this will prove the impetus to abandon a pyramid structure for BGP security and move to a “sea of certificates” model (where one independently chooses from several overlapping roots of authority) — which more closely approximates the reality of a global system which touches a myriad set of local jurisdictions.

Oct 30, '11

Back in July I wrote a blog article “Will Newzbin be blocked?” which discussed the granting of an injunction to a group of movie companies to force BT to block access to “Newzbin2“.

The parties were back in court this last week to hammer out the exact details of the injunction.

The final wording of the injunction requires BT to block customer access to Newzbin2 by #1(1) rerouting traffic to relevant IPs and #1(2) applying “DPI based” URL blocking. The movie companies have to tell BT which IPs and which URLs are relevant.

#2 of the injunction says that BT can use its existing “Cleanfeed” system (which I wrote about here and at greater length in my PhD thesis here) to meet the requirements of #1, even though Cleanfeed isn’t believed to use DPI at all !

#3 and #4 of the injunction allows the parties to agree to suspend blocking and to come back to court in the future, and #5 relates to the costs of the court action.

One of the (few) upsides of this injunction will be to permit lawful experimentation as to the effectiveness of the Cleanfeed system, assuming that it is used — if the studios ask for all URLs on a website to be blocked, I expect that null routing the website entirely will be simpler for BT than redirecting traffic to the Cleanfeed proxy.

Up until now, discovering a flaw in the technical implementation of Cleanfeed would result in successful access to a child sexual abuse image website. Anyone monitoring the remote end of the connection might then draw the conclusion that images had been viewed and a criminal offence committed. Although careful experimental design could avoid law-breaking, it might be some time into the investigation process before this was properly understood by the criminal justice system, and the intervening period would be somewhat stressful for the investigator.

There is no law that prevents viewing of the contents of Newsbin2, and so the block circumvention techniques proposed over the past few years (starting of course with just using “https”) can now start to be evaluated as to their actual effectiveness.

However, there is more to #1 of the injunction, in that it applies to:

[...] www.newzbin.com, its domains and sub-domains and including payments.newzbin.com and any other IP address or URL whose sole or predominant purpose is to enable or facilitate access to the Newzbin2 website.

I don’t expect that publishing circumvention experience here on LBT could be seen as the predominant purpose of this blog… so I don’t really expect these pages to suddenly become invisible to BT customers. But, since the whole process has an Alice in Wonderland feel to it (someone who believes that blocking websites is possible clearly had little else to do before breakfast), it cannot be entirely ruled out.


Calendar

April 2014
M T W T F S S
« Mar    
 123456
78910111213
14151617181920
21222324252627
282930  

Posts by Month

Posts by Category