Monthly Archives: December 2011

Bankers’ Christmas present

Every Christmas we give our friends in the banking industry a wee present. Sometimes it’s the responsible disclosure of a vulnerability, which we publish the following February: 2007’s was PED certification, 2008’s was CAP while in 2009 we told the banking industry of the No-PIN attack. This year too we have some goodies in the hamper: watch our papers at Financial Crypto 2012.

In other years, we’ve had arguments with the bankers’ PR wallahs. In 2010, for example, their trade association tried to censor one of our students’ thesis. That saga also continues; Britain’s bankers tried once more to threaten us so we told them once more to go away. We have other conversations in progress with bankers, most of them thankfully a bit more constructive.

This year’s Christmas present is different: it’s a tale with a happy ending. Eve Russell was a fraud victim whom Barclays initially blamed for her misfortune, as so often happens, and the Financial Ombudsman Service initially found for the bank as it routinely does. Yet this was clearly not right; after many lawyers’ letters, two hearings at the ombudsman, two articles in The Times and a TV appearance on Rip-off Britain, Eve won. This is the first complete case file since the ombudsman came under the Freedom of Information Act; by showing how the system works, it may be useful to fraud victims in the future.

(At Eve’s request, I removed the correspondence and case papers from my website on 5 Oct 2015. Eve was getting lots of calls and letters from other fraud victims and was finally getting weary. I have left just the article in the Times.)

Blood donation and privacy

The UK’s National Blood Service screens all donors for a variety of health and lifestyle risks prior donation. Many are highly sensitive, particularly sexual history and drug use. So I found it disappointing that, after consulting with a nurse who took detailed notes about specific behaviours and when they occurred, I was expected to consent to this information being stored indefinitely. When I pressed as to why this data is retained, I was told it was necessary so that I can be contacted as soon as I’m eligible again to donate blood, and to prevent me from donating before that.

The first reason seems weak, as contacting donors on an annual or semi-annual basis wouldn’t greatly decrease the level of donation (most risk-factor restrictions last at least 12 months or are indefinite). The second reason is a security fantasy, as it would only detect donors who lie at a second visit after being honest initially. I doubt donor dishonesty is a major problem and all blood is tested anyway. The purpose of lifestyle restrictions is to reduce the base rate of unsafe blood because all tests have false negatives. Storing detailed donor history doesn’t even have much time-saving benefit: history needs to be re-taken before each donation, since lifestyle risks can change.

I certainly don’t think the NBS is trying to stockpile data for nefarious reasons. I expect instead that the increasingly low technical costs of storing data speciously justify its very minor secondary uses if one ignores the risk of a massive compromise (NBS gets about 2 M donors per year). I wonder whether the inherent hazard of data collection was considered in the NBS’ cost/benefit analysis when this privacy policy was adopted . Security engineers and privacy advocates would do well to advocate non-collection of sensitive data before fancier privacy-enhancing technology. The NHS provides a vital service but they can’t do it without their donors, who are always in short supply. It would be a shame to discourage anybody from donating and being honest about their health history by demanding to store their data forever.

Job ad: post-doctoral researcher in security, operating systems, computer architecture

We are pleased to announce a job opening at the University of Cambridge Computer Laboratory for a post-doctoral researcher working in the areas of security, operating systems, and computer architecture.

Research Associate
University of Cambridge – Faculty of Computer Science & Technology

Salary: £27,428 – £35,788 pa
The funds for this post are available for one year:

We are seeking a Post-doctoral Research Associate to join the CTSRD Project, which is investigating fundamental improvements to CPU architecture, operating system (OS), and programming language structure in support of computer security. The CTSRD Project is a collaboration between the University of Cambridge and SRI International, and part of the DARPA CRASH research programme on clean-slate computer system design.

This position will be an integral part of an international team of researchers spanning multiple institutions across academia and industry. The successful candidate will contribute to low-level aspects of system software: compilers, language run-times, and OS kernels. Responsibilities will include researching the application of novel dynamic techniques to C-language operating systems and applications, including adaptation of the FreeBSD kernel and LLVM compiler suite, and measurement of the resulting system.

An ideal candidate will hold (or be close to finishing) a PhD in Computer Science, Mathematics, or similar with a strong background in low-level system software development, which should include at least of one of strong kernel development experience (FreeBSD preferred; Linux acceptable), or compiler internals experience (LLVM preferred; gcc acceptable). Strong experience with the C programming language is critical. Some background in computer security is also recommended.

Candidates must be able to provide evidence of relevant work demonstrated by a research publication track record or industrial experience. Good interpersonal and organisational skills and the ability to work in a team are also essential. This post is intended to be filled as soon as practically possible after the closing date.

Applications should include:

  • Curriculum Vitae
  • Brief statement of the particular contribution you would make to the project
  • A completed form CHRIS6

Completed applications should be sent by post to: Personnel-Admin,Computer Laboratory, William Gates Building, JJ Thomson Avenue, Cambridge, CB3 0FD, or by email to: personnel-admin@cl.cam.ac.uk

Quote Reference: NR10692
Closing Date: 10 January 2012

The University values diversity and is committed to equality of opportunity.

Privacy event on Wednesday

I will be talking in London on Wednesday at a workshop on Anonymity, Privacy, and Open Data about the difficulty of anonymising medical records properly. I’ll be on a panel with Kieron O’Hara who wrote a report on open data for the Cabinet Office earlier this year, and a spokesman from the ICO.

This will be the first public event on the technology and policy issues surrounding anonymisation since yesterday’s announcement that the government will give wide access to anonymous versions of our medical records. I’ve written extensively on the subject: for an overview, see my book chapter which explores the security of medical systems in general from p 282 and the particular problems of using “anonymous” records in research from p 298. For the full Monty, start here.

Anonymity is hard enough if the data controller is capable, and motivated to try hard. In the case of the NHS, anonymity has always been perfunctory; the default is to remove patient names and addresses but leave their postcodes and dates of birth. This makes it easy to re-identify about 99% of patients (the exceptions are mostly twins, soldiers, students and prisoners). And since I wrote that book chapter, the predicted problems have come to pass; for example the NHS lost a laptop containing over eight million patients’ records.

Here we go again

The Sunday media have been trailing a speech by David Cameron tomorrow about giving us online access to our medical records and our kids’ school records, and making anonymised versions of them widely available to researchers, companies and others. Here is coverage in the BBC, the Mail and the Telegraph; there’s also a Cabinet Office paper. The measures are supported by the CEO of Glaxo and opposed by many NGOs.

If the Government is going to “ensure all NHS patients can access their personal GP records online by the end of this Parliament”, they’ll have to compel the thousands of GPs who still keep patient records on their own machines to transfer them to centrally-hosted facilities. The systems are maintained by people who have to please the Secretary of State rather than GPs, and thus become progressively less useful. This won’t just waste doctors’ time but will have real consequences for patient safety and the quality of care.

We’ve seen this repeatedly over the lifetime of NPfIT and its predecessor the NHS IM&T strategy. Officials who can’t develop working systems become envious of systems created by doctors; they wrest control, and the deterioration starts.

It’s astounding that a Conservative prime minister could get the idea that nationalising something is the best way to make it work better. It’s also astonishing that a Government containing Liberals who believe in human rights, the rule of law and privacy should support the centralisation of medical records a mere two years after the Joseph Rowntree Reform Trust, a Liberal charity, produced the Database State report which explained how the centralisation of medical records (and for that matter children’s records) destroys privacy and contravenes human-rights law. The coming debate will no doubt be vigorous and will draw on many aspects of information security, from the dreadful security usability (and safety usability) of centrally-purchased NHS systems, through the real hazards of coerced access by vulnerable patients, to the fact that anonymisation doesn’t really work. There’s much more here. Of course the new centralisation effort will probably fail, just like the last two; health informatics is a hard problem, and even Google gave up. But our privacy should not depend on the government being incompetent at wrongdoing. It should refrain from wrongdoing in the first place.

DNSChanger might change the BGPSEC landscape

In early November, a sophisticated fraud was shut down and a number of people arrested. Malware from a family called “DNSChanger” had been placed on around four million machines (Macs as well as Windows machines) over several years.

The compromised users had their DNS traffic redirected to criminally operated servers. The main aim of the criminals seems to have been to redirect search queries and thereby to make money from displaying adverts.

Part of the mitigation of DNSChanger involves ISC running DNS servers for a while (so that 4 million people whose DNS servers suddenly disappear don’t simultaneously ring their ISP helpdesks complaining that the Internet is broken).

To prevent bad people running the DNS servers instead, the address blocks containing the IPs of the rogue DNS servers which used to belong to the criminals (but are now pointed at ISC) have been “locked”.

This is easy for ARIN (the organisation who looks after North American address space) to acquiesce to, because they have US legal paperwork compelling their assistance. However, the Dutch police have generated some rather less compelling paperwork and served that on RIPE; so RIPE is now asking the Dutch court to clarify the position.

Further details of the issues with the legal paperwork can be found on (or linked from) the Internet Governance Project blog. The IGP is a group of mainly but not entirely US academics working on global Internet policy issues.

As the IGP rightly point out, this is going to be an important case because it is going to draw attention to the role of the RIRs — just at the time when that role is set to become even more important.

As we move to crypto-secured BGP routing, the RIRs (ARIN, RIPE etc) will be providing cryptographic assurance of the validity of address block ownership. Which means, in effect, that we are building a system where the courts in one country (five countries in all, for five RIRs) could remove ISPs and hosting providers from the Internet… and some ISPs [and their governments] (who are beginning to think ahead) are not entirely keen on this prospect.

If, as one might expect, the Dutch courts eventually uphold the DNSChanger compulsion on RIPE (even if the Dutch police have to have a second go at making the paperwork valid) then maybe this will prove the impetus to abandon a pyramid structure for BGP security and move to a “sea of certificates” model (where one independently chooses from several overlapping roots of authority) — which more closely approximates the reality of a global system which touches a myriad set of local jurisdictions.