Category Archives: Legal issues

Security-related legislation, government initiatives, court cases

Privacy with technology: where do we go from here?

As part of the Royal Society Summer Science Exhibition 2014, I spoke at the panel session “Privacy with technology: where do we go from here?”, along with Ross Anderson, and Bashar Nuseibeh with Jon Crowcroft as chair.

The audio recording is available and some notes from the session are below.

The session started with brief presentations from each of the panel members. Ross spoke on the economics of surveillance and in particular network effects, the topic of his paper at WEIS 2014.

Bashar discussed the difficulties of requirements engineering, as eloquently described by Billy Connolly. These challenges are particularly acute when it comes to designing for privacy requirements, especially for wearable devices with their limited ability to communicate with users.

I described issues around surveillance on the Internet, whether by governments targeting human rights workers or advertisers targeting pregnant customers. I discussed how anonymous communication tools, such as Tor, can help defend against such surveillance.

Continue reading Privacy with technology: where do we go from here?

EMV: Why Payment Systems Fail

In the latest edition of Communications of the ACM, Ross Anderson and I have an article in the Inside Risks column: “EMV: Why Payment Systems Fail” (DOI 10.1145/2602321).

Now that US banks are deploying credit and debit cards with chips supporting the EMV protocol, our article explores what lessons the US should learn from the UK experience of having chip cards since 2006. We address questions like whether EMV would have prevented the Target data breach (it wouldn’t have), whether Chip and PIN is safer for customers than Chip and Signature (it isn’t), whether EMV cards can be cloned (in some cases, they can) and whether EMV will protect against online fraud (it won’t).

While the EMV specification is the same across the world, they way each country uses it varies substantially. Even individual banks within a country may make different implementation choices which have an impact on security. The US will prove to be an especially interesting case study because some banks will be choosing Chip and PIN (as the UK has done) while others will choose Chip and Signature (as Singapore did). The US will act as a natural experiment addressing the question of whether Chip and PIN or Chip and Signature is better, and from whose perspective?

The US is also distinctive in that the major tussle over payment card security is over the “interchange” fees paid by merchants to the banks which issue the cards used. Interchange fees are about an order of magnitude higher than losses due to fraud, so while security is one consideration in choosing different sets of EMV features, the question of who pays how much in fees is a more important factor (even if the decision is later claimed to be justified by security). We’re already seeing results of this fight in the courts and through legislation.

EMV is coming to the US, so it is important that banks, customers, merchants and regulators know the likely consequences and how to manage the risks, learning from the lessons of the UK and elsewhere. Discussion of these and further issues can be found in our article.

Don’t shoot the demonstrators

Jim Graves, Alessandro Acquisti and I are giving a paper today at WEIS on Experimental Measurement of Attitudes Regarding Cybercrime, which we hope might nudge courts towards more rational sentencing for cybercrime.

At present, sentencing can seem somewhere between random and vindictive. People who commit a fraud online can get off with a tenth of what they’d get if they’d swindled the same amount of money face-to-face; yet people who indulge in political activism – as the Anonymous crowd did – can get hammered with much harsher sentences than they’d get for a comparable protest on the street.

Is this just the behaviour of courts and prosecutors, or does it reflect public attitudes?

We did a number of surveys of US residents and found convincing evidence that it’s the former. Americans want fraudsters to be punished on two criteria: for the value of the damage they do, with steadily tougher punishments for more damage, and for their motivation, where they want people who hack for profit to be punished more harshly than people who hack for political protest.

So Americans, thankfully, are rational. Let’s hope that legislators and prosecutors start listening to their voters.

Small earthquake, not many dead (yet)

The European Court of Justice decision in the Google case will have implications way beyond search engines. Regular readers of this blog will recall stories of banks hounding innocent people for money following payment disputes, and a favourite trick is to blacklist people with credit reference agencies, even while disputes are still in progress (or even after the bank has actually lost a court case). In the past, the Information Commissioner refused to do anything about this abuse, claiming that it’s the bank which is the data controller, not the credit agency. The court now confirms that this view was quite wrong. I have therefore written to the Information Commissioner inviting him to acknowledge this and to withdraw the guidance issued to the credit reference agencies by his predecessor.

I wonder what other information intermediaries will now have to revise their business models?

Latest health privacy scandal

Today I gave a talk at the Open Data Institute on a catastrophic failure of anonymity in medical research. Here’s the audio and video, and here are the slides.

Three weeks ago we made a formal complaint to the ICO about the Department of Health supplying a large amount of data to PA Consulting, who uploaded it to the Google cloud in defiance of NHS regulations on sending data abroad. This follows several other scandals over NHS chiefs claiming that hospital episode statistics data are anonymous and selling it to third parties, when it is nothing of the kind.

Yesterday the Department of Health disclosed its Register of Approved Data Releases which shows that many organisations in both the public and private sectors have been supplied with HES data over the past year. It’s amazing how many of them are marked “non sensitive”: even number 408, where Imperial College got data with the with HESID (which includes postcode or NHS number), date of birth, home address, and GP practice. How officials can maintain that such data does not identify individuals is beyond me.

Health privacy: complaint to ICO

Three NGOs have lodged a formal complaint to the Information Commissioner about the fact that PA Consulting uploaded over a decade of UK hospital records to a US-based cloud service. This appears to have involved serious breaches of the UK Data Protection Act 1998 and of multiple NHS regulations about the security of personal health information. This already caused a row in Parliament and the Deparatment of Health seems to be trying to wriggle off the hook by pretending that the data were pseudonymised. Other EU countries have banned such uploads. Regular LBT readers will know that the Department of Health has got itself in a complete mess over medical record privacy.

NHS opt out: not what it seems

On January 23rd we had a conference call with the NHS Information Centre and a couple of its software suppliers about anonymisation. LBT readers will have followed how your GP records are to uploaded to the new central database care.data for resale unless you opt out. Any previous opt outs from other central systems like SCR will be disregarded (even if you wrote saying you opted out of all central systems), along with opt-outs from regional systems.

We’d been told that if you opted out afresh your data would be uploaded only in anonymised, aggregated form; after all the Prime Minister promised. But I persisted. How will the NHS work out doctors’ bonuses in respect of opted-out patients? Doctors get extra payments for meeting targets, such as ensuring that diabetic patients get eye tests; these used to be claimed by practice managers but are now to be worked out centrally. If the surgery just uploads “We have N patients opted out and their diagnostic codes are R1, R2, R3, …” then officials might have to give doctors the benefit of the doubt in bonus calculations.

It turned out that officials were still dithering. The four PC software vendors met them on January 22nd and asked for the business logic so they could code up the extraction, but officials could not make up their minds whether to respect the Prime Minister’s promise (and human-rights law) or to support the bonus calculation. So here we had a major national programme being rolled out next month, and still without a stable specification!

Now the decision has been taken. If you opt out, all your clinical data will be uploaded as a single record, but with your name, date of birth and postcode removed. The government will simply pretend this is anonymous, even though they well know it is not. This is clearly unlawful. Our advice is to opt out anyway while we lobby ministers to get their officials under control, deliver on Cameron’s promise and obey the law.

Why dispute resolution is hard

Today we release a paper on security protocols and evidence which analyses why dispute resolution mechanisms in electronic systems often don’t work very well. On this blog we’ve noted many many problems with EMV (Chip and PIN), as well as other systems from curfew tags to digital tachographs. Time and again we find that electronic systems are truly awful for courts to deal with. Why?

The main reason, we observed, is that their dispute resolution aspects were never properly designed, built and tested. The firms that delivered the main production systems assumed, or hoped, that because some audit data were available, lawyers would be able to use them somehow.

As you’d expect, all sorts of things go wrong. We derive some principles, and show how these are also violated by new systems ranging from phone banking through overlay payments to Bitcoin. We also propose some enhancements to the EMV protocol which would make it easier to resolve disputes over Chip and PIN transactions.

Update (2013-03-07): This post was mentioned on Bruce Schneier’s blog, and this is some good discussion there.

Update (2014-03-03): The slides for the presentation at Financial Cryptography are now online.

Untrue claims by NHS IT chief

If you listen to Radio 4 from 0810 on BBC iPlayer, you’ll hear a debate between Phil Booth of MedConfidential and Tim Kelsey of NHS England – the guy driving the latest NHS data grab.

Tim Kelsey made a number of misleading claims. He claimed for example that in 25 years there had never been a single case of patient confidentiality compromise because of the HES data kept centrally on all hospital treatments. This was untrue. A GP practice manager, Helen Wilkinson, was stigmatised as an alcoholic on HES because of a coding error. She had to get her MP to call a debate in Parliament to get this fixed (and even after the minister promised it had been fixed, it hadn’t been; that took months more pushing).

Second, when Tim pressed Phil for a single case where data had been compromised, Phil said “Gordon Brown”. Kelsey’s rebuttal was “That was criminal hacking.” Again, this was untrue; Gordon Brown’s information was accessed by Andrew Jamieson, a doctor in Dunfermline, who abused his authorised access to the system. He was not prosecuted because this was not in the public interest. Yeah, right. And now Kelsey is going to give your GP records not just to almost everyone in the NHS but to university researchers (I have been offered access though I’m not even a medic and despite the fact that academics have lost millions of records in the past), to drug firms like GlaxoSmithKline, and even to Silicon-Valley informatics companies such as 23andme.