The European Court of Justice decision in the Google case will have implications way beyond search engines. Regular readers of this blog will recall stories of banks hounding innocent people for money following payment disputes, and a favourite trick is to blacklist people with credit reference agencies, even while disputes are still in progress (or even after the bank has actually lost a court case). In the past, the Information Commissioner refused to do anything about this abuse, claiming that it’s the bank which is the data controller, not the credit agency. The court now confirms that this view was quite wrong. I have therefore written to the Information Commissioner inviting him to acknowledge this and to withdraw the guidance issued to the credit reference agencies by his predecessor.
I wonder what other information intermediaries will now have to revise their business models?
Today I gave a talk at the Open Data Institute on a catastrophic failure of anonymity in medical research. Here’s the audio and video, and here are the slides.
Three weeks ago we made a formal complaint to the ICO about the Department of Health supplying a large amount of data to PA Consulting, who uploaded it to the Google cloud in defiance of NHS regulations on sending data abroad. This follows several other scandals over NHS chiefs claiming that hospital episode statistics data are anonymous and selling it to third parties, when it is nothing of the kind.
Yesterday the Department of Health disclosed its Register of Approved Data Releases which shows that many organisations in both the public and private sectors have been supplied with HES data over the past year. It’s amazing how many of them are marked “non sensitive”: even number 408, where Imperial College got data with the with HESID (which includes postcode or NHS number), date of birth, home address, and GP practice. How officials can maintain that such data does not identify individuals is beyond me.
As part of another project, I needed to demonstrate how the various user-interface options for sending anonymous email through Mixmaster appeared to the email sender. This is very difficult to explain in words, so I recorded some screencasts. The tools I used were the Mixmaster command line tool, the Mutt email client with Mixmaster plugin, QuickSilver Lite, and finally a web-based interface.
The project is now over, but in case these screencasts are of wider interest, I’ve put them on YouTube.
Overall, the usability of Mixmaster is not great. All of the secure options are difficult to configure and use (QuickSilver Lite is probably the best), emails take a long time to be sent, recipients of anonymous email can’t send replies, and there is a high chance that the email will be dropped en-route.
Continue reading Current state of anonymous email usability
Three NGOs have lodged a formal complaint to the Information Commissioner about the fact that PA Consulting uploaded over a decade of UK hospital records to a US-based cloud service. This appears to have involved serious breaches of the UK Data Protection Act 1998 and of multiple NHS regulations about the security of personal health information. This already caused a row in Parliament and the Deparatment of Health seems to be trying to wriggle off the hook by pretending that the data were pseudonymised. Other EU countries have banned such uploads. Regular LBT readers will know that the Department of Health has got itself in a complete mess over medical record privacy.
I will be trying to liveblog Financial Cryptography 2014. I just gave a keynote talk entitled “EMV – Why Payment Systems Fail” summarising our last decade’s research on what goes wrong with Chip and PIN. There will be a paper on this out in a few months; meanwhile here’s the slides and here’s our page of papers on bank security.
The sessions of refereed papers will be blogged in comments to this post.
On January 23rd we had a conference call with the NHS Information Centre and a couple of its software suppliers about anonymisation. LBT readers will have followed how your GP records are to uploaded to the new central database care.data for resale unless you opt out. Any previous opt outs from other central systems like SCR will be disregarded (even if you wrote saying you opted out of all central systems), along with opt-outs from regional systems.
We’d been told that if you opted out afresh your data would be uploaded only in anonymised, aggregated form; after all the Prime Minister promised. But I persisted. How will the NHS work out doctors’ bonuses in respect of opted-out patients? Doctors get extra payments for meeting targets, such as ensuring that diabetic patients get eye tests; these used to be claimed by practice managers but are now to be worked out centrally. If the surgery just uploads “We have N patients opted out and their diagnostic codes are R1, R2, R3, …” then officials might have to give doctors the benefit of the doubt in bonus calculations.
It turned out that officials were still dithering. The four PC software vendors met them on January 22nd and asked for the business logic so they could code up the extraction, but officials could not make up their minds whether to respect the Prime Minister’s promise (and human-rights law) or to support the bonus calculation. So here we had a major national programme being rolled out next month, and still without a stable specification!
Now the decision has been taken. If you opt out, all your clinical data will be uploaded as a single record, but with your name, date of birth and postcode removed. The government will simply pretend this is anonymous, even though they well know it is not. This is clearly unlawful. Our advice is to opt out anyway while we lobby ministers to get their officials under control, deliver on Cameron’s promise and obey the law.
The Privacy Enhancing Technologies Symposium (PETS) aims to advance the state of the art and foster a world-wide community of researchers and practitioners to discuss innovation and new perspectives.
PETS seeks paper submissions for its 14th event to be held in Amsterdam, Netherlands, July 16–18, 2014 (of which I am program chair). Papers should present novel practical and/or theoretical research into the design, analysis, experimentation, or fielding of privacy-enhancing technologies. While PETS has traditionally been home to research on anonymity systems and privacy-oriented cryptography, we strongly encourage submissions in a number of both well-established and some emerging privacy-related topics.
Abstracts should be submitted by 10 February 2014, with full papers submitted by 13 February 2014. For further details, see the call for papers.
We had a crypto festival in London in London in November at which a number of cryptographers and crypto policy folks got together with over 1000 mostly young attendees to talk about what might be done in response to the Snowden revelations.
Here is a video of the session in which I spoke. The first speaker was Annie Machon (at 02.35) talking of her experience of life on the run from MI5, and on what we might do to protect journalists’ sources in the future. I’m at 23.55 talking about what’s changed for governments, corporates, researchers and others. Nick Pickles of Big Brother Watch follows at 45.45 talking on what can be done in terms of practical politics; it turned out that only two of us in the auditorium had met our MPs over the Comms Data Bill. The final speaker, Smari McCarthy, comes on at 56.45, calling for lots more encryption. The audience discussion starts at 1:12:00.
Your medical records are now officially on sale. American drug companies now learn that MedRed BT Health Cloud will provide public access to 50 million de-identified patient records from UK.
David Cameron announced in 2011 that every NHS patient would be a research patient, with their records opened up to private healthcare firms. He promised that our records would be anonymised and we’d have a right to opt out. I pointed out that anonymisation doesn’t work very well (as did the Royal Society) but the Information Commissioner predictably went along with the charade (and lobbyists are busy fixing up the new data protection regulation in Brussels to leave huge loopholes for health service management and research). The government duly started to compel the upload of GP data, to join the hospital data it already has. During the launch of a medical confidentiality campaign the health secretary promised to respect existing opt-outs but has now reneged on his promise.
The data being put online by BT appear to be the data it already manages from the Secondary Uses Service, which is mostly populated by records of finished consultant episodes from hospitals. These are pseudonymised by removing names and addresses but still have patient postcodes and dates of birth; patient views on this were ignored. I wonder if US purchasers will get these data items? I also wonder whether patients will be able to opt out of SUS? Campaigners have sent freedom of information requests to hundreds of hospitals to find out; so we should know soon enough.
Today we’re presenting a new side-channel attack in PIN Skimmer: Inferring PINs Through The Camera and Microphone at SPSM 2013. We found that software on your smartphone can work out what PIN you’re entering by watching your face through the camera and listening for the clicks as you type. Previous researchers had shown how to work out PINs using the gyro and accelerometer; we found that the camera works about as well. We watch how your face appears to move as you jiggle your phone by typing.
There are implications for the design of electronic wallets using mechanisms such as Trustzone which enable some apps to run in a more secure sandbox. Such systems try to prevent sensitive data such as bank credentials being stolen by malware. Our work shows it’s not enough for your electronic wallet software to grab hold of the screen, the accelerometers and the gyro; you’d better lock down the video camera, and the still camera too while you’re at it. (Our attack can use the still camera in burst mode.)
We suggest ways in which mobile phone operating systems might mitigate the risks. Meanwhile, if you’re developing payment apps, you’d better be aware that these risks exist.