Posts filed under 'Security economics

Apr 4, '14

Today I gave a talk at the Open Data Institute on a catastrophic failure of anonymity in medical research. Here’s the audio and video, and here are the slides.

Three weeks ago we made a formal complaint to the ICO about the Department of Health supplying a large amount of data to PA Consulting, who uploaded it to the Google cloud in defiance of NHS regulations on sending data abroad. This follows several other scandals over NHS chiefs claiming that hospital episode statistics data are anonymous and selling it to third parties, when it is nothing of the kind.

Yesterday the Department of Health disclosed its Register of Approved Data Releases which shows that many organisations in both the public and private sectors have been supplied with HES data over the past year. It’s amazing how many of them are marked “non sensitive”: even number 408, where Imperial College got data with the with HESID (which includes postcode or NHS number), date of birth, home address, and GP practice. How officials can maintain that such data does not identify individuals is beyond me.

Mar 14, '14

Three NGOs have lodged a formal complaint to the Information Commissioner about the fact that PA Consulting uploaded over a decade of UK hospital records to a US-based cloud service. This appears to have involved serious breaches of the UK Data Protection Act 1998 and of multiple NHS regulations about the security of personal health information. This already caused a row in Parliament and the Deparatment of Health seems to be trying to wriggle off the hook by pretending that the data were pseudonymised. Other EU countries have banned such uploads. Regular LBT readers will know that the Department of Health has got itself in a complete mess over medical record privacy.

Mar 5, '14

Bank names are so tricksy — they all have similar words in them… and so it’s common to see phishing feeds with slightly the wrong brand identified as being impersonated.

However, this story is about how something the way around has happened, in that AnonGhost, a hacker group, believe that they’ve defaced “Yorkshire Bank, one of the largest United Kingdom bank” and there’s some boasting about this to be found at http://www.p0ison.com/ybs-bank-got-hacked-by-team-anonghost/.

However, it rather looks to me as if they’ve hacked an imitation bank instead! A rather less glorious exploit from the point of view of potential admirers.
(more…)

Mar 3, '14

I will be trying to liveblog Financial Cryptography 2014. I just gave a keynote talk entitled “EMV – Why Payment Systems Fail” summarising our last decade’s research on what goes wrong with Chip and PIN. There will be a paper on this out in a few months; meanwhile here’s the slides and here’s our page of papers on bank security.

The sessions of refereed papers will be blogged in comments to this post.

Feb 21, '14

Next year’s Workshop on the Economics of Information Security (WEIS 2014) will be at Penn State on June 23–24. Submissions are due a week from today, at the end of February. It will be fascinating to see what effects the Snowden revelations will have on the community’s thinking. Get writing!

Feb 5, '14

Today we release a paper on security protocols and evidence which analyses why dispute resolution mechanisms in electronic systems often don’t work very well. On this blog we’ve noted many many problems with EMV (Chip and PIN), as well as other systems from curfew tags to digital tachographs. Time and again we find that electronic systems are truly awful for courts to deal with. Why?

The main reason, we observed, is that their dispute resolution aspects were never properly designed, built and tested. The firms that delivered the main production systems assumed, or hoped, that because some audit data were available, lawyers would be able to use them somehow.

As you’d expect, all sorts of things go wrong. We derive some principles, and show how these are also violated by new systems ranging from phone banking through overlay payments to Bitcoin. We also propose some enhancements to the EMV protocol which would make it easier to resolve disputes over Chip and PIN transactions.

Update (2013-03-07): This post was mentioned on Bruce Schneier’s blog, and this is some good discussion there.

Update (2014-03-03): The slides for the presentation at Financial Cryptography are now online.

Jan 8, '14

The next three weeks will see a leaflet drop on over 20 million households. NHS England plans to start uploading your GP records in March or April to a central system, from which they will be sold to a wide range of medical and other research organisations. European data-protection and human-rights laws demand that we be able to opt out of such things, so the Information Commissioner has told the NHS to inform you of your right to opt out.

Needless to say, their official leaflet is designed to cause as few people to opt out as possible. It should really have been drafted like this. (There’s a copy of the official leaflet at the MedConfidential.org website.) But even if it had been, the process still won’t meet the consent requirements of human-rights law as it won’t be sent to every patient. One of your housemates could throw it away as junk before you see it, and if you’ve opted out of junk mail you won’t get a leaflet at all.

Yet if you don’t opt out in the next few weeks your data will be uploaded to central systems and you will not be able to get it deleted, ever. If you don’t opt out your kids in the next few weeks the same will happen to their data, and they will not be able to get their data deleted even if they decide they prefer privacy once they come of age. If you opted out of the Summary Care Record in 2009, that doesn’t count; despite a ministerial assurance to the contrary, you now need to opt out all over again. For further information see the website of GP Neil Bhatia (who drafted our more truthful leaflet) and previous LBT posts on medical privacy.

Jan 3, '14

David Modic and I have just published a paper on The psychology of malware warnings. We’re constantly bombarded with warnings designed to cover someone else’s back, but what sort of text should we put in a warning if we actually want the user to pay attention to it?

To our surprise, social cues didn’t seem to work. What works best is to make the warning concrete; people ignore general warnings such as that a web page “might harm your computer” but do pay attention to a specific one such as that the page would “try to infect your computer with malware designed to steal your bank account and credit card details in order to defraud you”. There is also some effect from appeals to authority: people who trust their browser vendor will avoid a page “reported and confirmed by our security team to contain malware”.

We also analysed who turned off browser warnings, or would have if they’d known how: they were people who ignored warnings anyway, typically men who distrusted authority and either couldn’t understand the warnings or were IT experts.

Dec 31, '13

We had a crypto festival in London in London in November at which a number of cryptographers and crypto policy folks got together with over 1000 mostly young attendees to talk about what might be done in response to the Snowden revelations.

Here is a video of the session in which I spoke. The first speaker was Annie Machon (at 02.35) talking of her experience of life on the run from MI5, and on what we might do to protect journalists’ sources in the future. I’m at 23.55 talking about what’s changed for governments, corporates, researchers and others. Nick Pickles of Big Brother Watch follows at 45.45 talking on what can be done in terms of practical politics; it turned out that only two of us in the auditorium had met our MPs over the Comms Data Bill. The final speaker, Smari McCarthy, comes on at 56.45, calling for lots more encryption. The audience discussion starts at 1:12:00.

Dec 15, '13

Next year’s Workshop on the Economics of Information Security (WEIS 2014) will be at Penn State on June 23–24. The call for papers is now out; papers are due by the end of February.

It will be fascinating to see what effects the Snowden revelations will have on the community’s thinking about security standards and regulation, the incentives for information sharing and cooperation, and the economics of privacy, to mention only three of the workshop’s usual topics. Now that we’ve had six months to think things through, how do you think the security game has changed, and why? Do we need minor changes to our models, or major ones? Are we in the same policy world, or a quite different one? WEIS could be particularly interesting next year. Get writing!


Calendar

April 2014
M T W T F S S
« Mar    
 123456
78910111213
14151617181920
21222324252627
282930  

Posts by Month

Posts by Category