Category Archives: Legal issues

Security-related legislation, government initiatives, court cases

Report on the IP Bill

This morning at 0930 the Joint Committee on the IP Bill is launching its report. As one of the witnesses who appeared before it, I got an embargoed copy yesterday.

The report s deeply disappointing; even that of the Intelligence and Security Committee (whom we tended to dismiss as government catspaws) is more vigorous. The MPs and peers on the Joint Committee have given the spooks all they wanted, while recommending tweaks and polishes here and there to some of the more obvious hooks and sharp edges.

The committee supports comms data retention, despite acknowledging that multiple courts have found this contrary to EU and human-rights law, and the fact that there are cases in the pipeline. It supports extending retention from big telcos offering a public service to private operators and even coffee shops. It support greatly extending comms data to ICRs; although it does call for more clarity on the definition, it give the Home Office lots of wriggle room by saying that a clear definition is hard if you want to catch all the things that bad people might do in the future. (Presumably a coffee shop served with an ICR order will have no choice but to install a government-approved black box. or just pipe everything to Cheltenham.) It welcomes the government decision to build and operate a request filter – essentially the comms database for which the Home Office has been trying to get parliamentary approval since the days of Jacqui Smith (and which Snowden told us they just built anyway). It comes up with the rather startling justification that this will help privacy as the police may have access to less stuff (though of course the spooks, including our 5eyes partners and others, will have more). It wants end-to-end encrypted stuff to be made available unless it’s “not practicable to do so”, which presumably means that the Home Secretary can order Apple to add her public key quietly to your keyring to get at your Facetime video chats. That has been a key goal of the FBI in Crypto War 2; a Home Office witness openly acknowledged it.

The comparison with the USA is stark. There, all three branches of government realised they’d gone too far after Snowden. President Obama set up the NSA review group, and implemented most of its recommendations by executive order; the judiciary made changes to the procedures of the FISA Court; and Congress failed to renew the data retention provisions in the Patriot Act (aided by the judiciary). Yet here in Britain the response is just to take Henry VIII powers to legalise all the illegal things that GCHQ had been up to, and hope that the European courts won’t strike the law down yet again.

People concerned for freedom and privacy will just have to hope the contrary. The net effect of the minor amendments proposed by the joint committee will be to make it even harder to get any meaningful amendments as the Bill makes its way through Parliament, and we’ll end up having to rely on the European courts to trim it back.

For more, see Scrambling for Safety, a conference we held last month in London on the bill and whose video is now online, and last week’s Cambridge symposium for a more detailed analysis.

Snoopers’ Charter 2.0

This afternoon at 4.30 I have been invited to give evidence in Parliament to the Joint Select Committee on the Investigatory Powers Bill.

This follows evidence I gave on the technical aspects of the bill to the Science and Technology Committee on November 10th; see video and documents. Of particular interest may be comments by my Cambridge colleague Richard Clayton; an analysis by my UCL colleague George Danezis; the ORG wiki; and finally the text of the bill itself.

While the USA has reacted to the Snowden revelations by restraining the NSA in various ways, the UK reaction appears to be the opposite. Do we really want to follow countries like China, Russia and Kazakhstan, and take the risk that we’ll tip countries like Brazil and India into following our lead? If the Internet fragments into national islands, that will not only do grave harm to the world economy, but make life a lot harder for GCHQ too.

The emotional cost of cybercrime

We know more and more about the financial cost of cybercrime, but there has been very little work on its emotional cost. David Modic and I decided to investigate. We wanted to empirically test whether there are emotional repercussions to becoming a victim of fraud (Yes, there are). We wanted to compare emotional and financial impact across different categories of fraud and establish a ranking list (And we did). An interesting, although not surprising, finding was that in every tested category the victim’s perception of emotional impact outweighed the reported financial loss.

A victim may think that they will still be able to recover their money, if not their pride. That really depends on what type of fraud they facilitated. If it is auction fraud, then their chances of recovery are comparatively higher than in bank fraud – we found that 26% of our sample would attempt to recover funds lost in a fraudulent auction and approximately half of them were reimbursed (look at this presentation). There is considerable evidence that banks are not very likely to believe someone claiming to be a victim of, say, identity theft and by extension bank fraud. Thus, when someone ends up out of pocket, they will likely also go through a process of secondary victimisation where they will be told they broke some small-print rule like having the same pin for two of their bank cards or not using the bank’s approved anti-virus software, and are thus not eligible for any refund and it is all their own fault, really.

You can find the article here or here. (It was published in IEEE Security & Privacy.)

This paper complements and extends our earlier work on the costs of cybercrime, where we show that the broader economic costs to society of cybercrime – such as loss of confidence in online shopping and banking – also greatly exceed the amounts that cybercriminals actually manage to steal.

Internet of Bad Things

A lot of people are starting to ask about the security and privacy implications of the “Internet of Things”. Once there’s software in everything, what will go wrong? We’ve seen a botnet recruiting CCTV cameras, and a former Director of GCHQ recently told a parliamentary committee that it might be convenient if a suspect’s car could be infected with malware that would cause it to continually report its GPS position. (The new Investigatory Powers Bill will give the police and the spooks the power to hack any device they want.)

So here is the video of a talk I gave on The Internet of Bad Things to the Virus Bulletin conference. As the devices around us become smarter they will become less loyal, and it’s not just about malware (whether written by cops or by crooks). We can expect all sorts of novel business models, many of them exploitative, as well as some downright dishonesty: the recent Volkswagen scandal won’t be the last.

But dealing with pervasive malware in everything will demand new approaches. Our approach to the Internet of Bad Things includes our new Cambridge Cybercrime Centre, which will let us monitor bad things online at the kind of scale that will be required.

Decepticon: interdisciplinary conference on deception research

I’m at Decepticon 2015 and will be liveblogging the talks in followups to this post. Up till now, research on deception has been spread around half a dozen different events, aimed at cognitive psychologists, forensic psychologists, law enforcement, cybercrime specialists and others. My colleague Sophie van der Zee decided to organise a single annual event to bring everyone together, and Decepticon is the the result. With over 160 registrants for the first edition of the event (and late registrants turned away) it certainly seems to have hit a sweet spot.

Award-winning case history of the care.data health privacy scandal

Each year we divide our masters of public policy students into teams and get them to write case studies of public policy failures. The winning team this year wrote a case study of the care.data fiasco. The UK government collected personal health information on tens of millions of people who had had hospital treatment in England and then sold it off to researchers, drug companies and even marketing firms, with only a token gesture of anonymisation. In practice patients were easy to identify. The resulting scandal stalled plans to centralise GP data as well, at least for a while.

Congratulations to Lizzie Presser, Maia Hruskova, Helen Rowbottom and Jesse Kancir, who tell the story of how mismanagement, conflicts and miscommunication led to a failure of patient privacy on an industrial scale, and discuss the lessons that might be learned. Their case study has just appeared today in Technology Science, a new open-access journal for people studying conflicts that arise between technology and society. LBT readers will recall several posts reporting the problem, but it’s great to have a proper, peer-reviewed case study that we can give to future generations of students. (Incidentally, the previous year’s winning case study was on a related topic, the failure of the NHS National Programme for IT.)

Four cool new jobs

We’re advertising for four people to join the security group from October.

The first three are for two software engineers to join our new cybercrime centre, to develop new ways of finding bad guys in the terabytes and (soon) petabytes of data we get on spam, phish and other bad stuff online; and a lawyer to explore and define the boundaries of how we share cybercrime data.

The fourth is in Security analysis of semiconductor memory. Could you help us come up with neat new ways of hacking chips? We’ve invented quite a few of these in the past, ranging from optical fault induction through semi-invasive attacks generally. What’s next?

FCA view on unauthorised transactions

Yesterday the Financial Conduct Authority (the UK bank regulator) issued a report on Fair treatment for consumers who suffer unauthorised transactions. This is an issue in which we have an interest, as fraud victims regularly come to us after being turned away by their bank and by the financial ombudsman service. Yet the FCA have found that everything is hunky dory, and conclude “we do not believe that further thematic work is required at this stage”.

One of the things the FCA asked their consultants is whether there’s any evidence that claims are rejected on the sole basis that a pin was used. The consultants didn’t want to reply on existing work but instead surveyed a nationally representative sample of 948 people and found that 16% had a transaction dispute in the last year. These were 37% MOTO, 22% cancelled future dated payment, 15% ATM cash, 13% shop, 13% lump sum from bank account. Of customers who complained, 43% were offered their money back spontaneously; a further 41% asked; in the end a total of 68% got refunds after varying periods of time. In total 7% (15 victims) had claim declined, most because the bank said the transaction was “authorised” or following a”contract with merchant” and 2 for chip and pin (one of them an ATM transaction; the other admitted sharing their PIN). 12 of these 15 considered the result
unfair. These figures are entirely consistent with what we learn from the British Crime Survey and elsewhere; two million UK victims a year, and while most get their money back, many don’t; and a hard core of perhaps a few tens of thousands who end up feeling that their bank has screwed them.

The case studies profiled in the consultants’ paper were of glowing happy people who got their money back; the 12 sad losers were not profiled, and the consultants concluded that “Customers might be being denied refunds on the sole basis that Chip and PIN were used … we found little evidence of this” (p 49) and went on to remark helpfully that some customers admitted sharing their PINs and felt OK lying about this. The FCA happily paraphrases this as “We also did not find any evidence of firms holding customers liable for unauthorised transactions solely on the basis that the PIN was used to make the transaction” (main report, p 13, 3.25).

According to recent news reports, the former head of the FCA, Martin Wheatley, was sacked by George Osborne for being too harsh on the banks.