At our security group meeting on the 19th August, Sergei Skorobogatov demonstrated a NAND backup attack on an iPhone 5c. I typed in six wrong PINs and it locked; he removed the flash chip (which he’d desoldered and led out to a socket); he erased and restored the changed pages; he put it back in the phone; and I was able to enter a further six wrong PINs.
Sergei has today released a paper describing the attack.
During the recent fight between the FBI and Apple, FBI Director Jim Comey said this kind of attack wouldn’t work.
When Lying Feels the Right Thing to Do reports three studies we did on what made people less or more likely to submit fraudulent insurance claims. Our first study found that people were more likely to cheat when rejected; the other two showed that rejected claimants were just as likely to cheat when this didn’t lead to financial gain, but that they felt more strongly when there was no money involved.
Our research was conducted as part of a broader research programme to investigate the deterrence of deception; our goal was to understand how to design better websites. However we can’t help wondering whether it might shine some light on the UK’s recent political turmoil. The Brexit campaigners were minorities of both main political parties and their anti-EU rhetoric had been rejected by the political mainstream for years; they had ideological rather than selfish motives. They ran a blatantly deceptive campaign, persisting in obvious untruths but abandoning them promptly after winning the vote. Rejection is not the only known factor in situational deception; it’s known, for example, that people with unmet goals are more likely to cheat than people who are simply doing their best, and that one bad apple can have a cascading effect. But it still makes you think.
The outcome and aftermath of the referendum have left many people feeling rejected, from remain voters through people who will lose financially to foreign residents of the UK. Our research shows that feelings of rejection can increase cheating by 15-30%; perhaps this might have measurable effects in some sectors. How one might disentangle this from the broader effects of diminished social solidarity, and from politicians simply setting a bad example, could be an interesting problems for social scientists.
The Royal Society has just published a report on cybersecurity research. I was a member of the steering group that tried to keep the policy team headed in the right direction. Its recommendation that governments preserve the robustness of encryption is welcome enough, given the new Russian law on access to crypto keys; it was nice to get, given the conservative nature of the Society. But I’m afraid the glass is only half full.
I was disappointed that the final report went along with the GCHQ line that security breaches should not be reported to affected data subjects, as in the USA, but to the agencies, as mandated in the EU’s NIS directive. Its call for an independent review of the UK’s cybersecurity needs may also achieve little. I was on John Beddington’s Blackett Review five years ago, and the outcome wasn’t published; it was mostly used to justify a budget increase for GCHQ. Its call for UK government work on standards is irrelevant post-Brexit; indeed standards made in Europe will probably be better without UK interference. Most of all, I cannot accept the report’s line that the government should help direct cybersecurity research. Most scientists agree that too much money already goes into directed programmes and not enough into responsive-mode and curiosity-driven research. In the case of security research there is a further factor: the stark conflict of interest between bona fide researchers, whose aim is that some of the people should enjoy some security and privacy some of the time, and agencies engaged in programmes such as Operation Bullrun whose goal is that this should not happen. GCHQ may want a “more responsive cybersecurity agenda”; but that’s the last thing people like me want them to have.
The report has in any case been overtaken by events. First, Brexit is already doing serious harm to research funding. Second, Brexit is also doing serious harm to the IT industry; we hear daily of listings posptoned, investments reconsidered and firms planning to move development teams and data overseas. Third, the Investigatory Powers bill currently before the House of Lords highlights the fact that surveillance debate in the West these days is more about access to data at rest and about whether the government can order firms to hack their customers.
While all three arms of the US government have drawn back on surveillance powers following the Snowden revelations, Theresa May has taken the hardest possible line. Her Investigatory Powers Bill will give her successors as Home Secretary sweeping powers to order firms in the UK to hand over data and help GCHQ hack their customers. Brexit will shield these powers from challenge in the European Court of Justice, making it much harder for a UK company to claim “adequacy” for its data protection arrangements in respect of EU data subjects. This will make it still less attractive for an IT company to keep in the UK either data that could be seized or engineering staff who could be coerced. I am seriously concerned that, together with Brexit, this will be the double whammy that persuades overseas firms not to invest in the UK, and that even causes some UK firms to leave. In the face of this massive self-harm, the measures suggested by the report are unlikely to help much.
I’m liveblogging the Workshop on Security and Human Behaviour which is being held in Harvard. The programme is here. For background, see the liveblogs for SHB 2008-15 which are linked here and here. Blog posts summarising the talks at the workshop sessions will appear as followups below.
We recently reported that the Commissioner of the Met, Sir Bernard Hogan-Howe, said that banks should not refund fraud victims as this would just make people careless with their passwords and antivirus. The banks’ desire to blame fraud victims if they can, to avoid refunding them, is rational enough, but for a police chief to support them was disgraceful. Thirty years ago, a chief constable might have said that rape victims had themselves to blame for wearing nice clothes; if he were to say that nowadays, he’d be sacked. Hogan-Howe’s view of bank fraud is just as uninformed, and just as offensive to victims.
Our spooky friends at Cheltenham have joined the party. The Register reports a story in the Financial Times (behind a paywall) which says GCHQ believes that “companies must do more to try and encourage their customers to improve their cyber security standards. Customers using outdated software – sometimes riddled with vulnerabilities that hackers can exploit – are a weak link in the UK’s cyber defences.” There is no mention of the banks’ own outdated technology, or of GCHQ’s role in keeping consumer software vulnerable.
The elegant scribblers at the Financial Times are under the impression that “At present, banks routinely cover the cost of fraud, regardless of blame.” So they clearly are not regular readers of Light Blue Touchpaper.
The spooks are slightly more cautious; according to the FT, GCHQ “has told the private sector it will not take responsibility for regulatory failings”. I’m sure the banks will heave a big sigh of relief that their cosy relationship with the police, the ombudsman and the FCA will not be disturbed.
We will have to change our security-economics teaching material so we don’t just talk about the case where “Alice guards a system and Bob pays the costs of failure”, but also this new case where “Alice guards a system, and bribes the government to compel Bob to pay the costs of failure.” Now we know how Hogan-Howe is paid off; the banks pay for his Dedicated Card and Payment Crime Unit. But how are they paying off GCHQ, and what else are they getting as part of the deal?
Commissioner Hogan-Howe of the Met said on Thursday that the banks should not refund fraud victims because it “rewards” them for being lax about internet security. This was too much to pass up, so I wrote a letter to the editor of the Times, which has just been published. As the Times is behind a paywall, here is the text.
Sir, Sir Bernard Hogan-Howe argues that banks should not refund online fraud victims as this would make people careless with their passwords and anti-virus software (p1, March 24, and letters Mar 25 & 26). This is called secondary victimisation. Thirty years ago, a chief constable might have said that rape victims had themselves to blame for wearing nice clothes; if he were to say that nowadays, he’d be sacked. Hogan-Howe’s view of bank fraud is just as uninformed, and just as offensive to victims.
About 5 percent of computers running Windows are infected with malware, and common bank fraud malware such as Zeus lets the fraudster redirect transactions. You think you’re paying £150 to your electricity bill, while the malware is actually sending £9000 to Russia. The average person is helpless against this; everything seems normal, and antivirus products usually only detect it afterwards.
Much of the blame lies with the banks, who let the users of potentially infected computers make large payments instantly, rather than after a day or two, as used to be the case. They take this risk because regulators let them dump much of the cost of the resulting fraud on customers.
The elephant in the room is that the Met has been claiming for years that property crime is falling, when in fact it’s just going online like everything else. We’re now starting to get better crime figures; it’s time we got better policing, and better bank regulation too.
Ross Anderson FRS FREng
Professor of Security Engineering
University of Cambridge
There have been no arrests or charges for cybercrime events in the UK for almost two months. I do not believe that this apparent lack of law enforcement action is the result of any recent reduction in cybercrime. Instead, I predict that a multitude of coordinated arrests is being planned, to take place nationally over a short period of time.
My observations arise from the Cambridge Computer Crime Database (CCCD), which I have been maintaining for some time now. The database contains over 400 entries dating back to January 2010, detailing arrests, charges, and prosecutions for computer crime in the UK.
Since the beginning of 2016, there have been no arrests or charges for incidents that fit within the scope of the CCCD that I have picked up using various public source data collection methods. The last arrest was in mid-December, when a male was arrested on suspicion of offences under sections 1 and 2 of the Computer Misuse Act. Press coverage of this arrest linked it to the VTech data breach.
A coordinated ‘cyber crime strike week’ took place in early March 2015. In just one week, 57 suspects were arrested for a range of offences, including denial of service attacks, cyber-enabled fraud, network intrusion and data theft, and malware development.
Coordinated law enforcement action to address particular crime problems is not uncommon. A large number of arrests is ‘newsworthy’, capturing national headlines and sending the message that law enforcement take these matters seriously and wrongdoers will be caught. What is less clear is whether one week of news coverage would have a greater effect than 52 weeks of more sustained levels of arrest.
Furthermore, many of the outcomes of the 2015 arrests are unknown (possibly indicating no further action has been taken), or pending. This indicates that large numbers of simultaneous arrests may place pressure on the rest of the criminal justice system, particularly for offences with complex evidentiary requirements.
This morning at 0930 the Joint Committee on the IP Bill is launching its report. As one of the witnesses who appeared before it, I got an embargoed copy yesterday.
The report s deeply disappointing; even that of the Intelligence and Security Committee (whom we tended to dismiss as government catspaws) is more vigorous. The MPs and peers on the Joint Committee have given the spooks all they wanted, while recommending tweaks and polishes here and there to some of the more obvious hooks and sharp edges.
The committee supports comms data retention, despite acknowledging that multiple courts have found this contrary to EU and human-rights law, and the fact that there are cases in the pipeline. It supports extending retention from big telcos offering a public service to private operators and even coffee shops. It support greatly extending comms data to ICRs; although it does call for more clarity on the definition, it give the Home Office lots of wriggle room by saying that a clear definition is hard if you want to catch all the things that bad people might do in the future. (Presumably a coffee shop served with an ICR order will have no choice but to install a government-approved black box. or just pipe everything to Cheltenham.) It welcomes the government decision to build and operate a request filter – essentially the comms database for which the Home Office has been trying to get parliamentary approval since the days of Jacqui Smith (and which Snowden told us they just built anyway). It comes up with the rather startling justification that this will help privacy as the police may have access to less stuff (though of course the spooks, including our 5eyes partners and others, will have more). It wants end-to-end encrypted stuff to be made available unless it’s “not practicable to do so”, which presumably means that the Home Secretary can order Apple to add her public key quietly to your keyring to get at your Facetime video chats. That has been a key goal of the FBI in Crypto War 2; a Home Office witness openly acknowledged it.
The comparison with the USA is stark. There, all three branches of government realised they’d gone too far after Snowden. President Obama set up the NSA review group, and implemented most of its recommendations by executive order; the judiciary made changes to the procedures of the FISA Court; and Congress failed to renew the data retention provisions in the Patriot Act (aided by the judiciary). Yet here in Britain the response is just to take Henry VIII powers to legalise all the illegal things that GCHQ had been up to, and hope that the European courts won’t strike the law down yet again.
People concerned for freedom and privacy will just have to hope the contrary. The net effect of the minor amendments proposed by the joint committee will be to make it even harder to get any meaningful amendments as the Bill makes its way through Parliament, and we’ll end up having to rely on the European courts to trim it back.
For more, see Scrambling for Safety, a conference we held last month in London on the bill and whose video is now online, and last week’s Cambridge symposium for a more detailed analysis.
I’m in a symposium at Churchill College on the Investigatory Powers Bill. It’s organised by John Naughton and I’ll be speaking later on equipment interference, a topic on which I wrote an expert report for the recent IP Tribunal case brought by Privacy International. Meanwhile I’ll try to liveblog the event in followups to this post.
Today I’m at the tenth Scrambling for Safety which is being held at Kings College London. Sorry, all the tickets are sold out, but there is a video feed available from the Open Rights Group website.