Monthly Archives: August 2017

Is the City force corrupt, or just clueless?

This week brought an announcement from a banking association that “identity fraud” is soaring to new levels, with 89,000 cases reported in the first six months of 2017 and 56% of all fraud reported by its members now classed as “identity fraud”.

So what is “identity fraud”? The announcement helpfully clarifies the concept:

“The vast majority of identity fraud happens when a fraudster pretends to be an innocent individual to buy a product or take out a loan in their name. Often victims do not even realise that they have been targeted until a bill arrives for something they did not buy or they experience problems with their credit rating. To carry out this kind of fraud successfully, fraudsters need access to their victim’s personal information such as name, date of birth, address, their bank and who they hold accounts with. Fraudsters get hold of this in a variety of ways, from stealing mail through to hacking; obtaining data on the ‘dark web’; exploiting personal information on social media, or though ‘social engineering’ where innocent parties are persuaded to give up personal information to someone pretending to be from their bank, the police or a trusted retailer.”

Now back when I worked in banking, if someone went to Barclays, pretended to be me, borrowed £10,000 and legged it, that was “impersonation”, and it was the bank’s money that had been stolen, not my identity. How did things change?

The members of this association are banks and credit card issuers. In their narrative, those impersonated are treated as targets, when the targets are actually those banks on whom the impersonation is practised. This is a precursor to refusing bank customers a “remedy” for “their loss” because “they failed to protect themselves.”
Now “dishonestly making a false representation” is an offence under s2 Fraud Act 2006. Yet what is the police response?

The Head of the City of London Police’s Economic Crime Directorate does not see the banks’ narrative as dishonest. Instead he goes along with it: “It has become normal for people to publish personal details about themselves on social media and on other online platforms which makes it easier than ever for a fraudster to steal someone’s identity.” He continues: “Be careful who you give your information to, always consider whether it is necessary to part with those details.” This is reinforced with a link to a police website with supposedly scary statistics: 55% of people use open public wifi and 40% of people don’t have antivirus software (like many security researchers, I’m guilty on both counts). This police website has a quote from the Head’s own boss, a Commander who is the National Police Coordinator for Economic Crime.

How are we to rate their conduct? Given that the costs of the City force’s Dedicated Card and Payment Crime Unit are borne by the banks, perhaps they feel obliged to sing from the banks’ hymn sheet. Just as the MacPherson report criticised the Met for being institutionally racist, we might perhaps describe the City force as institutionally corrupt. There is a wide literature on regulatory capture, and many other examples of regulators keen to do the banks’ bidding. And it’s not just the City force. There are disgraceful examples of the Metropolitan Police Commissioner and GCHQ endorsing the banks’ false narrative. However people are starting to notice, including the National Audit Office.

Or perhaps the police are just clueless?

History of the Crypto Wars in Britain

Back in March I gave an invited talk to the Cambridge University Ethics in Mathematics Society on the Crypto Wars. They have just put the video online here.

We spent much of the 1990s pushing back against attempts by the intelligence agencies to seize control of cryptography. From the Clipper Chip through the regulation of trusted third parties to export control, the agencies tried one trick after another to make us all less secure online, claiming that thanks to cryptography the world of intelligence was “going dark”. Quite the opposite was true; with communications moving online, with people starting to carry mobile phones everywhere, and with our communications and traffic data mostly handled by big firms who respond to warrants, law enforcement has never had it so good. Twenty years ago it cost over a thousand pounds a day to follow a suspect around, and weeks of work to map his contacts; Ed Snowden told us how nowadays an officer can get your location history with one click and your address book with another. In fact, searches through the contact patterns of whole populations are now routine.

The checks and balances that we thought had been built in to the RIP Act in 2000 after all our lobbying during the 1990s turned out to be ineffective. GCHQ simply broke the law and, after Snowden exposed them, Parliament passed the IP Act to declare that what they did was all right now. The Act allows the Home Secretary to give secret orders to tech companies to do anything they physically can to facilitate surveillance, thereby delighting our foreign competitors. And Brexit means the government thinks it can ignore the European Court of Justice, which has already ruled against some of the Act’s provisions. (Or perhaps Theresa May chose a hard Brexit because she doesn’t want the pesky court in the way.)

Yet we now see the Home Secretary repeating the old nonsense about decent people not needing privacy along with law enforcement officials on both sides of the Atlantic. Why doesn’t she just sign the technical capability notices she deems necessary and serve them?

In these fraught times it might be useful to recall how we got here. My talk to the Ethics in Mathematics Society was a personal memoir; there are many links on my web page to relevant documents.

Compartmentation is hard, but the Big Data playbook makes it harder still

A new study of Palantir’s systems and business methods makes sobering reading for people interested in what big data means for privacy.

Privacy scales badly. It’s OK for the twenty staff at a medical practice to have access to the records of the ten thousand patients registered there, but when you build a centralised system that lets every doctor and nurse in the country see every patient’s record, things go wrong. There are even sharper concerns in the world of intelligence, which agencies try to manage using compartmentation: really sensitive information is often put in a compartment that’s restricted to a handful of staff. But such systems are hard to build and maintain. Readers of my book chapter on the subject will recall that while US Naval Intelligence struggled to manage millions of compartments, the CIA let more of their staff see more stuff – whereupon Aldrich Ames betrayed their agents to the Russians.

After 9/11, the intelligence community moved towards the CIA model, in the hope that with fewer compartments they’d be better able to prevent future attacks. We predicted trouble, and Snowden duly came along. As for civilian agencies such as Britain’s NHS and police, no serious effort was made to protect personal privacy by compartmentation, with multiple consequences.

Palantir’s systems were developed to help the intelligence community link, fuse and visualise data from multiple sources, and are now sold to police forces too. It should surprise no-one to learn that they do not compartment information properly, whether within a single force or even between forces. The organised crime squad’s secret informants can thus become visible to traffic cops, and even to cops in other forces, with tragically predictable consequences. Fixing this is hard, as Palantir’s market advantage comes from network effects and the resulting scale. The more police forces they sign up the more data they have, and the larger they grow the more third-party databases they integrate, leaving private-sector competitors even further behind.

This much we could have predicted from first principles but the details of how Palantir operates, and what police forces dislike about it, are worth studying.

What might be the appropriate public-policy response? Well, the best analysis of competition policy in the presence of network effects is probably Lina Khan’s, and her analysis would suggest in this case that police intelligence should be a regulated utility. We should develop those capabilities that are actually needed, and the right place for them is the Police National Database. The public sector is better placed to commit the engineering effort to do compartmentation properly, both there and in other applications where it’s needed, such as the NHS. Good engineering is expensive – but as the Los Angeles Police Department found, engaging Palantir can be more expensive still.

Cambridge2Cambridge 2017

Following on from various other similar events we organised over the past few years, last week we hosted our largest ethical hacking competition yet, Cambridge2Cambridge 2017, with over 100 students from some of the best universities in the US and UK working together over three days. Cambridge2Cambridge was founded jointly by MIT CSAIL (in Cambridge Massachusetts) and the University of Cambridge Computer Laboratory (in the original Cambridge) and was first run at MIT in 2016 as a competition involving only students from these two universities. This year it was hosted in Cambridge UK and we broadened the participation to many more universities in the two countries. We hope in the future to broaden participation to more countries as well.

Cambridge 2 Cambridge 2017 from Frank Stajano Explains on Vimeo.

We assigned the competitors to teams that were mixed in terms of both provenance and experience. Each team had competitors from US and UK, and no two people from the same university; and each team also mixed experienced and less experienced players, based on the qualifier scores. We did so to ensure that even those who only started learning about ethical hacking when they heard about this competition would have an equal chance of being in the team that wins the gold. We then also mixed provenance to ensure that, during these three days, students collaborated with people they didn’t already know.

Despite their different backgrounds, what the attendees had in common was that they were all pretty smart and had an interest in cyber security. It’s a safe bet that, ten or twenty years from now, a number of them will probably be Security Specialists, Licensed Ethical Hackers, Chief Security Officers, National Security Advisors or other high calibre security professionals. When their institution or country is under attack, they will be able to get in touch with the other smart people they met here in Cambridge in 2017, and they’ll be in a position to help each other. That’s why the defining feature of the event was collaboration, making new friends and having fun together. Unlike your standard one-day hacking contest, the ambitious three-day programme of C2C 2017 allowed for social activities including punting on the river Cam, pub crawling and a Harry Potter style gala dinner in Trinity College.

In between competition sessions we had a lively and inspirational “women in cyber” panel, another panel on “securing the future digital society”, one on “real world pentesting” and a careers advice session. On the second day we hosted several groups of bright teenagers who had been finalists in the national CyberFirst Girls Competition. We hope to inspire many more women to take up a career path that has so far been very male-dominated. More broadly, we wish to inspire many young kids, girls or boys, to engage in the thrilling challenge of unravelling how computers work (and how they fail to work) in a high-stakes mental chess game of adversarial attack and defense.

Our platinum sponsors Leidos and NCC Group endowed the competition with over £20,000 of cash prizes, awarded to the best 3 teams and the best 3 individuals. Besides the main attack-defense CTF, fought on the Leidos CyberNEXS cyber range, our other sponsors offered additional competitions, the results of which were combined to generate the overall teams and individual scores. Here is the leaderboard, showing how our contestants performed. Special congratulations to Bo Robert Xiao of Carnegie Mellon University who, besides winning first place in both team and individuals, also went on to win at DEF CON in team PPP a couple of days later.

We are grateful to our supporters, our sponsors, our panelists, our guests, our staff and, above all, our 110 competitors for making this event a success. It was particularly pleasing to see several students who had already taken part in some of our previous competitions (special mention for Luke Granger-Brown from Imperial who earned medals at every visit). Chase Lucas from Dakota State University, having passed the qualifier but not having picked in the initial random selection, was on the reserve list in case we got funding to fly additional students; he then promptly offered to pay for his own airfare in order to be able to attend! Inter-ACE 2017 winner Io Swift Wolf from Southampton deserted her own graduation ceremony in order to participate in C2C (!), and then donated precious time during the competition to the CyberFirst girls who listened to her rapturously. Accumulating all that good karma could not go unrewarded, and indeed you can once again find her name in the leaderboard above. And I’ve only singled out a few, out of many amazing, dynamic and enthusiastic young people. Watch out for them: they are the ones who will defend the future digital society, including you and your family, from the cyber attacks we keep reading about in the media. We need many more like them, and we need to put them in touch with each other. The bad guys are organised, so we have to be organised too.

The event was covered by Sky News, ITV, BBC World Service and a variety of other media, which the official website and twitter page will undoubtedly collect in due course.