5 thoughts on “FIPR 20th birthday

  1. Fleur Fisher introduced the FIPR 20th birthday seminar, talking about how in the mid-1990s as head of ethics at the BMA, she realised that the collection of medical data was starting to amount to a virtual naked patient. Merlin Erroll followed with a perspective of decades of debates in Parliament, and I gave an introductory talk on the history of FIPR, including the slides from our 10th birthday seminar a decade ago. What has really changed over the past ten or twenty years? Understanding that is today’s task.

    Duncan Campbell was next, showing and commenting on slides from the Snowden revelations. His first remark was on the code names given to the efforts by NSA and GCHQ to undermine open security standards, namely Bullrun and Edgehill – both battles from each country’s civil war. He went on to discuss systems and teams of penetrating target defences, cryptoanalysis and supply chain interdiction. His talk title was “100 years of data stealing”; it started with a wireless intercept station in Scarborough in 1912, a location that’s now a major centre for authorised hacking. The many serious issues include access to the Microsoft Cloud which compromises the systems of the House of Commons, whose members were kept in the dark. Fundamental issues include the integrity of cryptosystems (the issue that launched FIPR) were we won, and mass surveillance which they won.

    Ian Levy from GCHQ was next, with the right to reply. At present we have “angels” vs “demons”, two narratives of people who preserve privacy and people who undermine it. The world has changed a lot since Snowden, with a massive uptick in good encryption, provided free by service companies who have also invested massively in the security engineering of the software they sell or give away. The scale of data going over commodity systems is now also much greater than five or six years ago. Investigations are now different; instead of attaching crocodile clips to someone’s phone like, you have to deal with mobile phones. Crypto has become so mainstream that the chances of hiding a backdoor in crypto are pretty remote. Plenty good universities have access to FIBs, and software reverse engineering tools are hugely better. Content isn’t going to be there all the time, so metadata matters more, and CNE. The big change that follows inevitably is a step change in vulnerability markets. Intelligence agencies worldwide will pay huge money to whack your phone. In fact, NCSC is one of the top bug bounty collectors, and we’re trying to use export control laws to stop people selling vulnerabilities to oppressive states. We need more nuanced views of these issues. We should be concerned about other countries passing laws that change the weather. What the UK says to the big companies doesn’t matter; we’re only 60 million people, and nobody will change their business model for us. But what China and India say may very well change the weather, and in ways that might not be to our advantage. Oh, and do you think the big service firms can defend themselves against the intelligence services of countries like Russia? US law prohibits discrimination against US persons on the basis of their country of birth. Recall that China compromised Adobe code signing, and for a while you couldn’t tell what software was Adobe’s and what was Chinese. When talking about exceptional access, think hard about scale. If there’s a magic key that will give access to every message forever, the UK government would not want that. We may never agree on everything but we need a public debate that’s informed by reality.

    Julian Huppert was involved as an MP in the 1020-5 coalition in surveillance matters including the comms data bill and the DRIP Act. Nick Clegg got the comms data bill spun out into a separate bill, much to the disgust of the Home Office which feared that people might criticise it or ask for change. The draft bill committee duly criticised it at length and in depth. The three things they were asking for were IP address matching, web logs, and the collection of traffic data from service firms such as Google and Facebook. That the committee which slammed the bill contained a former cabinet secretary is telling. The revised bill was finally killed off by Nick Clegg which made Theresa May furious. The IP matching piece appeared in other legislation but hasn’t been implemented as it’s too hard. Then there was Snowden: we now know how much was already available. The perplexing thing is why there was no little discussion of Snowden in the public or for that matter in parliament; an example of the bizarre tone was the committee asking Alan Rusbridger whether he loved his country. In Britain people think of James Bond when intelligence comes up, in the USA they think of overreach and the constitution. As for the DRIP Act, he thought it was the best they could get. The IP Act is bad but as it lists the powers explicitly we can target them for future amendment or repeal, and we can push for the Investigatory Powers Commission to go further. We have to understand the political dynamics of the ratchet, and figure out how to push back on that. The public unfortunately see privacy vs security as a binary choice, which we know is a false choice; we’re nowhere near the boundary and can improve both. Understand the “Overton window” of things you can reasonably say, and make sure there are people arguing at our edge. And the agencies are actually better at assessing risk than other government departments. It was encouraging though that IPCO refused to agree GCHQ evidence in private.

    I didn’t liveblog the panel session as I was on it, but you should be able to find the video here.

  2. Onora O’Neill kicked off the afternoon with a philosophical approach. The data protection approach may not be an effective way of protecting personal data any more as the boundary of the personal is becoming blurred. In the old days, a computerised medical record was clearly private, and an exaggerated debate on consent got going, which led to completely bogus ideas of consent including the presumed and coerced variety. However sensitive data now get in the public domain by inference rather than by disclosure, whether malicious or merely inadvertent. So despite the huge effort that went into refurbishing data protection law, Onora thinks that within a decade we’ll need something different. It’s worth remembering that concepts like privacy, security and liberty don’t exist in that form in the Old testament; and these concepts do not tackle the main ethical issues that arise in communication. We need to look at speech acts, not speech content: to fraud, to malice, to bullying and so on. In short, neither privacy nor security is sufficient; we should think in terms of honesty, and of the accessibility and intelligibility of data rather than just “transparency”. We have made some progress with the private harms caused by data abuse, but little with the public harms such as attempts to swing elections in other jurisdictions. Many of these appear tied up with anonymity; she leaves that as a provocation.

    Eileen Munro did some work for FIPR on the privacy of children and their parents. Clause 8 of the Children Act 2003 provided for a national database to track childrens’ progress and development; they tried to have all the health, social care and other data in there but parliament wouldn’t have it, so they settled for contact links instead. The system actually came from the Home Office who wanted to pick up “future menaces to society”. Officials waited for a tragedy, and when Victoria Climbie died they used her death in a totally cynical way, despite the fact that the Climbie report did not show any lack of information. This led to the illogical “Every child matters” paper which bundled together many problems in a way that misled many readers. FIPR wrote a report debunking the policy, and when the coalition government came in it was dropped. A broader question, that has not gone away, is the fascination with predictive analytics which still don’t work. Starting with a very biased set of data loaded with the current prejudices will just embed them. New Zealand was talked about of doing such a system and there’s a big debate in Victoria, Australia. The biggest causative factor for child protection issues is poverty, and we have good data on that. To Eileen’s eye it is completely defective if governments use expensive analytics to blame families living in extreme poverty rather than helping them. There’s a lot of money to be made out of analytics, and so the problem’s still there; more subtle than the crudity of ContactPoint, but it has not gone away.

    Jen Persson founded Defend Digital Me in 2015 to build in FIPR’s work and is currently campaigning on children’s privacy. In 2013 the National Pupil Database was opened up to third parties but there was no information in the public domain; it was just made available to companies, journalists and others, with over 1,000 data releases since then and no ability to audit what’s happened to the data. For example, the Telegraph was given 9 million children’s data in 2013, and the department doesn’t know which children. The department refuses subject access requests by children, despite this being flagged as an issue by Patricia Hewitt as early as 1977. These data are being used to develop predictive tools, just as was feared for ContactPoint, and will be used to identify children at risks of becoming a NEET, or being excluded from school, or becoming a gang member. This is now applied in 50 London schools with no oversight and no consideration of the likely effects. This is not academic work, but production use of data in real life. Of the 400 data items, only four are consensual (country and date of birth) and these are now used to create a “hostile environment” by the Home Office since a secret agreement in 2015. Following a boycott, the government now says that country of birth won’t be given to the Home Office, but schools haven’t been told. This causes direct harm; for example, asking nonwhite children to bring in their passports. The rest of the 400 data items are largely hearsay and opinion touching things like mental health and whether the child is thought to be an offender. This then gets fed into predictive criminological models. The final problem is the wide adoption of “safeguarding” software from about 15 companies, some of which do keylogging which monitors not just the kids’ homework but their private use of their computers. They say they’re looking for suicide risk, self-harm risk, and radicalisation risk. Nobody knows which keywords are being used, but some 20,000 words (including words in other languages) raise alerts which teachers don’t know how to remove. For example, a kid who looked up black rhinos is now flagged as a gang member. What does the future look like? Well, some ministries are really keen on personalising stuff right down to the individual, and even use the national pupil number as as ID number past the age of 18 – giving it to outside firms. Yet the people promoting AI have the ear of government, are writing the textbooks, and using government money to search for evidence supporting their business model.

    Sam Smith of MedConfidential does informed consent; both the informed part or the consent part. He’d prepared some remarks in response to Ben Laurie of Deepmind, who couldn’t come. At a recent AI conference, people were discussing how not to get blamed for doing something stupid; telling upset people that you have a mathematical proof that you didn’t break the law by feeding your data to an AI is not a sensible approach. Weasel words from your PR consultant may work better. In short: if you just talk about maths and not about humans, you’re only doing security theatre. As for the GDPR, it’s largely a clean-up operation, and the NHS needs a lot of clean-up. They waited until the last minute to check with the ICO what MedConfidential had been telling them for eighteen months, namely that their practices were contrary to the GDPR, which led to a panic over the reuse of health data. Decision-making processes are anyway pathetic: the NHS issued guidance on messaging, but pulled it when they got annoyed at the number of WhatsApp lobbyists who pestered them. This all shows you what the government really cares about. The pointed question here may be “Do you believe in the rule of law?” Go read Tom Bingham’s The rule of law which will explain to you very clearly why any AI you use to take government decisions must be able to explain what it’s doing. And when it come to the re-identification of data, we should all have listened to Baroness O’Neill a lot more.

    Andreas von Heydwolff was the last speaker of the health and social care session, talking of issues with IT in Austrian healthcare. He came across out work in the mid-1990s and realised that the abuses we documented here would eventually turn up in Germany and Austria, as indeed they have. Data collection started with A Lehmann in 1856 who started an address directory that’s now the Austrian phone book; it was a public-private partnership and his office was in the police department. Health It started in the 1990s with insurers boasting that they had their machines on the same premises as the Ministry of Finance. A study of smartcards described it as “a solution searching for a problem” but by 2005 it was up and running. Technically it works quite well with 150m patient contacts a year; but in Germany a similar system has just been scrapped after years of work at a cost of 1-2bn Euros as the future is now smartphones. Electronic health records have been “coming” for 10 years and the physicians’ chamber wants them to be searchable; he’ll have to use them from next year which will cost him an extra hour a day. Other things that bother him include the government saying they want to open up databases to research (after protests they said the health minister, or an ethics committee, would decide what data releases to make); there’s no effective freedom of information rule; there’s no will to change this; and the right wing party is in bed with Putin. Finally he has no idea what to do with his phone; his patients call him, and the phone company was bought by Hutchinson, so maybe the data will end up in Hong Kong.

  3. Wendy Grossman started our final session, noting that the Ireland vote on abortion caused her to recall an Irish politician saying that people were not illiberal, just complicated. This applies to privacy; it’s not that people don’t care, but that their attitudes are complicated because of the many issues involved. The Internet encapsulates cultural values, and while the version we have today was built by American companies, the Internet of Things is being built by Asian firms who have somewhat different values. We had hoped over the last 20 years that politicians would learn about the Internet and start passing sensible laws, but this has not happened; the social expectations of the 20-somethings are different from the boomers’. The Overton window of surveillance has shifted, and kids no longer think it odd to use fingerprints to buy school lunches. The predictions of 50 years ago are often hilarious, failing to predict social changes such as in the status of women. Innovations GPS and Google Maps in cars is already changing things; some people think that a home without Alexa is broken. What will society look like once our brain implants are networked? Her synopsis is: it gets harder from here.

    Anthony Finkelstein is the Chief Scientific Adviser at the Cabinet Office. He funds research related to national security and works with colleagues on interdisciplinary problems. National security can’t all be done in the dark; some necessary things require broad political consent. There are nontrivial threats to national security including attempts to undermine democracy, and in fighting such threats the most important factor is trust and consent. Security agencies have particular abilities to intrude on citizens and so need strict control; he thinks the IP Act has improved the situation and we’ve played a useful part in the debate around it. The key things are proportionality and necessity. The real problems include the fragmented and diverse nature of the datasets used; the agencies are in no better shape than anyone else when it comes to cruft and legacy. He believes there is a tradeoff long-term between more access to data and more intrusion into the affairs of individuals. Nonetheless he’s an optimist for technical solutions that will enable us to push the intrusive boundary further away.

    Bill Thompson notes that the three horsemen all start with IP: investigatory powers, the IP protocol, and intellectual property. The IP protocol was designed for a community of good actors; yet we’re now in an arms race of surveillance technologies, with all sorts of unpleasant side effects that nobody predicted. For example, advertising has trashed the press, and he helped this by advising Alan Rusbridger that the Guardian should be free online. Evidence is scant; we heard from Sam Smith how our medical records can end up in all sorts of strange places, and the NHS can’t show the contrary. Perhaps we need to try to fix this further down the stack. Can we have the imagination to design a better Internet for tomorrow? It’s not just 20 years since FIPR but 20 years since Lessig wrote Code. Did we make some wrong choices earlier? He things that Tim Berners-Lee’s decision to make the web stateless was probably the worst, even though it was done for reasons that were good at the time. Perhaps he should have used the distributed system architecture we were working on in Cambridge at the time. Then perhaps the protocol issues wouldn’t subvert trust. Organisations such as the BBC should help create a space between the market and the state where we can build community. Younger members of the audience may not remember “radio and television” but they were once popular. In 1926 the BBC was not chartered to make programs but to develop wireless technology in the national interest. Perhaps the future is not so much about making programs as creating a new kind of platform or social framework.

    Guy Herbert adds a fourth horseman: intended philanthropy. The road to hell is paved with good intentions! CS Lewis once said it is better to be ruled by robber barons than by omnipotent busybodies. It was good Whitehall practice to get the exemptions in before the GDPR came in; we have broad data sharing for many innocent and well-meaning purposes, but as we saw in the last session these can lead to malignant sharing of data; sharing doesn’t have to be malevolent to be malignant. If you don’t fit the bureaucratic categories, tough. The title of this session is “From personalised shopping to personalised warfare”; he’s not interested in the extremes as much as the stuff in the middle, which amounts to personalised law. There’s been a great expansion of targeted legal measures from ASBOs to public space protection orders; coupled with changed in the ways the authorities handle information, we find data becoming reified, data becoming fact, and technology enabling individualised bureaucracy. His phrase is tyranny as an emergent phenomenon”; we’re monitored more and more because we can be, and this leads to the presumption that monitoring is a good thing. The outsourcing of the surveillance has an additional pernicious effect of adding a profit motive and causing the service providers to overreach as a precaution. Pressures on Facebook to take down Daesh propaganda may not be effective but increase censorship; know-your-customer rules deprive innocent people of access to banking. We cannot predict the effects because of chaotic interactions, and gaming on every side leads to institutionalised corruption. We run the risk of hemming ourselves in on all sides, from the best of intentions, but ultimately disrupting society. That is the challenge for the next ten years, or perhaps twenty, or perhaps 100.

    Erich Moechel was the last speaker of the day. He discussed information operations by Iran against Saudi Arabia, and by North Korea against various commercial and other targets. He argued that cyber-war would be a great leveller, with relatively weak states and substate actors being able to cause substantial mayhem. As I lent my laptop to display his slides, I could not blog his comments in detail.

    I chaired the discussion and so could not blog it. See the video!

Leave a Reply to Ross Anderson Cancel reply

Your email address will not be published. Required fields are marked *