On Friday I went to a fascinating lobbying meeting on the new EU data protection regulation. Europe is by default the world’s privacy regulator, as America doesn’t care and no-one else is big enough to matter; so this is really important. Some 3000 amendments have been proposed and the regulation is in the final stages of the committee process; the rapporteurs of the various parties are negotiating compromise amendments which should be ready for a vote within weeks. So the pressure is really on.
Friday was extraordinary because all the lobbyists came together in one room to argue their cases. This is because the liberal shadow rapporteur Alexander Alvaro was injured in a car crash last month, so Sarah Ludford, a London MEP, took over at the last minute. Normally lobbyists see MEPs singly or in small groups, but as time was short Sarah called a mass meeting at Europa House in London. So we all got to hear what the others were pushing for. Campaigners for open government say we’d have better laws if more if the process was public; here’s an example where that happened (literally) by accident.
I am posting my notes of the meeting here, as it’s a good case history of how lobbying works, as well as of how our privacy is being lost. There were about 100 people present, of which only 5 were from civil society. Most were corporate lobbyists: good-looking, articulate and impressive, but pushing some jaw-dropping agendas. For example the lovely lady from the Association of British Insurers found it painful that the regulation might ban profiling that was unfair or discriminatory.
Sarah Ludford DPR meeting, 3–6 PM, 26/4/13, Europa House, Smith Square.
Sarah says the shadows’ meeting has not so far gone through the articles in order but cherry-picked; started with articles 2, 3, 81, 83. LIBE committee now realising they have to go back to fundamentals. Sympathises with Beith and McNally at Justice SC and ministry who wrote the brief for the Council and are also Lib Dems.
1. Risk and context-based approach
David Smith of the ICO wants to remove a lot of the prescription in the proposals about how many data protection officers a business should have and the documentation it should keep. Art 6 controller and process obligations: should just say “appropriate”.
Emma Butler, Reed Elsevier: risk ties to many things in the proposal, so it should articulate what risk is (tangible harm, societal harm).
Mumbling Frenchman (Schwarzbert?) about virtues of profiling.
Razvan Antemir, EMOTA, ecommerce association in Brussels. Small businesses, excessive to have a DP officer for firms under 10 people. Members already pay for all sorts of trust marks.
Leo Baumann, Nokia and trade association. Location stuff burdensome as don’t want to have to document millions of processing operations. Need
what’s deemed to be risky.
Jeremy Wilson, BA and Assoc Eur Airlines and IATA. Risk-based vs context.
Yves Schwarzbert, Advertising association: ditto; this is a blanket,
binary approach which defines almost all data as personal data; need a more granular approach so businesses can enhance its approach to protection. Pseudonymisation?
.. James Nation, CBI, wants to maintain discretion, proportionality; worried about commission’s power to come in later by delegated acts.
2. Anonymisation. Recital that anonymised data out of scope. ICO guidance.
Ross Anderson: Anonymisation doesn’t work. Example of postcode and date of birth, and encrypted ANPR data. How to square increase or decrease in risk? Best ensure that liability remains.
Anna Fielder, Privacy International: best to leave out pseudonymisation! As for definitions, best leave them broad and put any exceptions in articles further down. Even things like IP addresses have been used to single people out!
Brad Bryant, Aon. Need to process sensitive health data for a lot of things and it’s hard to pseudonymise it. But we need it for modelling. We want an incentive to pseudonymise.
Kimon Zorbas, IAB Europe: Harmonise definitions across Europe, such as IP addresses. Defining an/pseud-onymous data would provide clarity. Germany does and the definition is stricter than other countries. Be realistic about regulators, stop fights about what is or is not psuedonymsation.
Emma Ascroft, Yahoo: some data very personal such as name and dob, while other data let you single out a user. The use of pseudonymous data is privacy-enhancing for users but the incentives are not there. Need (a) definitions for everybody (b) legal basis – consent not
workable [you have to authenticate the user which means collecting personal data] (c) data subject rights such as access; all linked to article 20 on profiling which seems to make consent the only basis for profiling, which in turn makes many business models illegal
Mark Purvis from Weber Shandwick, representing MasterCard. Lighter regime for pseudonymised data. Right of access 15, rectification 16, be forgotten 17, portability 18. [Sarah Ludford: art 10 already says that if data don’t allow identification, you don’t have to collect more information just for compliance].
Ian Roy, Telefonica UK: incentivise PETs, Emma describes well. Wants definition to exclude pseudonymous data [Sarah Ludford: that is not going to happen!]
Beth Thompson, Wellcome: “singling out” is a big issue for medical research as in clinical trials you have line data with a line for each patient. Duty on controller / regulator to make anonymisation robust. Code makes it complex
Unknown speaker – “singled out” adds nothing.
David Smith: still doesn’t know what personal data is after 20y. Don’t tinker with the definition; it’s just the way we operate that has changed. Art 29 WP has had a view on what “identify” means. A cookie singles you out as it gives you different ads; that’s “identifiable information” and “personal data”. Are IP addresses pseudonyms? No. But do you apply the full data protection regime? Disproportionate. Pseudonymise? If the same person holds the key it may help security but not protection. It’s a big range, mindblowing, and struggling with
definitions won’t help. Apply a risk based approach and be proportionate.
Emma Ascroft: important to define it to get incentives even if hard. Echoed by French guy.
James Leaton Grey, BBC: has many ways to get consent, and is told that their mechanisms aren’t enough for the Dutch. In any case you get only the computer’s consent.
Michael Smith, Experian: consent can’t be the be-all and end-all as the legitimate interest must also work. Credit reference essential for protecting people from excessive credit, and fraud, and free flow of credit. Many statutory provisions for consumers’ rights. Legitimate interest must not be curtailed
James Milligan, Direct Marketing Association. As Richard Thomas said
we’re turning ourselves into a nation of liars with opt-in boxes. Need multiple levels for the uninterested, the curious and the experts.
Anna Fielder, Privacy International: the proposals are only a small change in a landscape that almost everyone accepts isn’t working well. Consent technologies are evolving too.
Robbie Williams, BEERG, employee relations: HR will be made impossible by art 7(4) which rules out consent where there is a significant imbalance. Sarah Ludford: consent must be “freely given”
Fabrizio Ferraro, IG: art 17 and right to be forgotten – what’s workable solution for financial services?
Heather Wallet, Barclays: if no legitimate interest, then need carve-out for preventing and detecting fraud and crime.
Andrew, Equifax: important for legitimate interests of third parties
Zoltan Precsenyi, Symantec: can’t build an information society from the presumption that processing data “isn’t permitted unless”
Victoria Eva, Pearson: prescriptive stuff, burden of proof, means consent requirements hard to deal with in contexts such as education as parents have to consent to most of what kids do. This isn’t right.
Kim Smouter (?), Esomar, marketing association: some countries treat consent as written so we can’t do phone surveys. JURI said consent should be appropriate
Ed Simpson, Finance and Leasing Association: conflict between the draft and other regulations requiring we know our customer.
Juergen Baensch, ISFE, Eur videogame industry: parental consent in article 8; advocates retention of age 13 to be consistent with the US. But parental consent should not be applied where there are “legitimate interests”
Matilde Fiquet, FEDMA Eur direct marketing: need to extend legitimate interest to third parties.
Ross Anderson: be careful in two cases: legitimate interest of more distant third parties; OK for a third party who subcontracts for the data controller but not OK to sell my data to fifty behavioural advertising firms. Second problem is monopolies which are pervasive in information industries
Julien Fafleur, World federation of advertisers: lawful processing without prejudice to art 10 (not asking for more data for compliance)
Katy Thomas, GMC: right to be forgotten should not apply to medical
records, in relation to possible litigation; and doesn’t want to see consent between two unattractive options not to be seen as free.
Kathryn Whelan from Intel/McAfee: welcome amendments around security as a legitimate interests. Can have security without privacy but cannot have privacy without security.
Helen Wallace: if genome data are retained, particularly by the state, then the proposed amendments on health data are hazardous as genetic data are biometric.
Rachel Merrett, BMA: supports that forgetting must not apply to medical records; and explicit consent applies already to med data so would prefer “explicit” to “meaningful”
James Nation, CBI again: article 5f duty to demonstrate compliance “for each processing operation” cannot stand.
Anna Hansell, Imperial College London: explicit consent hard for med research on records of millions of people, to whom we can’t go back for consent. Already tightly regulated under UK law (balance against GMC explicit consent)
Beth Williams, Wellcome: move to broad consent will allow use of consent more often but also need another legal basis.
Emma Ascroft about legitimate interest and pseudonyms again.
Kim Smouter, ESOMAR(?) – e-privacy directive still applies.
3. Data subject rights arts 11-19
Jennie Weaver, ABI, art 12: lack of proportionality compared to DPA around rigid timeframe to provide data, exempting third-party data.
Ed Simpson, Finance and Leasing: wants lenders to charge a small fee to deter vexatious inquiries by claims management companies.
Mita Mitra, BT: agrees on a nominal fee to curb excessive requests as things get more electronic. General worry about authentication and about portability (where the new right seems aimed at social networks)
Anna Fielder: her husband was the victim of ID theft last year and had to make subject access requests of fifteen companies. The total bill came to over £250. Some companies don’t charge for subject access, and don’t suffer. A fee is also an issue for pensioners and other
Unknown: think of sensitive business data.
Unknown: currently the controller can ask the data subject for help to find the data, and this seems to have fallen by the wayside.
Gordon Nardell, general council of the bar: large number of amendments tabled to 14, 15 by legal professions elsewhere in Europe, which would be too broad as they remove all obligations by people under professional secrecy (so it would extend to doctors and priests as well as lawyers). If person A tells a lawyer to do something nasty to person B, then what happens if B does a subject access request?
Client-attorney confidentiality is protected anyway; the concern is that people park nasty stuff with lawyers as a loophole. The Bar Council takes the view that this should be done by national derogations under 21(5)f, which could be strengthened. Can we please do this in a compromise amendment?
Unknown: how do you square the right to be forgotten with cookies? Figuring out what information is linked to a cookie is often very hard. The user can exercise this right himself much more easily by deleting the cookie, so there should be a derogation from article 17
(right to be forgotten) where the user can do this themselves.
Helen Wallace: we need to beware that data protection applies to police and other state bodies as well, and we have to be careful about the crossover (e.g. if your genome is in your med record, can’t be deleted, and the police demand access).
Michael Smith, Experian: be careful with profiling restrictions in article 20 as this would catch a lot of long-established and beneficial uses such as credit scoring, prevention of ID theft, etc.
+1: a Frenchman.
Mark Purvis, Mastercard: Firms like theirs also do fraud profiling under regulatory obligations and following guidelines of ECB, which should count as lawful processing.
James Milligan, UK Direct Marketing Association: ad targeting and profiling should only be limited insofar as it does actual harm.
Barry Ryan, UK Market Research Society: the text just talks about “the automatic processing of data” so will apply (e.g.) to the use of statistics in research – particularly if the Albrecht amendments prevail
Helen Wallace: profiling is not always helpful, such as health data being used to profile customers to deny them insurance.
Unknown: pseudonymous data could work here, and for website optimisation, so long as there’s a rule not to link back to clear data
Unknown: we have so much more data / analytics nowadays, and it’s not really an optional extra any more. Most doesn’t go anywhere, but some ends up being applied to individuals where rights such as transparency and access become important.
Tim Pethick, Saga/AA: 18 million customers, personalised marketing important, and to insurance in particular. If we couldn’t do risk/fraud detection it would add 4% to premiums.
Victoria Eva, Pearson: worried about restrictions on profiling for education products.
Jennie Weaver, ABI: relationship between policyholder profile risk and expected claims, so the attempts to expand profiling controls (such as those which forbid “unfair and discriminatory” processing) would be a pain
Mark Purvis, Mastercard: art 24 on joint controllers. Making MC a JC with most banks they work with in Europe implies equal access to subject and equal amounts of data, and equal liability for breach. The controller with the most contact with the data subject should be the main point of contact. (Sarah: art 24 says that joint controllers shall determine their joint responsibilities.)
Zoltan Precsenyi, Symantec: security essential for privacy but wants short para obliging people to take appropriate measures; Sarah’s amendments for powers of national authorities would be fine provide they don’t diverge significantly. He prefers Alvaro’s amendment on that.
Siada el Ramly, European Digital Media Assoc: controller-processor responsibilities
Vincent Feiner, Unisys, is worried about art 82, the employment contract. They assess the cost on Eurobiz as €3.2bn, falling on HR functions, and this is consistent with figures Lord McNally put in the House of Lords, and has not been rebutted by the Commission. Delete
article 82, as Voss suggests; it will lead to 20-odd pieces of legislation in Member States.
Leo Baumann, Nokia: amendments to art 22-3 define the accountability principle saying that companies take responsibility for privacy approach, and this is better than detailed requirements for documentation etc that make it harder to do the right thing.
Rosina Robson from FSB, fed of small biz: risk-based approach please.
Unknown: same as Unisys above.
Yves Schwarzbert, advertising association: privacy impact assessments don’t do anything. If they’re in, best include legal privilege as it will be the same lawyers.
Anna Fielder: if you talk about increased accountability you strong possibilities for redress and enforcement. Welcome 73 which allows associations to take up complaints on behalf of many subjects. Stops short of collective redress which BEUC etc want and would be a great
way of getting redress. Collective ADR exists but the vanilla version is out (thanks to JURI – the Frenchman says this was because consumers’ assocs said going to court was too hard)
Unknown: duplicating work between controllers and processors increases costs without providing benefits to customers, especially in cloud situations
Unknown: art 26 is unduly prescriptive, so leave it to contract between controller and processor. Too prescriptive to require that the processor “hand over all the results to the controller”. What does this mean? Also 26(1)f – obligation of processor to assist controller in compliance.
Mita Mitra, BT: breach notification, need flexibility on art 31, 32 (don’t want notification fatigue).
* (1730–1800, after most others left) Medical data – art 81 and 83 for which compromise amendments are already agreed
Sarah Ludford: It was challenging to get them right. May have to revisit after doing things the wrong way round. Got agreement to some wording on S81 that I supplied, to the extent that these are “necessary and proportional and foreseeable by the data subject”. Talking of adding a one-time consent sufficient. Selective opt-out afterwards (or rely on
article 7). Safeguards in 83, involves consent in para 2. National derogation for research with high public interest (Albrecht had “extremely high”). Research data should be anonymised or pseudonymised to highest technical standards. We’ll curtail the power to adopt delegated acts so as to require public consultation and EDP (this might go into a general provision); and notification.
Ross: many but not all of our concerns about privacy in the NHS have been dealt with by Caldicott committee thanks for campaigning by people like Helen here. However that leaves private medicine, industrial medicine, prison and armed forces medical services, and
also colleagues in NL, Austria, etc. We cannot say that just because the UK has more or less adequate regulation we can slack off and exempt medical stuff from data protection.
Beth Thompson, Wellcome: art 83 has moved some way on from Albrecht but the wording there is such that all Section 251 HSCA uses of medical data will become illegal. [Ross Anderson disagrees.] Beth says the DoH agrees. [Anna Fielder: art 21 also has an override for exceptional public interest]
Rachel Merrett, BMA: supports right to opt out but wants complete audit trail in direct care. Supports section 251 and wants to use identifiable data for research when needed but doesn’t want the current measures around consent to be weakened and has concerns about seeking consent just once (as
commissioning, research and 100 other secondary uses are getting more complex).
Helen Wallace, Genewatch: who is a researcher? Helsinki declaration requires transparency, and people want to know which companies have access. They might be happy for universities to be doing it but not Google or other private companies. There’s a bigig difference between care data and research, especially once you’ve got genome stuff. People need to be able to get some stuff deleted.
Ross Anderson: agreed, and this goes to legitimate interests of third parties discussed earlier. If NHS has my genetic data and I become a suspect in a crime then they can get my data; if I then get acquitted I have a right to block their access. The UK has a bad record ofgetting the police to delete DNA data and I need to rely on European law for
Sarah Ludford on Art83: ethics committee taken out yesterday. “Translational and clinical” phraseology and public-health stuff will be kept entirely in Art 81. “Health” as a specific concern out of Art 83. Amendment to 83(1)b about identifying data being kept separate and using the highest technical standards to prevent unwarranted re-identification.
Unknown: need bridge between art 5 and art 83, see Art29WP opinion 3/2/2013
Sarah Ludford: does not like the wording that processing shall not result in data being processed by others such as banking companies; this sort of wording should not appear in legislation. Maybe we should instead have “Consent or member state or union law”
Ross Anderson: research can be an enormous loophole. Does medical research stretch to encompass market research, and drive coach and horses through the whole regulation?
Sarah Ludford: wants to stick to broad horizontal principles
Helen Wallace: all this is precisely why we need consent.
Sarah Ludford: her husband was chair of diabetes assoc, by way of declaration.