I’m at SHB 2010, which brings security engineers together with psychologists, behavioral economists and others interested in deception, fraud, fear, risk perception and how we make security systems more usable.
Today sees the publication of a report by Professor Trisha Greenhalgh into the Summary Care Record (SCR). There is a summary of the report in the BMJ, which also has two discussion pieces: one by Sir Mark Walport of the Wellcome Trust arguing that the future of medical records is digital, and one by me which agrees but argues that as the SCR is unsafe and unlawful, it should be abandoned.
Two weeks ago I reported here how the coalition government planned to retain the SCR, despite pre-election promises from both its constituent parties to do away with it. These promises followed our Database State report last year which demonstrated that many of the central systems built by the previous government contravened human-rights law. The government’s U-turn provoked considerable anger among doctors. NGOs and backbench MPs, prompting health minister Simon Burns to promise a review.
Professor Greenhalgh’s review, which was in fact completed before the election, finds that the SCR fails to do what it was supposed to. It isn’t used much; it doesn’t fit in with how doctors and nurses actually work; it doesn’t make consultations shorter but longer; and the project was extremely badly managed. In fact, her report should be read by all serious students of software engineering; like the London Ambulance Service report almost twenty years ago, this document sets out in great detail what not to do.
Here is a liveblog of WEIS which is being held today and tomorrow at Harvard. It has 125 attendees: 59% academic, 15% govt/NGO, and 26% industry; the split of backgrounds of 47% CS, 35% econ/management and 18% policy/law. The paper acceptance rate was 24/72: 10 empirical papers, 8 theory and 6 on policy.
The workshop kicked off with a keynote talk from Tracey Vispoli of Chubb Insurance. In early 2000s, insurance industry thought cyber would be big. It isn’t yet, but it is starting to grow rapidly. There is still little actuarial data. But the tndustry can shape behaviour by being in the gap between risk aversion and risk tolerance. Its technical standards can make a difference (as with buildings, highways, …). So far a big factor is the insurance response to notification requirements: notification costs of $50-60 per compromised record mean that a 47m compromise like TJX is a loss you want to insure! So she expects healthy supply and demand model for cyberinsurance in coming years. This will help to shape standards, best practices and culture.
Questions: are there enough data to model? So far no company has enough; ideally we should bring data together from industry to one central shared point. Government has a role as with highways. Standards? Client prequalification is currently a fast-moving target. Insurers’ competitive advantage is understanding the intersection between standards and pricing. Reinsurance? Sure, where a single event could affect multiple policies. Tension between auditability and security in the power industry (NERC) – is there any role for insurance? Maybe, but legal penalties are in general uninsurable. How do we get insurers to come to WEIS? It would help if we had more specificity in our research papers, if we did not just talk about “breaches” but “breaches resulting in X” (the industry is not interested in national security, corporate espionage and other things that do not result in claims). Market evolution? She predicts the industry will follow its usual practice of lowballing a new market until losses mount, then cut back coverage terms. (E.g. employment liability insurance grew rapidly over last 20 years but became unprofitable because of class actions for discrimination etc – so industry cut coverage, but that was OK as it helped shape best employment practice). Data sharing by industry itself? Client confidentiality stops ad-hoc sharing but it would be good to have a properly regulated central depository. Who’s the Ralph Nader of this? Broad reform might come from the FTC; it’s surprising the SEC hasn’t done anything (HIPAA and GLB are too industry-specific). Quantifiability of best practice? Not enough data. How much of biz is cyber? At present it’s 5% of Chubb’s insurance business, but you can expect 8-9% in 2010-11 – rapid growth!
Future sessions will be covered in additional posts…
The coalition Government plans to keep the Summary Care Record, despite pre-election pledges by both the Conservatives and the Liberal Democrats to rip up the system – which is not compliant with the I v Finland judgement of the European Court of Human Rights.
Last year colleagues and I wrote Database State, a report for the Joseph Rowntree Reform Trust, which studied 46 systems that keep information on all of us, or at least a significant minority of us. We concluded that eleven of them were almost certainly illegal under human-rights law, and most of the rest had problems. Our report was well received by both Conservatives and Lib Dems; many of its recommendations were adopted as policy.
Old-timers may recall that back in 1996-7, many of us geeks supported New Labour enthusiastically, as Blair promised not to introduce key escrow. It took him almost a year to renege on that promise; it has taken the coalition less than a month.
Blair’s U-turn on key escrow in 1998 led to the establishment of FIPR, and a two-year fight against what became the RIP Act (where at least we limited escrow to the powers in part 3). What’s the appropriate response now to Cameron and Clegg?
It’s inconceivable that assurances given to farmers, or to soldiers, or to teachers would be tossed aside so casually. Yet half a million of us earn our living in IT in Britain – there’s a lot more of us than of any of them! And many people in other jobs care about privacy, copyright, and other digital issues. So do those of us who care about digital policy have to become more militant? Or do we have to raise money and bribe the ruling parties? Or, now that all three major parties are compromised, should we downgrade our hopes for parliament and operate through the courts and through Europe instead?
The book “Digital Activism Decoded: The New Mechanics of Change” is one of the first on the topic of digital activism. It discusses how digital technologies as diverse as the Internet, USB thumb-drives, and mobile phones, are changing the nature of contemporary activism.
Each of the chapters offers a different perspective on the field. For example, Brannon Cullum investigates the use of mobile phones (e.g. SMS, voice and photo messaging) in activism, a technology often overlooked but increasingly important in countries with low ratios of personal computer ownership and poor Internet connectivity. Dave Karpf considers how to measure the success of digital activism campaigns, given the huge variety of (potentially misleading) metrics available such as page impression and number of followers on Twitter. The editor, Mary Joyce, then ties each of these threads together, identifying the common factors between the disparate techniques for digital activism, and discussing future directions.
My chapter “Destructive Activism: The Double-Edged Sword of Digital Tactics” shows how the positive activism techniques promoted throughout the rest of the book can also be used for harm. Just as digital tools can facilitate communication and create information, they can also be used to block and destroy. I give some examples where these events have occurred, and how the technology to carry out these actions came to be created and deployed. Of course, activism is by its very nature controversial, and so is where to draw the line between positive and negative actions. So my chapter concludes with a discussion of the ethical frameworks used when considering the merits of activism tactics.