A Study on The Value of Location Privacy

There is a Workshop on Privacy in The Electronic Society taking place at the beginning of November. We (George Danezis, Marek Kumpost, Vashek Matyas, and me) will present there results of A Study on the value of Location Privacy we have conducted a half year back.

We questioned a sample of over 1200 people from five EU countries, and used tools from experimental psychology and economics to extract from them the value they attach to their location data. We compare this value across national groups, gender and technical awareness, but also the perceived difference between academic use and commercial exploitation. We provide some analysis of the self-selection bias of such a study, and look further at the valuation of location data over time using data from another experiment.

The countries we gathered the data from were Germany, Belgium, Greece, the Czech Republic, and the Slovak Republic. As some of the countries have local currencies, we have re-calculated the values of bids in different countries by using a “value of money” coefficient computed as a ratio of average salaries and price levels in particular countries — this data was taken from Eurostat statistics.

We have gathered bids for three auctions or scenarios. The first and second bids were for one-month tracking. The former data were to be used for academic purposes only, and the latter for commercial purposes. The third bids were for the scenario where participants agreed with a year long tracking and data free for commercial exploitation. Let us start with the first bids.

Differences among Countries

The distributions of the first bids are on the following plot. Although there are differences between all nations, the Greek bids are beyond our expectations.

Distributions of bids in the first auction round.

Continue reading A Study on The Value of Location Privacy

The real hustle on BBC3: watch it!

For UK residents: BBC Three is re-running their wonderful 10-episode series “The Real Hustle” in which three skilled con artists give, with hidden cameras, a revealing and entertaining guided tour of the most popular scams used to rip off people today. Some computer-based (including keyloggers, bluejacking and bank card cloning), most not.

This series should be required viewing for all security professionals and most definitely for all security students. The only way to understand security is by understanding what crooks actually do. It’s also great fun.

Each episode is re-broadcast several times, from prime time to middle-of-the-night, so you usually get several chances to set your digital video recorder if the programme overlaps with something else you want to watch or record. Check the EPG.

Hot or Not: Revealing Hidden Services by their Clock Skew

Next month I will be presenting my paper “Hot or Not: Revealing Hidden Services by their Clock Skew” at the 13th ACM Conference on Computer and Communications Security (CCS) held in Alexandria, Virginia.

It is well known that quartz crystals, as used for controlling system clocks of computers, change speed when their temperature is altered. The paper shows how to use this effect to attack anonymity systems. One such attack is to observe timestamps from a PC connected to the Internet and watch how the frequency of the system clock changes.

Absolute clock skew has been previously used to tell whether two apparently different machines are in fact running on the same hardware. My paper adds that because the skew depends on temperature, in principle, a PC can be located by finding out when the day starts and how long it is, or just observing that the pattern is the same as a computer in a known location.

However, the paper is centered around hidden services. This is a feature of Tor which allows servers to be run without giving away the identity of the operator. These can be attacked by repeatedly connecting to the hidden service, causing its CPU load, hence temperature, to increase and so change the clockskew. Then the attacker requests timestamps from all candidate servers and finds the one demonstrating the expected clockskew pattern. I tested this with a private Tor network and it works surprisingly well.

In the graph below, the temperature (orange circles) is modulated by either exercising the hidden service or not. This in turn alters the measured clock skew (blue triangles). The induced load pattern is clear in the clock skew and an attacker could use this to de-anonymise a hidden service. More details can be found in the paper (PDF 1.5M).

Clock skew graph

I happened upon this effect in a lucky accident, while trying to improve upon the results of the paper “Remote physical device fingerprinting“. A previous paper of mine, “Embedding Covert Channels into TCP/IP” showed how to extract high-precision timestamps from the Linux TCP initial sequence number generator. When I tested this hypothesis it did indeed improve the accuracy of clock skew measurement, to the extent that I noticed an unusual peak at about the time cron caused the hard disk on my test machine to spin-up. Eventually I realised the potential for this effect and ran the necessary further experiments to write the paper.

After ID Cards…

The next government initiative to undermine privacy is the Children’s Database project. I am one of the authors of a report on this, which was written for the Information Commissioner and will be published later in September. Press interest is starting to mount (see the Telegraph, the Tiimes, the Evening Standard and the Daily Mail), and there will be a TV programme on the subject today (More 4 at half past seven). If you’re in the UK and are interested in privacy or computer security, that might be worth watching.

The project aims at linking up all the government systems that keep information on kids. Your kids’ schoolteachers will be able to see not just their school records but also their medical records, social work records, police records and probation records; see here for the background. I can’t reveal the contents of the report prior to publication but I am reminded of Brian Gladman’s punchline in his talk at SFS8: ‘You can have scale, or functionality, or security.If you’re smart you can build a system with any two of these. But you can’t have all three.’

As well as the technical objections there are legal objections – and strong practical objections from social workers who believe that the project is putting children in harm’s way.

With a single bound it was free!

My book on Security Engineering is now available online for free download here.

I have two main reasons. First, I want to reach the widest possible audience, especially among poor students. Second, I am a pragmatic libertarian on free culture and free software issues; I believe many publishers (especially of music and software) are too defensive of copyright. I don’t expect to lose money by making this book available for free: more people will read it, and those of you who find it useful will hopefully buy a copy. After all, a proper book is half the size and weight of 300-odd sheets of laser-printed paper in a ring binder.

I’d been discussing this with my publishers for a while. They have been persuaded by the experience of authors like David MacKay, who found that putting his excellent book on coding theory online actually helped its sales. So book publishers are now learning that freedom and profit are not really in conflict; how long will it take the music industry?

Protocol design is hard — Flaws in ScatterChat

At the recent HOPE conference, the “secure instant messaging (IM) client”, ScatterChat, was released in a blaze of publicity. It was designed by J. Salvatore Testa II to allow human rights and democracy activists to securely communicate while under surveillance. It uses cryptography to protect confidentiality and authenticity, and integrates Tor to provide anonymity and is bundled with an easy to use user interface. Sadly not everything is as good as it sounds.

When I first started supervising undergraduates at Cambridge, Richard Clayton explained that the real purpose of the security course was to teach students not to invent the following (in increasing order of importance): protocols, hash functions, block ciphers and modes of operation. Academic literature is scattered with the bones of flawed proposals for all of these, despite being designed by very capable and experienced cryptographers. Instead, wherever possible, implementors should use peer-reviewed building blocks, as normally there is already a solution which can do the job, but has withstood more analysis and so is more likely to be secure.

Unfortunately, ScatterChat uses both a custom protocol and mode of operation, neither which are as secure as hoped. While looking at the developer documentation I found a few problems and reported them to the author. As always, there is the question of whether such vulnerabilities should be disclosed. It is likely that these problems would be discovered eventually, so it is better for them to be caught early and users allowed to take precautions, rather than attackers who independently find the weaknesses being able to exploit them with impunity. Also, I hope this will serve as a cautionary tale, reminding software designers that cryptography and protocol design is fraught with difficulties so is better managed through open peer-review.

The most serious of the three vulnerabilities was published today in an advisory (technical version), assigned CVE-2006-4021, from the ScatterChat author, but I also found two lesser ones. The three vulnerabilities are as follows (in increasing order of severity): Continue reading Protocol design is hard — Flaws in ScatterChat

Anonymous data that isn't

AOL has recently been embarrassed after it released data on the searches performed by 658,000 subscribers. Their names had been replaced by numbers, but this was not enough to stop personal information leaking. The AOL folks just didn’t understand that protecting data using de-identification is hard.

They are not alone. An NHS document obtained under the Freedom of Information Act describes how officials are building a “Secondary Uses Service” which will contain large amounts of personal health information harvested from hospital and other records. It’s proposed that ever-larger numbers of people will have access to this information as it is progressively de-identified. It seems that officials are just beginning to realise how difficult it will be to protect patient privacy — especially as your deidentified medical record will generally have your postcode. There are only a few houses at each postcode; knowing that, plus a patient’s age, usually tells you whose record it is. The NHS proposes to set up an “Information Governance Board” to think about the problem. Meanwhile, system development steams ahead.

Clearly, the uses and limitations of anonymisation ought to be more widely understood. There’s more on the subject at the American Statistical Association website, on my web page and in chapter 8 of my book.

"Identity fraud" again

The National Consumer Council has published a report on “identity fraud” which is rather regrettable.

Identity fraud is not fraud, from the consumer’s viewpoint. If someone pretends to be me, borrows 10K from the Derbyshire Building Society and vanishes, it’s the building society that’s the victim, not me. If Experian then says I’m a loan defaulter when I’m not, that’s libel. Suing for libel may be expensive, but the Information Commissioner has announced his willingness to issue enforcement notices against the credit agencies in such circumstances. The NCC should have advertised this fact and encouraged people to go to him.

“Identity fraud” is an objectionable concept, an attempt by the banks to dump some liability. The Home Office egg them on because they think that rebadging credit-card fraud as “identity fraud” will help sell identity cards. But it’s a bad show when consumer organisations collude with an attempt to make consumers the victims of bankers’ and credit reference agencies’ negligence.

Security Theater at the Grand Coulee Dam

Security theater” is the term that Bruce Schneier uses to describe systems that look very exciting and dramatic (and make people feel better) but entirely miss the point in delivering any actual real world security. The world is full of systems like this and since 9/11 they’ve been multiplying.

Bruce also recently ran a competition for a “movie plot” security threat — the winner described an operation to fly planes laden with explosives into Grand Coulee Dam.

As it happens, I was recently actually at Grand Coulee Dam as a tourist — one of the many places I visited as I filled in the time between the SRUTI and CEAS academic conferences. Because this is a Federal site, provision was made from the beginning for visitors to see how their tax dollars were spent, and you can go on tours of the “3rd Power House” (an extra part of the dam, added between 1966 and 1974, and housing six of the largest hydroelectric generators ever made).

Until 9/11 you could park on top of the dam itself and wander around on a self-guided tour. Now, since the site is of such immense economic significance, you have to park outside the site and go on guided tours, of limited capacity. You walk in for about 800 yards (a big deal for Americans I understand) and must then go through an airport style metal detector. You are not allowed to take in backpacks or pointy things — you can however keep your shoes on. The tour is very interesting and I recommend it. You get to appreciate the huge scale of the place, the tiny looking blue generators are 33 feet across!, and you go up close to one of the generators as it spins in front of you, powering most of the NorthWest and a fair bit of California as well.

The security measures make some sense; although doubtless the place the bad guys would really like to damage is the control center and that isn’t on the tour. However….

… on the other side of the valley, a quarter of a mile from the dam itself, is a “visitor arrival center“. This contains a number of displays about the history of the dam and its construction, and if you have the time, there’s films to watch as well. On summer nights they project a massive laser light show from there (a little tacky in places, but they run white water over the dam to project onto, which is deeply impressive). You don’t have to go through any security screening to get into the center. However, and that’s the security theater I promised — you cannot take in any camera bags, backpacks etc!

No purses, backpacks, bags, fannypacks, camera cases or packages of any kind allowed in the visitor center.

What’s the threat here? I went to a dozen other visitor centers (in National Parks such as Yellowstone, Grand Teton, Glacier, Mt. Rainier and Crater Lake) that were generally far more busy than this one. Terrorists don’t usually blow up museums, and if, deity forbid, they blew up this one, it’s only the laser lights that would go out.