Monthly Archives: July 2007

Economics of Tor performance

Currently the performance of the Tor anonymity network is quite poor. This problem is frequently stated as a reason for people not using anonymizing proxies, so improving performance is a high priority of their developers. There are only about 1 000 Tor nodes and many are on slow Internet connections so in aggregate there is about 1 Gbit/s shared between 100 000 or so users. One way to improve the experience of Tor users is to increase the number of Tor nodes (especially high-bandwidth ones). Some means to achieve this goal are discussed in Challenges in Deploying Low-Latency Anonymity, but here I want to explore what will happen when Tor’s total bandwidth increases.

If Tor’s bandwidth doubled tomorrow, the naïve hypothesis is that users would experience twice the throughput. Unfortunately this is not true, because it assumes that the number of users does not vary with bandwidth available. In fact, as the supply of the Tor network’s bandwidth increases, there will be a corresponding increase in the demand for bandwidth from Tor users. This fact will apply just as well for other networks, but for the purposes of this post, I’ll use Tor as an example. Simple economics shows that performance of Tor is controlled by how the number of users scales with available bandwidth, which can be represented by a demand curve.

I don’t claim this is a new insight; in fact between me starting this draft and now, Andreas Pfitzmann made a very similar observation while answering a question following the presentation of Performance Comparison of Low-Latency Anonymisation Services from a User Perspective at the PET Symposium. He said, as I recall, that the performance of the anonymity network is the slowest tolerable speed for people who care about their privacy. Despite this, I couldn’t find anyone who had written a succinct description anywhere, perhaps because it is too obvious. Equally, I have heard the naïve version stated occasionally, so I think it’s helpful to publish something people can point at. The rest of this post will discuss the consequences of modelling Tor user behaviour in this way, and the limitations of the technique.

Continue reading Economics of Tor performance

The role of software engineering in electronic elections

Many designs for trustworthy electronic elections use cryptography to assure participants that the result is accurate. However, it is a system’s software engineering that ensures a result is declared at all. Both good software engineering and cryptography are thus necessary, but so far cryptography has drawn more attention. In fact, the software engineering aspects could be just as challenging, because election systems have a number of properties which make them almost a pathological case for robust design, implementation, testing and deployment.

Currently deployed systems are lacking in both software robustness and cryptographic assurance — as evidenced by the English electronic election fiasco. Here, in some cases the result was late and in others the electronic count was abandoned due to system failures resulting from poor software engineering. However, even where a result was returned, the black-box nature of auditless electronic elections brought the accuracy of the count into doubt. In the few cases where cryptography was used it was poorly explained and didn’t help verify the result either.

End-to-end cryptographically assured elections have generated considerable research interest and the resulting systems, such as Punchscan and Prêt à Voter, allow voters to verify the result while maintaining their privacy (provided they understand the maths, that is — the rest of us will have to trust the cryptographers). These systems will permit an erroneous result to be detected after the election, whether caused by maliciousness or more mundane software flaws. However should this occur, or if a result is failed to be returned at all, the election may need to fall back on paper backups or even be re-run — a highly disruptive and expensive failure.

Good software engineering is necessary but, in the case of voting systems, may be especially difficult to achieve. In fact, such systems have more similarities to the software behind rocket launches than more conventional business productivity software. We should thus expect the consequential high costs and, despite all this extra effort, that the occasional catastrophe will be inevitable. The remainder of this post will discuss why I think this is the case, and how manually-counted paper ballots circumvent many of these difficulties.

Continue reading The role of software engineering in electronic elections

Digital signatures hit the road

For about thirty years now, security researchers have been talking about using digital signatures in court. Thousands of academic papers have had punchlines like “the judge then raises X to the power Y, finds it’s equal to Z, and sends Bob to jail”. So far, this has been pleasant speculation.

Now the rubber starts to hit the road. Since 2006 trucks in Europe have been using digital tachographs. Tachographs record a vehicle’s speed history and help enforce restrictions on drivers’ working hours. For many years they have used circular waxed paper charts, which have been accepted in court as evidence just like any other paper record. However, paper charts are now being replaced with smartcards. Each driver has a card that records 28 days of infringement history, protected by digital signatures. So we’ve now got the first widely-deployed system in which digital sigantures are routinely adduced in evidence. The signed records are being produced to support prosecutions for working too long hours, for speeding, for tachograph tampering, and sundry other villainy.

So do magistrates really raise X to the power Y, find it’s equal to Z, and send Eddie off to jail? Not according to enforcement folks I’ve spoken to. Apparently judges find digital signatures too “difficult” as they’re all in hex. The police, always eager to please, have resolved the problem by applying standard procedures for “securing” digital evidence. When they raid a dodgy trucking company, they image the PC’s disk drive and take copies on DVDs that are sealed in evidence bags. One gets given to the defence and one kept for appeal. The paper logs documenting the procedure are available for Their Worships to inspect. Everyone’s happy, and truckers duly get fined.

In fact the trucking companies are very happy. I understand that 20% of British trucks now use digital tachographs, well ahead of expectations. Perhaps this is not uncorrelated with the fact that digital tachographs keep much less detailed data than could be coaxed out of the old paper charts. Just remember, you read it here first.

Recent talks: Chip & PIN, traffic analysis, and voting

In the past couple of months, I’ve presented quite a few talks, and in the course of doing so, travelled a lot too (Belgium and Canada last month; America and Denmark still to come). I’ve now published my slides from these talks, which might also be of interest to Light Blue Touchpaper readers, so I’ll summarize the contents here.

Two of the talks were on Chip & PIN, the UK deployment of EMV. The first presentation — “Chip and Spin” — was for the Girton village Neighbourhood Watch meeting. Girton was hit by a spate of card-cloning, eventually traced back to a local garage, so they invited me to give a fairly non-technical overview of the problem. The slides served mainly as an introduction to a few video clips I showed, taken from TV programmes in which I participated. [slides (PDF 1.1M)]

The second Chip & PIN talk was to the COSIC research group at K.U. Leuven. Due to the different audience, this presentation — “EMV flaws and fixes: vulnerabilities in smart card payment systems” — was much more technical. I summarized the EMV protocol, described a number of weaknesses which leave EMV open to attack, along with corresponding defences. Finally, I discussed the more general problem with EMV — that customers are in a poor position to contest fraudulent transactions — and how this situation can be mitigated. [slides (PDF 1.4M)]

If you are interested in further details, much of the material from both of my Chip & PIN talks is discussed in papers from our group, such as “Chip and SPIN“, “The Man-in-the-Middle Defence” and “Keep Your Enemies Close: Distance bounding against smartcard relay attacks

Next I went to Ottawa for the PET Workshop (now renamed the PET Symposium). Here, I gave three talks. The first was for a panel session — “Ethics in Privacy Research”. Since this was a discussion, the slides aren’t particularly interesting but it will hopefully be the subject of an upcoming paper.

Then I gave a short talk at WOTE, on my experiences as an election observer. I summarized the conclusions of the Open Rights Group report (released the day before my talk) and added a few personal observations. Richard Clayton discussed the report in the previous post. [slides (PDF 195K)]

Finally, I presented the paper written by Piotr Zieliński and me — “Sampled Traffic Analysis by Internet-Exchange-Level Adversaries”, which I previously mentioned in a recent post. In the talk I gave a graphical summary of the paper’s key points, which I hope will aid in understanding the motivation of the paper and the traffic analysis method we developed. [slides (PDF 2.9M)]