I’m in a symposium at Churchill College on the Investigatory Powers Bill. It’s organised by John Naughton and I’ll be speaking later on equipment interference, a topic on which I wrote an expert report for the recent IP Tribunal case brought by Privacy International. Meanwhile I’ll try to liveblog the event in followups to this post.
A very exciting Passwords 2015 is being hosted at the Computer Laboratory from 7 to 9 December. A unique conference that brings together the world’s top password hackers and academics. It is being liveblogged by the participants on Twitter with the hashtag #passwords15. A live feed is available on the Passwords 2015 page.
I have just spent a long weekend at Emergent Quantum Mechanics (EmQM15). This workshop is organised every couple of years by Gerhard Groessing and is the go-to place if you’re interested in whether quantum mechanics dooms us to a universe (or multiverse) that can be causal or local but not both, or whether we might just make sense of it after all. It’s held in Austria – the home not just of the main experimentalists working to close loopholes in the Bell tests, such as Anton Zeilinger, but of many of the physicists still looking for an underlying classical model from which quantum phenomena might emerge. The relevance to the LBT audience is that the security proofs of quantum cryptography, and the prospects for quantum computing, turn on this obscure area of science.
The two themes emergent from this year’s workshop are both relevant to these questions; they are weak measurement and emergent global correlation.
Weak measurement goes back to the 1980s and the thesis of Lev Vaidman. The idea is that you can probe the trajectory of a quantum mechanical particle by making many measurements of a weakly coupled observable between preselection and postselection operations. This has profound theoretical implications, as it means that the Heisenberg uncertainty limit can be stretched in carefully chosen circumstances; Masanao Ozawa has come up with a more rigorous version of the Heisenberg bound, and in fact gave one of the keynote talks two years ago. Now all of a sudden there are dozens of papers on weak measurement, exploring all sorts of scientific puzzles. This leads naturally to the question of whether weak measurement is any good for breaking quantum cryptosystems. After some discussion with Lev I’m convinced the answer is almost certainly no; getting information about quantum states takes exponentially much work and lots of averaging, and works only in specific circumstances, so it’s easy for the designer to forestall. There is however a question around interdisciplinary proofs. Physicists have known about weak measurement since 1988 (even if few paid attention till a few years ago), yet no-one has rushed to tell the crypto community “Sorry, guys, when we said that nothing can break the Heisenberg bound, we kinda overlooked something.”
The second theme, emergent global correlation, may be of much more profound interest, to cryptographers and physicists alike.
On Monday May 4th, the Dutch medical privacy campaigner Guido van’t Noordende will visit us in Cambridge. OK, it’s a bank holiday, but that’s the only day he’ll be in town. His talk will be on The Dutch electronic patient record system and beyond – towards physician-controlled decentralized medical record exchange.
Four years ago, Guido blocked an attempt to legislate for a central hub for medical records that would have enabled doctor A to see the records of doctor B on a simple pull model; there would have been a hub at the ministry with read access to everything. Other countries have wrestled with this problem, with greater and lesser degrees of success; for example, Norway just passed a medical data-sharing law and are starting to figure out what to build. In Britain of course we had the care.data fiasco. And in the Netherlands, they’re revisiting the issue once more. This will become a live issue in one country after another.
The announcement for Guido’s talk is here.
Today at 5pm I’ll be giving the Bellwether Lecture at the Oxford Internet Institute. My topic is Big Conflicts: the ethics and economics of privacy in a world of Big Data.
I’ll be discussing a recent Nuffield Bioethics Council report of which I was one of the authors. In it, we asked what medical ethics should look like in a world of ‘Big Data’ and pervasive genomics. It will take the law some time to catch up with what’s going on, so how should researchers behave meanwhile so that the people whose data we use don’t get annoyed or surprised, and so that we can defend our actions if challenged? We came up with four principles, which I’ll discuss. I’ll also talk about how they might apply more generally, for example to my own field of security research.
Many people assume that quantum mechanics cannot emerge from classical phenomena, because no-one has so far been able to think of a classical model of light that is consistent with Maxwell’s equations and reproduces the Bell test results quantitatively.
Today Robert Brady and I unveil just such a model. It turns out that the solution was almost in plain sight, in James Clerk Maxwell’s 1861 paper On Phyiscal Lines of Force in which he derived Maxwell’s equations, on the assumption that magnetic lines of force were vortices in a fluid. Updating this with modern knowledge of quantised magnetic flux, we show that if you model a flux tube as a phase vortex in an inviscid compressible fluid, then wavepackets sent down this vortex obey Maxwell’s equations to first order; that they can have linear or circular polarisation; and that the correlation measured between the polarisation of two cogenerated wavepackets is exactly the same as is predicted by quantum mechanics and measured in the Bell tests.
This follows work last year in which we explained Yves Couder’s beautiful bouncing-droplet experiments. There, a completely classical system is able to exhibit quantum-mechanical behaviour as the wavefunction ψ appears as a modulation on the driving oscillation, which provides coherence across the system. Similarly, in the phase vortex model, the magnetic field provides the long-range order and the photon is a modulation of it.
If our sums add up, the consequences could be profound. First, it will explain why quantum computers don’t work, and blow away the security ‘proofs’ for entanglement-based quantum cryptosystems (we already wrote about that here and here). Second, if the fundamental particles are just quasiparticles in a superfluid quantum vacuum, there is real hope that we can eventually work out where all the mysterious constants in the Standard Model come from. And third, there is no longer any reason to believe in multiple universes, or effects that propagate faster than light or backward in time – indeed the whole ‘spooky action at a distance’ to which Einstein took such exception. He believed that action in physics was local and causal, as most people do; our paper shows that the main empirical argument against classical models of reality is unsound.
Today Robert Brady and I will be giving a seminar in Cambridge where we will explain Yves Couder’s beautiful bouncing droplet experiments. Droplets bouncing on a vibrating fluid bath show many of the weird phenomena of quantum mechanics including tunneling, diffraction and quantized orbits.
We published a paper on this in January and blogged it at the time, but now we have more complete results. The two-dimensional model of electromagnetism that we see in bouncing droplets goes over to three dimensions too, giving us a better model of transverse sound in superfluids and a better explanation of the Bell test results. Here are the slides.
The talk will be at 4pm in the Centre for Mathematical Sciences.
In a seminar today, we will unveil Rendezvous, a search engine for code. Built by Wei-Ming Khoo, it will analyse an unknown binary, parse it into functions, index them, and compare them with a library of code harvested from open-source projects.
As time goes on, the programs we need to reverse engineer get ever larger, so we need better tools. Yet most code nowadays is not written from scratch, but cut and pasted. Programmers are not an order of magnitude more efficient than a generation ago; it’s just that we have more and better libraries to draw on nowadays, and a growing shared heritage of open software. So our idea is to reframe the decompilation problem as a search problem, and harness search-engine technology to the task.
As with a text search engine, Rendezvous uses a number of different techniques to index a target binary, some of which are described in this paper, along with the main engineering problems. As well as reverse engineering suspicious binaries, code search engines could be used for many other purposes such as monitoring GPL compliance, plagiarism detection, and quality control. On the dark side, code search can be used to find new instances of disclosed vulnerabilities. Every responsible software vendor or security auditor should build one. If you’re curious, here is the demo.
With some delay here is the second and final part on our impressions of David Birch’s Tomorrow’s Transactions Forum (TTF13), which we attended thanks to Dave’s generosity (See full agenda and PowerPoint presentations here). See part 1 here.
NOTE: Although written in first person, what follows results from a combination of Laurent Simon’s and my notes.
The theme of day 2 at TTF13 was social inclusion. The kick off question was “How to develop tools to help people deal with money?” (people with no financial culture and based on a transactional account).
This was followed by presentations on “Comic Relief” (the day before ‘the big day’), “Universal Credit” and expert panel on financial inclusion.
Continue reading Current issues in payments (part 2)