Category Archives: Uncategorized

UKRI Digital Security by Design: A £190M research programme around Arm’s Morello – an experimental ARMv8-A CPU, SoC, and board with CHERI support

PIs: Robert N. M. Watson (Cambridge), Simon W. Moore (Cambridge), Peter Sewell (Cambridge), and Peter G. Neumann (SRI)

Since 2010, SRI International and the University of Cambridge, supported by DARPA, have been developing CHERI: a capability-system extension to RISC Instruction-Set Architectures (ISAs) supporting fine-grained memory protection and scalable compartmentalization .. while retaining incremental deployability within current C and C++ software stacks. This ten-year research project has involved hardware-software-semantic co-design: FPGA prototyping, compiler development, operating-system development, and application adaptation, as well as formal modeling and proof. Extensively documented in technical reports and research papers, we have iterated on CHERI as we evaluated and improved microarchitectural overheads, performance, software compatibility, and security.

As we know, mainstream computer systems are still chronically insecure. One of the main reasons for this is that conventional hardware architectures and C/C++ language abstractions, dating back to the 1970s, provide only coarse-grained memory protection. Without memory safety, many coding errors turn into exploitable security vulnerabilities. In our ASPLOS 2019 paper on CheriABI (best paper award), we demonstrated that a complete UNIX userspace and application suite could be protected by strong memory safety with minimal source-code disruption and acceptable performance overheads. Scalable software compartmentalization offers mitigation for future unknown classes of vulnerabilities by enabling greater use of design patterns such as software sandboxing. Our An Introduction to CHERI technical report introduces our approach including the architecture, microarchitectural contributions, formal models, software protection model, and practical software adaptation. The CHERI ISA v7 specification is the authoritative reference to the architecture, including both the architecture-neutral protection model and its concrete mappings into the 64-bit MIPS and 32/64-bit RISC-V ISAs. Our Rigorous Engineering technical report describes our modelling and mechanised proof of key security properties.

Today, we are very excited to be able to talk about another long-running aspect of our DARPA-supported work: A collaboration since 2014 with engineers at Arm to create an experimental adaptation of CHERI to the ARMv8-A architecture. This widely used ISA is the foundation for the vast majority of mobile phones and tablets, including those running iOS and Android. The £170M UKRI program Digital Security by Design (DSbD) was announced in late September 2019 to explore potential applications of CHERI — with a £70M investment by UKRI, and a further £117M from industry including involvement by Arm, Microsoft, and Google. Today, UKRI and Arm announced that the Arm Morello board will become available from 2021: Morello is a prototype 7nm high-end multi-core superscalar ARMv8-A processor (based on Arm’s Neoverse N1), SoC, and board implementing experimental CHERI extensions. As part of this effort, the UK Engineering and Physical Sciences Research Council (EPSRC) has also announced a new £8M programme to fund UK academics to work with Morello. Arm will release their Morello adaptation of our CHERI Clang/LLVM toolchain, and we will release a full adaptation of our open-source CHERI reference software stack to Morello (including our CheriBSD operating system and application suite) as foundations for research and prototyping on Morello. Watch the DSbD workshop videos from Robert Watson (Cambridge), Richard Grisenthwaite (Arm), and Manuel Costa (Microsoft) on CHERI and Morello, which are linked below, for more information.

This is an incredible opportunity to validate the CHERI approach, with accompanying systems software and formal verification, through an industrial scale and industrial quality hardware design, and to broaden the research community around CHERI to explore its potential impact. You can read the announcements about Morello here:

Recordings of several talks on CHERI and Morello are now available from the ISCF Digital Security by Design Challenge Collaborators’ Workshop (26 September 2019), including:

  • Robert Watson (Cambridge)’s talk on CHERI, and on our transition collaboration with Arm (video) (slides)
  • Richard Grisenthwaite (Arm)’s talk on the Morello board and CHERI transition (video) (slides)
  • Manuel Costa (Microsoft)’s talk on memory safety and potential opportunities arising with CHERI and Morello (video)

In addition, we are maintaining a CHERI DSbD web page with background information on CHERI, announcements regarding Morello, links to DSbD funding calls, and information regarding software artefacts, formal models, and so on. We will continue to update that page as the programme proceeds.

This has been possible through the contributions of the many members of the CHERI research team over the last ten years, including: Hesham Almatary, Jonathan Anderson, John Baldwin, Hadrien Barrel, Thomas Bauereiss, Ruslan Bukin, David Chisnall, James Clarke, Nirav Dave, Brooks Davis, Lawrence Esswood, Nathaniel W. Filardo, Khilan Gudka, Brett Gutstein, Alexandre Joannou, Robert Kovacsics, Ben Laurie, A. Theo Markettos, J. Edward Maste, Marno van der Maas, Alfredo Mazzinghi, Alan Mujumdar, Prashanth Mundkur, Steven J. Murdoch, Edward Napierala, Kyndylan Nienhuis, Robert Norton-Wright, Philip Paeps, Lucian Paul-Trifu, Alex Richardson, Michael Roe, Colin Rothwell, Peter Rugg, Hassen Saidi, Stacey Son, Domagoj Stolfa, Andrew Turner, Munraj Vadera, Jonathan Woodruff, Hongyan Xia, and Bjoern A. Zeeb.

Approved for public release; distribution is unlimited. This work was supported by the Defense Advanced Research Projects Agency (DARPA) and the Air Force Research Laboratory (AFRL), under contract FA8750-10-C-0237 (CTSRD), with additional support from FA8750-11-C-0249 (MRC2), HR0011-18-C-0016 (ECATS), and FA8650-18-C-7809 (CIFV) as part of the DARPA CRASH, MRC, and SSITH research programs. The views, opinions, and/or findings contained in this report are those of the authors and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government. We also acknowledge the EPSRC REMS Programme Grant (EP/K008528/1), the ERC ELVER Advanced Grant (789108), the Isaac Newton Trust, the UK Higher Education Innovation Fund (HEIF), Thales E-Security, Microsoft Research Cambridge, Arm Limited, Google, Google DeepMind, HP Enterprise, and the Gates Cambridge Trust.

End of privacy rights in the UK public sector?

There has already been serious controversy about the “Henry VIII” powers in the Brexit Bill, which will enable ministers to rewrite laws at their discretion as we leave the EU. Now Theresa May’s government has sneaked a new “Framework for data processing in government” into the Lords committee stage of the new Data Protection Bill (see pages 99-101, which are pp 111–3 of the pdf). It will enable ministers to promulgate a Henry VIII privacy regulation with quite extraordinary properties.

It will cover all data held by any public body including the NHS (175(1)), be outside of the ICO’s jurisdiction (178(5)) and that of any tribunal (178(2)) including Judicial Review (175(4), 176(7)), wider human-rights law (178(2,3,4)), and international jurisdictions – although ministers are supposed to change it if they notice that it breaks any international treaty obligation (177(4)).

In fact it will be changeable on a whim by Ministers (175(4)), have no effective Parliamentary oversight (175(6)), and apply retroactively (178(3)). It will also provide an automatic statutory defence for any data processing in any Government decision taken to any tribunal/court 178(4)).

Ministers have had frequent fights in the past over personal data in the public sector, most frequently over medical records which they have sold, again and again, to drug companies and others in defiance not just of UK law, EU law and human-rights law, but of the express wishes of patients, articulated by opting out of data “sharing”. In fact, we have to thank MedConfidential for being the first to notice the latest data grab. Their briefing gives more details are sets out the amendments we need to press for in Parliament. This is not the only awful thing about the bill by any means; its section 164 will be terrible news for journalists. This is one of those times when you need to write to your MP. Please do it now!

Inter-ACE national hacking competition today

Over 100 of the best students in cyber from the UK Academic Centres of Excellence in Cyber Security Research are gathered here at the University of Cambridge Computer Laboratory today for the second edition of our annual “Inter-ACE” hacking contest.

The competition is hosted on the CyberNEXS cyber-range of our sponsor Leidos, and involves earning points for hacking into each other’s machines while defending one’s own.   The competition has grown substantially from last year’s: you can follow it live on Twitter (@InterACEcyber) At the time of writing, we still don’t know who is going to take home the trophy. Can you guess who will?

The event has been made possible thanks to generous support from the National Cyber Security Centre, the Cabinet Office, Leidos and NCC Group.

And the winners are…

inter-ace-logo4

The Inter-ACE Cyberchallenge on Saturday was fantastic. The event saw nearly twice as many competitors as attended the C2C competition in Boston recently, engaged in solving the most artful challenges. It was great to see so many students interested in cyber security making the effort to travel from the four corners of the UK, a few from as far away as Belfast!

IMG_5373The competition was played out on a “Risk-style” world map, and competing teams had to fight each other for control of several countries, each protected by a fiendish puzzle. A number of universities had also submitted guest challenges, and it was great that so many teams got involved in this creative process too. To give one example; The Cambridge team had designed a challenge based around a historically accurate enigma machine, with this challenge protecting the country of Panama. Competitors had to brute-force the settings of the enigma machine to decode a secret message. Other challenges were based around the core CTF subject areas of web application security, binary reverse engineering and exploitation, forensics, and crypto. Some novice teams may have struggled to compete, but they would have learned a lot, and hopefully developed an appetite for more competition. There were also plenty of teams present with advanced tool sets and a solid plan, with these preparations clearly paying off in the final scores.

IMG_5426

Between the 10 teams, their coaches, the organisers and the reporters, the lab was bustling with excitement and that intense feeling of hackers “in the zone” for the whole afternoon.

IMG_5406

I have nothing but praise for our partners Facebook, who worked hard on setting the challenges and making the CTF game run smoothly, as well as feeding the participants with pizza and endowing the prizes with hacking books and goodie bags.

IMG_5298

The biggest thanks go to the ACE-CSRs who enthusiastically supported this initiative despite the short notice. 40 students came to Cambridge to compete in the live event in teams of 4, and another 40+ competed remotely in the individuals.

 

In retrospect we should have organised a “best T-shirt” competition. I especially liked Facebook t-shirts “Fix more, whine less” and “s/sleep/hack/g” but the one I would have voted overall winner (despite not technically being a T-shirt) was Southampton’s Shakespearian boolean logic.

IMG_5310

It is with a mixture of pride and embarrassment that I announce the winners, as Cambridge won the gold in both the team and individual events.

IMG_5686

Team event:

  • 1st place (Gold): University of Cambridge
    Stella Lau, Will Shackleton, Cheng Sun, Gábor Szarka
  • 2nd place (Silver): Imperial College London
    Matthieu Buffet, Jiarou Fan, Luke Granger-Brown, Antoine Vianey-Liaud
  • 3rd place (Bronze): University of Southampton
    Murray Colpman, Kier Davis, Yordan Ganchev, Mohit Gupta

 

Individual event:

  • 1st place (Gold): Dimitrije Erdeljan, University of Cambridge
  • 2nd place (Silver): Emma Espinosa, University of Oxford
  • 3rd place (Bronze): David Young, University of Southampton

IMG_5346

I shall ignore allegations of having rigged the game except to say that yes, we did train our students rather extensively in preparation for the previously-mentioned Cambridge 2 Cambridge event with MIT. All of our winners are Cambridge undergraduates in computer science who had done well in the qualifiers for C2C. Two of them had actually been to Boston, where Gábor had been on the winning team overall and earned one gold and two silver medals, while Will (also former UK Cyber Security Challenge winner) had earned one gold, one silver and two bronze medals. Well deserved thanks also to my modest but irreplaceable collaborator Graham Rymer who designed and delivered an effective and up-to-date ethical hacking course to our volunteers. The Cambridge success in this weekend’s competition gives promising insights into the effectiveness of this training which we are gearing up to offering to all our undergraduates and potentially to other interested audiences in the future.

IMG_5359

We are once again grateful to everyone who took part. We are also grateful to the Cabinet Office, to EPSRC and to GCHQ for support that will allow us to keep the event running and we hereby invite all the ACEs to sharpen their hacking tools for next year and come back to attempt to reconquer the trophy from us.

What do we mean by scale?

Online booking is now open for the annual Lovelace lecture, which I’m giving in London on March 15th. I’ll be using the event to try to develop a cross-cutting theme that arises from my research but is of wider interest. What do we mean by scale?

Back when computing was expensive, computer science started out worrying about how computations scaled – such as which sorting algorithms ran as n², n log n or even n. Once software development became the bottleneck, we started off by trying the same approach (measuring code entropy or control graph complexity), but moved on to distinguishing what types of complexity could be dealt with by suitable tools. Mass-market computing and communications brought network effects, and scalability started to depend on context (this is where security economics came in). Now we move to “Big Data” the dependency on people becomes more explicit. Few people have stopped to think of human factors in scaling terms. Do we make information about many people available to many, or to few? What about the complexity of the things that can be done with personal data? What about costs now versus in the future, and the elasticity of demand associated with such costs? Do you just count the data subjects, do you count the attackers too, or do you add the cops as well?

I’ve been quoted as saying “You can have security, functionality, scale – choose any two” or words to that effect. I’ll discuss this and try to sketch the likely boundaries, as well as future research directions. The discussion will cross over from science and engineering to economics and politics; recent proposed legislation in the UK, and court cases in the USA, would impose compliance burdens on people trying to scale systems up from one country to many.

The students we’re training to be the next generation of developers and entrepreneurs will need a broader understanding of what’s involved in scaling systems up, and in this talk I’ll try to explore what that means. Maybe I’m taking a risk with this talk, as I’m trying to assemble into a row a lot of facts that are usually found in different columns. But I do hope it will not be boring.

Commercialising academic research

At the 2014 annual conference of the Academic Centres of Excellence in Cyber-Security Research I was invited to give a talk on commercialising research from the viewpoint of an academic. I did that by distilling the widsom and experience of five of my Cambridge colleagues who had started a company (or several). The talk was well received at the conference and may be instructive both for academics with entrepreneurial ambitions and for other universities that aspire to replicate the “Cambridge phenomenon” elsewhere.

Screenshot from 2015-01-12 14:45:04

A recording of the presentation, Commercialising research: the academic’s perspective from Frank Stajano Explains, is available on Vimeo.

Economics of Cybersecurity MOOC

Security economics is a thriving research discipline, kicked off in 2001 with Ross Anderson’s seminal paper. There has been an annual workshop since 2002. In recent years there has also been an effort to integrate some of the key concepts and findings into course curricula, including in the Part II Security course at Cambridge and my own course at SMU.

We are pleased to announce that on 20 January 2015, we will launch an online course on the Economics of Cybersecurity, as part of edX Professional Education. The course provides a thorough introduction to the field, delivered by leading researchers from Delft University of Technology, University of Cambridge, University of Münster and Southern Methodist UniversityContinue reading Economics of Cybersecurity MOOC

Health privacy: not fixed yet

I have written a letter to Stephen Dorrell, the chair of the Health Committee, to point out how officials appear to have misled his committee when they gave evidence there on Tuesday.

It is very welcome that the Health Secretary, Jeremy Hunt, announced he will change the law to ban the sale of our medical records collected via HES and care.data. He acted after it became clear that although officials told the Health Committee that our records collected via care.data could not legally be sold, records collected via a different system (HES) already had been. But that is not all.

Officials also said our records would not be sold abroad, and that only coded data would be extracted rather than free text entered by GPs during consultations. Yet our records were offered for sale in the USA; the Department signed a memorandum of understanding with the USA on data sharing; and CPRD (a system operated by MHRA, the regulator) has been supplying free text for mining.

I also sent Mr Dorrell a previously unpublished briefing I wrote for the European Commission last year on the potential harm that can follow if patients lose confidence in confidentiality. Evidence from the USA and elsewhere suggests strongly that tens of thousands of people would seek treatment late, or not at all.