Category Archives: Uncategorized

LDAP based UDP reflection attacks increase throughout 2017

There have been reports that UDP reflection DDoS attacks based on LDAP (aka CLDAP) have been increasing in recent months. Our network of UDP honeypots (described previously) confirms that this is the case. We estimate there are around 6000 attacks per day using this method. Our estimated number of attacks has risen fairly linearly from almost none at the beginning of 2017 to 5000-7000 per day at the beginning of 2018.
Number of attacks rises linearly from 0 at the beginning of 2017 to 5000-7000 per day at the beginning of 2018

Over the period where Netlab observed 304,146 attacks (365 days up to 2017-11-01) we observed 596,534 attacks. This may be due to detecting smaller attacks or overcounting due to attacks on IP prefixes.

The data behind this analysis is part of the Cambridge Cybercrime Centre’s catalogue of data available to academic researchers.

End of privacy rights in the UK public sector?

There has already been serious controversy about the “Henry VIII” powers in the Brexit Bill, which will enable ministers to rewrite laws at their discretion as we leave the EU. Now Theresa May’s government has sneaked a new “Framework for data processing in government” into the Lords committee stage of the new Data Protection Bill (see pages 99-101, which are pp 111–3 of the pdf). It will enable ministers to promulgate a Henry VIII privacy regulation with quite extraordinary properties.

It will cover all data held by any public body including the NHS (175(1)), be outside of the ICO’s jurisdiction (178(5)) and that of any tribunal (178(2)) including Judicial Review (175(4), 176(7)), wider human-rights law (178(2,3,4)), and international jurisdictions – although ministers are supposed to change it if they notice that it breaks any international treaty obligation (177(4)).

In fact it will be changeable on a whim by Ministers (175(4)), have no effective Parliamentary oversight (175(6)), and apply retroactively (178(3)). It will also provide an automatic statutory defence for any data processing in any Government decision taken to any tribunal/court 178(4)).

Ministers have had frequent fights in the past over personal data in the public sector, most frequently over medical records which they have sold, again and again, to drug companies and others in defiance not just of UK law, EU law and human-rights law, but of the express wishes of patients, articulated by opting out of data “sharing”. In fact, we have to thank MedConfidential for being the first to notice the latest data grab. Their briefing gives more details are sets out the amendments we need to press for in Parliament. This is not the only awful thing about the bill by any means; its section 164 will be terrible news for journalists. This is one of those times when you need to write to your MP. Please do it now!

Inter-ACE national hacking competition today

Over 100 of the best students in cyber from the UK Academic Centres of Excellence in Cyber Security Research are gathered here at the University of Cambridge Computer Laboratory today for the second edition of our annual “Inter-ACE” hacking contest.

The competition is hosted on the CyberNEXS cyber-range of our sponsor Leidos, and involves earning points for hacking into each other’s machines while defending one’s own.   The competition has grown substantially from last year’s: you can follow it live on Twitter (@InterACEcyber) At the time of writing, we still don’t know who is going to take home the trophy. Can you guess who will?

The event has been made possible thanks to generous support from the National Cyber Security Centre, the Cabinet Office, Leidos and NCC Group.

And the winners are…

inter-ace-logo4

The Inter-ACE Cyberchallenge on Saturday was fantastic. The event saw nearly twice as many competitors as attended the C2C competition in Boston recently, engaged in solving the most artful challenges. It was great to see so many students interested in cyber security making the effort to travel from the four corners of the UK, a few from as far away as Belfast!

IMG_5373The competition was played out on a “Risk-style” world map, and competing teams had to fight each other for control of several countries, each protected by a fiendish puzzle. A number of universities had also submitted guest challenges, and it was great that so many teams got involved in this creative process too. To give one example; The Cambridge team had designed a challenge based around a historically accurate enigma machine, with this challenge protecting the country of Panama. Competitors had to brute-force the settings of the enigma machine to decode a secret message. Other challenges were based around the core CTF subject areas of web application security, binary reverse engineering and exploitation, forensics, and crypto. Some novice teams may have struggled to compete, but they would have learned a lot, and hopefully developed an appetite for more competition. There were also plenty of teams present with advanced tool sets and a solid plan, with these preparations clearly paying off in the final scores.

IMG_5426

Between the 10 teams, their coaches, the organisers and the reporters, the lab was bustling with excitement and that intense feeling of hackers “in the zone” for the whole afternoon.

IMG_5406

I have nothing but praise for our partners Facebook, who worked hard on setting the challenges and making the CTF game run smoothly, as well as feeding the participants with pizza and endowing the prizes with hacking books and goodie bags.

IMG_5298

The biggest thanks go to the ACE-CSRs who enthusiastically supported this initiative despite the short notice. 40 students came to Cambridge to compete in the live event in teams of 4, and another 40+ competed remotely in the individuals.

 

In retrospect we should have organised a “best T-shirt” competition. I especially liked Facebook t-shirts “Fix more, whine less” and “s/sleep/hack/g” but the one I would have voted overall winner (despite not technically being a T-shirt) was Southampton’s Shakespearian boolean logic.

IMG_5310

It is with a mixture of pride and embarrassment that I announce the winners, as Cambridge won the gold in both the team and individual events.

IMG_5686

Team event:

  • 1st place (Gold): University of Cambridge
    Stella Lau, Will Shackleton, Cheng Sun, Gábor Szarka
  • 2nd place (Silver): Imperial College London
    Matthieu Buffet, Jiarou Fan, Luke Granger-Brown, Antoine Vianey-Liaud
  • 3rd place (Bronze): University of Southampton
    Murray Colpman, Kier Davis, Yordan Ganchev, Mohit Gupta

 

Individual event:

  • 1st place (Gold): Dimitrije Erdeljan, University of Cambridge
  • 2nd place (Silver): Emma Espinosa, University of Oxford
  • 3rd place (Bronze): David Young, University of Southampton

IMG_5346

I shall ignore allegations of having rigged the game except to say that yes, we did train our students rather extensively in preparation for the previously-mentioned Cambridge 2 Cambridge event with MIT. All of our winners are Cambridge undergraduates in computer science who had done well in the qualifiers for C2C. Two of them had actually been to Boston, where Gábor had been on the winning team overall and earned one gold and two silver medals, while Will (also former UK Cyber Security Challenge winner) had earned one gold, one silver and two bronze medals. Well deserved thanks also to my modest but irreplaceable collaborator Graham Rymer who designed and delivered an effective and up-to-date ethical hacking course to our volunteers. The Cambridge success in this weekend’s competition gives promising insights into the effectiveness of this training which we are gearing up to offering to all our undergraduates and potentially to other interested audiences in the future.

IMG_5359

We are once again grateful to everyone who took part. We are also grateful to the Cabinet Office, to EPSRC and to GCHQ for support that will allow us to keep the event running and we hereby invite all the ACEs to sharpen their hacking tools for next year and come back to attempt to reconquer the trophy from us.

What do we mean by scale?

Online booking is now open for the annual Lovelace lecture, which I’m giving in London on March 15th. I’ll be using the event to try to develop a cross-cutting theme that arises from my research but is of wider interest. What do we mean by scale?

Back when computing was expensive, computer science started out worrying about how computations scaled – such as which sorting algorithms ran as n², n log n or even n. Once software development became the bottleneck, we started off by trying the same approach (measuring code entropy or control graph complexity), but moved on to distinguishing what types of complexity could be dealt with by suitable tools. Mass-market computing and communications brought network effects, and scalability started to depend on context (this is where security economics came in). Now we move to “Big Data” the dependency on people becomes more explicit. Few people have stopped to think of human factors in scaling terms. Do we make information about many people available to many, or to few? What about the complexity of the things that can be done with personal data? What about costs now versus in the future, and the elasticity of demand associated with such costs? Do you just count the data subjects, do you count the attackers too, or do you add the cops as well?

I’ve been quoted as saying “You can have security, functionality, scale – choose any two” or words to that effect. I’ll discuss this and try to sketch the likely boundaries, as well as future research directions. The discussion will cross over from science and engineering to economics and politics; recent proposed legislation in the UK, and court cases in the USA, would impose compliance burdens on people trying to scale systems up from one country to many.

The students we’re training to be the next generation of developers and entrepreneurs will need a broader understanding of what’s involved in scaling systems up, and in this talk I’ll try to explore what that means. Maybe I’m taking a risk with this talk, as I’m trying to assemble into a row a lot of facts that are usually found in different columns. But I do hope it will not be boring.

Commercialising academic research

At the 2014 annual conference of the Academic Centres of Excellence in Cyber-Security Research I was invited to give a talk on commercialising research from the viewpoint of an academic. I did that by distilling the widsom and experience of five of my Cambridge colleagues who had started a company (or several). The talk was well received at the conference and may be instructive both for academics with entrepreneurial ambitions and for other universities that aspire to replicate the “Cambridge phenomenon” elsewhere.

Screenshot from 2015-01-12 14:45:04

A recording of the presentation, Commercialising research: the academic’s perspective from Frank Stajano Explains, is available on Vimeo.

Economics of Cybersecurity MOOC

Security economics is a thriving research discipline, kicked off in 2001 with Ross Anderson’s seminal paper. There has been an annual workshop since 2002. In recent years there has also been an effort to integrate some of the key concepts and findings into course curricula, including in the Part II Security course at Cambridge and my own course at SMU.

We are pleased to announce that on 20 January 2015, we will launch an online course on the Economics of Cybersecurity, as part of edX Professional Education. The course provides a thorough introduction to the field, delivered by leading researchers from Delft University of Technology, University of Cambridge, University of Münster and Southern Methodist UniversityContinue reading Economics of Cybersecurity MOOC

Health privacy: not fixed yet

I have written a letter to Stephen Dorrell, the chair of the Health Committee, to point out how officials appear to have misled his committee when they gave evidence there on Tuesday.

It is very welcome that the Health Secretary, Jeremy Hunt, announced he will change the law to ban the sale of our medical records collected via HES and care.data. He acted after it became clear that although officials told the Health Committee that our records collected via care.data could not legally be sold, records collected via a different system (HES) already had been. But that is not all.

Officials also said our records would not be sold abroad, and that only coded data would be extracted rather than free text entered by GPs during consultations. Yet our records were offered for sale in the USA; the Department signed a memorandum of understanding with the USA on data sharing; and CPRD (a system operated by MHRA, the regulator) has been supplying free text for mining.

I also sent Mr Dorrell a previously unpublished briefing I wrote for the European Commission last year on the potential harm that can follow if patients lose confidence in confidentiality. Evidence from the USA and elsewhere suggests strongly that tens of thousands of people would seek treatment late, or not at all.