Welcome to the UK Honeynet Project

The UK Honeynet Project (a Chapter of The Honeynet Project) was founded in 2002 as a volunteer not-for-profit research organisation. Our aim is to provide information surrounding security threats and vulnerabilities active in the wild on UK networks today, to learn the tools, tactics, and motives of the blackhat community and to share these lessons learned with the public and the wider IT community. The project seeks to provide input as part of an overall honeynet community of teams researching security within IT systems around the globe.

UK Honeynet Project Chapter Annual Status Report For 2011/2012

19:41, December 4th, 2012 by admin

As part of membership requirements, each year, all chapters of the Honeynet Project must post annual reports that detail what their chapter members have been working on during that period. The reporting period got a bit mixed up recently, so this is the UK Chapter’s annual report for both 2011 and 2012. You can find the status reports for other Chapters on the main Honeynet Project website.


Current UK Chapter members are:

David Watson – Full member, Chapter Lead, Honeynet Project Chief Research Officer
Arthur Clune – Full member
Jamie Riden – Full member
Steve Mumford – Alumni member

As you may have noticed from the lack of recent updates to our UK Chapter blog, during this period our members have either mostly been involved in activities under the core Honeynet Project, rather than UK-specific chapter activities, or have been busy with personal/professional lives so have had limited time to contribute here. That has unfortunately reduced public facing UK Chapter activity to lowest point in many years.

We have had a number of membership inquiries during this period, and potentially could increase our chapter membership, but to be honest, we have avoided bringing in new UK Chapter members whilst UK activity levels were low and no-one had the time to adequately support new members. Hopefully that situation will improve in 2013 and we’ll see increased UK Chapter output once again.


During this period we have had a mix of honeynet technologies deployed. Some have been part of long term data collection efforts, whilst others have been shorter term deployments – often for testing of new tools.

Long term deployments:

1) [David] Our version 1 HonEeeBox pre-packaged (Nepenthes) low interaction sensor project was active at the start of this reporting period, but has since switched over to the version 2 HonEeeBox system. Although the version 1 system is no longer being maintained, Just for reference purposes, two of the original HonEeeBox v1 sensors are still running and the total amount of data collected to date by the old system is:

Sensors: 43

Total Attacks: 2,401,582

Total Attacker IPs: 36,632

Total Victim IPs: 214

Total MD5sums: 4,665

Total malicious binary size: 559 Mbytes

2) [David] Like the v1 Nepenthes based HonEeeBoxes, the first releases of the Dionaea powered HonEeeBox v2 system still initially submitted data to a submit_http backend, which was developed during GSoC 2011. We have run a honey cloud hosted instance of that old backend, plus a couple of sensors for most of this period. The data has only been retained for historical purposes.

3) [David] Later v2 Dionaea based HonEeeBoxes were HPFeeds-enabled, and we have been submitting data to the Honeynet Project’s shared HPFeeds system from multiple physical and virtual sensors since it went live. These are a mix of Asus EeePC based physical HonEeeBoxes on domestic ADSL/FTTC lines, or cloud provider hosted VM instances. Current rough volumes of Dionaea events captured through HPFeedsvto date are:

Sensors: 44

Total Attacks: 14,552,708

Total Attacker IPs: 300,451

Total Victim IPs: 2,410

Total MD5sums: 7,865

Total malicious binary size: 2.6 Gbytes

Data and binary samples collected from each of the above systems were shared with the Shadowserver Foundation and VirusTotal, for automated AV and sandbox analysis, and hopefully eventual remediation of infected hosts. Enriched data has has also been logged locally in an instance of the GSoC 2012 HonEeeBox backend project, that we hope to continue developing with the student Gyoergy in 2013. Longer term we hope to be able to expand the number of sensors to 100+ and release public visualizations of these attacks.

4) Jamie has recently deployed a couple of local HPFeeds-enabled Dionaea sensors too, which are also feeding the main Honeynet Project shared HPFeeds instance.

5) During the start of this period David was still running a legacy Global Distributed Honeynet (GDH2) high interaction sensor node on a domestic DSL connection (since disabled). That included a Honeywall plus a mix of low and high interaction honeypots, mostly on Linux.

6) At points during this period, David ran a mix of Capture-HPC high interaction client honeypots, HoneySpiderNetwork low/high interaction client honeypots, and PhoneyC and Thug low interaction client honeypots.

7) David has helped provide the infrastructure used by other Project members in various botnet related studies and takedown activities. More information about these activities will hopefully eventually be made public.


During this period we built or worked on the following tools:

1) [David] HonEeeBox pre-packaged low interaction honeypot sensor system and associated back/front ends. We hope to continue this development work in 2013, increasing the number of sensors, adding low interaction SSH honeypot capabilities through Kippo, adding options for centralized monitoring and management, and perhaps including proxy/client honeypot elements too.

2) David and Arthur were mentors for GSoC 2011/2012 on HonEeeBox backend and front end development, which we also hope to continue in the future, eventually releasing a public Django/JS based user interface to replace the previous private ExtJS based HonEeeBox v1 prototype interface.

3) Minor support for our Honeysnap tool, when end user requests or bug reports were received.

4) We have tried to provide suggestions for improving some existing tolls or adding new features to new projects, such as the excellent Cuckoo Sandbox or aging Honeywall system.

For our current R&D activities:

1) David built a number of data visualization tools based on Processing.org, but didn’t get around to publicly releasing them. He very much hopes to rectify this failing in 2013 😉

2) Arthur is currently working on a pastebin scraping system, which will hopefully generate some interesting data for future analysis.

3) David has recently been working on spam pots with CERT.BR and the Shadowserver Foundation, which will become part of a larger scale distributed honeypot effort in 2013.

4) David has some ideas for next generation honeynet data capture systems and is currently exploring them. Will eventually share concepts and prototypes with members then the public at a suitable point.

5) Earlier in 2012 David ported the HonEeeBox system to the Raspberry Pi platform, to potentially provide another very low cost means of potentially distributing low interaction honeypot sensor systems. He will attempt to blog this information and release a disk image here in the next few days. Apologies to anyone waiting to use it for the delay! 😉

In general, we are still interested in large scale distributed honeynet sensor deployments and the tools necessary to store/manage/automate/visualize collected data. We would also like to see the ongoing development of high interaction honeypot technologies, or next generation alternatives for gathering such data. We’d like to continue to collaborate with anyone interested in the same goals, and to perhaps also run some more UK-focused future activities too.


Unfortunately nothing to be shared with the public at this time except the observation that running public internet facing low interaction honeypots to detect network spreading malware generally only results in a lot of Conficker samples!


Since last chapter status report, recent speaking engagements for David were:

September 2010 – Hands-on honeynet training classes for CNCERT/CC and FIRST TC, Beijing CN
September 2010 – Whats New In Honeynets presentation CNCERT/CC and FIRST TC, Beijing CN
November 2010GovCERT.NL security symposium, Rotterdam NL
December 2010 – 2 weeks of teaching hands-on honeynet classes, giving presentations, attending meetings, etc in Tokyo for NTT CERT, NTT, Hitachi, Nippon CSIRT Association (NCA), SECOND group, JP CERT, various national ISPs, etc.
January 2011 – BBC NewsNight Cyber Attacks
March 2011Honeynet Project annual workshop Paris FR. Public R&D overview presentation, private P1 research, GSoC, HonEeeBox and Shadowserver presentations/sessions.
March 2011 – October 2011 – Organisational administrator for Honeynet Project Google Summer of Code 2011 and student project mentor.
June 2011 –  CERT.EE Security Symposium, Tallin, EE.
October 2011 –  Google Summer of Code Mentor’s Summit, Google CA.
February 2012 –  Shadowserver Foundation annual workshop, San Jose, CA. Presentation on Honeynet R&D and GSoC.
March 2012Honeynet Project annual workshop Facebook, CA. Public hands-on honeynet training class, public R&D overview presentation, private P1 research, GSoC and HonEeeBox presentations/sessions.
March 2012 – October 2012 – Organisational administrator for Honeynet Project Google Summer of Code 2012 and student project mentor
June 2012 –   CERT.EE Security Symposium, Tallinn, EE. Presentation on recent honeynet R&D.
September 2012 – conference at Interpol, Lyon, FR.
October 2012 –  Google Summer of Code Mentor’s Summit, Google CA.

David will be teaching a 2 days hands-on honeynets class at the Honeynet Project’s next annual workshop in Dubai (which should be another great international event if you are interested in the cutting edge of honeynet R&D, so please check it out!), along with hopefully leading discussions again during private workshop events on honeynet R&D, GSoC and HonEeeBox, amongst others.

UK Chapter members have also attended UK-specific industry events such as Infosec UK and JANET meetings. Jamie presented at OWASP Birmingham in September and OWASP Edinburgh in November.

We continue to be active on both internal and external IRC and email, although UK-specific blogging activity has been poor. Chapter members have been involved in various Honeynet Project committee mailing lists, such as annual workshop organization, membership committee and infrastructure support. Members also individually participate in various other open or closed info-sec vetted communities too.


Since most activity by UK Chapter members was general Honeynet Project activity, we would like to continue to remain active members but also try to increase UK-specific activity.

We would like to see the recent GSoC work on HonEeeBox sensor back/front ends result in a public UI release.

We would like to release some interesting visualisations of existing data sets, then try and engage the wider infosec and data visualisation communities on how best to improve them. We may try and run a series of public Data Visualisation challenges in 2013.


Other activities that our Chapter members have been involved in during this period:

David was a Director of the Honeynet Project in 2011 and remains the Chief Research Officer (CRO). Involvement with various fund raising efforts and proposals (some under NDA), some of which resulted in additional financial support for the Honeynet Project‘s annual workshops in 2012 and 2013, and some of which are ongoing.

David collaborated on a EPSRC network proposal with Queens University Belfast Information Security Centre.


David was a GSoC student project mentor in 2011 and 2012 (and GSoC Org admin), Jamie was a student project mentor in 2010 and 2012. Arthur was a GSoC student project mentor in 2012 and helped with student selection in 2011.

GSOC 2012 project

13:07, November 12th, 2012 by arthur

As part of the Google Summer of Code, the UK Honynet project ran a project working with Gyöergy Kohut from the University of Dortmund to produce a web front end for Honeeebox. it went well: Gyöergy produced a Java backend which took events and stored them in a PostgreSQL database plus a web front end based on Django and Javascript.

The GSOC project has now finished, but we’re continuing to work on the project.

Returning to life

22:27, February 20th, 2011 by arthur

There’s been a long hiatus in blogging on this site. We’ve not stopped working, just blogging. We’ll aim to have content on here a little more regularly from now on but with a slight change of emphasis. Up till now we’ve only posted notes on things we were doing ourselves. Now we’ll broaden it out a little to include general commentary on the InfoSec world and current news.

Hopefully this will both make this site a more general resource but also allow us to blog more frequently. Tools aren’t updated that often (and some that are we can’t blog about) but the joy of InfoSec is that there is always something new happening.

Compiling Capture-HPC on VMWare Server 1.0.6

17:00, July 28th, 2008 by david

We often use Capture-HPC as a high interaction client honeypot for analyzing suspect URLs, but getting it up and running on a new platform can sometimes be a somewhat frustrating and time consuming process. I’ve recently had to repeat the build process on the latest version of VMWare Server (release 1.0.6 build-91891) running on Ubuntu Gutsy, so in case this saves anyone else some pain, this is what I had to do to make it work:

1) Download the latest sources (at the time of writing this was capture-server-2.1.0-300-src.zip)

2) Extract the latest sources

unzip capture-server-2.1.0-300-src.zip
cd capture-server-2.1.0-300-src

3) Ensure the necessary build dependencies were installed

sudo aptitude update ; sudo aptitude install ant ant-optional sun-java6-jdk sun-java6-bin sun-java6-jre
sudo install VMWare-Server-1.0.6-build-91891

4) Set the correct environment variables

  JAVA_HOME=/usr/lib/jvm/java-6-sun- ; export JAVA_HOME
  VIX_HOME=/usr/lib/vmware-vix/ ; export VIX_HOME
  VIX_INCLUDE=/usr/include/vmware-vix/ ; export VIX_INCLUDE
  VIX_LIB=/usr/lib/vmware-vix/ ; export VIX_LIB
  ANT_HOME=/usr/share/ant/ ; export ANT_HOME

5) Hack the revert compilation shell script:

chmod +x compile_revert_linux.sh
cat compile_revert_linux.sh
#gcc -I $VIX_INCLUDE -o revert revert.c $VIX_LIB/libvmware-vix.so
gcc -I $VIX_INCLUDE -o revert revert.c /usr/lib/libvmware-vix.so

6) Remove any of the logic from build.xml that refers to the Windows OS branch:

vi build.xml
<?xml version="1.0"?>
<project name="CaptureServer" default="release" basedir=".">
        <!-- all stuff to get the jni wrapper compiled -->
        <taskdef resource="net/sf/antcontrib/antcontrib.properties"/>

        <condition property="os" value="unix">
        <os family="unix"/>

         <property environment="env"/>
     <property name="src" value="."/>
     <property name="build" value="build"/>
     <property name="release" value="release"/>

     <target name="init">
          <mkdir dir="${build}"/>
                  <mkdir dir="${release}"/>

     <target name="compile" depends="init">
          <!-- Compile the java code -->
          <javac srcdir="${src}" destdir="${build}" debug="true" debuglevel="lines,vars,source"/>

                  <!-- Compile the revert code -->
                   <exec command="sh" executable="./compile_revert_linux.sh"/>


         <target name="jar" depends="compile">
        <mkdir dir="${build}/jar"/>
        <jar destfile="${build}/jar/CaptureServer.jar" basedir="${build}">
                <attribute name="Main-Class" value="capture.Server"/>

        <target name="release" depends="clean,compile,jar">
                <copy file="${build}/jar/CaptureServer.jar" todir="${release}"/>
                <copy file="./COPYING" todir="${release}"/>
                <copy file="./Readme.txt" todir="${release}"/>
                <copy file="./input_urls_example.txt" todir="${release}"/>
                <copy file="./config.xsd" todir="${release}"/>
                <copy file="./config.xml" todir="${release}"/>

                    <exec executable="cp">
                      <arg value="./revert"/>
                      <arg value="${release}"/>

                <zip destfile="./CaptureServer-Release.zip" basedir="release"/>

        <target name="clean">
        <delete dir="${build}"/>
                <delete dir="${release}"/>
                        <fileset dir="." includes="revert.exe"/>
                        <fileset dir="." includes="revert"/>
                        <fileset dir="." includes="CaptureServer-Release.zip"/>

6) Compile the Capture Server

Buildfile: build.xml
  [taskdef] Could not load definitions from resource net/sf/antcontrib/antcontrib.properties. It could not be found.

   [delete] Deleting directory /home/david/client_honeypots/capture-server-2.1.0-300-src/build
   [delete] Deleting directory /home/david/client_honeypots/capture-server-2.1.0-300-src/release

    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/build
    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/release

    [javac] Compiling 32 source files to /home/david/client_honeypots/capture-server-2.1.0-300-src/build
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:9: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac] import sun.misc.BASE64Decoder;
    [javac]                ^
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:42: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac]                             BASE64Decoder base64 = new BASE64Decoder();
    [javac]                             ^
    [javac] /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/ClientFileReceiver.java:42: warning: sun.misc.BASE64Decoder is Sun proprietary API and may be removed in a future release
    [javac]                             BASE64Decoder base64 = new BASE64Decoder();
    [javac]                                                        ^
    [javac] Note: /home/david/client_honeypots/capture-server-2.1.0-300-src/capture/MockClient.java uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] 3 warnings
     [exec] The command attribute is deprecated.
     [exec] Please use the executable attribute and nested arg elements.
     [exec] /usr/include/vmware-vix/
     [exec] revert.c:232:2: warning: no newline at end of file

    [mkdir] Created dir: /home/david/client_honeypots/capture-server-2.1.0-300-src/build/jar
      [jar] Building jar: /home/david/client_honeypots/capture-server-2.1.0-300-src/build/jar/CaptureServer.jar

     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
     [copy] Copying 1 file to /home/david/client_honeypots/capture-server-2.1.0-300-src/release
      [zip] Building zip: /home/david/client_honeypots/capture-server-2.1.0-300-src/CaptureServer-Release.zip

Total time: 2 seconds

7) Extract the newly made CaptureServer-Release.zip file into a suitable location (such as a newly made capture-server-2.1.0-300 directory).

8) Configure config.xml and run as normal, such as via:

cd capture-server-2.1.0-300
vi config.xml
/usr/lib/jvm/java-6-sun/bin/java -Djava.net.preferIPv4Stack=true -jar CaptureServer.jar -s your_ip:7070 -f input_urls_example.txt

Hopefully Capture-HPC should work cleanly after that.

NOTE: If you experience problems running Capture and find you receive this error when attempting to run the server:

VIX Error on connect in connect: One of the parameters was invalid

check that your VMWare Server installation was clean by removing VMWare Server (vmware-uninstall.pl), finding any vmware related files in /usr, deleting them and then reinstalling VMWare. I found that one of my VMWare Server upgrades had left a number of vmware-vix shared libraries on disk and these seem to cause the newly compiled Capture Server to fail to connect on revert.

For more trouble shooting details, see this thread on the public Capture-HPC mailing list:


The Sad State of IT Security

17:29, July 14th, 2008 by arthur

On Friday I found out that my credit card had been used, by nefarious persons unknown, to buy £500 worth of goods online. Bad enough, but this is the second time this has happened in four years.

At this point I can hear the reader’s thoughts: stupid bugger, he’s been p0wned, got malware on his machine. Well, it’s possible. Like nearly everyone out there, my machine might have been 0wn3d by someone really good. Unless your name is H.D.Moore, there’s always someone out there better than you. But it’s unlikely. I know exactly what should be running on my machine, I know what programs can talk to the outside world, I look at tcpdumps and use a browser + OS combination that’s not currently targetted in the wild. I think I can be reasonably confident that the only malware on my machine is the stuff that’s put there by me so I can study it.

So if my machine is clean (with high probability), I haven’t lost my card (100% certain as I have it with me now) and I shred all my bank statements, bills and till receipts (yup), how come I’ve still been defrauded?

I use my card online a lot. I don’t gamble online, buy porn, dodgy pills, email my card details around or send my details to nice gentlemen in Nigeria but I do buy stuff from a range of shops, small and big.

So my best guess is that my card has been taken from a merchant. What could I do to stop this happening?

Two options:

1) Never spend money online. Very limiting and not going to happen. Even if I was willing to live with the inconvience, it doesn’t give 100% protection anyway: my card could still be stolen if I use it at a bricks and mortar store (e.g. anyone who shopped at a store in the TJX group had their card placed at risk after card details were stolen). I’m certainly not going to stop using my card totally.

2) Only ever spend money with the biggest online shops: ones that are big enough to have their own security teams, do code audits etc etc. Stick with amazon.co.uk and tesco.com. Not foolproof, but a reasonable reduction in risk. The problem with this is that a lot of stuff I want to buy online is only available from smaller shops. Worse, it’s only available from mid-sized retailers. Ones that are too big to just use Paypal, big enough to have their own in house ASP or PHP developers, but not big enough to do it right.

You might think I’ve missed an option there: ‘3) Only buy from trusted retailers’. The trouble is that as a consumer, even one much more knowledgeable about security than most, there is no way I can make any valid judgement about a retailers security or lack thereof. I don’t have access to any information that will let me evaluate a retailers security, and without that information being available, there’s also no competitive pressure on stores. Instead we have to rely on the banking groups dragging standards upwards via things like the PCI DSS standards. These are good, but it’s a long slow grind.

Conclusions? My card has been stolen, it’s quite possible it’ll happen again, and there’s nothing I can do about it except to never use my card. Worse, because online crime is now a low priority for UK Police, I don’t even get to report this to the police, only to my bank, and I can be pretty confident that no-one will ever be charged for this (they weren’t last time even though I did report that incident to the police as it predated the new reporting arragements).

This is not a happy state of affairs. If the definition of distributed computing is the failure of a machine whose existence you don’t know about breaking something you are doing, then this is the security version: being compromised by systems you don’t know about and can’t influence.


Phishers branch out in their targetting

10:03, July 8th, 2008 by arthur

Phishers have been branching out recently, moving on to new targets away from the traditional bank account scam. As users become more aware, and more banks roll out two factor authentication and other mitigations, scammers are having to move on to softer targets.

In the past few months we’ve seen two new targets, with different motivations. Both of these targets show trends in attacks as some targets become hardened.

First, many UK Universities have been hit with targetted phishing scams, usually claiming to come from “IT Support”. Any compromised accounts are then used to send out more spam. It’s a nice example of accounts being useful not so much for the information in them, but for the access they provide to other resources: bandwidth and credible email addresses

Second, as mentioned by Dancho Danchev in May in ZDNet and in June on his blog, job sites are coming under attack. Dancho posted about the selling of tools that scrape information from CVs posted to online sites. Now we are seeing more direct attacks, with phishing emails aimed at getting login details of users of Monster.com and other job sites. Clearly gaining access to the information held on a job site is very useful to a scammer: it makes all sorts of nastiness easier.

It’s an arms race out there. Banks are now very quick at taking down phishing sites (see the recent blog from Ross Anderson’s group at Cambridge with links to stats on takedown), but other types of scams currently last much longer. If you’re one of the bad guys, it makes sense to go for the low hanging fruit. Why bother to steal someones online banking details when you can get more money for less work by stealing their identity? And why bother to go to lots of work to get their details when they have helpfully posted it on the web for you, all ready to use?


Global Browser Vulnerability Survey

12:40, July 4th, 2008 by david

A lot of current computer security threat research activity today occurs in the client space, with honeyclients such as Capture-HPC and PhoneyC regularly being used to study attacks against web browsers. Often these attacks occur through malicious obfuscated javascript and exploitation of vulnerable plugins or media extensions to allow fully automated ‘drive by download’ infections. The Honeynet Project have published a number of Know Your Enemy whitepapers in this area over the past year, and continue to actively research in this area. We have also previously blogged about some of the ideas the UK Honeynet Project have been experimenting with in this area.

One of the biggest challenges with client based threats is assessing the real world scale of the potential problem. For traditional server based threats, it was fair simple to survey the entire IPv4 space and determine what versions of a particular application or operating system were in active use at a particular time. However, for client threats, you need a client application to come to you and interact with a service before any assessment of potential client vulnerabilities can be performed. This is a significant challenge for both attackers and researchers (hence the continued use of indiscriminate spamming and malicious advert serving at the same time as more targeted attacks are also being developed).

As the world’s most popular search engine, Google record the user agent client version data from the billions of web searches made by an estimated 75% of Internet users, and is therefore one of the organisations most likely to be able to provide an assessment of the current state of web browser security (Microsoft’s MSRT also has excellent data, but only for the ~450 million users regularly running Windows Automatic Updates). However, for obvious privacy reasons, this data has not been made available to the public.

An interesting survey was released yesterday by Google Switzerland, IBM ISS and the Computer Engineering and Networks Laboratory of the University of Zurich, which provides the first systematic study of the browser data from around 1.4 billion Google users during the first half of 2008. They analysed Google’s client version data and correlated this with vulnerability data from sources such as Secunia’s PSI, in an attempt to assess how many vulnerable browsers were in circulation at a particular time.

The results are very interesting, with Internet Explorer taking 78% (1.1 billion) of the browser share and Firefox getting 16% (227 million). Drilling down deeper into the IE market share shows roughly half of IE users have now moved to IE7, whilst most FF users run the latest release. More worryingly, less that 50% of IE uses had the most secure version of their browser (rising to 83% in FF). For the month of June 2008, the authors suggest that over 45% web surfers (roughly some 637 million people) accessed Google with a browser that contained unpatched security vulnerabilities. There is also some interesting analysis of the exposure to plugged in as well as inbuilt vulnerabilities, plus some good recommendations for potential improvements to web browser security. In particular, the concept of web sites checking a browser’s agent strings and displaying a highly visible “expiry date” warning on every page (in an attempt to enforce a maximum shelf life) is worth further investigation.

The very welcome paper is definitely worth a read, but is unlikely to cause too much immediate worry to the cyber criminals who are actively targeting web users through the thousands of mass compromised web servers, phishing emails and instant message spam we encounter each day.

FIRST 2008

14:15, July 1st, 2008 by david

The Honeynet Project were asked to present at the 20th FIRST conference in Vancouver last week, as part of their Network Monitoring Special Interest Group on Fast Flux Service Networks. We set up a two hour session broken down into three equal sections:

  1. An introduction to the basic mechanics of fast flux (David Watson, UKHP)
  2. Current ATLAS fast flux statistics (Jose Nazario, Arbor)
  3. Detection and mitigation (Christian Gorecki, University of Mannheim)

The NM-SG session was open to FIRST members only, so the slides are not publicly available, but we hope to have a public release of similar material shortly. We had a number of questions, and feedback from the attendees seems to have been positive.

There were three additional short demos:

  1. Florian Weimer of RUS-CERT showed some new passive DNS tracking information
  2. Tillmann Werner from the German Giraffe Honeynet Project Chapter demonstrated how Honeytrap, LibEmu and Nebula can be used to analyze unknown attacks, which is looking very promising as a long term replacement for Nepenthes
  3. Piotr Kijewski of the Polish CERT/NASK gave a brief demonstration of their still under development HoneySpider web interface, which shares many of the features of client honeypot systems that we are currently working on but instead uses Java and Rhino instead of Python and SpiderMonkey

Overall it was an interesting event, with some good talks and lot of opportunities to meet up with a different group of people very active in the security operations and incident response fields. Quiet a few Honeynet Project members were also present, which always encourages a little extra R&D discussion. Hopefully we’ll see some spin off activity in the coming weeks.

Many thanks to Carol Overes from GovCERT in Holland for the invite.