SmartOx Challenge – Cognicity on Steroids!

By: Chris Cooper, KnowNow Information

participants at Smart Oxford challenge event

KnowNow was at the SmartOx Challenge (one of 13 companies out of 43 that applied to be there), held at the very lovely Jam Factory in the centre of Oxford on 18th September.   The day was sponsored by Nominet R&D, with the support of experts covering everything from product design, marketing, technology through to finance.  With mentors advising closely, bringing their own experience to bear.  A brilliant day and big thank you to Alexandra for a well run event.  

From a KnowNow perspective Mark Braggins (bearded open data guru in the pic above) and I were presenting our award winning Flood Event Model.   Although slightly bias, I think we also had the best mentor too in Ben Ward founder of the Oxford Flood Network (see the banner next to ours in the picture above).  Who definitely won the sweepstake for most name drops during the day. Thank you Sir! 

What is Smart about Oxford?

Lots.  Oxford has plenty of problems, typical of every urban conurbation, but in some respects due to the cities history, geography and demographic these problems are exacerbated.  However, the city leaders have seen the innovation opportunity and actively supported the Smart Oxford Challenge.   Which is a very smart thing to do indeed!

Why is KnowNow on the challenge?

Firstly, the big draw for KnowNow was the data.   The Flood Event Model eats information, the more we get the more accurate the prediction.   With access to the Oxford Flood Network data the Flood Event Model can turn an ongoing data stream into a set of really useful outcomes.   The graphic below is an example of the Flood Event Model output.   The darker the shading indicates increased flood risk, with green markers identifying incidents & events due to flooding.

Oxfordshire map

Secondly, the other draw was seeing how this days start up innovation accelerator compared to the Cognicity Challenge, which KnowNow was on at the start of the year.   As the headline suggests.. it was  “Cognicity on Steroids”.  More intense as 12 weeks seemed to be compressed into 10 hours!  However if it had not been for the Cognicity experience I do not think we would have done the day justice.   

The Point of the exercise is…

The end game of the day, after taking onboard the experts advice, with some nudging from Ben (especially on using the Kawasaki pitching framework) and then comments after the practice session, was to pitch Flood Event Model to the VIP’s.  In three minutes, with no slides.

Our Pitch

After introducing myself, I firstly outlined the challenge from an Oxford perspective.  I was able to use a quote from Bob Price the leader of the council to position how Flood Event Model could help Oxford.   Oxford suffered over £50m worth of negative business impact as a result of flooding.   Flood Event Model would help mitigate that impact if it was deployed.  Additionally, the model could aid the business case for the proposed new Western River Relief (see pic below – courtesy of Oxfordshire County Council).  

Oxfordshire countryside

The Flood Event Model has fantastic pedigree.   Although a concept from KnowNow, the model was delivered with the assistance of the STFC Hartree Centre.  The model is proven to be 85% accurate at predicting where a flood event will occur.  Which gives emergency and flood responders an opportunity to be pro-active in their response to a pending flood. 

Business Value

The value of being pro-active is really the point of having the Flood Event Model.  Simply knowing that something will happen in a particular place is pointless and not very sustainable unless an alternative, pro-active and less costly action is undertaken.  

These actions could involve just in time evacuation, precise deployment of flood defence or even timely warning to local residents.   Moreover, due to the models use of Open Data and its dependency on historical data to generate a prediction, the Flood Event Model is also a time machine.  

Next Steps

KnowNow hopes as well to build on our exposure to the Oxford Flood Network.  A potential collaboration in using their realtime data to add to the models data analytics capability is a natural next step.  Watch this space! 

KnowNow  is looking for other places in the UK and beyond to grow the models coverage.  If you would like to know more, then please do get in touch –

Thanks for reading, Chris

This article was first published in LinkedIn in September 2015.

Oxford: ‘we have the UK’s only licenced driverless car – there’s only one’

By: Paul Myles, TU Automotive

Connectivity is being heralded as the only way forward for the autonomous vehicle, yet Paul Myles found out that a group of Oxford scientists are proving their car can navigate without the Internet’s help

It’s probably of little surprise that the scientists behind the Mars Rover approached designing an autonomous vehicle without the imperative of having it communicate with the Internet.

artist's impression of driverless pod

And so boffins from the ‘dreaming spires’ of Oxford in the UK have devised a system that can allow any vehicle to navigate its way through the tiny, higgledy-piggledy streets and lanes of the ancient seat of learning completely free from the shackles of connectivity.

While some sectors of the auto and software industries might throw their hands up in horror at the prospect of a vehicle whose intelligence comes from within itself, severing the lucrative data stream to commercially interested third parties, the system could, in practice, accelerate the early adoption of autonomy in cars.

TU-Automotive went to find out more from Dr Graeme Smith, chief executive of Oxbotica, the spin-out of Oxford University’s Mobile Robotics Group responsible to commercialising the group’s advanced work in autonomous vehicles.

Smith took up story of how this group devised unique mathematical algorithms which allow three dimensional map building using simple two dimensional sensors.

He said: ‘Oxford Mobile Robotics Group has been pioneering innovative research in the area of mobile autonomy for a number of years now. They have conducted more than 100 man years of research into this area and have developed a huge portfolio of intellectual property (IP) around the autonomous vehicle.

‘The IP is very practical, for example downstairs in the garage we have the UK’s only licenced driverless car. There’s only one.

‘Based on a Nissan Leaf, clearly the robotics group didn’t build the car but what they did do is develop the autonomous control system and integrate the sensors that allows it to drive autonomously.

‘One of Oxbotica’s roles is to commercialise that IP and take it to market,’ Smith explained. ‘We have negotiated exclusive worldwide rights for most the group’s intellectual property and our remit is to find channels to market with this.

‘We have only been in business since September 2014 and we see ourselves not as a product company but as a licensing and IP company, working with customers to help them to integrate this IP into their products. This means we work with them on engineering support, technology transfer and on licensing.

‘We are all about supporting customers and helping them to market whether it be in a car, a warehouse, or round a closed test track in any form of autonomy. What we are also finding out that there is a huge spin-out of opportunities for a lot of this technology. For example, when NASA went to the moon we are still deriving the benefits of that investment into technology all these years later.

‘There is a lot of first class research taking place around the autonomous car and there are spin-out technologies that are very attractive to other companies. One example is our ability to quickly survey a city map in 3D is of extreme interest to surveying companies, whether it civil engineering for street surveys, road surveys or an internal survey from being able to do a 3D survey very, very quickly.

‘It has been possible to do 3D surveys using static equipment but it takes a long time and those companies that have mobile survey platforms typically cost about £300,000. We are probably just 5% of that cost. We may not perform at exactly the same level of resolution as one of those high-end systems but for 5-10% of the price if you can get somewhere near achieving say 80-90% of the functionality then that enables a lot more business models that, perhaps, didn’t seem worthwhile at the higher cost point.

‘We have had a lot of interest from companies that thought this would not be possible for them to afford this technology but now they are developing their own ideas around products that benefit from cheap, quick surveys.’

Smith said the ‘magic’ lies within the mathematical algorithms which allow simple cheaply sourced sensors to build up a sophisticated three dimensional map to rival that of others more costly systems currently being employed by big organisations.

He explained: ‘We are using commercially available off-the-shelf sensors and a lot of our IP is around how you synchronise those sensors together and keep them synchronised. We use a technique called visual odometry that helps you understand how you move in space, a bit like a pair of eyes, and then we use 2D lasers, rather than expensive 3D ones, that helps us map what is either side of us. As we move through the space, we are able to create the 3D model.

‘Oxbotica’s approach has been radically different from other approaches in the market that are a lot more complex but a lot more expensive as well. The core of our offering is the algorithms, the software and the thought behind them.’

And, beyond its Oxford mule, Oxbitica is employing this technology in trials of autonomous ‘pods’ in the UK’s Midlands.

Smith said: ‘We are working as part of the government’s driverless car challenge in Milton Keynes and Coventry that is called the UK Autodrive. As part of that we are working on the control systems that will be going into the 40 pods. These are made by a company in Coventry called the RDM Group and we help them integrate the sensors and then the entire control system which has been designed to cope with the same sort of environment that the pods will work in.’

These pods use 3D maps of their environment recorded throughout the seasonal changes they will have to handle.

He said: ‘The way the pods will work is that we have mapped the cycle ways where these pods will work multiple times in rain, in snow in sun and in summer and winter. From this we have created 3D models of the pathways in all these different conditions and the pods will be able to work within this map using their sensors to localise themselves within the map to know where they are to within a few centimetres. They then navigate along the routes that we have pre-mapped.

‘These pods will also have the ability to continually map as they go so we can upgrade the map if we wanted to. For other applications it’s possible that, if you are thinking ahead to 20-years’ time when a car is autonomous, one school of thought is that the car could come equipped with 3D maps of the world. Just in the same way you can buy CDs of maps at the moment, you’ll be able to buy 3D map CDs.’

Smith said an alternative approach would be a process of ‘teaching’ the car to recognise an individual motorist’s often used journey routes.

‘It could be that your car is creating a map itself as it drives,’ he said. ‘Possibly, when you buy the car in the first place the map is empty but when you drive a route to work it starts to learn about that route. Perhaps on day two or three a little light on the dashboard will tell you ‘I remember this bit and am OK to drive this bit if you like?’ In this way, over a period of time, you effectively create your own database.’

One of the key strengths of the system is the way the technology can use 3D point clouds to distinguish between the fixed obstacle and that which is temporary.

He said: ‘When we are navigating we want to make sure we are localising from things in the environment that are static. We have a lot of software, for example, that will start to remove things from the scans such as parked cars, pedestrians and anything that we determine are transient. If we scan the same route multiple times, we start to learn about which things are there all the time and which are not. So, again, by being able to over-lay one scan on the next we are able to build the database and eventually get back to something that is very static, which is the best thing to navigate from.’

This, effectively, liberates the vehicle from the need to be hooked up to the Internet, a strength that makes autonomy possible in areas of poor GPS coverage, whether that be in remote rural areas or in city skyscraper canyons.

Smith said: ‘The approach that we have is completely GPS-free and we don’t use any sort of infrastructure beacons to locate the vehicles. Those things may come along but everything that we have done has assumed we are infrastructure free that means it is equally applicable to a car or a mining robot or a one inside a shopping centre or even a train. In this way the technology is completely cross-platform.’

However, Smith had some good news for those still commercial interests whose business models are dependent on an autonomous vehicle being connected to the Internet.

He added: ‘This doesn’t rule out cars communicating with each other, sharing information and databases, and certainly doesn’t rule out some centralised infrastructure to download local sections of the map.

‘We don’t know right know, so what we are doing is working on a level below that and we are able to build our own maps in real time and to build them ahead of time and navigate with them.’

Smith believes the autonomous vehicle is unlikely to burst suddenly onto the scene in the older cities of Europe but rather embark a more gradual process of integration with existing transport infrastructure.

He said: ‘We think we will see autonomy adopted in stages and we already know that new cities being built in the Far East they are thinking about autonomous transportation and may well design a city around it. In that environment of a purpose built city with a purpose built autonomous transport system, we can see elements coming in much more quickly. Certainly, closed environments like warehouses there are already autonomous solutions and we can see this creeping into many different dimensions.

‘However, there will always be challenges in introducing new technology in a mixed environment – introducing autonomous driving into an environment that wasn’t designed for it with other forms of transport might be more problematic. I think it might take some time and until the industry is able achieve a critical mass to make this successful.

‘To start with, we may have an autonomous lane on a motorway or even a dedicated autonomous motorway.

‘It’s easy to predict how one autonomous car would interact with another autonomous car but it’s more complex to think how it would react to a human driver or cyclist. It’s this interaction that would be the main reason for a slow implementation of the technology.’

This article first appeared in TU Automotive in July 2015 with the title Smart car for the dumb city.

How does the Flood Network operate?

By: Ben Ward, Flood Network

The Oxford Flood Network is our demonstration network which uses the power of crowdsourcing to collect flood information at a much higher resolution than was previously economical. 

Spot the flood sensor

Spot the flood sensor

This kind of information is useful to local authorities who have to deploy demountable flood barriers, sandbags and staff around the city during times of flooding and need to disseminate information about road closures to the public via the media. Hydrologists use this live and historical data to improve their models which they use to calculate future flood risk.

By deploying sensors in the community: under bridges, in back gardens and even under floorboards, we’re able to monitor water levels in more detail and provide real-time updates of levels around the city. So how does it work?

1) Sensors!

Our tiny low-power sensors used in Oxford Flood Network monitor water levels by using ultrasonic pings to the water surface below. We mount them under bridges or on overhangs to track water levels. We get Floodwatchers to adopt the sensor and connect them through their broadband onto the Internet.

L-R: Sensors have developed over the past few years into something robust and compact.

L-R: Sensors have developed over the past few years into something robust and compact.

The hardware has developed from a quick proof-of-concept with a glue gun, to a Sugru-encased mount, an IP55 junction box through to the v4 sensor device we use today, which is IP67 protected (dust and immersion-proof), with custom PCBs for simple assembly and diagnostics capability.

2) Gateways

Flood Network v4 sensor next to gateway

Flood Network v4 sensor next to gateway

The sensor device, left, monitors water levels from above the water surface and transmits the readings over wireless to the gateway, which is attached to a Floodwatcher’s broadband and back to the internet.

3) Water

Typical installation over a minor stream

Typical installation over a minor stream

Typical installations are over minor streams and ditches. These are not classified by the Environment Agency, but often lead to flooding of streets, gardens and properties.

4) Apps and Maps

The web app runs on mobile to allow troubleshooting during setup.

The web app runs on mobile to allow troubleshooting during setup.

A mobile-friendly web app is used to manage the sensors and gateways to make sure they’re all running and checking in at the correct times. Nominet R&D developed the system around the Oxford Flood Network and we’ve spent many hours standing with cold hands in ditches waiting for missed telemetry messages developing ways to deploy the hardware more easily.

The map showing some potential flooding in the Oxford area.

The map showing some potential flooding in the Oxford area.

Finally, the data appears on a community Flood Map. (This is in closed beta at the moment but should soon be opened up.) The river segments are highlighted in different colours according to current conditions vs local knowledge of likely flood levels and the historical data can be viewed and zoomed. It also incorporates live data from the Environment Agency’s feeds to further improve the picture.

Better Prediction

Feeding the data to existing flood models will improve their accuracy and allow authorities to make better decisions based on live data, meaning faster response times.

Working together we can make communities more resilient and reduce the damage and inconvenience of flooding.

Libelium Waspmote sensor platform

Libelium Waspmote sensor platform

As we develop a commercial product we recognise that the community deployment model isn’t always suitable. We’re working with some off-the-shelf hardware to gather our data and make trial deployments. Many wireless options are available and together with Love Hz we’re researching technologies such as LoRa, GPRS, TV Whitespace and SIGFOX for different situations.

If you have a specific location or project in mind then get in touch with us by email on or on Twitter.


This article was first published in the Oxford Network blog in September 2015.

Crowdsourced data: Smart Objects and the Power of the Crowd

By: Adrian Segens, RedBite Solutions

According to the Oxford English Dictionary (OED) “crowdsourced” is a relatively new word, its first recorded use being as recent as 2005. So, it is ironic that the compilation of the OED is itself one of the most impressive crowdsourcing projects in history.

In 1857, the British Philological Society appealed to the public to send in the earliest quotations that they could find of a word’s usage on small slips of paper. By 1884, James Murray, the OED’s editor had to have a large shed built in his garden, with storage space for the ever-increasing number of slips being sent to him from all over the world. Anything addressed to ‘Mr Murray, Oxford’ would always find its way to him and he sent out so much post that the Post Office erected a special post box for him outside his house, which can still be seen at 78, Banbury Road. Such was the volume and quality of the data that was provided to Murray by the public that the first OED was not fully complete until 1928. Which, sadly, was 13 years after Murray’s death.

78 Banbury Road Oxford
78 Banbury Road Oxford 20060715” by Kaihsu Tai – own work of Kaihsu Tai. Licensed under CC BY-SA 3.0 via Commons –

Nearly 90 years later, we now have technologies that Sir James Murray could not have dreamed of but the crowdsourcing technique that he pioneered is still very much part of our lives. What’s more, Oxford could be one of the cities to benefit from it most.

It is a little known fact that Oxford already has a network of 100,000 highly sophisticated sensors that autonomously navigate their way around the city, 24/7, sharing data with each other and uploading straight to the cloud through 4G and Wi-Fi. Every faulty street lamp, pot hole and overflowing waste bin in the city is detected by these autonomous sensors almost as soon as faults occur but these are not drones from an IoT future, they are the 100,000 smartphones that are in the hands of Oxford’s active citizens. The question is, how should we ensure that we gather this wealth of sensory data in a way that engages the citizen and gives the city actionable data that will improve the life of those citizens.

For RedBite, our whole concept of the IoT and Smart Objects rests on the simple fact that it enables people to interact with objects. The humble QR code, linking an object directly to the cloud immediately makes that object Smart through its ability to communicate with people. However, if we are to draw upon all that accumulated knowledge, we must power the imaginations of people – just as James Murray did – in order to get them to share.

From our experience of making objects Smart, we can see 4 rules that will make people want to share what they see and hear about the objects around them. To engage people, the solution must be:

1. Simple and Familiar

Smartphones, tablets, QR codes, NFC tags and social media apps are robust, low-cost, highly scalable technologies that are already a key part of people’s lives. This means that community engagement is likely to build momentum sooner, thanks to the convenience and ubiquity of such technologies.

If city assets (anything from streets to library books) bore a QR code, we could link every one of those assets to its own unique URL. In effect literally everything can have a social media profile page onto which anyone with a smart device can add comments and share data including photographs and attachments.

Although this provides citizens with a convenient method of contributing data, it does not give them a reason why they should.

2. Useful and Informative

At the University of Cambridge’s Institute for Manufacturing (IfM), much of the equipment that the staff and students use on a daily basis has been made Smart and interactive by the use of RedBite’s solution, RedStore.

Everything from complex equipment like robots and 3-D printers to first aid boxes now have a discrete QR code which, when scanned, takes the user to a page where they can read instructions, ask for help and see comments left by previous users. So, the staff and students use the system because it gives them access to useful information as well as the ability to contribute information. RedStore gives the IfM’s management and maintenance teams knowledge of when faults occur and when services should be scheduled.

3. Fun!

Like crowdsourcing, “gamification” is a newcomer to the language. According to the OED it is this “the application of typical elements of game playing to other areas of activity, typically as an online technique to encourage engagement with a product or service”. In other words, it is possible to get people to engage with something that they perhaps normally wouldn’t engage with, if you create a sense of fun around the activity.

This has already been applied in a smart city context through the Hello Lamp Post project, again using the power of the mobile device and the results were very impressive. In the summer of 2013, people all over Bristol could be seen sending and receiving SMS messages to everyday city objects like postboxes. Why? Because the postboxes and lampposts answered them with questions and stories about the other Bristolians that they had talked to earlier!

4. Delivering Value

As well as encouraging citizens to engage initially with Smart Objects, an emphasis must be placed on continued engagement. There is a risk that the somewhat novelty appeal may not be enough to ensure continued interaction with Smart Objects. To overcome this, cities must ensure that active citizens feel a sense of engagement in the process of running the city on a daily basis. Perhaps citizens could be thanked for their contributions, kept updated on progress, or made aware of the value that reports such as theirs has to the city in financial and social terms.

Above all, community engagement must be deemed worthwhile by citizens in order for them to interact with these objects. Results deemed worthwhile are likely to differ from person to person. However, seeing how local authorities are using this data to provide improved services is likely to be appreciated by all active citizens, thus encouraging sustained engagement.

The benefit for the city itself can also be huge. RedStore can be used to maximise the efficiency of city maintenance teams by turning the crowdsourced data into action plans for city workers. Every day, they will see on their smartphones a list of assets to be worked upon chronologically and by proximity. By utilising crowdsourced data, we have the power to revolutionise the data we have access to, the way we capture data and how we can use this data to benefit citizens.

Perhaps if James Murray had had access to the types of technologies we have today, the OED may have been published far more rapidly and without the need for him to build a new shed to store all of the collected data in!

Every object tells a story. Let them speak.

This article was first published in the RedBite Solutions blog in September 2015.