During the 1st and 2nd of December, Purple Frog descended on London Olympia for 2 days of AI, ML and Big Data based fun at the AI & Big Data Expo. The AI and Big Data Expo is a leading Artificial Intelligence & Big Data Conference & Exhibition that showcases the next generation enterprise technologies and strategies from the world of Artificial Intelligence & Big Data, providing an opportunity to explore and discover the practical and successful implementation of AI & Big Data to drive your business forward. This is the first time the Frogs have attended the event, so I thought it would be good to showcase what we saw and what we thought of the event.

Day 1- Enterprise AI

The first day concentrated primary on the use of AI in industry and kicked off with a ‘fireside chat’ on where to start with adopting AI in Enterprise. This was hosted by Tim Ensor of Cambridge Consultants with a guest star of Tarv Nijar who is Global Snr Director Customer Experience (Data Analytics & AI) at McDonald’s. This was an extremely insightful half hour of conversations to listen to, especially in understanding how big companies like McDonalds approach bringing AI projects into their business, managing these projects, and are then looking to build on top of the benefits they are seeing! There was a lot we took away from this in regard to adjusting how we approach bringing AI (and ML at that) to our clients and how we manage it when we do deliver projects for them.

The fireside chat was followed by several general sessions, of which I’m going to go into detail about a few which really caught my interest. The first was a talk by Tom Farrand of CausaLens, who are leaders in Causal AI which is why it is no mystery that his talk was on how Causal AI is shaping decision making. The main takeaway was an understanding of what Causal AI is, as I’d never heard of it beforehand. In essence the difference between Causal AI and Modern AI systems we see today is that the modern systems learn via ‘associational relationships’ (two way relationships, basically what you understand as a correlation; two correlated variables), while Casual AI is interested in Causal relationships which are one way; where a variable causes a change in another. The bottom line of this difference is that Casual AI has an understanding of the true context of the problem trying to be solved via not containing spurious relationships which mislead predictions. The whole concept of Causal AI and its advantages is much better explained in a short blog Tom has written as an introduction to Causal AI (here).

Another session of interest was a panel talk on ‘Streamlining and Simplifying with Intelligent Automation’ hosted by Tanya Suarez (IOT Tribe) with guests Jack Warren(Climate Partner) and Leo Scott Smith(Tended). While the topic of the panel talk itself was insightful, I was most impressed with the innovative applications of AI of the two companies of the guests. Climate Partner work with their customers to calculate and reduce carbon emissions and offset unabated emissions, which is done via carbon reduction projects. During the talk Jack discussed how for tree re-planting projects it is needed to know how many trees have been planted and how much growth there has been, as this impacts the carbon credits their customers attain. They now use drones with AI functionality to assess tree growth in an area; this methodology is much quicker and also much more accurate than the methods that were traditionally applied for these means such as calculating from aerial photographs or having men on the ground to assess. Tended are a deep-tech startup combining technology and behavioural science to transform the health and safety landscape, their whole business model revolves around applying AI in innovative ways. One such application that was discussed by Leo was the use of sensors on solo workers in remote areas. Currently the safety protocol for these workers was to call a supervisor to check in at certain time intervals. This had two major faults, that people can forget to check in and thus cause false alarms, plus the time between checks ins could be too long if an incident has occurred and thus mean the difference between life and death. The system implemented by Tended can identify if an accident has occurred with high accuracy and then contact emergency services and the workers employer with the workers location and condition, which allows for a much quicker alert process.


Day 2 – Applied Data and Analytics

The second day was much shorter in length in regard to sessions to attend, but still carried the same impact with some great talks packed into the time they had! The topic of the day was applied data and analytics, so we had moved away from AI and into the realm of Machine Learning! Once again there were too many things to talk about so I am going to zero on a couple of sessions I found particularly interesting.

The first of these was from Gurpreet Muctor of Westminster City Council, revolving around using data to power smart cities. In particular  Westminster’s  modelling of pollutants in the air around the borough at what times and how different modes of transport can affect your exposure to these pollutants (travelling via moped exposed you less than a car, travelling faster meaning for less exposure etc). All of this is done via modelling to understand the complexities and hot spots of pollutants; one of these being schools around dropping off and picking up times. From this they then provided sensors to local schools as a trial for the schools and children to be aware of the levels of pollutants around their schools at these times, with the children then being encourage to try to get to school via modes of transport which are less polluting. They had seen success with this, with the children being made aware of the pollutant levels and then influencing the behaviour of their parents; all of which stemming from some great ML modelling for the council to get an informed view and then act to make an impact.

Finally, another great talk was from Andrew Rimell and Steluta Lordache from Rolls Royce on use big data analytics for effective aircraft route analysis. The basis of the problem at the source of this is pollutants in the air that planes travel through, and the effects that different levels of these pollutants have on the engines that Rolls Royce produce. If the engines are consistently exposed to higher than expected levels of pollutants, they break or need servicing out of cycles which can cost airlines from cancelled flights and administration. To combat this, Rolls Royce have created a model which can map the most effective route through airspace from point A to point B. To do this they get data from Copernicus Atmospheric Monitoring Service (the European Union’s Earth Observation Programme) on atmospheric conditions etc (everything which may affect an engine). They then break the sky up into blocks for which they then break this data down into to feed into their model, which will then work out the fastest route from point A to B navigating from block to block of atmospheric data. So far they have been pretty happy with the routes outputted which are weighing up the conflicting interests of speed of flight against exposure to these pollutants, with the model being able to change the routes on a day to day basis based on the data fed into it.


The expo was a great experience where at the bottom line, apart from the exposure to new techniques and technology, you can really get inspired listening to what other people are achieving and how they’re doing it. Talking to exhibitors was also a brilliant way to really get to the soul of new technology and businesses within the AI and ML space! Can’t wait to go again next year and hopefully build on what we hope to achieve off the back of the inspiration and momentum gained from this expo.

Tags: , , ,