Data scientist, machine learning engineer & data strategist.
I help businesses discover, understand and execute on emerging opportunities in data science and machine learning.
Working from idea to implementation, I specialise in designing and delivering complete machine learning systems, products and features. I help teams adopt and integrate machine learning into their daily routines.
If you're looking for a data scientist, whether you have a project in mind or just want to explore ideas, feel free to get in touch email@example.com
Discover, understand and prioritise emerging opportunities, find new sources of value and defensible, strategic advantages.
Deliverables include a presentation tailored to your organisation and sector, advisory sessions with your executive team, and a prioritised list of high impact data science projects.
Plan the technical execution of a project, understand data, technical, team and wider product requirements for success.
Deliverables include architecture plans for a complete data product and pipeline with a flexible roadmap showing times and costs of various feature combinations and team recommendations, plus advisory sessions with your executive and technical team.
Hands on data science and machine learning work, finding insights and making predictions from data, designing and building data products and features.
Presentations, visualisations and interactive data exploration tools to communicate the value, insight and opportunity in your data and strategy to your investors, board, team and customers.
One to one and small group sessions with executives, team leads and individual contributors.
Tapdaq's mediation product allows app publishers to show ads from any ad network. I designed and built a machine learning system that selects the best paying network for each ad shown.
The system uses factors like ad network performance history, app type, device and user details to predict which network will give the best payout. It makes a new decision for each ad impression in each app, so publishers earn the most revenue possible from their portfolios.
Huge volumes of ad impressions and events occur across all of Tapdaq's apps, all of which are captured and used to make predictions. I designed and built a collection of microservices, using technologies including Apache Kafka and Google BigQuery, to provide large scale machine learning with guaranteed response times.
The nature of the prediction problem required a novel combination of machine learning techniques, including reinforcement learning and supervised learning.
"We were delighted to have Ollie join our team to help us understand a critical part of our new product.
His careful planning, clear communication and technical ability surpassed all of our expectations, not only that, he gelled in well with the team right away, and was able to work side-by-side with our engineers to integrate his efforts into our existing infrastructure.
Ollie's work has had a radical impact on our business, and I would certainly recommend him to anyone who wants to shift the needle in terms of business value."
Nick Reffitt, CTO
I designed and built a machine learning system to rate image quality and factor it into search results, giving Picfair a market-leading photography search engine.
The machine learning system is deployed into production as a scalable microservice, so as Picfair grows they can just add more servers to handle incoming images. It's seamlessly integrated into the existing tech stack, with testing, server monitoring, a clean API and interfaces for the Rails and Elasticsearch codebase.
I designed and built an interactive visualisation of London's property market, a system to find stories from rental trends each month, and data-driven marketing pages for Rentify's marketing team.
All of these tools are powered by a central data product. Combining Rentify's exclusive lettings data with 3rd party sources including London Datastore and Land Registry records, it creates a rolling, proprietary analysis of the London property and rental market.
The Promise was a powerful historical drama set in Israel and Palestine. Channel 4 expected strong and wide ranging reactions to the program, and wanted a way to show this conversation while maintaining editorial guidelines and balance.
I designed a topic modeling system to group the conversation into themes in a way that reflected Channel 4's editorial policy, built an interactive visualization to let viewers see and explore the conversation, and designed a server architecture to make it work at scale.
Behind the scenes, a data service collects tweets matching the hashtag #c4thepromise. Natural language processing techniques clean the text content, then store the words and sentences in a graph data structure. The topic modeler processes the graph and exports an optimised data format to a separate web app, which serves the visualization and client data requests at scale.
I use Python for data science, making extensive use of the Jupyter Notebook with NumPy, SciPy, pandas, scikit-learn, Keras and specialised libraries for exploratory work, model building and simulations.
I've used Postgres extensively, also AWS RDS and Redshift, Google BigQuery, Apache Kafka, Airflow, MySQL, MongoDB, Elasticsearch, Redis and Neo4j. I've worked with data from APIs, web scrapers, camera phones, sensors, and countless csv files.
Read Courtney Boyd Myers' interview with me about marketing, artificial intelligence, creative technology and just about everything else.
I write an occasional newsletter about data science and machine learning in business, and interesting developments in technology. I also post when I'm giving talks and running workshops.