I use computational methods to gather, analyse and visualise data for the purposes of journalism.
More specifically, these are the things I do - click or scroll to find out more:
And please feel free to say hello.
I have recently joined the Telegraph as a Data Journalist. More on that soon...
Prior to joining the Telegraph I was a data journalist on Trinity Mirror's Data Unit between July 2014 and October 2016. After graduating from City University with an MA in Interactive Journalism in the summer of 2014 I moved to Cardiff to take up the job. In October 2015 I moved back to London where I continue to work on the data unit.
The Data Unit is an in-house data journalism news wire sending stories out across Trinity Mirror's regional titles as well as to the Mirror itself. My stories appeared across the country whether it be examining cheap Michelin-approved restaurants in Wales, looking at flood risk on Teesside, or identifying wanted criminals in Manchester.
Around two-dozen of my stories made front-page leads in Trinity Mirror's regional titles as well as one for the briefly formed New Day newspaper. My greatest hit was a scrape of the police's unidentified dead bodies database, a story that went on to be used on five front pages.
Although it's always nice to appear prominently in print, most of my efforts while at Trinity Mirror went towards creating digital journalism. I collaborated on numerous occasions with Carlos Nóvoa and Dmitri Thompson who are, respectively, a developer and designer on the data unit.
Together we made some really fun online tools such as an interactive that tells you how fast your broadband is based on Ofcom data and a pizza tracker which tells you where the cheapest pizzas can be found near you.
We were shortlisted in the open data category in the 2016 Data Journalism Awards for the work we did on The 1939 Register in association with FindMyPast. Dubbed 'The Wartime Domesday Book', The 1939 Register was taken on September 29, a few weeks after Britain declared war on Germany. In just one day 65,000 enumerators were employed to visit every house in England and Wales to take stock of the 41 million strong civilian population.
I've been learning and using R for just over a year. It started with me wanting to analyse data that was too big for spreadsheet software but my use of it has evolved since then to include creating visualisations, scraping and getting a feel for different machine learning algorithms.
These are some of R packages I use frequently:
As well as allowing you to work with large datasets, R allows you to write scripts that make your analyses reproducible. A good example of this is the bankruptcies scraper I wrote with
rvest that can be run again and again with the touch of a button.
R is also a powerful tool for creating visualisations with the
ggplot2 package being brilliant in this regard. This is a chart I made with
ggplot2 and which was subsequently jazzed-up by Dmitri for the launch of the sixth series of Game of Thrones.
I'm not a statistician (my degree was in English) but I'm trying to supplement my A Level maths knowledge with some further study online. For instance, I'm currently working through Brett Lantz's book Machine Learning with R which covers multiple machine learning algorithms along with some statistics.
A recent example of me putting all this into practice was a story on bike thefts across Britain. The following image is of an interactive that allows you to see how prevalent the crime is in your area of choice. The header image was designed by Dmitri Thompsom and Carlos Nóvoa helped with making it work in Trinity Mirror's CMS.
My coding journey is still very much in progress and I'm constantly looking for new technologies to help me with my journalism. As well as just generally enjoying learning about this stuff, I recognise that data journalism is a fast evolving field and one that it pays to keep up with.