You might think that above is just another sensationalist website.
It’s certainly sensationalist. But MediaMass is not just another website!
First off, all the news reported on MediaMass is completely fake – indeed, it’s an exercise in proving that you shouldn’t believe everything you read. But more importantly for our purposes, none of it has been written by human journalists (who presumably would find writing absolute rubbish all day a less than fulfilling way to earn a living).
Rather, all the content is created by algorithms programmed to invent stories using a set of pre-defined variables. And before you scoff, enough people want to read these algorithmic musings that MediaMass generates five-figure revenues each month from Google advertising!
“Artificial Intelligence” (currently Weak AI) is beginning to affect more legitimate and worthy news reporting, too. It’s a rudimentary start, but exponential advances in advanced pattern recognition and natural language generation already mean that computers can produce credible journalistic copy in subjects that are reasonably formulaic, like sports reports and business news.
Automated news reporting first hit the headlines (no sense of irony there for the poor hack involved) in 2010, when Bloomberg asked: ‘Are Sportswriters really necessary’? The following paragraph was entirely machine generated using scoreboard data:
“Michigan held off Iowa for a 7-5 win on Saturday. The Hawkeyes (16-21) were unable to overcome a four-run sixth inning deficit. The Hawkeyes clawed back in the eighth inning, putting up one run.”
At the leading edge of news gathering and reporting is US company, ‘Narrative Science’. Their flagship product is called ‘Quill’. Through a process of identifying facts from data, then “producing content which meets your communication goals, business rules and overarching stylistic preferences, such as tone, style and formatting”, they claim to be able to create “1:1 personalised communications in a consistent, brand aware tone-of-voice”.
“Imagine as the CEO of a major company you go off and spend £100m on gathering data,” Hammond says. “In theory, you can get an idea of what is going on in every single aspect of your company. But when you have got it, what do you do? You ask a guy who knows about spreadsheets and powerpoints and tell him to make sense of it. It’s like: did you forget you spent all this money? We are that guy. We have built a system that looks at the data, figures out where the story lies in it, pulls that data out, analyses it in the right way and converts it into language the CEO will understand.”, Kris Hammond, Founder, Narrative Science
What makes Hammond’s statement interesting is that he is not focusing exclusively on “the death of journalism”. Rather, his point is that the best way to communicate with individuals is to speak on their terms: maybe journalism isn’t dead, but about to become infinitely customisable.
What happens when systems like ‘Quill’ can make sense of ‘dark data’ too – the unstructured data of emails, voicemails and Word documents that every business collects? Not only could these systems make us better informed, they may even help us make better decisions.
Of course, there is also the sense that when we put our faith in algorithms, we are simply endorsing their judgement. We may, instead of making better decisions, subtly be steered towards amplifying worse ones.
I’d like to leave you with a thought. If the future sees machines generating news, there will be other machines acting upon it. In 2012, 60% of trading in the US Futures Markets happened through High Frequency Trading. HFT is the use of sophisticated algorithms to rapidly trade securities. We’ve already seen the ‘Flash Crash’ of 2010 in which major stock indexes collapsed and rebounded within 36 minutes. What might happen when the algorithms making trading decisions start communicating with other algorithms creating the news on which these decisions are made? Without careful checks and balances, it could be very bleak indeed!