Last week I was talking with friends and the conversation turned to AI and how we feel its impact. Someone had heard the rumour that Facebook is listening to our conversations and showing us ads based on the content. Now, this particular rumour is not true. Besides questions of legality and creepiness, it is simply infeasible, even for a company as large as Facebook, to analyse the speech of all its users in real time. The necessary computational cost far exceeds any potential revenue that could be made from the resulting ads. This does not mean that the rumour is easy to kill, because there is always someone whose friend was once talking about visiting Australia and, lo and behold, they started seeing ads about Australia.

Then we moved on to examples where AI actually does influence our lives, but I struggled find good examples. I was wasn’t looking for examples where AI is used; there are plenty of those such as Alexa, Siri, Cortana, Facebook’s news feed, Netflix recommendations and Google translate. I was looking for examples where AI has changed our interaction with a system beyond making the system more convenient to use. This was surprisingly difficult and I wondered why.

I now think, this is in part because our everyday understanding is based on stories, while AI works at the level of statistics. We tell each other stories, we remember stories, we make sense of the world via stories. Stories in which we ourselves participate we call experience. The most useful stories are those that are at once particular and universal: when I was twelve my front wheel got caught in a tram rail while cycling and I fell and broke my wrist. A simple story detailing a particular event. But beyond this one event, there is also a universal side: tram rails can be dangerous for cyclists (apparently particularly so in Edinburgh). It is a certainly a lesson I learned. The specifics of a story hook it in our memory while the universal aspects add to our understanding of the world.

With AI such stories are much more difficult to find. We can tell particular stories: an add for cruise vacations placed next to a video of the sinking cruise ship Costa Concordia. But is the story universal? Does the story tell us why this happened and under what circumstances it will happen again? Does it only happen with Google ads or also with Facebook ads? We learn that algorithmic advertising sometimes leads to inappropriate ad placements. With access to data we could quantify how often this happens and tell a statistical story. But we rarely have this data, as it is jealously guarded and kept out sight by the tech companies.

Our troubles with story telling don’t end here: algorithms change, making stories outdated. For a while Google translate would translate the English sentence “She is a doctor” into Turkish as “O bir doktor”, but when translated back into English it would become “He is a doctor”. Thus this story served as an example of gender bias in Google’s systems. Until, in 2018 Google addressed the issue and now provides both English translations: “He is…” and “She is…”. This does not mean that Google translate is now free of gender bias. In fact we don’t whether it is or not. It is probably less biased than it was before, but ultimately we don’t know. The particular is gone, the fate of the universal is unknown, but our ability to comprehend it is reduced.

The book Algorithms of Oppression by Safiya Noble grappled with the same problem. In it the author uses particular stories about Google search results to make wider points about the impact of algorithmic decisions. While the universal points remain valid, the particulars of some stories have become outdated even before the book was published. Searching for “jews” no longer returns anti-semitic websites as top search results and searching for “black girls” no longer returns links to porn websites. But this does not mean Google has magically solved its problems of bias and misrepresentation. It means that we need to find new stories to represent the bias. An image search for CEO still returns mostly male CEOs.

When looking for stories to explain what AI does, we quickly run into structural obstacles. As Safiya Noble put it: “It’s impossible to know the specifics of what influences the design of proprietary algorithms, other than that human beings are designing them, that profit models are driving them, and that they are not up for public discussion.”

We can see the same in other areas: climate change is so difficult to comprehend, because it started its life as a statistic. On average, the surface temperature is rising. We should expect more extreme weather events.  No single hot day, wild fire or hurricane is due to climate change, but the statistical aggregate is. Although with every heatwave and new heat records being set every year climate change is now a story as well as a statistic. 38C in London is hard to ignore.

At this point after presenting the problem I should suggest what to do about it. Well, I don’t know. No amount of artificial intelligence will change human nature and so we will remain in need of stories. Stories can be powerful. Maybe we just need to find the right ones.

Everything’s got a story in it. Change the story, change the world. —Pratchett