The short film Slaughterbots depicts a near future by which swarms of micro drones assassinate 1000’s of individuals for his or her political views. Released in November 2017 by academics and activists warning of the hazards of advanced artificial intelligence (AI), it quickly went viral, attracting over 3m views to this point. It helped spark a public debate on the long run of autonomous weapons and put pressure on diplomats meeting on the United Nations Convention on Conventional Weapons.

But this type of speculative science fiction storytelling isn’t just useful for attracting attention. The individuals who design and construct advanced technology can use stories to contemplate the implications of their work and ensure it’s used for good. And we expect this type of “science fiction prototyping” or “design fiction” could help prevent human biases from working their way into latest technology, further entrenching society’s prejudices and injustices.

A bias can result in the arbitrary preference of some categories (of results, people, or ideas) over others. For example, some people could also be biased against hiring women for executive jobs, whether or not they are conscious of it or not.

Technology built around data that records such bias can find yourself replicating the issue. For instance, recruitment software designed to pick out the perfect CVs for a selected job could be programmed to search for characteristics that reflect an unconscious bias towards men. In which case, the algorithm will find yourself favouring men’s CVs. And this isn’t theoretical – it actually happened to Amazon.

Designing algorithms without considering possible negative implications has been compared to doctors “writing in regards to the advantages of a given treatment and completely ignoring the negative effects, irrespective of how serious they’re”.

Some tech firms and researchers are attempting to tackle the problem. For example, Google drew up a set of ethical principles to guide its development of AI. And UK academics have launched an initiative called Not-Equal that goals to encourage greater fairness and justice within the design and use of technology.

The problem is that, publicly, firms are likely to deliver only a positive vision of the potential consequences of near-future technologies. For example, driverless cars are sometimes portrayed as solving all our transport issues from cost to safety, ignoring the increased dangers of cyberattacks or the very fact they may encourage people to walk or cycle less.

The difficulty in understanding how digital technologies work, especially those which might be heavily driven by obscure algorithms, also makes it harder for people to have a fancy and comprehensive view of the problems. This situation produces a tension between a reassuring positive narrative and the vague suspicion that biases are embedded to a point within the technologies around us. This is where we expect storytelling through design fiction can are available.

Stories are a natural approach to fascinated with possibilities and complicated situations, and we now have been hearing all of them our lives. Science fiction may also help us speculate on the impact of near-future technologies on society, as Slaughterbots does. This may even include problems with social justice, like the best way certain groups, comparable to refugees and migrants, could be excluded from digital innovations.

Revealing the (possible) future

Design fiction stories provide a novel way for designers, engineers and futurists (amongst others) to think in regards to the impact of technology from a human perspective and link this to possible future needs. With a mix of logic and imagination, design fiction can reveal features of how technology could also be adopted and used, starting conversations about its future ramifications.

For example, the short story “Crime-sourcing” explores what might occur if AI was to make use of crowdsourced information and a criminal database to predict who might commit a murder. The researchers found that since the database was filled with people in minority ethnic groups who, for social reasons, were statistically more more likely to reoffend, the “crime-sourcing” model was more more likely to wrongly suspect minorities than white people.

You don’t must be a talented author or make a slick film to supply design fiction. Brainstorming activities involving cards and storyboards have been used to develop design fiction and help develop the storytelling process. Making workshops that used these sorts of tools more common would enable more engineers, entrepreneurs and policymakers to make use of this approach to assessment. And making the resulting work publicly available would help to show potential biases in technologies before they affect society.

Encouraging designers to create and share more stories in this fashion would make sure the narrative that underpins latest technology wouldn’t just present a positive picture, nor a particularly negative or dystopian one. Instead, people will find a way to understand each features of what is occurring around us.

This article was originally published at theconversation.com