You are walking back home. Suddenly the bottom seems to open and a security drone
emerges, blocking your solution to confirm your identity. This might sound far-fetched nevertheless it is based on an existing technology – a drone system made by the AI company Sunflower Labs.

As a part of a global project the impact of AI on cities, we recently “broke ground” on a brand new field of research called AI urbanism. This is different from the concept of a “smart city”. Smart cities gather information from technology, similar to sensor systems, and use it to administer operations and run services more easily.

AI urbanism represents a brand new way of shaping and governing cities, by the use of artificial intelligence (AI). It departs substantially from contemporary models of urban development and management. While it’s vital that we closely monitor this emerging area, we should always even be asking whether we should always involve AI so closely within the running of cities in the primary place.

The development of AI is intrinsically connected to the event of cities. Everything that city dwellers do teaches AI something precious about our world. The way you drive your automobile or ride your bike helps train the AI behind an autonomous vehicle in how urban transport systems function.

What you eat and what you purchase tells AI systems about your preferences. Multiply these individual records by the billions of people who live in cities, and you’ll get a sense for the way much data AI can harvest from urban settings.

Sunflower Labs has made a house security drone designed to confirm the identity of holiday makers.

Predictive policing

Under the normal concept of smart cities, technologies similar to the Internet of Things use connected sensors to watch and quantify what is going on. For example, smart buildings can calculate how much energy we eat and real-time technology can quantify what number of persons are using a subway at anybody time. AI urbanism doesn’t simply quantify, it tells stories, explaining why and the way certain events happen.

We usually are not talking about complex narratives, but even a basic story can have substantial repercussions. Take the AI system developed by US company Palantir, that’s already employed in several cities, to predict where crimes will happen and who shall be involved.

These predictions could also be acted on by cops by way of where to assign resources. Predictive policing usually is one of the controversial powers that artificial intelligences are gaining under AI urbanism: the capability to find out what is correct or flawed, and who’s “good” or “bad” in a city.

This is an issue because, because the recent example of ChatGPT has made clear, AI can produce an in depth account, without grasping its meaning. It is an amoral intelligence, within the sense that it’s indifferent to questions of right or flawed.

And yet this is precisely the type of query that we’re increasingly delegating to AI in urban governance. This might save our city managers a while, given AI’s extraordinary velocity in analysing large volumes of information, but the worth that we’re paying by way of social justice is gigantic.

A human problem

Recent studies indicate that AI-made decisions are penalising racial minorities within the fields of housing and real-estate. There can also be a substantial environmental cost to keep in mind, since AI technology is energy intensive. It is projected to contribute significantly to carbon emissions from the tech sector in coming many years, and the infrastructure needed to keep up it consumes critical raw materials. AI seems to promise lots by way of sustainability), but after we have a look at its actual costs and applications in cities, the negatives can easily outweigh the positives.

It isn’t that AI is getting uncontrolled, as we see in sci-fi movies and skim in novels. Quite the alternative: we humans are consciously making political decisions that place AI within the position to make decisions concerning the governance of cities. We are willingly ceding a few of our decision-making responsibilities to machines and, in several parts of the world, we will already see the genesis of recent cities speculated to be completely operated by AI.

The NEOM project in Saudi Arabia would come with a linear city called The Line.
Corona Borealis Studio / Shutterstock

This trend is exemplified by Neom, a colossal project of regional development currently under construction in Saudi Arabia. Neom will feature recent urban spaces, including a linear city called The Line, managed by a large number of AI systems, and it’s speculated to change into a paragon of urban sustainability. These cities of the long run will feature self-driving vehicles transporting people, robots cooking and serving food and algorithms predicting your behaviour to anticipate your needs.

These visions resonate with the concept of the autonomous city which refers to urban spaces where AI autonomously performs social and managerial functions with humans out of the loop.

We have to do not forget that autonomy is a zero sum game. As the autonomy of AI grows, ours decreases and the rise of autonomous cities risks severely undermining our role in urban governance. A city run not by humans but by AIs would challenge the autonomy of human stakeholders, as it might also challenge many individuals’s wellbeing.

Are you going to qualify for a house mortgage and give you the chance to purchase a property to boost a family? Will you give you the chance to secure life insurance? Is your name on an inventory of suspects that the police are going to focus on? Today the answers to those questions are already influenced by AI. In the long run, should the autonomous city change into the dominant reality, AI could change into the only arbiter.

AI needs cities to maintain devouring our data. As residents, it’s now time to rigorously query the spectre of the autonomous city as a part of an expanded public debate, and ask one quite simple query: will we really want AI to make our cities sustainable?

This article was originally published at theconversation.com