5 real AI threats that make The Terminator look like Kindergarten Cop
The real future looks way scarier than science fiction
The Facebooks
If we’re going to prioritize a list of existential threats to the human race, we should probably start with the worst of them all: social media.
Facebook’s very existence is a danger to humanity. It represents a business entity with more power than the governing body of the nation in which it’s incorporated.
The US government has taken no meaningful steps to regulate Facebook’s use of AI. And, for that reason, billions of humans across the planet are exposed to demonstrably harmful recommendation algorithms every day.
Facebook’s AI has more influence over humankind than any other force in history. The social network hasmore active monthly users than Christianity.
It would be shortsighted to think decades of exposure to social networks, despitehundreds of thousands of studies warning usabout the real harms, won’t have a major impact on our species.
Whether in 10, 20, or 50 years,the evidenceseems to indicate we’ll live to regret turning our attention spans over to a mathematical entity that’sdumber than a snail.
The Amazons
The next threat on our tour-de-AI-horrors is the fascinating world of anti-privacy technology and the nightmare dystopia we’re headed for as a species.
Amazon’s Ring is the perfect reminder that, for whatever reason, humankind isdeeply invested in shooting itself in the footat every possible opportunity.
If there’s one thing almost every free nation on the planet agrees on, it’s thathuman beings deserve a modicum of privacy.
Ring doorbell cameras destroy that privacyand effectively give both the government and a trillion-dollar corporation a neighbor’s eye-view of everything that’s happening in every neighborhood around the country.
The only thing stopping Amazon or the US government from exploiting the data in the buckets where all that Ring video footage is stored istheir word.
If it ever becomes lucrative to use our data or sell it.Or a political shift gives the US government powers to invade our privacythat it didn’t previously have, our data is no longer safe.
But it’s not just Amazon. Our cars will soon be equipped withcloud-connected cameras purported to watch driversfor safety reasons. We already have active microphones listening in all of our smart devices.
And we’re on the very cusp ofmainstreaming brain-computer-interfaces. The path to wearables that send data directly from your brain to big tech’s servers is paved with good intentions and horrible AI.
The next generation of surveillance tech, wearables, and AI-companions might eradicate the idea of personal privacy all-together.
The Googles
The difference between being the first result of a Google search or ending up at the bottom of the page cancost businesses millions of dollars. Search engines and social media feed aggregators can kill a business or sink a news story.
And nobody voted to give Google or any other company’s search algorithms that kind of power, it just happened.
Now,Google’s bias is our bias. Amazon’s bias determines which products we buy. Microsoft and Apple’s bias determine what news we read.
Our doctors, politicians, judges, and teachers use Google, Apple, and Microsoft search engines to conduct personal and professional business. And the inherent biases of each product dictate what they do and do not see.
Social media feeds often determine not just which news articles we read, but which news publishers we’re exposed to. Almost every facet of modern life is somehow promulgated via algorithmic bias.
In another 20 years, information could become so stratified that “alternative facts” no longer refer to those that diverge from reality, but those that don’t reflect the collective truth our algorithms have decided on for us.
Blaming the algorithms
AI doesn’t have to actually do anything to harm humans. All it has to do is exist and continue to be confusing to the mainstream. As long as developers can get away withpassing off black box AIas a way to automate human decision-making, bigotry and discrimination will have a home in which to thrive.
There are certain situations where we don’t need AI to explain itself.But when an AI is tasked with making a subjective decision, especially one that affects humans, it’s important we be able to know why it makes the choices it does.
It’s a big problem when, for example,YouTube’s algorithm surfaces adult content to children’s accountsbecause the developers responsible for creating and maintaining those algorithms have no clue why it happens.
But what if there isn’t a better way to use black box AI?We’ve painted ourselves into a corner– almost every public-facing big tech enterprise is powered by black box AI, and almost all of it is harmful. But getting rid of it may prove even harder than extricating humanity from its dependence on fossil fuels – and for the same reasons.
In the next 20 years, we can expect the lack of explainability intrinsic to black box AI to lie at the center of any number of potential catastrophes involving artificial intelligence and loss of human life.
Assassinations
The final and perhaps least dangerous (but most obvious) threat to our species as a whole is that ofkiller drones. Note, that’s not the same thing as killer robots.
There’s a reason why even the US military, with its vast budget, doesn’t have killer robots.And it’s because they’re pointlesswhen you can just automate a tank or mount a rifle on a drone.
The realkiller robotthreat is that of terrorists gaining access to simple algorithms, simple drones, simple guns, and advanced drone-swarm control technology.
Perhaps the best perspective comes from Lee who, ina recent interview with Andy Serwer, said:
Story byTristan Greene
Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns:(show all)Tristan is a futurist covering human-centric artificial intelligence advances, quantum computing, STEM, physics, and space stuff. Pronouns: He/him
Get the TNW newsletter
Get the most important tech news in your inbox each week.