In the opening scene of the original Blade Runner film, Leon, a Nexus-6 replicant is given a “Voight-Kampff Test” to determine whether or not he is human.

The test is designed to provoke an emotional response. Emotions are read by scanning the iris, the colored part of your eye. The color of your iris is like your fingerprint; it’s unique to you, and nobody else in the world has the exact same colored eye.

As the questions go on, Leon becomes increasingly agitated. When he is asked to “describe in single words, only the good things that come into your mind about your mother”, he’s had enough. “My mother?” Leon says. “Let me tell you about my mother.” And he pulls out a gun and kills his tormentor.

Replicants have a termination date because if they live too long, they begin to develop emotions and the fear is that they will no longer be distinguishable from humans. Leon and a few other advanced replicants are on a mission to confront their creator, Dr. Eldon Tyrell, and find a way to extend their lives.

Phillip K. Dick, author of the novel, Do Androids Dream of Electric Sheep? upon which the movie is based, would flip out at what’s happening today. Not because it’s what he foretold but because it’s the exact opposite.

It isn’t AI that needs to prove it isn’t human. It’s humans that need to prove they aren’t AI.

I warned about this in Digital ID and Our Obsession with “Identity”.

It is nearly impossible to escape the Vast Machine that is absorbing us into it. It insists that we prove who we are, over and over, and the more we do, the less satisfied it seems to be.

The more ways we must prove our identity, the more ways AI will find to fake it. The more information we give AI, the more that information can be used against us.

As an example, Amazon uses surveillance to tally the seconds of each worker’s bathroom break or time each step of their work. And in fact, workers are being trained to do this to themselves with their Fitbit devices recording their steps in a day. In some work locations, AI listens into every conversation, cataloging every word, who said it and how, and then scoring each agent.

“In low wage work we’re seeing a lot more decisions that were made by a middle manager being outsourced to an algorithm,” says Aiha Nguyen of the research organization Data & Society.

More and more companies are gathering data to boost production and to train machines to mimic humans. In the U.S., cameras have been installed over each worker’s head in assembly lines as they put together car parts or electronics.

The result is that humans are being required to behave more like robots, no spontaneity of thought or action, no excuse for mistakes, while machines are learning to behave more like humans.