This reads like a dystopian science fiction. I'm honestly surprised that the facial recognition works well enough to be useful. I would assume there'd be issues with angles, resolution, lighting,...
Gulzia, an ethnic Kazakh woman who didn’t want her last name used out of fear of retribution, said that cameras were being installed everywhere, even in cemeteries, in late 2017. Now living across the border in Kazakhstan, she told The Associated Press by phone on Monday that she had been confined to house arrest in China and taken to a police station, where they photographed her face and eyes and collected samples of her voice and fingerprints.
“This can be used instead of your ID card to identify you in the future,” she said they told her. “Even if you get into an accident abroad, we’ll recognize you.”
This reads like a dystopian science fiction.
I'm honestly surprised that the facial recognition works well enough to be useful. I would assume there'd be issues with angles, resolution, lighting, doppelgangers, and general classification confidence that would make it too prone to error to be useful.
Like all "big data policing" projects, the objective is the amplification of already existing repression, disguised as a neutral, objective system to catch "criminals" [as defined by whom?]....
Like all "big data policing" projects, the objective is the amplification of already existing repression, disguised as a neutral, objective system to catch "criminals" [as defined by whom?].
Consider the NSA's Skynet program to identify terrorists using a "big data" model trained on a group with only six examples of confirmed terrorists and with a test group of exactly one. This model is then used to suggest people to be killed via drone strike.
This reads like a dystopian science fiction.
I'm honestly surprised that the facial recognition works well enough to be useful. I would assume there'd be issues with angles, resolution, lighting, doppelgangers, and general classification confidence that would make it too prone to error to be useful.
Like all "big data policing" projects, the objective is the amplification of already existing repression, disguised as a neutral, objective system to catch "criminals" [as defined by whom?].
Consider the NSA's Skynet program to identify terrorists using a "big data" model trained on a group with only six examples of confirmed terrorists and with a test group of exactly one. This model is then used to suggest people to be killed via drone strike.
You can't even train a human on six data points, never mind a computer.