Once again noting that historically facial recognition programs are horrible at identifying Black people and not great at any other POC. In part because camera software is also bad at handling...
Once again noting that historically facial recognition programs are horrible at identifying Black people and not great at any other POC. In part because camera software is also bad at handling Black skin tones and in part because they rarely train it sufficiently to make distinctions. I'd really like to know how many of those 1 in 40 aren't white.
And of those, with how many it happens consistently all the time. It effectively means barring many from stores to get their basic human needs. This is all but guaranteed to ruin people's lives....
I'd really like to know how many of those 1 in 40 aren't white.
And of those, with how many it happens consistently all the time. It effectively means barring many from stores to get their basic human needs.
This is all but guaranteed to ruin people's lives. And to add insult to injury, of already heavily discriminated minorities. How that's an acceptable cost, I honestly can't see.
Is it a lack of training data, or is it just the way light works. Security camera video usually looks pretty poor to begin with, skin tones that reflect less light are going to have less...
Is it a lack of training data, or is it just the way light works. Security camera video usually looks pretty poor to begin with, skin tones that reflect less light are going to have less distinguishable features or contrast.
It's demonstrated bias. Facial recognition is worst on Black Women, and if it were just "how light works" there probably wouldn't be a gender difference. But let's say it was "just how light...
It's demonstrated bias. Facial recognition is worst on Black Women, and if it were just "how light works" there probably wouldn't be a gender difference.
But let's say it was "just how light works" - it would clearly be super unethical to use something that can't correctly identify major portions of the population. You'd think they'd figure that out during the training and testing process if they were being thorough and inclusive. And if cameras also don't work on dark skin tones (Black models look amazing when lit and shot correctly) that's also another point of bias in the tech we're using.
But no, consistently, from NZ to the US to the UK, data shows a bias, not just to Black skin tones but Maori, Asian, and Latino folks.
Studies show that facial recognition technology is biased. The error rate for light-skinned men is 0.8%, compared to 34.7% for darker-skinned women, according to a 2018 study titled “Gender Shades” by Joy Buolamwini and Timnit Gebru, published by MIT Media Lab. A 2019 test by the federal government concluded the technology works best on middle-age white men. The accuracy rates weren’t impressive for people of color, women, children, and elderly individuals.
0.8% is still unacceptably terrible -- that's like, 8 dudes in 1000 get pulled put and publicly shamed for absolutely no reason. And 34.7% error rate should send this technology straight to the...
0.8% is still unacceptably terrible -- that's like, 8 dudes in 1000 get pulled put and publicly shamed for absolutely no reason.
And 34.7% error rate should send this technology straight to the trash heap.
the biggest problem is that dumb humans think tech is 100% correct. So if a human heard from another human "hey that's a shop lifter", there is probably some doubt being processed and they'll approach the individual with politeness and caution. Somehow when a human is told by a flashing screen, the humans themselves turn into automatons to "apply the company procedure". I don't blame the individuals, I blame the company procedures. Imagine being a loss prevention staff pulled in to be questioned why you let someone through when the flashing idiot box says not to.
Oh for sure it's not the retail employees, not even loss prevention, though they may have the same bias. It's decision makers and policy enforcers and everyone buying into AI. We have the issue at...
Oh for sure it's not the retail employees, not even loss prevention, though they may have the same bias. It's decision makers and policy enforcers and everyone buying into AI.
We have the issue at the university, someone higher up in administration is having AI sweet nothings whispered in their ear and I'm at my team meetings being like "so, a chatbot for a disordered eating organization gave out weight loss advice like, day 1...."
There's a really fantastic episode of a workplace comedy called Better Off Ted from 2009 that makes it very clear how even if (though that if is still in question for AI) the technology that you...
There's a really fantastic episode of a workplace comedy called Better Off Ted from 2009 that makes it very clear how even if (though that if is still in question for AI) the technology that you are implementing is incapable of racism and is simply limited in its capability, the choice to implement that technology anyway is racist. It's about automatic lights, not facial recognition, but it's otherwise a striking mirror of this situation.
Given that the tech is known to racially prejudice to a significant degree, I would have thought there is even some basis for legal action here. Imagine if you were to roll a D20 every time a...
Given that the tech is known to racially prejudice to a significant degree, I would have thought there is even some basis for legal action here.
Imagine if you were to roll a D20 every time a black person walked into a shop, and then were to turn them away if it landed on a 1. This would obviously be illegal: why doesn't the same argument apply here? (In fact, looking at numbers elsewhere in this thread, the rate seems higher than 1 in 20).
From the article: I think anyone would be hard pressed to find a large retailer, maybe even many smaller retailers, that tell their employees to interface with potential shoplifters. Every company...
From the article:
The company declined to comment on Sara's case to the BBC, but did say its technology helped to prevent crime and protect frontline workers. Home Bargains, too, declined to comment.
I think anyone would be hard pressed to find a large retailer, maybe even many smaller retailers, that tell their employees to interface with potential shoplifters. Every company I have ever worked for, or heard the policy of, tells employees that they are discouraged from stopping theft and some even go as far as to say they will fire employees that try to, since it exposes the company to lawsuits.
In what world does this technology "Protect frontline workers," if the result is employees being told by their employers to confront presumed criminals? Why would this software even change how most companies view the risk involved in compelling employees to confront shoplifters? I presume that the issue at focus in this piece, itself, opens companies who misidentify shoplifters to lawsuits, let alone the complication of harm or threat that already exists.
I guess this all presumes that the company is referring to this software and not some other AI that does something different and they're just speaking generically about their IP.
I worked at Kroger a few years back when I was in college -- we were told not to stop proper theft (like if we got held at gunpoint and asked to give cash) but when I worked self-checkout they...
I worked at Kroger a few years back when I was in college -- we were told not to stop proper theft (like if we got held at gunpoint and asked to give cash) but when I worked self-checkout they said to stop anyone doing obviously suspicious shoplifting shit like having a huge cart full of drinks and trying to just wheel it through. But we also had a security guard there by the door and it's hard to say whether that was company policy or just my manager.
When I did my retail stints, it was explicit for any theft. You were not permitted to do more than just say 'excuse me, I need to check your reciept.' You were under strict orders to never follow...
When I did my retail stints, it was explicit for any theft. You were not permitted to do more than just say 'excuse me, I need to check your reciept.' You were under strict orders to never follow or try to physically stop.
One employee workers comp claim, for say standing in front of somebody wheeling out a cart and getting a tailbone fracture from being slammed to the concrete, costs way more money to the company than say 20 PS5s.
It easily costs a hundred bucks to see a doctor to get antibiotics with shitty insurance, let alone get an xray and a cast or a bullet removed.
Oh yeah no, when I said "stop" I meant more "stop them and ask them to show their receipt", nothing more than that. You definitely weren't expected to physically prevent them from leaving.
Oh yeah no, when I said "stop" I meant more "stop them and ask them to show their receipt", nothing more than that. You definitely weren't expected to physically prevent them from leaving.
I work for them currently, and either that was specific to the division you worked in, or changed since then. My division has fired people for simply going and capturing images of the license...
I work for them currently, and either that was specific to the division you worked in, or changed since then. My division has fired people for simply going and capturing images of the license plates of shoplifters, and their official policy is do nothing besides alerting store management. Even attempting to ask them to check their receipt is against policy now.
My division also only places security guards at the union stores, nonunion ones (I work at one, sadly) do not have that. We have kids stealing cases of beer constantly, and all corporate did was turn the door that's on the deli/cafe area side into an emergency exit, but months later they still haven't installed any form of alarm on it, so they just walk out that door anyways. And I work at a store that has often been the single busiest store in all of Kroger nationally, not just in my division, no less. They have crap ass cameras that throw alerts all the time for theft if it so much as detects a purse swung off of a shoulder at the wrong time on the SCO registers, but do nothing to enforce things. It's an asinine waste of money.
I'm not complaining about not being required to intervene, but Kroger corporate clearly doesn't care that much about theft that isn't internal. God forbid you sample a produce item for a customer and eat the other portion yourself though, I had someone on my team written up for theft for that one.
To be fair the Kroger I worked at had other weird policies, like not accepting passports as ID for liquor sales, so it may well have been mismanaged or miscommunicated. In practice the only thing...
To be fair the Kroger I worked at had other weird policies, like not accepting passports as ID for liquor sales, so it may well have been mismanaged or miscommunicated. In practice the only thing I ever got scolded on was my scanning speed.
So, a person who has shoplifted is banned not from that shop, but from shops plural? Sounds horrible. But at least it screws up and hits the wrong people.
So, a person who has shoplifted is banned not from that shop, but from shops plural? Sounds horrible. But at least it screws up and hits the wrong people.
I wonder this can be challenged in my country. (if it happens) Because here they're not even allowed to hang camera's in the server room because people work there.
I wonder this can be challenged in my country. (if it happens)
Because here they're not even allowed to hang camera's in the server room because people work there.
I think this type of technology has a harder time getting implemented in a country the stricter their privacy protections are. There's no way they'd allow this here in Germany, for instance. They...
I think this type of technology has a harder time getting implemented in a country the stricter their privacy protections are. There's no way they'd allow this here in Germany, for instance. They don't even allow dashcams here and only recently got Google StreetView due to privacy concerns.
I don't really know how popular having them illegally is -- I and my friends here don't own cars -- but I haven't noticed them in taxis or cars parked around me. Most of the reasonable uses of...
I don't really know how popular having them illegally is -- I and my friends here don't own cars -- but I haven't noticed them in taxis or cars parked around me. Most of the reasonable uses of dashcams (i.e., as a "cover your ass" mechanism) aren't particularly useful when the video evidence isn't admissable in any legal proceedings and you could get in trouble for having it at all. I think the rule is against making certain recordings of what could be private people/property more generally rather than specific devices, so I doubt the phone thing would work. I think there might be some workaround if the dashcam doesn't store the video at all without you manually interfering? But I've not dug into it.
With the rise of automated surveillance everywhere, from red light cameras to store cameras, I have been thinking about what it would mean to have a technical means to opt out. The general shape...
With the rise of automated surveillance everywhere, from red light cameras to store cameras, I have been thinking about what it would mean to have a technical means to opt out. The general shape of my idea is a series of bright IR LEDs that strobe in some randomized pattern to wash out the auto exposure on camera sensors. Then put them in glasses, or the brim of a hat, or a license plate frame. For wearables, there is the battery situation to consider, too.
To be clear, I am not in favor of shoplifting or speeding or trying to cheat on pay parking or tolls. But if it's a real problem, then put human resources on dealing with it. And if that's too expensive, then maybe it's not a big enough problem.
I imagine you'll start to see more resistance to "opting out", for private stores this'll probably mean being escorted out, and for civil matters like red light cameras you may receive fines...
I imagine you'll start to see more resistance to "opting out", for private stores this'll probably mean being escorted out, and for civil matters like red light cameras you may receive fines similar to trying to obstruct your license plate.
Best I can tell, in the PA/NY/NJ/ME/DE area, nobody has ever been cited for hving nearly-opaque license plate covers which are impossible to see through. Often paired with one of those...
Best I can tell, in the PA/NY/NJ/ME/DE area, nobody has ever been cited for hving nearly-opaque license plate covers which are impossible to see through.
Often paired with one of those 'friends/family of the police' stickers.
You're right of course, but all the more reason its kinda terrible. The end result of that tends to be systemic racism, see also getting caught with drug paraphenalia.
You're right of course, but all the more reason its kinda terrible. The end result of that tends to be systemic racism, see also getting caught with drug paraphenalia.
There is anti-facial recognition make-up, I don't know how well it works on modern tech though. Depends how much it disrupts the parts of the face they track.
There is anti-facial recognition make-up, I don't know how well it works on modern tech though. Depends how much it disrupts the parts of the face they track.
Once again noting that historically facial recognition programs are horrible at identifying Black people and not great at any other POC. In part because camera software is also bad at handling Black skin tones and in part because they rarely train it sufficiently to make distinctions. I'd really like to know how many of those 1 in 40 aren't white.
And of those, with how many it happens consistently all the time. It effectively means barring many from stores to get their basic human needs.
This is all but guaranteed to ruin people's lives. And to add insult to injury, of already heavily discriminated minorities. How that's an acceptable cost, I honestly can't see.
Is it a lack of training data, or is it just the way light works. Security camera video usually looks pretty poor to begin with, skin tones that reflect less light are going to have less distinguishable features or contrast.
It's demonstrated bias. Facial recognition is worst on Black Women, and if it were just "how light works" there probably wouldn't be a gender difference.
But let's say it was "just how light works" - it would clearly be super unethical to use something that can't correctly identify major portions of the population. You'd think they'd figure that out during the training and testing process if they were being thorough and inclusive. And if cameras also don't work on dark skin tones (Black models look amazing when lit and shot correctly) that's also another point of bias in the tech we're using.
But no, consistently, from NZ to the US to the UK, data shows a bias, not just to Black skin tones but Maori, Asian, and Latino folks.
Biased Tech by the ACLU
0.8% is still unacceptably terrible -- that's like, 8 dudes in 1000 get pulled put and publicly shamed for absolutely no reason.
And 34.7% error rate should send this technology straight to the trash heap.
the biggest problem is that dumb humans think tech is 100% correct. So if a human heard from another human "hey that's a shop lifter", there is probably some doubt being processed and they'll approach the individual with politeness and caution. Somehow when a human is told by a flashing screen, the humans themselves turn into automatons to "apply the company procedure". I don't blame the individuals, I blame the company procedures. Imagine being a loss prevention staff pulled in to be questioned why you let someone through when the flashing idiot box says not to.
Oh for sure it's not the retail employees, not even loss prevention, though they may have the same bias. It's decision makers and policy enforcers and everyone buying into AI.
We have the issue at the university, someone higher up in administration is having AI sweet nothings whispered in their ear and I'm at my team meetings being like "so, a chatbot for a disordered eating organization gave out weight loss advice like, day 1...."
There's a really fantastic episode of a workplace comedy called Better Off Ted from 2009 that makes it very clear how even if (though that if is still in question for AI) the technology that you are implementing is incapable of racism and is simply limited in its capability, the choice to implement that technology anyway is racist. It's about automatic lights, not facial recognition, but it's otherwise a striking mirror of this situation.
Given that the tech is known to racially prejudice to a significant degree, I would have thought there is even some basis for legal action here.
Imagine if you were to roll a D20 every time a black person walked into a shop, and then were to turn them away if it landed on a 1. This would obviously be illegal: why doesn't the same argument apply here? (In fact, looking at numbers elsewhere in this thread, the rate seems higher than 1 in 20).
From the article:
I think anyone would be hard pressed to find a large retailer, maybe even many smaller retailers, that tell their employees to interface with potential shoplifters. Every company I have ever worked for, or heard the policy of, tells employees that they are discouraged from stopping theft and some even go as far as to say they will fire employees that try to, since it exposes the company to lawsuits.
In what world does this technology "Protect frontline workers," if the result is employees being told by their employers to confront presumed criminals? Why would this software even change how most companies view the risk involved in compelling employees to confront shoplifters? I presume that the issue at focus in this piece, itself, opens companies who misidentify shoplifters to lawsuits, let alone the complication of harm or threat that already exists.
I guess this all presumes that the company is referring to this software and not some other AI that does something different and they're just speaking generically about their IP.
I worked at Kroger a few years back when I was in college -- we were told not to stop proper theft (like if we got held at gunpoint and asked to give cash) but when I worked self-checkout they said to stop anyone doing obviously suspicious shoplifting shit like having a huge cart full of drinks and trying to just wheel it through. But we also had a security guard there by the door and it's hard to say whether that was company policy or just my manager.
When I did my retail stints, it was explicit for any theft. You were not permitted to do more than just say 'excuse me, I need to check your reciept.' You were under strict orders to never follow or try to physically stop.
One employee workers comp claim, for say standing in front of somebody wheeling out a cart and getting a tailbone fracture from being slammed to the concrete, costs way more money to the company than say 20 PS5s.
It easily costs a hundred bucks to see a doctor to get antibiotics with shitty insurance, let alone get an xray and a cast or a bullet removed.
Oh yeah no, when I said "stop" I meant more "stop them and ask them to show their receipt", nothing more than that. You definitely weren't expected to physically prevent them from leaving.
I work for them currently, and either that was specific to the division you worked in, or changed since then. My division has fired people for simply going and capturing images of the license plates of shoplifters, and their official policy is do nothing besides alerting store management. Even attempting to ask them to check their receipt is against policy now.
My division also only places security guards at the union stores, nonunion ones (I work at one, sadly) do not have that. We have kids stealing cases of beer constantly, and all corporate did was turn the door that's on the deli/cafe area side into an emergency exit, but months later they still haven't installed any form of alarm on it, so they just walk out that door anyways. And I work at a store that has often been the single busiest store in all of Kroger nationally, not just in my division, no less. They have crap ass cameras that throw alerts all the time for theft if it so much as detects a purse swung off of a shoulder at the wrong time on the SCO registers, but do nothing to enforce things. It's an asinine waste of money.
I'm not complaining about not being required to intervene, but Kroger corporate clearly doesn't care that much about theft that isn't internal. God forbid you sample a produce item for a customer and eat the other portion yourself though, I had someone on my team written up for theft for that one.
To be fair the Kroger I worked at had other weird policies, like not accepting passports as ID for liquor sales, so it may well have been mismanaged or miscommunicated. In practice the only thing I ever got scolded on was my scanning speed.
So, a person who has shoplifted is banned not from that shop, but from shops plural? Sounds horrible. But at least it screws up and hits the wrong people.
I wonder this can be challenged in my country. (if it happens)
Because here they're not even allowed to hang camera's in the server room because people work there.
I think this type of technology has a harder time getting implemented in a country the stricter their privacy protections are. There's no way they'd allow this here in Germany, for instance. They don't even allow dashcams here and only recently got Google StreetView due to privacy concerns.
How popular are dashcams there despite their lack of legality? Are there map apps that make a phone into a dual-purpose satnav and camera?
I don't really know how popular having them illegally is -- I and my friends here don't own cars -- but I haven't noticed them in taxis or cars parked around me. Most of the reasonable uses of dashcams (i.e., as a "cover your ass" mechanism) aren't particularly useful when the video evidence isn't admissable in any legal proceedings and you could get in trouble for having it at all. I think the rule is against making certain recordings of what could be private people/property more generally rather than specific devices, so I doubt the phone thing would work. I think there might be some workaround if the dashcam doesn't store the video at all without you manually interfering? But I've not dug into it.
With the rise of automated surveillance everywhere, from red light cameras to store cameras, I have been thinking about what it would mean to have a technical means to opt out. The general shape of my idea is a series of bright IR LEDs that strobe in some randomized pattern to wash out the auto exposure on camera sensors. Then put them in glasses, or the brim of a hat, or a license plate frame. For wearables, there is the battery situation to consider, too.
To be clear, I am not in favor of shoplifting or speeding or trying to cheat on pay parking or tolls. But if it's a real problem, then put human resources on dealing with it. And if that's too expensive, then maybe it's not a big enough problem.
I imagine you'll start to see more resistance to "opting out", for private stores this'll probably mean being escorted out, and for civil matters like red light cameras you may receive fines similar to trying to obstruct your license plate.
Best I can tell, in the PA/NY/NJ/ME/DE area, nobody has ever been cited for hving nearly-opaque license plate covers which are impossible to see through.
Often paired with one of those 'friends/family of the police' stickers.
You wouldn’t get away with that in Australia.
One of those things that's only tolerated until it isn't. You get pulled over by the wrong cop on a bad day and you're looking at a hefty fine.
You're right of course, but all the more reason its kinda terrible. The end result of that tends to be systemic racism, see also getting caught with drug paraphenalia.
There is anti-facial recognition make-up, I don't know how well it works on modern tech though. Depends how much it disrupts the parts of the face they track.