Essex police pause facial recognition camera use after study finds racial bias

34 pointsposted 7 hours ago
by Brajeshwar

32 Comments

bsenftner

5 hours ago

Former author of one of the top 5 facial recognition servers in the world for multiple years running, here's what's going on: the industry has solved this issue, but the potential clients are seeking the lowest bidder, and picking the newer companies, the nepostically created not really players but well connected, and those companies have terrible implementations. This is not a case of the technology not there yet, we solved all these racial bias issues 10 years ago. But new companies with new training sets and new ML engineers that do not know any of the industry's history are now landing contracts with terrible quality models, but well connected sales channels.

griffzhowl

an hour ago

This study finds a higher rate of correct identification for black people than for other ethnic groups, whereas a few years ago the problem seemed to be that the software was less effective at identifying black people.

Do you have some insight about why this reversal might have occurred?

raw_anon_1111

3 hours ago

How recently? We had a home security camera and every time our (Black) son walked up to the door, the camera would classify him as an “animal”. This was as recently as 2022

fallinghawks

15 minutes ago

In the other direction, my camera regularly identifies cats, crows, and shadows as people. I think recognition in security cameras has a very long way to go.

graemep

3 hours ago

So just like the rest of government IT then.

OJFord

5 hours ago

This is actually more (socially/ethically/philosophically) interesting than one might assume from the headline: it's not false positives, it's that it's more effective (correctly identifies someone is on a watch-list) for one group than another within a protected characteristic.

So essentially they're pausing the use of it because it works too well for group A / not well enough for group B, potentially leading to disproportionate (albeit correct) arrests of group A.

metalman

5 hours ago

Absolutly impossible to condone further structural bias against a minority, and just ignore the free "white pass" built into the software, and esspecialy troubling that it passes white women, the most. The only possible action is to reject and dissable any system with a racial bias, investigate how such a thing happened, with a very pointy look for intent on the part of the vendors, who would then qualify for bieng housed in one of his majestys facilities for persons such as these.

edgyquant

5 hours ago

If it’s not falsely identifying people I don’t see a problem at all. If it’s identifying criminals every criminal should be caught

almostjazz

3 hours ago

If you start with hypothetical demographic groups A and B that are for all intents and purposes exactly identical, but you implement a system such that if A commits a crime they have a 10% chance of being caught and if B commits a crime they have a 50% chance of being caught, you will achieve the following:

1. More short-term crime prevention than a system catching 10% of A's crimes and 10% of B's crimes (good!)

2. Enforce a societal belief that A is intrinsically better than B (bad!)

3. Disproportionately burden children, families, and communities in B than A, causing them to indeed perform worse in everything than those in A (bad!)

4. As a result of 2 & 3 it is not a stretch to say simply causing B to do more actual crime (potentially negating point 1 entirely)

If you believe that crime enforcement is not for the sake of vengeance but instead something done to improve the well-being, safety, and happiness of citizens, you may see that inequality=bad just as crime=bad. How to best solve this trolley problem is complicated but it's important that people are aware that it is complicated before firing off an answer.

Joker_vD

5 hours ago

See, what you've said is precisely "structural bias against a minority", or "systemic injustice". Then again, the elites are, technically, also a minority as well, and we all know how well letting their crimes slide works out for the rest of the society.

metalman

4 hours ago

it is FALSELY unidentifying people, which makes the harware, software, sales, implimentation of the whole system a criminal enterprise, which it is. Kudos to the police for rejecting this racist biggoted unjust criminal software implimentation.

blitzar

5 hours ago

> the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”

Technology has moved on a lot no doubt, however, studies were finding the opposite (and with order of magnitude errors) as recently as 2020 with a lazy google literature search

> these algorithms were found to be between 10 and 100 times more likely to misidentify a Black or East Asian face than a white face

https://jolt.law.harvard.edu/digest/why-racial-bias-is-preva...

krisoft

5 hours ago

Given that these are machine learning algorithms their performance will very much depend on the training dataset. So it is probably not (just) that “technology has moved on a lot”, but that the engineers working on it curated new training sets. It is not entirely unreasonable to think that they too read the paper you are talking about and made measures in an attempt to correct for the effect.

stuaxo

3 hours ago

Maybe, or there might be qualities around say contrast and the physical cameras themselves that build this in.

griffzhowl

an hour ago

But it doesn't seem like these physical factors would have reversed the effect between 2020 and now. Unless you can think of some?

blitzar

an hour ago

Cameras are much better now than they were before, all the same everytime you get a "have you seen this person" its a grainy pixelated flat image from CCTV that could easily be one of thousands of people.

blitzar

4 hours ago

then in theory, the dataset can be changed to make model error rates "fair" for all intersections of race, gender, age etc.

ap99

6 hours ago

> more likely to correctly identify men than women.

> more likely to correctly identify black participants than participants from other ethnic groups.

> AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.

I wonder if they're more worried about putting too many men in prison or too many black people.

graemep

3 hours ago

They are concerned about a higher rate of false positives (therefore a higher rate of incorrect arrests etc.) of white people (and probably Asians etc.) and women. This is also discriminatory.

People forget equality law runs both says. it is illegal to discriminate against men, whites, or heterosexuals just as it is to discriminate against women, non-whites or gays.

xenocratus

5 hours ago

Neither, they're worried about bad rep.

ghusto

5 hours ago

> the system was more likely to correctly identify men than women and it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

I am genuinely unsure what's going on.

My understanding of the article is that the system is problematic because it is more likely to correctly identify black people than "other ethnic groups". Is that right?

defrost

5 hours ago

It's problematic for use in Essex as it works best for a small minority of the Essex population and has a much higher error rate for a typical sample of the Essex community.

Adendum: Essex Ethnicity breakdown- 85.1% White British · 5.2% Other White · 3.7% Asian · 2.5% Black · 2.4% Mixed · 1.1% Other · (2021).

from: https://en.wikipedia.org/wiki/Essex

ie: most accurate (however acccurate that is) for the men of 2.5% of the regions population

Not so accurate for 98.75% of the regions population.

OJFord

5 hours ago

Essentially (with made up numbers): 100 men on a high street, 4 of which are on a watch-list; 2 of which are black. Both black guys get identified, only one of the others does.

Ditto men vs. women, mutatis mutandis.

edgyquant

5 hours ago

So it should be improved but sounds like it’s just catching criminals who need to be caught no?

bondarchuk

5 hours ago

ap99

4 hours ago

Having lived in large urban areas my entire adult life and watching how different cultures behave, there are in fact differences.

Ignoring the color of someone's skin, do you think the person who routinely litters, breaks small rules, breaks large rules, ignores customs, flouts laws, is not deferential to authority, etc... Do you think they'll be more or less likely to end up in prison?

graemep

3 hours ago

The problem is that a likely outcome is that they will arrest two white men who are not the ones on the white list. That is discriminatory, at least if it keeps happening so that you get a higher rate of wrongful arrests of one group.

pingou

6 hours ago

If the suspect is Black, the software should automatically return zero matches in 30% of cases. Problem solved.

t23414321

2 hours ago

antimemetics (look one more time)

moi2388

2 hours ago

“statistically significantly more likely to correctly identify black participants than participants from other ethnic groups”.

Great. Wasn’t the problem before always that it couldn’t correctly identify non-white people? It does it accurately now. That is somehow also a problem? It should make more mistakes?

bloqs

5 hours ago

Correlation does not indicate causation

gib444

6 hours ago

Alternative headlines:

Essex police, well aware of all the issues before using it, pause use until expected bad publicity dies down

Or

Essex police chosen as force to take some flack for the issues while other forces steam ahead