This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
A few book recommendations to better understand how predictive risk algorithms further victimize the vulnerable:
- Automating Inequality by Virginia Eubanks https://app.thestorygraph.com/books/638496ba-6ce4-4d31-ab42-34ae18ceef1e
- Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin https://app.thestorygraph.com/books/f7d95c0d-c131-4b42-9bb4-a6b2d139a762
- Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil https://app.thestorygraph.com/books/4d181a56-a300-49d2-968e-94647ac3c48d
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
@Ashedryden wish I could read it but your summarization is on point.
There are ways to use ai for sociology, I think, with data that exists... from what you wrote, it smells of an excuse to collect extraneous data for surveillance.
Which is pretty much what everyone seems to be doing...
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
@Ashedryden muuuuum! fashy's playing with the phrenology machine again!
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
@Ashedryden sounds like Minority Report irl! They even made that point in the film - a person covered up a murder by finding a loophole in the "crime prediction" system.
-
A few book recommendations to better understand how predictive risk algorithms further victimize the vulnerable:
- Automating Inequality by Virginia Eubanks https://app.thestorygraph.com/books/638496ba-6ce4-4d31-ab42-34ae18ceef1e
- Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin https://app.thestorygraph.com/books/f7d95c0d-c131-4b42-9bb4-a6b2d139a762
- Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil https://app.thestorygraph.com/books/4d181a56-a300-49d2-968e-94647ac3c48d
@Ashedryden It's exasperating how many cops, technocrats, and marketing dweebs seem to think "Minority Report" was an instructional video.
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
@Ashedryden the criminals will be the people against AI
-
A few book recommendations to better understand how predictive risk algorithms further victimize the vulnerable:
- Automating Inequality by Virginia Eubanks https://app.thestorygraph.com/books/638496ba-6ce4-4d31-ab42-34ae18ceef1e
- Race After Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin https://app.thestorygraph.com/books/f7d95c0d-c131-4b42-9bb4-a6b2d139a762
- Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil https://app.thestorygraph.com/books/4d181a56-a300-49d2-968e-94647ac3c48d
Also important to mention that according to labeling theory (https://en.wikipedia.org/wiki/Labeling_theory), being labeled pre-criminals will have an outsized impact on the life trajectories for these kids. It will come with stigma that they will eventually internalize: "if everyone keeps treating me like a criminal, I guess I am one", leading to a self-fulfilling prophecy. Now you have managed to manufacture a deviant.
-
This *looks* like a pro-social use of AI, but it's not https://www.thetimes.com/uk/politics/article/ai-data-children-predict-criminals-fwclzh323
1. Sociology has a pretty good idea of what factors contribute to crime, such as poverty. This "solution" is a surveillance-based intervention into individual rather than structural harms.
2. It fails to notice that predictive risk systems themselves victimize the vulnerable; they amplify bias and create feedback loops.
3. This will necessarily treat children as pre-criminals.
@Ashedryden oh for fuck's sake. this time it's PKD going "do not create the torment nexus" - and the UK government is going "look at the shiny new torment nexus we are building! aren't we clever? don't you want to vote for us? wouldn't it be scary if Reform got their hands on this torment nexus?"
-
System shared this topic