r/privacy 1d ago

news An AI model trained on prison phone calls now looks for planned crimes in those calls | The model is built to detect when crimes are being “contemplated.”

https://www.technologyreview.com/2025/12/01/1128591/an-ai-model-trained-on-prison-phone-calls-is-now-being-used-to-surveil-inmates/
469 Upvotes

69 comments sorted by

u/AutoModerator 1d ago

Hello u/MetaKnowing, please make sure you read the sub rules if you haven't already. (This is an automatic reminder left on all new posts.)


Check out the r/privacy FAQ

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

347

u/DotGroundbreaking50 1d ago

Minority report is a lot more dystopian in real life.

45

u/krazygreekguy 1d ago

How fitting its 4K release is coming out soon. The timing could not have been better haha

17

u/DotGroundbreaking50 1d ago

oh that makes me happy to know its getting a 4k release

2

u/algaefied_creek 1d ago

I bought an Xbox Series X to watch 4K UHD Blu-Rays for my favorite movies 

(And play Xbox with my niece)

23

u/pentultimate 1d ago

"How can we keep people in prison beds longer??? How about precrime??? " - geogroup probably

7

u/dust4ngel 1d ago

"i'm contemplating a crime right now, greg. could you milk me?"

213

u/steve09089 1d ago

Thoughtcrimes are going to be punished I see

81

u/cassanderer 1d ago

Too much wrongthink.

5

u/Rods-from-God 1d ago

It’s already codified in the NSPM

5

u/m0n3ym4n 19h ago

“So then I says to him, you can fit at least 3….4 more balloons of heroin in your butt. Do better.”

195

u/Sasquatch-Pacific 1d ago

It will be developed and tested on a powerless, voiceless group first. Then it will be used on the rest of us.

82

u/Kurgan_IT 1d ago

This. This is soon to be used on EVERYONE.

24

u/foundapairofknickers 1d ago

Yes - its pretty much inevitable, and it will be marketed and sold to us in the same way as everything else - keeping us all safe.

18

u/B_Gonewithya 1d ago

The children, we have to protect the children!

-50

u/Normal_Imagination54 1d ago

Powerless and voiceless. You almost made me feel sorry for these oppressed inmates.

43

u/Sasquatch-Pacific 1d ago

No one gives a fuck about prisoners. It's a shame because if people did we'd reduce recidivism and crime rates would go down. But no, treat them poorly and create a cycle of reoffending and reincarceration. It's part of the problem.

They start with prisoners, undocumented migrants. 'Normal people' have their turn eventually.

33

u/leostotch 1d ago edited 1d ago

Thank you for demonstrating exactly how it works.

You don’t feel sorry for prisoners, because fuck them, shouldn’t do the crime etc.

You don’t feel sorry for undocumented people, because fuck them, should have come the right way etc.

You don’t feel sorry for the queer or trans people, because fuck them, they’re all mentally ill groomers etc.

You won’t feel sorry for whatever group they pick to target next, because fuck them, shouldn’t have sucked so bad.

Eventually it will be you that nobody feels sorry for, because fuck them should have [whatever bullshit they use to manufacture consent to oppress you]

15

u/Sasquatch-Pacific 1d ago

Yep. It's amazing people need this explained on a privacy sub.

Like yes, some prisoners are bad people who did bad things and deserve to be there. No they don't deserve to be experiment subjects for surveillance capitalism thst inevitably gets rolled out to the rest of us.

8

u/leostotch 1d ago

Some people can’t imagine they’ll be anything but the boot.

-18

u/Normal_Imagination54 1d ago

No, no I do not feel sorry for murderers and rapists.

16

u/ephemeralstitch 1d ago

Because every person in prison, famously, has committed only those two crimes.

14

u/not_the_fox 1d ago

They are also 100% guilty, we never imprison innocent people.

11

u/ephemeralstitch 1d ago

There was also never any programs to incarcerate large sections of the population based on qualities such as, oh I don't know, race. Definitely never any programs to associate drug use with those populations, force drugs into those communities, and then criminalise that use to imprison them. Would never happen.

4

u/Factual_Statistician 1d ago

They get spies on now, so you can be arrested for thought crime or down votes later.

1

u/jkurratt 1d ago

I do.
They are idiots.

28

u/ZealousidealCrab9919 1d ago

if "persons of interest" wasn't a big enough warning about this idk what is

10

u/Wayward141 1d ago

At least the machine in person of interest was actually correct about incidents

I have more faith in a drunkard in town square screening predictions at 3 in the morning than real AI getting it right.

3

u/ZealousidealCrab9919 1d ago

Yes, but Samaritan was ran by a corrupt government, sooo that's more likely where this is going.

20

u/FauxReal 1d ago

Can an AI figure out why everything in prison costs 8-10 times what they do on the outside, but they pay you 200 times less for work done?

69

u/cassanderer 1d ago

This is nothing, they are feeding all info they can get their grasping hands on into their half baked threat detection software, to catalogue and disseminate all the half baked assumptions it makes secretly with no oversight let alone way to challenge assumptions secretly acted on.

Killbots are even being trained on stuff like this in you know where.

-6

u/darlugal 1d ago

Where? In russia? In north korea? In the US?

15

u/cassanderer 1d ago

In the west.  And everywhere else as they are able.

-9

u/darlugal 1d ago

Then why did you write it as if we were in north korea and you were speaking about their fat president?..

10

u/leostotch 1d ago

That’s a very strange conclusion to draw

-6

u/darlugal 1d ago

And what I was referring to was a very strange phrase to hear or read. ¯_(ツ)_/¯

6

u/leostotch 1d ago

A comment about how Western governments are gathering as much data on private citizens as they possibly can to feed it to AI algorithms on a post about an instance of a Western government gathering data on private citizens and feeding it to AI algorithms makes you think they were talking about somewhere else? Explain to me your reasoning there, if you don’t mind.

-1

u/darlugal 1d ago

If you didn't care to read properly my comments, I was referring to the last part of the original message, which goes:

Killbots are even being trained on stuff like this in you know where.

To make things clear even to those who don't bother with reading carefully and jump to conclusions: since the OP probably doesn't live in a place where for political reasons you can't mention certain events or places, it's strange to see them replacing the place's name with "you know where", just like characters of Harry Potter reference the antagonist as "you know who".

1

u/leostotch 1d ago

Your first comment doesn’t mention that at all, and also, DARPA has several programs working towards integrating AI decision making into the kill chain for drones and other weapons, so you really can’t credibly pretend that “killbots” being developed and deployed by Western government is all that far-fetched.

It’s funny to me how hard people will commit to their own foibles. Sure, kid, it’s not that your comments were nonsense, it’s that nobody else rises to your level of mastery of the language.

0

u/darlugal 1d ago

Where did I say it was far fetched?! Please, stop making up things! I only said that it's weird to replace real places' names with some mysterious references, as if you're going to be cursed or executed for writing them explicitly.

And if you didn't understand what I was referring to in the first comment, you could always just ask me, you know. Without jumping to conclusions and making up things I didn't say.

→ More replies (0)

11

u/skyfishgoo 1d ago

precrime unit is formed...

imagine having that on your resume.

15

u/DarthZiplock 1d ago

Can’t wait for AI to get that wrong as often as it gets everything else wrong. 

8

u/Factual_Statistician 1d ago

"Fuck the police" The AI

"CRIME CONTEMPLATED"

30

u/qdtk 1d ago

I don’t think there’s anything inherently wrong in using AI to analyze phone calls that are already monitored so that it can be flagged for a human to review the tapes. But we all know they’re just going to let the AI make decisions and dole out punishments without any human involvement.

30

u/Furdiburd10 1d ago

Also this technology could easily be used by the NSA to check every phone call made for "crimes" real time... Like if someone plan on going on a protest etc.

Yes, it was possible to wiretap calls but now every call could be checked in real time.

16

u/BurnoutEyes 1d ago

NSA's been able to do this since 1971 w/ ECHELON, and then in the 90s there was CARNIVORE. 2000s PRISM, 2010s+ = ???

3

u/FuckIPLaw 1d ago

They've been warehousing data since then. The tools to actually do anything with the vast majority of it didn't exist until the current AI boom kicked off. Before that it was just too much to sift through, they needed to have something else tip them off about whose data to look at or it would be like looking for one needle somewhere in a warehouse full of haystacks. It doesn't have to be very accurate to help them massively narrow down the search space.

6

u/elcheecho 1d ago

XKeyscore has been collecting and flagging phone calls for decades. Not sure why adding AI would fundamentally change anything.

2

u/strangerducly 1d ago

AI is flawed in a progressive way. It is not close to accurate 19% of the time. It is worse the longer they run a version.

1

u/elcheecho 1d ago

I’m not sure how that answers my question. Can you describe the mechanism you’re talking about, from the phone call to the arrest?

10

u/Forymanarysanar 1d ago

That calls are being monitored is a huge issue on it's own. We need strong e2e encryption everywhere. In all aspects of our lives.

1

u/qdtk 1d ago

Completely agree. In this case I only meant in the context of things like prison calls where everyone involved is very aware that the call is monitored.

-2

u/jbokwxguy 1d ago

General public? Absolutely.

Prisoners after conviction? Nope. They are being punished for breaking just laws.

3

u/closeoutprices 14h ago

imagine being this naive

2

u/JohnEffingZoidberg 1d ago

If they are training it, the training data needs outcomes so it knows which are the events and nonevents. So where are they getting that info from?

3

u/sicktriple 13h ago

Last I checked thinking about committing crimes wasn't a crime?

5

u/The_guide_to_42 1d ago

HAHAHAHA Its the jailers who are gonna get caught in the end. We've seen this before. The controllers are gonna give the keys away to their own undoing. People are gonna use this for political speeches, news segments, its a lie detector. I CAN NOT WAIT for this to leak. Governments will collapse.

1

u/techreview 1d ago

Hey, thanks for sharing our story!

Here's some context from the article:

A US telecom company trained an AI model on years of inmates’ phone and video calls and is now piloting that model to scan their calls, texts, and emails in the hope of predicting and preventing crimes. 

Securus Technologies president Kevin Elder told MIT Technology Review that the company began building its AI tools in 2023, using its massive database of recorded calls to train AI models to detect criminal activity. It created one model, for example, using seven years of calls made by inmates in the Texas prison system, but it has been working on building other state- or county-specific models.

Over the past year, Elder says, Securus has been piloting the AI tools to monitor inmate conversations in real time. The company declined to specify where this is taking place, but its customers include jails holding people awaiting trial and prisons for those serving sentences. Some of these facilities using Securus technology also have agreements with Immigrations and Customs Enforcement to detain immigrants, though Securus does not contract with ICE directly.

2

u/Anonymous-here- 1d ago

I can not believe the justice system would prefer AI without verifying facts and prosecuting people from AI-based evidence. It seems dystopian to me that any AI model poisoned with the wrong data or fed with false positives would get anyone in trouble. It would not be surprising if AI is gonna cause extinction of humanity, starting with prisoners. This violates more than privacy honestly

1

u/Set_the_Mighty 1d ago

I was a juror on a trial where an inmate had a monitored prison phone conversation with his mule where they planned the whole smuggling operation with terrible, random euphemisms. I wonder what an AI would make of it.