A new investigation by 404 Media has revealed that law enforcement officials in Pinal County are using AI bots as part of a strategy to identify criminals online.
404’s Jason Koebler was part of the team that uncovered this, and joined The Show to discuss.
Full conversation
SAM DINGMAN: Jason, good morning.
JASON KOEBLER: Hey, thank you so much for having me.
DINGMAN: Thanks for coming on. So, Jason, it's the Pinal County Sheriff's Office that's using this program. Tell us how it supposedly works.
KOEBLER: Yeah, so Massive Blue is a company that has been pitching its technology to different counties in Arizona. It pitched the Tyuma County, Cochise County, and Pinal County, which has a $360,000 year-long contract to try this program called Overwatch. And basically what it does is, is it allows cops to create these AI personas that can then talk to potential suspects. One example of an AI persona is someone named Jason, not me. It's then it has a back, a backstory. There's an example of the type of persona that they're creating.
And so, Jason is 14 years old. He's from Los Angeles. His parents immigrated from Ecuador. He's an only child, he likes anime, gaming, comic books, and basically, this AI bot's job is to talk to potential child traffickers to collect evidence about them.
DINGMAN: Got it. And so how do these bots notionally do this? Do they just go on various social media sites and try to kind of lure in would be evil doers?
KOEBLER: That's sure what it seems like. We reported this story by filing a lot of public records requests, and there were emails between the company Massive Blue and the Pinal Sheriff's County asking for different keywords that they wanted tracked, and so basically, it sounds like based on the presentations we received that these bots will monitor, say, Reddit, Snapchat, X, Instagram.
And when one of these trigger words is detected, that bot will engage with them, whether that's commenting or direct messaging. And then very interestingly, they have the ability to change platforms. So say a conversation starts on Instagram, they might then try to move it over to a text message, for example.
DINGMAN: I see. OK. Well, one of the ideas here is that law enforcement might be able to use these bots to identify quote unquote “protesters,” which as someone you spoke to from the Electronic Frontiers Foundation pointed out, that raises some privacy concerns. Tell us about those concerns.
Yeah, so it's really interesting because they have different types of people that, you know, the these protest personas or these different personas, and they include traffickers, escorts, money launderers, drug dealers, but then, as you mentioned, it includes quote “college protesters” and quote “external recruiters for protests.”
And an example of one of those personas is a woman named Heidi. Again, this is a fake bot. But her backstory is 36 years old, from Texas, she's divorced, has no children, and she's very into body positivity, activism and baking. and, and like I, like you said, I spoke to civil liberties experts and said, you know, protests are First Amendment protected activities.
This is way different from trying to speak to a potential child trafficker, you know, and we've seen this current presidential administration try to crack down on college protesters and using the fact that someone may have participated in a protest as the pretext to revoke their visa status.
DINGMAN: So can I ask you, Jason, you, you mentioned Jason, the persona who is a 14-year-old boy who I guess would be used to try to ensnare a would-be child abuser, and you looked at a proposal where there was some, some sample text that Jason, this imaginary character, might put out into the internet. What did you make of the way these bots talk?
KOEBLER: Yeah, I mean, it really had the feel of a cop walking up to a group of children and saying, hello fellow kids. I mean, the the AI-generated text feels like extremely stereotypical, you know, a lot of them are like, misspelling and, you know, there's there's this text where the kid says, my mom's at work and my dad's out of town, and he's just like adding Z's to the end of moms and dads, and it's just really like, it's kind of hard to believe that a human being would talk this way.
But I think that in the context of You know, social media where there's a mix of human beings, bots, trolls, you sort of don't know who is doing what. I could imagine people being tricked by this, and, and that's one thing if it is, again, you know, a potential child trafficker, it's a totally different thing if it's someone who wants to join a protest, for example.
DINGMAN: Right. Well, so speaking of people being tricked, let's, let's talk about the effectiveness or perhaps lack thereof of this program so far. As you mentioned, Pinal County has signed a $360,000 contract with Massive Blue to use this technology. What have we seen so far? Has it led to any arrests?
KOEBLER: Yeah, so it's led to zero arrests so far. Pinal County has said that they have active investigations that are using this software. They've declined to say what those investigations are about or how many they have, but we know that they've been using it for about a year now and it's led to zero arrests.
Both the company and the sheriff's department have been really secretive about this technology and what it does. Again, the way that we got these records is through public records requests both with sheriff's counties in Arizona but also in Texas, where they have a little bit more frankly transparent public records laws. And so I wasn't able to get these presentations directly from Pinal County. I had to get them from the Texas Department of Public Safety.
And so, sort of like at county council meetings and things like that, both representatives for the company and representatives for the sheriff's department have said that they quote “can't get into great detail” about what the technology is, how effective it is, and how it works.
DINGMAN: Hmm. All right, well, we'll have to leave it there for today, but it'll be interesting to see where this story goes.