Cellebrite Dumps AI Into Its Cell Phone-Scraping Tool So Cops Can Hallucinate Evidence

submitted by

www.techdirt.com/2025/03/18/cellebrite-dumps-ai…

I honestly don’t understand this compunction to break things that are already working fine. Axon makes body cameras (and Tasers!), but it simply wasn’t enough to equip cops with cameras…

3
107

Log in to comment

3 Comments

I honestly don’t understand this compunction to break things that are already working fine.

Simple market capitalist mindset. If you're not expanding, you're dying.


Equipping cops with cameras gets cops in prison, not the people cops are terrorizing, that’s the difference.


The headline seems a little misleading, since there's no reason to think AI summaries themselves are going to be what gets presented to a court. Sounds like the actual concern is violating someone's rights by going through all of their phone data beyond the scope of a warrant:

Allowing software to just go blundering around in the scraped contents of a seized phone is quite another, as ACLU lawyer Jennifer Granick stated to 404 Media:

“The Fourth Amendment does not permit law enforcement to rummage through data, but only to review information for which there is probable cause. To use an example from the press release, if you have some porch robberies, but no reason to suspect that they are part of a criminal ring, you are not allowed to fish through the data on a hunch, in the hopes of finding something, or ‘just in case.’“


Comments from other communities

Ha ha ha, good quality headline from The Onion!

sees source

.......well fuck.

A month or two ago, there were a few articles involving Jim Jordan that were peak Not The Onion material.



Sounds like a great way to get evidence thrown out of court.

In a just judicial system, yes. But that's not what we have in the US.


Mission is going according to plan



Even RAG is terrible at accuracy and avoiding hallucinations.


Fabricating evidence is the one thing cops are good at .


The fourth amendment implications are on point here, but this tool isn't "hallucinating" evidence. It's a shitty LLM that lazy investigtors can use to find links between different device artifacts mostly.

Cellebrite is dumping money into this because its the industry buzz right now. They just want more of that sweet government contract money. It's usefulness (and even invasiveness in some cases) is pretty overstated.

While it being junk is all well and good, how to you convince a judge or a jury that their "evidence" is garbage?


Even less shitty LLMs tend to hallucinate.



Ironically, the number of inaccuracies and half truths this article contains makes me think it was written by AI.

Got a lot of people to click on it while raging, though, so it served its purpose.

In case anyone's interested in the source material, here's the press release it's going on about. The AI is about searching and analyzing evidence, it isn't fabricating anything that'll actually be used in court.

I'm not holding my breath.


The problem is those of us not in digital forensics believe this BS. It fuels anti-law enforcement sentiment unjustly. Hate LE if you want, just make sure it's based on truth, not shite like this.




This article is written with some wild speculations by both the author of the article and the source they are quoting. When cell phones are cracked for evidence they have to use write blockers when they copy the phone. They do the analysis on the copy. The original is then re-copied in court to show what was found. This way the original is never tampered with and made inadmissible, and whatever analysis bullshit you did isn't mixed in with your court room copy. What this also means is that your AI can hallucinate all it wants and make up any evidence you can imagine all day long, but when you get into the court room and have to then point to where the conclusions came from and you can't-you will be standing there with a dick on your forehead and with a case being tossed out.


Post-truth society sucks.


Insert image