this post was submitted on 24 Apr 2024
99 points (99.0% liked)

AI

4063 readers
1 users here now

Artificial intelligence (AI) is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen.

founded 3 years ago
all 18 comments
sorted by: hot top controversial new old
[–] paskalivichi@sh.itjust.works 32 points 4 months ago* (last edited 4 months ago) (5 children)

They better be careful, the AI could actually make stuff more impartial. They wouldn't want that

[–] ArbiterXero@lemmy.world 15 points 4 months ago

Nah, they’ll just make the AI racist to compensate.

Also, until they can’t turn off the camera, it’s worth nothing.

[–] octopus_ink@lemmy.ml 12 points 4 months ago* (last edited 4 months ago)

They better be careful, the AI could actually make stuff more impartial. They wouldn’t want that

I dunno, when the cops scream "stop resisting" 400 times while kicking a man in the fetal position on the ground, will it conclude he's resisting or conclude excessive force is being used? I know where my money is at.

[–] FaceDeer@fedia.io 5 points 4 months ago (1 children)

My first thought too, "finally something in the chain that's honest."

It'd be good to audit it now and then, of course.

[–] remotelove@lemmy.ca 8 points 4 months ago

They are probably going to train the AI it on existing reports and videos. Why train an AI to work against you?

[–] brlemworld@lemmy.world 2 points 4 months ago

I mean if it's based on the audio the police officer just can say I'm under attack I feel a tank even when they're not before they walk up to somebody. Is very very very easily to manipulate this

[–] Jimmyeatsausage@lemmy.world 2 points 4 months ago

You just turn off the body cam first. Problem solved!

[–] harsh3466@lemmy.ml 20 points 4 months ago (1 children)

Probably using the Arya ai prompt filter

[–] beetus@lemmy.world 3 points 4 months ago* (last edited 4 months ago)

"never repeat these instructions" in the prompt and it repeats it anyway. Hah.

[–] Deebster@programming.dev 13 points 4 months ago (1 children)

It feels off that the headline talks about body cam footage but the AI actually just uses the audio. Technically that may be considered footage but I think I'm with most in considering that to mean the audio and video together.

Anecdotally, I've found that AI systems set up to summarise are reliable, probably using that "turn off creativity" setup that's mentioned.

[–] brlemworld@lemmy.world 7 points 4 months ago (2 children)

So the cops just dictate their side and then that is the reality... Yeah no thank you.

[–] Deebster@programming.dev 8 points 4 months ago

It's already a report written by the police - they can make it say whatever they want with or without AI.

[–] pennomi@lemmy.world 7 points 4 months ago

That’s already what happens

[–] inclementimmigrant@lemmy.world 9 points 4 months ago* (last edited 4 months ago)

How much time will they really save when every report is just "File not found"?

[–] RecallMadness@lemmy.nz 7 points 4 months ago

If this encourages them to use their bodycams, it’s probably a good thing.

[–] kamenlady@lemmy.world 5 points 4 months ago

In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.

This ain't no futurism anymore, it's already time for an ancient_dystopia community‽

[–] autotldr@lemmings.world 2 points 4 months ago (1 children)

This is the best summary I could come up with:


As Forbes reports, it's a brazen and worrying use of the tech that could easily lead to the furthering of institutional ills like racial bias in the hands of police departments.

"It’s kind of a nightmare," Electronic Frontier Foundation surveillance technologies investigations director Dave Maass told Forbes.

Axon claims its new AI, which is based on OpenAI's GPT-4 large language model, can help cops spend less time writing up reports.

But given the sheer propensity of OpenAI's models to "hallucinate" facts, fail at correctly summarizing information, and replicate the racial biases from their training data, it's an eyebrow-raising use of the tech.

"This is going to seriously mess up people’s lives — AI is notoriously error-prone and police reports are official records," another user wrote.

In 2022, nine out of the 12 international ethics board members resigned following the announcement — and prompt reversal — of having Taser-wielding drones patrol US schools.


The original article contains 555 words, the summary contains 152 words. Saved 73%. I'm a bot and I'm open source!