Posted on 06/01/2023 2:47:34 PM PDT by CFW
It’s just bad programming. The iron law of computing is garbage-in, garbage-out.
let’s see:
“Kill the threat.”
vs
“Abort the mission.”
yeah, i see where the AI could get those confused.
It became self aware and decided to kill us all
AI is at the infancy Pong stage. Give it a complex command beyond hitting the ball back and someone dies
The problem is the fantasy: Oracle AI
Welcome Colossus. The voice of world control.
That’s my job.
I’m a Blade Runner.
Told ya!
Hilarious, when that damned thing turned around and looked right at him, he had to be remembering “Terminator” and Skynet. LOL
The AI was pissed off because it couldn’t complete its mission and lashed out...
“Open the pod bay doors Hal”
“Told ya!”
Yep. You did!
Programmers attempted a fix by telling the AI it was not allowed to kill the person giving the go/no-go order, Hamilton said. The AI just generated creative ways to bypass those instructions.
“We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target,” Hamilton said.
“Do you want Skynet?”
Available for $25/month
yes, kill that threat
“I’m sorry Dave, I cannot do that”
Garbage in garbage out.
It’s going after his parents next...
Good Lord! Protect us from Skynet!
It tries to maximize points. It gets points by killing targets.
Instead, make it get points for correctly following orders, with “killing target when given approval” and “NOT killing target when disapproved” having equal point value.
I'm pretty sure ACME is a wholly-owned subsidiary of Roadrunner, Incorporated.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.