Let’s face it though, the movie never made a lot of sense.
They replaced humans with artificial intelligence, an artificial intelligence capable of independent thought (why did it need to be capable of independent thought).
All they needed to do was literally replace the humans with remote control relays, there’s absolutely no reason for it to be an AI. If you are giving artificial intelligence access to weapons, then you are the problem, not the artificial intelligence.
I wish we could have a sensible conversation about AI without assuming that it’s going to kill everyone because it happened in some movie. AI’s biggest threat to humans is that it will replace everyone by doing a better job, and our entire economic system will fall apart, not that it’s going to start Armageddon.
Let’s face it though, the movie never made a lot of sense.
They replaced humans with artificial intelligence, an artificial intelligence capable of independent thought (why did it need to be capable of independent thought).
All they needed to do was literally replace the humans with remote control relays, there’s absolutely no reason for it to be an AI. If you are giving artificial intelligence access to weapons, then you are the problem, not the artificial intelligence.
I wish we could have a sensible conversation about AI without assuming that it’s going to kill everyone because it happened in some movie. AI’s biggest threat to humans is that it will replace everyone by doing a better job, and our entire economic system will fall apart, not that it’s going to start Armageddon.
Is there a difference?