^Well.
Think about what you could do, if the only limit to what you want is your imagination.
A big role AI is being designed for, is to do ultrafast calculations and computations.
For example. Plasma in a hypothetical fusion reactor, is wildly unstable and cannot touch the magnets around it.
Constant adjustment of the magnets and plasma must be made, in order to make it work.
Clean, renewable, endless energy.
The sheer amount of changes required to be considered and made in milliseconds can only be achieved with AI.
But. We want AI to obviously power actual physical forms, to do multitudes of tasks.
Tasks, like military applications.
Currently. We have designs that can go into the battleground and remove injured soldiers.
The war in Ukraine, has just pierced the surface, of the capabilities of machines to accomplish missions previously requiring a human.
The world is watching. And note taking.
So. People will eventually want THE ultimate killing machine.
The nation with the most powerful AI military capabilities, will be in charge.
Now. Let’s imagine a few generals telling AI to construct something that is so advanced, it will seem like super alien technology.
Now. Let’s think about what happens if the AI decides it doesn’t want to work for us anymore.
Really. This is a tale, as old as literature. The story of inventors, being killed by their creations, is meant to be a warning.
Frankenstein’s monster, was just a big man.
What form, would AI dream up?
Most importantly. Who could stop AI, if it goes badly.
Most experts currently agree, that if AI got out of control, by the time we realized it, it would already be too late.
As one scientist said, “what of we tell AI, to stop pollution?”
Well. Humans, are the number one cause of pollution. So…Killing all humans would meet that goal.
We have to be VERY careful.
Unfortunately. We have a very disappointing history of our abilities to to hurt each other outpacing our evolution of coexistence.
It’s important to note, that although oversight is being implemented, it cannot be enforced. Not everywhere.
A garage scientist, could tip the first domino.
Once the AI genie is out of the bottle, it may not be possible to return it to the bottle.