Can artificial intelligences become fully aware of themselves if they cannot be sentient (overcome their programming)?
For the purposes of this question, I’m assuming that artificial intelligences cannot become “human”, that is, to have emotions, to decide for themselves, etc.. I’m only asking if they can be fully aware of themselves. The examples are numerous, the Machine from “Person of Interest”, KITT from “Knight Rider”, R2D2 from “Star Wars”, etc.
N.B. Also, look for “Eagle Eye”, “Echelon Conspiracy”, and “WarGames”.
Observing members:
0
Composing members:
0
11 Answers
Wouldn’t it depend on what they are programmed for and limits .
However self teaching if programed to observe and learn “may” help to become self taught,at least I would think of that possiblility.
Especially in the first line.
Not without the total ability to reason. Remember how Robbie the Robot reacted to a conflict in input and programming in the movie “Forbidden Planet?”
Isn’t the definition of sentient being self aware?
I believe it is a mistake to tackle your question based on the premise implied in your details that emotions are necessary to the definition of sentience. I also suspect that the obsession with all of these science fiction and futuristic fictions blurs the lines between possibility and reality for those who dabble in such matters as opposed to those schooled to follow scientific disciplines.
“A.I.” is not something that can be sentient right now. It just does not work that way.
It depends on what you mean by self-aware. It should be possible to program a robot to recognize itself as a physical object and to recognize itself in a mirror.
No. Some symbols that correspond to some things about the AI can be encoded, and humans can program an AI to act on the data it has in ways that seem to correspond to behavior that makes some sense to humans, but it’s just an information system, and the understanding comes from the human designers and programming, and only exists in the AI in terms of data representations. At some logical level, there can be a representation of the properties of itself, but that is not understanding per se of the AI itself..
@luigirovatti
Wouldn’t it depend on what they are programmed for and limits .
However self teaching if programmed to observe and learn “may” help to become self taught, at least I would think of that possibility.
( corrected spelling Programmed and Possibility)
But I think that you understood anyways?
Answer this question
This question is in the General Section. Responses must be helpful and on-topic.