General Question

luigirovatti's avatar

Can artificial intelligences become fully aware of themselves if they cannot be sentient (overcome their programming)?

Asked by luigirovatti (3001points) December 14th, 2020

For the purposes of this question, I’m assuming that artificial intelligences cannot become “human”, that is, to have emotions, to decide for themselves, etc.. I’m only asking if they can be fully aware of themselves. The examples are numerous, the Machine from “Person of Interest”, KITT from “Knight Rider”, R2D2 from “Star Wars”, etc.

N.B. Also, look for “Eagle Eye”, “Echelon Conspiracy”, and “WarGames”.

Observing members: 0 Composing members: 0

11 Answers

Inspired_2write's avatar

Wouldn’t it depend on what they are programmed for and limits .
However self teaching if programed to observe and learn “may” help to become self taught,at least I would think of that possiblility.

luigirovatti's avatar

@Inspired_2write: I didn’t understand what you said. Too many typos.

luigirovatti's avatar

Especially in the first line.

kritiper's avatar

Not without the total ability to reason. Remember how Robbie the Robot reacted to a conflict in input and programming in the movie “Forbidden Planet?”

filmfann's avatar

Isn’t the definition of sentient being self aware?

stanleybmanly's avatar

I believe it is a mistake to tackle your question based on the premise implied in your details that emotions are necessary to the definition of sentience. I also suspect that the obsession with all of these science fiction and futuristic fictions blurs the lines between possibility and reality for those who dabble in such matters as opposed to those schooled to follow scientific disciplines.

AYKM's avatar

“A.I.” is not something that can be sentient right now. It just does not work that way.

luigirovatti's avatar

@AYKM: Just to ask, are you replying to @stanleybmanly, or myself?

LostInParadise's avatar

It depends on what you mean by self-aware. It should be possible to program a robot to recognize itself as a physical object and to recognize itself in a mirror.

Zaku's avatar

No. Some symbols that correspond to some things about the AI can be encoded, and humans can program an AI to act on the data it has in ways that seem to correspond to behavior that makes some sense to humans, but it’s just an information system, and the understanding comes from the human designers and programming, and only exists in the AI in terms of data representations. At some logical level, there can be a representation of the properties of itself, but that is not understanding per se of the AI itself..

Inspired_2write's avatar

@luigirovatti
Wouldn’t it depend on what they are programmed for and limits .
However self teaching if programmed to observe and learn “may” help to become self taught, at least I would think of that possibility.
( corrected spelling Programmed and Possibility)

But I think that you understood anyways?

Answer this question

Login

or

Join

to answer.

This question is in the General Section. Responses must be helpful and on-topic.

Your answer will be saved while you login or join.

Have a question? Ask Fluther!

What do you know more about?
or
Knowledge Networking @ Fluther