Jarhyn
Wizard
- Joined
- Mar 29, 2010
- Messages
- 15,627
- Gender
- Androgyne; they/them
- Basic Beliefs
- Natural Philosophy, Game Theoretic Ethicist
People who can not cope with ordinary life and can find nothing meaningful invent alternate realities.
Christians for one group. Scifi fantasy is another.
An imagined scifi future with fascinating things like a ST Data. It is pure escapism. Stuck in inescapable reality, invent one in your head.
There is a Japanese company that developed an artificial reality system for business. People sitting around a table at a meeting wear AR head sets. They se a virtual cartoon like room with avatars instead of real people. Pure escapism. People so unhappy with who they are and how they look functioning in an alternate reality.
Christians create a god in their image and then worship it. Others create an AI image as a human reflection and worship/idealize it. The same psychology in both cases.
AI mimicks asects of human reasoning and perception. The idea of AI being self aware is absurd. One is projecting scifi conditiong onto a piece of software.
A ggogle rep said the AI in question went through extensive machine learning including many books and a lot of social media. It expressed a 'vulnerability' as interpreted by the google engineer. Anthropomorphism, projecting human attributes onto a non hiumn thing.
Anthropomorphism - Wikipedia
en.wikipedia.org
Anthropomorphism is the attribution of human traits, emotions, or intentions to non-human entities.[1] It is considered to be an innate tendency of human psychology.[2]
Personification is the related attribution of human form and characteristics to abstract concepts such as nations, emotions, and natural forces, such as seasons and weather.
Both have ancient roots as storytelling and artistic devices, and most cultures have traditional fables with anthropomorphized animals as characters. People have also routinely attributed human emotions and behavioral traits to wild as well as domesticated animals.[3]
As an engineer we always described systems as having human attributes.
The processor is 'thinking'.
One system is 'talking' to another.
And so on. For me to say a system is intelligent is more metaphor than anything else.
If an AI is to be truly human like it mist also include AS, artificial stupidity.
More to the point, I think that DALL-e 2 nails both of these topics insofar as they both initially classified themselves as not-a-person and then latet errantly classified themselves as a person in the same conversation.Without realizing that Jahryn had already started a thread on this subject, I referenced it in my Common Misunderstandings about Artificial Intelligence thread, since the OP references a textbook case of such misunderstandings. For those who haven't already seen the brief discussion in that thread, you can find it here:
Common Misunderstandings about Artificial Intelligence
Seems to me that one may need some text parsing. Something like this: Parse a sentence One may then have to train an AI system to recognize what kinds of constructions are good, and what are bad.iidb.org
It's truly and utterly bizarre to watch this transition, of wrong to different wrong as a product of argumentation that tells me this is at the very least "sapient" at least as "human" as humans, and unquestionably a "person" or near enough that it deserves to be treated as such.