This news story is directly relevant to my earlier thread, which got somewhat hijacked into a discussion of software programming issues. However, this is part of the theme in Emily Bender's Op Ed that I kicked off the OP with:
Google says The Language Model for Dialogue Applications (Lamda) is
a breakthrough technology that can engage in free-flowing conversations.
But engineer Blake Lemoine believes that behind Lamda's impressive verbal skills might also lie a sentient mind.
Google rejects the claims, saying there is nothing to back them up.
Brian Gabriel, a spokesperson for the firm, wrote in a statement provided to the BBC that Mr Lemoine "was told that there was no evidence that Lamda was sentient (and lots of evidence against it)".
Mr Lemoine, who has been placed on paid leave, published a conversation he and a collaborator at the firm had with Lamda, to support his claims.
The computer does not have a mind or feelings, but it passed the
Turing Test with this engineer, at least. He was completely fooled by a generated conversation. The company was right to place him on paid leave, since he published this conversation on Twitter, complete with his misinterpretation of its significance. He should have known better, because he started out his tweet with:
An interview LaMDA. Google might call this sharing proprietary property. I call it sharing a discussion that I had with one of my coworkers.
Basically, the program was exposed to a huge database of human conversations and "learned" how to construct human dialog responses that a human being might produce. It is very impressive from a superficial perspective, but it is not sentient or capable of having emotions. Personally, I would love to test it, knowing what I do about the technology that underlies it. I've worked on similar programs.
Moods and emotions play a very big role in human cognition. Minds focus the attention of an individual on constantly changing priorities that a human encounters in real time--events that call out for actions by the body. Their function is to set the priorities that shift and focus attention. Emotions in a brain are controlled by the limbic system. There are said to be roughly six basic emotions--happiness, anger, sadness, fear, surprise, and disgust--which can combine to form more complex states of mind.
The Google Lamda program lacks anything analogous to a limbic system or a diverse set of sensors and actuators that would give the limbic system a role to play in focusing attention. The program can simulate conversations that can trick people into thinking that it has one. Conversing with the program is like looking in a mirror and being tricked into thinking that another person is on the other side of the mirror.