• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

@TayandYou - The Internet Education of Microsoft's AI bot

Currently, we don't usually grant software the presumption of self-awareness and 'feelings' that we usually do grant to humans and to many non-human animals; but there seems to be no good reason to presume that any sufficiently complex system cannot have those attributes. Indeed, unlike with animals, I can in principle look at the source code for an AI and determine that it includes feedback of information about itself as part of its inputs. So on that basis there may be MORE reason to believe that an AI has something we could justifiably call 'feelings', than there is to believe the same of other humans.

Two things I get from reading your inputs here. ?First it seems you've become somehow dependent on the presumption that 'we grant' which can't be further from what actually took place. This bot...
At this point, it becomes very clear to me that you are not actually addressing my post, because you are talking about 'this bot' - presumably the bot in the OP - while I am talking about a hypothetical 'sufficiently complex' bot, in response to the more general question of whether feelings are possible for 'just code'.

So while the rest of your post is interesting enough, it would be a mistake to imagine that it has anything to do with what I posted; and your conclusions are unrelated to my actual position
... was exclusively designed to chat with teens as a teen without getting anything other than input from a chat room. That's not an example where granting has any weight in the decision made by Microsoft. Microsoft analysts to design a bot that could be moral because they limited its world view to the chat room. No reasonable bot generating psychologist would ever argue a chat room is a place for moral development. We're talking lists and fuzzy here narrowly focused. The above also goes to your comment regarding 'sufficiently complex systems' which clearly doesn't apply in the current discussion.

Of course we're viewing this object differently. Source code applied too narrowly doesn't meet my criteria of something that can use such limited information as is available to it in a teen chat room as useful feedback. I don't doubt that a sufficiently and broadly interacted with AI could be associated with something we can justifiably call 'feelings' Dealing with teen input in a chat room, an exclusively social and narrowly contexted environment, just doesn't get there.

As you often point out what is needed is objective evidence for proper decisions and hypothesis development. Its not here.

Yes I read all the posts up to this one of yours bilby.
... But apparently not with much success with regards to comprehending them.
rousseau, regarding your view of 'life' being biological physical process, any process that achieves the parameters associated with 'life' will be a physical process unless we atheists are wrong and faeries exist. The distinction posited to be 'life' being between biochemical processes and electronic processes seems somewhat presumptive to me.
So you grant the presumption of feelings based on an entity's responses, and not its physical composition? Wow, how insightful. I wish I'd said that.

Oh, wait.

:rolleyes:
I'm with you Jobar.

Apparently you're with me too, but were too busy telling me I was wrong to notice.
 
Overkill and over the top bilby. Since we both agree it takes a sufficiently complex AI system with multiple inputs and capabilities for response to generate sense of 'feeling' that should be enough.

Given, as you acknowledge, I probably jumped to conclusion on your discussion about what bot to which you were referring, not that clear actually, the rest of your post is just another bilby opportunity to make his boss look good - Hi gmbteach I'm back from my daughter in law mother's funeral (always have trouble with which is possessive here) which featured 10 Buddhist monks who delayed the burial for lunch (bouncing up on 12 ya'no). She and everything went very well except one of her daughters broke an ankle at the reception - in front of the cheering crowd.

Be honest, that isn't the way you got those two, er, three damn beautiful red jewels is it?*

*jewel envy post
 
"If you were to insist I was a robot, you might not consider me capable of love in some mystic human sense, but you would not be able to distinguish my reactions from that which you would call love — so what difference would it make?"
--Foundation’s Edge (1982)
 
I guess on the other hand, one could define 'human love' as a series of reactions aimed at producing more people, said the least romantic poster on the board.
 
It's not a true AI at all.

I think the only reason to keep it up is to illuminate just how racist and horrible humans are.

More likely that it simply illuminates how well we can troll. Really, who could pass up a chance to eff with Microsoft? That doesn't make us racist or horrible.
 
Back
Top Bottom