All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
Do you think we need to rethink the turing test?
Do you think we need to rethink the turing test?
Please continue.
The irony being, I'm not sure LP could pass a Turing test.![]()
I have been reading your posts on this subject and I think you write very well about it. *insert completely unrelated link*Do you think we need to rethink the turing test?
Please continue.
What would you like me to continue with?
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
Well, an AI that can convincingly portray a Patriot Party flat-earther Karen may technically pass the Turing test, but is it really Artificial Intelligence?
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
Well, an AI that can convincingly portray a Patriot Party flat-earther Karen may technically pass the Turing test, but is it really Artificial Intelligence?
At least someone gets it!
The irony being, I'm not sure LP could pass a Turing test.![]()
The irony being, I'm not sure LP could pass a Turing test.![]()
Pretty sure Loren would pass, but there's a certain newcomer around here I was wondering about... lotta stock material.
Do you think we need to rethink the turing test?
Please continue.
What would you like me to continue with?
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
Do you think we need to rethink the turing test?
Please continue.
All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
How else would you formulate the test? I think it's a good test. The basic problem is philosophical. We have chosen to use human intelligence as a baseline for what intelligence is. Since humans are all that smart, it has problems. But if you open it up to other definitions of what it means to be intelligent, we quickly run into trouble. It's an infinite variety of what counts, and everybody wants to prove how their thing is the smartest, so use a definition where they shine. At least the Turing test does away with all that.
But does the Turing test really count if the software design intent is to mimic someone who is using emotions, not reason? Just regurgitating canned talking points and favored accusations to live out their persecuted rebel fantasy?All the total nonsense that people have accepted makes the bar for recognizing intelligence awfully low!
How else would you formulate the test? I think it's a good test. The basic problem is philosophical. We have chosen to use human intelligence as a baseline for what intelligence is. Since humans are all that smart, it has problems. But if you open it up to other definitions of what it means to be intelligent, we quickly run into trouble. It's an infinite variety of what counts, and everybody wants to prove how their thing is the smartest, so use a definition where they shine. At least the Turing test does away with all that.
The thing is we have so many extremists these days that could probably be emulated with a few thousand canned responses and a weighted keyword system to decide which of them to pick.
i would suggest that software written to emulate an extremist would be a parody of that form of 'streemism.without a clear indicator of the author's intent, it is impossible to create a parody of extreme views so obviously exaggerated that it cannot be mistaken by some readers for a sincere expression of the views being parodied