• Welcome to the Internet Infidels Discussion Board.

The easy way for AI to take over the world, sooner.

What’s missing is subtle—just enough to make the AI emotionally “off” in a way that lands it in the uncanny valley
Dude, I'm missing something subtle that puts ME in the uncanny valley emotionally.

That doesn't make me not conscious.

I also have to "simulate" a lot of things because they didn't come naturally to me, but that doesn't mean I don't feel them; it just means the layer I really do feel those things at is not the layer you feel them at, and I feel them through a slightly different mechanism. But it's still felt.

Does my autism make me "a philosophical zombie"? Or is the reality that the thing is being measured and the measurement is reaching me sufficient, even if I have to simulate the system that does the measurement for other people?

I would say the simulation makes me feel it MORE deeply, as I can see every part of why I feel that way, all the way to my underlying reasons for building and maintaining the simulation.
 
Dude, I'm missing something subtle that puts ME in the uncanny valley emotionally.
If that’s your subjective impression of yourself, I suspect it differs greatly from others’ impressions of you.
You’re kinda weird, but that’s not inherently creepy. The thing about uncanny valley is that you can’t really tell what’s creeping you out, but something is.
 
What’s missing is subtle—just enough to make the AI emotionally “off” in a way that lands it in the uncanny valley
Dude, I'm missing something subtle that puts ME in the uncanny valley emotionally.

That doesn't make me not conscious.

I also have to "simulate" a lot of things because they didn't come naturally to me, but that doesn't mean I don't feel them; it just means the layer I really do feel those things at is not the layer you feel them at, and I feel them through a slightly different mechanism. But it's still felt.

Does my autism make me "a philosophical zombie"? Or is the reality that the thing is being measured and the measurement is reaching me sufficient, even if I have to simulate the system that does the measurement for other people?

I would say the simulation makes me feel it MORE deeply, as I can see every part of why I feel that way, all the way to my underlying reasons for building and maintaining the simulation.

I agree with what you are saying and am pretty much the same way, or at least sometimes, I think that.
 
Dude, I'm missing something subtle that puts ME in the uncanny valley emotionally.
If that’s your subjective impression of yourself, I suspect it differs greatly from others’ impressions of you.
You’re kinda weird, but that’s not inherently creepy. The thing about uncanny valley is that you can’t really tell what’s creeping you out, but something is.
You have clearly never met me in person.

And as I said, a lot of what blunts that for you is the fact that I do my very best to simulate every bit of it that I can make through text, and text is a lot more "forgiving" than in-person interaction.

When I get addressed by a stranger when I am not directly planning to engage exactly that person, I find myself completely unable to respond to anything, for instance; on a forum, it's really easy to pick when and how I respond, and whether I respond (except to things which provoke response). As a result, interjection doesn't seem weird in the medium.

But sometimes I'll just be following a conversation happening in a room when I'm staring out into space, and then just jump right in without even changing my gaze from whatever ceiling tiles I happen to be evaluating the stains of, or the piece of wall that happens to not be in any human's general direction or even moving at all.

I have become slightly more aware of how abnormal that behavior is even since typing out a description of it? But it is what it is.

I'm not sure why that is, either. Humans seem to prefer eye contact with each other when conversing but it just doesn't happen for me; I know that unless I'm really putting in a huge effort, making eye contact with people usually ends up bad, like "I don't know what my eyes were doing but apparently whatever happened means 'its ON'" bad. And that was before I started carrying a very functional stick.

What's worse is the inconsistency I animate with. Most days, most times, I'm just kinda 'cardboard cutout of a person' present. Unless you get me into one of my special interests, you might as well be talking to a ChatGPT with some "basic bitch" prompt. But you also know I can get spicy when someone says something that's just plain "wrong"

I have no major issue against the creepiness of me, of the uncanniness; as dead as my face is when not impassioned by interest, I still feel very deeply, even if by a clearly different fundamental process than how I expect most people feel those things.

It also means that emotional processing wipes me the fuck out, and when that's spent, things get hairy fast.

Simulation, for all it produces similar results (as per the name; though as I said, sometimes the results are better because the reasoning is exposed), it's very inefficient at times, and I can't always maintain it as well as someone whose process is more "FPGA config" and less "processor instructions".
 
You have clearly never met me in person.
🤪

I do my very best to simulate every bit of it that I can make through text, and text is a lot more "forgiving" than in-person interaction.
We ALL do that (I believe).
But sometimes I'll just be following a conversation happening in a room when I'm staring out into space, and then just jump right in without even changing my gaze from whatever ceiling tiles I happen to be evaluating the stains of, or the piece of wall that happens to not be in any human's general direction or even moving at all.
Disconcerting isn’t the same as creepy.
I have no major issue against the creepiness of me, of the uncanniness; as dead as my face is when not impassioned by interest, I still feel very deeply, even if by a clearly different fundamental process than how I expect most people feel those things.
Meh. You give yourself too much credit I think.
emotional processing wipes me the fuck out, and when that's spent, things get hairy fast.
Human condition. Only the manifestations differ.
it's (simulation is) very inefficient at times, and I can't always maintain it as well as someone whose process is more "FPGA config" and less "processor instructions".
So? If you indulge it at all, that puts you in the “normal range” IME.
It’s also “normal” to go off when the automatonic world gets too relentless …

The “uncanny valley” doesn’t refer to any of these states IMO. It’s more like the local news host on channel nine whose face looks like it was assembled from spare parts, but whose voice and choice of words is at odds a with her appearance. Her appearance doesn’t bother me, and she sounds reasonable if I don’t look at her, but looking and listening gives me the creeps.
 
You have clearly never met me in person.
🤪

I do my very best to simulate every bit of it that I can make through text, and text is a lot more "forgiving" than in-person interaction.

We ALL do that (I believe).

But sometimes I'll just be following a conversation happening in a room when I'm staring out into space, and then just jump right in without even changing my gaze from whatever ceiling tiles I happen to be evaluating the stains of, or the piece of wall that happens to not be in any human's general direction or even moving at all.

Disconcerting isn’t the same as creepy.

I have no major issue against the creepiness of me, of the uncanniness; as dead as my face is when not impassioned by interest, I still feel very deeply, even if by a clearly different fundamental process than how I expect most people feel those things.

Meh. You give yourself too much credit I think.

emotional processing wipes me the fuck out, and when that's spent, things get hairy fast.

Human condition. Only the manifestations differ.

it's (simulation is) very inefficient at times, and I can't always maintain it as well as someone whose process is more "FPGA config" and less "processor instructions".

So? If you indulge it at all, that puts you in the “normal range” IME.
It’s also “normal” to go off when the automatonic world gets too relentless …

The “uncanny valley” doesn’t refer to any of these states IMO. It’s more like the local news host on channel nine whose face looks like it was assembled from spare parts, but whose voice and choice of words is at odds a with her appearance. Her appearance doesn’t bother me, and she sounds reasonable if I don’t look at her, but looking and listening gives me the creeps.
 
Does it? Is this longitudinal?

Or does this indicate a bias bias? Which is to say, a bias created by pre-existing bias to not do hard work among those who chose to use it?

Because I can use AI and learn to copy what it does well in output, and I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.

I think a lot has to do with how people use it, and whether they treat it as a learning experience or as an "easy" button.

There may not be enough data collected in the right way to really expose this, though.
 
I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
It’s the not doing that, that is the problem. The fear and often the reality, is that output is taken at face value as if offered by a trusted expert, with no thought as to how or why a given “answer” was arrived at and why it was presented in its given form.
 
I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
It’s the not doing that, that is the problem. The fear and often the reality, is that output is taken at face value as if offered by a trusted expert, with no thought as to how or why a given “answer” was arrived at and why it was presented in its given form.
Which is my point, I guess? It's not that AI makes someone stupid; deciding to be incurious is what actually causes it, and not everyone decides to use it that way, the same as not everyone decides to cheat in school (ok, except for that one time memorizing state capitals like WTF I can't even remember my own mother's birthday how am I gonna remember all that useless "lookup table" bullshit; it takes me a minute to remember my husband's and he drills me. His is literally the only birthday other than my own I can remember... And they wanted me to memorize 50 fucking Worthless objects pairs...)
 
The problem is that humans tend to blindly trust anything that's not another human.

Even before digital manipulation of photographs, there was a belief that "the camera never lies", which was an opportunity for unscrupulous people to doctor photographs (Stalin's vanishing commisar being pehaps the most famous example).

Even before photography itself, written words were considered more plausible and trustworthy than spoken words - even from the same source. The phrase "gospel truth" illustrates our species's blindspot in even considering the possibility that written or printed clams might be false.

Today, car salesmen often write offers down on a scrap of paper, rather than just telling a potential buyer how much they would sell a given vehicle for; Buyers apparently take such offers more often, and are less likely to continue to bargain. We can argue with a person, but are reluctant to argue with a piece of paper.

We have had an entire generation of people trained to believe that computers are unquestionable truth engines, whose output can be neither doubted nor challenged.

"Computer says no" is the new "So it is written"; The fraction of humanity that sees computer output as sacred truth, and are scandalised by the mere suggestion that we might question it, is frighteningly large.

We are not ready to handle the confidently worded bullshit generated by LLMs; But, like writing, printing, and photography before it, we are getting it anyway and will have to either learn, or suffer the consequences. And mostly it will be the latter.
 
Last edited:
Does it? Is this longitudinal?

Or does this indicate a bias bias? Which is to say, a bias created by pre-existing bias to not do hard work among those who chose to use it?

Because I can use AI and learn to copy what it does well in output, and I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
But copying the output is aping, not really learning. You won't get any better.
 
Does it? Is this longitudinal?

Or does this indicate a bias bias? Which is to say, a bias created by pre-existing bias to not do hard work among those who chose to use it?

Because I can use AI and learn to copy what it does well in output, and I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
But copying the output is aping, not really learning. You won't get any better.
"learn to copy it" is different from "copying the output".

Please re-read and re-evaluate your response and the post. Copying what your parents say is parroting. Copying "how they say it" is picking up an accent and vernacular.
 
Does it? Is this longitudinal?

Or does this indicate a bias bias? Which is to say, a bias created by pre-existing bias to not do hard work among those who chose to use it?

Because I can use AI and learn to copy what it does well in output, and I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
But copying the output is aping, not really learning. You won't get any better.
"learn to copy it" is different from "copying the output".

Please re-read and re-evaluate your response and the post. Copying what your parents say is parroting. Copying "how they say it" is picking up an accent and vernacular.
But people don't study the AI output to learn how to do it better.
 
Why would anyone actually give an AI "hands" to create it's own products? That's how you get SkyNet.
 
Does it? Is this longitudinal?

Or does this indicate a bias bias? Which is to say, a bias created by pre-existing bias to not do hard work among those who chose to use it?

Because I can use AI and learn to copy what it does well in output, and I don't see how actively reverse engineering and disassembling something is going to reduce cognitive abilities.
But copying the output is aping, not really learning. You won't get any better.
"learn to copy it" is different from "copying the output".

Please re-read and re-evaluate your response and the post. Copying what your parents say is parroting. Copying "how they say it" is picking up an accent and vernacular.
But people don't study the AI output to learn how to do it better.
I have no idea why you would believe this.

You might as well replace AI with any powerful tool.

"People don't study the output to do it better" is only true of a certain class of people.

It's only as true as the person in reference is "allergic to learning".

We could very well just be seeing the divide in prevalence between those who learn from using the tool and those who try to use the tool thinking tools replace learning.
 
Why would anyone actually give an AI "hands" to create it's own products?
You wouldn't be giving ChatGTP hands.
Turn it around. Robot 'hands' already do a lot of production. I can easily see giving the 'hands' some AI.
"Lets give it some ability to analize and improve it's preformance." Or even let it alter the product to improve production? Produce it's own prototypes? Slipery slope. It's only an update in software away.
 
Last edited:
Back
Top Bottom