• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Unlocking the "Mystery" of Consciousness

So the solution to the ontological aspect of the explanatory gap is that all feelings are both uniquely personal via their connection to life, and they have a unique neurobiology through the special features. And the two in combination create the unique feature of conscious feelings. Note this explanation requires no supernatural intervention nor any new “fundamental property” or principle of physics.

This in no way explains how something that has feelings is generated or how feelings are generated.

This answers nothing about how neural tissues create the things experienced or the thing that can experience.

We conclude that the “mystery” of consciousness and Levine’s “explanatory gap” and the “hard problem” can be naturally and scientifically explained.

This is pulled straight from their asses.

They don't explain anything about how a subjective mind is generated or how it functions.

Of course it arrived via evolution, but that is not an explanation for how anything is generated.

The "mystery" of the mind is how does neural tissue generate one with all it's richness and with it's stability for years.

Knowing that the mind arrived via evolution and is a tool for survival is no mystery.
 
I'm not really seeing anything substantive in how they deal with the ontological problem of conscious experience. Basically they say consciousness has neurobiological roots, and these evolved for purposes we can make educated guesses about. But the entire point of the ontological problem is that it seems like subjective experience is superfluous; that is, we could imagine animals having evolved the same neurobiology for the same purposes, without the accompanying first-person sensations. The authors offer no plausible account for why this is the case, other than to say "consciousness is unique and personal". A lot of articles get clicks by having provocative titles like this, and turn out to lack what was promised on the tin.
 
I'm not really seeing anything substantive in how they deal with the ontological problem of conscious experience. Basically they say consciousness has neurobiological roots, and these evolved for purposes we can make educated guesses about. But the entire point of the ontological problem is that it seems like subjective experience is superfluous; that is, we could imagine animals having evolved the same neurobiology for the same purposes, without the accompanying first-person sensations. The authors offer no plausible account for why this is the case, other than to say "consciousness is unique and personal". A lot of articles get clicks by having provocative titles like this, and turn out to lack what was promised on the tin.

Yeah.

Except that I can't actually imagine animals having evolved the same neurobiology for the same purposes without the accompanying first-person sensations.

My own mini-theory about non-superfluousness is that things get might get laid down in memory better if they appear in consciousness.
 
I'm not really seeing anything substantive in how they deal with the ontological problem of conscious experience. Basically they say consciousness has neurobiological roots, and these evolved for purposes we can make educated guesses about. But the entire point of the ontological problem is that it seems like subjective experience is superfluous; that is, we could imagine animals having evolved the same neurobiology for the same purposes, without the accompanying first-person sensations. The authors offer no plausible account for why this is the case, other than to say "consciousness is unique and personal". A lot of articles get clicks by having provocative titles like this, and turn out to lack what was promised on the tin.

Not only that, but a subjective experience of the lion without the ability to do anything about it is total uselessness.

So the subjective perspective also leads to the idea of autonomy and the control of the body with the mind that experiences the lion.
 
I'm not really seeing anything substantive in how they deal with the ontological problem of conscious experience. Basically they say consciousness has neurobiological roots, and these evolved for purposes we can make educated guesses about. But the entire point of the ontological problem is that it seems like subjective experience is superfluous; that is, we could imagine animals having evolved the same neurobiology for the same purposes, without the accompanying first-person sensations. The authors offer no plausible account for why this is the case, other than to say "consciousness is unique and personal". A lot of articles get clicks by having provocative titles like this, and turn out to lack what was promised on the tin.

I agree. It basically says there's something unique about animal brains, so naturally it produces something like consciousness that is "indeed unique in all of nature". They also seem to be clasifying all qualia as feelings. I see feelings as a type of qualia, but the perception of colors and sounds is not feeling in and of itself.
 
I'm not really seeing anything substantive in how they deal with the ontological problem of conscious experience. Basically they say consciousness has neurobiological roots, and these evolved for purposes we can make educated guesses about. But the entire point of the ontological problem is that it seems like subjective experience is superfluous; that is, we could imagine animals having evolved the same neurobiology for the same purposes, without the accompanying first-person sensations. The authors offer no plausible account for why this is the case, other than to say "consciousness is unique and personal". A lot of articles get clicks by having provocative titles like this, and turn out to lack what was promised on the tin.

Yeah.

Except that I can't actually imagine animals having evolved the same neurobiology for the same purposes without the accompanying first-person sensations.

Really? Well, maybe it's not an imagination exercise but an explanatory one. So you could just think of explaining everything there is to explain about the inner workings of the human brain, a la Mary's Room, and running out of anything further to say before ever getting into first-person sensations. Either way, it seems like they are tacked-on after the fact, but it also seems like nature shouldn't work that way.

My own mini-theory about non-superfluousness is that things get might get laid down in memory better if they appear in consciousness.

I'm getting into functionalism lately as a theory of mind. If we take the idea seriously, then being laid down in memory couldn't depend on whether or not something has an accompanying first-person sensation, for the same reason I explained above. In the distant future, when memory pathways in the brain have been fully laid out and all of the inputs and outputs have been defined--i.e. everything is fully explained about memory on a physical level--we wouldn't expect this explanation to include anything about ineffable first-person sensations, because they aren't the kind of things that can be isolated and quantified. So, if one school of thought says "you need first-person sensation to enable better memory" and another says "you don't, all that's needed is X Y and Z in the neural connections etc." there would be no way to prove who is right. You couldn't run an experiment with first-person experience as the independent variable, with all other physical factors being kept constant.

That's what makes people start to think that the first-person perspective is something more fundamental, with its own logic and parameters that don't follow what's happening on the outside. My favorite description of it is to think of it as another dimension, not spatial or temporal, but the subjective dimension, where experiences unfold in their peculiar way. Just as you could never describe to someone in Flatland what height is like, you'd also come up short trying to explain what subjective experience is like, and that suggests we're dealing with something properly understood as a dimension and not a substance.
 
Back
Top Bottom