• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Definitions of Consciousness: The Poll

Which one of the eight definitions below fits your view of consciousness?

  • The mind or the mental faculties, characterized by thought, feelings, and volition.

    Votes: 1 4.3%
  • Consciousness is an arousal state, awareness, motivated to treat self and environmental events. Arou

    Votes: 2 8.7%
  • Consciousness is known subjectively to the conscious organism. The task of science is to discover th

    Votes: 1 4.3%
  • Consciousness is a misinterpretation of the relationship of the self to the external world in order

    Votes: 1 4.3%
  • Consciousness is a notoriously ineffable and ethereal stuff that can’t even be rigorously defined, l

    Votes: 0 0.0%
  • Consciousness is the brain’s model of the world and self, made of sense awareness, memory, feelings

    Votes: 5 21.7%
  • The mental activity of which a person is aware.

    Votes: 1 4.3%
  • Consciousness is knowledge of specific neuronal processes in the human brain.

    Votes: 2 8.7%
  • None of the above

    Votes: 7 30.4%
  • Don’t know

    Votes: 3 13.0%

  • Total voters
    23
Bad poll.. did not respond. Issue is that the "choices" are not mutually exclusive... or consistent..

What is better:

1
A
Red
Spots
Vanilla


CHOOSE!

That's just what happens with qualitative research - sure, patterns emerge out of the data, but the sort of coherence you are after is something I'd take as a sign of malpractice.
 
Science certainly does give the most detailed definition ...
<SNIP>
... As I said before, this is no more mysterious than water being wet.

I agree with all of this.

Maybe I could try to find something to nitpick but broadly I can say it's good with me.

it's an emergent property.

I don't think we actually know it is.

But I agree that's usually what we believe, sure.

That's one side.

Go on, what's the other side?
EB
 
If it's all right with you, I'd definitely prefer the minimal definition.

If there's overlap.....there's overlap. It's arguably a question of when to stop including stuff that as you say 'may be' relevant. But given that it may not be (recalled memories for example).....to a core definition, I'd rather just run with the one I have suggested.

Yeah, Ok.

I have to point out that the definition I proposed is really all the first line, i.e. "Consciousness is awareness".

The rest is just eliciting the possible contents of awareness: Consciousness may include...

And the last sentence is pure comment. It's in effect about science, not about consciousness.

Your choice.
EB
 
I agree with all of this.

Maybe I could try to find something to nitpick but broadly I can say it's good with me.



I don't think we actually know it is.

But I agree that's usually what we believe, sure.

That's one side.

Go on, what's the other side?
EB

Oh language and narrative and all the stuff Dennett always goes on about. My position is that the biology gives us a what it is like while the narrative brings it all together and polishes the user illusion from a vague experience to a unified self. The biological mind is private and the stories are public, but as I said earlier the two are intimately integrated to make a functioning self. One only has to look at the varieties of autism to see how this trick can fail to happen. The ironic thing is that I would be a Pat Churchland style eliminativist about all the folk psychology that is carried on language except that I think that this is the idiom in which we came to see ourselves as ourselves and exploded from being a very smart tribal ape to being the apex predator.
 
Given the overabundance of "None of the above" in the vote (30.43%), I want to urge again those who voted for this option to try to express their own view of consciousness here.

Please, give it a try.

Thank you to all the seven of you.
EB
 
Bad poll.. did not respond. Issue is that the "choices" are not mutually exclusive... or consistent..

What is better:

1
A
Red
Spots
Vanilla


CHOOSE!

I have to come back on this because it shows you don't even understand the question!

Which one of the eight definitions below fits your view of consciousness?

See? It's not: "What is better?"

If you can understand the difference.

And obviously it's not a list of unrelated items but a list of definitions, and all definitions of the same thing. The definitions have been drafted by different people according to their view of consciousness, so they clearly are related because of that. And then voters are asked to choose the one that fits their own view. And if none fits, they can choose to vote "None of the above" or "Don't know".

I think you've lost a good occasion to shut up. :D
EB

- - - Updated - - -

Oh language and narrative and all the stuff Dennett always goes on about. My position is that the biology gives us a what it is like while the narrative brings it all together and polishes the user illusion from a vague experience to a unified self. The biological mind is private and the stories are public, but as I said earlier the two are intimately integrated to make a functioning self. One only has to look at the varieties of autism to see how this trick can fail to happen. The ironic thing is that I would be a Pat Churchland style eliminativist about all the folk psychology that is carried on language except that I think that this is the idiom in which we came to see ourselves as ourselves and exploded from being a very smart tribal ape to being the apex predator.

Alright, I guess we don't need to go into that, not yet.

I hope you could vote?
EB
 
I will also include ruby spark's definition that consciousness is awareness.

And I'll wait a few more for FDI and DBT to tell me what it is they want to do, which may include keeping both their definitions as they are or possibly modified, merging them, or just dropping one.

Don't take too long, though.
EB

Apart from ''none of the above'' it looks like my definition has the most votes. Not to say that such a small sample means much.
 
Apart from ''none of the above'' it looks like my definition has the most votes. Not to say that such a small sample means much.

Sure, but I'd like to be able to identify the main conceptions and yours I think is close enough to those of FDI and ruby sparks.

And we now already know how many votes each definition got and which got more votes.

Also, votes in a possible second round would still split between several but close definitions so that not one definition will get more votes than "None of the above"!

I think you could try to reach a consensus around the key word 'awareness', possibly starting from the definition I suggested.

And I hope everybody understands the distinction I emphasised in the definition I provided between the definition proper given by the first line, and the additional comments and considerations provided in the second and third paragraphs, which strictly speaking are not definitional. This also shows your three definition are in fact identical. Keeping just one would avoid needlessly splitting votes.

Let's see it again:
Consciousness is awareness.

Consciousness may include awareness of perception, feelings, sensations, ideas and thoughts, recalled memories, decisions, or willed realisations or actions etc. as well as partially conscious activity such as dreams, delirium etc.

It is the task of science to discover the physical processes allowing awareness and how these processes provide an operational model of the physical world allowing the conscious organism to survive and prosper in it.
EB
 
Apart from ''none of the above'' it looks like my definition has the most votes. Not to say that such a small sample means much.

Sure, but I'd like to be able to identify the main conceptions and yours I think is close enough to those of FDI and ruby sparks.

And we now already know how many votes each definition got and which got more votes.

Also, votes in a possible second round would still split between several but close definitions so that not one definition will get more votes than "None of the above"!

I think you could try to reach a consensus around the key word 'awareness', possibly starting from the definition I suggested.

And I hope everybody understands the distinction I emphasised in the definition I provided between the definition proper given by the first line, and the additional comments and considerations provided in the second and third paragraphs, which strictly speaking are not definitional. This also shows your three definition are in fact identical. Keeping just one would avoid needlessly splitting votes.

Let's see it again:
Consciousness is awareness.

Consciousness may include awareness of perception, feelings, sensations, ideas and thoughts, recalled memories, decisions, or willed realisations or actions etc. as well as partially conscious activity such as dreams, delirium etc.

It is the task of science to discover the physical processes allowing awareness and how these processes provide an operational model of the physical world allowing the conscious organism to survive and prosper in it.
EB

I think there's a clear distinction between 'consciousness is awareness' and many of the other definitions here. Oldman came closest to it when he posted a video about trees talking. I'm not saying trees are conscious (though they might be). But, I think that too many views and definitions offered here, if they're not explicitly or implicitly about human consciousness, then they're loaded up with human-esque criteria, to the point of being arguably anthropocentric. It's a bit like defining 'movement' as 'going in different directions, at different speeds, in a vehicle that has 4 wheels and an engine' or something like that. Way too much.

The other definition of consciousness which is already in use and is, imo, also very good, is that an entity is conscious if it feels like something to be that entity. This is arguably identical, for all practical purposes to 'consciousness is awareness'. It's a paraphrase of Thomas Nagel's definition. Nagel's definition ("A creature is conscious if there is 'something that it is like' to be this creature") introduces an extra ingredient, the concept of 'creature' (as does my paraphrased version, using the word, 'entity'). Entity is much better, imo, but both of these words/concepts (creature and/or entity) could be argued to be superfluous, though it might be splitting hairs to exclude the latter.

Another good word that's already out there is Sentience, which, at bottom is nothing more than the capacity to feel. Even there, we might have to add the qualifier 'unadorned' sentience, because it's very easy for humans to let their own perspective creep in, or otherwise add bells and whistles, and that's what's happening here, I think.
 
Last edited:
Sure, but I'd like to be able to identify the main conceptions and yours I think is close enough to those of FDI and ruby sparks.

And we now already know how many votes each definition got and which got more votes.

Also, votes in a possible second round would still split between several but close definitions so that not one definition will get more votes than "None of the above"!

I think you could try to reach a consensus around the key word 'awareness', possibly starting from the definition I suggested.

And I hope everybody understands the distinction I emphasised in the definition I provided between the definition proper given by the first line, and the additional comments and considerations provided in the second and third paragraphs, which strictly speaking are not definitional. This also shows your three definition are in fact identical. Keeping just one would avoid needlessly splitting votes.

Let's see it again:
Consciousness is awareness.

Consciousness may include awareness of perception, feelings, sensations, ideas and thoughts, recalled memories, decisions, or willed realisations or actions etc. as well as partially conscious activity such as dreams, delirium etc.

It is the task of science to discover the physical processes allowing awareness and how these processes provide an operational model of the physical world allowing the conscious organism to survive and prosper in it.
EB

I think there's a clear distinction between 'consciousness is awareness' and many of the other definitions here. Oldman came closest to it when he posted a video about trees talking. I'm not saying trees are conscious (though they might be). But, I think that too many views and definitions offered here, if they're not explicitly or implicitly about human consciousness, then they're loaded up with human-esque criteria, to the point of being arguably anthropocentric. It's a bit like defining 'movement' as 'going in different directions, at different speeds, in a vehicle that has 4 wheels and an engine' or something like that. Way too much.

The other definition of consciousness which is already in use and is, imo, also very good, is that an entity is conscious if it feels like something to be that entity. This is arguably identical, for all practical purposes to 'consciousness is awareness'. It's a paraphrase of Thomas Nagel's definition. Nagel's definition ("A creature is conscious if there is 'something that it is like' to be this creature") introduces an extra ingredient, the concept of 'creature' (as does my paraphrased version, using the word, 'entity'). Entity is much better, imo, but both of these words/concepts (creature and/or entity) could be argued to be superfluous, though it might be splitting hairs to exclude the latter.

Another good word that's already out there is Sentience, which, at bottom is nothing more than the capacity to feel. Even there, we might have to add the qualifier 'unadorned' sentience, because it's very easy for humans to let their own perspective creep in, or otherwise add bells and whistles, and that's what's happening here, I think.


Nagel is at pains to make it clear that it is the having the body that matters. I'm pretty certain that this is an explicit rejection of functionalist accounts of mind. This becomes harder to see the further we get from the GOFAI idea that if you can model the input output relations of a person you have captured everything worth capturing. Nagel was one of the first to reject this idea from the point of view of mental states, rather than from the point of view of computation.

Nowadays the only functionalism that is remotely mainstream (and just as wrong IMHO) is microfunctionalism - applying the same idea to nerves or chords of nerves. This can be problematic because Nagel really did see functionalism as the big error to be avoided and that just isn't really a problem any more. The idea that getting the biology and microcomputation right is critical is mainstream. Just bear in mind that it didn't use to be.
 
Nagel is at pains to make it clear that it is the having the body that matters. I'm pretty certain that this is an explicit rejection of functionalist accounts of mind. This becomes harder to see the further we get from the GOFAI idea that if you can model the input output relations of a person you have captured everything worth capturing. Nagel was one of the first to reject this idea from the point of view of mental states, rather than from the point of view of computation.

Nowadays the only functionalism that is remotely mainstream (and just as wrong IMHO) is microfunctionalism - applying the same idea to nerves or chords of nerves. This can be problematic because Nagel really did see functionalism as the big error to be avoided and that just isn't really a problem any more. The idea that getting the biology and microcomputation right is critical is mainstream. Just bear in mind that it didn't use to be.

Thanks. I will, obviously, get out of my depth very quickly if we were to move on to discuss the various isms and their nuances. Not that I'm averse. I'm fascinated. At this point though I'm following the lead of the OP and restricting myself to trying to come up with the best string-of-words definition. A starting position only. For one thing, it's easier. :)

ETA: well, arguably easier. I doubt we'll achieve agreement on even a basic definition.
 
Nagel is at pains to make it clear that it is the having the body that matters.

Were I to try to comment though.......

I wouldn't readily accept this. What do we mean by 'body' though? If it has to be a 'creature's body'.....what's a 'creature'?

If Functionalism involves multiple realizability....I'm good with that. That we appear to need a substrate for consciousness to manifest. Hence 'entity'. Otherwise, we'd be getting into substance dualism, surely?

You could say.....that the definition 'consciousness is awareness' at least avoids us having to make a call on that. In that sense, it could be argued to be a better definition for being more widely applicable, covering more, dare I say all, eventualities. Doesn't that make it a better starting definition? I think so.
 
Last edited:
Nagel is at pains to make it clear that it is the having the body that matters.

Were I to try to comment though.......

I wouldn't readily accept this. What do we mean by 'body' though? If it has to be a 'creature's body'.....what's a 'creature'?

If Functionalism involves multiple realizability....I'm good with that.

In that case, Nagel's position probably isn't for you. You are quite right to pick me up on 'creature'. I phrased it sloppily. Nagel is of the opinion that while there is almost certainly a direct relationship between the neural (other processing structures may be available...) structure and the experience, no amount of observation from the physical will allow a theorist to get to grips with what is going on in the mental, even if they are the same thing. As such, the idea of multiple realisation of mental states literally cannot make sense. How would a theorist know that you had the same mental states if, for each of the multiples, they couldn't get to grips with what is going on. In each of the cases you can imagine what it is like for you to be in that relation, but not what it is like to be in that relation (if anything). He just uses bats as an example because bats use echolocation and that is suitably odd to draw out the problem.

For Nagel, having the same substrate is at least some evidence that you can assume that you have the same mental states. Personally I'm not even sure you can do that. What we do agree on though is that input output relations just aint enough and so functionalism (at least when applied to the mental) fails. You can certainly realise input output relationships in any way you fancy right up to fantasy cheese cognition or ant hills. What you can't do is assume that these relations are mediated by internal mental states unless you are having them. That's why functionalist almost inevitably end up being behaviourists either implicitly or explicitly.

Oh and he uses a lovely image to elucidate the problem - imagine a neurobiologist isolating the precise chord of neurons carrying the information about the taste of chocolate the subject is eating and then licking them in an attempt to taste the chocolate... It's funny but it also rather nicely highlights the problem of being in precisely the right relationship to the information.
 
Nagel is at pains to make it clear that it is the having the body that matters.

Were I to try to comment though.......

I wouldn't readily accept this. What do we mean by 'body' though? If it has to be a 'creature's body'.....what's a 'creature'?

If Functionalism involves multiple realizability....I'm good with that.

In that case, Nagel's position probably isn't for you. You are quite right to pick me up on 'creature'. I phrased it sloppily. Nagel is of the opinion that while there is almost certainly a direct relationship between the neural (other processing structures may be available...) structure and the experience, no amount of observation from the physical will allow a theorist to get to grips with what is going on in the mental, even if they are the same thing. As such, the idea of multiple realisation of mental states literally cannot make sense. How would a theorist know that you had the same mental states if, for each of the multiples, they couldn't get to grips with what is going on. For Nagel, having the same substrate is at least some evidence that you can assume that you have the same mental states. Personally I'm not even sure you can do that. What we do agree on though is that input output relations just aint enough and so functionalism (at least when applied to the mental) fails. You can certainly realise input output relationships in any way you fancy right up to fantasy cheese cognition or ant hills. What you can't do is assume that these relations are mediated by internal mental states unless you are having them. That's why functionalist almost inevitably end up being behaviourists either implicitly or explicitly.

This seems to get us into.....interesting things....which might follow on after we have what we might call, set out our definitional stall.

(By the way, sorry, I added to my post while you were writing, totally bad habit of mine).

In what way though does multiply-realizable states not make sense?

I mean, there's a barrier to us knowing if this or that arrangement produces consciousness, sure, but that's only a problem with knowing. That's out the window. Not on the menu, surely? You can't get it from any amount of outside observation.
 
I might add...I'm with Dennett not Block when it comes to the China Brain thought experiment. Which I believe is a criticism of functionalism, in Block's view.

Ditto The Chinese Room thought experiment. Imo, Functionalism survives these.
 
I might add...I'm with Dennett not Block when it comes to the China Brain thought experiment. Which I believe is a criticism of functionalism, in Block's view.

Ditto The Chinese Room thought experiment. Imo, Functionalism survives these.

Sure, both are just a failure to get to grips with emergence or the difference between intrinsic and derived intentionality, meaning and so on. Searle has wasted a lot of people's time. I have more time for Block, but not the time to run that argument as my position there is complicated and unintuitive. Mind you, my response to Hofstadter's Ant Hillary is largely the same and we've done that to death elsewhere.


I've no problem with multiple realisation of states, just not first person mental states. Remember, I think that the mental states are the physical states from a different perspective. Functionalism fails to preserve the physical states, it just preserves the input output relations. Thus, while I feel I have good reason for thinking that someone with similar neurobiology and biology to me is likely to have a mental life, I see few reasons to think that similar input output relations (or behaviour if you prefer) imply a conscious mental life. Indeed, Wittgenstein gives me very good reasons to think that a system can master the grammar of mental talk without having a mental life. I'd even go so far as accepting Dennett's position that a system can mistake its heterophenomenology for a real mental life ... at least from the outside. Meanwhile the fact that I feel like something remains the non negotiable fact of any credible theory of mind. I can't see how functionalism of multiple realisation preserves this from the 'inside' even as I accept it can preserve it from the 'outside'. (Read 'inside' and 'outside' as first and third person or mental and physical as you wish.)
 
Oh and I meant to mention. There are two Nagels - Ernest and Thomas. They are not related. Ernest is an arch functionalist, behaviourist and positivist. Thomas isn't, isn't and isn't. They both often get quoted as Nagel. It's caught everyone out at one point or another, including me.
 
Apart from ''none of the above'' it looks like my definition has the most votes. Not to say that such a small sample means much.

Sure, but I'd like to be able to identify the main conceptions and yours I think is close enough to those of FDI and ruby sparks.

And we now already know how many votes each definition got and which got more votes.

Also, votes in a possible second round would still split between several but close definitions so that not one definition will get more votes than "None of the above"!

I think you could try to reach a consensus around the key word 'awareness', possibly starting from the definition I suggested.

And I hope everybody understands the distinction I emphasised in the definition I provided between the definition proper given by the first line, and the additional comments and considerations provided in the second and third paragraphs, which strictly speaking are not definitional. This also shows your three definition are in fact identical. Keeping just one would avoid needlessly splitting votes.

Let's see it again:
Consciousness is awareness.

Consciousness may include awareness of perception, feelings, sensations, ideas and thoughts, recalled memories, decisions, or willed realisations or actions etc. as well as partially conscious activity such as dreams, delirium etc.

It is the task of science to discover the physical processes allowing awareness and how these processes provide an operational model of the physical world allowing the conscious organism to survive and prosper in it.
EB


Two comments if I may. First, I’d want to argue that it should be physical and logical processes. While all logical processes supervene on the physical, it’s quite clear that not all logical processes lead to the same result as near equivalent physical processes.

Second, I am unclear quite what the formulation ‘consciousness is awareness’ tells us. Perhaps I’m missing something, but it seems to me that two words are synonyms that play precisely the same role. As such, how is this formulation different from saying consciousness is consciousness? Or to put it another way, what do we learn from using this definition?
 
I think that the mental states are the physical states from a different perspective. Functionalism fails to preserve the physical states, it just preserves the input output relations. Thus, while I feel I have good reason for thinking that someone with similar neurobiology and biology to me is likely to have a mental life, I see few reasons to think that similar input output relations (or behaviour if you prefer) imply a conscious mental life. Indeed, Wittgenstein gives me very good reasons to think that a system can master the grammar of mental talk without having a mental life. I'd even go so far as accepting Dennett's position that a system can mistake its heterophenomenology for a real mental life ... at least from the outside. Meanwhile the fact that I feel like something remains the non negotiable fact of any credible theory of mind. I can't see how functionalism of multiple realisation preserves this from the 'inside' even as I accept it can preserve it from the 'outside'. (Read 'inside' and 'outside' as first and third person or mental and physical as you wish.)

Well, this is where I start to get confused (read that as 'out of my depth') but I have no problem going there. I might just want to add as a caveat (and a safety net) that as far as I'm concerned it's slightly different from the mere definitions issue, so unless you can say something about the minimal one I've become fond of, we can move on to things I'm confused about.....

Right. I thought, wrongly perhaps, that Functionalists did not necessarily say that phenomena (in this case consciousness) were the processes. If they do, while I'm not necessarily 'out', I'm not 'on board'.

To me Multiple Realisability (MR) would only show that the processes are transferrable to a different substrate, not that they can do without one.

My only caveat, and it involves speculation, is....the possibility that we generally see the universe the 'wrong way' in that we see processes (or indeed information) as (secondary) features of substrates, whereas.....what's to say that the universe doesn't ultimately consist of processes (or information) and substrates (matter) are a secondary manifestation of that?

Also, I'm not as strong a strong physicalist as I used to be.

Finally, I'm not sure why functionalism of multiple realisability fails to preserve the subjective fact of personal awareness?
 
Last edited:
Back
Top Bottom