• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Definition of Consciousness: 2nd Poll

Which one of the four definitions below best fits your view of consciousness?


  • Total voters
    12
  • Poll closed .
So I will disagree with most of the philosophical community. Who are these people again?

And I fail to see how I could possibly be wrong whenever I think that I think. There's no sense of self involved as the "I" here does not point at a self, as would normally be the case. Descartes saying "I think" is really his thought thinking "I think". You're at liberty to dismiss this as vacuous but, personally, I couldn't.


In my example, I couldn't possibly have articulated any elaborate idea on the moment.
Do you not think that a self is an elaborate idea?
And, as I already pointed out, I had no sense of self during the episode. The narrative where I use the "I" came after the event, from memory, once I had woken up. It is also similar to saying "I had a dream". Although the "I" in this case is normally taken to mean the person, it is still the case that the person is only able to attribute the dream to herself on the basis that people take the memories they can recall as being about themselves.

You can see, I'm sure, the story, as you describe it, is not dissimilar to those told by people who have had near death experiences, 'night terror' alien encounters and so on. I'm surprised that you'd want to give a memory of something that occurred in the middle of the night while passed out from pain the same status as everyday mental states.
I just reported my experience as I can recall it. And I happen to think there's no difficulty whatsoever.

More to the point, you really don't seem to be responding to the repeated pointing out of the paradoxical nature of the claims like:
'I had no notion of my own identity'?
Do you not see the problem? Who had no notion of whose identity?

I think I understood your point alright. And I believe I have replied.

At the time, having no notion of self, I could not have thought in terms of "I". However, once I had recovered, I was able to remember what had gone through my mind during the episode. And I usually take the memories I have to be mine. So, I can only see the thing having this minimal thinking process during the episode as being me, even if my memory of this process doesn't feature any self data.

I really don't see an issue here.
EB

Cool, I get that. We disagree, but I don't think there's much milage in carrying on disagreeing. If you don't see a problem with being able to remember what had gone through your mind when you didn't have a sense of self, then I think we are done on that particular problem.


As such, I'll stop trying to pimp a bicameral thesis and have a go at selling my old mate Dan Dennett's theory of consciousness. Apparently this is explaining consciousness away. Fortunately, as an atheist, I think it should be really easy to convince you that his deflationary theory of consciousness explains everything that needs explaining. That will probably be more fun, don't you think?

So why (or how) is The Daniel explaining consciousness away?
 
I am currently skimming this paper, which uses therms like 'ego dissolution' and 'self-binding' and unbinding, in relation to psychedelic experiences and having thoughts with no feeling of ownership':

https://academic.oup.com/nc/article/2017/1/nix016/3916730

Yes, that look like exactly the sort of thing I was talking about. In fact it looks like a bloody good paper. Nice catch Ruby. Now, let me read it and I'll comment. It's an area I'm always interested in, mostly because, as a nipper, I took an unfeasibly large amount of mushrooms, LSD and so on and while I spouted a ridiculous amount of bullshit and spent a lot of time sat in fields full of hippies tripping my tits off there was never any doubt who I was or indeed, how to roll a fat one.
 
I am currently skimming this paper, which uses therms like 'ego dissolution' and 'self-binding' and unbinding, in relation to psychedelic experiences and having thoughts with no feeling of ownership':

https://academic.oup.com/nc/article/2017/1/nix016/3916730

Yes, that look like exactly the sort of thing I was talking about. In fact it looks like a bloody good paper. Nice catch Ruby. Now, let me read it and I'll comment. It's an area I'm always interested in, mostly because, as a nipper, I took an unfeasibly large amount of mushrooms, LSD and so on and while I spouted a ridiculous amount of bullshit and spent a lot of time sat in fields full of hippies tripping my tits off there was never any doubt who I was or indeed, how to roll a fat one.

I think part of my fascination is just jealousy that I never got to try that shit.

Plus, about the article, I was a bit surprised to see that Alan Watts got a brief passing mention, since I wasn't aware anyone took him seriously.
 
I am currently skimming this paper, which uses therms like 'ego dissolution' and 'self-binding' and unbinding, in relation to psychedelic experiences and having thoughts with no feeling of ownership':

https://academic.oup.com/nc/article/2017/1/nix016/3916730

Yes, that look like exactly the sort of thing I was talking about. In fact it looks like a bloody good paper. Nice catch Ruby. Now, let me read it and I'll comment. It's an area I'm always interested in, mostly because, as a nipper, I took an unfeasibly large amount of mushrooms, LSD and so on and while I spouted a ridiculous amount of bullshit and spent a lot of time sat in fields full of hippies tripping my tits off there was never any doubt who I was or indeed, how to roll a fat one.

I think part of my fascination is just jealousy that I never got to try that shit.

Plus, about the article, I was a bit surprised to see that Alan Watts got a brief passing mention, since I wasn't aware anyone took him seriously.

Jobar does, but I suppose it's because he took a lot of drugs. (and yes, that's deliberately ambiguous!)


Either way, reading on, they agree with me:

In brief, our conclusion is that even in florid psychedelic experience the self-model is never entirely destroyed. Rather, as the coherence it normally imposes degrades, we become aware that our normal experience of unity depends on a modelling process. Just as disorders of feature binding help disclose the nature of object representation, ego dissolution discloses the nature of self-awareness.

Either way, It's a very impressive paper.

IN fact, here they are making the point I just made in PM:

On these views unbound information cannot be the object of conscious awareness. (iv) is a Hebbian theory that proposes that binding consists in the frontally regulated construction and maintenance of transient activity patterns in frontal–posterior (especially parietal) circuitry, stabilizing and integrating perceptual representations, which then become available to executive functioning (Ballard et al. 1983; van Essen et al. 1992).
In other words, no conscious experience without binding.

In fact, having made it clear that narrative alone doesn't cut the mustard, they make it equally clear that:

The narrative ‘I’ is Dennett’s (1991) narrative self or Damasio’s (2010) autobiographical self, which allows us to experience high-level (attempted) control of the integrated functioning of the egocentric and salience systems.
We think that Dennett is correct that the narrative self is a model which abstracts from the integrated functioning of a hierarchy of egocentric and salience systems to posit a simple unified entity of which those integrated features are attributes. This view is consistent with the idea that as functioning disintegrates we will no longer feel like unified entities.

Which is pretty well my position in a nutshell.

As is so often the case, the smoking gun is hidden away in an acronym...

https://en.wikipedia.org/wiki/Default_mode_network
 
Last edited:
Yes, it would appear that neither psychedelics, or anaesthetically-induced comas (I did some googling) are going to be the places to go to find the lower limit of consciousness (the 'bare' experience of...anything) in humans, because as that paper says, the disintegration under psychedelics is not complete (and in the case of anaesthesia there is the difficulty of having to guess what the patient is experiencing, if anything).

So, although we have a spectrum, with more robust sense of self appearing to depend on levels of co-ordinated integration of a plethora of experienced stimuli (and if I understand it, something to do with attention) we can't get to the lower end. I'm afraid I'm assuming there is a lower end, 'bare' unbound sensations, because I still can't accept that there aren't creatures out there or in our ancestry that can only feel pain accompanied by a sense of self. Even when it comes to us, I find it hard to say that our experiences need a sense of self to go along with them, because surely at some point when, say, going into a coma, we don't cross a hard line where self and say, pain click off simultaneously?

Anyhows, I was wondering.....could it be that with enough psychedelics we do cross into complete loss of self but merely lack the ability to report or recall it, even to 'ourselves'? I know that's almost as speculative as asking if a bat has a sense of self.

This part (of the paper) also confused me, but then, it would do:

"However, it does not follow from the existence of a robust, causally efficacious, and multi-layered self-model that the entity represented by this model exists. We agree with Metzinger that it does not, and reject Dennett’s identification of the self with narrative self-models as well as Hohwy and Michael’s identification of the self with hierarchical predictive self-models. This is because the self-model does not posit the existence of a narrative or a model; it posits the existence of a substance, a prime mover, a persistent entity that underlies, owns, and initiates thoughts, experiences, and actions."

They seem to be at pains, there and in the conclusions, to disagree with Dennett about something and I wish I could understand what it was.
 
Complete loss of self happens with complete loss of memory function. Only incomprehensible sensation remains, unrecognized objects, shapes and colours and sounds that have no meaning or significance.
 
I have now looked at the other definitions and I have come to the conclusion that the first definition in this survey

Consciousness is an arousal state, awareness, motivated to treat self and environmental events. Arousal is the overall level of responsiveness to stimuli.

has the most operable parts and of those that are not actually specifically operable are referenced to others that are operable. The terms of arousal, awareness, motivation, environment are all well researched and fairly well understood. It seems all the other definitions can, without loss of meaning, be reduced to the above definition. All of the definitions have problems with consciousness and self. They seem no more that terms of art without real substance beyond our belief they exist and that we, as humans, use them.

If we take information, control, filter, and process substrates as they are worked out over the past 80 years or so and we apply the ideas that neurons have a univerasl associative component and an underlying edge filtering component, that twitches and squirts are well bedded in physical theory I believe we can move forward using this definition. I don't see how the other definitions, all depending on many 'self evident' components without physical bridges, can progress beyond where they are except to make the Greek philosophical origins of will and consciousness more coherent in the present discussion.

Yes, such as sense, precept, memory are substantiated. However they are all part of the machine that is aroused to a state or general awareness of that is being processed, thus they are included in the the simpler definition above.

I've been impressed in the differences between different species in aroused and aware capacity as I am with the progress of the squirt portion ob behavior subserved by the autonomic NS. Other than the obvious, verbal language, the sense of continuity seems to be the most obvious capacity that has changed over the course of vertebrate evolution making those animals that have internal capacity to, on the fly, compute now and somewhat into the future making continuous motion and observing appear continuous as well. There are others of course, but we need to get the rudiments laid down before we try to determine or integrate them into the overall plan for describing consciousness an self.

Unless and until we gain understanding of what the mechanisms of consciousness and will consist we shan't be able to make progress.
 
Complete loss of self happens with complete loss of memory function. Only incomprehensible sensation remains, unrecognized objects, shapes and colours and sounds that have no meaning or significance.

We only surmise this because those who appear to have those conditions are unable to report. As I age I find bits and pieces of my ability to communicate remembered information being lost to my access only to discover there are other ways to get at what is missing. At some point my ability to report will die, as it does in all of us, usually before physical death.
 
If you don't see a problem with being able to remember what had gone through your mind when you didn't have a sense of self, then I think we are done on that particular problem.

I can only report what I remember and take it at face value.

If you think there's a problem, it's up to you to explain and that's something you haven't done yet.

As such, I'll stop trying to pimp a bicameral thesis and have a go at selling my old mate Dan Dennett's theory of consciousness. Apparently this is explaining consciousness away. Fortunately, as an atheist, I think it should be really easy to convince you that his deflationary theory of consciousness explains everything that needs explaining. That will probably be more fun, don't you think?

I'm sceptical. Maybe we don't see the same things as in need of an explanation?

So why (or how) is The Daniel explaining consciousness away?

It was a long time ago when I read his book and so I'm not going to try to go into the specifics. Me, I make a distinction between the computational processes apparently taking place in the brain and subjective experience, that is, the phenomenology of consciousness as I certainly experience it myself. I'm reasonably optimistic that the former, although already a difficult task, could nonetheless be explained using scientific investigation. However, the latter seems for now beyond the scientific paradigm. Maybe things could change in this respect but I won't be holding my breath.

Anybody explains subjective experience in scientific terms, do you know?
EB
 
Complete loss of self happens with complete loss of memory function. Only incomprehensible sensation remains, unrecognized objects, shapes and colours and sounds that have no meaning or significance.

We only surmise this because those who appear to have those conditions are unable to report. As I age I find bits and pieces of my ability to communicate remembered information being lost to my access only to discover there are other ways to get at what is missing. At some point my ability to report will die, as it does in all of us, usually before physical death.

Sure, but the problem lies not in the parts of memory integration that still function but areas where it doesn't function, there lies the unrecognized experience. A tiny glimpse into the condition being when we can't remember where we put something or recall a name, we draw a blank, where there should be a memory there is nothing, try as you might the information is not recalled, sometimes slipping away even as we try to recall.

As the pathology of permanent memory loss progresses, this experience of blank spots, not remembering, not recalling a name, a moment of non recognition, temporary in a normal brain, extends ever further into consciousness culminating the inability to recall one's own name, recognize close family, not know where one lives or do simple household chores.


Of course we can only surmise or imagine what this process may feel like to a patient as it progresses, but not entirely without a personal glimpse into what the inability to recall feels like, if only temporary and trivial (thankfully).
 
Complete loss of self happens with complete loss of memory function. Only incomprehensible sensation remains, unrecognized objects, shapes and colours and sounds that have no meaning or significance.

More precisely, to get the same result, it's good enough to have a transient loss of your ability to recall memories. It doesn't seem necessary to have lost your ability to memorise so that you will be able to recall the episode afterward.

That's my experience, anyway.
EB
 
Yes, it would appear that neither psychedelics, or anaesthetically-induced comas (I did some googling) are going to be the places to go to find the lower limit of consciousness (the 'bare' experience of...anything) in humans, because as that paper says, the disintegration under psychedelics is not complete (and in the case of anaesthesia there is the difficulty of having to guess what the patient is experiencing, if anything).

Ruby, I do not say this very often, but I strongly recommend that you don't think too hard about general anaesthesia. Suffice to say that the two real breakthroughs were induced paralysis and the ability to stop memory traces from being laid down. Nowadays, when practical, a local anaesthetic will be used as well, because for some reason, people do suffer varying degrees of shock...
So, although we have a spectrum, with more robust sense of self appearing to depend on levels of co-ordinated integration of a plethora of experienced stimuli (and if I understand it, something to do with attention) we can't get to the lower end. I'm afraid I'm assuming there is a lower end, 'bare' unbound sensations, because I still can't accept that there aren't creatures out there or in our ancestry that can only feel pain accompanied by a sense of self. Even when it comes to us, I find it hard to say that our experiences need a sense of self to go along with them, because surely at some point when, say, going into a coma, we don't cross a hard line where self and say, pain click off simultaneously?

I don't think anyone is saying that. Look, by way of analogy, imagine, well you don't have to imagine, there are drugs that stop memories from being laid down. Now think of someone who has taken a drug of that nature. Have they lost consciousness? No. Will they remember their experience of a pain, say? No. In the same way, if the incoming pain fails to bind then how would the rest of the brain become aware of it? If the rest of the brain doesn't become aware of it, is it likely to become a mental event outside of itself (if it ever becomes one in the first place. There are probably many ways that we lose consciousness. Losing the mechanisms that bind everything and also lead to a sense of self are just one more way.
Anyhows, I was wondering.....could it be that with enough psychedelics we do cross into complete loss of self but merely lack the ability to report or recall it, even to 'ourselves'? I know that's almost as speculative as asking if a bat has a sense of self.

I don't think so, Holist and I once took about ten times the normal dose of fresh Psilocybe semilanceata out of curiosity and while it was pretty intense, we were definitely still there to tell the tale.

This part (of the paper) also confused me, but then, it would do:

"However, it does not follow from the existence of a robust, causally efficacious, and multi-layered self-model that the entity represented by this model exists. We agree with Metzinger that it does not, and reject Dennett’s identification of the self with narrative self-models as well as Hohwy and Michael’s identification of the self with hierarchical predictive self-models. This is because the self-model does not posit the existence of a narrative or a model; it posits the existence of a substance, a prime mover, a persistent entity that underlies, owns, and initiates thoughts, experiences, and actions."

They seem to be at pains, there and in the conclusions, to disagree with Dennett about something and I wish I could understand what it was.

That's precisely what I was talking about in my PM, or are you making a point?
 
Complete loss of memory is what consciousness is not.

It tells us little about what consciousness is.

Consciousness is the ability to agree or disagree with some idea. The ability to choose. The ability to create a unique criteria for choosing.

Ask those who say consciousness cannot choose why they believe what they do.

The squirming and rationalizations and invented stories are amusing.

Because the truth is they believe it only because they choose to believe it.

One can choose to follow reason, or follow evidence, or follow invented stories.

But what you follow is always a choice.
 
Complete loss of memory is what consciousness is not.

Are you saying that if you don't remember a mental event it didn't happen?

It tells us little about what consciousness is.

I agree, I was using it as an analogy for a more complete binding failure

Consciousness is the ability to agree or disagree with some idea. The ability to choose. The ability to create a unique criteria for choosing.

I'm not sure we are in the same ballpark. I can imagine all of the things you describe above happening without it feeling like anything for them to happen.

Ask those who say consciousness cannot choose why they believe what they do.

I'm sorry, I can't parse this.
The squirming and rationalizations and invented stories are amusing.

Because the truth is they believe it only because they choose to believe it.

If you have a counter argument for whoever you are disagreeing with, I'm all ears, but my experience is that emotive verbs like 'squirm' have little place in rational argument.

One can choose to follow reason, or follow evidence, or follow invented stories.

I assume you are commenting on Speakpigeon here as he's the only one offering an experience. I really don't see what grounds you have for saying his description of an experience is invented. Certainly it is anecdotal but first, anecdote is usually the way that science gets an idea as to where to do research and secondly, when talking about personal experience all anyone has is anecdote. That's the problem we spoke about in the previous thread. I might not agree with him, but I don't see what he's said or done that isn't reasonable. If you can offer non anecdotal evidence about first person mental events you are the first. Ever.
 
Last edited:
Why do you believe what you believe?

I am trying to show what a consciousness can do.

It can make decisions.

Freely.

I am introducing something, not commenting much on the things people have said.
 
Why do you believe what you believe?

I am trying to show what a consciousness can do.

It can make decisions.

Freely.

I am introducing something, not commenting much on the things people have said.

Are you sure you are not talking about freewill? It's absolutely feasible to design, in software, a belief based deciding system that acts on beliefs to bring about desires. I don't think it is conscious. Being able to discriminate and decide isn't really the problem we are trying to address here, or at least that's the easy question. The problem, as I understand it, is how those discriminations and decisions feel like something to have. Perhaps it would be helpful is you explained precisely what you think consciousness is.

I'm glad that you are not commenting much on the things people have said; argue about them by all means, but demeaning them with boo words like 'squirm' probably isn't helpful for anyone.
 
If there is free will what is it besides the consciousness that is free?

The brain is not free.

The only kind of "programming" you can talk about is human designed programming. Programming where the objective exists first.

Evolution has no preexisting objectives. So if "programming" exists in the human it cannot be anything like human designed programming.

And talking about human designed programming is a distraction and meaningless.

The computer scientists have to prove their programming applies to biological systems.

That is not assumed. The opposite is assumed.
 
Last edited:
If there is free will what is it besides the consciousness that is free?

The assumption there is that you can't have agency without awareness. Why can't an evolved system that has evolved to make decisions not make them freely. The idea that the ship needs a captain has a rather uneven history. Sometimes the ship can just control itself.

The brain is not free.

Why not? I'd say it's a necessary condition of the agent and the mind being free that the brain is.

The only kind of "programming" you can talk about is human designed programming. Programming where the objective exists first.

That's just empirically false. Google 'Genetic algorithms' or just 'self programing AI'.

Evolution has no preexisting objectives. So if "programming" exists in the human it cannot be anything like human designed programming.

Really? I'm giving myself instructions all the time. More to the point your premise is, as I just demonstrated, false.
And talking about human designed programming is a distraction and meaningless.

Cool, I was merely picking up your point about beliefs. Perhaps you can tell me where they come from?

The computer scientists have to prove their programming applies to biological systems.

Not when they are simulating a biological connectome and working on the processes that animate it - I'd suggest you might want to take a peek at the convergence of neurobiology and neurocomputation.

That is not assumed. The opposite is assumed.

I think either case cannot be assumed but needs supporting evidence.
 
..I'd say it's a necessary condition of the agent and the mind being free that the brain is.

Anyone up for 'partially free' or 'has some agential freedoms'? :)

I'm not even entirely against 'elbow room' but just don't ask me to use the term 'free will' too readily.

Aren't we in the wrong thread though?
 
Back
Top Bottom