Togo said:
There is absolutely no evidence whatsoever supporting your interpretation of consciousness. If you want consciousness as an emergent property to be accepted, and any other interpretation to be rejected, you at the very least some kind of reason.
I... don't think you understand what you're saying here.
Hm.. I do, you may not.
What I'm saying is that there is no evidence (your criterion) separating your view from that of a dualist
You do realize that emergence refers to the process whereby larger entities or patterns arise through interactions among simpler entities that do not on their own exhibit such properties, right?
Yes, although it really doesn't matter for the point I was making, which is that the difference between the two views isn't a matter of evidence.
Literally nobody in either philosophy (except some of those who are of the theistic persuasion) or science posits anything other than the notion that consciousness is an emergent property. So... what the hell are you even talking about?
Chalmers, and other dualists. You appear to think they don't exist.
When you claim there's no evidence supporting my interpretation of consciousness, you're demonstrating that at best you simply don't know what the term 'emergent property' means and at worst that you're actively suggesting a supernatural explanation for consciousness. I prefer the middle road though, where either or both of us is simply misinterpreting what the other's argument is.
Yes, you seem to think I'm somehow proposing that properties don't emerge. I'm not, I'm saying that what divides your opinion (consciousness is an emergent property) from other rival opinions (consciousness is not an emergent property) is not
evidence.
And no reason to hypothesise that it can.
Nonsense. We have lots of reasons to hypothesize exactly that. By accepting that we live in a materialistic universe, we find ourselves forced to conclude
'Because I'm a materialist' isn't really a reason, any more than 'because I'm a 'Christian' is a reason. I'm not saying it's an unreasonable position to hold, I'm saying that it is a position you have chosen to hold.
that it is plausible that any process within it can be replicated since any such processes will be subject to certain basic natural laws and are not fucking magic.
No, but then if you're arguing with a dualist, then it isn't fucking magic to them either. You really can't argue that dualism fails because it's not materialism - that's totally missing the point.
Which is why the first hurdle is to define what we're trying to prove. Traditionally, attempts to form scientific hypotheses about consciousness have foundered on one of two rocks, either the 'we can't measure this' rock, or on the 'we've found something to measure, but no one really thinks it's consciousness'. This is why it's called the 'hard problem' of consciousness. Because there are lots of easy problems to solve, just by redefining conscious experience as something that's simple to measure.
Except this is not actually the issue at all if we're talking about creating artificial consciousness. You don't need to define something in order to create it; nor do you explicitly need to understand or measure it first.... ...Artificial consciousness would still be conscious regardless of whether or not we can recognize it.
No, of course, not, you can create something first and then argue if it is conscious. But given that this is an internet discussion, unless you believe the patterns of our lengthy posts will suddenly awaken and becomes sentient, then the first step we can reasonably accomplish here is to work out what the frag we're talking about.
In order for science to be useful here, we need something we can measure. Or we need to prove that there is no possible difference between A and B. What we can't do is declare we're only interested in measureable things, say that the difference is not measureable, and then claim that because it's not measureable it somehow doesn't exist.
Which is where the simulation comes in. Since we know human consciousness to be a product of the brain (we don't need to understand in exacting detail how consciousness functions to know this, just like you don't need to understand the physical processes involved with smoke resulting from a fire to make the connection between the two) then we can reasonably conclude that a simulation of said brain; with a high enough resolution; is in fact conscious when it behaves similarly to a real brain.
Only if we make certain materialist assumptions a priori. Again, I'm not arguing they're unreasonable assumptions, merely that making them isn't really a refutation of dualism.
It wasn't programmed, after all, to pretend to be conscious; its consciousness is the result of a simulated version of the exact same processes that appear produce our own consciousness. So, at that point we can start to actually experimentally understand consciousness in ways that are not possible at present, by altering bits and pieces of the simulation in order to see what changes.
Yes, this is something I've actually done. Of course, unless you have a definition of consciousness, you have to guess at what to measure (or measure everything, and then publish whatever is fashionable at the time.)
It's in effect claiming that any mechanism that is sufficiently complicated to duplicate the behaviour of a conscious person, develops consciousness.
...no, it's really not.
Hm.. Fair point.
that a 'cognitive zombie' would be logically impossible. If it's not logically impossible, then there's still no way of measuring whether consciousness is present or not.
This is not really a good argument since this kind of logic invalidates any and all measurements period. It leads to solipsism.
No, it leads to some topics not being resolvable through scientific inquiry. it would only be solipsism if it were claimed that it's impossible to measure
anything, rather than only some things.
Which is exactly why a great many human e It is not logically impossible that you are actually a brain in a vat and that everything you've ever experienced is a lie;
No, you're confusing two different problems. The brain in the vat thought experiment is about what we can be sure about. The measurement problem is about what we can empirically control. Not all things are measureable. We can't measure Watford's potential to win the cup, the desirability of life insurance, or whether it's better to open eggs from the big end or little end. That's not because of solipsism.
The basic problem is that consciousness, as ordinarily conceived, is not condusive to measurement. If we're going to use science to solve the hard problem, we need to measure something. What should we measure?