• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

The Great Contradiction

I'm wondering why the quotes around "punishment." Is it that you don't think sentencing someone to prison, to a reformatory or penitentiary, really counts as punishment?

Not that. Only that it didn't seem correct to include rehabilitation in the term punishment. That's all. Possibly a bit pedantic of me, but I think a useful distinction.

The reasons for punishment: rehabilitation, isolation, deterrence, and vengeance. That last one, vengeance, isn't a legitimate function of government.

With the arguably pedantic point about rehabilitation not of itself being punishment, yes, I think I agree.

If you kill your rapist because you are angry, that may be acceptable, but government has no business hurting people based on irrational emotions. (Note: I'm not saying that killing a rapist would necessarily be irrational. One might hurt someone in anger (seeking vengeance) while also intending to keep that person from repeating an offense (rehabilitation or isolation). But here I'm trying to separate out the motives for punishment. It would be wrong to hurt someone for no benefit. It is irrational to hurt somebody out of anger (vengeance) -- without also intending rehabilitation (teaching the person not to repeat a hurtful behavior), isolation (separating the person from the situation in which he is likely to repeat the hurtful behavior), or deterrence (punishing one person to discourage others from repeating the hurtful behavior).

Prison and jail sentences are generally held to be forms of punishment. The primary purpose of incarceration is often held to be rehabilitation.

But perhaps your concern is that you don't believe incarceration actually accomplishes rehabilitation? An argument can be made for that position. Unfortunately, equally good arguments can be made against the effectiveness of isolation and deterrence. Which, if we entertain these arguments, leaves us with no justification for ever punishing anyone.

Since it seems implausible that society could function without punishment, we withhold credence from the claims that rehabilitation, isolation, and deterrence don't work.

Yes.

(My concern was not that incarceration doesn't actually accomplish rehabilitation).

We have an advocate of retribution in this thread. It makes no sense to me.

Suppose Sara accidentally kills Joe's daughter in a car wreck. And now suppose Joe wants to kill Sara's daughter retributively, because it's "fitting," because it "fits the crime," because it's "just."

Is Joe's motive rehabilitation? No, because he already knows Sara won't do such a thing again. Sara is distraught over what she did; she's given up driving; and, in fact, Joe, her next door neighbor, has taken up driving her wherever she needs to go.

Is Joe's motive isolation? No, because killing Sara's daughter has no tendency to keep Sara from being able to have car accidents.

Is Joe's motive deterrence? No, it hasn't occurred to Joe that other people may drive more safely if he kills Sara's daughter.

Is Joe's motive vengeance? No, he's not acting in anger. He's acting only in the belief that retribution is somehow good.

Does Joe have any rational motive at all? I'd say no. Joe isn't hoping to accomplish any good thing. No deterrence, no isolation, no rehabilitation, nor any other benefit. He just thinks symmetry is fitting and proper. He thinks retribution is good for nothing.

That is, he thinks retribution is good in spite of the fact that it has no benefits.

Joe wants to do a great harm, but he has no offsetting benefit as a justifying goal.

I'm not on Joe's side. I think he's irrational. I do not favor retribution.

Me neither. That said, it's emotion-based. I'm sure I'd harbour retributive urges if it was, say, my daughter that was raped. I'm only human, as they say. The question of whether emotions are irrational might be an interesting one. We tend to think of emoting as totally distinct from reasoning, but maybe it isn't. It seems to me there's at least overlap and mingling. Is it possible for a human to do 'pure reasoning'? I'm not sure.

Also, while, as you say, revenge is no business of the state, it might be the case that laws in some ways reflect what people expect or desire or see as appropriate and deserving, so it may be, arguably improperly as you say, that the state, via its laws, does include, perhaps implicitly only, a retributive component.

To me the main point about retribution, or perhaps better to say the apparently interesting thing about it as it pertains to what this thread has become (a discussion on free will) is that some studies suggest that weaker beliefs in free will are positively correlated to weaker retributive urges. Iow, there may be a downside to believing in free will.
 
That is, he thinks retribution is good in spite of the fact that it has no benefits.

Joe wants to do a great harm, but he has no offsetting benefit as a justifying goal.

I'm not on Joe's side. I think he's irrational. I do not favor retribution.

As implied in the OP, the benefit is that it "feels good" for Joe in some way. I would liken such behavior to misusing a drug for its high, and not its medical benefit. Hence the behavior.

Is there some level of anosognosia in Joe's behavior? Is there some level in the behavior of a believer when that person says that the existence of creators do not need to be explained? I would say absolutely. I don't think there is any other way to explain it. It's no different from the paralyzed stroke patient telling the doctor he chooses not to move his legs but that he could if he wished to. In both cases one is holding a belief - making a claim - that cannot be rationally defended, only irrationally - emotionally - accounted for.

The only rational way to explain irrational behavior is to ascribe it to brain structure and function. And structure certainly comes first.

Impulsive behavior is controlled in the prefontal cortex and enough work has been done to identify this part of the brain with irrational impulsive behavior. That's not to say that impulsive behavior is always negative or always desirable, only that it's understandable neuroscientifically.
 
Wiploc said:
Does Joe have any rational motive at all? I'd say no. Joe isn't hoping to accomplish any good thing. No deterrence, no isolation, no rehabilitation, nor any other benefit. He just thinks symmetry is fitting and proper. He thinks retribution is good for nothing.

That is, he thinks retribution is good in spite of the fact that it has no benefits.

Joe wants to do a great harm, but he has no offsetting benefit as a justifying goal.

I'm not on Joe's side. I think he's irrational. I do not favor retribution.
Justice is not a means to an end - at least, not primarily. It is an end. The goal of exacting just retribution is that the perpetrator gets what he deserves. It is a good, just thing that he gets what he deserves. That is the benefit, not a further one - well, it may also have other benefits, like promoting social peace, but that is not the main reason.
 
Justice is not a means to an end - at least, not primarily. It is an end. The goal of exacting just retribution is that the perpetrator gets what he deserves. It is a good, just thing that he gets what he deserves. That is the benefit, not a further one - well, it may also have other benefits, like promoting social peace, but that is not the main reason.

I do not at all understand that position. It seems to me pointless and perverse.

Did you point out that we've discussed this before at length? Somebody did, but I don't find the post. I don't want to rehash something that we already worked thru, so I'm happy to agree to disagree.
 
Justice is not a means to an end - at least, not primarily. It is an end. The goal of exacting just retribution is that the perpetrator gets what he deserves. It is a good, just thing that he gets what he deserves. That is the benefit, not a further one - well, it may also have other benefits, like promoting social peace, but that is not the main reason.

I do not at all understand that position. It seems to me pointless and perverse.

Did you point out that we've discussed this before at length? Somebody did, but I don't find the post. I don't want to rehash something that we already worked thru, so I'm happy to agree to disagree.
We discussed the matter before, but not at length. I posted the link earlier.

Why would it seem perverse or pointless to want to do justice?

Take a look at normal human behavior. For example, suppose a person is the victim of kidnapping, torture, rape and/or murder, or similar heinous crimes.

It is common that the victim's loved ones - and/or the victim herself, if the crime was not murder - want justice. When the judiciary system fail to punish the perpetrators, they tend to complain, saying there has been no justice, and so on. Now, my suggestion would be to read what the victims, loved ones, etc., say in those cases. It is clear that in the vast majority of them, the victims/loved ones are not asking (not primarilily, at least) that the perpetrators who committed all of those heinous crimes be rehabilitated. They also are not seeking only to deter others, or to protect others from that particular perpetrator. Those might be secondary goals, but usually - and normally in humans I'd say, barring ideologies getting in the way - what they seek is justice, that is retribution. What they want is that the perpetrators suffer for their crime as they deserve.

Or take a look at how criminal justice systems operate. Even when they are very flawed - and some no doubt are really bad - they revolve around concepts such as just deserts. That is why there exists excuses such as insanity. That is why those who are insane in a way that they have lost contact with reality - when recognized as such - may the subjects of isolation (to protect others) or treatment (attempted rehabilitation), but they are not punished, whereas those who are evil but not insane - e.g., serial killers - are in fact punished.

That is a central feature of morality, which is human morality. This is how human psychology operates. Ideology can damage that to some extent, but it cannot destroy it - not in a large percentage of people -, as it is too central to human psychology.
 
Why would it seem perverse or pointless to want to do justice?
I can't speak for Wiploc but the way you describe "just retribution" looks like a form of sadism to me - the infliction of discomfort for no other reason than to satisfy the desires of the one seeking 'justice'. The fact that the desire for retribution is very popular doesn't automatically mean we should accept it uncritically.
 
To me, nearly all justice systems are essentially and mainly retributive. Even being sent to prison is 'paying a price for what you did'. It's your just deserts, and it's also meant as a deterrent to other potential wrongdoers.

Of course it may also at the same time protect other people from you while you're in there, and/or give you the opportunity to change for the better, etc.

So in that sense I'd say that proportionate, reasonable retribution can be part of a rational justice system.

I find it hard to imagine a society without retribution (or if you like revenge) as quite a core feature. 'Evening things up' feels like an important component of group dynamics.

Now I'm not sure what I mean when I say retribution. I'm confused. :(
 
How about, if we imagine a hypothetical rational robot or AI entity that is in charge of our justice system. The AI machine's algorithms are extremely sophisticated, and it can learn and develop over time, by number crunching and other deep analyses of the results of its actions. To the extent that if its human makers had given it any biases, at least some of these would be modified by it learning for itself.

It just isn't capable of having feelings, emotions, or even consciousness.

Would such a 'rational machine' still send people to prison, partly for reasons to do with just deserts and deterrence, because society would work better that way?

I like the way of looking at this from the point of view of saying that the AI Justice robot would be deciding what to do with 'machines that malfunction', which I believe is what humans could be described as. So the Justice robot doesn't even need to decide, or be initially programmed to consider, whether the malfunctioning machines it's dealing with have free will or not. That would be irrelevant to the justice robot.

Controversially perhaps, such a robot might decide, in cases where malfunctioning machines can't be fixed, and storing them in long term prison is too expensive of available resources, to destroy (and/or possibly recycle) the unfixable machines, but it would do it for 'rational', non-emotional reasons. A bit like the way we all might do when replacing a home computer.
 
Last edited:
How about, if we imagine a hypothetical rational robot or AI entity that is in charge of our justice system. The AI machine's algorithms are extremely sophisticated, and it can learn and develop over time, by number crunching and other deep analyses of the results of its actions. To the extent that if its human makers had given it any biases, at least some of these would be modified by it learning for itself.

It just isn't capable of having feelings, emotions, or even consciousness.

Would such a 'rational machine' still send people to prison, partly for reasons to do with just deserts and deterrence, because society would work better that way?

I like the way of looking at this from the point of view of saying that the AI Justice robot would be deciding what to do with 'machines that malfunction', which I believe is what humans could be described as. So the Justice robot doesn't even need to decide, or be initially programmed to consider, whether the malfunctioning machines it's dealing with have free will or not. That would be irrelevant to the justice robot.

Controversially perhaps, such a robot might decide, in cases where malfunctioning machines can't be fixed, and storing them in long term prison is too expensive of available resources, to destroy (and/or possibly recycle) the unfixable machines, but it would do it for 'rational', non-emotional reasons. A bit like the way we all might do when replacing a home computer.

GORT was the robot in "The Day the Earth Stood Still." Good flick. It's a common theme in human culture, we invent an entity that meters out perfect, dispassionate justice.

Justice can be defined in a lot of ways. Does your idea of justice include fairness? People don't make their brains, which make them what they are, so is it just to punish a person for inheriting a brain that is different? Was this fair to that person? Wouldn't it be more just to try to fix that brain so the person can enjoy the same life as you? Your notion of justice is simply to kill that person, that brain. You're not advocating just shutting the machine off but destroying it completely just because it has a defective part. Are you really okay with that?

Of course, this is why people latch onto free will. Free will is their hammer.
 
GORT was the robot in "The Day the Earth Stood Still." Good flick. It's a common theme in human culture, we invent an entity that meters out perfect, dispassionate justice.

Justice can be defined in a lot of ways. Does your idea of justice include fairness? People don't make their brains, which make them what they are, so is it just to punish a person for inheriting a brain that is different? Was this fair to that person? Wouldn't it be more just to try to fix that brain so the person can enjoy the same life as you? Your notion of justice is simply to kill that person, that brain. You're not advocating just shutting the machine off but destroying it completely just because it has a defective part. Are you really okay with that?

Interesting questions. I don't know that film. I wonder how much my personal answers would illuminate things? My views would be very subjective. What I might offer is that I would guess that many of those 'human machines' executed could in fact have been fixed*. As to the ones who couldn't, I don't know. It's tricky. Ideally, I'd merely quarantine them safely, but in the real world, spending resources on that might more or less automatically mean deciding not to spend those resources elsewhere, alleviating the sufferings of non-criminals for instance.

I suppose one potentially scary thing about inventing a machine to dispense dispassionate justice (assuming such a thing could even be designed into its program by a non-dispassionate human designer) is that the justice would in fact actually be dispassionate. If you were so defective that you could not be fixed and it was too resource expensive to quarantine you, you might be killed.

I think it was Isaac Asimov who suggested that the 1st and over-riding rule when progamming or making robots should be the rule that the robots 'do no harm to a human'. I guess that's a good idea. But who's to say if it's fair and reasonable and not just self-serving of human interests? I mean of individual humans, because there might be times when the interests of an individual human are greatly outweighed by the interests of the many humans, or even other species, or even the planet. Such a robot might not have tried to kill Hitler, to give an example, and to prove Godwin's Law.

The full Asimov rule number 1 is, "A robot may not injure a human being or, through inaction, allow a human being to come to harm." so I'm not sure how that might have affected a robot who encountered Hitler. Maybe smoke would have started coming out of its central processor. I guess the robot could just incapacitate Hitler. Duh. Silly me.

Of course, this is why people latch onto free will. Free will is their hammer.

Yes. Ideas of free will and ideas of justice are heavily intertwined. So much so that some arguments for free will, if I recall correctly, are essentially made on the grounds that dispensing justice would (supposedly) not be possible if there was no free will or even if we agreed there was no free will. Which to me, while important, is obviously putting the cart before the horse.





* Especially if we start to consider focusing more on preventative measures (whatever they may be) rather than after-the-fact punishments. This paradigm, with it's slightly different emphasis, is also said to be another potential benign outcome (along with a reduction in retributive urges) of weakened beliefs in free will. To reuse the machine terminology, not only do we not blame our computer if it malfunctions (well, we arguably shouldn't, at least, but we might nonetheless) but this tendency not to blame would be reduced even further if the computer had been permitted to be in a situation where it malfunctioning was more likely (being repeatedly dropped onto a hard surface for example). Perhaps if the human machines really did treat each other better (more fairly), there'd be fewer malfunctions. Sort of like what Jesus said, at least sometimes, allegedly, albeit Jesus was probably a strong believer in free will. Is there an equivalent to Godwin's Law for bringing up Jesus?
 
Last edited:
* Especially if we start to consider focusing more on preventative measures (whatever they may be) rather than after-the-fact punishments. This paradigm, with it's slightly different emphasis, is also said to be another potential benign outcome (along with a reduction in retributive urges) of weakened beliefs in free will. To reuse the machine terminology, not only do we not blame our computer if it malfunctions (well, we arguably shouldn't, at least, but we might nonetheless) but this tendency not to blame would be reduced even further if the computer had been permitted to be in a situation where it malfunctioning was more likely (being repeatedly dropped onto a hard surface for example). Perhaps if the human machines really did treat each other better (more fairly), there'd be fewer malfunctions. Sort of like what Jesus said, at least sometimes, allegedly, albeit Jesus was probably a strong believer in free will. Is there an equivalent to Godwin's Law for bringing up Jesus?

Justice is all about prevention as I see it. Maybe one day the technology will exist to constrain a person with little cost to the society. Maybe we can download their DNA sequence as a form of capital punishment instead of incarcerating them at great expense to the society. Of course we actually terminate them but we can bring them back because we have the technology to create that living organism again based on its DNA. In any case, knowing that there are no perfect humans we need to be careful about who we punish. Ideally we fix each other, not punish each other. We have to take sworn oaths to be more rational than emotional.
 
I was rummaging through some of my Psych 1 notes from 1959. Back then psychologists were beginning to experiment with Psychotropic drugs like LSD. Claims were being made about such as time and sense distortion resulting from transmitter substance imbalances and the like. Jump forward 20 years and I was using dopamine and ketamine in studies of addictive behavior in rats. Over that period we went from a very harsh notion of justice to a forgiving due to external influences.

Hardly 10 years later government, lead by liberal society fought crime - crushed civil rights with harsh punishment for one form of Cocaine use whilst calling line parties among the white entertainment elite youthful exuberance - through punishment for drug trafficking in gang (read black and brown) communities. At the same time Michigan, Idaho, Montana, Texas, became centers for Posse Comitatus organizations.

Justice is what they are all about. All of them.

What's all this diversity around fair justice? Could it be we are talking how one part of society controls another. Isn't this a political discussion having little to do with what I listed in my first sentence, psychology.
 
Why would it seem perverse or pointless to want to do justice?
I can't speak for Wiploc but the way you describe "just retribution" looks like a form of sadism to me - the infliction of discomfort for no other reason than to satisfy the desires of the one seeking 'justice'. The fact that the desire for retribution is very popular doesn't automatically mean we should accept it uncritically.

The reason is that a person who deserves to be punished, is punished as deserved. The concept of what an agent deserves is a universal human moral concept. Attached we have the concept of just retribution - when a person is given what they deserve, and because they deserve it.

Now, if people who deserve to be punished are not so, then the world is less just than it otherwise would be. But apart from that, the problem is more direct: people who deserve to be punished are getting away with it.

The idea that doing justice by punishing people as they deserve seems perverse or pointless (as both Wiploc and you say) suggests to me a partial moral error theory. Is that what you propose?

More specifically, it suggests an error theory about desert, so that all judgements like 'Agent A deserves X' are not true. It does not require a full moral error theory, since there are other concepts - like that of acting immorally - that need not be ruled out. But it seems like a partial one. Is that what you propose?

Alternatively, perhaps what you and/or Wiplic propose is that the only judgements of the form 'Agent A deserves X' that are true are some of the judgments in which X is something good. Is that what you propose?

If it is neither, then do you agree that some people deserve to be punished? Or do you have another theory?
 
ruby sparks said:
To me, nearly all justice systems are essentially and mainly retributive. Even being sent to prison is 'paying a price for what you did'. It's your just deserts, and it's also meant as a deterrent to other potential wrongdoers.
Yes, though the deterrent factor is secondary, at least if the system aims to be just. Otherwise, a person could be punished as deterrence as long as most people believe that he is guilty, which would serve the deterrence purpose. But deterrence without justice would be, well, unjust. :)
ruby sparks said:
So in that sense I'd say that proportionate, reasonable retribution can be part of a rational justice system.
Yes, punishing people as they deserve.

ruby sparks said:
I find it hard to imagine a society without retribution (or if you like revenge) as quite a core feature. 'Evening things up' feels like an important component of group dynamics.

Now I'm not sure what I mean when I say retribution. I'm confused. :(
The meaning of words is given by usage. The concepts of just deserts and just retribution are not defined by stipulation, but by usage. You may not have a theory about the meaning of those words, but you understand them (that's the case for most words, by the way).

ruby sparks said:
Would such a 'rational machine' still send people to prison, partly for reasons to do with just deserts and deterrence, because society would work better that way?
That depends on the objectives of the rational machine. If the machine wants to make the world more just, then it would do it for just deserts. If it wants to deter, then it would do it for deterrence. If it wants both, for both. If it wants to make society work better, if that 'better' is a moral one, sure, for just deserts.

OTOH, if the machine is merely interested in keeping humans tamed, then it would rationally do whatever works for that purpose, in its superhuman assessment.

If the machine has some other goal, then it depends on the goals.

The motivation to do justice is not built-in rationality, but in the (properly functioning) human mind. A rational alien from another planet might not even have morality, even if they might have an analogue (or not, e.g., if the alien is a non-social AI).


ruby sparks said:
I like the way of looking at this from the point of view of saying that the AI Justice robot would be deciding what to do with 'machines that malfunction', which I believe is what humans could be described as. So the Justice robot doesn't even need to decide, or be initially programmed to consider, whether the malfunctioning machines it's dealing with have free will or not. That would be irrelevant to the justice robot.
That would be relevant if the robot wants to do justice, because it would be unjust to punish someone who did not act of his own accord even to a minimum degree (i.e., not free at all), and it would be unjust to punish someone who did not act of his own accord (e.g., threatened with a gun; a little freedom remains, though not much) to the same extent as someone who did (no threats).


ruby sparks said:
Controversially perhaps, such a robot might decide, in cases where malfunctioning machines can't be fixed, and storing them in long term prison is too expensive of available resources, to destroy (and/or possibly recycle) the unfixable machines, but it would do it for 'rational', non-emotional reasons. A bit like the way we all might do when replacing a home computer.
Sure. It all depends on the robot's goals. The hypothesis that it is rational does not say whether it has moral goals.
 
then do you agree that some people deserve to be punished?
No.

Or do you have another theory?
I think we should do the minimum necessary (where appropriate) to achieve deterrence, rehabilitation, societal protection and restoration. These are all considered, by most, to be forms of punishment but they're not (in my view) "deserved" - they're pragmatic responses to wrongdoing (they're consequentialist reasons for punishment).
 
GORT was the robot in "The Day the Earth Stood Still." Good flick. It's a common theme in human culture, we invent an entity that meters out perfect, dispassionate justice.

Justice can be defined in a lot of ways. Does your idea of justice include fairness? People don't make their brains, which make them what they are, so is it just to punish a person for inheriting a brain that is different? Was this fair to that person? Wouldn't it be more just to try to fix that brain so the person can enjoy the same life as you? Your notion of justice is simply to kill that person, that brain. You're not advocating just shutting the machine off but destroying it completely just because it has a defective part. Are you really okay with that?

Interesting questions. I don't know that film. I wonder how much my personal answers would illuminate things?

GORT is a big robot, definitely the strong silent type. The way GORT dealt with dangerous humans was not to kill or harm them but to render them defenseless, take away their weapons of destruction. It was communicated to the Earthlings that unless Earth changed its ways and became more peaceful it would be perceived as a threat to galactic peace. It's destruction was certainly implied. I guess GORT could have roamed Earth zapping everyone's tanks, guns, missiles, nukes, etc. That would have obviously made rational sense but would not have been as emotionally engaging as the gospel according to GORT.
 
* Especially if we start to consider focusing more on preventative measures (whatever they may be) rather than after-the-fact punishments. This paradigm, with it's slightly different emphasis, is also said to be another potential benign outcome (along with a reduction in retributive urges) of weakened beliefs in free will. To reuse the machine terminology, not only do we not blame our computer if it malfunctions (well, we arguably shouldn't, at least, but we might nonetheless) but this tendency not to blame would be reduced even further if the computer had been permitted to be in a situation where it malfunctioning was more likely (being repeatedly dropped onto a hard surface for example). Perhaps if the human machines really did treat each other better (more fairly), there'd be fewer malfunctions. Sort of like what Jesus said, at least sometimes, allegedly, albeit Jesus was probably a strong believer in free will. Is there an equivalent to Godwin's Law for bringing up Jesus?

Justice is all about prevention as I see it. Maybe one day the technology will exist to constrain a person with little cost to the society. Maybe we can download their DNA sequence as a form of capital punishment instead of incarcerating them at great expense to the society. Of course we actually terminate them but we can bring them back because we have the technology to create that living organism again based on its DNA. In any case, knowing that there are no perfect humans we need to be careful about who we punish. Ideally we fix each other, not punish each other. We have to take sworn oaths to be more rational than emotional.

Well, maybe justice SHOULD be all about prevention, and it’s good that you see it that way, but I don’t think it actually is. I think it’s still more about retribution, even though it avoids expressing it in those terms ( it’s not....seemly, to do so).

That said, justice has ....softened. There are more ‘mitigating circumstances’ allowed these days, and I think that trend is set to continue.

In a way, the weakening of strong beliefs in free will has already been happening for quite a while.
 
then do you agree that some people deserve to be punished?
No.

Or do you have another theory?
I think we should do the minimum necessary (where appropriate) to achieve deterrence, rehabilitation, societal protection and restoration. These are all considered, by most, to be forms of punishment but they're not (in my view) "deserved" - they're pragmatic responses to wrongdoing (they're consequentialist reasons for punishment).

Okay, so we disagree on the matter, and it seems you support partial a moral error theory. On the other hand, you're clearly not suggesting a full moral error theory, since you talk about wrongdoing.

Do you think some people deserve something else, like a reward, or that all statements of the form 'X deserves A' are untrue?

At any rate, why do you think moral talk went so wrong when it comes to desert, but not wrongdoing? (i.e., why do you think some behaviors are morally wrong, but no one deserves to be punished for them?).
 
* Especially if we start to consider focusing more on preventative measures (whatever they may be) rather than after-the-fact punishments. This paradigm, with it's slightly different emphasis, is also said to be another potential benign outcome (along with a reduction in retributive urges) of weakened beliefs in free will. To reuse the machine terminology, not only do we not blame our computer if it malfunctions (well, we arguably shouldn't, at least, but we might nonetheless) but this tendency not to blame would be reduced even further if the computer had been permitted to be in a situation where it malfunctioning was more likely (being repeatedly dropped onto a hard surface for example). Perhaps if the human machines really did treat each other better (more fairly), there'd be fewer malfunctions. Sort of like what Jesus said, at least sometimes, allegedly, albeit Jesus was probably a strong believer in free will. Is there an equivalent to Godwin's Law for bringing up Jesus?

Justice is all about prevention as I see it. Maybe one day the technology will exist to constrain a person with little cost to the society. Maybe we can download their DNA sequence as a form of capital punishment instead of incarcerating them at great expense to the society. Of course we actually terminate them but we can bring them back because we have the technology to create that living organism again based on its DNA. In any case, knowing that there are no perfect humans we need to be careful about who we punish. Ideally we fix each other, not punish each other. We have to take sworn oaths to be more rational than emotional.

Well, maybe justice SHOULD be all about prevention, and it’s good that you see it that way, but I don’t think it actually is. I think it’s still more about retribution, even though it avoids expressing it in those terms ( it’s not....seemly, to do so).

That said, justice has ....softened. There are more ‘mitigating circumstances’ allowed these days, and I think that trend is set to continue.

In a way, the weakening of strong beliefs in free will has already been happening for quite a while.

I think we're coming closer to appreciating that we shouldn't reward people for being lucky, and nor should we punish people for being unlucky. Our history has been to do just that, however, but we seem to be getting better.

I remember reading about observations of monkeys, how when they are more closely confined in a cage they change their behavior to being more tolerant and accepting, much less willing to engage in violent behavior, at least so long as their basic needs are met. Maybe that's us too.
 
Okay, so we disagree on the matter, and it seems you support partial a moral error theory. On the other hand, you're clearly not suggesting a full moral error theory, since you talk about wrongdoing.

Do you think some people deserve something else, like a reward, or that all statements of the form 'X deserves A' are untrue?

At any rate, why do you think moral talk went so wrong when it comes to desert, but not wrongdoing? (i.e., why do you think some behaviors are morally wrong, but no one deserves to be punished for them?).
You've subtly changed the subject.

I originally took issue with your use of the term "just retribution" (post #346). In response you asked me if I thought that "some people deserve to be punished?". I assumed you were still talking about retribution so of course I disagreed.It seems now that you're talking about punishment in its widest sense (not just retribution).

I'm content to accept "deserve" in the sense that some wrongdoers deserve to suffer the imposition of deterrence, incapacitation and/or restoration where appropriate - i.e. where it is believed future reoffending can be reduced and/or restitution made.

In short, I accept the need for consequentialist punishment (in some cases) but reject retributive punishment.
 
Back
Top Bottom