• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Is censorship moral?

I do not think Trump like speech is any less murky than anything else. Like libel and defamation, it has to be proven that Trump has intent to cause harm.
Proven to who, exactly?

If a private company decides to delete one of Trump's posts from their media platform is that censorship? What if said private company decides to delete his account and stops letting him use their company's facilities at all?

If a judge decides that a public statement qualifies as a threat and puts Trump in jail for the duration of the trial is that censorship?
Tom
The OP defines censorship as:

“the effort to silence speech, written or oral, and to hide away images that are seen to be offensive. ”
 
take censorship to generally mean suppression of speech and expression in public areas. Moderation is limiting speech on a platform or medium.

Proven to who? A judge and jury.

There are specific laws against threatening violence against govt officials and inciting violence.

I think the justice department is picking and choosing that which they think they can successfully prosecute.

Trump has lost lawsuits against him for defamation.
 
How about we talk about censorship issues less emotional and murky than child porno.
That's fine as long as it's about censorship.
Ex-president Trump is online threatening violence against the judiciary, on privately owned and operated social media.
I've thought of this example too. It might appear to serve as a counter example to my statement in the OP: "The censors are invariably more powerful than those whom they censor, and so censorship results essentially from whomever can apply the greater force." But although Trump while president was arguably the most powerful person on earth, in the context of Twitter he had no power to resist his being banned from that service.
If whatever Twitter is calling itself these days scrubbed his posts on their platform would that be immoral?
Would putting the ex-president in jail for expressing his violent opinions on social media be immoral?
Would any of that qualify as censorship?
Statements or images meant to incite violence might well constitute materials that should be censored. However, under that criterium The Declaration of Independence and Roosevelt's declaring war on Japan should have been censored. So who decides what violence inciting speech should be censored? As always, the censors are those not with the better morals but those with the power to silence what and whom they object to.
I don't think so. Feel free to differ.
So what should the rule be? Silence anybody who calls for violence? As we all know that's a rule that will be very unevenly enforced.
 
How about we talk about censorship issues less emotional and murky than child porno.
That's fine as long as it's about censorship.
Ex-president Trump is online threatening violence against the judiciary, on privately owned and operated social media.
I've thought of this example too. It might appear to serve as a counter example to my statement in the OP: "The censors are invariably more powerful than those whom they censor, and so censorship results essentially from whomever can apply the greater force." But although Trump while president was arguably the most powerful person on earth, in the context of Twitter he had no power to resist his being banned from that service.
If whatever Twitter is calling itself these days scrubbed his posts on their platform would that be immoral?
Would putting the ex-president in jail for expressing his violent opinions on social media be immoral?
Would any of that qualify as censorship?
Statements or images meant to incite violence might well constitute materials that should be censored. However, under that criterium The Declaration of Independence and Roosevelt's declaring war on Japan should have been censored. So who decides what violence inciting speech should be censored? As always, the censors are those not with the better morals but those with the power to silence what and whom they object to.
I don't think so. Feel free to differ.
So what should the rule be? Silence anybody who calls for violence? As we all know that's a rule that will be very unevenly enforced.
This is why “it depends” was and still is the best answer to your question.
 
I've thought of this example too. It might appear to serve as a counter example to my statement in the OP: "The censors are invariably more powerful than those whom they censor, and so censorship results essentially from whomever can apply the greater force." But although Trump while president was arguably the most powerful person on earth, in the context of Twitter he had no power to resist his being banned from that service.

I don't see it as a counter example so much as yet another layer of complexity.

Jan 7th, 2021 Twitter decided not to keep hosting Trump and his tweets. But they can't really censor anyone. Trump was free to continue saying anything he wanted, he just couldn't keep using one particular private concern to broadcast his views. His views didn't match their standards, so they chose not to continue doing business with him. That's their right as private citizens.
Tom
 
I've thought of this example too. It might appear to serve as a counter example to my statement in the OP: "The censors are invariably more powerful than those whom they censor, and so censorship results essentially from whomever can apply the greater force." But although Trump while president was arguably the most powerful person on earth, in the context of Twitter he had no power to resist his being banned from that service.

I don't see it as a counter example so much as yet another layer of complexity.

Jan 7th, 2021 Twitter decided not to keep hosting Trump and his tweets. But they can't really censor anyone. Trump was free to continue saying anything he wanted, he just couldn't keep using one particular private concern to broadcast his views. His views didn't match their standards, so they chose not to continue doing business with him. That's their right as private citizens.
Tom
Exactly. The right to free speech doesn't imply the right to freely use someone else's soapbox.
 
Remember, you're in a different country.
Remember, your country really isn't that different from mine. Remember also that we have no information about which country this alleged event took place in, if it even happened at all. So you too may be in a different country. Or maybe we both are.
There are substantiated incidents of problems with innocent images in the US.
And in every other OECD country. So perhaps we could discuss one of those cases. Feel free to post the details of one, along with its substantiation. Until somebody does, we have no incidents to discuss.
You don't recall this one?


This was a telehealth photo, not even just a baby in the bath pic. I used it because I recalled enough to get the all-seeing eye to cough it up as hit #1.
 
It doesn't prevent it. It does make it very hard to produce it for commercial reasons, though. This means little is produced simply from commercial motives.
So your argument is to prevent the harm involved in the production and supply of graphic materials, we should cut out the demand for them. Is that correct?
I'm not saying right or wrong, but rather simply why the law is the way it is. Whether it is a net benefit to society or not I do not know.

What you are showing is the harm caused by witch-hunting about child pornography. In the quest to stamp it out they sweep up lots of things like the images you refer to which are not pornography.
You can dismiss it as "witch-hunting about child pornography," but the effort to destroy images deemed pornographic of children is part of such censorship, and it threatened an innocent mother. As I see it, the laws against child pornography are the product of prudes and politicians who can't even define what they outlaw.
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
 
Remember, you're in a different country.
Remember, your country really isn't that different from mine. Remember also that we have no information about which country this alleged event took place in, if it even happened at all. So you too may be in a different country. Or maybe we both are.
There are substantiated incidents of problems with innocent images in the US.
And in every other OECD country. So perhaps we could discuss one of those cases. Feel free to post the details of one, along with its substantiation. Until somebody does, we have no incidents to discuss.
You don't recall this one?


This was a telehealth photo, not even just a baby in the bath pic. I used it because I recalled enough to get the all-seeing eye to cough it up as hit #1.
I don't recall stuff I have never seen before, no. :rolleyesa:

That's not really a story about censorship though; Again, Google withdrew permission to use their platforms (a soapbox that they own), and no laws were broken by the person accused of being a pornographer.

The real issue in this story is the increasing use of automated tools to detect wrongdoing, and the absence of simple avenues of appeal in response to the inevitable mistakes.

This should be commercially suicidal, but organisations like Google and Facebook are effective monopolies, so despite their bad behaviour, they maintain the vast majority of their user base. It's an area that desperately needs regulation. But it's not actually censorship. Just stupidity.
 
It doesn't prevent it. It does make it very hard to produce it for commercial reasons, though. This means little is produced simply from commercial motives.
So your argument is to prevent the harm involved in the production and supply of graphic materials, we should cut out the demand for them. Is that correct?
I'm not saying right or wrong, but rather simply why the law is the way it is. Whether it is a net benefit to society or not I do not know.

What you are showing is the harm caused by witch-hunting about child pornography. In the quest to stamp it out they sweep up lots of things like the images you refer to which are not pornography.
You can dismiss it as "witch-hunting about child pornography," but the effort to destroy images deemed pornographic of children is part of such censorship, and it threatened an innocent mother. As I see it, the laws against child pornography are the product of prudes and politicians who can't even define what they outlaw.
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.

Much like the release of revenge porn being a violation, child porn is an even worse violation, and this violation almost universally happens at a time when consent cannot happen.

I will advocate till I'm blue in the face to draw, or have an AI draw whatever you want, especially if you have the AI model hash and seed to prove it came from a computer...

As soon as we start talking about things like images of actual children or images of unknown and questionable provenance, all that has to be torn down. Ideally the very economic factors that encourage it should be torn down too, as such images are violations.

I will note I have hedged some statements here. It is technically possible for a kid to snap some pictures of themselves, hold onto them in private until they are no longer a kid, and then make a decision on whether they feel violated by them, but the rarity of that event moots the point in that for every molecule of ethical material there is a practical moon of evil material, and to find that molecule you have to accept the existence of the other.

As a result the only ethical material to keep, display, etc of that type is stuff that is clearly "pure fantasy": drawn, CG, and AI art, posted in places restricted to adults.

It's absolutely the case that witch hunts happen. I can't tell you the number of awful events where someone is doing everything right, sticking to adult websites and sticking to ethically produced images, yet still get stormed by 17-21 year olds on Twitter for being "pedos", especially artists, when the whole point of all that work is to NOT have that issue.

The problem here is that purity is seen as a virtue, when purity is not in any way virtuous, a clear failing of the paradigm of virtue ethics in general.

The overall result of such crusades is popular art sites not just saying "have an account, be 18+, and explicitly opt in to seeing that content, and tag it for what it is" but wholesale banning it and anyone who draws it from their site.

Even IF it is pornography, even if the things in it look like children, if we can validate that those events and children do not exist, and if we can keep children from accessing those images, and if we can make access to new images completely free of any need to violate the sexual privacy of anyone (through manual or generative art), I just don't see the problem with it.

To that end, I am even more interested in the average party to see actual CSAM completely stricken from the internet, and to see violators of tagging policies dealt with harshly and swiftly.

There is a whole community of age players built specifically so people could both have their interests and engage in them ethically, and censorship is important to that, in keeping us protected from the intrusion of predators and creeps.

TL;DR: censorship protects people who ethically consume pornographic drawn and generative art of themselves and aspects of themselves and other consenting participants in fantasy situations from the exposure to evil acts, people who knowingly have contact with or who are themselves child molesters, child abductors, child rapists, and child murderers.
 
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.
Which doesn't address my point that we are witch-hunting non-porn. Specifically, non-sexual nudity. Baby-in-the-bath pics, nudists etc.

I also think the current laws are major overkill when applied to teens sexting.

I will advocate till I'm blue in the face to draw, or have an AI draw whatever you want, especially if you have the AI model hash and seed to prove it came from a computer...
That could be hard burden given the variety of models out there. I'd like to see the reverse--AIs sign their work but I'm not seeing how to keep the keys secret with things you can run on your own system like Stable Diffusion.

As soon as we start talking about things like images of actual children or images of unknown and questionable provenance, all that has to be torn down. Ideally the very economic factors that encourage it should be torn down too, as such images are violations.
Yup, anything that's not provably innocent needs to be considered guilty.

I will note I have hedged some statements here. It is technically possible for a kid to snap some pictures of themselves, hold onto them in private until they are no longer a kid, and then make a decision on whether they feel violated by them, but the rarity of that event moots the point in that for every molecule of ethical material there is a practical moon of evil material, and to find that molecule you have to accept the existence of the other.
Yeah, while technically this should be acceptable there are big issues with coercion and the like.
 
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.
Which doesn't address my point that we are witch-hunting non-porn. Specifically, non-sexual nudity. Baby-in-the-bath pics, nudists etc.
the only sure fire way to prevent that is to not censor child porn at all and I hope that you, unlike Unknown Soldier, would agree that’s a bad idea.

Every policy or regulation can be abused. But that’s not a reason to not have them.
 
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.
Which doesn't address my point that we are witch-hunting non-porn. Specifically, non-sexual nudity. Baby-in-the-bath pics, nudists etc.
the only sure fire way to prevent that is to not censor child porn at all and I hope that you, unlike Unknown Soldier, would agree that’s a bad idea.

Every policy or regulation can be abused. But that’s not a reason to not have them.
You're assuming nudity = sex.
 
It doesn't prevent it. It does make it very hard to produce it for commercial reasons, though. This means little is produced simply from commercial motives.
So your argument is to prevent the harm involved in the production and supply of graphic materials, we should cut out the demand for them. Is that correct?
I'm not saying right or wrong, but rather simply why the law is the way it is. Whether it is a net benefit to society or not I do not know.
Educated guesses are welcome. I believe that cutting demand to cut supply is the same rationale used to criminalize possession of some drugs as well as possession of kiddie porn. Simple possession of heroine, for example, may seem harmless, but if the authorities can destroy the demand for heroine, then the suppliers of heroine will go broke. Or so they believe.
What you are showing is the harm caused by witch-hunting about child pornography. In the quest to stamp it out they sweep up lots of things like the images you refer to which are not pornography.
You can dismiss it as "witch-hunting about child pornography," but the effort to destroy images deemed pornographic of children is part of such censorship, and it threatened an innocent mother. As I see it, the laws against child pornography are the product of prudes and politicians who can't even define what they outlaw.
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
I see. You are being critical of the efforts against child porn. I'm glad I'm not the only one to see how dumb those laws are.

By the way, I do not look at child pornography. My decision not to do so is based on aesthetics. Child pornography is ugly to me.

Finally, I see my latest threads are going well. My decision not to engage the troublemakers has apparently made that difference. It's an effective alternative to censoring them which apparently wasn't going to happen no matter how much they attacked me.
 
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.
Which doesn't address my point that we are witch-hunting non-porn. Specifically, non-sexual nudity. Baby-in-the-bath pics, nudists etc.
the only sure fire way to prevent that is to not censor child porn at all and I hope that you, unlike Unknown Soldier, would agree that’s a bad idea.

Every policy or regulation can be abused. But that’s not a reason to not have them.
You're assuming nudity = sex.
No. I’m not. Where in what I said would you assume that? I have stated what I defined the term as in this thread and I am arguing that there is a line to be drawn. Unknown soldier is arguing that all censorship is immoral thus no line should be drawn. And you seem to be arguing that no line should be drawn because we can’t delineate a perfect line.
 

Finally, I see my latest threads are going well. My decision not to engage the troublemakers has apparently made that difference. It's an effective alternative to censoring them which apparently wasn't going to happen no matter how much they attacked me.

How precious. Censorship does you good, it seems. Now you have your own safe space, courtesy of the censorship you claim to oppose.
 
You miss my point--I'm calling it witch-hunting because they're going after stuff that isn't pornography.
So, a few things here. One is that child porn must not exist, especially realistic porn, because there is a vested interest in preventing the exposure of people in sexually compromised ways against their consent.
Which doesn't address my point that we are witch-hunting non-porn. Specifically, non-sexual nudity. Baby-in-the-bath pics, nudists etc.

I also think the current laws are major overkill when applied to teens sexting.

I will advocate till I'm blue in the face to draw, or have an AI draw whatever you want, especially if you have the AI model hash and seed to prove it came from a computer...
That could be hard burden given the variety of models out there. I'd like to see the reverse--AIs sign their work but I'm not seeing how to keep the keys secret with things you can run on your own system like Stable Diffusion.

As soon as we start talking about things like images of actual children or images of unknown and questionable provenance, all that has to be torn down. Ideally the very economic factors that encourage it should be torn down too, as such images are violations.
Yup, anything that's not provably innocent needs to be considered guilty.

I will note I have hedged some statements here. It is technically possible for a kid to snap some pictures of themselves, hold onto them in private until they are no longer a kid, and then make a decision on whether they feel violated by them, but the rarity of that event moots the point in that for every molecule of ethical material there is a practical moon of evil material, and to find that molecule you have to accept the existence of the other.
Yeah, while technically this should be acceptable there are big issues with coercion and the like.
Yeah, I am not really talking about the "innocent" stuff. I am not about to share, even behind spoilers, any of the art I have made, even if all of it is ostensibly innocent though.

Arguably, any pictures of actual children are in some ways a violation of those children. More than once the thought has crossed my mind that the pictures of me as a young child in a sailor's outfit are fundamentally violations of my privacy and consent, for example, *and there is nothing explicitly sexual about them*...

That said, people are psychopaths over non-pornographic images of children. I think some people would as soon ban 100% of all images of anything appearing childlike from the entirety of society.

I am myself no stranger to the phenomena of such witch hunting though. I'm a friggin' ABDL for fuck sakes: there's not a month that goes by without some new drama over how Twitter teens are mobbing someone I know for commissioning a piece of *hand drawn art* depicting *themselves* in a sexual situation as a child, and someone losing their whole public identity to teenage Purity Paladins over it.
 
Back
Top Bottom