• Welcome to the new Internet Infidels Discussion Board, formerly Talk Freethought.

Republicans, I hope you like rape.

So, I know this is going to be something folks would disagree with, but I don't think we should allow kids on the internet unsupervised at all.

I think that the Internet should be like liquor: you show your ID to get an account, and access should be strictly controlled to the same extent we expect control of liquor cabinets, and we should have class time dedicated to informing people how to use the internet safely, what the dangers are, as well as a discussion about how porn is illusory/inaccurate/misleading, and how any adult trying to distribute it to the likes of them has ulterior motives.

I will go further and wonder if the whole 'World Wide Web' is a net negative.

It started as a wonderful dream; and in many ways the first decade or so of the 'Web was a paradise compared with what we have now.

Porn is the least of my worries. What about shaming teenagers, identity theft and other frauds? Search engines used to find brilliant little websites. Now they're just tuned to point to paywalls and other revenue generation. (I searched for an ordinary but slightly uncommon uncapitalized English word yesterday and the first page of results had NOTHING but links to a company using that as a brand-name.)

I thought Wikipedia was wonderful when it first appeared: Much larger and easier to edit than the Encyclopedia Brittanica. IMHO it's now a dismal caricature of its past promise ... and yet is STILL one of the best sites around! I've been searching the Internet off and on for a while, and the changes for the worse are palpable. Dissatisfied with a Wikipedia answer and want to explore a topic further? Most of the "relevant" Google hits will be pages which simply copied the Wiki article word-for-word, mistakes and all.

And do let us not get started on "Social Media."

Call me a Luddite; call me a reactionary; but I think when The Decline and Fall of Western Civilization is written (sooner than you think), the 'Net (especially social media) will get much blame.
Rather, I think a lot of it has to do with a rather naive approach to the internet in general. The fact is that if parents treated the internet even a little bit like they treat alcohol, we all would be far better off.

Everyone's internet can and should be authenticated insofar as basic access, and while I do have my doubts as to whether we would really be able to prevent all underage access any more than we can prevent underage access to liquor or firearms or cigarettes, our culture does a passable job of at least some of those things through a late enough age that hopefully it does some good.

My expectation would be for at least some privacy.

Adult websites that take themselves seriously in terms of protecting kids use an RTA label that makes it easy for browser and ISP based systems to restrict content from being accessed by minors.

I would go so far as to say, you should probably have to type your login to your ISP every time you connect, or at least you should if you have kids in your home. If we could manage even that tiny little thing, the virtual equivalent of "auto-locks on the liquor cabinet", we would be miles ahead of where we are now.

I don't even think letting pictures of real kids on the internet was a good idea.

I also think you're right about social media mostly. To be fair, forums are a form of social media, too. It's almost like "cities" and "villages". People live in cities. It's this big trope, even, as a big "no shit" to political maps that show how much red land that happens to be occupied what turns out to be shockingly few people. Online is no different, in that people tend to want to surf and post in large, loosely interconnected hubs.

But if you want what is probably the absolute peak of internet content, I would point you over to https://www.fieggen.com/ better known as Ian's Shoelace Site.

It's probably the best site I've ever found on knot tying and shoe laces.
 
We have three possible cases:

1) Porn lower rape.
2) Rape declines for unrelated reasons, causing availability of porn. Huh, this one makes no sense.
3) Something else causes both increased access to porn and lower rates of rape. What?!
You left off the obvious one, where correlation does not equal causation.

4) Some things impacted the porn availability, other things impacted rape.

The world has changed hugely in the last 20-25 years. Internet made porn easy and free.

Rape response has also become more effective, in large part due to DNA technology. Also general attitudes towards rape have hardened.
But they are not the same thing.

Maybe guys who might have been inclined to commit a rape are now more likely to "take the edge off" on a porn site. By itself, that sure sounds like a good thing to me.
Tom
 
We have three possible cases:

1) Porn lower rape.
2) Rape declines for unrelated reasons, causing availability of porn. Huh, this one makes no sense.
3) Something else causes both increased access to porn and lower rates of rape. What?!
You left off the obvious one, where correlation does not equal causation.

4) Some things impacted the porn availability, other things impacted rape.

The world has changed hugely in the last 20-25 years. Internet made porn easy and free.

Rape response has also become more effective, in large part due to DNA technology. Also general attitudes towards rape have hardened.
But they are not the same thing.

Maybe guys who might have been inclined to commit a rape are now more likely to "take the edge off" on a porn site. By itself, that sure sounds like a good thing to me.
Tom
And along the same lines, and going a step further...

Saying this is admittedly skeevy and creepy and will probably get me crucifed here, but now that AI generated images and videos have gotten pretty darn realistic (and will only get better) maybe we ought to consider allowing AI generated child porn to those with such tendencies. No real kids would be involved in the images, and if the pedophiles can get their sexual gratification from these images and not real life photos (or kids themselves!) then society is better off, no?
 
Saying this is admittedly skeevy and creepy and will probably get me crucifed here, but now that AI generated images and videos have gotten pretty darn realistic (and will only get better) maybe we ought to consider allowing AI generated child porn to those with such tendencies. No real kids would be involved in the images, and if the pedophiles can get their sexual gratification from these images and not real life photos (or kids themselves!) then society is better off, no?
I'd have to think about that. I'd like to see a scientific test of that but I don't see how it could be done in an ethical manner.

My guess is some pedos already are doing it.

Are AI generated child porn images illegal?
 
Saying this is admittedly skeevy and creepy and will probably get me crucifed here, but now that AI generated images and videos have gotten pretty darn realistic (and will only get better) maybe we ought to consider allowing AI generated child porn to those with such tendencies. No real kids would be involved in the images, and if the pedophiles can get their sexual gratification from these images and not real life photos (or kids themselves!) then society is better off, no?
I'd have to think about that. I'd like to see a scientific test of that but I don't see how it could be done in an ethical manner.

My guess is some pedos already are doing it.

Are AI generated child porn images illegal?
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.

I think the big problem is much of the public couldn't get past the disturbing nature of it, and would ignore the (possible) greater good.

Good question about the AI images being illegal. We are in some untested waters here I think.
 
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.
You would still need a control group which would have to involve real children.
 
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.
You would still need a control group which would have to involve real children.
Maybe. Not sure.
 
I really wonder if all those people who 50-60 years ago pushed for the liberalisation of porn laws saying it does no harm to anyone else, how could it hurt others etc. ever look back and realise how stupid they were?
 
We have three possible cases:

1) Porn lower rape.
2) Rape declines for unrelated reasons, causing availability of porn. Huh, this one makes no sense.
3) Something else causes both increased access to porn and lower rates of rape. What?!
You left off the obvious one, where correlation does not equal causation.

4) Some things impacted the porn availability, other things impacted rape.

The world has changed hugely in the last 20-25 years. Internet made porn easy and free.

Rape response has also become more effective, in large part due to DNA technology. Also general attitudes towards rape have hardened.
But they are not the same thing.

Maybe guys who might have been inclined to commit a rape are now more likely to "take the edge off" on a porn site. By itself, that sure sounds like a good thing to me.
Tom
But the effect applies across many countries. And it predates DNA in places like the US.
 
Saying this is admittedly skeevy and creepy and will probably get me crucifed here, but now that AI generated images and videos have gotten pretty darn realistic (and will only get better) maybe we ought to consider allowing AI generated child porn to those with such tendencies. No real kids would be involved in the images, and if the pedophiles can get their sexual gratification from these images and not real life photos (or kids themselves!) then society is better off, no?
I'd have to think about that. I'd like to see a scientific test of that but I don't see how it could be done in an ethical manner.

My guess is some pedos already are doing it.

Are AI generated child porn images illegal?
When I went to look up that 85% number I also found research that showed the availability of "child" porn had a positive effect. I did not dig into it, though, and I think they were talking about places that had lower ages of consent, not truly children.

And I think it could be done in an ethical manner. Compare the recidivism rate for kiddie porn convicts who do or don't get access to AI kiddie porn.
 
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.
You would still need a control group which would have to involve real children.
Why? Your control group simply isn't given access to the AI porn, the endpoint is arrests for sexual offenses against children.
 
But the effect applies across many countries. And it predates DNA in places like the US.
Seriously? The effect applies in countries without Internet access?
I don't know of one.

DNA identification has been improving for a long time. As have the hardening attitudes towards coercive sex. I remember when "Boys will be boys" was all too common and "look at what she was wearing" was a defense.
Kiss that shit goodbye.
Tom
 
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.
You would still need a control group which would have to involve real children.
Why? Your control group simply isn't given access to the AI porn, the endpoint is arrests for sexual offenses against children.
So it would still put children at risk.
 
Yeah, scientific tests would be in order for sure before embarking on something like that. If the child porn given during the tests is AI, the child does not exist in real life. So, no harm in that respect regarding ethics. I assume the test subject (a volunteer) would be OK with it.
You would still need a control group which would have to involve real children.
Why? Your control group simply isn't given access to the AI porn, the endpoint is arrests for sexual offenses against children.
So it would still put children at risk.
An alternative experiement would be that you could interview them during a time span both with and without the child porn, and check to see if their desires toward real children had waned when access to the AI porn was allowed.
 
As the subject has been broached already:

Some people generate porn with AI. Some of that porn people generate with AI could be mistaken for containing human children.

I would like to compare two scenarios:

In one scenario, a gay 24 year old who "looks 8*" decides to participate in a porno. It's school themed and he puts on a school uniform his boyfriend had from when he was young and they shoot a scene with desks in a school setting, culminating in a very racy ruler-spanking scene with moaning and all.

Now, in the second... Exactly as the first except everything is generated by AI, and let's just say the AI model of the smaller-bodied participant is rather similar to the small-bodied 24 year old.

A court case has already happened pertaining to the rights of adults to participate in pornographic films that depict such scenes, to which the defense was the actress of a film testifying that she was of age, vindicating people accused of having illegal pornography.

If you would argue that a 24 year old has no legal right as any of their peers to appear in a porno that most any other 24 year old would be allowed to participate in, you're wrong. Filming a film becomes wrong because of the involvement of non-consenting parties.

So, I would argue that there is NO legal basis to suddenly declare the second film, the film that involved ZERO human actors, illegal.

Second, I've at least got some experience observing various people who were into some gross porn (drawn/digital art). There are two pretty clearly different groups: the group that likes gross porn because they really wish they could "get away with something" and mourn the fact they won't get an opportunity, and the group who doesn't want to get away with anything.

I would liken this between the people who play violent videogames while pouting that they can't be violent IRL, and the people who play such video games because they have violent urges and say "well, gotta do something with them on occasion, what's the least violent violent-enough thing I can just be done thinking about it?"

For the latter group there was never anything you could consider "desire" to have sexual contact with a minor. For most in the ABDL community, for instance (a group which arguably "simulates" certain situations), "mommy" and "daddy" roles are hard to locate and arrange because given the choice, that's just not the side of the equation most people are into.

These are very different use cases and I've never seen someone pushed into one group from the other as a result of porn... The ones seeking to escalate it were always creeps seeking to escalate and explore such feelings in inappropriate ways.

As of now, I'm aware of a few cases on the subject in the legal system, mostly because of the ArtistHate community on reddit: one case wherein someone trained a model using illegal abuse images, one in which someone used a pre-existing model to transform a picture of their neighbor's kid into "deep fake" porn of that kid, one in which someone distributed AI porn to minors, and one in which there is an obscenity law being applied (Floriduh).

I would say in most of these cases, the AI has nothing to do with the actual crime: training on CSAM requires CSAM thus CSAM charges are appropriate; making deep-fake porn of a non-consenting person is wrong no matter who the person is, and especially so if they are a kid; distributing ANY kind of porn to minors is grooming and distributing porn to a minor; the last one is troubling because AFAIK that particular charge begins and ends with a victimless thought crime ("obscenity").

There is grounds to consider something along the lines of my initial example, however, of the 24 year old small-bodied porn actor: if it's possible for an adult to allow their image in a legal porn, it's fully possible to inadvertently depict a "doppelganger" of a real person, potentially creating an illegal porn.

There are certainly many questions that DO need answers, however I think that in any sane and ethical application of law, there needs to be the room for giving those who act to eliminate possible harm from arising from their condition the room to seek that pathway.

*I have a close friend who matches this description, though they do not participate in pornography filing AFAIK.
 
I really wonder if all those people who 50-60 years ago pushed for the liberalisation of porn laws saying it does no harm to anyone else, how could it hurt others etc. ever look back and realise how stupid they were?
What do you think is the harm?
Earlier in the thread you will talking to theBeave about AI images and ethics.
The desensitisation that occurs amongst those who view porn i.e. what was acceptable a few years ago is not longer enough.
The fact that is just gets worse and worse i.e. child porn, strangulation scenes etc.
Gives too many men an unrealistic view on sex, performance and consent etc. Dealing with the real messy world does not meet the world of porn.
Bandwidth wasted
Makes a few very depraved individuals very rich.
 
AI generated images and videos have gotten pretty darn realistic ...
If you're talking about some of the super-beautiful women showing up on Google Images, I prefer flesh-and-blood women.

I have a real-life GF named Vicki; she let me take her photo and I'll post it here. She's not a supermodel or a beauty pageant contestant; and men don't stare at her when we're walking down the street, but I think Vicki looks OK!
She asked me to take her photo after she put on one of the costumes she uses for our tantric training sessions. Her looks may not be much, but she's very smart and creative. Sometimes I feel like a lucky guy to have her for my girl-friend.
aig008.jpg
 
I really wonder if all those people who 50-60 years ago pushed for the liberalisation of porn laws saying it does no harm to anyone else, how could it hurt others etc. ever look back and realise how stupid they were?
What do you think is the harm?
Earlier in the thread you will talking to theBeave about AI images and ethics.
The desensitisation that occurs amongst those who view porn i.e. what was acceptable a few years ago is not longer enough.
The fact that is just gets worse and worse i.e. child porn, strangulation scenes etc.
Gives too many men an unrealistic view on sex, performance and consent etc. Dealing with the real messy world does not meet the world of porn.
Bandwidth wasted
Makes a few very depraved individuals very rich.

Stay out of peoples' bedrooms.
 
Back
Top Bottom