In this case there are several crimes, but in the other one mentioned about a korean there is nothing, only possession of generated content arguing that there is high realism (someone could say the same even of a sketch). To imprison for acts that have neither victims nor any harm either directly or indirectly, is more aberrant than possessing that.
PS: I'm just talking about legality and rights, I know it's controversial and I'm sure someone has something to argue against it, but if you're going to accuse me of being a pedo just get lost you moron.
Careful, any time I point this out, the fascists come out of the woodwork to call me a pedo.
Criminalizing the creation, possession, or viewing of entirely artificial artwork is beyond unethical; it's extraordinarily evil. I don't care if you find someone's artwork gross, troubling, distasteful, immoral, etc... that's art. Victimizing real people is not "art" or "speech" or "expression"... so as long as that isn't happening there is no ethical grounds whatsoever for restricting a persons exercise of expression, especially in private.
Social consequences for creating, sharing, viewing certain artwork is one thing... but the government or law punishing someone for it is a different thing entirely.
That said, this specific case is different in that the doctor DID in fact victimize real children by using secret photos and recordings of them to create the images. That crosses way across the line that I laid out above. Additionally, he possessed actual CSAM (which he may have made himself), and so is absolutely guilty of sexually victimizing real children. That guy deserves everything he gets in prison.
People getting way overexcited about AI at the moment. If a crime or perceived crime even remotely is related to AI it becomes the main focus.
Like the person who was hit by a self-driving car, the case was really about a hit and run drive that it hit the pedestrian first and throwing them into the self-driving car. Have the self-driving car not been there and it had been a human driver pretty much the same thing would have happened but they focus on the AI aspect.
If I used an AI to commit fraud it was me that committed the fraud not the AI but you can be damn sight certain that people would get hung up on that aspect of the case and not the me committing a crime bit.
It's the same as when Ford invented the transit van (I have no idea what the equivalent in the US market was). It was faster than most cars at the time, could carry heavier loads, and was physically larger. Inevitably it got used in a lot of bank robberies because the police literally couldn't keep up with it. And people started talking about maybe having a performance limit on vehicles, when really the actual solution was that everyone else just needed better cars. If they had actually implemented a performance limit, they would have held us back.
I thought it was obvious but ok I'll explain it to you. The story isn't really about AI, it involves an AI but really that's got absolutely nothing to do with the crime that was happening, so why we obsessing over it?
The guy committed a crime. And also as a separate event he used AI.
The AI did not enable him to commit the crime, the AI did not make the crime worse, the AI did not make the crime possible, and he did not use the AI to plan the crime. The use of the AI was entirely incidental to the crime.
This piece controversial, but evocative, thought-provoking and says something about an innocent time in our youth and the change of demeanor sexuality brings when we become aware.
People may not like this, but if you can separate sexuality and understand that we were once "innocent" - meaning sex wasn't something we knew about, we just had these bodies we were told to hide in clothes, the painting takes on a whole new meaning.
I'm not advocating for fake cheese pizza photos, fuck those sickos, but art can appear to be one thing on first glance and then take on a new meaning as we study and understand more.
Your first passage about criminalizing art is 100% correct and 100% irrelevant. You cannot call porn art. Porn with adults, children, dogs, pumpkins - all that stuff is made for people to get off, not enjoy the emotions that real art provokes in people. Therefore we cannot compare criminalizing porn with criminalizing art.
There are edge cases, of course, when art might be provocative and considered immoral, and maybe even illegal sometimes. But that would be edge cases, highly debated.
Maybe noone would need to point out your pedophilia if you stopped conveniently ignoring that it's not possible to generate child porn ""AI Art"" without having child porn first..
Dont they often train the program with adult porn and then the ai just puts a childs face onto bodies generated from this training data? I imagine these ai companies are scraping data from popular pornsites or just paying for the data and these pornsites work hard not to have CP on them. The result is a childs face on a body too mature for it. Remember that some actual adult actresses have body proportions that many would consider underdeveloped and someone generating these pictures could regenerate till the ai uses these body proportions.
The point being is you don't need CP to train ai to make CP. I am not justifying any moral positions here, but pointing out a fact in ai technology.
Uh no, adult porno photo with child face edited in is just adult porno with child face edited in, I don't know anyone insane enough to claim it's child porn, and pedophiles don't like physically matured bodies so everyone loses.
In this case the guy did have real images but you don't need them. AI is kind of intelligent in a sort of hard to define way, it picks up on stuff.
It picked up that people like younger individuals in pornography, so it took that to the logical extreme. AI is weird because it's intelligence without any actual thought. But it can totally generate variations on things it's already seen, and a kid is just a variation on a young adult.
Ah yes, I don't know what I'm talking about it's just that guy happened to have real images just like they do every time because it's impossible to get your garbage model to produce cp otherwise.
Criminalizing the creation, possession, or viewing of entirely artificial artwork is beyond unethical; it's extraordinarily evil.
No it isn't.
I don't care if you find someone's artwork gross, troubling, distasteful, immoral, etc... that's art.
No, it's child porn.
Careful, any time I point this out, the fascists come out of the woodwork to call me a pedo.
Can't imagine why.
You realise the AI is being trained on pictures of real children, right?
So it's wrong for it to be based on one child, but according to you the AI "art" (as you keep calling it) is okay as long as there are thousands of victims instead?
So you're cool with images of 6 year olds being penetrated by a 40 year old as long as "tHe Ai DrEw iT sO nObOdY gOt HuRt"? I guess you could just set it as your desktop and phone wallpaper and everything would be fine. Let me know how that works out for you.
That's some stunning mental gymnastics right there.
You realise the AI is being trained on pictures of real children, right?
Disingenuous and misleading statement. No readily available AI is trained on CP.
So it’s wrong for it to be based on one child, but according to you the AI “art” (as you keep calling it) is okay as long as there are thousands of victims instead?
Disingenuous and misleading statement. I’m guessing you don’t understand how AI works. As for AI output, a randomly generated nonexistent person is nonexistent. Simple as that.
Sidenote: I disapprove of nonconsensual Photoshop and AI illustrations of real people, except for fair use cases such as satire. AI is just another illustrative tool, and the choice of tool is beside the point.
So you’re cool with images of 6 year olds being penetrated by a 40 year old as long as “tHe Ai DrEw iT sO nObOdY gOt HuRt”?
No, I am not. And that is still utterly unimportant. It doesn’t matter how I feel about someone’s fictitious illustrations, sculptures, writings, or anything else created by a person or AI that is wholly fictitious.
That’s literally the whole point I am making: It doesn’t matter how I feel about it, it doesn’t matter how YOU feel about it. It’s not real. Neither you nor I nor anyone else has the right to judge someone else’s art.
That’s literally the whole point I am making: It doesn’t matter how I feel about it, it doesn’t matter how YOU feel about it. It’s not real. Neither you nor I nor anyone else has the right to judge someone else’s art.
It does matter how myself and wider society view disgusting content. It matters a lot. And society absolutely has a say of it's acceptance or otherwise to such content. You saying otherwise is absurd.
In the same way that I can't and shouldn't write something incrediblely racist and pretend it's 'art'. Even if AI made it.
Attempting to give AI child porn a pass, as you are doing for some baffling reason, absolutely will create further harm further down the line.
I’d say it’s because the person you’re replying to rightfully sees it as a slippery slope. If you say this fake image that didn’t directly harm anyone is illegal, what’s to stop you from saying some other fake image that’s much more in line with social tastes is also illegal? Ie an artwork made of human shit, for example. Most people would be repulsed by that. But it doesn’t change the fact that it could be art. As long as it doesn’t concretely harm someone, it’s hard to equate it to said harm.
Oh that's very simple, it's irrelevant whether child porn generated by AI is legal or not, it's reality that you need to possess actual child porn to train AI with it before it can generate more.
You already incriminate yourself before generating the """art""".
I am a software engineer and you are misrepresenting the technology. All the articles I can find state it was a web based ai generator but not which one. Please find me a company that makes this tech public and is somehow not in trouble but should be or is in trouble.
"That same year, Tatum surreptitiously recorded one of his New York patients during an outpatient visit, five days after the youth turned 18."
"Two of the images Tatum used AI to modify were from a school dance"
The above quotes indicate it may have been used on an older child which could easily be done with legal training data. Please find any evidence that any public ai image generator is stupid enough to use CP when they are risking millions of dollars and would have to keep a lot of employees quiet about it.
I am a software engineer and you are misrepresenting the technology. All the articles I can find state it was a web based ai generator but not which one. Please find me a company that makes this tech public and is somehow not in trouble buy should be or is trouble.
"That same year, Tatum surreptitiously recorded one of his New York patients during an outpatient visit, five days after the youth turned 18."
"Two of the images Tatum used AI to modify were from a school dance"
The above quotes indicate it was used on an older child which could easily be done with legal training data. Please find any evidence that any public ai image generator is stupid enough to use CP when they are risking millions of dollars and would have to keep a lot of employees quiet about it.
I know you know this, but you are not crazy. I'm astonished you are being down voted so hard. The pedo apology is so strong it's making me not want to use Lemmy. This thread is worse than reddit.
I left when the API price changes kicked in and at first Lemmy was alright, then the extremists turned up and the echo chamber in here is so ridiculous that there just isn't much point in being here.
Not just the pedo apologists (next step will be AI CP actually being posted here and people defending it as "art"), but also seeing that YouTube is trying to stop freeloaders leeching from it and somehow that's evil literally every single day and seeing how evil cars are literally every single day and seeing how Linux is the next coming of Jesus literally every single day (and I say that as a 20+ year Linux user) is incredibly tedious.
Sure, this existed on Reddit as well, but at least there was actually other content to dilute it and for the most part people were reasonable instead of the rabid extremism I'm seeing every day here. There is no way in hell I would have seen the up/downvote ratio like I'm seeing in this pedo apologist conversation on Reddit.
You realise the AI is being trained on pictures of real children, right?
Can you share a source? Just like how people utilize the internet to distribute CP, there are undoubtedly circles where people are using ml for CP. However my understanding is that by and large, popular models are not intentionally trained on any.
The pedofiles that are smart enough to not get caught and use technology like tor and encrypt everything and can figure out how to use stable diffusion will be the pedofiles that have custom models trained on real children.
And if you and me consider the possibility in a casual conversation online, they have also considered the possibility, heavily researched and implemented it if it's at all possible. And they know how to not get caught.
But it's okay, it's "art" after all and we can't ban art because that's evil.... Right... Right?
....okay, seeing as you haven't actually done any research, yet arrived at a conclusion, a conversation about this is going to be difficult.
Let's get more specific so we can have an actual conversation. When you say "the AI", what do you mean? Dall-e, midjourney, or some guy training and using their own model on a local computer?
Are you familiar with large models being able to compose concepts they've seen, to produce something not found in its training data?
You used a technical assertion in your argument. Out of curiosity, I wanted to learn more and asked you for sources.
You can neither prove nor are you capable of discussing said technical assertion. I am now going to leave the conversation. Seeing as you can't prove or even discuss it, I'd hope you avoid using it in the future, or at least learn more about it.
however, I think psychologists might not be a fan of giving them access to that material. I think the reason is because they would end up looking fore more and more extreme material and they could end up offending as a result of that
Afaik we're still yet to find out whether viewing AI-generated material makes an individual look for real-life child abuse imagery.
I believe viewing the latter allows many to keep real-life urges under control (might re-check materials on that), but it obviously comes with its own issues. If we can make AI generated child pornography, and if it doesn't make people go look for the "real stuff", we might actually make a very positive impact on child safety.
According to the few studies we have in the nineties and aughts most people who have sexual attractions to kids are aware acting on them can be harmful and will turn to alternative ways to explore them (when they can't be suppressed or redirected.) So yes, now we have victimless ways to produce porn, the objections are to the fetishes themselves, not to resulting violent behavior.
That said people commonly and openly express their distaste for such people, more so than domestic violence offenders who assault their kids, just not sexually. The same general disdain for child-sex-attracted adults does not transfer to action to feed children or protect them from abusive households.
That said, when we've worried about fiction driving people to act it out in reality, historically this has been demonstrated wrong every single time. Women are not driven to scandal and harlot behavior from trashy romance. Teens are not driven to violence from violent movies or video games. We can expect porn featuring childreb is not going to compell someone to seek to actually sex-assault kids.