Skip Navigation
Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • If a pedophile creates a hospital/clinic room setting and photographs a naked kid, will it be okay? Do you understand these problems impossible to solve just like that? Parents also take photos of their kids, and they do not take photos like a doctor would. They take photos in more casual settings than a clinic. Would parents be considered pedophiles? According to the way you propose to judge, yes.

    You are basically implying that social defamation is what matters here, and the trauma caused to victim of such fictional media is a problem. However, this is exactly what anti-AI people like me were trying to warn against. And since these models are open source and in public hands, the cat is out of the bag. Stable diffusion models work on potato computers and take atmost 2-5 minutes to generate per photo, and 4chan has entire guides for uncensored models. This problem will be 100x worse in a couple years, and 1000x worse in the next 5 years. And infinitely worse in a decade. Nothing can be done about it. This is what AI revolution is. Future generations of kids are fucked thanks to AI.

    The best thing one can do is protect their privacy, and photos from being out there. Nobody can win this battle, and even in the most dystopian hellhole with maximum surveillance, there will be gaps.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Have you considered the problem of doctors, married parents and other legitimate people being labelled as CSAM users and pedophiles? This has already happened, and they are not obligated to take the brunt of misjudgement of tools developed to judge such media. This is not a hypothetical scenario, and has already happened in real world, and has caused real world damage to people.

    The argument of planted CSAM is not incoherent, but has also played out with many people. It is one of the favourite tools for elites and ruling politicians to use. I am less worried about it because such a law thankfully does not exist, that will misjudge the masses brutally for fictional media.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • An image is not merely an arrangement of pixels in a jpeg,

    I am not one of those "it's just pixels on a screen" people. But if it was not recorded in real world with a camera, it cannot be real.

    Who will be the judge? If there is some automated AI created, who will be the one creating it? Will it be perfect? No. We will end up in the situation that Google caused to users, like doctors, married parents and legitimate people being labelled as pedophiles or CSAM users. It has already happened to me in this thread, and you also said it. The only accurate way to judge it will be a very large team of forensic experts on image/video media, which is not feasible for the amount of data social media generates.

    not every law needs to have a perfectly defined line

    And this is where the abuse by elites, politicians and establishment starts. Activists and dissidents can be easily jailed by CSAM being planted, which would in this case be as simple as AI pictures being temporary drive by downloads onto target's devices.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Glad that it will always remain a hot take.

    The problem with your argument is there cannot be developed a scale or spectrum to judge where the fake stops and real starts for drawings or AI generated media. And since they were not recorded with a camera in real world, they cannot be real, no matter what your emotional response to such a deplorable defamation act may be. It is libel of an extreme order.

    Cuties was shot with a camera in real world. Do you see the difference between AI generated media and what Cuties was?

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • I think I have been attacked far too much here already. These nasty people are labelling me as pedophilia supporters. I would suggest capital punishment for pedophiles and atleast a non bailable offence law for such defamation actors like the one in post article and these internet creatures that go around labelling people falsely.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Since you are also labelling me as a pedophile/loli fan, I would prefer you provide evidence of the same. Failing to do so will require to take moderator actions.

    Justifying your absurdity using hivemind baiting tactics may work on Reddit, but this is Lemmy.

    Edit: I have learned my lesson. I will never be this tolerant again. Disgusting people. Leniency just makes you a doormat.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • I can't believe that the person defending sex crimes of this magnitude is a fucking mod

    Can you describe where I did exactly this? You might be violating rules by falsely accusing people of things they did not do.

    Asking this as a mod. I would prefer an answer in the next 24 hours for the reputation damage you are trying to attempt to do to me. I will not hesitate to take action against such a deplorable act. Lemmy mods are not like this.

    And considering I have not deleted any of my comments, I am fairly transparent here simply trying to discuss.

    Edit: since you are active right now, you must provide an answer within two hours

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • It was not a threat, but a hypothetical example to gauge the reaction of that reactionary baiter.

    The problem with claiming AI generated art as CSAM is that there is no possible way to create an objective definition of what "level" of realism is real and what is not. A drawing or imaginary creation is best left not defined as real in any capacity whatsoever. If it is drawn or digitally created, it is not real, period. Those people thinking of good uses of AI were too optimistic and failed to account for the extremely bad use cases that will spiral out of control as far as human society goes.

    Even though China is incredibly advanced and proactive on trying to control this AI deepfake issue, I do not trust any entity in any capacity on such a problem impossible to solve on a country or international scale.

    I just had a dejavu moment typing this comment, and I have no idea why.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • You are projecting your need for professional therapy. You are so blind from anger, you are a complete utter failure at maintaining composure and thinking through things with a calm, composed mind.

    I do not need to clarify my position. It is clear if you have a clear mind, which you do not. Just to repeat, AI drawings are not a CSAM problem, but a lethal weapon for defamation and libel.

  • Deduplication tool
  • The largest footprint file type is videos. Use Video Duplicate Finder tool on Github. Then use Czkawka to deduplicate general types of files. Both are available on Linux.

    This will solve atleast 97% of your problems.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Thanks for telling the class that you're a lolicon pedo.

    I knew you are one of those reactionary false accusers. I highly doubt anyone takes you seriously in real life, because of this bratty behaviour. You are making the kids unsafe by making a reactionary bait out of this serious issue, and it is damaging more than the drawings you are posturing to be after.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • There is not yet AI that can do this. Also, is there real world harm happening? This is a problem of defamation and libel, not "CSAM". Reducing problems to absurdity is lethal to liberty of citizens.

    All those who wanted AI so much, you will have the whole cake now. Fuck AI empowerment. I knew this would happen, but the people glazing AI would not stop. Enjoy this brainrot, and soon a flood of Sora AI generated 720p deep fake porn/gore/murder videos.

  • InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)TH
    TheAnonymouseJoker @lemmy.ml
    Posts 0
    Comments 0