• 0 Posts
  • 57 Comments
Joined 2 years ago
cake
Cake day: March 24th, 2022

help-circle



  • This is actually a good take. Kids aren’t miniature adults, they’re kids. They’re not helpless or useless, but neither are they fully morally and emotionally developed. They need guidance. Plenty of adults can’t responsibly handle internet access. I survived early onilne porn and gore and social media, but it’s not like any of it benefited me in a meaningful way.

    Some folks have an attitude that’s like “I touched hot stoves and I learned better”, but that’s far from ideal.




  • Do I approve of sex work?

    So, yes, sorta, mostly, but I don’t think it’s straight forward.

    For one, sex work is a very broad category that ranges from selling feet pics to having sex to which you wouldn’t otherwise consent with strangers. So under that large umbrella of “jobs wherein you assist someone with getting their rocks off in exchange for money” there’s a lot of variation and differing considerations for the impacts on the workers and the clients.

    So I guess I approve of sex work in the general sense that I approve of any service industry labor that doesn’t intrinsically harm the worker or the consumer. But on the other hand, sex work, particularly having sex, and even stuff short of having sex, bares some higher risk than your average behind-the-counter job. There’s risks of violence, disease, and emotional or psychological harm, some of which is higher because of illegality or stigma, but some of which is higher simply because of the intrinsically intimate nature of sex. And sure, there is something kinda squicky about commodifying human intimacy.

    But on the other hand, the demand is there (not like I don’t consume porn), so the supply will always follow to meet it. So best you can do is ensure that whatever labor sex workers do is as safe as possible, and that the people who do the labor do so freely (to the degree possible in a society that’s still capitalist).








  • the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.

    First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.

    But also, AI systems can blend multiple elements together. They don’t need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.