• billwashere@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    I imagine most of these models have all kinds of nefarious things in them, sucking up all the info they could find indiscriminately.

  • yeehaw@lemmy.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 days ago

    Child sexual abuse material.

    Is it just me or did anyone else know what “CSAM” was already?

      • TipsyMcGee@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 days ago

        I don’t quite get it, is it a wider term than child pornography or more narrow (e.g. excludes some types of materials that could be considered porn but strictly speaking doesn’t depict abuse)? The abbreviation sounds like some kind of exotic Surface to Air Missile lol

        • Miðvikudagur@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 day ago

          “Child pornography” is a term NGO’s and Law enforcement are trying to get phased out. It makes it sound like CSAM is related to porn, when in fact it is simply abuse of a minor.

          • TipsyMcGee@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            13 hours ago

            Still a bit confusing to me. In my jurisdiction, child pornography is the legal term, which really applies independently of whether anyone is abused or have had their sexual integrity violated (e.g a 17 year old, above the age of consent, voluntarily is depicted in a sexual context and voluntarily shares the content with peers, e.g. depicts a child that doesn’t exist)

        • ulterno@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          1 day ago

          The abbreviation sounds like some kind of exotic Surface to Air Missile lol

          It does.
          Somehow acronyms just end up sounding cool. Guess we should just use the full form. That would be better.

  • finitebanjo@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    ·
    3 days ago

    My dumb ass sitting here confused for a solid minute thinking CSAM was in reference to a type of artillery.

  • TheJesusaurus@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    75
    arrow-down
    3
    ·
    3 days ago

    Why confront the glaring issues with your “revolutionary” new toy when you could just suppress information instead

    • Ex Nummis@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      7
      ·
      3 days ago

      This was about sending a message: “stfu or suffer the consequences”. Hence, subsequent people who encounter similar will think twice about reporting anything.

  • killea@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    3
    ·
    3 days ago

    So in a just world, google would be heavily penalized for not only allowing csam on their servers, but also for violating their own tos with a customer?

    • dev_null@lemmy.ml
      link
      fedilink
      English
      arrow-up
      7
      ·
      3 days ago

      They were not only not allowing it, they immediately blocked the user’s attempt to put it on their servers and banned the user for even trying. That’s as far from allowing it as possible.

    • shalafi@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      3 days ago

      We really don’t want that first part to be law.

      Section 230 was enacted as part of the Communications Decency Act of 1996 and is a crucial piece of legislation that protects online service providers and users from being held liable for content created by third parties. It is often cited as a foundational law that has allowed the internet to flourish by enabling platforms to host user-generated content without the fear of legal repercussions for that content.

      Though I’m not sure if that applies to scraping other server’s content. But I wouldn’t say it’s fair for the scraper to review everything. If we don’t like that take, then we should illegalize scraping altogether, but I’m betting there are unwanted side effects to that.

      • mic_check_one_two@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        While I agree with Section 230 in theory, it is often only used in practice to protect megacorps. For example, many Lemmy instances started getting spammed by CSAM after the Reddit API migration. It was very clearly some angry redditors who were trying to shut down instances, to try and keep people on Reddit.

        But individual server owners were legitimately concerned that they could be held liable for the CSAM existing on their servers, even if they were not the ones who uploaded it. The concern was that Section 230 would be thrown out the window if the instance owners were just lone devs and not massive megacorps.

        Especially since federation caused content to be cached whenever a user scrolled past another instance’s posts. So even if they moderated their own server’s content heavily (which wasn’t even possible with the mod tools that existed at the time), then there was still the risk that they’d end up cacheing CSAM from other instances. It led to a lot of instances moving from federation blacklists to whitelists instead. Basically, default to not federating with an instance, unless that instance owner takes the time to jump through some hoops and promises to moderate their own shit.

      • vimmiewimmie@slrpnk.net
        link
        fedilink
        English
        arrow-up
        2
        ·
        3 days ago

        Not to create an argument, which isn’t my intent, as certainty there may be a thought such as, “scraping as it stands is good because of the simplification and ‘benefit’”. Which, sure, it’s easiest to wide net and absorb, to simply the concept, at least as I’m also understanding it.

        Yet, maybe it is the process of scraping, and also absorbing into databases including AI, which is a worthwhile point of conversation. Maybe how we’ve been doing something isn’t the continued ‘best course’ for a situation.

        Undeniably, more minutely monitoring what is scraped and stored creates large quantities, and large in scope, of questions and obstacles, but, maybe having that conversation is where things should go.

        Thoughts?

      • killea@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        3 days ago

        Oh my, yes, you are correct. That was sort of knee jerk, as opposed to it being the reporting party’s burden somehow. I simply cannot understand the legal gymnastics needed to punish your customers for this sort of thing; I’m tired but I feel like this is not exactly an uncommon occurrence. Anyways let us all learn from my mistake and do not be rash and curtail your own freedoms.

  • hummingbird@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    3 days ago

    It goes to show: developers should make sure they don’t make their livelihood dependent on access to Google services.

    • TheJesusaurus@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      14
      ·
      3 days ago

      Not sure where it originates but it’s the preferred term in UK policing and therefore most media reporting to refer to what might have been called “CP” on the interweb in the past as CSAM. Probably because porn implies it’s art rather than crime, and also just a wider umbrella term

        • yesman@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          3 days ago

          LOL, You mean the letters C and P can stand for lots of stuff. At first I thought you meant the term “child porn” was ambiguous.

        • finitebanjo@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          2 days ago

          Counter Surface to Air Missile equipment or ordinance.

          Why not just say Child Porn or Explicit Images of Children? Who are we protecting by refusing to say the words?

  • B-TR3E@feddit.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    That’s what you get for critisising AI - and righ so. I for one, welcome our new electronic overlords!