• conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    7 months ago

    I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    2
    ·
    edit-2
    7 months ago

    “People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”

    • Ogmios@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      7 months ago

      I’d posit that the people who don’t want their files scanned, and the people suing Apple are not the same people.

      • chemical_cutthroat@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        3
        ·
        7 months ago

        If I’ve learned on thing from my time on earth, it’s that all humans are the same, and all of the opinions of one are shared by the majority.

  • lurklurk@lemmy.world
    link
    fedilink
    English
    arrow-up
    11
    ·
    7 months ago

    Is iCloud a file sharing service or social network in some way? If it isn’t, comparing them with such services makes no sense

  • lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    11
    ·
    edit-2
    7 months ago

    The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.

    But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

    • lurklurk@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      7 months ago

      You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it’s added, next year it’s extended to cover terrorism. Then to look for missing people. Then “illegal content” in general.

      The reason most people seem to disagree with you in this case is that you’re wrong

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        4
        ·
        7 months ago

        We could’ve burned that bridge when we got to it. If Apple would’ve been allowed to implement on-device scanning, they could’ve done proper E2E “we don’t have the keys officer, we can’t unlock it” encryption for iCloud.

        Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      😆 yea especially after I learned that most cloud services (amazon, google, dropbox) were already doing csam scans on their servers 🤭

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        7 months ago

        Yep, it’s a legal “think of the children” requirement. They’ve been doing CSAM scanning for decades already and nobody cared.

        When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

        The stupidest ones were the ones who went “a-ha! I can create a false match with this utter gibberish image!”. Yes, you can do that. Now you’ve inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would’ve EVER get swatted by your false matches.

        Can people say the same for Google stuff? People get accounts taken down by “AI” or “Machine learning” crap with zero recourse, and that’s not a surveillance state?

        • Petter1@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          7 months ago

          😅why do we get downvoted?

          I guess somebody doesn’t like reality 💁🏻

  • Lutra@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    I just read up, and I didn’t know this is not so much about stopping new images, but restitution for continued damages.

    The plaintiffs are “victims of the Misty Series and Jessica of the Jessica Series” ( be careful with your googling) https://www.casemine.com/judgement/us/5914e81dadd7b0493491c7d7

    Correct me please, The plaintiffs logic is : “The existence of these files is damaging to us. Anyone found ever in possession of one of these files is required by law to pay damages. Any company who stores files for others, must search every file for one these 100 files, and report that files owner to the court”

    I thought it was more about protecting the innocent, and future innocent, and it seems more about compensating the hurt.

    Am I missing something?