Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accuses Apple of vastly undercounting how often child sexual abuse material (CSAM) appears in its products. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC.

Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales. In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is in stark contrast to its big tech peers, with Google reporting more than 1.47m and Meta reporting more than 30.6m, per NCMEC’s annual report.

  • Docus@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    ·
    5 months ago

    There is no ‘good’ way of monitoring these platforms without a massive intrusion of privacy. Just like there is no good way of monitoring what people store on their hard disks, memory sticks, or burn onto dvd/cd and send through the mail.

    • wewbull@feddit.uk
      link
      fedilink
      English
      arrow-up
      21
      ·
      5 months ago

      The trouble is that Apple (after a lot of protest against their proposed “solution”) didn’t implement measures that other platforms did. For once I think they did the right thing, but it’s a difficult position to defend against the NSPCC.

      We need to value the privacy of people more as a society IMHO.

    • Flying Squid@lemmy.worldM
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      15
      ·
      5 months ago

      You think you have privacy on these platforms? They’ve been using everything in their cloud storage to train their AIs.

      • Docus@lemmy.world
        link
        fedilink
        English
        arrow-up
        14
        ·
        5 months ago

        Any evidence that Apple use everything I store in iCloud for training their AI? Please share your evidence. There is a long list of things I don’t like about Apple, but their views on privacy are not on that list.

    • xmunk@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      22
      ·
      5 months ago

      If Apple is unable to properly monitor their service then they should stop offering it.

      I am a developer that works with file classification and I’m quite aware of how difficult this monitoring is… and that’s not a fucking excuse. Apple chose to become a monopoly company with immense power and if they can’t properly run it then maybe we shouldn’t have such gigantic platforms.

      • Docus@lemmy.world
        link
        fedilink
        English
        arrow-up
        24
        ·
        5 months ago

        You are missing the point here that from a privacy point of view, Apple should not have the ability to see what I store on my phone, or by extension on iCloud. Just like the company that made my TV has zero business knowing what I watch on that TV.

        • xmunk@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          9
          ·
          5 months ago

          Oh, I’m perfectly happy to exclude local phone contents from searching. People have all sorts of private images on their phones and it’s creepy as fuck to try and dig into that.

          Apple is trying to offer this service and it’s unreasonable and extremely fallible - they should stop that.

          • Docus@lemmy.world
            link
            fedilink
            English
            arrow-up
            15
            ·
            5 months ago

            I’m not sure what you are getting at. Apple offer storage and offer to encrypt that storage. You think that should be illegal? What about Apple offering storage and I encrypt stuff myself before storing it? What about a self storage company where I hire a container and put my own padlock on it? Or the self storage company has a duplicate key, but then I store a locked safe in it? And even if you could get Apple to change their ways: what about Amazon cloud storage - a lot of companies and agencies would be very unhappy if Amazon could scan their data. CSAM is a problem. But abandoning all privacy and security is not the solution.

        • xmunk@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          13
          ·
          5 months ago

          I don’t offer such a service so I don’t host user’s images.

          This is a hard problem to solve.