cross-posted from: https://lemmy.world/post/28530807

Following last year’s announcement, Google Messages is rolling out Sensitive Content Warnings that blur nude images on Android…

  • f4f4f4f4f4f4f4f4@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 hours ago

    This image classification, which does not currently apply to videos, works on-device and is powered by Android System SafetyCore, which “doesn’t send identifiable data or any of the classified content or results to Google servers.”

    It’s opt-in for adults but automatically activated for children’s devices.

    • felsiq@lemmy.zip
      link
      fedilink
      English
      arrow-up
      7
      ·
      18 hours ago

      If this is done locally on-device with no reporting back to Google, it could be a really good feature - the way Apple does it isn’t censorship, it just blurs the picture to give you a heads up “hey this is nudity, you wanna see this right now?”. You can click into it to see the original whenever you want, and it’s just a nice layer of protection to make sure you actively wanted to see whatever it was (and specifically right now). I hope google’s implementing it the same way, but I don’t trust them enough to bet on it and I couldn’t be bothered to read the article lol

      • Kommeavsted@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 hours ago

        Silently installed device scanning software is spyware whether it sends data or not.

        The only reason it wouldn’t report is to avoid legal liability. Protections like this are thin and hinge upon the legal system determining whether the applet’s knowledge is an extension of Google’s.

      • 𞋴𝛂𝛋𝛆@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        3 hours ago

        It is not about the initial application. There is enormous power in the ability to control such an app and roll changes in over time