• MKBandit@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    I suggest everyone file an official comment with the FTC. This is beyond ridiculous. If anything is going to be changed the ESRB needs to actually prop market the ratings because all the problems come from parent that just buy games without any research and then complain when they don’t read what the M rating on the game means

  • FangedWyvern42@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    I feel like this could be bypassed by justing photographing a parent and submitting their image, not to mention how fucking horrifying the whole “database of minor’s faces” thing is, even with the claim that they will delete all the images.

  • 47 Alpha Tango@lemmy.zip
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    The process steps of verification include:

    The user takes a photo of someone old enough to play game

    Tells game “yup that me”

    The system then checks if there’s a live human face in the frame

    The image is then uploaded to Yoti’s backend server for estimation

    Orrrrr

    They’re building the largest database of pictures of minors on the planet.

    • Poggervania@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Oh Lordy, can you imagine if they made a combo system of having your console do a facial scan whilst drinking a verification can?

  • SatanicNotMessianic@lemmy.ml
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Apple had to go through a tremendous amount of engineering effort, custom hardware development, and testing to get FaceID to work reliably. This sounds like a software-based system that uses a user’s camera. I’m also not exactly certain what these steps mean without making a lot of assumptions.

    The user takes a photo of themselves

    Okay? MS had developed an age estimator a decade or so ago. They put up a guess-my-age website that you could upload photographs to. It was relatively accurate (at least in my testing)in that it scored consistently and was usually within a few years of the right answer. I’m sure they’re better now.

    The system then checks if there’s a live human face in the frame

    So it’s looking at a live video? What’s the picture for, then? Does it confirm the picture is the same person as in the video, in which case why would someone have to upload a picture?

    The image is then uploaded to Yoti’s backend server for estimation

    Again, fine. Privacy concerns aside, that’s what we’d expect.

    But out of gaming devices, how many have a video camera? Obviously phones and tablets do, as do most laptops, but neither my switch nor my steam deck have one, nor does my recently retired gaming pc. Am I su even if I’m playing on another device? Am I going to have to periodically re-authenticate?

    I’m not even talking about spoofing here, which given the ubiquity of filters for phone cameras would be trivial.

    This strikes me as someone’s project that was sold to management as a good idea and which now needs to find an application.

    I’m also going to make the very safe assumption that despite their claims, their real world performance across ethnicities is not going to match up with their confident statements in their application. That’s been a pretty constant issue with this sort of application. They make the same claim about police facial recognition databases despite being repeatedly proven wrong.

    The proposal also said, “To the extent that there is any risk, it is easily outweighed by the benefits to consumers and businesses of using this [Facial Age Estimation] method.”

    I really, really want to know what the actual harm being prevented is supposed to be such that it outweighs any other concerns. I don’t mean that some ten year old might play Cyberpunk. I mean actual research showing a quantified harm associated with it, along with harm reduction realized by implementing a parental-permission based age verification system.

    • CosmicSploogeDrizzle@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Time to put these cameras in movie theaters. Can’t be watching R rated films if under 17. Time for cameras on E readers, can’t be reading inappropriate material without regulatory consent. Time to take a photo of yourself, the new Travis Scott album just dropped, but headphones only there buddy. Can’t have anyone else listening to it unless they authenticate.

    • Grangle1@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 year ago

      The “harm”? Litigation $$$ paid to parents of 10 year olds playing Cyberpunk, all it is, really. ESRB covering their butts because they know the rating system is, and always has been, as useless as the parental advisory stickers on CD cases. Parents don’t know it exists, retailers don’t care as long as they get their money, and devs/publishers only care about it as something to point to in order to avoid censorship while they blatantly market games like Cyberpunk to 10 year olds knowing that parents will buy it for them if the kid nags them enough or the kids themselves will buy it from some teenager working at GameStop who’s getting paid too little to care, or they’ll just lie about their age on Steam and use their parents’ credit card to buy it. This is just about the worst way to cover their butts, though.

  • Wakachaka@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    I can’t speak for this specifically, but these face recognition systems have had a history of bias for white people.

  • Metal Zealot @lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    1 year ago

    Do you want masks to stay around forever?? Cuz that’s how you get masks to stay around forever