Ⓐ☮☭

  • 0 Posts
  • 310 Comments
Joined 2 years ago
cake
Cake day: July 20th, 2023

help-circle
  • We need to split clients from providers.

    Invidious and freetube could diversify to accept multiple alternative sources besides youtube content.

    If the content exists on multiple platforms the user could set a preference and orded or backup providers.

    As creators make a switch to smaller platforms the users who use these clients are unaffected. It works similar to our fediverse, a community can just change instance and everyone can still access it the same.

    Creators could test migration by posting to multiple providers themselves. Those reliant on YouTube money specifically could premiere on youtube and after some time reupload elsewhere.

    Those that dont rely on youtube money can do the reverse where the later youtube money served more as an ad to their alternative main channel.





  • That must be some other system indeed.

    They don’t really provide much information from how the images were actually shared though.

    Maybe there is a machine learning algorithm that is trained to detect specific features in a random photo but i cant imagine it being accurate without frequent false possibles.

    Could be that if you have a certain amount of “plausible” hits then a google employee has to review them manually and they quickly Judged it wrongly?

    Though that technically implies your Private medical picture is now seen and possibly covertly copied by a (rogue) employee.


  • There not even checking for CSAM

    That would be near impossIble considering the tech. Even on a normal portrait is hard to judge the age on. Let alone fotos with more complex perspectives and only some body parts visible.

    What they are doing is using hashes of specific real pictures that the police know are commonly shared.

    Theoretically it could catch some careless content consuming offenders. The worst offenders, that produce new material, are beyond the scope.

    But also, obvious what google gets is just the hashcodes and not the actual pics. If the police gave google a hash to target for pics of vances bald head or (trans-positive) memes who would know?