This should be illegal, companies should be forced to open-source games (or at least provide the code to people who bought it) if they decide to discontinue it, so people can preserve it on their own.

  • recursive_recursion [they/them]@programming.dev
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    4
    ·
    edit-2
    2 年前

    hmm not sure if that would work as the model that he was using would be different from what’s available so he’d probably notice some differences which might cause a mix of uncanny valley and surrealism/suspension of disbelief where the two are noticably not the same

    plus using a chat-only model would be real tragic as it’s a significant downgrade from what they already had

    his story actually feels like a Romeo and Juliet situation

    • brsrklf@jlai.lu
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 年前

      Doesn’t even take a change of service provider to get there.

      Replika had what had very obviously become a virtual mate service too, until they decided “love” wasn’t part of their system anymore. Probably because it looked bad for investors, as happened for a lot of AI-based services people used for smut.

      So a bunch of lonely people had their “virtual companion” suddenly lobotomized, and there’s nothing they could do about it.

      • SCB@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        2 年前

        I always thought replika was a sex chatbot? Is/was it “more” than that?

        • brsrklf@jlai.lu
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 年前

          It’s… complicated.

          At first the idea was it’d be training an actual “replica” of yourself, that could reflect your own personality. Then when they realized their was a demand for companionship they converted it into virtual friend. Then of course there was a demand for “more than friends”, and yeah, they made it possible to create a custom mate for a while.

          Then suddenly it became a problem for them to be seen as a light porn generator. Probably because investors don’t want to touch that, or maybe because of a terms of servce change with their AI service provider.

          At that point they started to censor lewd interactions and pretend replika was never supposed to be more than a friendly bot you can talk to. Which is, depending on how you interpret what services they proposed and how they advertized them until then, kind of a blatant lie.

    • Surreal@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      2 年前

      LLM is capable of role-playing, character.ai for example can get into the role of any character after being trained. The sound is just text-to-speech, character.ai already includes that, though if a realistic voice is desired, it would need to be generated by a more sophisticated method, which is already being done. Example: Neuro-sama, ElevenLabs