• webghost0101@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 day ago

    Pretty sure its in the Tos it can’t be used for therapy.

    It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.

    • jagged_circle@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      23 hours ago

      What? Its a virtual therapist. Thats the whole point.

      I don’t think you can sell a sandwich and then write on the back “this sandwich is not for eating” to get out of a case of food poisoning