Another AI fail. Letting AI write code and modify your file system without sandboxing and buckups. What could go wrong?

  • NeatNit@discuss.tchncs.de
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    2 days ago

    None of this would happen if people recognized that, at best, AI has the intelligence level of a child. It has a lot of knowledge (some of which is hallucinated, but that’s besides the point) but none of the responsibility that you’d hope an adult would have. It’s also not capable of learning from its own mistakes or being careful.

    There’s a whole market for child safety stuff: corner foam, child-proof cabinet locks, power plug covers, etc… You want all of that in your system if you let the AI run loose.

        • Lime Buzz (fae/she)@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 day ago

          It isn’t pedantry in the case I’m making. I’m making more of a moral/ethical point in that it’s unfair and probably ableist to people who do actually hallucinate to compare them with something that doesn’t actually do that.

          It is robbing the word of any value or meaning and kind of making fun of them in the process, downplaying what they go through.

          • NeatNit@discuss.tchncs.de
            link
            fedilink
            arrow-up
            3
            ·
            1 day ago

            I see, that’s different from how I interpreted it. Thanks for clarifying.

            I don’t really see it that way. To me it’s not downplaying anything. AI ‘hallucinations’ are often disastrous, and they can and do cause real harm. The use of the term in no way makes human hallucinations sound any less serious.

            As a bit of a tangent, unless you experience hallucinations yourself, neither you nor I know how those people who do feel about the use of this term. If life has taught me anything, it’s that they won’t all have the same opinion or reaction anyway. Some would be opposed to the term being used this way, some would think it’s a perfect fit and should continue. At some point, changing language to accommodate a minority viewpoint just isn’t realistic.

            I don’t mean this as a blanket statement though, there are definitely cases where I think a certain term is bad for whatever reason and agree it should change. It’s a case by case thing. The change from master to main as the default branch name in git springs to mind. In that case I actually think the term master is minimally offensive, but literally no meaning is lost if switching to main and that one is definitely not offensive so I support the switch. For ‘hallucination’ it’s just too good of a fit, and is also IMO not offensive. Confabulation isn’t quite as good.

      • megopie@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 day ago

        Exactly, They’re just probabilistic models. LLMs are just outputting something that statistically could be what comes next. But that statistical process does not capture any real meaning or conceptualization, just vague associations of when words are likely to show up, and what order they’re likely to show up in.

        What people call hallucinations are just the system functional capability diverging from their expectation of what it is doing. Expecting it to think and understand, when all it is doing is outputting a statistically likely continuation.

    • Jo Miran@lemmy.ml
      link
      fedilink
      arrow-up
      7
      ·
      edit-2
      2 days ago

      A child, on acid and meth. You should never let it run lose, no matter how many safeguards. Not if your code is business critical.