basiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 19 hours agoAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.comexternal-linkmessage-square42fedilinkarrow-up1309arrow-down12file-text
arrow-up1307arrow-down1external-linkAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.combasiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 19 hours agomessage-square42fedilinkfile-text
minus-squareNutWrench@lemmy.mllinkfedilinkEnglisharrow-up20arrow-down1·11 hours agoAn early AI was once asked, “Bob has a headache. What should Bob do?” And the AI replied, “Bob should cut off his own head.” The point being: AIs will give you logical solutions to your problems but they won’t always give you practical ones.
minus-squarekrunklom@lemmy.ziplinkfedilinkEnglisharrow-up1·3 hours agoexcept they won’t always give you logical answers.
minus-squareRedditRefugee69@lemmynsfw.comlinkfedilinkEnglisharrow-up8arrow-down1·10 hours agoYes, eating one small rock a day is logical.
An early AI was once asked, “Bob has a headache. What should Bob do?” And the AI replied, “Bob should cut off his own head.”
The point being: AIs will give you logical solutions to your problems but they won’t always give you practical ones.
except they won’t always give you logical answers.
Yes, eating one small rock a day is logical.