RobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 months agoOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comexternal-linkmessage-square106linkfedilinkarrow-up1655arrow-down18cross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksnottheonion@lemmy.mlaboringdystopia@mander.xyz
arrow-up1647arrow-down1external-linkOpenAI says dead teen violated TOS when he used ChatGPT to plan suicidearstechnica.comRobotToaster@mander.xyz to Not The Onion@lemmy.worldEnglish · 2 months agomessage-square106linkfedilinkcross-posted to: fuck_ai@lemmy.worldtechnology@lemmy.worldnottheonion@sh.itjust.worksnottheonion@lemmy.mlaboringdystopia@mander.xyz
minus-squareJoe@lemmy.worldlinkfedilinkEnglisharrow-up11·2 months agoIt certainly should be designed for those type of queries though. At least, avoid discussing it. Wouldn’t ChatGPT be liable if someone planned a terror attack with it?
It certainly should be designed for those type of queries though. At least, avoid discussing it.
Wouldn’t ChatGPT be liable if someone planned a terror attack with it?