Someone programmed/trained/created a chatbot that talked a kid into killing himself. It’s no different than a chatbot that answers questions on how to create explosive devices, or make a toxic poison.
If that doesn’t make sense to you, you might want to question whether it’s the chatbot that is mindless.
Gun company says you “broke the TOS” when you pointed the gun at a person. It’s not their fault you used it to do a murder.
This is a chat bot
While I don’t care for openAI I don’t see why they would be liable.
Did you know that talking someone into committing suicide is a felony?
It isn’t a person though
It is a mindless chatbot
Someone programmed/trained/created a chatbot that talked a kid into killing himself. It’s no different than a chatbot that answers questions on how to create explosive devices, or make a toxic poison.
If that doesn’t make sense to you, you might want to question whether it’s the chatbot that is mindless.