I don’t think this is the fault of the AI yet. Unless the chat logs are released and it literally tries to get him to commit. What it sounds like is a kid who needed someone to talk to and didn’t get it from those around him.
That said, it would be good if cAI monitored for suicidal ideation though. Most of these AI companies are pretty hands off with their AI and what is said.
Yeah, not cut and dry at all. OPs article didn’t have the chat logs. Looks like it told him not to commit but did demand loyalty. He changed his wording from “I want a painless death” to “I want to come home to you” to get it to say what he wanted.