OpenAI Chief Executive Officer (CEO) Sam Altman has cautioned ChatGPT users who treat the chatbot as a confessional that their chats could be used as evidence in court in case a crime is committed.
Speaking at the This Past Weekend podcast on June 25, Altman revealed that there is no legal or policy framework to protect your chats from scrutiny in case of a lawsuit at the moment.
Basically, no legal privilege shields ChatGPT discussions the way doctor–patient or attorney–client exchanges are protected.
“Right now… if you talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, we could be required to produce that, and I think that’s very screwed up,” Altman said.
In a nutshell, users should not expect any legal confidentiality for their conversations with ChatGPT, even though some have turned the chatbot into a therapist of sorts, where they don’t shy away from confessing their deepest secrets.
Even the OpenAI CEO highlighted the personal nature users have developed with ChatGPT, saying, "People talk about the most personal stuff in their lives to ChatGPT," he said.
"People use it—young people, especially, use it—as a therapist, a life coach; having these relationship problems and [asking] what should I do? And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT."
This means that if there is a lawsuit against a user, ChatGPT could be required to produce the chats as evidence, something that Altman termed "very screwed up."
Although OpenAI basically deletes chats on the free ChatGPT tier within 30 days, sometimes it stores them for legal and security reasons.
Currently, the AI developer is embroiled in a lawsuit with The New York Times, which requires it to save user conversations with millions of ChatGPT users, excluding enterprise customers.
Whether the chats touch on mental health, emotional advice, or companionship, they can be produced in court or shared with others in case of a lawsuit.
This is unlike chats with other messaging apps like WhatsApp that have end-to-end encryption, which prevents third parties from reading or accessing your chats, as OpenAI can read every conversation between users and ChatGPT.