top of page

ChatGPT Privacy Concerns: Your Words Could Be Used in Court

  • Writer: Sammy Salmela
    Sammy Salmela
  • 3 days ago
  • 2 min read
Illustration showing a person chatting with an AI assistant while being watched through a magnifying glass symbolising privacy risks with AI.

Article with AI Analysis

Date: 29 July 2025

Source: Cointelegraph, Martin Young


Introduction

What you tell ChatGPT might not stay private. In a recent podcast, OpenAI CEO Sam Altman admitted that conversations with AI don’t have the same privacy protections as those with your doctor, lawyer, or therapist. This raises a serious question: can what you share with a chatbot be used against you in a legal case?


No Legal Privilege for ChatGPT Conversations

Sam Altman described the situation as a “huge issue.” When you speak to a doctor or therapist, there are laws that protect those conversations. But right now, if you open up to ChatGPT, those same protections do not exist.In a lawsuit, your chat history could potentially be subpoenaed. “We could be required to produce that,” Altman said. This vulnerability is a sharp contrast to the growing trust many users place in AI as a source of advice whether emotional, medical, or financial.


A Call for AI Privacy Reform

Altman believes that AI should be treated like any other confidential relationship. He said, “We should have the same concept of privacy for your conversations with AI that we do with a therapist.”However, there is currently no legal framework in place to guarantee this. Policymakers seem to agree that urgent action is needed but change is still pending. Until then, many users may unknowingly expose sensitive personal or business information.


The Trade-Off Between Privacy and Surveillance

While Altman acknowledged the need for some surveillance in a world with increasing AI use to prevent misuse such as terrorism he also warned of a slippery slope. “History is that the government takes that way too far,” he said.This means that while collective safety is important, overreach by authorities could compromise individual freedoms. For now, users are caught in a privacy grey zone.


AI-Powered Sentiment Analysis

Our AI analysis of this article revealed:


  • sentiment_score: 0.21 Slightly concerned but neutral in tone overall.

  • Financial Sentiment: 0.02 Minimal financial relevance, but important for tech investors.

  • Polarity Score: 0.11 Mildly leaning towards negative due to privacy risks.

  • Subjectivity Score: 0.47 Moderate subjectivity reflecting quotes and opinions from Altman.


These scores suggest that this article raises awareness about an overlooked risk, but does so in a calm and informative tone. It highlights a serious gap in user protections without resorting to alarmism.


Read More

Explore more articles on Hikarinova Blog 


We’re Getting Ready to Launch Our Test Pilot Program in the U.S. and Asia

We’re about to open up early access for a small group of test pilots in the U.S. and Asia.

If you’re curious about where automated trading is headed and want to be part of building something from the inside this is your chance. As a test pilot, you’ll get hands-on access, early features, and a direct line to the team. Your feedback will help shape the product before full release.

More details will be shared soon stay tuned.



Disclaimer

This article was generated using AI and reviewed for accuracy. The information presented is for educational purposes only and should not be construed as financial advice. Always consult with a professional before making investment decisions.

Commentaires


bottom of page