16% of data breaches now involve AI tools according to IBM. You paste client information into ChatGPT, proprietary code into Claude, financial data into Gemini — and you don't know which of those platforms uses your input to train their next model. Here's the honest breakdown of every major AI tool's privacy policy as of April 2026.
- ChatGPT Free: Trains on your data by default. Opt-out available in settings.
- ChatGPT Plus: Trains on your data by default. Same opt-out available.
- Claude Free: Does NOT train on your conversations.
- Claude Pro: Does NOT train on your conversations.
- Gemini Free: Trains on your data. Human reviewers may read conversations.
- Gemini Advanced: Same training policy. Opt-out available.
- Perplexity: Does not use queries to train models.
- All tools: Temporarily process your data on their servers — nothing is truly "local" unless you run local models.
- Last verified: April 2026
The Privacy Matrix
ChatGPT (OpenAI): Your conversations are used to train future models by default. You can opt out: Settings → Data Controls → "Improve the model for everyone" → toggle off. But this disables conversation history. The addition of ads in February 2026 created additional data collection concerns — ad targeting typically requires behavioral analysis of user interactions.
Claude (Anthropic): Anthropic does not train on your conversations across any tier. Their privacy policy explicitly states that user conversations are not used for model training. This is the cleanest privacy stance among the major providers.
Gemini (Google): Conversations with Gemini may be reviewed by human annotators and used to improve Google's AI models. Given Google's advertising business model, the data collection infrastructure is more extensive than most competitors. Opt-out is available but buried in settings.
Perplexity: Does not use your search queries to train models. However, queries are temporarily stored for service delivery. The company's business model is subscription-based, not advertising-based, which reduces the structural incentive for aggressive data collection.
Copilot (Microsoft): Enterprise and business tiers have strong privacy guarantees — data is not used for training. Consumer tier is less clear. Microsoft's enterprise agreements are among the most privacy-protective in the industry.
Getting value from this? We cover AI privacy and policies with honest analysis. Join readers who stay informed →
What You Should Never Paste Into Any AI Tool
Regardless of privacy policies, treat every AI interaction as potentially non-private. Never paste: passwords, API keys, or authentication tokens. Social Security numbers, financial account numbers, or personal medical records. Confidential client documents or communications. Proprietary source code that constitutes trade secrets. Private communications from other people without their consent.
The Privacy-Conscious AI Stack
If privacy is a priority, here's the stack that minimizes exposure. Claude Pro ($20/month) for most tasks — no training on conversations, no ads, cleanest privacy policy. Perplexity for research — no query training, subscription-based model. Local models via Ollama for highly sensitive work — nothing leaves your device. Total cost: $20/month plus a computer capable of running local models.
For a full comparison of all AI tools including privacy policies, see our State of AI Models page.
The Bottom Line
Your AI tool's privacy policy matters because you're feeding it your thinking — your strategies, your client data, your competitive intelligence. The difference between a tool that trains on your data and one that doesn't is the difference between an employee and a contractor who photocopies everything before returning your files.
Choose tools whose business model doesn't depend on monetizing your input.
This is what we do every week. One deep dive on AI tools, workflows, and honest takes — no hype, no filler. Join us →
Disclosure: Some links in this article are affiliate links. We only recommend tools we've personally tested and use regularly. See our full disclosure policy.