Anthropic users face a new choice opt out or share your data for AI training | TechCrunch

News Source : TechCrunch
News Summary
- Anthropic is requiring all Claude users to decide by September 28 whether they want their conversations used to train AI models.
- Previously, users of Anthropic’s consumer products were told that their prompts and conversation outputs would be automatically deleted.
- The new policies apply to Claude Free, Pro, and Max users, including those using Claude Code.
- Business customers using Claude Gov, Claude for Work or Claude for Education will be unaffected, which is how OpenAI similarly protects enterprise customers from data training policies.
Anthropic is making some big changes to how it handles user data, requiring all Claude users to decide by September 28 whether they want their conversations used to train AI models. While the company [+4756 chars]