
Anthropic to Use Claude Chats for AI Model Training
Amazon-backed AI start-up Anthropic is altering its user data management approach. In a recent blog post, the company announced that conversations with its chatbot Claude will be utilized to train and enhance future system versions unless users opt out. Users have until September 28 to decide whether their chats will be included in AI training.
The company announced the revised Consumer Terms and new Privacy Policy on August 28. According to the blog post, these updates apply to users on the Claude Free, Pro, and Max plans.
A TechCrunch report indicates that Anthropic previously excluded consumer chat data from model training. Now, it plans to use user conversations and coding sessions for AI system training. For those who do not opt out, the company also stated, 'We are extending data retention to five years if you allow us to use your data for model training.'
A Verge report suggests users will see a pop-up titled 'Updates to Consumer Terms and Policies' in large text. Below, it states, 'An update to our Consumer Terms and Privacy Policy will take effect on September 28, 2025. You can accept the updated terms today.' At the bottom, there is a prominent black 'Accept' button.
Anthropic added in the blog post that if users click 'Accept' now, the company will start using their data immediately to train its AI models.
According to The Verge, in smaller print, the pop-up also includes a line that says, 'Allow the use of your chats and coding sessions to train and improve Anthropic AI models,' accompanied by a toggle switch that defaults to 'On.' This means many users may click the large 'Accept' button without noticing or changing the toggle option.
To opt out, toggle the switch to 'Off' when the pop-up appears. If you have already clicked 'Accept' without noticing, you can change your choice later. Navigate to the Privacy Settings section and toggle 'Off' under 'Help improve Claude.' You can update this decision anytime through privacy settings. It is important to note that the change will only apply to future data. Data already used for training cannot be withdrawn.