Anthropic changes its data policy
🕓 Read time: ~2 min
🚨 Quick AI Alert: Anthropic, the company behind Claude, has just changed its data policy and it’s a big shift you need to know about.
What changed:
Until now, Anthropic stood out because chats were not used for training by default. Data was retained for 30 days, but you had to explicitly opt in if you wanted your chats to improve their models.
As of this new policy, that has flipped. Your chats will now be used to train Claude UNLESS you actively opt out. In addition, chat data will now be stored for up to 5 years instead of just 30 days.
For many coaches and consultants, this is a 180-degree turn. From 'Active Opt-In' to 'Active Opt-Out' (= the OpenAI way).
You should have seen a pop-up advising you of Anthropic's updated T&Cs when you last logged into your Claude account. If you have accidentally accepted them without toggling off the bit on training data, here's what to do (by 28 September):
-
Log into your Anthropic account
-
Go to Settings > Privacy
-
Toggle OFF "Help improve Claude" if you prefer to keep chats private and limit data retention to 30 days
✅ Key Takeaway: AI tools change fast. What was “safe yesterday” may shift overnight. Staying alert and updating your settings is the simplest way to keep control.
Til next time, stay smart and intentional,
Elena