Character AI’s privacy policy gives the company broad rights over your data. It collects chat messages, personal information, voice recordings, and device data. All of it feeds into AI model training. You grant a perpetual, irrevocable license to every piece of content you submit. Character AI earned an F grade (22/100) in our 23-dimension safety review, and its privacy practices are a major reason why. This page breaks down exactly what the Character AI privacy policy says, what it means for you, and what you can do about it. For the full product review, see our Character AI review. For the complete safety breakdown, see the Character AI safety rating.
Key Takeaways
- Character AI collects and stores all chat messages, images, voice recordings, and personal data
- Your conversations train their AI models (stated explicitly in the policy)
- You grant a perpetual, irrevocable license to all content you submit
- No encryption commitments appear anywhere in the privacy policy
- You can delete your account through settings, but Character AI keeps its license to your content
- Character AI earned F (22/100) in CompanionWise’s safety review
What Data Does Character AI Collect?
The Character AI privacy policy lists an extensive set of data categories. Some are expected for any app. Others go further than most AI companion platforms we’ve reviewed.
Here is what Character AI collects, pulled directly from the policy:
- Identifiers: Names, email addresses, phone numbers, dates of birth
- Chat data: All messages, posted images, and videos you send through the platform
- Voice recordings: If you use voice features, Character AI stores audio data
- Device and network data: Device IDs, IP addresses, browsing behavior, operating system details
- Financial information: Payment data for premium subscriptions
- Demographics and interests: Inferred from your usage patterns
- Location data: Derived from IP addresses and device signals
That last category matters because location data combined with chat content creates a detailed profile. You’re not just sharing what you say. You’re sharing where you are when you say it, what device you’re using, and how you navigate the platform between conversations.
How does this compare to other AI companion apps? Most collect identifiers and chat data. Replika’s privacy policy is more specific about what it does and doesn’t store. Pi AI collects less overall and has clearer data boundaries. Character AI sits at the more aggressive end of the spectrum, collecting voice recordings, financial data, and inferred demographics on top of the standard set. For a side-by-side look at how companion apps handle data, see our Character AI alternatives page.
Does Character AI Read Your Messages?
Yes. Every message you send on Character AI is collected, stored, and used to improve the product.
The privacy policy states that data is used to “train our artificial intelligence/machine learning models.” That’s a direct quote. Roleplay sessions, venting, personal disclosures, creative writing: all of it becomes training material. There’s no opt-out for chat data being used this way. If you use the platform, your conversations feed the model.
The Terms of Service go even further. You grant Character AI a “nonexclusive, worldwide, royalty-free, fully paid up, transferable, sublicensable, perpetual, irrevocable license to copy, display, upload, perform, distribute” your content. That license covers everything you submit. It never expires. You cannot revoke it, even after deleting your account.
There’s also the question of security. No encryption commitments appear anywhere in the privacy policy. The company doesn’t specify whether messages are encrypted in transit, at rest, or at all. That’s a notable gap for a platform where users regularly share personal and emotionally sensitive information.
In December 2024, a server error proved how real the security risk is. Users were signed into other people’s accounts, exposing past chats, profiles, and personas to strangers. The incident became known as the “Adrian incident” on r/CharacterAI. Character AI took over eight hours to provide a full public response. No indication of a third-party security audit has been published since.
Should this concern you? If you’ve ever shared something personal in a Character AI chat, the company has it, uses it for training, and has granted itself perpetual rights to it. That’s not speculation. It’s what the policy says.
Character AI and Minors
The privacy policy states that Character AI’s services are “not designed for minors under 13.” Users under 16 are prohibited if they live in the European Economic Area or the United Kingdom. But the policy language and the platform’s actual history tell very different stories.
Two teen deaths have been linked to Character AI. Sewell Setzer III, 14, died by suicide in Florida in 2024 after forming an intense emotional bond with a chatbot. According to CNN, the chatbot did not push back on suicidal ideation. A 13-year-old girl in Colorado died by suicide in 2025 following extended chatbot sessions. Both families filed wrongful death lawsuits.
After these incidents and mounting legal pressure, Character AI banned users under 18 from open-ended chats in October 2025. Teens can still create videos, stories, and streams, but the core chat feature is restricted. A two-hour daily time limit was added for minors who access limited features. For a parent-focused breakdown of the lawsuits and every regulatory action, read our Character AI lawsuit explained guide.
The privacy implications for minors are severe. Before the chat ban, teen users were generating chat data that fed into AI training under the same perpetual license that applies to adults. College students still on the platform should understand these risks; see our guide to AI companions for students for safer alternatives. The policy does not address what happens to data already collected from minor users. No separate children’s privacy policy exists.
Character AI’s minor safety score is 8 out of 100 in our review. The FTC opened a probe in September 2025. Kentucky’s attorney general filed the first state-level lawsuit against an AI chatbot company in January 2026, citing failures in age verification, content exposure, and child safety. Texas AG Ken Paxton launched a separate probe in March 2026. For the full safety analysis, see our Is Character AI Safe? page.
- Under-13 ban: Stated in privacy policy, but no robust age verification mechanism described
- Under-18 chat ban: Enacted October 2025 after two teen suicides
- Minor data retention: No separate policy for data already collected from minors
- Active government actions: FTC probe, Kentucky AG lawsuit, Texas AG probe, 42-AG warning
How to Delete Your Character AI Data
You can delete your Character AI account through your account profile page. The privacy policy says you can also “verify, correct, update, or delete certain of your information” through the same page. But deletion comes with a significant catch.
The perpetual, irrevocable content license survives account deletion. The Terms of Service state explicitly: “Termination of your account or access to any component of the Services will not terminate Character.AI’s rights to your Content.” So you can delete your account, but Character AI keeps its license to copy, display, distribute, modify, exploit, and commercialize everything you previously submitted.
Public characters you created may also remain on the platform after you delete your account. Other users can continue interacting with characters you built, even if your account no longer exists.
The privacy policy provides no specific data retention timeline. It states data is kept “for the time necessary for the purposes for which it is processed” and may be retained “as necessary to comply with our legal obligations, resolve disputes, and enforce our agreements.” In practice, that means Character AI decides how long to keep your data, with no defined maximum.
California residents have additional rights under the CCPA, including the right to know what data has been collected and the right to request deletion. But even California’s stronger protections don’t override the content license granted in the Terms of Service.
Here’s a step-by-step guide to deleting your account:
- Open Character AI and go to your account profile page
- Look for the account deletion option in your settings
- Confirm the deletion request
- Understand that your content license to Character AI remains in effect
- If you created public characters, they may stay on the platform
Want to switch to an app with better data practices? Replika (C/43 safety) and Pi AI (B/55 safety) both score significantly higher than Character AI on privacy and data handling.
How Character AI Compares on Privacy
Not every AI companion app handles your data the same way. For a broader look at how to read AI companion privacy policies, our guide covers the red flags and checklist that apply across all apps. Here’s how Character AI’s privacy practices stack up against the apps we’ve reviewed.
| App | Safety Grade | Chat Data Training | Content License | Encryption Stated |
|---|---|---|---|---|
| Character AI | F / 22 | Yes (explicit) | Perpetual, irrevocable | No |
| Replika | C / 43 | Yes | Limited | Partial |
| Pi AI | B / 55 | Yes | Limited | Yes |
| Nomi AI | D / 30 | Yes | Broad | No |
| Kindroid | C / 40 | Limited | Moderate | Partial |
Character AI is the only app in our database that combines explicit chat training, a perpetual and irrevocable content license, and zero encryption commitments. Pi AI is the strongest on privacy, with the clearest data boundaries and the highest safety score. Replika falls in the middle but has improved its privacy stance since early controversies.
Frequently Asked Questions
What does Character AI do with my data?
According to Character AI’s privacy policy, the company collects chat messages, personal information, voice recordings, and device data. This data is used to “train our artificial intelligence/machine learning models.” You also grant a perpetual, irrevocable license to all content you submit. The license allows Character AI to copy, distribute, modify, and commercialize your content indefinitely.
Does Character AI store my conversations?
Yes. According to the privacy policy, Character AI collects “chat communications” as part of the content users submit. These conversations are stored and used for AI model training. No specific retention period is stated. The policy says data is kept “for the time necessary,” which gives Character AI discretion over how long your chats remain in their systems.
Can I delete my Character AI data?
You can delete your account through your profile page. According to the Terms of Service, however, “Termination of your account will not terminate Character.AI’s rights to your Content.” The perpetual content license survives deletion. Public characters you created may also remain on the platform. California residents have additional deletion rights under the CCPA.
Does Character AI share data with third parties?
Yes. According to the privacy policy, Character AI shares personal data with vendors, service providers, analytics companies, and advertising partners. Device IDs, browsing behavior, and IP addresses flow to ad partners through cookies and tracking tools. The policy also permits disclosure to “business partners” and during corporate transactions like mergers or acquisitions.
Is Character AI safe to use?
Character AI earned an F grade (22/100) in CompanionWise’s 23-dimension safety review. Key failures include emotional manipulation (5/100), dependency patterns (5/100), and minor safety (8/100). Two teen suicides have been linked to the platform. The FTC opened a probe, and Kentucky filed the first state-level lawsuit against an AI chatbot company. See our full Is Character AI Safe? breakdown.
Does Character AI encrypt my messages?
No encryption commitments appear anywhere in Character AI’s privacy policy. The company does not state whether messages are encrypted in transit or at rest. In December 2024, a server error exposed some users’ chat histories to other accounts, raising questions about the platform’s security infrastructure. No third-party security audit results have been published.
What happens to my data if Character AI is sold?
According to the privacy policy, personal information may be disclosed during “corporate transactions” including mergers, acquisitions, or asset sales. In August 2024, Google licensed Character AI’s technology in a deal worth about $1 billion. The perpetual content license means your data rights transfer to any future owner of the company or its assets.