How AI Companion Apps Use Your Data

Every AI companion app you download starts collecting data the moment you create an account. Conversation logs, photos, voice recordings, device identifiers, location pings, usage patterns. Most users never read the privacy policy. We did. Over the past three months, we analyzed the privacy policies, terms of service, and data practices of 11 major AI companion apps as part of our 23-dimension Safety Index. The findings weren’t reassuring. Only one app earned above a C for data safety. Six earned a D or F. This guide breaks down exactly what these apps collect, where that data goes, who can see it, and what you can do about it.

Key Takeaways

  • Every app logs your full conversation history. All 11 apps we reviewed store complete chat transcripts on their servers. None offer true local-only storage.
  • Most apps collect far more than just messages. Device data, IP addresses, usage patterns, and (in some cases) photos, voice recordings, and location are all fair game.
  • Only Pi AI earned above a C for safety with a B/55 score. Replika (C/43) and Kindroid (C/40) follow. Eva AI scored the lowest at F/10.
  • Several apps use your conversations for AI training with no opt-out mechanism, or bury the opt-out in settings most users never find.
  • GDPR and CCPA give you real rights, but exercising them ranges from straightforward (Pi AI) to nearly impossible (Eva AI, Romantic AI).
  • Check your privacy settings before your next conversation. We list the specific steps for each app below.

What Data Do AI Companion Apps Actually Collect?

The short answer: more than you’d expect from a chat app. We categorized data collection into eight types during our safety reviews. Some apps collect far more than others: Muah AI collects financial information, ID documents, and address verification details, and a 2024 data breach exposed 1.9 million accounts. Here is what we found across all 11 apps.

Conversation logs. Every app stores your complete message history on remote servers. This includes text messages, roleplay scenarios, emotional conversations, and anything you type into the chat window. Replika’s privacy policy states it collects “the content of all messages you send and receive” through the platform. Character.AI’s policy uses similar language. There is no mainstream AI companion app that processes conversations entirely on your device.

Photos and media. Apps that support image sharing (Candy AI, Kindroid, Nomi AI, Replika) retain copies of every photo you send or receive. Candy AI’s AI-generated images are also stored server-side and linked to your account. If you share personal photos during conversations, those photos live on the company’s servers alongside your chat logs.

Voice recordings. Apps with voice chat features (Replika, Character.AI, Talkie AI, Pi AI) collect and store audio data. Replika’s policy specifically mentions “voice and audio information” as a data category. Whether these recordings are used for model training varies by app.

Device and technical data. All 11 apps collect standard device information: operating system, device model, IP address, browser type, and advertising identifiers. Some go further. According to Candy AI’s privacy policy, the app collects “device event information such as crashes, system activity, hardware settings.” Eva AI collects similar technical telemetry.

Usage patterns. Session length, time of day, features used, buttons tapped, screens visited. This behavioral data helps apps optimize engagement, but it also creates a detailed profile of how you interact with your AI companion. According to Character.AI’s privacy policy, they collect “information about your use of the Services, including how frequently you use the Services.”

Location data. Some apps request location permissions or derive approximate location from IP addresses. Romantic AI’s privacy policy mentions collecting “approximate location based on IP address.” Eva AI’s policy includes geographic location in its data collection categories.

Contact and address book data. A handful of apps request access to your contacts list, ostensibly for social features or referral programs. This is less common in companion apps than in social media, but it appeared in our analysis of Eva AI and Anima AI’s requested permissions.

Payment information. Any app with a subscription tier processes payment data. Most use third-party processors (Stripe, Google Play Billing, Apple In-App Purchase), meaning the app company itself may not store your full card number. However, they retain purchase history, subscription status, and billing email addresses.

How Is Your Data Stored and Protected?

Storage practices vary dramatically across the 11 apps we reviewed. The gap between the best and worst is significant.

Encryption. All 11 apps use HTTPS (TLS encryption in transit), which is the bare minimum. Fewer disclose whether data is encrypted at rest on their servers. Pi AI explicitly states it uses encryption at rest. Replika mentions “industry standard security measures” without specifying encryption type. Several apps, including Romantic AI and Eva AI, provide no technical details about server-side encryption in their privacy policies.

Server locations. Where your data physically lives determines which laws protect it. Pi AI (Inflection AI) operates from the United States. Character.AI is US-based. Candy AI’s privacy policy references servers in multiple jurisdictions without specifying which ones. For EU users, this matters because data transfers outside the European Economic Area require specific legal safeguards under GDPR.

Data retention. How long do apps keep your data after you stop using them? Pi AI states it retains data “for as long as your account is active” and will delete it upon request. Character.AI retains data “as long as we need it to provide you the Services.” Candy AI’s policy is vague, stating data is kept “for as long as necessary.” Eva AI and Romantic AI provide minimal retention timelines in their policies. PepHop AI goes further in the wrong direction: its privacy policy lists a fictional video game location as its governing jurisdiction, making its data retention commitments legally unenforceable. Mello AI is worse still: its privacy policy is a generic template copied from an educational institution, referencing “students” and “enrollment details” instead of anything related to the companion app.

Here’s how we scored data protection across the apps we reviewed:

App Safety Grade Safety Score Encryption Details Retention Policy
Pi AI B / 55 55/100 TLS + at-rest disclosed Clear, deletion on request
Replika C / 43 43/100 TLS, “industry standard” Active account + request
Kindroid C / 40 40/100 TLS, limited detail Moderate clarity
Nomi AI D / 30 30/100 TLS, minimal detail Vague
Talkie AI D / 30 30/100 TLS, minimal detail Vague
Candy AI D / 32 32/100 TLS, multi-jurisdiction Vague
Anima AI D / 25 25/100 TLS, no detail Minimal
Character.AI F / 22 22/100 TLS, limited detail “As long as needed”
Chai AI F / 18 18/100 TLS, no detail Minimal
Romantic AI F / 13 13/100 TLS, no detail Unclear
Eva AI F / 10 10/100 TLS, no detail Unclear

Who Can Access Your Conversations?

When you send a message to an AI companion, that message doesn’t stay between you and the bot. Multiple parties may have access to your conversation data, depending on the app.

Company employees and contractors. Most privacy policies reserve the right for internal staff to access user data for “service improvement,” “safety monitoring,” or “content moderation.” Character.AI’s policy allows employees to review conversations flagged by automated systems. Replika’s policy permits access for “providing, maintaining, and improving our services.” The practical question is how many people and under what conditions.

AI training pipelines. Your conversations may feed directly into the models that power these apps. We cover this in detail in the next section, but the key point here: when an employee accesses your data for “model improvement,” they may be reading your actual messages to label training data.

Third-party service providers. Cloud hosting companies (AWS, Google Cloud), analytics providers, payment processors, and customer support platforms all potentially touch your data. According to Replika’s privacy policy, it shares data with “service providers who perform services on our behalf.” Pi AI’s policy similarly references third-party service providers. The chain of data access extends beyond the app company itself.

Law enforcement. Every app we reviewed includes language allowing data disclosure in response to legal requests. This is standard and legally required. The difference is in transparency: Pi AI publishes general principles about law enforcement requests. Most other apps simply state they’ll comply with “valid legal process” without further detail.

Advertising and analytics partners. Apps that run ads or use tracking SDKs share behavioral data with advertising networks. Our Exodus Privacy analysis found that Eva AI and Anima AI include more advertising-related trackers than the other apps we reviewed. Candy AI’s policy mentions sharing data with “advertising partners.” Pi AI, by contrast, does not run ads and has fewer third-party tracking integrations. Sakura FM also runs ad-free with only one tracker SDK (Sentry for crash reporting), though its privacy policy acknowledges data sharing that may qualify as a CCPA “sale.”

Watch: NBC News investigates AI companion chatbots and the privacy concerns raised by advocates as millions share intimate thoughts with these apps.

Do These Apps Use Your Conversations for AI Training?

This is the question most users care about, and the answers range from transparent to deliberately opaque.

Pi AI is the most transparent. Its privacy policy acknowledges using conversation data to improve its models, but the company provides clearer documentation about its data practices than any other app in our review. Pi’s parent company, Inflection AI (now part of Microsoft’s AI division), publishes more detailed information about how training data is handled.

Character.AI uses conversation data for model training. According to its privacy policy: “We may use information we collect to develop and improve our Services,” which includes training AI models. The company faced scrutiny and multiple lawsuits in 2025 related to its data practices and safety features, particularly regarding minors.

Replika states it uses conversation data for “improving our products and services, including training and improving our AI.” Users can delete individual messages, but whether deleted messages are excluded from training datasets already created is unclear.

Nomi AI positions itself as more privacy-friendly than competitors, with marketing language about respecting user data. Its actual privacy policy, however, includes standard language about using data for service improvement. The gap between marketing claims and policy language is something we flagged in our safety rating.

Candy AI collects conversation data and uses it broadly. Its privacy policy grants the company wide latitude to use data for “research and development,” “marketing and advertising,” and “product improvement.” Opt-out mechanisms are not prominently featured.

Romantic AI, Eva AI, Chai AI, and Anima AI provide the least clarity. Their privacy policies use broad, vague language about data usage. None offer transparent opt-out mechanisms for training data specifically. When a company won’t tell you clearly whether your conversations train its AI, the safe assumption is that they do.

Can you opt out? In most cases, the only reliable way to prevent your data from being used for training is to not use the app at all. Replika allows message deletion, but that doesn’t guarantee removal from existing training datasets. Pi AI offers more controls than most. For the D- and F-rated apps, opt-out is either nonexistent or so buried in settings that it’s effectively inaccessible.

Your Rights Under GDPR, CCPA, and Other Privacy Laws

Privacy regulations give you specific rights over your personal data. Whether AI companion apps make those rights easy to exercise is a different question entirely.

If you’re in the EU or UK (GDPR):

  • Right of access. You can request a copy of all personal data the company holds about you. This includes conversation logs, device data, and any profiles created from your usage.
  • Right to erasure (“right to be forgotten”). You can request deletion of your personal data. The company must comply unless it has a legal basis to retain the data.
  • Right to data portability. You can request your data in a machine-readable format and transfer it to another service.
  • Right to object to processing. You can object to your data being used for specific purposes, including AI training and profiling.

If you’re in California (CCPA/CPRA):

  • Right to know. You can request disclosure of what personal information is collected, used, shared, or sold.
  • Right to delete. Similar to GDPR erasure, you can request deletion of personal information.
  • Right to opt out of sale. If the company sells your personal information, you can opt out. Several AI companion apps share data with advertising partners in ways that may constitute a “sale” under CCPA definitions.
  • Right to non-discrimination. Companies cannot penalize you for exercising your privacy rights.

How do the apps actually handle these requests?

Pi AI provides a straightforward data deletion process through its account settings. We reviewed it and found the process to be functional. Replika offers account deletion that includes conversation data, accessible through the app settings. Kindroid responds to email-based data requests within a reasonable timeframe.

At the other end, Eva AI and Romantic AI make the process significantly harder. Their privacy policies list the rights but provide no clear mechanism for exercising them beyond a generic contact email. Chai AI’s process is similarly opaque. Character.AI improved its data rights processes after facing legal pressure in 2025, but the experience still isn’t seamless.

If you want to submit a data request, document everything. Screenshot the request, save the confirmation email, note the date. Under GDPR, companies have 30 days to respond. Under CCPA, they have 45 days.

How We Score Data Practices in Our Safety Index

Data practices are one of six dimensions in the CompanionWise Safety Index. The Data and Privacy dimension evaluates four sub-areas:

  • Data collection scope. What types of data does the app collect? Is collection proportional to the service provided?
  • Data sharing and third parties. Who receives your data? How many third-party integrations exist?
  • User controls. Can you manage, export, or delete your data? Are the controls accessible?
  • Policy transparency. Is the privacy policy clear, specific, and accessible? Does it match the app’s actual behavior?

Each sub-area is scored on a 0-100 scale, weighted, and combined into an overall safety score across all six dimensions. The Data and Privacy dimension is one of the most heavily weighted because it directly affects user trust and legal compliance.

You can read the full methodology on our How We Rate page, including how we source evidence, weight dimensions, and calculate letter grades.

Watch: CNBC examines how AI apps are being used to create explicit images of real people without consent, exposing how personal data shared with these platforms can be weaponized.

Practical Steps to Protect Your Data

You don’t have to quit AI companion apps entirely. But you should take these steps to limit your exposure.

  1. Read the privacy policy before downloading. We know nobody does this. At minimum, search the policy for “third party,” “training,” and “retention.” Those three terms tell you the most about how your data is handled. Or check our safety ratings instead.
  2. Use a separate email address. Create a dedicated email for AI companion apps. Don’t use the same email you use for banking, work, or primary social accounts. A free Gmail or ProtonMail address takes two minutes to set up.
  3. Don’t share personal photos. Treat every image you send to a companion app as permanent and potentially accessible by the company’s employees. Avoid sending selfies, photos of friends/family, or anything you wouldn’t want stored on a third-party server indefinitely.
  4. Disable location permissions. Go to your phone settings and deny location access for any companion app. On iOS: Settings, then Privacy & Security, then Location Services. On Android: Settings, then Apps, then the specific app, then Permissions.
  5. Review app permissions regularly. Check what permissions each companion app has. Camera, microphone, contacts, and storage access should only be granted if you actively use those features. Revoke anything unnecessary.
  6. Avoid sharing identifying information in conversations. Don’t tell your AI companion your real full name, home address, workplace, school name, or financial details. The AI doesn’t need this information to function, and it ends up in a database you can’t fully control.
  7. Use a VPN for additional privacy. A VPN masks your IP address from the app’s servers, making it harder to link your usage to your geographic location and internet service provider.
  8. Delete conversations periodically. If the app allows message deletion (Replika does, for example), use it. Clearing old conversations reduces the amount of personal data stored on the company’s servers, even if previously created training datasets may still contain it.
  9. Exercise your data rights. If you stop using an app, submit a formal data deletion request rather than simply uninstalling. Uninstalling removes the app from your phone but does nothing to the data stored on the company’s servers.
  10. Choose safer apps when possible. Our Safety Index exists for this reason. If privacy matters to you, the difference between a B-rated app and an F-rated app is real and measurable. See our full rankings.

Frequently Asked Questions

Do AI companion apps read my other messages or access my phone’s texts?

No. AI companion apps only access messages sent within their own platform. They cannot read your SMS texts, WhatsApp messages, or other app conversations unless you explicitly grant such permissions, which none of the 11 apps we reviewed request. The data collected is limited to interactions within the companion app itself, though device-level permissions like contacts and storage may be requested separately.

Can AI companion apps see my personal photos?

Only if you share them directly in the chat or grant camera/gallery access. Apps like Candy AI and Replika support image sharing, meaning any photo you send is stored on their servers. According to Replika’s privacy policy, shared media is part of “content you create through the Services.” Don’t share anything you wouldn’t want stored permanently on a company server.

What happens to my data if an AI companion app shuts down?

Most privacy policies do not specify what happens to user data if the company ceases operations. Under GDPR, data must still be handled according to the original privacy policy, and users should be notified. In practice, enforcement during company shutdowns is inconsistent. The safest approach is to submit a deletion request before any app you use goes offline, rather than waiting and hoping the data gets properly disposed of.

Are AI companion apps safe for teenagers?

Most AI companion apps require users to be 13 or older (some require 18+), but age verification is minimal across the industry. Character.AI faced legal action in 2025 partly over concerns about minors accessing the platform. From a data perspective, teenagers are especially vulnerable because they’re less likely to read privacy policies, more likely to share personal information in conversations, and protected by additional regulations like COPPA in the US and specific GDPR provisions for minors in the EU.

Can I use an AI companion app without giving personal data?

Not meaningfully. All apps require account creation (typically email and sometimes phone number). Even without sharing personal details in conversations, the app still collects device data, IP address, usage patterns, and conversation content. You can limit exposure by using a separate email, avoiding personal details in chats, and denying optional permissions. But truly anonymous use of these apps is not possible with any of the 11 platforms we reviewed.

Which AI companion app is best for privacy?

Pi AI earns the highest safety score in our index at B/55, with the strongest data practices of any app we reviewed. Replika (C/43) and Kindroid (C/40) are the next best options. All three provide clearer privacy policies and more functional data controls than the competition. At the bottom, Eva AI (F/10) and Romantic AI (F/13) have the weakest data protections. See our full app rankings for the complete breakdown.

This guide was last updated on . Learn about our editorial process .