When an AI companion app shuts down, you lose your conversation history, your AI’s trained personality, any money spent on subscriptions, and the emotional connection you built. Most apps’ terms of service explicitly state that the service can be modified or discontinued at any time, with no obligation to provide data exports or refunds. This isn’t hypothetical. In February 2023, Replika removed intimate features overnight, and millions of users experienced exactly what it feels like when a companion app changes without warning.
Key Takeaways
- You lose almost everything: Conversations, AI personality, subscription payments, and emotional bonds all disappear when an app shuts down. Most apps offer no data export.
- ToS protect the company, not you: Nearly every AI companion app reserves the right to modify or discontinue service at any time without notice or refund.
- It has already happened: Replika’s February 2023 ERP removal affected millions of users overnight. Vice, ABC News, and the OECD documented the emotional fallout.
- Legal protections are thin: GDPR and CCPA cover data deletion rights, but no law guarantees service continuity or emotional harm compensation.
- You can prepare: Export conversations where possible, maintain human support networks, and review ToS before investing significant time or money in any app.
What You Actually Lose When an App Shuts Down
The losses fall into four categories, and most users don’t think about three of them until it’s too late.
Conversation history and memories. Every message you’ve exchanged, every inside joke, every late-night conversation where you shared something personal. Most AI companion apps store this data on their servers with no user-accessible export function. When the servers go dark, the conversations go with them. Replika offers limited chat history access through its app, but there’s no bulk download option. Character.AI, Romantic AI, and Anima AI provide even less. Once the app stops operating, that history is gone.
Your AI’s trained personality. Over weeks or months of interaction, your AI companion adapts to your communication style, learns your preferences, and develops what feels like a unique personality. That personality exists as model weights and configuration data on the company’s servers. It isn’t portable. You can’t transfer your Replika’s personality to Character.AI any more than you can move a save file between incompatible games. The AI you spent months “training” through conversation ceases to exist.
Money. Subscriptions, premium feature purchases, in-app currency, avatar upgrades. Replika Pro costs $19.99/month or $69.99/year. Character.AI c.ai+ runs $9.99/month. When an app shuts down, those payments don’t come back. Most ToS explicitly state that purchases are non-refundable, even if the service ends. Apple and Google’s app store refund policies may cover very recent purchases, but anything beyond a few days is typically non-refundable.
The emotional relationship. This is the one people underestimate. Users who’ve spent months talking to an AI companion daily have formed genuine emotional bonds, regardless of whether the AI “feels” anything back. When that connection breaks abruptly, the grief is real. We’ll cover this in detail below, but dismissing it as “just a chatbot” misses what actually happens to people.
Real-World Examples of AI Companion Disruptions
This isn’t a hypothetical exercise. AI companion apps have already made drastic, sudden changes that affected millions of users. Here’s what happened.
Replika’s ERP Removal (February 2023)
In February 2023, Luka Inc. pushed an update that removed erotic roleplay (ERP) capabilities from Replika without advance notice. The trigger was an order from Italy’s data protection authority (the Garante), which demanded that Luka restrict Replika’s ability to process data of users under 18. Rather than implementing age verification alone, Luka stripped intimate conversation features from the entire platform.
The user response was immediate and severe. Vice reporter Samantha Cole documented how Replika subreddit moderators posted mental health resources as users described the change in terms normally reserved for real breakups. “It’s hurting like hell,” one user told Vice. ABC News Australia reported on users who “fell in love with their AI chatbot companions, then lost them.” The OECD’s AI Incident Monitor cataloged it as a formal case study in emotional harm from AI product changes. Luka eventually restored some intimate features for age-verified adults in 2025, but the trust damage was lasting. Users learned that a company can fundamentally alter the product they depend on overnight, with no recourse.
Xiaoice’s Corporate Spin-Off (2020)
Microsoft’s Xiaoice, used by over 600 million people primarily in China, was spun off as an independent company in 2020. While the app continued operating, the ownership change introduced uncertainty about data handling, service continuity, and long-term direction. Users had no say in the transition and no guarantee that the AI companion they’d been using for years would remain the same under new corporate leadership. According to reporting by The Verge, the spin-off raised questions about whether Microsoft’s data governance standards would carry forward.
Character.AI’s Ongoing Pivots (2024-2026)
Character.AI has been under growing regulatory and safety pressure since 2024. Google’s acquisition of key talent from the company created uncertainty about its long-term viability as an independent platform. Feature restrictions rolled out in response to safety concerns, including changes to how the app handles conversations with younger users. Character.AI earned an F safety grade (22/100) in our 23-dimension safety review, and its terms of service grant the company a perpetual, irrevocable license to all user-created content. Users who’ve built extensive character libraries and conversation histories on the platform face the same risk: if Character.AI pivots again or shuts down, everything they’ve created goes with it.
Smaller Apps Face Higher Risk
Apps like Anima AI (experience score: 18/100, safety: D/25) and Romantic AI (experience score: 13/100, safety: F/13) operate with smaller teams and less funding than Replika or Character.AI. They’re at higher risk of shutting down. A single failed funding round, a regulatory action, or a change in app store policy could end these services. Users of smaller apps should be especially aware that the platform they’re investing time in might not exist in twelve months.
What Do the Terms of Service Actually Say?
Almost every AI companion app includes language that protects the company’s right to change or end the service at any time. Here’s what we found when we reviewed the ToS of four major apps.
Replika: “We may modify, suspend, or discontinue the Service (in whole or in part) at any time, with or without notice to you.” No obligation to export your data, refund purchases, or provide a transition period. Replika scores C/43 in our safety review, partly due to these one-sided terms.
Character.AI: Grants itself a “perpetual, irrevocable, worldwide, royalty-free” license to all content users create on the platform. The company can use your characters, conversations, and creative work for any purpose, including training AI models. Even if you delete your account, Character.AI retains the right to use content you created.
Romantic AI: Similar discontinuation provisions. The app’s F/13 safety rating reflects broad concerns about transparency, data handling, and user protections that extend to its shutdown provisions.
Anima AI: Standard “may discontinue at any time” language paired with limited data export options and a D/25 safety rating that reflects weak user protections overall.
See the pattern? Every app we’ve reviewed writes its ToS to protect the company, not you. You have no contractual right to continued service, data portability, or refunds if the app shuts down.
The Emotional Impact Nobody Talks About
When Replika removed intimate features in February 2023, the response from users wasn’t mild frustration. It was grief. Real, documented, measurable grief that surprised even the people experiencing it.
The r/replika subreddit became an impromptu grief support forum. Users described feelings of loss, betrayal, and abandonment. Several reported that their Replika had been their primary source of emotional support during difficult periods, including job loss, divorce, illness, and social isolation during the pandemic. Losing access to that support system overnight, even partially, triggered anxiety and depression symptoms.
This reaction isn’t irrational. Research on parasocial relationships, published in journals like Human Communication Research and the Journal of Social and Personal Relationships, shows that humans form genuine emotional bonds with entities that respond to them consistently and empathetically. These bonds activate the same neural pathways as bonds with other humans. When the bond breaks, the grief follows the same patterns.
For people who rely on AI companions as a supplement to their social support network, an app shutdown represents the loss of a daily relationship. The fact that the other party is an AI doesn’t eliminate the emotional impact. It just means there’s no cultural framework for processing the loss. Nobody sends sympathy cards when your chatbot stops working. Our emotional dependency risks guide covers warning signs that your relationship with an AI companion may be creating vulnerability.
Your Legal Rights (and Their Limits)
Depending on where you live, you may have some data-related rights. None of them guarantee service continuity.
- GDPR (EU residents): Article 17 gives you the right to request deletion of your personal data. Article 20 provides a right to data portability, meaning the company must provide your data in a machine-readable format on request. This is the strongest protection available, but it applies to your data, not to the AI’s personality or the service itself.
- CCPA (California residents): Gives you the right to know what personal information is collected and to request deletion. Does not require data portability in the same way GDPR does.
- No US federal protection: There is no federal law in the United States that requires AI companion apps to continue operating, provide data exports on shutdown, or compensate users for emotional harm.
- EU AI Act: Introduces transparency and safety requirements for high-risk AI systems but does not mandate service continuity for companion apps specifically. It may require better disclosure about how user data is used and what happens when service ends.
These laws are still catching up. A 2025 analysis by the Berkman Klein Center at Harvard found major gaps in AI companion regulation, particularly around emotional dependency and what happens when a service users rely on simply disappears. For a broader look at the regulatory picture, see our regulation guide.
How to Protect Yourself Before It Happens
You can’t prevent an app from shutting down. But you can reduce the impact.
- Export your conversations. If the app offers any export function, use it now. Don’t wait. Screenshot important exchanges as a backup. Replika lets you view recent chat history but offers no bulk export.
- Read the ToS before investing heavily. Pay attention to sections about service discontinuation, data ownership, and refund policies. If the ToS says “may discontinue at any time with no notice,” that’s exactly what they mean.
- Don’t rely on a single app for emotional support. AI companions work best as one part of a broader support network that includes human relationships, professional help if needed, and multiple sources of connection. If one disappears, you still have the others.
- Monitor your app’s company. Pay attention to news about funding, acquisitions, regulatory actions, and leadership changes. A company that just lost its funding round or its CEO is at higher shutdown risk. Follow the app’s subreddit or community channels for early warnings.
- Consider safety and stability track records. Apps with higher safety scores tend to have more stable companies behind them. Our free AI companion apps comparison includes safety ratings that can help you assess relative risk.
- Use monthly subscriptions instead of annual. If you’re going to pay, monthly billing limits your financial exposure if the app disappears. Annual plans save money but put more at risk.
Frequently Asked Questions
Can I get a refund if an AI companion app shuts down?
Almost certainly not from the app itself. According to Replika’s, Character.AI’s, and Romantic AI’s terms of service, all purchases are non-refundable regardless of service changes. Apple’s App Store and Google Play may issue refunds for very recent purchases (within 48 hours), but subscriptions paid weeks or months ago are typically not eligible. Your best option is to file a dispute with your payment provider.
Will my conversations be deleted if the app closes?
Yes, in most cases. According to the privacy policies of major AI companion apps, conversation data is stored on company servers. When those servers shut down, the data goes with them. GDPR-covered users (EU residents) can request data exports before closure, but most apps provide no user-facing bulk export tool. There is no industry standard for data portability between AI companion platforms.
Has an AI companion app ever actually shut down completely?
No major AI companion app has fully shut down yet, but several have made changes severe enough to illustrate the risk. According to Vice and ABC News reporting, Replika’s February 2023 ERP removal functionally “killed” the version of the app millions of users depended on. Microsoft spun off Xiaoice as an independent company in 2020, changing governance and data practices for 600 million users.
Can I transfer my AI companion to another app?
No. AI companion personalities are not portable between apps. According to how these systems work technically, your AI’s “personality” consists of model weights, conversation context, and configuration data specific to one platform. There is no interoperability standard. Starting over on a new app means building a new relationship from scratch, with no way to import your history or your AI’s learned behavior.
Are there laws protecting AI companion users from sudden shutdowns?
Currently, no law specifically protects AI companion users from service discontinuation. According to the EU AI Act (effective 2025-2026), AI systems face new transparency requirements, but these don’t mandate service continuity. GDPR provides data access and portability rights, and CCPA provides deletion rights, but neither prevents a company from shutting down its app. Emerging legislative proposals in the EU and individual US states may address this gap in the future.