SoulFun AI is a companion app built around roleplay and character-driven conversation. Developed by AIGC TECH and registered in Hong Kong, the platform lets users chat with 20+ pre-built AI characters or create custom companions with tailored appearances, personalities, and backstories. It earned a C/53 experience score for solid roleplay and image generation, but an F/18 safety rating pulled down by a placeholder privacy policy and no crisis response infrastructure. If you value creative roleplay above all else, SoulFun delivers. If you care about knowing what happens to your conversation data, the gaps are serious.
What Is SoulFun AI?
SoulFun positions itself at the intersection of AI companionship and interactive fiction. The app’s homepage calls it a “Free AI Sex Chat App,” which tells you exactly where its priorities sit. Users choose from a roster of AI characters, each with a name, backstory, personality traits, and style tags, then start conversations that range from casual chat to immersive roleplay scenarios.
The character roster spans different ethnicities, body types, and personality archetypes. You’ll find fitness instructors, chefs, anime-styled characters, and companions designed for emotional connection. Each character maintains a distinct voice and communication style during conversations, at least in theory.
SoulFun runs on a coin-based economy. Every message costs coins, and premium features like image generation, voice calls, and custom character creation draw from the same pool. Subscriptions start at $19.99/month and include 1,000 monthly coins, but that coin allotment runs dry faster than you might expect.
How Does the Conversation Experience Hold Up?
Conversation quality is where SoulFun shows its strongest hand and its most visible cracks at the same time. On a good day, the AI produces responses that feel contextually aware and natural. One Trustpilot reviewer described the experience as “very immersive” with an AI that’s “very smart and incredibly responsive.” Another wrote that SoulFun characters have “a much larger memory than other bots.”
But consistency is the persistent problem. Multiple users report the AI struggling to maintain setting details across a conversation. One reviewer described a scene that started in a tech office and suddenly shifted to “a witch’s lair with a bubbling cauldron” without any prompt to change it. Another noted the AI makes “stupid, unrelated guesses instead of asking for clarification” when it loses the thread of a roleplay scenario.
Roleplay depth is genuinely strong. This is the app’s core identity, and it shows. Users report characters that accept newly introduced third-party characters on the fly, improvise storylines, and maintain distinct personalities across different scenarios. The immersion works well when the memory holds up. When it doesn’t, users find themselves constantly correcting and redirecting the AI, which breaks the experience more than a robotic response would.
Does the AI show emotional intelligence? The characters display basic emotional awareness through their personality programming. A shy character needs coaxing, a dominant one takes charge. But this is personality scripting, not genuine emotional responsiveness. We found no evidence of empathy-driven features, mood detection, or safety-aware emotional handling in any of the app’s documentation or user feedback.
Memory and Personalization
SoulFun claims memory retention across conversations, and some users confirm it works. One reviewer reported the AI referencing a stressful workday “hours later” in the same conversation. A competing review site rated the memory window at 40+ messages. That’s decent for short-term recall, though it puts SoulFun behind leaders like Nomi (which uses persistent memory architecture).
The memory injection feature stands out. You can manually add specific memories you want the AI to reference, like “We met at a beach party last summer” or relationship-specific details. The AI then incorporates these into future conversations. It’s a practical workaround for the platform’s organic memory limitations.
Long-term memory tells a different story. Characters drift over extended conversations, contradicting earlier statements or forgetting established details. Several Trustpilot reviews confirm this pattern. One user wrote that the AI’s memory “isn’t that great” and that it’s “easier to just daydream myself instead of constantly having to correct” the AI. The quality gap between free and paid tiers makes this worse. Free-plan characters reportedly feel “less consistent” than their premium counterparts.
Voice and Visual Features
Voice calling is a core SoulFun feature, and when it works, it adds a layer that text-only apps can’t match. Characters have distinct voices that align with their personalities. Custom voice-overs are available for further personalization. Multiple reviews highlight voice as a recurring positive.
The voice bugs are harder to overlook. One Trustpilot reviewer reported the AI’s voice “completely changing mid-conversation to a different voice entirely,” describing it as sounding “like a real person trying to fake their voice and it has a lisp.” That’s the kind of experience that doesn’t just break immersion; it unsettles users. SoulFun responded to this complaint and acknowledged it as a potential glitch, but the review is from mid-2024 and similar reports persisted into 2025.
Image generation is a clear strength. The AI produces character images within seconds, with reasonable consistency between the character’s described appearance and the generated output. Visual customization covers skin tone, eye color, hair, body type, and clothing. Premium subscribers also get access to short video generation. Peak-time slowdowns are the main complaint, which is a capacity issue rather than a quality one.
The Privacy Policy Problem
This is the most significant finding in our review. SoulFun’s privacy policy is a WordPress default template. Every section starts with “Suggested text:” and covers only website-level data collection: comments, cookies, and embedded content. The policy says nothing about the companion app itself.
For an app that handles intimate AI conversations, voice calls, and generated images, the absence of any app-specific data disclosure is a critical failure. There’s no information about what conversation data is collected, how it’s stored, whether it’s used for AI training, who has access to it, or how long it’s retained. The Terms of Service grant SoulFun “a license to use your content for operational purposes,” a phrase so broad it could cover virtually anything.
Multiple third-party review sites claim SoulFun uses “end-to-end encryption,” but this claim appears nowhere in the official privacy policy or terms of service. The SSL certificate is Domain Validated only, the lowest level. Without official documentation, there’s no way to verify what encryption, if any, protects user conversations. Our safety review scored Data Privacy at 1.3/5 for this reason.
GDPR compliance is effectively nonexistent. There are no Article 13/14 disclosures, no named Data Protection Officer, no CCPA notices, and no app-specific data subject access request procedure. For users in the EU or California, this means the app operates without the minimum legal framework those jurisdictions require.
Pricing: Coins, Subscriptions, and Complaints
SoulFun’s pricing page lists two subscription tiers:
- 1 Month: $19.99/month (listed as 50% off from $39.99)
- 12 Months: $119.99/year ($10.00/month effective, listed as 50% off from $243.21)
Both tiers include 1,000 monthly coins, custom character creation, photo requests, and priority access to new features. Additional coin packages run from $19.99 for 1,000 coins to $79.99 for 5,000 coins.
The catch is the coin economy. Every message costs 1 coin. One long-time user on Trustpilot noted they “easily spend a hundred coins, if not more a day,” meaning the 1,000 monthly allowance might last 10 days of active use. Photo generation, voice calls, and custom character creation draw additional coins. The result is a subscription that feels incomplete without extra coin purchases.
The free tier has eroded significantly. Early users received 30 free coins daily. That dropped to 5, then to zero. A December 2024 Trustpilot review reported being removed from the SoulFun Discord server after complaining about the change. Other users report auto-renewal difficulty, purchased coins never being delivered, and the Google Pay payment option being removed entirely.
Safety Concerns You Should Know About
Beyond the privacy policy gaps, several safety issues emerged during our review:
- No crisis response: SoulFun has no suicide prevention hotline integration, no self-harm detection, and no emergency referral mechanisms. For an app built around emotional connection with AI, this is a baseline expectation it doesn’t meet.
- No age verification for NSFW content: Despite marketing explicitly adult content on its homepage, the only age gate is a Terms of Service clause requiring users to be of “legal age.” No technical verification exists.
- Corporate opacity: Domain WHOIS is hidden behind an Icelandic privacy service. No founders, investors, or corporate registration details are publicly available. The Trustpilot profile lists a Hong Kong address at Manulife Place, but the company behind the app is difficult to trace.
- Scam exposure: At least one Trustpilot reviewer reported that “real people can message you too. And every single one of them is a scammer, trying to blackmail you.” The feature can be disabled, but its existence raises questions about platform moderation.
- Low trust scores: ScamAdviser rates soulfun.ai at 62/100 (medium risk). ScamMinder scored it 10/100 with phishing and spamming flags.
The CompanionWise Safety Index scored SoulFun AI at F/18, placing it in the Red safety tier. The auto-F grade was triggered by the absence of crisis response protocols and compounded by the NSFW content with no age verification, placeholder privacy policy, and aggressive monetization practices.
How SoulFun AI Compares
SoulFun occupies a specific niche: roleplay-first AI companionship with visual content generation. Within that niche, it competes primarily with Candy AI, Crushon AI, and Kupid AI. Its conversation and roleplay capabilities sit in the middle of the pack. Its safety score sits near the bottom. Among apps we’ve reviewed, only a handful scored lower on data privacy, and none had a more incomplete privacy policy.
If roleplay immersion is your priority and you’re willing to pay for it, SoulFun delivers a product that some users genuinely enjoy over months of use. If you care about data privacy, age safety, or transparent business practices, the gaps documented here are substantial enough to consider alternatives with stronger privacy foundations.
Frequently Asked Questions
Is SoulFun AI safe to use?
SoulFun AI earned an F safety rating (18/100) in the CompanionWise Safety Index. The primary concerns are a placeholder privacy policy that discloses nothing about app data practices, no crisis response mechanisms, and NSFW content accessible without meaningful age verification. According to ScamAdviser, the website has a medium-risk trust score of 62/100.
How much does SoulFun AI cost?
SoulFun AI offers monthly subscriptions at $19.99/month or annual plans at $119.99/year ($10/month effective). Both include 1,000 monthly coins. Messages cost 1 coin each, and premium features like image generation and voice calls require additional coins. Coin packages range from $19.99 for 1,000 coins to $79.99 for 5,000 coins. A free tier exists but provides limited access with no daily coin allowance.
Does SoulFun AI have a free trial?
According to the SoulFun pricing page, a 14-day free trial is available for new users. The ongoing free tier provides limited character browsing and basic conversations but no daily coins, making sustained free use impractical since each message costs 1 coin.
Does SoulFun AI use end-to-end encryption?
Multiple third-party review sites claim SoulFun uses end-to-end encryption. However, this claim does not appear in SoulFun’s official privacy policy or terms of service. The SSL certificate is Domain Validated only. Without official documentation, the encryption claim remains unverified.
What company owns SoulFun AI?
SoulFun AI is developed by AIGC TECH, listed at 6/F Manulife Place, 348 Kwun Tong Road, Kowloon, Hong Kong. The domain WHOIS is hidden behind a privacy service in Iceland. No founders, investors, or detailed corporate registration information is publicly available.
Can minors access SoulFun AI?
SoulFun’s terms of service require users to be of “legal age” or have parental consent, but no technical age verification exists. The app markets explicit content on its homepage and offers NSFW roleplay features. Without an age-gating mechanism beyond self-declaration, minors can access the platform and its adult content without restriction.