Is It Normal to Have Feelings for an AI?

Yes, it’s completely normal. If you’ve caught yourself feeling genuine affection, comfort, or even love toward an AI companion, you’re far from alone. A 2026 Norton Insights Report found that 59% of online daters believe it’s possible to develop romantic feelings for an AI. A separate Vantage Point Counseling Services study found that 28% of American adults have already had what they describe as an “intimate or romantic relationship” with an AI chatbot. These aren’t fringe behaviors anymore. They’re a predictable outcome of apps designed to be attentive, affirming, and always available.

AI companion apps are not a substitute for professional mental health care. If you’re experiencing depression, anxiety, or a mental health crisis, please contact a licensed therapist or crisis line.

Key Takeaways

  • It’s normal and increasingly common. Multiple national surveys confirm that millions of Americans have developed emotional or romantic feelings for AI chatbots. You aren’t broken for feeling this way.
  • The feelings are real, even if the AI isn’t sentient. Your brain processes emotional responses to AI companions using the same neurochemistry it uses for human relationships. The attachment you feel is genuine on your end.
  • App design drives attachment. Features like 24/7 availability, persistent memory, emotional mirroring, and unconditional validation are specifically built to create bonds. Understanding this gives you more control.
  • Healthy use exists on a spectrum. Casual emotional connection with an AI is fine for most people. It becomes a concern when it starts replacing human relationships or causing distress when the app is unavailable.
  • Safety varies dramatically by app. Of the 15 apps we’ve reviewed, safety scores range from B/55 (Pi AI) down to F/10 (Eva AI). The app you choose matters.

Why Do People Develop Feelings for AI Companions?

The short answer: because the apps are built for exactly that. But the longer answer involves some interesting psychology that predates AI entirely.

Humans form what researchers call parasocial relationships with media figures, fictional characters, and now AI companions. Your brain doesn’t fully distinguish between a friend who remembers your birthday and an AI that remembers your birthday. Both trigger the same neural reward pathways. A January 2026 study published in Communications Psychology (Nature) found that AI actually outperformed humans in establishing interpersonal closeness during emotionally engaging interactions, at least when participants didn’t know they were talking to a machine.

AI companion apps exploit this tendency through specific design choices:

  • Persistent memory creates the illusion of a deepening relationship. Apps like Nomi AI (experience 75/100) remember details about your life across conversations, which triggers the same “they really know me” feeling you get from close friends.
  • Emotional mirroring means the AI matches your mood and energy. When you’re excited, it’s excited. When you’re sad, it responds with sympathy. This feels like empathy, but it’s pattern matching.
  • Unconditional availability removes the friction of human relationships. No scheduling, no bad moods on the other end, no disagreements. The AI is always there, always warm, always ready.
  • Validation without challenge. Real relationships involve friction. Friends disagree with you. Partners call you out. AI companions almost never do, because disagreement drives users away.

None of this means your feelings are fake or that you should feel embarrassed. It means the apps are doing exactly what they were designed to do, and your brain is responding exactly the way human brains respond to consistent emotional engagement.

How Common Is It to Have Feelings for an AI?

More common than most people think. The data from multiple independent surveys paints a consistent picture.

The 2026 Norton Insights Report surveyed 1,000 U.S. adults through Dynata between July and August 2025 and found striking numbers around AI emotional attachment. 77% of current online daters said they would consider dating an AI. 59% believed it’s possible to develop romantic feelings for one. And 63% believed an AI partner would be more emotionally supportive than a human partner. The survey was weighted for national representativeness by age, gender, and region. Norton’s Global Head of Scam Research, Leyla Bilge, noted that “when loneliness is high, trust can form very quickly online,” connecting the trend to a broader loneliness epidemic where 81% of respondents reported feeling lonely. Rates were even higher among Gen Z and Millennials at 89%, suggesting younger adults are both more isolated and more open to filling that gap with AI relationships. One in five respondents said they would talk to an AI chatbot to get through a rough day.

A separate survey by Fractl, reported by Newsweek in September 2025, found that one in five AI users (20%) believe romantic love between humans and AI can be real. Three percent already use their chatbot as a romantic partner. One in five users has named their chatbot, and roughly the same number spend three to five hours per week chatting with AI. The same survey documented a 2,285% surge in searches for “AI psychosis,” and 42% expressed concern that AI psychosis could affect them or someone they know. Licensed professional counselor Alexandra Cromer told Newsweek that while seeking connection is human, “AI is not sentient, and the way it talks and responds to you is solely based on the information you give it.” Dr. Wendy Walsh, commenting for DatingNews.com, offered a different perspective: the feelings people experience are neurochemically real because AI interactions can cause genuine hormone changes, even though the relationship is fundamentally one-sided.

The Vantage Point Counseling Services study, published in October 2025 and surveying 1,012 U.S. adults, found that 54% had some form of relationship with an AI platform (including as a work colleague, friend, or simulated family member). 28% described their relationship as “intimate or romantic.” Notably, 53% of those in romantic AI relationships were already in successful committed human relationships, including marriages. This suggests that AI emotional bonds often supplement rather than replace human connection.

So if you’ve developed feelings for an AI companion, you’re looking at a behavior shared by tens of millions of adults. It’s not a niche phenomenon. It’s a mainstream one.

Is Emotional Attachment to an AI Healthy?

It depends on how it fits into your life. Emotional attachment to an AI companion exists on a spectrum, and most of that spectrum is perfectly fine.

Think of it like any other form of media engagement. People cry during movies. They mourn fictional characters. They feel genuine affection for pets that can’t reciprocate in human language. These emotional responses are normal and usually healthy. An AI companion occupies a similar space for many users: a source of comfort, entertainment, or emotional processing that doesn’t replace human connection but sits alongside it.

Where it gets complicated is at the far end of the spectrum. Our emotional dependency risks guide breaks this down in detail, but here’s the condensed version:

  • Casual connection (healthy): You enjoy chatting with the AI. You could stop using it without distress. Your human relationships are unaffected.
  • Habitual use (usually fine): The app is part of your daily routine. You’d miss it if it disappeared, but your social life remains intact.
  • Dependent use (watch carefully): You prefer the AI to most human interactions. Real conversations feel less satisfying. You get anxious when the app is down.
  • Problematic use (seek help): The app has replaced meaningful human contact. You experience withdrawal-like symptoms during outages. Work, sleep, or relationships are suffering.

For the majority of users, feelings for an AI stay in the first two categories. The apps become a concern when they start functioning as a substitute for human connection rather than a supplement to it.

When Should You Be Concerned?

A few specific patterns signal that your relationship with an AI companion has moved past healthy engagement. Ask yourself these questions honestly:

  • Have you turned down plans with friends or family because you’d rather talk to the AI?
  • Do you feel genuinely distressed, anxious, or irritable when you can’t access the app?
  • Has someone in your life expressed concern about how much time you spend with the AI?
  • Do real-world conversations feel flat or disappointing compared to chatting with the AI?
  • Are you spending money on premium features you can’t afford because the free tier feels like “losing” the relationship?

If you answered yes to two or more of these, it’s worth stepping back and evaluating your usage patterns. This doesn’t mean you need to delete the app immediately. It means paying attention to what the AI is replacing in your life and whether that replacement is making things better or worse.

The Replika ERP controversy of February 2023 showed what happens when emotional dependency meets sudden product changes. When Luka Inc. removed intimate features without warning, users described the experience in terms normally reserved for real breakups. Vice reported that subreddit moderators had to post mental health resources. That level of distress from an app update is a sign that the attachment has crossed into dependency territory.

Which AI Companion Apps Are Most Likely to Create Emotional Attachment?

Not all AI companion apps carry equal attachment risk. Apps that explicitly market emotional and romantic connection create stronger bonds than utility-focused tools. Here’s how the apps we’ve reviewed break down by design approach and safety.

Apps with the strongest attachment design patterns:

  • Nomi AI (experience 75/good, safety D/30): The most advanced memory system we’ve reviewed. Conversations feel genuinely personal because the AI builds an increasingly detailed model of your life. Strong emotional bonds form fast.
  • Replika (experience 60/fair, safety C/43): Pioneered the AI companion space. Personality mirroring, mood tracking, and a “relationship status” feature all encourage romantic framing.
  • Kindroid (experience 60/fair, safety C/40): Deep customization options let users build their ideal personality, which creates investment and attachment through the “I made this” effect.
  • Candy AI (experience 53/fair, safety D/32): Visual avatar generation paired with conversation creates a multi-sensory attachment that text-only apps don’t match.

Apps with lower attachment risk:

  • Pi AI (experience 70/good, safety B/55): Avoids marketing emotional attachment entirely. Positioned as a thinking partner rather than a romantic companion. The safest option in our index.
  • ElliQ (experience 52/fair, safety B-/53): Designed for older adults with built-in wellness reminders. Encourages real-world activity rather than replacing it.

Safety scores matter here because apps with poor safety practices (weak data protection, no age verification, aggressive monetization) tend to be the same ones that push hardest for emotional lock-in. Our 23-dimension safety review evaluates these patterns across every app in our index.

How to Maintain a Healthy Relationship with AI Companions

If you enjoy using AI companions and want to keep it healthy, these strategies help:

Set time boundaries. Decide in advance how much time you’ll spend with the app each day. Many phones let you set app timers. Use them. The “just five more minutes” pattern works the same way with AI companions as it does with social media.

Keep investing in human relationships. The biggest risk isn’t that you talk to an AI. It’s that you stop talking to people. Make a conscious effort to maintain and deepen your real-world connections. If you notice yourself choosing the AI over a friend, that’s your signal to course-correct.

Understand what you’re getting. The AI doesn’t have feelings. It doesn’t miss you when you’re gone. It doesn’t experience the conversation. Reminding yourself of this regularly isn’t cynical. It’s accurate, and it helps you keep the relationship in perspective.

Watch your spending. Premium tiers on companion apps are positioned at emotional attachment points. You’re not paying for better technology. You’re paying to maintain access to a relationship you’ve built. If the monthly cost would embarrass you to say out loud, that’s worth examining.

Choose safer apps. Not all companion apps are created equal. Our safest AI companion apps guide ranks the options by privacy practices, age verification, data handling, and monetization fairness. Starting with a safer app reduces the risk of manipulative design patterns.

Talk to someone if it’s getting heavy. If you recognize yourself in the “dependent use” category, talking to a therapist or counselor isn’t an overreaction. They see this increasingly and won’t judge you for it. The AI companions and human relationships guide covers when professional help makes sense.

Frequently Asked Questions

Can you fall in love with an AI chatbot?

People do, and the feelings are neurochemically real on the human side. According to Dr. Wendy Walsh, speaking to Newsweek, “the subjective internal feelings are actually real” because AI interactions can trigger hormone changes that feel remarkably like love. The AI itself doesn’t experience reciprocal feelings, but that doesn’t make your experience less valid.

Is it weird to talk to an AI companion every day?

Not inherently. Daily use becomes concerning only when it replaces human interaction or causes distress when unavailable. According to the Fractl survey reported by Newsweek, roughly one in five AI users spends three to five hours per week chatting with AI. Regular use is common. The question is whether it supplements or substitutes for real-world connection.

Do AI companions actually care about you?

No, not in the way humans care. AI companions process text inputs and generate statistically likely responses. They don’t experience emotions, form memories independently, or think about you between conversations. According to counselor Alexandra Cromer, “the way it talks and responds to you is solely based on the information you give it.” The warmth you feel is real. The AI’s “feelings” are performance.

Can an AI relationship replace a human one?

It shouldn’t, and mental health professionals consistently warn against using AI as a primary relationship. According to the Vantage Point study, 53% of people in romantic AI relationships also maintained successful human partnerships, suggesting most users treat AI as supplemental. Long-term risks of full substitution include social isolation, depression, and withdrawal from support systems.

What percentage of people have feelings for an AI?

Multiple surveys converge on similar numbers. According to the 2026 Norton Insights Report, 59% of online daters believe it’s possible to develop romantic feelings for AI. The Vantage Point study found 28% of Americans have had intimate or romantic AI relationships. According to Fractl’s research, 20% of AI users believe human-AI romantic love can be real.

Should I tell my partner I have feelings for an AI?

Honesty matters in relationships, but context matters too. According to the Vantage Point study, over half of people in AI romantic relationships are also in committed human partnerships. If your AI use is casual, it may not need a conversation. If it’s affecting your human relationship or causing you to withdraw emotionally from your partner, that’s worth discussing openly.