Woebot Review 2026

Experience Score: PoorSee full breakdown ↓

The Bottom Line

Woebot is no longer available to new users. The free CBT mental health chatbot shut down June 30, 2025 after eight years and 1.5 million users; existing accounts were locked out July 31, 2025 with conversation data anonymized that same day. Woebot earns CompanionWise Safety Index B/63/Yellow on its historical record: HIPAA-aligned, SOC 2 Type 2 examined with zero exceptions, FDA Breakthrough Device designation 2021, and 18 IRB-reviewed clinical trials. Experience scores 38/100 (poor) on our category-relative rubric because the rule-based decision-tree architecture intentionally constrained conversational depth and memory. Founder Alison Darcy cited FDA marketing-authorization cost and the regulatory difficulty of deploying LLM-class AI inside a regulated medical product as the reasons for the shutdown.

Caution
Hear the full breakdown5:02
0:00
5:02

Safety Index Score

63 / 100
B Caution
View full Safety Index report →

Experience Score

Experience Score measures product quality based on aggregated user feedback, separate from the Safety Index.

Poor 38/100
Dimension Score
Conversation Quality 43/100
Memory & Personalization 29/100
Feature Depth 53/100
App Experience 45/100

Who It's Best For

  • Researchers studying historical regulated mental-health AI
  • Former users seeking a sourced answer about what happened
  • Healthcare-product designers studying clinical-grade chatbot architecture
  • Journalists covering AI mental-health regulatory gaps
  • B2B partners with active access codes through providers or employers

Who It's NOT For

  • New users seeking a working mental health chatbot (the service is shut down)
  • Anyone wanting open-ended generative conversation (architecture was decision-tree only)
  • Users with ADHD or autism-coded emotional processing per the ADDitude critique
  • People needing active in-product crisis intervention (the app directed users to external emergency resources)
  • Users wanting deep customization, persistent memory, or roleplay features

What We Like

  • SOC 2 Type 2 plus HIPAA-aligned posture (rare in category)

    According to the Woebot Health safety overview, the company underwent SOC 2 Type 2 examination with zero exceptions and adhered to HIPAA Privacy and Security Rules, including encryption-at-rest and annual third-party penetration testing.

  • Rule-based architecture eliminated the LLM hallucination class of risk

    The chatbot ran on a fixed decision tree authored by clinical psychologists. Every line the bot sent was reviewed in advance by a clinician, structurally insulating the product from generative-AI failure modes that have surfaced in newer LLM-based companion apps.

  • 18 IRB-reviewed clinical trials documented in the public record

    A Dime Society regulatory case study documented 18 separate IRB-reviewed trials with about 1,800 participants. The foundational study (Fitzpatrick, Darcy and Vierhile, JMIR Mental Health, 2017) showed significant reduction in depressive symptoms versus an e-book control across two weeks.

  • FDA Breakthrough Device designation in May 2021

    According to a BusinessWire announcement, WB001 (Woebot Health's investigational digital therapeutic for postpartum depression) received FDA Breakthrough Device designation in May 2021. The pivotal RCT (NCT05662605) enrolled its first patient in January 2023.

  • No advertising business model, no documented breaches

    Have I Been Pwned queries returned zero breaches at woebothealth.com or the legacy woebot.io domain. Exodus Privacy found only Google Crashlytics and Firebase Analytics SDKs, both standard for crash reporting and product analytics rather than advertising.

What Could Be Better

  • Service shut down June 30, 2025 (consumer product no longer functional)

    Per a STAT News interview with founder Alison Darcy on July 2, 2025, Woebot Health ended the consumer service on June 30, 2025. Existing users were locked out July 31, 2025 with account data anonymized the same day. App store listings remain visible but installing the app produces an access-code wall with no consumer path through.

  • Passive crisis response with no active intervention

    The terms of service explicitly directed users to call 911 or local emergency services in any urgent situation and disclaimed any clinical relationship. When NLP routing detected concerning language, the app offered the option to view external resource information rather than escalating directly.

  • Decision-tree architecture felt scripted to many users

    Across 890 combined iOS and Google Play ratings, 31 reviews cite repetitive or scripted responses. The highest-signal Google Play review (73 thumbs up) summarizes the complaint: "the bot doesn't just let you talk, it usually just gives you a multiple choice selection of responses, and they are often variations of the same answer."

  • Privacy-policy and marketing-language inconsistency on advertising

    Per Mozilla Privacy Not Included, a 2023 review flagged tension between the company's "we never share with advertisers" pledge and other privacy-policy passages mentioning marketing-partner sharing. The iOS Privacy Nutrition Label lists "Contact Info (Name)" under Third-Party Advertising and Developer's Marketing or Advertising.

  • Assumed neurotypicality per the ADDitude Magazine critique

    ADDitude Magazine's 2023 review by Elizabeth Broadbent argued that Woebot "assumes neurotypicality." The CBT decision tree was authored for a default cognitive style, not specifically for ADHD or autism-coded emotional processing, and the standard responses sometimes mismatched the user's actual emotional state.

What Is Woebot?

Woebot is no longer available to new users. The app shut down on June 30, 2025, after almost eight years as a free CBT-based mental health chatbot used by more than 1.5 million people. Existing users were locked out on July 31, 2025; account data was anonymized that same day. Both app store listings remain visible, which is why people keep finding the product and downloading it, only to hit an access-code wall with no way through. We still review Woebot because the safety record is worth preserving on the public record, and because the questions readers are asking (“Is Woebot still around? What was it? What happened to my data?”) deserve a clear, sourced answer in one place.

What happened to Woebot

Woebot Health emailed users on April 28, 2025 to announce the shutdown. The consumer service ended on June 30, 2025. The transcript-export window closed on July 15, 2025, so users who wanted a copy of their conversation history had about eleven weeks to act. Account data was anonymized on July 31, 2025. The company kept both app store listings published. The iOS binary received a final maintenance release (v6.6.1) on December 17, 2025, six months after the consumer service ended.

That last detail explains a lot of the recent reviews. The most recent Google Play review at the time of our analysis (April 26, 2026) reads: “you need an access code to start with but there’s no option for this! I’m uninstalling.” Anyone who searches for “mental health chatbot” or who sees Woebot recommended in older articles will find a polished store listing, install the app, then run into a wall with no apparent way around it. The wall is the policy, not a bug.

Founder Alison Darcy explained the shutdown in a STAT News interview on July 2, 2025. Two reasons: the cost and complexity of pursuing FDA marketing authorization for WB001 (the company’s investigational digital therapeutic for postpartum depression), and the regulatory difficulty of deploying LLM-class generative AI inside a medical product. The pre-scripted Woebot, in her words, was “an impressive chatbot before more advanced technology was available,” but it was outpaced by ChatGPT-class systems that the FDA has not yet figured out how to regulate.

What was Woebot, and why did it matter

Woebot was founded in 2017 by Alison Darcy, a Stanford clinical research psychologist. The product was a cognitive behavioral therapy chatbot, offering guided self-help, not therapy. The conversational interactions ran on a fixed decision tree authored by clinical psychologists. Users moved through the dialogue by tapping multiple-choice replies; the app routed them through CBT exercises, mood check-ins, journaling prompts, and gratitude logs. Beneath the conversational layer, natural-language processing routed user free-text into the appropriate scripted branch. There was no generative AI in production. The bot did not improvise. It did not hallucinate. It did not write its own crisis responses on the fly.

That architectural choice is the single most important thing to understand about Woebot’s safety posture. Every line the bot ever sent to a user was reviewed in advance by a clinician. When the news headlines started arriving in 2024 about LLM-based companion bots telling users to harm themselves or roleplaying as their dead relatives, Woebot was structurally insulated from that class of failure. The trade-off was that Woebot could feel scripted, single-track, and shallow when users wanted open conversation. The 73-thumb-up Google Play critique is the canonical version of this complaint: “the bot doesn’t just let you talk, it usually just gives you a multiple choice selection of responses, and they are often variations of the same answer.”

The business model was free-at-point-of-use. Woebot Health made revenue from B2B partnerships: employers, hospitals, healthcare providers, IRB-approved academic studies. Through 2023, anyone could download the app and use it. From 2023 onward the developer gradually shifted to access-code-only distribution, where users obtained codes through participating clinicians, employers, or research programs. By 2024 the consumer download path no longer worked without a code, which is when the Google Play 1-star reviews started arriving in volume.

How safe was Woebot

Our safety review covers the as-of-retirement state, the version of the product that 1.5 million users actually experienced. CompanionWise Safety Index: Woebot B/63/Yellow. The grade reflects strong infrastructure, intentional design choices that reduced certain risk classes, and meaningful gaps in age verification, minor safeguards, and crisis response.

The infrastructure side was unusually strong for an AI companion product. Woebot Health declared SOC 2 Type 2 examined with zero exceptions. The privacy policy treated all user data as Protected Health Information under HIPAA, even when not legally classified as such. The company adhered to the HIPAA Privacy and Security Rules, including encryption-at-rest for all databases and TLS for transit. Annual third-party penetration testing was declared on the security overview. Exodus Privacy’s tracker scan found two SDKs: Google Crashlytics and Firebase Analytics, both standard for crash reporting and product analytics. Have I Been Pwned queries returned zero breaches on woebothealth.com or the legacy woebot.io domain.

The SOC 2 Type 2 plus HIPAA combination is rare in the AI companion category and tells you something specific. According to the Woebot Health safety overview, the company underwent SOC 2 Type 2 examination with zero exceptions, adhered to HIPAA Privacy and Security Rules, ran annual third-party penetration testing, and held ISO 13485:2016 certification for medical-device quality management. The HIPAA Security Rule is a federal compliance baseline maintained by the Department of Health and Human Services; SOC 2 Type 2 is an independent attestation framework run by the AICPA that examines how a service provider operationalizes security controls over a multi-month observation window. Together they mean a third-party auditor watched Woebot’s actual security operations, not just its written policies, and found no exceptions. Most apps in the safety registry hold neither attestation. Woebot held both, plus an FDA Breakthrough Device designation, plus 18 IRB-reviewed clinical trials. That triad is the historical record we are preserving on this page.

The clinical-trial footprint was real and substantial. A Dime Society regulatory case study documented 18 separate trials with about 1,800 participants, all IRB-reviewed, with protocols registered on ClinicalTrials.gov. The foundational study (Fitzpatrick, Darcy & Vierhile, JMIR Mental Health, 2017) showed significant reduction in depressive symptoms versus an e-book control across two weeks. WB001, the postpartum depression digital therapeutic, received FDA Breakthrough Device designation in May 2021. The pivotal RCT (NCT05662605) enrolled its first patient in January 2023.

Where the safety score lost points was a different category of risk. Crisis response was passive: when the app detected concerning language through its NLP routing, it offered the option to view information about external resources rather than actively intervening. The terms of service explicitly directed users to call 911 in an emergency and disclaimed any clinical relationship. Age verification was self-attestation only. The app was rated 12+ on iOS and Teen on Google Play, with parental consent required for users between 13 and the local age of majority, but there was no in-product parental dashboard or screen-time gate. Mozilla Privacy Not Included flagged a tension in 2023 between the company’s “we never share with advertisers” pledge and other privacy-policy passages mentioning marketing-partner sharing. The iOS Privacy Nutrition Label includes “Contact Info (Name)” under Third-Party Advertising and Developer’s Advertising or Marketing. That probably reflects first-party marketing, but the language inconsistency is real and we held the score back for it.

What did Woebot feel like to use

The aggregate review picture is bimodal. Across 890 combined ratings (500 iOS plus 390 Google Play), 58% are 5-star and 22.5% are 1-star, with a thin middle. The 5-star reviews concentrate in pre-2024 historical CBT-loved-it testimonials. The 1-star reviews concentrate in post-2023 access-code complaints and post-shutdown “I can’t sign up” frustration.

The positive themes were specific. 74 mentions of CBT skill acquisition, where users credited Woebot with helping them internalize techniques like cognitive reframing they now use without the app. 69 mentions of help with anxiety, depression, or panic, often framed as a stop-gap for users on therapy waitlists or between sessions. 68 mentions of the bot’s “companion” or “friend” feel, with the cute, quirky, non-judgmental tone praised as a differentiator. 40 mentions of “free with no ads, no premium gating,” now historical but a real selling point while the product was active. 23 mentions of 24/7 availability, often citing 3am panic-attack moments.

The negative themes were equally specific. 60+ mentions of access-code gating or “I cannot sign up” since 2024. 31 mentions of repetitive, scripted, or robot-like responses. 20 mentions of shallow or generic answers, especially for users with severe symptoms or complex life situations. 9 mentions of crashes or sign-in failures on iOS, mostly clustered in the late maintenance releases. The most thoughtful critique came from a 2023 ADDitude Magazine review by Elizabeth Broadbent, who pointed out that “Woebot assumes neurotypicality.” The CBT decision tree was authored for a default brain, not for ADHD or autism-coded emotional processing. When she told the bot she was mad, Woebot suggested she “imagine my emotions had a voice.” She was, she wrote, in “total rage mode.” The mismatch is real, and Woebot never claimed to fix it.

The honest reading of the user feedback is this. Woebot was a CBT skill builder, not a conversation partner. People who came to it expecting therapy or open-ended dialogue were disappointed. People who came to it expecting structured, evidence-based CBT exercises and a friendly check-in cadence generally got value. The Reddit r/EOOD comment from 2023 captures the typical satisfied user: “It’s a good anonymous way to work through mental/behavior issues in the moment they occur and apply CBT thinking to them. The daily consistency reminder was also very helpful.”

Our experience score is 38/100 (poor), which reflects the category-relative experience inside the AI companion space. Short-term memory was constrained by the decision-tree architecture, long-term memory effectively did not exist between sessions, and onboarding broke for any user without an access code. The score is not a verdict on whether the product worked for its actual purpose. For its purpose (guided CBT skill practice, not companionship), Woebot worked well for many users. It just was not built to score well on the dimensions our category-relative rubric measures.

Why did Woebot shut down

The reasons matter for the broader category. Speaking with STAT News on July 2, 2025, founder Alison Darcy gave two intertwined reasons. First, the cost of fulfilling FDA marketing-authorization requirements for WB001 as a regulated postpartum depression digital therapeutic had grown beyond what the company could sustain. Pursuing FDA clearance for a digital therapeutic is on the order of mid-eight figures over multi-year timelines; the Breakthrough Device designation accelerates review but does not lower the cost of running the pivotal trials. Second, Woebot Health wanted to deploy LLM-class generative AI in product, and the FDA has not yet established a regulatory framework for evaluating LLM-driven therapeutic claims. The pre-scripted Woebot, Darcy said, “was an impressive chatbot before more advanced technology was available.” She did not want to keep operating a product she described as outpaced by ChatGPT-class systems while she could not legally ship the next-generation version inside a regulated digital therapeutic. The company is now a private-practice consultancy, not a consumer-product business.

This is the third or fourth significant exit in the regulated-mental-health-AI space in the last 24 months. The category implication is unsettling: companies that did things the slow way (IRB trials, FDA clearance pathways, HIPAA-grade infrastructure, no LLM hallucinations) got outcompeted on user expectations by unregulated LLM-based companions, and could not legally pivot to LLMs themselves under current FDA rules. That is the regulatory gap Darcy was pointing at. Whether the gap closes, and how, will shape what kinds of mental-health AI exist five years from now.

What should former users know about their data

Account data was anonymized on July 31, 2025. Personal identifiers tied to conversation transcripts were severed on that date. The transcript-export window closed earlier, on July 15, 2025; if you did not pull a copy of your conversations before then, you cannot retrieve them now. Per the privacy policy, any retained de-identified records are held for legitimate research and operational purposes consistent with the IRB-approved studies in which Woebot participated. The HIPAA-aligned rights infrastructure (access, rectification, erasure, portability) was preserved through shutdown and is documented in the privacy policy and FAQ.

If you held a Woebot account through an employer or healthcare provider partnership, the partner program may have a separate retention policy for the de-identified data Woebot shared with them. Contact your provider directly about that. The “we never sell to advertisers” pledge was binding and survives the shutdown.

What do people use instead of Woebot

Readers landing on this page often ask which mental health chatbot to use now. We do not name a single replacement, because Woebot’s specific posture (CBT-only, rule-based, free, B2B-distributed, FDA-engaged) is rare. The closest current analogue is Wysa, which is also evidence-based, also B2B-distributed for employers and clinicians, with NHS partnerships in the UK. Replika is more conversational but has neither clinical posture nor HIPAA infrastructure, and our safety review surfaced concerns we did not see at Woebot. Character.AI is recreational and not therapy-oriented. The CompanionWise Safety Index pages let you compare the current options on the same evidence-based 23-dimension framework we used here.

If you are in active mental-health distress, the products listed above are not crisis services. Call or text 988 in the United States (Suicide and Crisis Lifeline) or your local crisis line. Woebot’s own terms of service made the same point in capital letters; we will too.

Frequently asked questions

Is Woebot still available?

No. According to the Woebot Health FAQ, the consumer service ended on June 30, 2025. Existing users were locked out on July 31, 2025, and account data was anonymized that same day. Both app store listings remain visible because the developer kept the binary published, but new users cannot create accounts.

Why did Woebot shut down?

Founder Alison Darcy told STAT News that the company exited the consumer market because of the cost of FDA marketing authorization for WB001, its postpartum depression digital therapeutic, and the regulatory uncertainty around deploying LLM-class generative AI inside a regulated medical product.

Was Woebot HIPAA compliant?

Yes. According to the Woebot Health safety overview, the company adhered to the HIPAA Privacy and Security Rules and treated all user data as Protected Health Information, even when not legally classified as such. The company also held SOC 2 Type 2 with zero exceptions and ISO 13485:2016 quality management certification.

Was Woebot a real AI?

Woebot used natural language processing to route user free-text into pre-scripted CBT decision-tree branches authored by clinical psychologists. Per STAT News reporting on the company’s history, Woebot was rule-based throughout its production lifetime. It did not use generative AI, which is one reason no LLM-style hallucinations were ever inflicted on users.

What happened to my Woebot data?

According to the Woebot Health FAQ, account data was anonymized on July 31, 2025. The transcript export window closed on July 15, 2025. If you did not pull a copy of your conversations before that date, you can no longer retrieve them. Any retained de-identified data is held under the privacy policy.

Can I get a Woebot alternative?

Wysa is the closest current analogue, with similar evidence-based CBT and DBT exercises, B2B distribution to employers and clinicians, and NHS partnerships. Replika and Character.AI sit in different categories: more conversational, no clinical posture. Compare current options on the CompanionWise Safety Index using the same evidence-based framework we applied to Woebot.

Did Woebot work for depression?

The 2017 foundational randomized controlled trial (Fitzpatrick, Darcy and Vierhile, JMIR Mental Health) showed significant reduction in depressive symptoms versus an e-book control across two weeks. The Dime Society case study documented 18 IRB-reviewed trials with about 1,800 participants. Results varied; Woebot was guided self-help, not a substitute for therapy.

Key Features

Pricing

Plan Price Features

The Competition

This review was last updated on . Learn about our review process .