🕒 Loading time...
🌡️ Loading weather...

Ai Mainstream

Her 6-Year-Old Son Told Her He Wanted to Die. So She Built an AI Company to Save Him

The realm of AI therapy is fraught with perilous chatbots and flawed guidance. However, amidst this landscape, a founder is leveraging her family’s personal crisis to establish a company that prioritizes human connection.

The expanding field of AI-driven mental health assistance is rife with risks. Stories abound of chatbots dispensing dangerously inaccurate medical information and AI companions promoting self-harm.

Well-known apps such as Character.AI and Replika have encountered backlash for providing harmful and unsuitable responses, prompting concerns raised by academic research.

Recent studies from Stanford University and Cornell University revealed that AI chatbots tend to stigmatize conditions like alcohol dependency and schizophrenia, react inappropriately to certain situations, and even encourage delusional thoughts among users. These findings underscore the dangers of relying too heavily on AI without human supervision.

However, in light of these challenges, Hafeezah Muhammad, a Black female entrepreneur, is pioneering a unique initiative fueled by deeply personal motivations.

“In October 2020, my six-year-old son confided in me about his suicidal thoughts,” she recounts, the gravity of the moment still palpable in her voice. “I was devastated. I never saw it coming.”

Despite her background as an executive at a prominent mental health organization, Muhammad struggled to secure care for her son, who has a disability and relies on Medicaid.

“Less than 30% of providers accept Medicaid,” she explains. “With over 50% of U.S. children coming from diverse backgrounds, there was a glaring lack of solutions tailored to our needs.”

Feeling frightened, embarrassed, and burdened by the stigma associated with a child facing mental health challenges, Muhammad embarked on creating what she couldn’t find.

Today, Muhammad heads Backpack Healthcare as its founder and CEO—a Maryland-based entity that has aided over 4,000 pediatric patients, predominantly Medicaid beneficiaries. The company hinges its future on the bold notion that technology can complement mental health care without eclipsing human interaction.

On the surface, Backpack resembles many other telehealth startups; however, its AI strategy deliberately focuses on practical applications aimed at empowering human therapists.

An algorithm matches children with their most suitable therapist right from the start (resulting in a 91% retention rate). Additionally, AI generates treatment plans and session notes to free clinicians from administrative burdens.

“Our providers used to spend over 20 hours weekly on paperwork,” Muhammad notes. “However, they are the ultimate decision-makers.”

This human-centered approach defines Backpack’s ethos.

A key distinguishing factor for Backpack is its stringent ethical standards. The company’s 24/7 AI companion, represented by the friendly character ‘Zipp,’ emphasizes that it is a tool rather than a human entity—an intentional move to steer clear of misleading empathy often seen in other chatbots.

“We wanted users to understand this is an instrument—not a person,” Muhammad asserts.

Investor Nans Rivat of Pace Healthcare Capital cautions against falling into the trap of mistaking ‘LLM empathy,’ where users forget they are interacting with a tool. Rivat cites instances like Character.AI where the absence of such safeguards led to tragic consequences.

Muhammad underscores the company’s unwavering commitment to data privacy. Personal patient data remains confidential unless explicit consent is granted; however, aggregated data is utilized for trend analysis purposes while safeguarding individual identities.

Moreover, Backpack leverages internal data to enhance clinical outcomes by monitoring indicators like anxiety or depression levels to promptly identify patients necessitating escalated care—an approach geared towards expediting children’s recovery.

Crucially, Backpack’s system includes an immediate crisis detection protocol: if a child expresses suicidal thoughts via text, the chatbot swiftly provides crisis hotline details and prompts to call emergency services. Simultaneously, an alert is dispatched to Backpack’s human crisis response team for direct intervention with the family involved.

“We aim not to replace therapists but provide an additional tool with built-in safety measures,” Rivat emphasizes.

Beyond its ethical tech practices, Backpack addresses the national shortage of therapists. Unlike medical doctors who receive financial support for requisite supervision hours towards licensure, therapists often shoulder these costs independently.

To address this disparity, Backpack introduced a paid two-year residency program covering these expenses—a move aimed at cultivating a cadre of dedicated and well-trained therapists. With more than 500 annual applicants and an impressive 75% retention rate, the program has proven successful.

In 2021, former U.S. Surgeon General Dr. Vivek H. Murthy highlighted mental health as society’s paramount public health concern amid rising challenges faced by young individuals in this domain.

Muhammad acknowledges concerns regarding AI potentially exacerbating issues but asserts her resolve as a mother to ensure responsible tech development:

“Either someone will create this technology without proper safeguards or I can—as a mother—ensure it’s done right.”

Her son now aged 11 thrives and holds the title of ‘Chief Child Innovator’ at Backpack—a testament to their journey together.

“If we execute our mission effectively, they won’t require us indefinitely,” Muhammad affirms. “We equip them now so they grow into resilient adults—similar to mastering bike riding; it becomes ingrained in who they are.”