Welcome to Healthcare AI News, your weekly dose of the latest developments and headlines in the world of Healthcare AI.
In this issue, we explore:
✅ Headlines: Microsoft launches new data & AI tools for healthcare insights
✅ Industry: White House wants to deploy advanced AI for healthcare
✅ Interesting Reads: Scientist finds radioactive tracers in food, wins Nobel
✅ Tech: MongoDB boosts app development for healthcare and insurance
✅ Feature: Mental Health Care and AI
📢 Exciting News, based on your feedback, we're increasing our newsletter frequency to Tuesdays and Thursdays starting next week! 📢
Stanford AI in Healthcare Specialization
Imagine being able to analyze data on patient visits to the clinic, medications prescribed, lab tests, and procedures performed, as well as data outside the health system -- such as social media, purchases made using credit cards, census records, Internet search activity logs that contain valuable health information, and you’ll get a sense of how AI could transform patient care and diagnoses.
In this specialization, we'll discuss the current and future applications of AI in healthcare with the goal of learning to bring AI technologies into the clinic safely and ethically.
This specialization is designed for both healthcare providers and computer science professionals, offering insights to facilitate collaboration between the disciplines.
The Stanford University School of Medicine is accredited by the Accreditation Council for Continuing Medical Education (ACCME) to provide continuing medical education for physicians.
View the full CME accreditation information on the individual course FAQ page.
Mental Health Care and AI
Reimagining Mental Health Care: Exploring the future of AI-driven solutions
It’s time for your online psychotherapy appointment.
You log in and type into the chatbox. Your therapist asks how you’ve been. You describe your week in detail. Your therapist responds and asks how these events made you feel.
But it’s not a person on the other side. It’s generative AI.
This is the (fictional) future of AI-enabled mental health care imagined—and warned against—by generative AI pioneers decades ago. And it’s still what many people think of first when they think of AI being used in mental health care today.
But that does not need to be the case. There are many other ways we can—and already do—use AI to address mental health. Without replacing human providers.
Today, we’re exploring two use cases of AI in mental health care we’re particularly excited about. And the risks we need to keep in mind.
Affective Computing AI for mental health diagnostics
One of the trickiest aspects of mental health monitoring and intervention is picking up on signs of mental illness or distress before it escalates.
Pattern recognition is a huge part of that, but mental health screening tools are still too generic to be meaningful for many patients. AI-enabled mental health diagnostic tools bring personalization to mental health symptom detection.
Mental health startup CompanionMx is a great example. Their platform uses voice detection to screen for depression and anxiety with the help of an audio diary users record over time. Along with being an outlet for users to describe their experiences, it doubles as a record of progress in a user’s mental health as it fluctuates over time.
Another tool we’re excited about comes from AI-enabled video game startup thymia. Thymia’s games aren’t just for fun—they’re an AI-based diagnostic assessment used to screen for mental illness before a patient sees a clinician. As a patient-user interacts with the program, their voice, movement, and behavioral data are processed by the AI to quantify mental health.
One challenge behind these tools—as with many healthcare AI models—is data quality. As more of these models are built and used by real patients, their accuracy and sophistication inherently improves, allowing for them in themselves to become data sources and tools for mental health research.
Generative AI for mental health management
Amid shortages of mental health providers and wait times for crisis resources, people are turning to self-driven apps to help manage mental health issues such as anxiety or even panic attacks.
A good example is Youper, a mental health chatbot. While coming closer to the “AI-as-therapist” model than the other tools we’re referring to here, the point of Youper is not to replace traditional psychotherapy—but to provide a space for patient users to reflect and ground themselves in a conversational format.
Many of these tools are becoming so popular that crisis hotlines sometimes refer callers looking for self-help resources to them as a stopgap measure. When getting in to see a mental health professional can take weeks of calling around and waiting lists—it’s easy to see how these tools can help in the meantime.
And given how these hotlines are increasingly using AI to help them do triage and resource referral, we wouldn’t be surprised if crisis hotline AI were soon directly linking to these resources as well—if it isn’t already.
Combining mental health with AI: Understanding the risks
We’ll say it again. Using predictive analytics to personalize user experience for a mental health journaling app is not the same as replacing human mental health providers with generative AI.
Putting aside the question of whether AI can or cannot recreate the empathy or the therapeutic relationship that form the foundation of traditional psychotherapy—there are other big concerns.
The biggest is generative AI’s current issues with bias and accuracy. Medical AI overall is likely to reproduce bias represented in the data it’s trained on. Hence the need for data diversity. And with generative AI chatbots, errors and bias represented in recommendations can be incredibly misleading and even dangerous.
As we covered (read: insisted) in our feature about AI and rural healthcare, we must focus our innovative attention on areas of healthcare with the highest need.
With the current skyrocketing demand for mental health providers, this is definitely one of them.
One of the highest-need areas in healthcare presents a wonderful opportunity for innovative intervention. But it also presents a need to be particularly careful with how we employ new technologies.
So, now it’s your turn to tell us.
Do you think we’re on the road to seeing AI-as-therapist solutions soon? And what other AI-enabled mental health solutions should we be watching out for?