🔮 AI Predicts Your Mortality
Plus: Chatbots pick up on your feelings | Microsoft partners with Duke Health to reshape healthcare | Differences between IoT and Edge computing | Biogen to acquire Reata Pharmaceuticals
Welcome to Healthcare AI News, your weekly dose of the latest developments and headlines in the world of Healthcare AI.
In this issue, we explore:
✅ Headlines: Can AI-powered robots do the work of a nurse?
✅ Industry: Amazon rolls out its virtual health clinics nationwide
✅ Feature: Workforce ecosystems and AI
✅ Interesting Reads: Clever trick that makes cancer cells self-destruct
✅ Tech: Why AI needs a Red team
✅ Venture Pipeline: Meta and Microsoft vets build shared brain dev. platform
🌟 Advertise With Us 🌟
Boost your brand amongst Healthcare's influential circle! Our diverse subscriber base boasts top executives, key decision makers, and visionary professionals from leading organizations – the ultimate platform for your brand's success. 🔥
AI transforms how doctors predict health (Read More)
A wearable could detect breast cancer earlier (Read More)
How AI inferences of race in medical images can improve or worsen healthcare disparities (Read More)
Chatbots pick up on your feelings (Read More)
AI predicts mortality based on body measurements (Read More)
Predictive or Generative AI: Which will change healthcare the most? (Read More)
Can AI-powered robots do the work of a nurse? (Read More)
AI-generated data could be a boon for healthcare - If only It seemed more real (Read More)
HEALTHCARE AI NEWS CONSULTING
📢 Seize The Opportunity 📢
We're THRILLED to announce our new venture. A specialized consulting service designed for healthcare industry leaders like you
What is it?
Consulting Reimagined: We're launching a new consulting service that connects professionals like you with entities such as Venture Capital firms, Private Equity, and Research Institutes. These organizations seek your specialist insights for their strategic decisions.
Simply put: we bridge the gap. You engage directly with these organizations, offering your profound insights, and in return, you receive financial compensation. Our platform stands as your avenue to leave an impact on industry decisions while being rewarded for your valuable contributions.
Flexible Consulting: This isn't a job; it's an opportunity. Consulting on your schedule, set your rate while keeping your day job. Engagements are typically over the phone lasting 30-60 mins. There are no exclusive contracts or long-term commitments.
A Confidential Environment: We uphold your privacy. It's not about sharing proprietary information; it's about leveraging your knowledge and experience to transform industries.
A Lucrative Venture: An opportunity for above-market earnings awaits, offering an additional income stream.
Join us and turn your expertise into a catalyst for healthcare transformation. Here, you'll not only lend your voice to shape the industry but also build and enhance your personal brand as a respected thought leader. Benefit from our platform to generate additional income while you make a lasting impact on global healthcare. By joining our esteemed consulting circle, you elevate yourself into a role of influence and innovation. You're not just joining us - you're setting the stage for the future of healthcare.
We kept it easy. Simply complete this short form and we will get in touch with you.
Extreme heat threatens the health of unborn babies (Read More)
Microsoft partners with Duke Health to reshape healthcare (Read More)
AI detects diseases in MRI scans that doctors miss (Read More)
Amazon rolls out its virtual health clinics nationwide (Read More)
Should people without diabetes use glucose monitors? (Read More)
How Kaiser Permanente is using AI to meet patients' needs faster (Read More)
UMich, AI-based fraud, waste, and abuse system aims to cut costs and protect patients (Read More)
CareCloud and Google Cloud collaborate to bring AI to ambulatory healthcare settings (Read More)
AI and Crisis Hotlines
Can AI help manage a crisis? These crisis hotlines’ experiences tell a mixed story.
Today, we’re talking about an area of healthcare much of the industry isn’t very familiar with. That is, unless you’re in the mental health space. Or an emergency responder.
Crisis hotlines—including phone and chat options—are a key triage tool emergency services and mental health organizations have relied on for decades. You can even call them a stopgap in mental health care, especially as the burden of mental illness grows and the supply of available providers struggles to keep up.
But crisis lines are also underfunded. And staffed almost entirely by volunteers. And, even with how popular these services are today, call and chat volumes are only expected to increase.
There’s no doubt AI can help these services be more efficient. But how it’s implemented is the tricky part.
Today, we’re exploring this case study of public-facing AI implementation to learn from the successes and pitfalls two specific crisis lines experienced when they launched AI tools. Let’s dig in.
Success: Triage, response speed, improved review process
With the onset of the COVID-19 pandemic, the kinds of messages received by the National Eating Disorders Association (NEDA) chat-based Helpline started to change.
The volume of messages went up. And the types of messages received by the eating disorder-oriented crisis line more frequently escalated to topics such as suicide, self-harm, and domestic abuse. Staff and volunteer turnover increased. Wait times for responses stretched to up to a week or more.
So, earlier this year, NEDA launched Tessa, replacing the human staff of their Helpline with an AI chatbot meant to provide an accurate—and much more efficient—triage and resource referral system..
Similar motivations were behind crisis hotline Protocall’s partnership with Lyssn, an AI tool used for analyzing and reviewing recordings of behavioral health encounters.
Since Protocall is one of many call centers receiving calls to the national 988 Lifeline, they’re required to review a certain amount of calls for quality. Lyssn helps the already strapped staff do this review process more efficiently. Protocall also hopes to use the analytics to improve call quality and training across the board.
Unlike Tessa, Lyssn never interacts with callers directly, only doing analysis once the call is done. The technology is now also learning to give counselors feedback directly after the call, which Protocall hopes will further increase the quality and efficiency of their crisis services.
Pitfall: Quality and accuracy
As Protocall undergoes their trial period with Lyssn, the jury is still out on whether the AI tool may cause the crisis line any unexpected quality issues.
With Tessa and NEDA, the story’s a bit different.
If you’ve already heard of Tessa before reading this feature, odds are it’s because of headlines about NEDA shutting the chatbot down. This happened after reports surfaced of the chatbot giving inaccurate—and even harmful—advice.
The (understandable) resulting backlash was swift. Some of the criticism even suggested that AI should never be used to help address sensitive mental health issues like eating disorders.
And to their credit, the creators of the chatbot addressed these concerns—along with others—in an op-ed in STAT, saying that their intention was never to replace the empathetic human touch of mental health care or even crisis counselors, but to make triage and resource referral easier.
And here’s a bonus pitfall: The negative impacts on volunteer and staff morale.
NEDA launched Tessa after the increased pandemic-era demand led to a lot of staff turnover and burnout, yes. But it also occurred after the Helpline staff voted to unionize. Shortly after informing NEDA of this intent, NEDA dissolved the 20-year-old Helpline, replacing it with the AI chatbot.
As you can imagine, staff and volunteers didn’t feel great about this. This is the worst-case scenario many AI critics always warn of: ill-advised replacement of human workers where a human touch is still necessary.
Final thoughts from Healthcare AI News Team
Don’t worry—this isn’t just a bummer cautionary tale.
We do applaud both Protocall and NEDA for these intentions: Using AI to make over-worked health workers’ jobs easier. And giving patients easy access to valuable health information.
But there are clear lessons to be learned from how Protocall’s work with Lyssn is being praised and why NEDA is being criticized.
When it comes to AI and crisis management—the stakes are incredibly high. You need efficiency. You need accuracy. And you need empathy.
To achieve all three, staff buy-in for the use of the AI is key. And enhanced vigilance is always important when AI interacts with the public—whether that’s patients or volunteer counselors.
That’s a recipe for trust—another very tricky, but necessary aspect of successful crisis management.
Other crisis hotlines are beginning to experiment with AI as well—including the national Crisis Text Line. We’ll be looking forward to seeing how these applications evolve.
So, what are your thoughts?
Do you think AI has a place in crisis hotlines? Do you have other ideas for how AI can be used to improve this line of health-adjacent work? We want to hear from you, so slide into our replies!
Why AI needs a red team (Read More)
How generative AI impacts your digital transformation priorities (Read More)
Bad code stalls developer velocity (Read More)
Differences between IoT and Edge computing (Read More)
How generative AI code assistants could revolutionize developer experience (Read More)
A SPECIAL MESSAGE FROM OUR PETS! 🐾
TWEET OF THE WEEK
Recent MIT grad reveals a worrying trend in #AI 🖥️.
A supposed 'professional' portrait whitewashed her features 👱♀️, prompting her to question if the tech thinks she should be white to be professional.
Time to address #AIbias in tech. 🚫👩🏻➡️👩🏼 #MIT
— Healthcare AI Newsletter (@AIHealthnews)
Aug 1, 2023
What'd you think of today's Newsletter?