Primary care physicians play a critical role in identifying patients at risk for serious mental health issues, including suicidality. But the ever-increasing demands on their clinical time can hinder the ability to identify emotional distress in time to intervene. Can artificial intelligence (AI) help?
Medscape spoke with Tom Zaubler, MD, a psychiatrist and chief medical officer of NeuroFlow, about how AI can improve the ability of primary care physicians and other clinicians to screen their patients for suicidal ideation and boost rates of treatment for mental health issues in their patients. This interview has been edited for clarity and length.
How can AI help in suicide prevention and mental health screening in primary care?
Recent studies have demonstrated the potential of AI in mental health screening and suicide prevention. One method is natural language processing (NLP), which can analyze patients' journal entries for signs of suicidal thoughts or behaviors. This technology has shown promise in detecting suicidal ideation in patients who may not report such thoughts on traditional screening tools like the Patient Health Questionnaire-9 (PHQ-9). AI can be part of an integrated approach to identify and provide support to individuals at risk for suicide or those without a psychiatric history but who may still be at risk. A recent study by [Maria] Oquendo and colleagues found that one fifth of patients who attempt suicide do not meet the criteria for a mental health disorder.
Improved screening is obviously important, but in some ways it's not the most important part of the problem. The lack of accessibility to specialized mental health care is a critical obstacle to treating patients with acute psychiatric needs.
How can primary care doctors effectively connect patients with mental health support, given the scarcity of mental health professionals?
Primary care doctors can leverage technology to extend mental health support. This includes using platforms for safety screening and providing patients with immediate access to local and national resources and digital interventions. Alerts can be sent to professionals within the practice or employed by technology companies to offer immediate support, including suicide safety planning and counseling. Users can hit a button to "Find a Therapist." Also, if they acknowledge feelings of self-harm, these keywords are detected within the app by NLP. "Urgent alerts" are then sent to clinicians who are overseeing patient care. If someone is flagged, a social worker or member of a response services team intervenes and calls the person at risk to tailor care. These interventions do not always require a psychiatrist or masters-prepared clinician but can be effectively managed by trained paraprofessionals. These staff members can provide suicide safety planning and lethal-means-restriction counseling, and can assess the need for escalation of care.
How is technology likely to manifest in physician practices in the near future to support mental health care?
Automated screening platforms for depression and anxiety, alerts for physicians when patients screen positively, and integration with collaborative care models are a few of the ways technology will become part of clinical practice. Additionally, advanced data analytics and predictive modeling using electronic health records and claims data will help identify high-risk patients. Technologies like voice recognition and machine learning can analyze patient journals and possibly, in the future, social media feeds to detect mental health issues. These technologies aim to extend and augment the capabilities of healthcare practices, improving the identification and management of patients at risk for mental health issues.
Are these technologies as effective in pediatric populations, and are there any specific challenges?
Technologies for mental health screening and support are effective in pediatric populations, with certain age-specific considerations and legal restrictions on technology use. For adolescents and older children comfortable with technology, digital tools can significantly impact mental health care. For younger children, technology must facilitate information-gathering from various sources, including parents and teachers. Despite challenges, technology is crucial for early identification and intervention in pediatric mental health, potentially shortening the time to diagnosis and improving outcomes.
The statistics are horrifying. One third of adolescent girls have seriously thought about suicide over the past year; 13% attempt suicide. So there's a need in the adolescent population and in the preadolescent population, too, because there's an 8- to 10-year lag between onset of symptoms and diagnosis of mental illness. If we can shorten that lag, you see improved performance in schools; you see decreased truancy; you see greater economic achievement and so on. It makes such a profound difference. Not to mention it saves lives. So, yes, technology is critical in a pediatric population. It exists and it's happening right now. There are challenges, but the goal can be met.
A 2014 study found that 45% of people who completed suicide visited a primary care physician in the preceding month. And only 23% of people who attempt suicide have not seen a primary care physician within the past year. What does that say about the importance of screening at the primary care level?
The fact that a significant percentage of individuals who die by suicide have visited a primary care physician within a month or year prior to their death underscores the critical role of primary care in suicide prevention. This highlights the potential for primary care settings to identify and intervene with individuals at risk for suicide, making the case for the importance of integrating effective mental health screenings and support technologies in primary care practices.
In other words, we're not talking about a marginal benefit.
No, the potential benefit is huge. The United States Preventive Services Task Force did not endorse universal screening for suicide in its 2023 recommendations; they felt — and I accept that conclusion — there wasn't enough evidence [at the time] to really support that recommendation. I think when you talk to a lot of suicide researchers, what you will hear is that providing suicide assessments as far upstream as possible is critical, especially when you start seeing more and more research showing that 20% of the population who die by suicide are not likely to have any psychiatric pathology at all. I believe the evidence base will soon support a recommendation for universal screening for adults. I believe it is especially important to screen for suicidal ideation in kids, given the high rates of suicide in this population.
Credit:
Lead Image: Source / Getty Images
Image 1: NeuroFlow
Medscape Family Medicine © 2024 WebMD, LLC
Any views expressed above are the author's own and do not necessarily reflect the views of WebMD or Medscape.
Cite this: AI and Suicide Prevention in Primary Care: A Q&A - Medscape - Mar 22, 2024.
Comments