YOU KNOW THE STEREOTYPE FOR TALK THERAPY: A coach with a patient supine and a quiet place. But what about the future? Might we incorporate artificial intelligence (AI)-powered software into mental health management?
We now have artificial intelligence apps that analyze the human voice and text. I find the prospect of machines invading this therapeutic space both exciting and dangerous.
Let’s look at how artificial intelligence-powered software is about to revolutionize mental health care.
Artificial intelligence (AI) and healthcare
Artificial intelligence (AI) has crashed my field of cancer management. For example, AI helps us to read mammograms.
Once we have a breast cancer diagnosis, artificial intelligence and machine learning tools help us determine the cancer biology (for example, the “robot” helps determine whether cells have too much of a receptor called HER-2). In treatment, artificial intelligence tools are working their way into radiation therapy.
Artificial intelligence and mental health
Artificial intelligence and machine learning-driven analyses can help to assess mental health. The timing may be good: A 2021 American Psychological Association survey of psychologists shows an increase in demand for treatment.
Clinicians reported the most significant increase in treating anxiety disorders, depressive disorders, and trauma- and stress-related disorders.
You would not be surprised to learn there is a shortage of mental healthcare workers. A Kaiser Family Foundation 2021 study reports that not a single American state can meet its demand for mental health services.
Could artificial intelligence tools make mental health services more expedient? I recently learned in an MDLinx article about how we may be soon saying, “the robot will see you now.”
MDLinx points to an AI-powered mental healthcare app, Woebot. This free app is available on iOS and Android devices and uses artificial intelligence and natural language processing to evaluate mental health. It then recommends management centered on interpersonal psychotherapy, dialectical behavioral therapy, or cognitive behavioral therapy.
Does text chatting with a robot provide outcomes similar to talking with a clinician in person? The approaches are equally effective in aligning clinicians and patients on therapeutic goals and the patient-clinician bond. Patients appear connected to their robots.
Artificial intelligence, ethics, and privacy
Artificial intelligence used in this fashion is fraught with ethics and privacy issues. For example, we have the black box problem. We do not know what data is informing the artificial intelligence analysis. The algorithm’s analysis can even be a mystery to the technology experts creating the algorithms.
Algorithms can be biased, for example associating mental illnesses with certain races. Moreover, privacy can be a problem. As a consumer health product, federal privacy guidelines (HIPAA) do not apply (unless the information is sent or received from a healthcare provider).
I am skeptical about consumer-grade products in the health realm, especially given concerns about biases and privacy. Nevertheless, artificial intelligence tools will someday supplement the many excellent clinicians who help those with mental health challenges.