AI and Mental Health: Repowering Psychology?

Posted by Peter Rudin on 3. July 2020 in Essay

Mental Health Care                  Picture Credit: ohmybot.net

Introduction

According to Wikipedia, psychology is the science of behaviour and mind. Psychology includes the study of conscious and unconscious phenomena as well as feeling and thought. Psychologists seek an understanding of the emerging knowledge of brain research and all the variety of human behaviour linked to those efforts. As a social science, psychology aims to understand individuals and groups by establishing general principles and researching specific cases.

The majority of psychologists are involved in some kind of therapeutic role, practicing counselling. Some do scientific research on a wide range of topics related to mental processes and behaviour and typically work in university psychology departments or teach in other academic institutions (e.g., medical schools, hospitals). Some are employed in industrial and organizational settings in areas such as human resources or in forensic investigation and other aspects of law.

As a discipline, psychology has long sought to fend off accusations that it is a “soft” science. Philosopher of science Thomas Kuhn’s 1962 critique implied that psychology lacked an overarching theory found in mature sciences such as chemistry and physics. Because some areas of psychology rely on research methods such as surveys and questionnaires, critics assert that psychology is not an objective science. Sceptics have suggested that personality or thinking and emotion cannot be directly measured and are often inferred from subjective self-reports.

Psychotherapy and the role of the psychologist

Psychotherapy endows a sense of control in patients gripped with emotional conflicts which allows them to handle their reflexes and regain composure through conscious and behavioural alterations. These transformational changes can be brought about by improving their listening skills and reflective capacities, creating awareness and making them more attentive and intervening. When the visionary 20th century psychoanalysts were developing their theories, there was little empirical research to guide their work. Freud, Jung, Adler, and the other pioneers based their theories mainly on their own limited personal experiences with a small number of clients.

Humanistic psychology, also referred to as humanism, became popular in the 1950s as a movement in reaction to the restrictive approach of psychoanalysis. The humanistic approach sought to engage the whole person, not just fragmented parts of the personality, considering issues such as free will, personal growth, self-actualization, self-identity, death, loneliness, freedom and meaning. Abraham Maslow, who formulated a hierarchy of human needs and Carl Rogers, who created and developed client-centered therapy, pioneered humanistic psychology. A survey of over 2,000 psychotherapists conducted in 2009 revealed that Carl Rogers was rated the most influential of all psychotherapists, even more so than Freud. Rogers’s ideas about the importance of empathy and the core conditions for a therapeutic relationship have permeated the fields of counselling and psychotherapy: listen to the client, be respectful and non-judgmental, be empathic, and help clients find their own solutions. Practicing psychotherapy happens mostly in a one-to-one setting between the therapist and the patient. Sympathy between the parties is an emotional prerequisite for a successful therapeutic engagement. What might work with the personality of one therapist might not work with another. Good psychotherapists are a rather rare species and the time and cost spent on a sustainable, successful therapy is significant.

The mental health problem and the shortage of psychologists

Approximately 15.5% of the global population is affected by mental illnesses and the numbers is rising. In the United States, one in five adults suffers from some form of mental illness. The ramifications of this go beyond our families and cultures as mental health also has a tremendous economic impact for the cost of treatment as well as the loss of productivity. According to a report published by the World Health Organization (WHO) in October 2019— one in four people in the world will be affected by mental or neurological disorders at some point in their lives. Around 450 million people currently suffer from such conditions, placing mental disorders among the leading causes of ill-health and disability worldwide. Treatments are available, but nearly two-thirds of people with a known mental disorder, such as depression, never seek help from a health professional. Stigma, discrimination and neglect prevent care and treatment of people with mental disorders and there is only one psychiatrist per 100 000 people in over half the countries in the world, says the WHO. This critical shortfall of psychiatrists and other mental health specialists to provide treatment exacerbates this crisis. Nearly 40% of Americans live where there is a shortage of mental health professionals and 60% of U.S. counties do not have a psychiatrist. Those that do have access to mental health professionals often forgo treatment because they cannot afford it. Despite this gloomy situation, about USD 201 billion are spent on mental health care annually, making mental health the most expensive sector of the US healthcare system after the cost of treating heart related problems. Confronted with a worsening trend, it comes as no surprise that mental health professionals are considering the application of AI to improve the efficiency and reach of mental health care.

Aligning Psychotherapy and AI 

One main concern of psychological therapy is helping the individual to build self-awareness. Referrals to psychological literature and relevant tools or methods are today’s resources used by the therapist. Tools implemented by computers do not contradict the fundamentals of therapeutic methods. On the contrary, they aim to support the construction of self-knowledge by the individual as well as making pertinent information available for the work of the psychotherapist, improving the quality and effectiveness of therapeutic work. An algorithm is understood as a finite set of activities which, when applied in a certain order over a range of data, makes a transformation, producing a certain result. Although this process is necessary, it is not sufficient for the application of psychotherapy. Algorithms must be complemented with heuristics for decision making. Consequently, the processes of psychology and computation are not in conflict, but rather are complementary.

The idea of human interaction being replaced by AI-based communication technology was first applied about 35 years ago with ELIZA, an early natural language processing computer program created at the MIT Artificial Intelligence Laboratory by Joseph Weizenbaum. ELIZA simulated conversation by entering requests via keyboard and presenting results via computer display, using a “pattern matching” and substitution methodology that gave users an illusion of understanding. Directives on how to interact were provided by “scripts”, which allowed ELIZA to process user inputs and engage in discourse following the rules and directions of the script. The most famous script, DOCTOR, simulated a psychotherapist (in particular, Carl Rogers, who was well-known for simply parroting back at patients what they had just said), and used rules, dictated in the script, to respond to user inputs. As such, ELIZA was one of the first chatterbots. ELIZA’s creator, Joseph Weizenbaum regarded the program as a means of demonstrating the superficiality of communication between man and machine but was surprised by the number of individuals who attributed human-like feelings to the computer program. The discovery that empathy can be artificially introduced in man-machine interaction remains fundamental to the design of today’s design of therapeutic AI- systems. Natural language generation (NLG) and emotion sensing did not exist 35 years ago. In this respect it is astonishing that AI-support for therapeutic counselling is still at a very early stage.

Practical applications of AI to assist in solving mental health problems

Therapeutic AI can free up doctors to work more with their patients by analysing patient records and generating reports based on the data, handling administrative tasks, and even assisting with diagnosis or intervention. In addition, empathy focused AI-applications can help patients to create awareness of their own emotional states. Chatting with a computer may not be the first thing that springs to mind when you think of therapy, but for some patients, interacting with a chatbot or a robotic companion powered by AI can have significant benefits. Experience shows that many people interact with avatars the way they would with another human. Additionally, virtual therapists may encourage more in-depth sharing compared to a real one-to-one therapeutic session. The following provides a partial list of companies offering AI-supported therapeutic services:

  • woebot.com built by psychologists that worked at Stanford, the app is an automated conversational agent or chatbot that monitors the moods of users and provides a venue in which users can express their thoughts and emotions through therapeutic conversations.
  • quartethealth.com flags possible mental conditions and can refer patients to a provider or a computerized cognitive behavioural therapy program.
  • ginger.io  is supported by proprietary technology that is built on the world’s largest mental health data set. It brings together coaches, therapists and psychiatrists to ensure that members are provided with the appropriate level of care when they need it.
  • companionmx.com  patients being treated with depression, bipolar disorders and other conditions to create an audio log where they can talk about how they are feeling. The AI system analyses the recording, looking for changes for proactive health monitoring.

None of these apps or platforms has really made a significant impact on the mental health market so far and the support of venture capital for growth is minimal. The common understanding of the providers of these services is that psychologists cannot be replaced by intelligent machines, yet there is agreement that the efficiency of psychotherapy can be significantly improved by internet-connected AI.

Conclusion: Where to go from here

The mental health workforce in the United States is barely keeping up with the growing need for its services. According to the Bureau of Labor Statistics (BLS), there were more than 577’000 mental health professionals practicing in the U.S. in 2017. Among the 46.6 million adults with mental health problems, only 19.8 million (42.6%) received mental health services. Additionally, as people become more aware of the value of good mental health, they are finding it increasingly difficult to access mental health services. 

Mutual trust in AI-conversational therapy is an absolute prerequisite to advance the market further. However, without the adherence to ethical standards and the protection of privacy, supported by government endorsed regulations, trust cannot be achieved. Psychotherapists need to become the driver of this technology while continuing to offer one-to-one video counselling. Chatbots or avatars need to replicate the voice or appearance of the virtual therapist to generate empathy for a sustainable patient relationship. Further improving chatbot and avatar technology while gaining global access to qualified mental health data are vital to solving a growing problem of huge proportions.

One Comment

  • So far we have observed what AI in human communication can do and where it leads to – see Facebook. Having tried a psychology chatbot myself I realized soon that we were just going round in circles, not leading anywhere in particular.
    AI needs a lot of improvement before it is useful in psychotherapy and not making people more angry than they are. Obviously, there is a large need for improvement of mental health anywhere, not just in the US.

Leave a Reply

Your email address will not be published.