Artificial Intelligence and Psychiatry | Mind You


By Marvin Ross

Not sure about any of you, but I’m not interested discussing my mental state with a computer even though, in 2021, companies that focus on mental health managed to get five billion dollars in funding for their artificial intelligence (AI) ventures. That, according to the New Yorker magazine, is more than double the funding for any other medical issue. It’s a bandwagon that is reminiscent of the money that poured into cannabis in Canada when it was being legalized. A report in December 2022 pointed out that medical pot has been a bloodbath for investors.

In addition to the problems of psychiatruc AI I’ll mention below, two things bother me about it. The first is an apocryphal story from the days of main frame computers. Programmers decided to try to translate from English to Russian and fed into a computer “the spirit is willing but the flesh is weak”. What came out in Russian was “the wine is good but the meat is off. “ OK, these days google translate is pretty accurate but virtual chats are still a pain.

I recently used an online virtual assistant when a package I ordered was delivered but it wasn’t. I live in an apartment building and the post office leaves larger packages in secure boxes when they won’t fit into the usual mail slot. A key for the secure box is left in the mail slot. I got the key but the wrong package was in the secure box. Clearly, what happened was the postal worker left me the wrong key. An easy mistake to make and to correct.

The online virtual assistant could not comprehend and kept telling me the package had been delivered so I gave up and tried chat. Usually, chat means that there is a human on the other end but it was still my friend virtual assistant. I called and got a voice activated virtual assistant who could also not understand and kept telling me “you got your package so go away” I screamed “human” into the phone and virtual assistant told me it would be a very long wait. I tried for a few more minutes and again screamed “human’ and was transferred.

No long wait, the human understood what happened and the next day the postal worker called to tell me that she had put the correct key into my mail slot.

Imagine dealing with that scenario with a psychiatrist in software? But there are other problems as the New Yorker article pointed out.

One attempt was a computer model to predict suicide which was based on analyzing suicide notes to determine what they called “the language of suicide”. This was combined with audio recordings of patients to try to identify the sounds that people made when suicidal. The result was a system that could identify people who were suicidal, mentally ill but not suicidal or neither of these. The AI model agreed with humans about 85% of the time.

But, there were times that the model and the human did not agree. The machine did not know what the human had done or not done in terms of a history of self harm, the existence or not of a supportive family and no suicidal plans. That is crucial knowledge that a machine cannot know. The Veterans Administration in the US does US AI called Reach Vet to screen people. It has managed to reduce psychiatric admissions by 8% and documented suicide attempts by 5% but it has not reduced suicide mortality.

A software system based on large language models where the computer becomes familiar with billions of words can assemble sentences. It can answer questions, write code and generate stories but have no idea what they are saying means. They are like autocorrect with no comprehension of context. The New Yorker cites one such system developed by Facebook’s parent company that once stated that Elon Musk had been killed in a car accident. Another system called Replika which is described as an AI companion that cares once made aggressive sexual advances towards a user.

Computers can help us but so far these computerized psychiatric activities and functions cannot substitute for a human shrink.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *