• chahaleden

Woebot : My session with an ai "therapist"

Dernière mise à jour : 27 oct. 2020

As in the other texts we dived into, the author gives a great space to defining the terms at the center of the research, the semantics seems to be at the core of the works. This made a lot of sense for the algorithm, but it was all the more striking to discover through this week’s reading Figuring the Human in AI and Robotics how broad and even controversial it can be to define words that felt common and obvious. What does it mean to be “humanlike”? How do you define a human from a non human? Why is it important to know from which point of view they are being defined?

After opening to different ways of thinking, the author suggests a classification of three elements that could lead to address the experiences of the machines as “Humanlike”:

  • Embodiment

  • Emotion

  • Sociality

One of the exemple that aroused my curiosity, that would fall in the Emotion part, was that of the use of AI as a psychologist.

As for the robots described in the text, Cog and Kismet, that are trained to read emotions, after them being thoroughly analysed and classified, Ellie was developed as part of SimSensei project by the USC Institute for Creative Technologies. It is a robot, embodied in an (cartoon) identifiable human, that is able to read emotions through sensory mapping of the face and body. It detects patterns, and reacts accordingly, for example, with a smile at the right moment, to make a connection with the “patient”, by noding, engaging in small talk etc.

Sceenshots from a video showing the way Ellie/SimSensei works and interacts with a patient

Not having access to this kind of experiment, I decided to explore this week’s research through a live session with what is presented as an accessible, largely used AI therapist: Weabot. It’s an application developed by Stanford University researchers that markets itself as providing “Compassionate solutions for human problems” and puts the communication focus on “heart” and feeling like “a friend”.

Woebot's presentation on its website

The first session with Woebot on the Woebot App

Although I don’t have any previous experience in human therapy, and I wasn’t actually a candidate for it, I was glad I didn’t count on it for any health conditions.

It appears that the democratized version of therapist is far from competing with a human exchange. The empathy is overstated, the exchange is out of the hands of the user, who is part of a very directed “discussion”. It felt closer to a kind of motivational coach, at best.

They did not make the choice of a humanlike bot, but of an almost cliche version of a robot.

7 vues0 commentaire

Posts récents

Voir tout

How can we define the boundaries of interfaces? It is striking to read Johan Redström’s article Design and Technology in Situated Computing 22 years after it was written. The way that situatedness is

“Small things overcome great ones. This will kill that. The book will kill the edifice”. Victor Hugo, Notre Dame de Paris With this sentence long excluded from his book Notre Dame de Paris, Victor Hug

In the short video, Speculative Fabulation Donna Haraway explains her choice of using “fable” rather than “narration” in English. Here “fiction” is used in the sense of a tale, including magical creat