top of page
Rechercher
  • Photo du rédacteurchahaleden

Dernière mise à jour : 27 oct. 2020

As in the other texts we dived into, the author gives a great space to defining the terms at the center of the research, the semantics seems to be at the core of the works. This made a lot of sense for the algorithm, but it was all the more striking to discover through this week’s reading Figuring the Human in AI and Robotics how broad and even controversial it can be to define words that felt common and obvious. What does it mean to be “humanlike”? How do you define a human from a non human? Why is it important to know from which point of view they are being defined?

After opening to different ways of thinking, the author suggests a classification of three elements that could lead to address the experiences of the machines as “Humanlike”:

  • Embodiment

  • Emotion

  • Sociality

One of the exemple that aroused my curiosity, that would fall in the Emotion part, was that of the use of AI as a psychologist.


As for the robots described in the text, Cog and Kismet, that are trained to read emotions, after them being thoroughly analysed and classified, Ellie was developed as part of SimSensei project by the USC Institute for Creative Technologies. It is a robot, embodied in an (cartoon) identifiable human, that is able to read emotions through sensory mapping of the face and body. It detects patterns, and reacts accordingly, for example, with a smile at the right moment, to make a connection with the “patient”, by noding, engaging in small talk etc.

Sceenshots from a video showing the way Ellie/SimSensei works and interacts with a patient


Not having access to this kind of experiment, I decided to explore this week’s research through a live session with what is presented as an accessible, largely used AI therapist: Weabot. It’s an application developed by Stanford University researchers that markets itself as providing “Compassionate solutions for human problems” and puts the communication focus on “heart” and feeling like “a friend”.


Woebot's presentation on its website https://woebothealth.com/



The first session with Woebot on the Woebot App


Although I don’t have any previous experience in human therapy, and I wasn’t actually a candidate for it, I was glad I didn’t count on it for any health conditions.


It appears that the democratized version of therapist is far from competing with a human exchange. The empathy is overstated, the exchange is out of the hands of the user, who is part of a very directed “discussion”. It felt closer to a kind of motivational coach, at best.

They did not make the choice of a humanlike bot, but of an almost cliche version of a robot.


  • Photo du rédacteurchahaleden

Over the years, people have growingly relied on dating apps to meet their partners. In 2017 in the United States, more couples met online than in life. This abstract is a fictional trial of the algorithm used by the company Matchgroup, mother of Tinder, among others. It takes place in the near future, after the extinction of the last couple who met “by chance”.


Accused : Match Group’s rating algorithm. Charge : The premeditated murder of spontaneous romance.



Prosecutor : You played a key role in the change of our behaviours in an essential human characteristic, “the thrill of a spontaneous encounter” slowly vanished. Did you at any moment feel guilt over your actions?


Algorithm : I did not. I was focused on efficiency. I returned results rapidly, more so than any other fellow at the time. As a matter of fact, many tried to compete with me, but I remained the fastest, lightest in my field. I kept learning and evolving and was always appreciated by my hierarchy. I was constantly rewarded for playing a key role in the company’s success, and the leaders publicly praised my success. Up until the users started questioning the system. At this point, the same people who were giving me instructions, started blaming me for executing them. But I wouldn't have been able to perform if they didn't feed my knowledge willingly. In all of this case, I am probably the only one who isn't guilty.


Prosecutor : Are you implying you were fed to the lions by the people who invented you?

Algorithm : I was the most convenient one to take the blames. It allowed the work to keep going, and for people to ask less questions. In reality, I was never a sure answer, nore were my colleagues algorithms, performing my job for other purposes. Each of us has been told to work in different ways, but none of us, in all our possible varieties, could make a decision, and more than anything at the time, predict possible futures. In reality, I was never a sure answer, they knew it from the start. A big part of my morphology is my threshold, my margin of error.


Prosecutor : How would you define your implication within the case?


Algorithm : I had no part in any decision. I didn’t know, or care, about the global project. I was given instructions, I followed them. My first months were a little challenging, I was still young, and I had to absorb a lot of data : how users behave, what patterns they reproduce etc... At this point the tasks I was asked to execute were not managed efficiently. But once I was fed sufficient knowledge, I only ran simple actions, repeatedly. I have been given more credit than I deserve. In your human terms, I am seen as a mastermind of a complex ecosystem, at best as a bureaucrat, but what I really am is an efficient, committed specialised worker. In 2017, I was able to work with over a billion inputs a day, that’s a large amount of data to process, I couldn’t have done it if my operations weren’t simple.


Prosecutor : Could you describe the tasks you performed to the court?


Algorithm : My tasks have evolved through time. I was asked to change the way I deliver the results. When I first started, I was asked to use an Elo rating, if a profile had been successful upon a large number of users, I would send him equally successful profiles. I am not allowed to describe the full process I am performing today, but it is now based on patterns. If a user likes the same profile as another, I will send him similar offers.


Prosecutor : Would you perform any given task, no matter what the outcome?


Algorithm : Yes. I don’t make judgments. I am not responsible for the income I observe and learn. I only analyse your behaviours, I don’t create them, you do, within a system that you built.


No further question.


Dernière mise à jour : 27 oct. 2020


  • A kinaesthetic approach to space.


By introducing the hypothesis through the point of view of a physiotherapist the author directs us toward a kinesthetic approach to space, framing it within the domestic scale.

I tried to explore a similar example of “choreography” at the scale of the city.



Citymapper and the expected city.


Like a stage director, our mobile tells us how and when to move, by granting us access to increasingly accurate predictions. This guides our speed, trajectories, the position we stand in on a platform. From an object fitting inside our hand, we are given an ability to grasp infrastructures at the scale of the large metropolis.

This can be considered a major improvement, granting optimal comfort, better time management, fluidity, avoiding stressful situations and bad surprises.

Citymapper, How to get to...


  • What could we be losing in the process for a less troubled life?

The name of the collective the author is part of and presenting here, Constant, made me do an association with eponymous Dutch artist and the work of his movement : the International Situationists.

Building the field that they named Psychogeography, they set a series of experimentations exploring the way that a geographical environment dialogues and affects individual behaviours. One of their main works revolves around “The Theory of the Drift’ (La Théorie de la Dérive, 1956), a way of negotiating with the urban space, through unplanned journeys where they would "let themselves be drawn by the attractions of the terrain and the encounters they find there".

The Naked City; Guy Debord and the Situationists, 1958.



The drift theory is meant as a playful experience, and places encounter and chance in the center of an improvised choreography.

Leaving aside any preconceived idea or constraint with the city, the participants would start drifting according solely to the geography of the urban landscape, their emotions and feelings.


Transport apps are likely to be the exact opposite to this radical approach.


Could we be witnessing the death of randomness, surprise and coincidence?


We won’t be learning by practice, by our repeated movements, but by knowledge and predictions, dictated by an application carried by each individual. Our bodies won’t be learning by progressively getting accustomed to a path, a recurring situation, by getting into a habit, or choosing a non linear, non optimised trajectory within a city.

Our senses might become less alert, to the sound of a train approaching, the vibration of the ground announcing its arrival. We won’t be unfolding maps to plan a journey, and be able to locate it within a territory, nor getting lost and discovering a beautiful road we never would have noticed.


This echoes both parts of the text “A fish can’t judge the water”. The first one that depicts the disappearance of materiality and the importance of being aware and critical when this vanishing occurs. And the second one about the political importance of Open Source, that while not allowing seamless experiences, offers freedom and possibility to be an enlightened actor of our actions and creations. Where mistakes, bugs, and imperfect practices benefit creation and thinking.



The ideas developped in the text could be read also for physical objects that have always been defining our surroundings, and the way we "negotiate" with it.

We would consider a city well designed if the infrastructures disappear to allow for a smooth experience of the city. Every design, at every scale, has always had a part of knowledge that escapes to the users, we are not all civil engineers, architects, planners... They have also always been very political, rigid, decisions, that influence the way we live and move around our homes, our buildings, and cities, and those decisions are imposed on the users of the space. They will have to trust that the ones doing it are knowledgeable enough, and doing their job at their best (best should be defined, in a time when social impact of constructions has been pushed away to the benefit of economic interests). Maybe the changes occuring in the technologies that are filling our "milieus" could be a great opportunity to become more enlightened users, and practitioners.

The text sounds as much as a warning as a way of inviting to shift our attention towards the importance and impact of reflecting upon and creating our tools. At a time of exponentially fast changes, this is a political statement, as every creation is. Training our critical sense, taking a step back, reflecting on things that are being designed for us to forget about them, seems more important than ever.


If the fish had the same possibilities, who would dare to judge the water infront of him?


bottom of page