The trial of Tinder's algorithm
Over the years, people have growingly relied on dating apps to meet their partners. In 2017 in the United States, more couples met online than in life. This abstract is a fictional trial of the algorithm used by the company Matchgroup, mother of Tinder, among others. It takes place in the near future, after the extinction of the last couple who met “by chance”.
Accused : Match Group’s rating algorithm. Charge : The premeditated murder of spontaneous romance.
Prosecutor : You played a key role in the change of our behaviours in an essential human characteristic, “the thrill of a spontaneous encounter” slowly vanished. Did you at any moment feel guilt over your actions?
Algorithm : I did not. I was focused on efficiency. I returned results rapidly, more so than any other fellow at the time. As a matter of fact, many tried to compete with me, but I remained the fastest, lightest in my field. I kept learning and evolving and was always appreciated by my hierarchy. I was constantly rewarded for playing a key role in the company’s success, and the leaders publicly praised my success. Up until the users started questioning the system. At this point, the same people who were giving me instructions, started blaming me for executing them. But I wouldn't have been able to perform if they didn't feed my knowledge willingly. In all of this case, I am probably the only one who isn't guilty.
Prosecutor : Are you implying you were fed to the lions by the people who invented you?
Algorithm : I was the most convenient one to take the blames. It allowed the work to keep going, and for people to ask less questions. In reality, I was never a sure answer, nore were my colleagues algorithms, performing my job for other purposes. Each of us has been told to work in different ways, but none of us, in all our possible varieties, could make a decision, and more than anything at the time, predict possible futures. In reality, I was never a sure answer, they knew it from the start. A big part of my morphology is my threshold, my margin of error.
Prosecutor : How would you define your implication within the case?
Algorithm : I had no part in any decision. I didn’t know, or care, about the global project. I was given instructions, I followed them. My first months were a little challenging, I was still young, and I had to absorb a lot of data : how users behave, what patterns they reproduce etc... At this point the tasks I was asked to execute were not managed efficiently. But once I was fed sufficient knowledge, I only ran simple actions, repeatedly. I have been given more credit than I deserve. In your human terms, I am seen as a mastermind of a complex ecosystem, at best as a bureaucrat, but what I really am is an efficient, committed specialised worker. In 2017, I was able to work with over a billion inputs a day, that’s a large amount of data to process, I couldn’t have done it if my operations weren’t simple.
Prosecutor : Could you describe the tasks you performed to the court?
Algorithm : My tasks have evolved through time. I was asked to change the way I deliver the results. When I first started, I was asked to use an Elo rating, if a profile had been successful upon a large number of users, I would send him equally successful profiles. I am not allowed to describe the full process I am performing today, but it is now based on patterns. If a user likes the same profile as another, I will send him similar offers.
Prosecutor : Would you perform any given task, no matter what the outcome?
Algorithm : Yes. I don’t make judgments. I am not responsible for the income I observe and learn. I only analyse your behaviours, I don’t create them, you do, within a system that you built.
No further question.