Is everything an interface?
How can we define the boundaries of interfaces?
It is striking to read Johan Redström’s article Design and Technology in Situated Computing 22 years after it was written. The way that situatedness is seen in relation to the ecosystem of objects surrounding us, and how they could be linked to spaces hosted online.
They also point at the evolution of common reactions in relation to line phones compared to personal mobile phones.
Both these examples would be tackled differently today, especially if we consider the research placed into Augmented Reality. One of the biggest challenges will be the ability of populating the space seamlessly. The group Meta foresees the evolution of Facebook within that perspective. Instead of placing comments on a virtual wall, you would place “post-its” that could be found by your friends when they visit that place. That could be for example in a restaurant, leaving a personal note, a rating, or a comment on the place.
This article is particularly interesting for me to put in perspective the research that I have been pursuing on a very similar subject. After looking at markerless AR to place events in the city of London in a first-year project, I would like to use the same concept indoors.
For that, I am looking at a technology that is rather new, at least in its accessible form. UltraWideBand technology would be significantly modifying our way of being within our ecosystem of objects. UWB technology uses a 3 points system of 2 anchors and 1 moving object worn by the subject that is going around the space. Practically, this would mean a shift from using any kind of sensor. When entering a space, lights could be turned on or off, the same for screens, or any other device. Within this new paradigm, where is the interface? Is it the UWB anchors? Or the person itself? Is it the physical environment or the virtual one that is directing it?