SensoRoll: a multisensory interactive system with embodied and networked interactions.

KEYWORDS: INTERACTION DESIGN, PROGRAMMING, MULTISENSORY EXPERIENCES, WEARABLES DEVICES, ELECTRONICS, MICROCONTROLLERS, SINGLE BOARD COMPUTERS, QUANTITATIVE UX RESEARCH, NETWORKED INTERACTIONS, EMBODIED INTERACTIONS

I worked on this project for several months during my last year at University of Trento, and just seeing how everything turned out in the end still makes me think that this is one of my proudest achievements so far.
Design and development of SensoRoll proved to be a monumental challenge because in order to create it I had to leave the comfort zone of Visual Design and I had to learn how to design for auditory and haptics experiences. This video (that I edited myself) will explain all there needs to be known about SensoRoll:

As you may have grasped from the video, SensoRoll is a: wearable device that allows the user to control a digital game. Think something like Kinect, but slightly different. The objective of the game is to maneuver a ball to the end of a track while avoiding obstacles.

The rationale behind the creation of this device was to try and create an interactive experience that goes beyondο»Ώ the visual-first paradigm that many products nowadays offer, with the ultimate goal being that of creating accessible and inclusive technology that anyone can interact with, regardless of perceptual limitations. Our hypothesis would be that even visually impaired people can play with SensoRoll, and should be able to compete with sighted users.

Starting from this hypothesis, my colleagues and I started working on the project. I dealt with hardware initially, and then with the design and programming of the auditory and haptic interactions plus the programming of the backend for the network, which would allow for communication between the different devices of the system; the actual programming of the game was dealt with by my talented friend and colleague, Ramona Plogmann.

We use sensors to detect motion and sounds (clapping) produced by the user (inputs of the system), and then we use software to render visual, auditory and haptic stimuli as outputs; the idea is specifically that of mapping the visual information to other modalities, so that users can "hear" the game with ther ears or "feel" it on their skin. For example, walls (the obstacles in the game that users need to avoid to arrive at the end of the track), have specific sounds and, depending on where the user is on the track, specific vibrations are produced on the skin by means of vibration motors. Users can avoid walls by moving (bending to the right or the left), or by clapping, and in turn the system will provide appropriate feedback sustaining the interaction loop.

Of course, this project was no easy task. Especially for someone like me, who has experience with visual design but had never touched electronics and had never even thought about creating experiences that are more than just visual up to that point. The major problem that I faced was precisely that of having to overcome my visual design thinking, and I had to start thinking in terms of tangible interactions that the users would have with the device.

I worked hard, extremely hard, on a day to day basis for several hours a day.

I went from being able to light up a simple LED with Arduino, which is literally the easiest thing you can do:

photo_2021-10-07_20-10-56



To actually being able to build the device myself, with the Teensy microcontroller, the sensors (BNO055, by Bosch), actuators (vibration motors), and the Raspberry Pi.


photo_2021-12-22_11-51-55


Besides discovering that I love tinkering with electronics (new hobby for me!), I contributed to the hardware implementation part, which was pretty much done at that point.

Next, it was time to deal with software. And that, again, was really tricky.

I learned a programming language from scratch - Pure Data. It is a visual scripting language that can be used to create electronic music, but it also can be used to design sonic interactions and control hardware by sending messages via the use of a network. Again, I went from being able to create a simple oscillator object (the simplest thing you can do):

Frame-25

To design and program a full-fledged dashboard for control and monitoring of the status of the system in real-time, with also an entire section dedicated to the network that allows for communication between the devices:

dashboard

In the end, after the hardware and software were ready, we invited users to try our system and to assess their experience with SensoRoll. I contributed to the experimental design, to the creation of a ad-hoc questionnaire, and to the statistical analysis which was finally conducted using the R statistical package software.
The experiment was built around two modalities (think of it as an A/B test of sorts): visual mode (where users see the screen) and auditory-haptics mode (with a blindfold on and only audtory and haptic stimuli and feedback are available). Users would play the game in both modalities rfollowing a within-group design.

The experiment revealed that, while users could all complete the track even while blindfolded, their performance were still significantly slower compared to visual mode. This may sound like a defeat, but actually it was not; after all, we tried to achieve something very difficult, and it was nigh impossible to nail it the first time around. The experiment was extremely useful because it helped us in gaining users' insights, and pointed us to new directions for the project. For example, we understood that perceptually some stimuli were hard to differentiate, and the users would feel overwhelmed by too many sounds and vibrations happening at the same time. Also, we found out about a certain learnability curve related to the auditory-haptic mode: essentially, we proved statistically that performances with a blindfold on would significantly improve after a certain number of trials up to a point in which they become comparable with the visual mode. Thus, it is likely that the problem lies in the fact that users are simply not used to this new stream of information coming in. Therefore, we plan to run more experiments to finally arrive at a point in which users can really play SensoRoll with a blindfold on and don't feel any difference. And, more importantly, we plan to include actual visually impaired users in our project (which we could not access at first because of internal regulations at the University) to assess their experience with SensoRoll.



Ultimately,

this project taught me that designers should not limit themselves to just software (apps and websites), but should be brave to go out of their comfort zones and work with different interfaces that might involve hardware and different kinds of stimuli other than "just" visual.