MediaLAB Amsterdam is now the Digital Society School! You are viewing an archive of MediaLAB projects. Content on this website may be outdated, and is without guarantee.

Sense VR

Exposure Therapy with Virtual Reality

Team

Commissioner:

Description

The Virtuous Cycle of Immersion

“The goal of VR storytelling is to tell a story that will stimulate emotions that will influence action” – Shin (2017)

In 1980 Marvin Minsky used a term “telepresence” to describe the feeling that a human operator may have while interacting with teleoperation systems. Later, telepresence was translated into “immersion” which generally describes a deep mental involvement in something. During the second sprint we studied immersion in the VR environment, focusing on the process from the users’ perspective. By putting the individual at the centre of the studies about immersion, embodied cognition and the theory of flow we were able to find some new point of views for our topic and get some validations for our concept.

Janina’s physical mind map of “The virtuous cycle of immersion” (aka. never ending story).

Not surprisingly, the first important notion is again the individuality. The personality traits of the user (the so-called “Big Five”) need to be taken into account before the users’ level of immersion can be analyzed. As Weibel et al. (2010) states, previous studies only show the level of immersion measured in terms of satisfaction and positive feedback of the user, whereas less is left to question how personality traits can influence the overall experience of immersion. Lately, it has been studied that especially some individual factors, such as openness to experience and high imagery ability, have a positive effect on presence – followed up by the immersion (Weibel et al. 2010). During our concept sketching we also faced this problem and we found ourselves asking what motivates our target group. What make the 15 to 24 year old women to get up from bed every morning?

 

Figure 1. A screenshot of Janina’s mind map (which is as fluid as the concept of immersion by itself). With the help of this map, we are able to see that in addition to “openness to experience” also emotional involvement of the user might be relevant for the level of immersion. And as you can see, the design perspective states that in order “to facilitate immersion and to enhance sensations of presence, prior assessment of the users’ personality might effectively help tailoring the most suitable environment for each user” (Weibel et al. 2010).

In addition to the personality traits of each user, the notion of body and the sense of embodiment are important for defining the level of immersion. The VR environment not only make users to feel different kind of emotions but moreover changes who they are in the virtual space (Shin 2017). Fully immersive VR environment offers a sense of embodiment, which can be defined as a state where the users feel themselves as a part of the VR environment. At the same time, users can also feel that VR components are parts of their own bodies (see Bailey et al. 2016). In fact, the users often create a virtual body of themselves inside of the immersive VR environment as “an analog of his or her own biological body” (Kliteni et al. 2012).

“In other words, the embodied cognition in VR allows users to feel a sense of embodiment.” (Shin & Biocca 2017)

As Shin and Biocca (2017) conclude, this sense of embodiment comes from our embodied cognition which we process by being physically “present” somewhere. Marvin Minsky (2006) would probably include presence as one of those suitcase-like words, which doesn’t have a clear definition but which can help us to understand our brain processes. However, we will define presence for now as a state of mind and immersion as an experience over time (Shin 2017). In fact, presence is not unique for the VR as also watching a movie or simply reading a text can induce a feeling of presence (Baus and Bouchard 2014). However, presence and immersion together are both strongly related to our ability of perspective taking and perception (Rudrauf at al. 2017).

If the immersion in the VR environment is defined as “an experience over time” and not an instant temporary connection (which we would probably call a presence), it means that the user’s cognition is highly involved to this process. The VR environment, as well as the stories told in VR, are constantly reprocessed through the user’s sense-making. As Shin (2017) argues, “users actively create their own VR, based on their understanding of the story, their empathetic traits, and the nature of the medium.” Therefore, the role of VR developers is only to propose an immersion but the processing of the immersion is left for the user. It is important to understand both presence and immersion as fluid states that are reprocessed and redefined by the user. (Shin 2017.)

Some other necessary elements for immersion and the engagement of the user are the ability of perspective taking, the concept of flow and empathy. Of course, the ability of perspective taking involves the 360 degree nature of the VR environment. Unlike in other mediums, in virtual reality the user is not limited to 2D perspective taking and perception, but allowed to all kind of projective transformations (see the mathematical model of embodied cognition by Rudrauf at al. 2017). This can also mean that in some scenarios less is left for the imagination, as the user is given the ability to observe the VR environment from every angle before making predictions. In comparison, TV screen as a medium forces us constantly and unconsciously to take into account those invisible cues framed out of the screen  – and filling them in with our own imagination.

The concept of flow is said to be a determining feature for the ability to emphasize and embody VR stories (Shin 2017). First of all, it can be argued that immersive technology and VR produce empathy only because of the sense of embodiment – it places the user in the centre of the experience. However, Shin (2017) also differentiates presence and flow from each other stating that “Presence can be immersion into a virtual space, whereas flow can be an experience of immersion into a certain user action.” Flow, in this context, always involves action.

Figure 2. The Flow Model of Mihaly Csikszentmihalyi (1997). As we see, the arousal level is in between the anxiety/fear and the state of flow. This figure partly follows the Russell’s circumplex model of affect (1980).

The common view is that breaks in presence (BIPs) need to be avoided in order to keep up the immersion of the user in the VR environment. Slater and Steed (2000) were first to define the concept of BIP, describing any event whereby the real world becomes apparent for the participant. However, this feature could be further studied with our topic related to different kind of phobias and their treatment. Would it be possible to play with the breaks in presence in order to help people to conquer their fears? In fact, some exposure therapist argue that for some phobias the distraction might hinder the process whereas for other it might actually work as a facilitating feature (McNally 2007).

We also took a look at calm technology and how it could be relevant for the level of immersion. Of course, the immersive experience needs to be mediated as transparently as possible, without the technological interference becoming an obstacle for the user. For people having a low immersive tendency all the technological aspects probably need to be more carefully designed, but as the previous studies show, the technological aspects by themselves are not the main ingredients for the immersion. (Weibel et al. 2010; Shin 2017.)

One popular point of view for the definition of immersion is the coherency of the environment. This means that the VR environment doesn’t necessarily need to follow the common laws of physics, as long as it follows some kind of coherent laws and rules. This feature proficiently implemented to virtual reality can surely turn out to be beneficial in some cases, and is something we will try to keep in our mind.

As Shin (2017) concludes:

“Future research should see immersion as a cognitive dimension alongside consciousness, awareness, understanding, empathizing, embodying, and contextualizing, which helps users understand the content and stories delivered.”

Understanding the concept of immersion is important for us as we are working on with different kind of phobias and fears, focusing on their treatment with exposure therapy in VR. As the main challenge is to change the fear memory of the user, the VR environment needs to be immersive and realistic enough in order to implement the behavioral change of the users also in vivo – in other words – having a long term impact.

Next, by taking a deeper look into those different kind of levels of our consciousness (ie. trying to make sense of Minsky’s theories), and searching for all types of relative “black holes” for our emotional processing we still hope to find new insights for the emotion recognition during the coming sprints.

Figure 3. One proposed model of immersion with the levels of user experience and the quality of the experience (Shin 2017). We find this model too simple to define all the aspects of immersion but it well states the sense of embodiment and empathy as guaranteeing the quality of user experience in VR environments. Baus and Bouchard (2014) also name three aspects from where to measure the quality of the user’s experience in VR: the feeling of presence, the level of realism and the degree of reality.

 

Sources:

Bailey J., Bailenson J., and Casasanto D. (2016). “When does virtual embodiment change our minds?” Presence. Vol 25(3): 222-23

Baus O. and Bouchard S. (2014). “Moving from Virtual Reality Exposure-Based Therapy to Augmented Reality Exposure-Based Therapy: A Review” Frontiers in Human Neuroscience. Vol 8: 112.

Kliteni K., Groten R. & Slater M. (2012). “The sense of embodiment in virtual reality”. Presence 21(4): 373–387.

McNally R. (2007). “Mechanism of exposure therapy: How neuroscience can improve physiological treatments for anxiety disorders” Department of

Minsky, M. (2006). The Emotion Machine. Commonsense Thinking, Artificial Intelligence, and the Future of Human Mind. Simon & Schuster, NY.

Rudrauf D., Bennequin D., Granic I., Landini G., Fristoni K., and Williford K. (2017). “A mathematical model of embodied consciousness” Journal of Theoretical Biology. Vol. 428:106-131

Shin, D. (2017). “Empathy and embodied experience in virtual reality environment: To what extent can virtual reality stimulate empathy and embodied experience?”. Coming in Computers in Human Behavior, January 2018, Vol.78: 64-73 Available online:

http://www.sciencedirect.com/science/article/pii/S0747563217305381

Shin, D. & Biocca F. (2017). “Exploring immersive experience in journalism”. New Media & Society. September 30, 2017

Slater M. and Steed A. (2000). “A virtual presence counter” Presence: Teleoperators and Virtual Environments. Vol 9(5):413–434

Weibel D., Wissmath B. & Mast F. (2010). “Immersion in Mediated Environments: The Role of Personality Traits” Cyberpsychology, Behavior, and Social Networking. Vol 13(3): 251-256.

Research trip to Twente University


Janina and Agnetha excited for the symposium 

During our first sprint we took a part of one day symposium titled “Wearables in Practice: Physiology and Emotions 4TU”, held at the University of Twente in Enschede. We got a lot of new insights from the experts working in a field of sensor technology and individual health.


Joyce Westerinks self-tracking with skin conductivity sensor in combination with taking notes throughout her day 

Joyce Westerink, a former physicist, has been focusing on wearable well-being in her research. She confirmed that skin conductivity is a reliable sensor to measure arousal levels, although the delay can vary from 6 to 10 seconds. In her study about stress levels in teachers, she also visualized the data so that the teachers could also interpret their stress levels by themselves. Westerink gave a brief insight into the psychophysiology in communication and the intimacy of heart beats. In her experiment – setting two individuals to different sides of the room, and then asking them to walk towards each other – she found out that hearing the personal heartbeat make the subjects to stop approaching each other more quickly. As she suggests, hearing a personal heartbeat is creating intimacy between the subjects. She strongly believes that this feature could also be used to improve our relationships with each other. Westerink argued that heart rate as a connection is more accepted among the users than the heart rate as an information. For our next sprints we will keep in mind the possibility to offer the subject the visualization of his/her emotional data.


One research looking how to track emotions on people with Borderline Personality Disorder using self-evaluation app in combination with heart-beat tracking wristbands 

The emphasis of the symposium was put onto an individual. Almost every speaker highlighted the self-regulation and -reflection of the wearables, as well as individual interpretation of one’s own physiological data. For example, Matthijs Noordzij stated that the smartwatch technology can be used as a means of warning, informing and/or educating people. However, in order to implement the smartwatch technology to our everyday life, it is necessary to design predictive models with the help of sensors. As Noordzij highlighted, we should invent a new approach to deal with the data by focusing more on design research.


One pilot study on wearable sensor tracking through vital signs. 

During the symposium, we also participated in a workshop to test out the effects of “surprise” or “excitement”. One participant in each team would wear a wristband that would track your heartbeat, and the data would be connected to an external app you can use on your smartphone. All data from your day would be stored there, and the wristband would make a sound that your pulse was increasing, and you can then go to the app and add some notes about what happened in that moment. The participants in this experiment would have to sit on a balloon, and the excitement of sitting on the balloon, and the may/may not effect of the balloon exploding would increase the heartbeat in the participants. Some balloons did indeed explode and we could clearly see changes in the pulse on the app. It was some delay though from actual “shock” and until the sensor would detect it and give feedback back to the user. Thus making it a bit uncertain to what actually triggered this emotion. We were asked to give our inputs about the product and give our thoughts on how to improve it. This use of technology to track emotions were definitely interesting for our project, and for our next sprint we will try to push forward and test out some of the ideas.


Workshop session with heartbeat sensor, smartphones and ballooons

The participants in this experiment would have to sit on a balloon, and the excitement of sitting on the balloon, and the may/may not-effect (anticipation) of the balloon exploding would increase the heartbeat (arousal level) in the participants. Some balloons did indeed explode and we could clearly see changes in the pulse on the app. It was some delay though from actual “shock” and until the sensor would detect it and give feedback back to the user. Thus making it a bit uncertain to what actually triggered this emotion. We were asked to give our inputs about the product and give our thoughts on how to improve it. This use of technology to track emotions were definitely interesting for our project, and for our next sprint we will try to push forward and test out some of the ideas.


The team keeping a happy mood after a long day in Enschede 🙂 

The symposium gave us a lot of insights into current findings in the field of wearable technologies and how we can implement such technologies in our own final products. We talked to quite a few researcher with their experience and knowledge in the field, and we were highly impressed with their research so far. With all this in mind we are excited to start the next sprint with new knowledge and broader insights. We did get feedback on our ideas from people and people seemed to agree that the problem we are attacking is something that could be very useful in many different fields, and we hope we can come up with a great solution to tackle this problem!

AN INTRODUCTION TO OUR CHALLENGE…

Four people from around the globe were transported to the Dutch capital Amsterdam and put together as a team at the Amsterdam MediaLab to face the challenge from Triple. Yujie Shan (CN), Janina Saarnio (FI), Christiaan van Leeuwen (NL) and Agnetha Mortensen (NO) with guidance from their coach Tamara Pinos from Ecuador. All with different background and skills, in the prospect of learning from each other and growing together as a team!

IMG_5157

GOAL (CHALLENGE):

“The development (and open-source release) of software that can determine the user’s emotional state based on sensory data (e.g., head motion, pupil dilation, heart rate variability) that is produced while she or he interacts with a virtual reality (VR) environment. This software can serve as a ‘middle layer’ between app and user, allowing the VR environment to the user’s present emotional state (i.e., valance and arousal levels).”

The last few weeks we have been attacking our challenge from different angles and trying to figure out different ways to meet our goals. In the startup week, we had a 3-day sprint where we got to play with the different tools in the Makers Lab and create different prototypes to show our client Triple. We’ve dugged deep into emotions, sensory technology, VR and Augmented Reality (AR) to try to understand our problem, get better insights into tools available and get an overview of our options and potential sololutions. Last week we went to Triple’s headquearter to meet our clients and talk to them about our ideas and hear their thoughts and inputs. What were they actually expecting out of us, and how could we collaborate together most optimally to achieve our goals? Triple is giving us a lot of freedom in how to tackle the challenge but will provide us with tools, and expert knowledge in order to reach our goals.

FullSizeRender 2

Getting an overview of our challenge and generating ideas for prototyping in the MakersLab

IMG_0314

One of our prototypes made in the MakersLab. A form of Twister based on emotions, where the different emotions would have different textures that would represent the feeling generated. 

After our meeting with Triple we decided to focus on one specific emotion that we can test and conduct experiments on. After an overall research on emotions we decided to focus on Fear, as this is an emotion that there is already conducted a lot of research on, and especially in relation to detecting that emotion through sensory technology. Everybody can get scared and feel the emotion of fear when something external triggers or fight-flight responses in the Amygdala in the brain. When you are afraid your heartbeat rises, you start sweating and body temperature drops by a few degrees. All this is trackable through different sensors, and our challenge now is how to select the sensors that best can detect the emotion of fear.

IMG_4330

Janina and Christiaan playing around with skin-conductivity sensors

Today we had an expert consultation with Joanekke Weerdmester (PhD-candidate at the Developmental Psychopathology research group at the Behavioural Science Institute in Nijmegen), to share our progress, ideas and get insights into how to tackle our problem further. Joanekke’s PhD-project is focused on the development and validation of biofeedback games for children with anxiety, and we found her field of knowledge highly relevant for our project. She was very helpful in advising us on the different routes we can take with our project, and she gave us a lot of relevant information in relation to games, and sensory technology. She has previous experience with different sensors used in projects she has been working on such as the games DEEP, and Nevermind. Her inputs definitely gave us some ideas on how to progress further.

20170831_125205

1st week prototype in how to detect emotions by self-reporting 4 basic emotions; sad, happy, scared and neutral.
The players would look at pictures in different categories where we selected 4 pictures that we thought would represent the different emotions. We would then see if the participants would report the same emotions as our assumptions.  The prototype was made using MakeyMakey.

IMG_5157 2

Janina and Rory on our way to our first meeting with our clients

IMG_0494

On our way back to the MediaLab in Amsterdam after our first client meeting with Triple in Alkmaar. Everyone super happy while Christiaan and Agnetha are lost in the virtual world of distractions. 

We are excited to continue this journey further, trying and failing, and hopefully, we can contribute to an amazing open source software that can benefit different stakeholders and research institutes 🙂

Until next time,

Meraqi,
Janina, Christiaan, Rory and Agnetha
(Team Sense VR)