MediaLAB Amsterdam is now the Digital Society School! You are viewing an archive of MediaLAB projects. Content on this website may be outdated, and is without guarantee.

Sense VR

Exposure Therapy with Virtual Reality

Team

Commissioner:

Description

LAB Fest and Final Presentation

Wednesday 17th of January MediaLAB Amsterdam arranged LABFEST, a final expo where we could showcase our protypes and talk to people in the industry about our projects. We got a lot of nice feedback and are happy with the end-product we came up with. Quite a lot of people showed up and we were excited to talk to people about our prototype and the future possibilities of our Virtual Reality Exposure Therapy!

Below you can see a few pictures from the expo:)

 

Team photo in front of our stand at the Expo!

 

Pavel is trying out the Relaxation Therapy before entering the VRET.

 

 

Our business cards ready to be handed out!

 

Our team member Janina at the Expo

FINAL PRESENTATION FOR OUR CLIENTS!

Wednesday morning we presented our project for our clients and let them try out the VRET prototype!

We also gonna publish a research paper which goes more into details about our design-thinking process and explains the computational model and our prototype. If you are interested to continue on the ideas or want to know more about our project feel free to contact us on: VRMediaLAB2017@gmail.com

Thanks to everyone who have contributed and helped us out in one way or another, and a big thanks to our coach Tamara Pinos Cisneros and our lead-programmer Danny Dorstijn!

Jingle Bell Rings After SinterKlaas

We were happy to get a professional photo for our team before the end of the project. Here we are! Team MAREQI is back with a high spirit!

This sprint is 1 week shorter than previous ones. We made the most of it and had a lot of fun.

Finnish Lunch Cooking

Our happy Finnish girl, Janina, conducted a Finnish lunch on 4 December for MediaLAB. By the way, Finland celebrated 100 years of independence on 6 December. We, creators from all over the world, celebrated this historical moment in advance by eating Finnish spinach pancake!

 

SinterKlaas Party

SinterKlaas left some gift boxes in MediaLAB before he went back to Spain. On 6 December, almost all MediaLAB members surrounded by each other reading our letters and opened gift boxes from SinterKlaas. Some got chocolate letters, and some received toys. The whole room was filled with laughters and happiness. What a wonderful evening!

 

Experiment

We set up a room with black curtains and aluminum foils in none/RGB lighting. The main goal of this experiment was to test change in the heart rate when the users were exposed to something fearful in a virtual environment. Subjects were told to watch a calming video in order to generate a resting heart rate as a baseline. Then, they were asked to play a game called Slender man which provided an individual gaming experience. Half of participants were exposed to RGB lights which changed colors based on user’s heart rate during the game while another half of participants were exposed to a dark room. Meanwhile, we recorded the users’ behavioral responses during the game. 

 

Snow White Amsterdam

On Monday morning, 11 December 2017, the whole Amsterdam city was covered with white snow. In the afternoon, heavy storm with big snowflakes hit the whole city. Trains and flights got canceled due to orange alert warning. We could not continue our experiment on the day since we could not get any participants.

As a result of data analyzation in the end of the experiment, it showed that there was not much relation between heart rate range and scary level. We are now trying to figure out a make-sense computational model in other ways.

 

Christmas Party

We set up a lovely Christmas tree in our office and get a Christmas-feeling group photo with our coach Tamara in the end of 2017. In addition, HvA invited employees including us to a fancy Christmas party with yummy food, drinks, and loud music in a party hall where we drank and danced till late. Tamara showed her enthusiasm on dancing which was very impressive.

In 2017, We experienced up and down together in our project. Hope it can only make us stronger in 2018!

Cheers!

 

Failure Is Success In Progress

Our Adventure never stops!

Nijmegen Day

We went to Enjoy VR in Nijmegen on 14 November to get more inspiration from underwater VR games.

We realized that our underwater concept so far was not unique enough after experiencing several underwater games in VR. However, if we managed to integrate biofeedback in the underwater VR experience, that would be a big step forward in VR field.

After that, we went to VR LAB where we were asked to stand on a metal mesh and rebuild a new landscape of Nijmegen city in VR by replacing landmarks taken from other cities. People there who designed this interactive game shared their ideas about the importance of the metal mesh under user’s feet which created presence in the game due to haptic responses.

 

At a previous translate session we had contact with Joanneke Weerdmeester who works at the Radboud University Nijmegen. She gave us an opportunity to go to the university to receive feedback from a group of experts who are either experienced with EEG, therapy, game development, behavioral science, heart-rate or other sensors. Our initial idea was to incorporate heart-rate sensors and EEG in the concept to make an estimate of the emotion of the user so the environment could be adapted accordingly. Unfortunately the experts told us a few things that will change the concept drastically. They described that the problem with a VR headset is that it causes electrical interference with the EEG and therefore gives very inaccurate readings with a lot of noise. Furthermore, there are probably not a lot of people who are afraid of jellyfish or some of the other underwater phobia who actually go to a therapist to get help for this. This was a significant reality check moment.We had an upset evening and the emotion lasted even for a few coming days. However, smiles were still on our faces as you can see in the photo below. The other day, we started to rethink about our concepts and came up with a new concept which can be applied to art therapy. In addition, we decided not to go for EEG but take behavior responses into account in our concept. 

Pending

On the coming Monday, Triple visited us in MediaLAB and we served Norwegian food for MediaLAB lunch. Reindeer meat gravy and sweet porridge dessert tasted not bad!

After a few days of a catch-up with Triple about our reality check in Nijmegen, we decided to continue our under water concept and focusing on our computational model that uses heart rate sensor as a biofeedback for now.

Tinkering

In the end of the week, with the help of some computer geeks from HvA, we tinkered heart rate sensors (Garmin & MioLINK & Arduino) and then found out that connecting these sensors to computers via bluetooth were not an easy job if we do not have enough knowledges of programing. Then we decided to give it a try on ANT+ instead of bluetooth to build a connection between heart rate sensor and Unity in our computational model. It turned out that it was not an easy job either!

“FAILURE IS SUCCESS IN PROGRESS.(Albert Einstein)” 

We have to get up and keep trying!

 

#Inspiration

At the beginning of the third sprint we spent few days outside of the MediaLAB to gather some inspiration for redefining our concept.

Dutch VR Days

On Thursday 26th of October we visited the Dutch VR Days where we got a chance to test the newest VR headsets (including the 8K Pimax, the only one which didn’t make Chris feel dizzy) as well as feel the overwhelming excitement in the growing field of VR and AR.

For the start we all lined up to test the Playstation VR. Rory got a chance to play a deep ocean game where the user was transported into the water where gradually by drowning down the environment got darker and fish bigger. At some point (apparently when the sharks appeared) she felt so scared that she needed to take the whole headset off and escape from the VR environment. Also Janina needed to face her fears by playing a VR simulation of a “hall of a horror” – the one famous and familiar from the amusement parks (she also screamed at the beginning of the game and felt pretty embarrassed about that). Agnetha played a shooting game in Playstation but she wasn’t really convinced about the quality of the game as there were a lot of glitches.

We also tested some other scary games in order to get some new insights to redefine and -design our concept. Something we found impressive was a scary game called “Hell Eluja” which was built interactive by adding a tablet to control the VR experience real time. The rules of the game were quite simple as the tablet controller (aka. the player) was asked to find the user experiencing the VR by adding cameras, and then implementing some monsters to catch and kill the user. Rory totally freaked out, and at the end we found her under the table. One idea for our concept could be to let the therapist “control” the exposure therapy based on her behavioral reactions.

In addition to the exploration of different kind of VR headsets and environments (including a physical water pontoon where one could swim with dolphins!?), and other crazy, although genius ideas how to make the most out of VR (check for example Sensiks), we also sat down for a while and listened few experts from the field. Isabel Tewes, now working as a developer strategist at Oculus, gave a speech about the user engagement. Even though the topic was not directly relevant for our project (since at the end we hope the user don’t need to get back to our VR environment) we were able to collect few points that we maybe should take into account when redesigning our concept.

Firstly, Tewes mentioned the importance of the first impression in order to engage the user to the VR environment. In fact, the first seconds in VR are very crucial for the immersion – which of course is necessary in our case of exposure therapy. As Tewes noted, there needs to be some elements or “magic” that helps us to engage in the VR environment. Interestingly, in all of her given examples the engagement was strongly build up by other imaginary subjects in the VR environment (see one of her examples about Rick and Morty: Virtual Rick-ality). This strongly reminded us about “Henry”, a VR experience of a hitchcock’s birthday, where the user is empathising and embodying him/herself to the VR environment even though it is cartoon and fictionable. By taking into account all of these examples, we assume something worth of study could be the presence of a third person or even a fictional subject in a VR environment, and its relation to empathy, engagement and immersion in virtual reality.

Dutch Design Week @Eindhoven

On the same Friday (27th of October) we took a train to Eindhoven to visit the Dutch Design Week and get some more inspiration. The exhibition “We know how you feel tonight” by VPRO Medialab was one of the events that got our attention from the DDW program.

At the entrance the exhibition took a quick look into the short history of emotional tracking. It made the visitor to question his/her emotional data, as well as its privacy, openness and future. The visitors were allowed to give some feedback by typing comments either to the sheets of “Fears” or “Hopes” concerning the use of their emotional data.

After the short introduction we passed through the big black curtains into a dark room. The colorful auras around the people who were lying down in the floor created a bit spooky feeling. By taking a closer look, we were able to see that they were having different kind of sensors attached to them. The EEG sensor in their forehead was programmed to listen and record the biofeedback from their bodies – and transport them out of the noise of our footsteps into somewhere. But who knows where?

The emotional aura, appearing in different colors according to the emotional data of the users, was build according to the Russell’s Circumplex Model of Affect (1980) which is already familiar to us. For the EGG they were using the MUSE sensor which is developed especially for the use of meditation. However, interpreting the data and visualize it is never an easy task. According to the exhibition builders, the emotional aura, which was made visible by the lasers was relatively difficult to build up. Overall, the exhibition was worth visiting and we were especially happy that a lot of space was left for critical thinking. Even the experience as an observer raised a different kind of and sometimes overlapping emotions, feelings and questions.

How can you be sure the emotional aura is build according to the direct biofeedback of the person? Is there somebody behind the recording machine saving the emotional data of the user – and if yes – how it will be reused in the future? Are those people here voluntarily or did somebody pay them to come here? How willing would you be to be attached to those sensors, and visibly showing your “emotions” to other people? Does the visualization of the emotional biofeedback create intimacy between the people?

By digging into the last question, we believe that the intimacy of the biofeedback might be used in the future. From the therapy point of view the visualization of the emotional data can help to build an empathetic environment, which then again can help us to take different kind of perspectives. For example, a film producer Chris Milk is calling virtual reality as an “ultimate empathy machine”, something which he also explored by producing a project for United Nations. A 360 degree film “Clouds Over Sidra” transports the user to follow a life of a 12-year old girl refugee girl in Syria. In his TED talk Milk argues that especially the empathy activated via VR can help us to change this world. However, the notion of empathy is raising controversial debates around the field. Is empathy something special for VR or more a question about the immersiveness and the different routes for that? As we argued in our previous blog post, also the embodiment of the user can lead to immersion and engagement in virtual reality, as well as the state of flow.

The VPRO Medialab states in their website “Our emotions have been our own until now, but it seems that this is about to change”. Do you agree or disagree?

The Virtuous Cycle of Immersion

“The goal of VR storytelling is to tell a story that will stimulate emotions that will influence action” – Shin (2017)

In 1980 Marvin Minsky used a term “telepresence” to describe the feeling that a human operator may have while interacting with teleoperation systems. Later, telepresence was translated into “immersion” which generally describes a deep mental involvement in something. During the second sprint we studied immersion in the VR environment, focusing on the process from the users’ perspective. By putting the individual at the centre of the studies about immersion, embodied cognition and the theory of flow we were able to find some new point of views for our topic and get some validations for our concept.

Janina’s physical mind map of “The virtuous cycle of immersion” (aka. never ending story).

Not surprisingly, the first important notion is again the individuality. The personality traits of the user (the so-called “Big Five”) need to be taken into account before the users’ level of immersion can be analyzed. As Weibel et al. (2010) states, previous studies only show the level of immersion measured in terms of satisfaction and positive feedback of the user, whereas less is left to question how personality traits can influence the overall experience of immersion. Lately, it has been studied that especially some individual factors, such as openness to experience and high imagery ability, have a positive effect on presence – followed up by the immersion (Weibel et al. 2010). During our concept sketching we also faced this problem and we found ourselves asking what motivates our target group. What make the 15 to 24 year old women to get up from bed every morning?

 

Figure 1. A screenshot of Janina’s mind map (which is as fluid as the concept of immersion by itself). With the help of this map, we are able to see that in addition to “openness to experience” also emotional involvement of the user might be relevant for the level of immersion. And as you can see, the design perspective states that in order “to facilitate immersion and to enhance sensations of presence, prior assessment of the users’ personality might effectively help tailoring the most suitable environment for each user” (Weibel et al. 2010).

In addition to the personality traits of each user, the notion of body and the sense of embodiment are important for defining the level of immersion. The VR environment not only make users to feel different kind of emotions but moreover changes who they are in the virtual space (Shin 2017). Fully immersive VR environment offers a sense of embodiment, which can be defined as a state where the users feel themselves as a part of the VR environment. At the same time, users can also feel that VR components are parts of their own bodies (see Bailey et al. 2016). In fact, the users often create a virtual body of themselves inside of the immersive VR environment as “an analog of his or her own biological body” (Kliteni et al. 2012).

“In other words, the embodied cognition in VR allows users to feel a sense of embodiment.” (Shin & Biocca 2017)

As Shin and Biocca (2017) conclude, this sense of embodiment comes from our embodied cognition which we process by being physically “present” somewhere. Marvin Minsky (2006) would probably include presence as one of those suitcase-like words, which doesn’t have a clear definition but which can help us to understand our brain processes. However, we will define presence for now as a state of mind and immersion as an experience over time (Shin 2017). In fact, presence is not unique for the VR as also watching a movie or simply reading a text can induce a feeling of presence (Baus and Bouchard 2014). However, presence and immersion together are both strongly related to our ability of perspective taking and perception (Rudrauf at al. 2017).

If the immersion in the VR environment is defined as “an experience over time” and not an instant temporary connection (which we would probably call a presence), it means that the user’s cognition is highly involved to this process. The VR environment, as well as the stories told in VR, are constantly reprocessed through the user’s sense-making. As Shin (2017) argues, “users actively create their own VR, based on their understanding of the story, their empathetic traits, and the nature of the medium.” Therefore, the role of VR developers is only to propose an immersion but the processing of the immersion is left for the user. It is important to understand both presence and immersion as fluid states that are reprocessed and redefined by the user. (Shin 2017.)

Some other necessary elements for immersion and the engagement of the user are the ability of perspective taking, the concept of flow and empathy. Of course, the ability of perspective taking involves the 360 degree nature of the VR environment. Unlike in other mediums, in virtual reality the user is not limited to 2D perspective taking and perception, but allowed to all kind of projective transformations (see the mathematical model of embodied cognition by Rudrauf at al. 2017). This can also mean that in some scenarios less is left for the imagination, as the user is given the ability to observe the VR environment from every angle before making predictions. In comparison, TV screen as a medium forces us constantly and unconsciously to take into account those invisible cues framed out of the screen  – and filling them in with our own imagination.

The concept of flow is said to be a determining feature for the ability to emphasize and embody VR stories (Shin 2017). First of all, it can be argued that immersive technology and VR produce empathy only because of the sense of embodiment – it places the user in the centre of the experience. However, Shin (2017) also differentiates presence and flow from each other stating that “Presence can be immersion into a virtual space, whereas flow can be an experience of immersion into a certain user action.” Flow, in this context, always involves action.

Figure 2. The Flow Model of Mihaly Csikszentmihalyi (1997). As we see, the arousal level is in between the anxiety/fear and the state of flow. This figure partly follows the Russell’s circumplex model of affect (1980).

The common view is that breaks in presence (BIPs) need to be avoided in order to keep up the immersion of the user in the VR environment. Slater and Steed (2000) were first to define the concept of BIP, describing any event whereby the real world becomes apparent for the participant. However, this feature could be further studied with our topic related to different kind of phobias and their treatment. Would it be possible to play with the breaks in presence in order to help people to conquer their fears? In fact, some exposure therapist argue that for some phobias the distraction might hinder the process whereas for other it might actually work as a facilitating feature (McNally 2007).

We also took a look at calm technology and how it could be relevant for the level of immersion. Of course, the immersive experience needs to be mediated as transparently as possible, without the technological interference becoming an obstacle for the user. For people having a low immersive tendency all the technological aspects probably need to be more carefully designed, but as the previous studies show, the technological aspects by themselves are not the main ingredients for the immersion. (Weibel et al. 2010; Shin 2017.)

One popular point of view for the definition of immersion is the coherency of the environment. This means that the VR environment doesn’t necessarily need to follow the common laws of physics, as long as it follows some kind of coherent laws and rules. This feature proficiently implemented to virtual reality can surely turn out to be beneficial in some cases, and is something we will try to keep in our mind.

As Shin (2017) concludes:

“Future research should see immersion as a cognitive dimension alongside consciousness, awareness, understanding, empathizing, embodying, and contextualizing, which helps users understand the content and stories delivered.”

Understanding the concept of immersion is important for us as we are working on with different kind of phobias and fears, focusing on their treatment with exposure therapy in VR. As the main challenge is to change the fear memory of the user, the VR environment needs to be immersive and realistic enough in order to implement the behavioral change of the users also in vivo – in other words – having a long term impact.

Next, by taking a deeper look into those different kind of levels of our consciousness (ie. trying to make sense of Minsky’s theories), and searching for all types of relative “black holes” for our emotional processing we still hope to find new insights for the emotion recognition during the coming sprints.

Figure 3. One proposed model of immersion with the levels of user experience and the quality of the experience (Shin 2017). We find this model too simple to define all the aspects of immersion but it well states the sense of embodiment and empathy as guaranteeing the quality of user experience in VR environments. Baus and Bouchard (2014) also name three aspects from where to measure the quality of the user’s experience in VR: the feeling of presence, the level of realism and the degree of reality.

 

Sources:

Bailey J., Bailenson J., and Casasanto D. (2016). “When does virtual embodiment change our minds?” Presence. Vol 25(3): 222-23

Baus O. and Bouchard S. (2014). “Moving from Virtual Reality Exposure-Based Therapy to Augmented Reality Exposure-Based Therapy: A Review” Frontiers in Human Neuroscience. Vol 8: 112.

Kliteni K., Groten R. & Slater M. (2012). “The sense of embodiment in virtual reality”. Presence 21(4): 373–387.

McNally R. (2007). “Mechanism of exposure therapy: How neuroscience can improve physiological treatments for anxiety disorders” Department of

Minsky, M. (2006). The Emotion Machine. Commonsense Thinking, Artificial Intelligence, and the Future of Human Mind. Simon & Schuster, NY.

Rudrauf D., Bennequin D., Granic I., Landini G., Fristoni K., and Williford K. (2017). “A mathematical model of embodied consciousness” Journal of Theoretical Biology. Vol. 428:106-131

Shin, D. (2017). “Empathy and embodied experience in virtual reality environment: To what extent can virtual reality stimulate empathy and embodied experience?”. Coming in Computers in Human Behavior, January 2018, Vol.78: 64-73 Available online:

http://www.sciencedirect.com/science/article/pii/S0747563217305381

Shin, D. & Biocca F. (2017). “Exploring immersive experience in journalism”. New Media & Society. September 30, 2017

Slater M. and Steed A. (2000). “A virtual presence counter” Presence: Teleoperators and Virtual Environments. Vol 9(5):413–434

Weibel D., Wissmath B. & Mast F. (2010). “Immersion in Mediated Environments: The Role of Personality Traits” Cyberpsychology, Behavior, and Social Networking. Vol 13(3): 251-256.

The Journey from Sprint 2 Sprint!

Time flies and our project is taking shape and transforming into new possibilities. The last few weeks have been very hectic trying to get everything ready for our meeting with Triple.

In the first week of our sprint we did a Visual Thinking and Communication Workshop (www.loesbogers.com) with two ex-MediaLAB workers. They thought us how to doddle and sketch our ideas to visualize our ideas in an easy way. It was a lot of fun and we even made some drawings of each other.

From the first sprint we decided to go from the overall look on the emotion of fear and narrow it down to more specifically looking at anxiety among young women in the ages from 15-24.

As we already have done a lot of research in our previous sprint about fear, we decided to take it to a next level and focus on anxiety as anxiety is rooted in a lot of irrational fears that are affecting people’s everyday lives. Seeing how big of an issue anxiety is, and especially among young women and girls in their teens we have decided to put our attention towards young girls and women in the ages 15-24 in Generation Z with a slight overlap to the late bloomers in the Millennials generation. We did this aw we wanted to put our attention towards a group of people whom we can have a big impact on with the product we come up with.

 

Anxiety disorders are also the most common mental disorders in western countries, and as many as 4 out of 100 experience anxiety globally. In addition women are twice as likely as men to experience anxiety, and anxiety levels among young women and girls are higher than ever, and keeps on rising. Increasing numbers of academic studies are finding that mental health problems have been soaring among girls over the past 10 – and in particular the last five – years. The physical responses are also better detectable in women compared to man.

Many young girls and women face a huge range of pressures. In particular, the rise of social media means they need to always be available, they may seek reassurance in the form of likes and shares, and they are faced with constant images of ‘perfect’ bodies or ‘perfect’ lives, making it hard not to compare themselves to others. Fear of failure and fear of being left out socially is especially troubling young girls. NHS data shows 68% rise in hospital admissions because of self-harm among girls under 17 in past decade. Body dissatisfaction is seen in about 10% of girls at primary school but really jumps in early adolescence, as puberty is starting, and about 50% of girls age 15 feel that they are too fat. During this period girls tend to self-objectify more than boys, experience more teasing around weight and shape and perceive more pressure from friends and family to be thin.

Through research and conducting surveys we are aware of a lot of the triggers that makes the young girls and women feel inadequate, like social media, peer-pressure, body dissatisfaction, bullying, stress at school, FOMO (fear of missing out), and the tendency to ruminate. Understanding the fears of young girls and women will definitely help us in our process towards making an adaptable and immersive environment that can empower these young girls and women to become more confident in their own skin, and eventually become anxious free.

Through the VR environment we aim to:
– Improve their ability to adapt to life’s challenges (e.g. flexibility, optimism, social competence, emotional management and positive self-esteem)
– Face fear through building courage (e.g. mental, intellectual, emotional and moral)
– Developing intellectual courage to not fear what others think
– Learning self-acceptance that will create inner strength
– Boosting buoyancy and adaptation to the challenges children and young people face

With a well defined target group we decided to do our translate session at Rotterdam University of Applied Sciences, where our team member Christiaan had to have a pitstop presentation about our project this far. He designed a good looking poster that he presented for the rest of his class. Beside looking at other projects and gaining some new insights we also got a lot of good feedback from different teachers in how to improve our ideas.

As research, surveys and interviews have been taking a lot of our time this last 2 sprints we decided that it was time to get creative and generate some cool concepts to present for our client Triple during our next meeting. With good knowledge about our target group we could pour out with several different ideas with brainstorming in how to help our target group overcome their fears, anxieties and phobias.

 

Working on our presentation until the final end before our meeting we delivered 3 different concepts to Triple in how to help our target audience.

Below you can see the different Moodboards of the 3 different concepts we presented for Triple.

Virtual Group Therapy (MORPG) 

Emotional Internal Journey

Cinematic Emphatic Experience

After feedback from Triple we have chosen to go the route of the Virtual Emotional Journey concept and are excited to continue to work on it for the rest of the project. We have still a lot of work to do, and next sprint will be focused on researching implementation of Oculus Rift with the Sensor technology, and how to extract and transform the data to create the adaptable environment that can be used as a form of exposure therapy for people with different phobias.

Below you can see our first storyboard from the selected concept. Drawing made by Rory!

Until next time!

Love,

MERAQI

 

Research trip to Twente University


Janina and Agnetha excited for the symposium 

During our first sprint we took a part of one day symposium titled “Wearables in Practice: Physiology and Emotions 4TU”, held at the University of Twente in Enschede. We got a lot of new insights from the experts working in a field of sensor technology and individual health.


Joyce Westerinks self-tracking with skin conductivity sensor in combination with taking notes throughout her day 

Joyce Westerink, a former physicist, has been focusing on wearable well-being in her research. She confirmed that skin conductivity is a reliable sensor to measure arousal levels, although the delay can vary from 6 to 10 seconds. In her study about stress levels in teachers, she also visualized the data so that the teachers could also interpret their stress levels by themselves. Westerink gave a brief insight into the psychophysiology in communication and the intimacy of heart beats. In her experiment – setting two individuals to different sides of the room, and then asking them to walk towards each other – she found out that hearing the personal heartbeat make the subjects to stop approaching each other more quickly. As she suggests, hearing a personal heartbeat is creating intimacy between the subjects. She strongly believes that this feature could also be used to improve our relationships with each other. Westerink argued that heart rate as a connection is more accepted among the users than the heart rate as an information. For our next sprints we will keep in mind the possibility to offer the subject the visualization of his/her emotional data.


One research looking how to track emotions on people with Borderline Personality Disorder using self-evaluation app in combination with heart-beat tracking wristbands 

The emphasis of the symposium was put onto an individual. Almost every speaker highlighted the self-regulation and -reflection of the wearables, as well as individual interpretation of one’s own physiological data. For example, Matthijs Noordzij stated that the smartwatch technology can be used as a means of warning, informing and/or educating people. However, in order to implement the smartwatch technology to our everyday life, it is necessary to design predictive models with the help of sensors. As Noordzij highlighted, we should invent a new approach to deal with the data by focusing more on design research.


One pilot study on wearable sensor tracking through vital signs. 

During the symposium, we also participated in a workshop to test out the effects of “surprise” or “excitement”. One participant in each team would wear a wristband that would track your heartbeat, and the data would be connected to an external app you can use on your smartphone. All data from your day would be stored there, and the wristband would make a sound that your pulse was increasing, and you can then go to the app and add some notes about what happened in that moment. The participants in this experiment would have to sit on a balloon, and the excitement of sitting on the balloon, and the may/may not effect of the balloon exploding would increase the heartbeat in the participants. Some balloons did indeed explode and we could clearly see changes in the pulse on the app. It was some delay though from actual “shock” and until the sensor would detect it and give feedback back to the user. Thus making it a bit uncertain to what actually triggered this emotion. We were asked to give our inputs about the product and give our thoughts on how to improve it. This use of technology to track emotions were definitely interesting for our project, and for our next sprint we will try to push forward and test out some of the ideas.


Workshop session with heartbeat sensor, smartphones and ballooons

The participants in this experiment would have to sit on a balloon, and the excitement of sitting on the balloon, and the may/may not-effect (anticipation) of the balloon exploding would increase the heartbeat (arousal level) in the participants. Some balloons did indeed explode and we could clearly see changes in the pulse on the app. It was some delay though from actual “shock” and until the sensor would detect it and give feedback back to the user. Thus making it a bit uncertain to what actually triggered this emotion. We were asked to give our inputs about the product and give our thoughts on how to improve it. This use of technology to track emotions were definitely interesting for our project, and for our next sprint we will try to push forward and test out some of the ideas.


The team keeping a happy mood after a long day in Enschede 🙂 

The symposium gave us a lot of insights into current findings in the field of wearable technologies and how we can implement such technologies in our own final products. We talked to quite a few researcher with their experience and knowledge in the field, and we were highly impressed with their research so far. With all this in mind we are excited to start the next sprint with new knowledge and broader insights. We did get feedback on our ideas from people and people seemed to agree that the problem we are attacking is something that could be very useful in many different fields, and we hope we can come up with a great solution to tackle this problem!

AN INTRODUCTION TO OUR CHALLENGE…

Four people from around the globe were transported to the Dutch capital Amsterdam and put together as a team at the Amsterdam MediaLab to face the challenge from Triple. Yujie Shan (CN), Janina Saarnio (FI), Christiaan van Leeuwen (NL) and Agnetha Mortensen (NO) with guidance from their coach Tamara Pinos from Ecuador. All with different background and skills, in the prospect of learning from each other and growing together as a team!

IMG_5157

GOAL (CHALLENGE):

“The development (and open-source release) of software that can determine the user’s emotional state based on sensory data (e.g., head motion, pupil dilation, heart rate variability) that is produced while she or he interacts with a virtual reality (VR) environment. This software can serve as a ‘middle layer’ between app and user, allowing the VR environment to the user’s present emotional state (i.e., valance and arousal levels).”

The last few weeks we have been attacking our challenge from different angles and trying to figure out different ways to meet our goals. In the startup week, we had a 3-day sprint where we got to play with the different tools in the Makers Lab and create different prototypes to show our client Triple. We’ve dugged deep into emotions, sensory technology, VR and Augmented Reality (AR) to try to understand our problem, get better insights into tools available and get an overview of our options and potential sololutions. Last week we went to Triple’s headquearter to meet our clients and talk to them about our ideas and hear their thoughts and inputs. What were they actually expecting out of us, and how could we collaborate together most optimally to achieve our goals? Triple is giving us a lot of freedom in how to tackle the challenge but will provide us with tools, and expert knowledge in order to reach our goals.

FullSizeRender 2

Getting an overview of our challenge and generating ideas for prototyping in the MakersLab

IMG_0314

One of our prototypes made in the MakersLab. A form of Twister based on emotions, where the different emotions would have different textures that would represent the feeling generated. 

After our meeting with Triple we decided to focus on one specific emotion that we can test and conduct experiments on. After an overall research on emotions we decided to focus on Fear, as this is an emotion that there is already conducted a lot of research on, and especially in relation to detecting that emotion through sensory technology. Everybody can get scared and feel the emotion of fear when something external triggers or fight-flight responses in the Amygdala in the brain. When you are afraid your heartbeat rises, you start sweating and body temperature drops by a few degrees. All this is trackable through different sensors, and our challenge now is how to select the sensors that best can detect the emotion of fear.

IMG_4330

Janina and Christiaan playing around with skin-conductivity sensors

Today we had an expert consultation with Joanekke Weerdmester (PhD-candidate at the Developmental Psychopathology research group at the Behavioural Science Institute in Nijmegen), to share our progress, ideas and get insights into how to tackle our problem further. Joanekke’s PhD-project is focused on the development and validation of biofeedback games for children with anxiety, and we found her field of knowledge highly relevant for our project. She was very helpful in advising us on the different routes we can take with our project, and she gave us a lot of relevant information in relation to games, and sensory technology. She has previous experience with different sensors used in projects she has been working on such as the games DEEP, and Nevermind. Her inputs definitely gave us some ideas on how to progress further.

20170831_125205

1st week prototype in how to detect emotions by self-reporting 4 basic emotions; sad, happy, scared and neutral.
The players would look at pictures in different categories where we selected 4 pictures that we thought would represent the different emotions. We would then see if the participants would report the same emotions as our assumptions.  The prototype was made using MakeyMakey.

IMG_5157 2

Janina and Rory on our way to our first meeting with our clients

IMG_0494

On our way back to the MediaLab in Amsterdam after our first client meeting with Triple in Alkmaar. Everyone super happy while Christiaan and Agnetha are lost in the virtual world of distractions. 

We are excited to continue this journey further, trying and failing, and hopefully, we can contribute to an amazing open source software that can benefit different stakeholders and research institutes 🙂

Until next time,

Meraqi,
Janina, Christiaan, Rory and Agnetha
(Team Sense VR)