Batsy's Heroic Adventures

Course: Virtual Environments

Guidelines for Empathy Building for Bats with VR

Project Overview

About the Opportunity

Since the COVID pandemic, people have increased attempts to erase bat populations due to irrational fear of diseases. There is also a lack of awareness around the ecological importance of bats, who pollinate plants in their fight against climate change and kill harmful insects like mosquitoes.

According to AustinBats.Org,
"Bats are America's most rapidly declining and threatened warm-blooded animals. Alarming losses of free-tailed bats have been reported though their population status is inadequately monitored. Even the Congress Ave. Bridge bats appear to be in decline."

Academic studies have proven that VR can build empathy for animals by letting humans experience their day to day life.
This project aims to test and contribute to the design guidelines for building empathy towards animals, as defined by academic studies.

Project Overview

Role & Team

UX Methods I Used

I  worked as the Lead UX Researcher on this project alongside Ishita, Amanda, Fanyi and Soojin.

Research phase:
• Academic literature review
• User journeys
• Usability Testing
• Data analysis

Design phase:
• Storyboards
• UX writing
• Wireframing
• Prototyping

Tools

Timeline

Canva, Unity, Figma, Qualtrics

Fall 2022 (8 weeks - October to December)

Problem Statement

"How to help people empathize with bats and reduce the irrational fear resulting from the COVID- 19 pandemic?"

Project Roadmap

Since our project involved information architecture that needed to be tested for an exhibit, we employed the agile design methodology. This iterative process helped us seamlessly combine the multiple layers (the content research, the application and the 3D rendering of the space) crucial for the objectives we set out during ideation.

Sneak Peak

  1. Onboarding
  2. Discovery & Reservations for interesting exhibits
  3. The AR Token Game & Rewards
  4. Detailed Content & Audio Guides
  5. AR Enhanced Memories for sharing on social media
  1. Onboarding
  2. Discovery & Reservations for interesting exhibits
  3. The AR Token Game & Rewards
  4. Detailed Content & Audio Guides
  5. AR Enhanced Memories for sharing on social media

Click on the ethereal landscape below to play my VR game.

Research & Empathize

To understand the problem space more concretely, I conducted desktop research to know more about the lives of bats and also what studies on VR can teach me about designing virtual environments for combatting zoophilia.

#In what ways do bats contribute to humans and the environment?

My secondary research into the importance of bats revealed the following:

01
Bats reduce the use of pesticides by killing harmful pests.

They save farmers a billion dollars annually in avoided pesticide use by eating migrating pests and reducing egg-laying on crops

02
They're also responsible for pollination and seed dispersal of plants like the blue agave.
03
Bats carry no more diseases than other animals.

However, unfounded speculations and misleading research only strengthen the stigma against them. Colonies of millions have already been destroyed.

#What do we already know about building empathy through academic literature on VR?

I conducted a review of 10 academic studies focusing on ways for building effective VR and its impact on empathy. Important insights are given below:

01
Users to modify their thoughts and behaviors according to their avatars.

For instance, people with conventionally attractive avatars stand closer to peers in virtual spaces.

02
Embodying someone's daily routine  creates better understanding and empathy for their frustrations.

For instance, embodying the avatar of a gender minority and experiencing sexism helps people empathize more with gender minorities.

03
Low refresh rates, jarring movements and long drawn experiences cause simulator sickness.
04
Users adjust themselves to the bodies of their avatars even if their own might be different.

For instance, a person using a fox's avatar may use their foot as their tail, even if no such instructions were given by the game. A strong sense of immersion into the virtual environment can create strong feelings of ownership towards the avatar's body.

#Defining pain points: How do people view bats? What attitudes do they hold about the animal?

I questioned my colleagues and team members to understand how people predominantly view bats to understand who we were designing for. I also used the desktop research to guide my understanding of major user pain points:

Indifference & general fear

"I don't know, I've never thought of bats. But if I have to, I guess they're scary"

Fear of diseases

"Weren't they the cause of covid? I'm sure they must spread many other diseases too"

Misconceptions from popular culture

"I remember seeing Batman and bats really creeped me out since then".

Ideation & Brainstorming

The pain points made it clear that a lot of the stigma and mystery surrounding bats. It is difficult to empathize with a creature that one is scared of and doesn't know much about. That became our guiding principle.

Our initial idea centered on a "Day in the Life" narrative. The experience would take the user through multiple activities that bats engage in, making them aware of their importance and the perils they face. However, further brainstorming sessions made us realize that a gamified experience would be better and we spent time exploring this further.

As revealed through desktop research, it was also important to create a strong sense of immersion by developing the setting to increase sense of ownership towards Batsy's (our main character) body.

#Storyboarding:
The Game's Basic Challenge

We settled on a game idea where the objective would be to capture as many mosquitoes as one can while embodying a bat. The bat would capture mosquitoes placed around its path and help the humans in this way.

#Deciding on the Equipment

We decided to use the Meta Quest 2 because it is user-friendly, affordable and widely available.
It does not confine the user with a wire, unlike the Rift, and also has accurate motion tracking and a higher refresh rate that would make the experience immersive and keep motion sickness at bay.

The mid-fi prototype featured an urban forest with a freely moving bat character. It also featured mosquitos that could be 'caught' by pressing a button on the Oculus controller. However, it presented the risk of simulator sickness due to the height of the bat asset and its largely unhampered pathway.

⬤ Female/19/Younger siblings
⬤ Male/23
⬤ Female/26
⬤ Female/26
Male/28
⬤ Male/40
⬤ Male/41
⬤ Female/49/Parent
Female/65/
Grandparent

User Journeys

By imagining our intended audiences, we adapted our product to suit how the physical space, the application and interpersonal would impact user experience.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Wireframing & Prototyping

Sketching

The initial wireframes drawn on paper for understanding the flow of the game and the features and characters we wanted to add to it. Some ideas included pleasant background music to set the mood and a 'snap' sound and haptic feedback on successfully catching mosquitoes.

An important point of contention was how the user would embody the agent. Our secondary research suggested that first-person embodiment results in the highest levels of IVBO (Illusion of Virtual Body Ownership), but this would not give us the desired results in terms of empathizing with the avatar, as the user can't see it. Another approach would be having the user follow the avatar in a third-person POV. This would result in increased empathy, but lessened IVBO.

We decided to come up with a hybrid POV (point-of-view) approach. The camera would be positioned slightly above the bat, allowing for the user to see the bat as it moves around, but not be too divorced from the game as a third-person POV.

Physical Space

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

For the physical space, I researched Islamic architecture to inform the aesthetic of our imagined physical space. Based on my content research, my team calculated the footfall of the Austin Public Library and sketched out rooms with various ways of organizing information. These sketches were later visualized through AutoCAD.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Hi-Fi Prototype

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Usability Testing

Public Demo

A VR showcase was held at the University of Texas at Austin where students and faculty could try out different projects.

Research Question

The research questions that guided my user testing were based on the literature review conducted earlier. I created a pre- testing and a post-testing questionnaire to understand the following: How does immersion impact empathy-building in users?

Users were asked to rate feelings of self presence, spatial presence and empathy on a Likert scale of 1 to 5. A total of 22 people took the survey.

Self-presence

66%

Average self-presence score

72%

felt the avatar "represented them".

Spatial presence

80%

Average spatial presence score

90%

felt they were inside the virtual world.

Empathy

77%

Average empathy score

73%

felt compassion for Batsy.

Final Results:

Can VR Make Users Feel More Empathic Towards Bats?


The game boosted empathy and improved attitudes toward bats in 10 people (48%) participants:
3 became more positive than positive,
3 became more positive than neutral, and
3 became more positive than negative.

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Insights

Empathy through first-person perspective

The incorporation of the first-person perspective helps people go through the daily experiences of bats and gives them a better understanding of their experiences.

Reduced simulator sickness

The preset path is perfect for completely diminishing VR sickness in most users. It allows for ease of access to targets

Easy to follow and user-friendly

The game is easy to follow, and the controls are intuitive. The color scheme enables users to distinguish targets from other objects. The storyline is also well-developed enough to incite empathy and interest.

Lack of onboarding

The game doesn’t test whether the user has understood the controls.

Storylines and actionable advice

The game can be further expanded to accommodate intended storylines and actionable advice. Initially, I planned to include multiple levels to show the various stages of a bat's life and emphasize their signficance.

Lack of accessibility

The game is not the most accessible to people with disabilities and people from other linguistic backgrounds. VR's technological limitations contribute to this partially.

Emotionally compelling and simple graphics

People find it easier to empathize with simpler, animated faces as opposed to realistic and detailed ones. The choice of the cute bat avatar and corresponding world-building evoke warmth and compassion for the bat

Object of inquiry

Method

Participants

Lo-fi, hand-drawn sketches transposed into screens on Figma
In-person and face-to-face over Zoom (lo-fi app),  moderated think aloud protocol
5 lo-fi testers -
Young professionals,
grad students, and older adults

Lo-fi Findings & Recommendations

From this round of testing, we developed the following key takeaways:

Takeaway 1

Nomenclature: The names of several pages confused users, e.g. Art, Explore, My Library - it was challenging for users to complete information-seeking tasks on the first pass when they were unsure what they would find on these pages.

Takeaway 2

Navigability: The hamburger menu needed to be workshopped to include, exclude, and reorder functions. Additionally, it was challenging for users to locate the Princess Badr exhibit because they weren’t sure where to navigate- e.g. Exhibits, My Events, My Master Library.

Takeaway 3

Core Functionality: Some users weren’t sure they grasped the main purpose of the app. An e-reader? A booking site? Additional information? The AR game? We decided to focus less on booking and e-reading, and more on providing additional information + the AR game.

Mid-fi Prototyping & Usability Testing

On the basis of lo-fi feedback, we created an updated mid-fi Figma prototype:

Mid-fi Usability Testing

We conducted mid-fi testing right after spring break to further develop our phone app and to understand user behaviors at in-person exhibits

Object of inquiry

Method

Takeaway 1

Nomenclature: The names of several pages confused users, e.g. Art, Explore, My Library - it was challenging for users to complete information-seeking tasks on the first pass when they were unsure what they would find on these pages.

Takeaway 2

Navigability: The hamburger menu needed to be workshopped to include, exclude, and reorder functions. Additionally, it was challenging for users to locate the Princess Badr exhibit because they weren’t sure where to navigate- e.g. Exhibits, My Events, My Master Library.

Takeaway 3

Core Functionality: Some users weren’t sure they grasped the main purpose of the app. An e-reader? A booking site? Additional information? The AR game? We decided to focus less on booking and e-reading, and more on providing additional information + the AR game.

Participants

Simulated physical environment
+
Mid-fi mobile app prototype in Figma
In-person, moderated think aloud protocol
+
Contextual inquiry
5 people

Graduate students, Aged 23 - 28

Mid-fi Testing Sessions

We set up our simulated environment in the PCL grad student lounge.

Secondary Research

During the UX cycle, usually there's an emphasis on generative and evaluative research. For the purpose of this project, I learnt the skillset of distilling UX guidelines from academic research and validating them through my design.

Hardware Limitations

This project challenged me to work with the limitations of the Oculus gear as I modified my design and research around refresh rates, rendering, simulator sickness and even the strictures of an immersive space.

The Value of VR

It was interesting to learn about the various opportunity spaces and gaps in VR capabilities. The ability of VR to help people empathize, given certain design decisions, can be harnessed for artistic and social experiments.

Lessons Learned

You May Also Like

User Research. Accessible Redesign.

CapMetro

Making public transit accessible for people with visual disabilities

User Research. Usability Testing

Indeed

Enhancing Indeed's user experience and marketability through evaluative research

Reflections