Assignment 5 Report



1.    Introduction and Overview

Our MR (Mixed Reality) application revolutionizes spider exposure therapy by immersing users in a mixed reality environment, allowing them to confront their fears as lifelike spiders crawl on their hands and walls, providing a safe and effective way to overcome arachnophobia.

The only change to the application since Assignment 4 is the removal of the spider walking around the user’s hand. This decision was made because inverse kinematics did not play nicely with the weird hand mesh and caused leg problems. However, with more time and resources, this would be an intricate part of later program revisions.

2.    Technical Development

The application is designed to run on the Meta Quest Pro or Meta Quest 3, it takes advantage of the full colour passthrough capabilities of these headsets through Meta’s passthrough API. While this application would also work on a Quest 1 or Quest 2, the low resolution, black and white passthrough would make the experience sub-optimal.

The spider’s legs are completely procedurally animated using ray-casts to determine where to place the feet and a basic inverse kinematic (IK) rig to determine the angle that each of the spiders joints should be at to achieve the desired foot position. Movement of the spider is controlled with a script that uses kinematics to determine the velocity of the spider over time and displacement, this velocity is then used with the linear interpolation function to make the movement of the spider look more realistic overall. This is accomplished by having the spider accelerate at a constant value, this is then capped at a maximum velocity for the spider when it is walking between two points. At the end of the spiders journey the velocity is capped by using a sinusoidal function to ensure that the spider reaches the destination in a finite amount of time without overshooting the end point. The script determines the location to move the spider towards by taking the location of a game object, this way, the game object can be moved anywhere in the scene and the spider will walk towards it.

The transition of the spider between the wall and the hand is done by changing the parent of the spider when it meets the hand. This way, when the wall is moved into place for calibration, the spider follows it and when the spider moves to the hand, the spider can then follow all hand movements made by the user.

Due to a lack of automatic wall generation from Meta’s passthrough API, the user must automatically calibrate their wall at the start of the experience. This is done by having the user place both their hands on the wall a vector is created between these two points and the y (or j) component of the vector is set to 0. This vector is then transformed into a quaternion as this is the standard way of expressing orientation in Unity. However, since a vector cannot fully explain a quaternion, there is not enough information, therefore Unity makes an assumption (OpenAI, 2023), to correct for this it is assumed that the wall is perfectly vertical, and the plane is transformed appropriately.

 

3.    3D Content

Because this application heavily relies on the MR passthrough, there is minimal 3D models when it comes to the scene that the user is in. Much like an Augmented Reality (AR) application, the only models are the ones that are seen by the user on top of the real world passthrough.

Table 3D model


Figure 1: A 3D Model of a Table

This is a simple table 3D model created by us. It serves for the first scene as a place for the spider to sit under the glass. The material is a simple shader with a solid colour.

Glass 3D model


Figure 2: A 3D Model of a Glass

This is a simple glass 3D model created by us. It serves for the first scene as a place for the spider to sit under while it is on the table. It can be lifted by the user so that there becomes no barrier between them and the spider. The material is a standard shader with a solid colour and some transparency.

Spider 3D model


Figure 3: A 3D Model of a Spider with Animation Rigging

This is a simple model of a spider; it has an armature that can be animated to appear as if it is walking. In the final revision of the application, the legs will be controlled by an inverse kinematic system and the feet will be placed so that they always contact the user’s hand. The 3D model and textures were both obtained from this location:

https://assetstore.unity.com/packages/3d/characters/animals/insects/animated-spider-22986

 

4.    Usability Testing

While the main use case of this application would be in a controlled environment with a psychologist who will be trained in using this application, there may be people who wish to use the application on their own. As such, the interface will need to be intuitive and understandable for the user.

To test our application we will perform a simple field study with a small group of people. We will tell them the purpose of the application and then provide them with the headset that already has the application loaded up. From there, the person is on their own to figure out the application for themselves. After they are finished with the experience, we will gather feedback from them about what they thought about the interface to determine whether it is intuitive enough.

After conducting our usability tests, we collected all the data and feedback from observations we made during the tests as well as verbal feedback given by the participants. The testing certainly showed us a number of pitfalls that our current application has in its interface design. Out of the three participants that tested our application, all agreed that the calibration text was too long and not descriptive enough to explain how to calibrate the wall position completely. One young participant failed to read the calibration text entirely. Another point brought up by a participant was that there was no consistency between which hand was used to interact with the application, this comes after the application requires the user to use their right hand to interact with the menus but only allows the user to use their left hand to interact with the spider on the wall, this would likely be something that would be fixed by having all controls be ambidextrous in the final application.

The findings from the testing mostly boil down to confusion as to what to do next in the application and lack of continuity with the interactions. This is likely due to some of the idosyncrayies that are associated with our interactions, such as not many people would expect that if you put your hand on a wall a spider would walk over to it. This largely can not be helped as it is mainly designed for controlled environments where when the therapist though the patient was ready they would tell them to put their hand on the wall as some kind of final test or exposure.

 

5.    Addressing the Results of the Usability Testing

To address the user feedback, we primarily need to create clarifications in the application as to what actions the user can perform next to advance the experience. One of these clarifications would be notifying the user that they can place their hand on the wall to have the spider climb onto it. Additionally, refactoring the spider transition system so that it is possible for the spider to climb onto either hand would be a priority to reduce confusion and increase accessibility.

Another change we would make is to change the calibration text wording to make it more descriptive and have less words, this would make it more appealing to read and assist the user in understanding the application more.

To improve the accessibility of our application furthermore, we would refactor the menu interaction system such that either hand can be used to point and press buttons on the UI, this way left, and right-handed people could easily interact with our application.

Unfortunately, time does not permit the implementation of any of these changes, as most of them would require a large refactoring of the core code in the program to allow for more flexibility.

 

6.    References

KatriSoft. (2021, September 13). Unity Hand Tracking with Oculus Quest 2 and UI Canvas - Tutorial. Retrieved from YouTube:

lchaumartin. (2021, March 30). SpiderProceduralAnimation. Retrieved from GitHub: https://github.com/lchaumartin/SpiderProceduralAnimation

OpenAI. (2023, 10 18). Crie script Unity para planos. Retrieved from ChatGPT: https://chat.openai.com/share/7f14770c-7538-4db8-8a32-cc732ae38b0c

OpenAI. (2023, October 22). Unity LineRenderer Canvas Check. Retrieved from ChatGPT: https://chat.openai.com/share/5cf0dc07-82ee-44f5-af81-c2229bbbb205

OpenAI. (2023, October 20). Unity Script: Point Object. Retrieved from ChatGPT: https://chat.openai.com/share/089b9bd3-cc20-461b-b35f-0874e9ad43f3

OpenAI. (2023, October 22). Unity Vertical Plane Creation. Retrieved from ChatGPT: https://chat.openai.com/share/7a0ea4db-0c90-47f2-b79c-73bee9227924

Sounds and Green Screens. (2021, March 30). Ding! - Sound Effect. Retrieved from

Files

VR Assignment 4-5.apk 55 MB
Oct 22, 2023

Get arachnophobia exposure therapy application

Leave a comment

Log in with itch.io to leave a comment.