VR Development

Starting with a concept I made for a design class and seeing the future potential of VR, I've grown my interest in VR development and document my learning below.

XR Haptics Manager PRO

In development. System is fully functional with what the features I intended to release for students I worked with.  It will be updated in the near future to cleanup leftover code, add a README and tutorial, and to post images here. Stay tuned!

Building Virtual Worlds - VR ULA

As the Undergraduate Learning Assistant for MI 482 Building Virtual Worlds at Michigan State University, I am responsible for providing technical help to the students regarding development of VR applications using Unity. I apply my knowledge of making VR games in Unity to aid students with setting up the base project and their Quest 2 headsets, maintaining GitLab repositories, and helping students troubleshoot or find solutions through C# scripting.

This position gave me the opportunity to research and explore compelling topics such as VR and Networking and Haptic Feedback.

This position lasts 15 weeks, ending in April 2024.

VR Haptics Presentation - MI 482

As the tech help for MI 482-Building Virtual worlds, I had noticed that the students were not including haptic feedback in their projects. Because haptic feedback is crucial to having an immersive project or a polished game, I worked with my professor to build the class's understanding of using Haptics.

I wanted to take the guesswork out of using haptics feedback, so I created a base system for the class. During my research, I found that the OpenXR haptic solution is very limiting, by only accepting flat numbers as haptic impulses and overriding any existing impulses. So solve this, I am working on a very in-depth haptic manager system that I plan to complete in time to give to the students. The system includes dynamic types for reading haptic data, built in curve haptics, additive and multiplicative haptic impulses, and potentially a priority and operation order.

Dueling Darkride - VR Game Jam

In March 2024, I joined a 4 person team to build a VR game in 48 hours for the Spartasoft March 2024 game jam. The team consisted of myself doing programming, another programmer, a level designer/generalist, and a friend who was brand new to game development.

Dueling Darkride is a two-player online arcade game, in which players are strapped with bug guns and compete to shoot the most targets while pushed via minecart through space. Throughout the ride, the player reaches several shop stations where they can spend points on a multitude of upgrades. But players must pick carefully, because the player with the most points at the end is the winner.

I developed the gun mechanics from scratch with a list of easily adaptable gun settings. In addition to the settings, I created a system of applying modifiers to the guns that were stored separately so that it is easy to reset the gun settings. I also created a reload gesture, wrist UI for score, and 3D buttons that interact with the guns.

If I were to do this project over again, I would have planned our time more effectively. Unfortunately we failed to put together a functional game loop in the final hours, and if we had managed our time and task priority better that would not have happened. I wish that I had done something different with the spread on the guns. It isn't intuitive what the spread strength value does, and the spread does not feel good to players. I think it would be better if the gunshots would hit the target if you were within X angle of the target, and can increase the threshold via an accuracy stat, to replace spread.

Our team is hoping to find time to fix up the project and release a build, and in the future we want to recreate the map and create a polished game loop.

Monarch Simulator - VR Game Jam

In February 2024, I created a VR game in 48 hours by myself as a part of the Spartasoft February 2024 Game Jam. 

My personal goal for this project was to make a fun & casual VR game by myself. I built the entire world and wrote all of the code to make the game function, in just 48 hours. The systems and mechanics I made are: a simple character AI, a confined lever, a releasing trapdoor, a score system, a spawning system, and an overall game manager. I also created a wristwatch UI to display the total amount of characters dropped through the trapdoor.

Updated 3/18/24 with improved AI, bugfixes, post-processing, and lighting.

Pandora's Boxing - VR Game Jam

In November 2023, I formed a team of 6 to participate in a 48-hour game jam. The goal in mind was for myself and another programmer to learn how to create a VR game with Unity's OpenXR SDK in preparation for my ULA position in the spring. We ran into a lot of technical difficulties which unfortunately lost us 24 of the 48 hours allowed in the game jam. Regardless, we made a cool project.

I learned how to do rapid development testing on my Quest 2 in Unity. Prior to this jam, I tested a small Unity VR project on my Quest 2 by building the project for each test which was far too inconvenient. I learned about the Oculus Link cable but opted to use Oculus AirLink with SteamVR, a wireless solution. This allowed myself and my teammates to test our VR project from the Unity Editor. Since then, I have learned about the Oculus Developer Hub and their XR Simulator which I am eager to try and use in my ULA position.

On Pandora's boxing, I was able to allow the player to hold down the trigger to convert their hand into a boxing glove for a weapon to battle enemies. This was pretty simple to do by setting up a UnityEvent listener to invoke whenever the trigger is pressed down. Next I made a sword object that the player could pickup. This was done by using the OpenXR Interactive Object component. Using the controls on the component, I designed it so that the sword would instantly connect and stay in the hand of the player when grabbed. I used the delta position of the object to calculate velocity, and a high enough velocity allowing player to damage enemies. This system with events could extend to other weapons with ease.

This project opened my eyes to the possibilities of the OpenXR SDK. The most confusing concept to wrap my head around was the intractable system and inputs. Since understanding and practicing how both of these work, I can extend this knowledge into creating more interesting features in the future.