Justin Nam talked about the work behind the Broken Rock project, explained how VR interactions were developed, and shared the technical aspects that helped the project management.
Hi everyone! My name is Justin Nam and I am a Virtual Reality major graduating from Ringling College of Art and Design. I have worked intensively with VR technology, specializing in VR user design and technical art. I had an internship at ContinuumXR, a VR medical company, and had many project collaborations, such as Moffitt Cancer Center MRI Experience and the Sarasota Housing Authority Architectural Visualization. In Broken Rock, I utilized the skills of tech art and VR interaction design.
Broken Rock was the senior thesis project that I worked on alongside my teammates Xavier Rodriguez, Joseph Janssen, and Quinn Kuslich. It is a VR psychological narrative where the user plays a character named Jed and has to traverse through a cave in order to find his missing daughter. Jed does not necessarily have a close relationship with his daughter, having many past traumas and gaslighting incidents. Within the cave, Jed faces many hallucinations due to the psychedelic-induced mushrooms.
The project took about a year to complete, within two and a half semesters. As a team, we underwent many phases, including pitches, pre-production and prototyping, and multiple milestones throughout the duration of the project. These experiences were very insightful, and the project was initially going to be five chapters long. But with many VR applications out there, it was necessary to make the entire experience about 15-20 minutes long. After a lot of feedback from instructors, consequently, we had to cut down the project scope, having multiple story drafts and changing the pacing within the level sequences many times.
User Presence & Feedback
It’s important to make the user feel a sense of agency. This means that the user should feel attached to the experience as if they were actually a part of it. As a result, we integrated many VR interactions in order to progress through the story. In our project, we implemented such aspects in a typical venture in a cave, like climbing, writing in a journal, grabbing items from a backpack, and more.
Throughout the development, it required a lot of playtesting. We tested to fix any occurring bugs, comfortability, and accessibility. It was to ensure that the experience felt the best as it could be. We implemented “quality of life” features such as camera fades, locomotion vignettes, and UI notifications.
Developing VR Interactions
Broken Rock was made in Unreal Engine 4, and the programming was made using systems such as Blueprints, UMG, and Level Sequences. Many of the interactions were prototyped prior to the actual project development, so we knew that the project was viable within the time frame. I programmed a lot of the VR gameplay in Broken Rock, including the level sequences, climbing, journal, and the map.
I collaborated with my peer Joseph Janssen to create the climbing interaction. We underwent many iterations to it. To make this, we researched a lot of existing VR games that had this, such as The Climb. Coding the basic grabbing and moving mechanic was doable, but in order to make the interaction feel more realistic, we had to adjust the interaction to have a parabolic trajectory. This means that the user should be able to launch themselves to other rocks.
It was a hard process to recreate, especially without any existing resources online. I had to get creative with this. After a lot of experimentation, eventually, I used physics as a method. I did this by adding velocity to the player pawn, based on the speed and the direction of where the hand was gripped. I optimized the code by having custom events fire and stop the connectors in order to make the code not continuously execute the entire time. Knowing each and every node from Unreal’s flow control was essential in making this VR Interaction. Some snippets of the code are below.
Another interaction I made was a journal used as a menu. Originally the project called for a diegetic main menu, and utilizing a journal was perfect. I programmed it so that holding a pencil over text buttons would allow users to navigate through the menus. Through the many iterations, I added functionalities such as spawning in the menu by pressing the pause button, book opening animation, and grabbing the pencil from anywhere on the journal. This ended up working proficiently on our final build.
Through this interaction, I learned that it’s key to keep the blueprint systems as clean and as simple as possible. I had to communicate through two different systems, so it was best to make the code linear and not cast back and forth between the UMG and Widget BP.
The last interaction I wanted to mention was using a map. It was necessary to give the users information about where they were at, to ensure that they don’t get lost through the narrative. I programmed it so that pocket flaps would open while hovering and that the map can be grabbed from the backpack. I believe that many people are familiar with a red X on a map, so I utilized this. While holding the map, the map would show a red X icon to where they stand on the level.
Similar to the Climbing Interaction, this code only ran when the user was holding the map. I looked up many tutorials online, took in the knowledge, and simplified it. The map worked by tracking the waypoints I placed on the level. The code then compared the distances and marked whichever had the shortest length.
It is important to note some of the technical aspects of the project I worked on which expedited and helped the team in our project management. To start, one issue our team encountered was making a grip pose for every item. This was going to be too exponential with our team’s workload. Instead, we found a workaround with Job Simulator as inspiration. We utilized a system called Tomato Presence, where items that are grabbed replace the user’s “hand”. This was a successful method, so I worked this code into the existing grab system in VR. As a result, the system made the development more efficient and user-friendly.
For the environment, our team used a Rock Master Material. By using material instances, the team could easily adjust parameters for specific rock assets. This included changing the texture and adding dirt and wet masks with the Material Functions I created. I added mechanics to where the user could add these masks procedurally via world position or manually with vertex painting.
I also mainstreamed a level sequence system for our team. Some features included a Voiceover macro which told if the playing voice line is finished and event dispatchers from blueprints firing events in the level. As a result, this made a viable workflow based on the linear narrative with its endings.
There were many challenges with this project, and we had to overcome them and find efficient ways of solving problems. One important tip that I have learned is to research and have a lot of testing.
Even if the answer may not be easily found, finding many resources and accumulating that knowledge can lead you to your goal. I solved most of the issues in the project by doing this or drawing from prior experiences. Playtesting and iterating based on that feedback is also critical to the development. Having an outside perspective makes it a lot easier to find bugs.
Overall, this project has been a long and exciting process of learning new things. I am proud of the result and look forward to continuing to explore new territories within VR.
Justin Nam, VR Technical Artist
This content is brought to you by 80 Level in collaboration with Unreal Engine. We strive to highlight the best stories in the gamedev and art industries. You can read more Unreal Engine interviews with developers here.
You may find these articles interesting