top of page
Sports Scrmble: Virtual Reality (VR) Game

Original Platform / Publish Date: Oculus Quest / May 2019

Armature Team Makeup: Game Designers (3), Engineers (7), Artists (7), Animation & FX (2), Audio (3), Production (3), & QA (1) 

My Role: UX Designer, UI Artist, 3D modeler, and Unreal 4 implementer

My Tools: Adobe Illustrator, Autodesk Maya, Perforce & Unreal Engine 4 (UE4)

Work Environment: 100% In-Person

Timeline: July 2018 - February 2019 (8 months of development)

Website: MetaQuest Store

My Role

The Publisher and Game Director had established the basic gameplay during pre-production. My role was to establish the UX interactions with 3D UI in a 360 space. I was also responsible for determining the UI look, modeling the UI, hand texturing the UI to “simulate light” and implementing it in the Unreal 4 Engine with basic blueprint animation scripting.


COMPETITOR RESEARCH: I played VR games that were already out on the Oculus Rift to see how other developers had handled interactions. For the most part, many used flat planes and projected UI elements on them using standard game UX interactions. Since the Game Director wanted 3D animated UX interactions in world space, I incorporated the evolving, simple VR interaction standards, and had to do R & D on what 3D UI the target hardware could handle.

TECHNICAL LIMITATIONS: The gameplay was so processor intensive, any intricate UX interactions had to be done while the game play was paused. For the same reason, the UI elements had to “simulate” having shadows on them, but could not cast shadows or accept light on their surfaces, in the game-world.

Defining the Game Flow

BASIC LEVEL STRUCTURE: Based on the technical limitations and gameplay established in pre-production, there were certain lobby areas where we could establish more complex UX interactions. While in-game, UI was kept to a minimum and was passively introduced. Some of the UX interactions were a joint feedback loop with the environment, so I worked closely with environment artists and designers to make sure a user knew when positive, and negative, interactions occurred.


BASIC LEVEL LAYOUT: I created high fidelity mockups in Illustrator since the visual language was already established by the Game Director. We iterated on visuals and user flow in 2D and once approved, the 3D creation and implementation commenced. Additionally, since the user was in the “center” of the experience, literaly, I made sure to take into account view angles, so all important information was in one field of view. If we strayed from that, animated textures or arrows were used to move the user’s eye to the necessary UI element.


PROCESS:We wanted to create as close to a 1:1 translation from the 2D Illustrator approved wireframes to the 3D world. These were the basic steps I had to figure out to make that a reality:

  • Export Illustrator Vector curves as DXF Files

  • Import DXF Files into Maya

  • Project curves onto a 2D plane

  • Extrude the curves to make 3D shapes

  • Make all parts of an assembly share the same origin point, so when imported into UE4, the animated parts worked together correctly

  • Unwrap the model

  • Paint “light” ramps onto the UV texture to simulate a light source using the atlas sheet. Also, if the texture would animate in UE4, make sure the UVs all were alighted correctly so the animation would move as designed.

  • Export the model

  • Import the model in UE4

  • Apply the correct atlas sheet texture OR material to the appropriate models

  • Apply animation scripting as necessary

  • Bring the 3D UI element into the 3D level to set-up the interface and create the user flow with game designers and environment artists based off the wireframe

Key Takeaways

This project was probably the most technically demanding project, but was pretty simplistic from a UX perspective. Most of my UX energy was spent making sure we had consistent interactions and layouts, that kept all decision making in the user’s filed of view.

bottom of page