Frostbound - Schell Games
Platform: Google Daydream
Published: November 2016
Rougelike Adventure Survival VR game
Frostbound is a game developed by Schell Games for Google's virtual reality platform, Daydream. It is an unique adventure survival game with rougelike aspects. In Frostbound, you can immerse yourself in a icy landscape filled with beauty and peril. You need to guide a small expedition of Elkin, elk-like creatures, through a beautiful yet dangerous wilderness towards a distant mountain and the abandoned city on top of it. Check out our launch trailer below!
AI and Graphics Engineer
This was an interesting challenge, as we had to make a fully fledged VR game, on a new platform with all its limitations in 2 and a half months. I was one of 3 engineers on this project and was in charge of the AI and graphics programming in the game. In terms of AI, my main tasks were to program the AI of the wolves and wolf king. Due to the tight schedule, I was given complete ownership over the enemies, which meant programming their AI using goal-oriented state machines and pathfinding, and also designing their behavior based on feedback and iteration. I worked closely with the rest of the team to iterate on their behavior, based on user testing and feedback.
I was also the graphics engineer on the team. One of the first things I did was implementing the day-night cycle change, where the lighting and visual FX changed with in-game time. I also wrote the majority of the shaders for the game, from character and environment shaders to UI shaders. I also added in many other visual FX in the game. Throughout, my main goal was always to keep it as performant as I could, and utilize as little draw calls as possible, which I was largely successful at doing and the final game ran at 60 FPS, the desired framerate for Daydream games.
For both game AI and game graphics, I developed engine tools for the artists and designers to use in order to modify gameplay and game visuals easily on their end, without my help. This proved to be very useful, and helped save a lot of iteration time down the road and was well-worth the time I put in early on. The best part about this game was I was on this game from start to release, and enabled me to experience the development cycle in full.
I Expect You To Die - Schell Games
Platforms: Oculus Rift with Touch, PSVR
Published: December 2016
Play as a super spy in this VR escape the room game
I Expect You To Die is a virtual reality puzzle game where you play as an elite secret agent. You must attempt to survive deadly situations in immersive, dangerous locations and situations. The game was developed for the Oculus originally, and uses the Oculus touch. It can also be played with the mouse and keyboard. The studio decided to port it to the PlaystationVR as well in 2016, where it can be played used either the PS Move controllers or a dualshock controller. The game itself has been widely praised by critics and fans alike and it is an original IP from Schell Games. Check out the launch trailer below!
I was brought onto the team when we were porting the game to the PlaystationVR. The team was having troubles trying to get the game running at 90 FPS throughout all game scenes and possible actions, so my role was to get it optimized and running at 90 FPS. We used a variety of techniques to achieve this and the main issue was finding a way to reduce draw calls. I wrote a bunch of shaders and modified some code on the graphics end to help reduce some draw calls, and worked closely with the artists in order to achieve optimial static and dynamic draw call batching in our problematic scenes.
Annihilator VR - Schell Games, Legendary Entertainment
Platform: GearVR, Google Cardboard
Published: Jan 2017
Google Play Store: Annihilator VR
An Interactive Comic Book VR Experience
This was a project that Schell Games partnered together with Legendary Entertainment to publish. The VR experience was based on the Annihilator Comic. The idea was to figure out how an interactive comic book would be in VR and make that. The end result was something that is really immersive and makes use of VR pretty well.
Gameplay and Build Engineer
My main roles on this project was as a gameplay and build engineer. I added in several gameplay features such as tracking gaze and modifying the game UI according to where you are looking, to help guide the user into where he should be looking. I also came into this project late so a lot of my responsbilities also included reading through previous code written, fixing bugs, improving and tweaking features. This gave me a lot of experience in parsing through code, especially really complex code as the principal engineer who wrote most of the code is one of the smartest people I have met. He created a graphical node editor for designers to use to direct the story flow throughout the experience, and I had the pleasure of being able to go through the code and also help add features to it.
I also helped created build plugins for Unity to help automate the build process for Google Cardboard and GearVR that changed the settings and plugins included in the build, depending on what platform we wanted to build for. The basic idea was that for both platforms, we would have to make minor changes for the Unity build settings, not have certain plugins in the apk file, and more that was inefficient to do manually. Thus, the build tool we built helped to automate this and very easily make a build for each platform.
Nova - VR Exploration with Viacom Next
Platforms: HTC Vive, Oculus DK2
Duration: Jan - May 2016
Prototyping VR experiences for showcase at Viacom Next
Project Nova. What exactly is it? At the Entertainment Technology Center (ETC), based in Carnegie Mellon University (CMU), students spend an entire semester working on a single project for external clients. Nova was a 6 person project with Viacom Next as our clients. What we were attempting to do was to explore the field of virtual reality (VR); uncovering new types of experiences and mechanisms made possible via VR and utilizing them to their full potential. To that end, we developed 2 main VR demos using the HTC Vive for Viacom Next to showcase in their demo room to visitors. I was the producer of this project, and also one of the programmers on the team.
This was a very exploratory project where we spent many weeks prototyping and researching VR mechanics, coming up with 15 prototypes, before finally moving foward into the development stage to make 2 core demos. You can go into our project site at the link above to view extensive details about our project, our development blog and learn more about our findings in VR, the prototypes we made and also who the members of the team were. Below is a public promotional video for the project that can help you learn more about it too.
Lightscape - Demo One
Follow the light out of the darkness
This demo was made using the HTC Vive, and was born out of our desire to provide new perspectives to a player in VR. We also wanted to make full use of the Vive’s ability to track room-scale movement and have the player be able to navigate around the entire space. We place the player blind into a completely dark 3D world. The only way they can see is by shaking their controllers or pressing the trigger button, which then releases sound waves in the form of light particles that help to reveal outlines of the environment around them as they collide. The player then needs to find their way out of the world to the exit. The basic concept is simple. Avoid red and follow blue. We take the player through dynamically loaded levels that make the experience feel like one continuous maze. As they exit one level, a new level dynamically loads around them making them walk round and round in circles within a 5 by 5 foot square area but making it feel like a much bigger space. We slowly add in more puzzle elements to force them to run around corners, to duck and even crawl on the ground. Watch the gameplay trailer below to see what it is like!
Nova Robotics - Demo Two
Build your own robot
This demo was made using the HTC Vive and is a robot construction sandbox. This demo was made to explore the amount of freedom the player is given to build stuff and borne out of a desire to test the feasibility of utilizing VR for more than just games, but sandbox building environments where users have the freedom to build their own creations, like in TiltBrush. The player is given a large amount of freedom to build his own robot, using the many different weapons and body parts we provide them. They then are able to deploy their robot into a city, where their objective is to destroy as much of the city as they can within a time limit. Construction in a toybox environment shows off a lot of what makes VR, and the HTC Vive in particular, great. Room-scale movement, combined with the ability to pick up, move around and build objects in a 3D setting helps to immerse the player in ways that a non-VR game just wouldn’t be able to.
I had two main roles, that of producer and programmer. As producer, I was in charge of client-team and faculty-team communications, documentation of our project, playtesting, implementing Scrum methodology, keeping our website updated and making sure that we reached our project goals. It was my first time as producer and I learnt a lot about how to be a better and more effective team leader by the end of the project. As a programmer, I had different responsibilities during the first and second half. The first half consisted of a lot of prototyping so each programmer went about developing several solo prototypes on their own. I explored the various types of movement mechanics, the possibility of different perspectives in VR, social VR by letting people use smartphones to play with people in VR from outside VR, and more.
During the second half, we had decided on our two demos - Lightscape and Nova Robotics. My main role now, along with the other programmers, was to build these demos in a month. On Nova Robotics, I helped to develop the part connection system, as well as being able to pick up items with the Vive controllers. I programmed in all the UI for the game as well, making them VR compatiable as well. I helped to develop the gameflow control system and also the movement of the robot. For Lightscape, I played more of a design role, helping to design levels, the tutorials and gameplay elements.
This project was extremely educational to me. I learnt a lot about how to design and build VR experiences, as well as what sort of mechanics worked and didn't work in VR. I also learnt how to lead a team, and communicate effectively with clients. It felt a lot like a real-world production project and was really useful as we encountered many issues that real production studios would have faced too, and overcame them.
Junk Food Pilgrimage
Platforms: Android Devices
Duration: Jan - Apr 2016
Published: Mar 2016 on Google Play Store
Website: Junk Food Pilgrimage GGJ2016
I participated in the game jam held in Pittsburgh, with a team of 5 others, and we ended up making a game we fondly coined Junk Food Pilgrimage. It is a mobile game that is also available on the Google Play store to all Android users. It is primariliy an endless runner games where players can play as either a cookie or a chocolate bar. The gameplay differs for each character as well. For the cookie, it continually wobbles so the player has to keep tapping to keep it balanced and straight. As for the chocolate bar, it continually hops up and down and the player needs to tap rhythmically in order to have it jump higher and higher and not fall to the ground. We originally developed the game for the game jam and decided to keep working on it in order to upload it to the app store. The team consisted of 1 sound designer, 2 3D and 2D artists, 1 UI designer, and 2 programmers, of whom I was one.
I was one of the main gameplay and network programmers on the team. We originally made the game networked multiplayer during the game jam. Players were able to see 'ghosts' of the characters of other players playing the game at the same time in their game. It added for an interesting competitive incentive to the game. I was in charge of getting the real-time multiplayer working over the network, as well as helping out with the gameplay programming. However, moving forward, we decided to forgo real-time multiplayer for the app store release due to the server prices and the fact that we wanted to release this game as a fun personal project rather than try to make money of it.
Publishing to Play Store
While developing the game further, I documented a lot of my development and publishing process to the Google Play store. There were many steps I had to take, and I decided to make a guide for others to use in the future. This guide not only goes through the steps needed to publish to the Play Store, but also offers other tidbits. I have a section about publishing from Unity, tips on reducing APK size, how to gain visibility for your app on the Play Store, and an overview section of the Google Play Services and how to use it on Unity. I hope this helps anyone who sees it!
Temple of Balls
Platform: Google Tango
Duration: Oct 2015
An AR action game where you need to dodge all the traps to survive
Players use the Google Tango and find themselves trapped inside the temple. There are 3 levels, and each level has an escalation of difficulty in the traps that the guest has to dodge. The main trap that is a mainstay in all 3 are gargayoles shooting fiery balls at you that you need to avoid. These gargayoles will shoot at you from each of the walls and you need to constantly be on the lookout to avoid them. Later levels have different traps to have a progression of difficulty, with a break in between to give guests a chance to rest. There are 4 rounds in total, and the last round is the round which enables the guest to escape and win. We make use of the Tango's motion tracking feature to have the guest move around in real life to add a level of physical engagement to the game, which makes the action of dodging that much more fun. Checkout a gameplay video below!
Programmer, Sound Designer
I was one of the core gameplay programmers and the sound designer on this game. My main programming responsiblities were setting up the Google Tango environment in unity and integrating the motion tracking features of the Tango into our game. In addition, I was in charge of programming the mechanics of the traps that the player needed to dodge. I was also in charge of programming the audio systems for our game. We made this game within a week so we tried to focus on a single feature and make it fun.
Duration: Sep 2015
Recall your memories by painting them to life
You play as a famous artist with failing eyesight, who tries to remember his memories by painting them to life. Using the Kinect, guests are able to use their paintbrush and palette in order to paint away the darkness and reveal objects in the world. By interacting with various objects, they learn more about the life of the character they are playing as and travel through different rooms. The memories start from the most recent memories up till the painter's childhood memories.
Programmer, Sound Designer
I was the sound engineer and lead writer for the game. I wrote the dialogue and storyboarded out the main story of the game. I was in charge of making and integrating all sound effects, background music and dialogue for the game. All the music in this game are my own original compositions as well, composed using a synthesizer. I was also in charge of audio programming in that I set up a global event-based audio player system that triggers sound effects and music on certain events.
Ending Scene Theme
Keepin' It Realtime
Technologies: Unity, Node.js, Sails.js
Duration: Oct - Dec 2014
Playing online multiplayer games in realtime
This was my final team project for 15-437: Web Applications., within a team of 4. We created a website to host multiplayer games created in Unity. These games are playable in real-time. Each user will see a game lobby screen once they start a game, and then get the option of either hosting a new game or joining any currently hosted game.
We have already taken the site offline, but you can run the site yourself. This site was developed using sails.js. You will need to download sails.js and several other libraries but once you do so, you can go to the github page and go into the keepin' it realtime folder on that page, and run the sails lift command in that folder. That will run the site on your local server.
My main responsibility was creating the games and making them multiplayer-compatiable over the network. I created one original game, the space explorer game for the site, and made it multiplayer. I also used a chess game I had made previously and modded it for network capability. Finally, I took a Unity demo project called survival shooter and modified the source code to make it multiplayer as well.
I actually created a master networking script in C# that I was able to use for all 3 games, with slight modifications for each, in order to allow players in each game to host games over the network and play with others. I also created an in-game chat system. I also designed and implemented a scoreboard for each game that updates every 10 seconds with the 10 highest scores. Lastly, I helped out with the design of the site and implementing the friending system for users who signed up. However, the unique aspect of this site is that users do not have to sign up to play. There is an option to play as a guest.
Below are various images showing the various features of the site, such as the login page, the profile page, and each of the actual game pages, with a site chat room and scoreboard on each of them.
Simulating Propagation of RNA Viruses
Technology: OpenGl, CUDA
Duration: Mar - May 2014
This was the final project I did with a partner for 15-418: Parallel Computer Architecture and Programming, at CMU. We simulated from scratch how 3 RNA viruses (HIV-1, Ebola, Influenza-a) propagated on a macro level from person to person in six different cities around the world. Developed in C++ and sped up using NVIDIA's parallel computing architecture, CUDA, resulting in an extensive speedup. The actual simulation was displayed using OpenGL.
My main tasks were to develop the algorithms behind the propagation of the viruses and creating the data structures representing the real-world environments for the viruses to propagate in. Basically, I created the entire sequential simulation in code, and my partner hooked it up to OpenGL and created a visualization of the simulation. Then, we both worked together to integrate CUDA into the code to parallize the simulation. You can check out the project website for more extensive details about the project if you are interested, and also download the final report for the project below.
Research on Smarter Game AI
Engine: Skyrim Creation Kit
Duration: Jan - May 2015
Making virtual companion characters behave more intelligently
This research project was focused on investigating how to make virtual companions behave more intelligently around human players in a game. Many games have have such virtual companions, and they are usually built on developer-tailored state machines or behavior trees. However, they don’t provide the opportunity for the player to teach the companion charcter smarter behaviors.
I worked together with 2 others on this on this project. We built the system on “Skyrim,” an adventure video game which has non-player game characters that can be recruited as companions to help you fight. These companions are kind of dumb by default and do not understand any kind of complex tactics. We are looking into how to apply search-based planning algorithms to make the companion characters be able to behave differently based on each individual player's own behavior and shape their goals based on the player's play-style. Below is a top-down view of the tiny portion of the game that we used for most of our testing.
The objective of my research during the semester was to first design and implement a model of the game world in Skyrim for the AI planner to use. This took up most of my time and involved not only creating various data structures representing all aspects of the game world, but also developing more specific enemy AI vital to our testing purposes. After this was done, the last part of my research involved integrating our AI algorithms to the game model and iterating on the algorithm based on testing.
This serves as a summary of the work we did over the semester.