Automated Occluders For GPU Culling
Game Developer Magazine (September 2011 Issue)
2014 – Present
Unreal Motion Graphics
I worked on UMG.
2012 – 2014
At organic motion I work on L.I.V.E. a system for allowing an actors motion to be mapped to an avatar in real time. Along with facial motion and voice retargeting we can allow the actor to play any role. I’ve also done some GPU optimization work on OpenStage, the markerless motion capture booth Organic Motion also provides.
2010 – 2012
I joined Activate3D in mid 2010, about a year before the Kinect for 360 came out. Our tiny startup began working on a solution to solve the physical disconnect between the living room and the virtual world. To allow the player to make the avatar move via the players intent and move about the virtual world in a way we’ve yet to see.
I did a lot of work looking at how humans moved, developed machine learning algorithms for determining intention, worked on several interaction methodologies for standard 2D applications usable at the 6-10 foot range, and worked on many cutting edge ideas with my fellow engineers. In mid 2012 Activate3D was acquired by Organic Motion. Check out some of the videos below of the different things we created!
Emergent Game Technologies
2007 – 2010
Products: Gamebryo Lightspeed 3.0, 3.1, 3.2
Toolbench is a pluggable tools framework written in C#, very similar in design to Eclipse for its plug-ins. It hosts a World Builder, Script Editor, Debugger, etc…
When I first joined Emergent, Toolbench was just getting off the ground. I worked primarily on the tool’s infrastructure and on the level editor, World Builder. However, I’ve been involved at some level with almost every plug-in in Toolbench. Even though the technology was fairly new at the time we ended up using a fair amount of WPF in the tools UI.
You can watch the following videos of Toolbench being demoed on Gamebryo LightSpeed 3.0.
Oxel is a tool I work on in my spare time – it’s a tool for generating occlusion volumes from the voxelized structure of a mesh. It uses a very robust voxelization method to first build a octree of the scene, then attempts to fill the volume with low poly structures.
To see the Game Jams I’ve participated in head over to my [Game Jams] page.
Virtuoso (HI-FIVES Project)
Affiliation: North Carolina State University
HI-FIVES stands for Highly Interactive Fun Internet Virtual Environments in Science.
I worked on this research project at NCSU under Dr. Michael Young. Funded by a grant from the NSF, the goal of the project was to create a new teaching environment for students with video games. I was involved with HI-FIVES from the start working with another programmer. We did the project as a mod for Half-Life 2 using the Source engine from Valve Software to act as a level-builder and script editor for the world. Designed so that teachers (or anyone) with no experience in game development, can after a short time build and script worlds for their students to learn in.
One piece of the project I was responsible for was the scripting environment. Unfortunately the Source engine does not have a scripting environment which meant I had to add one to it. I started by attaching the Lua virtual machine to the engine. This made it so any natively built object could actually be controlled by Lua scripts instead of precompiled C++ code. However, for anyone to be able to write a game the scripting environment had to be visual. So to go along with the visual representation completed by my partner on this project. I wrote the backend framework that contained all the classes for representing various script components. Thing along with the client/server communication layer to transmit the changes and synchronize all the clients, multiple people could edit the same environment together and then all observe the results.