I am a game developer and this is my portfolio.
In my opinion eye tracking is very important for virtual/augmented reality HMD. It is relatively cheap hardware wise and it can practically be used as another form of input. Successfully utilizing gaze tracking increases immersion and provides easier ways to interact with objects.
Many games use depth of field to increase immersion, but for proper real-lifelike experience in a virtual reality head mounted display simple depth of field effect is not enough. Gaze-contingent depth of field on the other hand helps simulate vision effects such as vergence and focus more closely to the real life and effectively will decrease eye strain and increase immersion.
In the video above I have implemented such depth of field. It uses Tobii EyeX and Unity3D.
The user can switch between two different versions of gaze-dependant depth of field.
The first version features blurring that happens on objects both close and further away from the camera.
The second version is more lifelike and the DOF can only be observed when the user focuses on an object that is close.
Having an eye tracking solution in a virtual reality HMD also brings the possibility of more personalized LOD.
Currently the Level Of Detail in games happen based of the distance from the object to the camera. The further the distance the smaller is the polycount of the object. But sometimes the player looks at the background can could notice LOD popping of objects or generally unpleasantness in the scenery far away.
With an eye tracking solution a new form of LOD can be achieved, one that prioritizes objects closer to the position the player looks at. That way objects far from the player gaze need not be rendered with the highest LOD possible even if they are close to the camera. This sort of LOD in combination with the gaze-contingent depth of field will further mask LOD popping and increase performance as only objects that are observed will be fully rendered with maximum quality.
This was a short term project for Inition
The project consisted of using Kinect v2 to create 3 interactive experiences.
The Inition team was responsible for the art direction and the assets while I was programming.
“Project Foundation” is a multiplayer game prototype. It was developed by a small group of people. Each one of us wore many hats and this is the result of enormous collaborative effort and a lot of trial and error.
My goal was to create an interesting car selection menu for a video game. I came up with the idea to have the cars come and go, stopping in front of the camera and I really wanted to see it in action.
After the main part was completed I decided to focus on making it look better. Some of the things that helped the overall look include adaptive depth of field and a dirty camera effects.
Eric James from Ins3D made some improvements upon my implementation:
link to the freevi forum page
link to his improved version
The following post was written before the release of Unity 5.
Even with the new version of Unity (Render Textures for free) my interlacing has a better performance on some devices.
After successfully completing the first round of their challenge I received a free Commander 3D tablet from freevi.
The first version of their sdk for Unity used the usual method of rendering interlaced 3D: render textures.
Render textures are only available for Unity Pro users. In order to develop and publish an application for the Commander 3D tablet the developers had to buy Unity Pro and Unity Pro Android licenses. This could have cost them 3000$ or 150$ for subscriptions every month (12 months minimal contract).
I tried to look at the problem from a different perspective.
Why not, instead of rendering the two images and interlacing them, we create a mask for half the pixels in each image and then render it. Fortunately the shader for such mask was very easy to write.
The final result was very surprising as it had a better performance (on the tablet) than the original render texture alternative. It increased performance by 2-10 fps and saved developers 3000$ per seat.
The github repo can be found here:
I would like to note that this method of 3D interlacing is not hardware specific. If you need to interlace 2 or more cameras you can use it.
Zen Build is a free puzzle game currently available for Android and Web. Slot different pieces together to solve numerous puzzles and enjoy a relaxing music and minimalistic design.
Favourite quotes from players:
"Best indie game ever saw this game on 9gag and i immediately downloaded it. i rarely give reviews but this one really caught me. hats off to the developer. very easy to play but challenging."
~ Christiano Nabong (google play)
"Innovative Relaxing 3D puzzle with full rotation control. Have to be a little precise on where a piece goes or it will kinda fling it at the main body and bounce off. Nice physics touch!"
~ Kitty Phipps (google play)
A very interesting and somewhat challenging game. This really helped me calm myself after today. 11/10 will be watching for updates!
~ wilcojar000 (gamejolt)
Through the years I have developed numerous game prototypes. Here you can find videos of some of my favourites.