After finishing my initial capstone prototype I wasn’t to happy with it because even though it worked very nice, it was to boring for my taste. So i decided to spin it off into something crazier.
A game that will live in your hand through projection mapping.
Calibrating the hand has two main stages. Aggregating the right amount of points and then shifting the camera. To aggregate the points needed to successfully map your hand with the projector we need to save the 3D point of the leap motions index finger and the 2D point of where the mouse thinks that index finger is. In the image bellow you can see the calibration process.
This image shows what happens when you don’t aggregate the points correctly. The image will be offset from your hand.
In contrast, this images show how it looks in OF when you have successfully calibrated the hand projections.
Last semester I was the technical lead for a project called Marioneta! This project was made with one experience designer, one sound designer, one artist and two programmers (me being one of them).
Marioneta was a project for the Pittsburgh Children’s Museum to create a puppet gesture recognition and mirroring effect with the use of Microsoft Kinect v2. The projects main focus is to create an endless experience in which everything in the world will react to the users actions as they become and impersonate a puppet.
This installation is currently installed in the museum, you can go check it out!
After one week of playing with leap motion at the Entertainment Technology Center (Carnegie Mellon University) a team of 2 artist 1 sound designer and two programmers (including myself as a programmer); were able to create a new and easy way to experience the thrive of being a DJ in a Unity3D app.
Working with LEAP MOTION we realized its limitations but we were able to work around them to create an engaging and fun experience for naïve and experienced guest. It was a lot of fun working in this project, even though it was only one week I think we were able to do something very cool and unique.
All the art assets are controlled with a sound analyzer (to be accurate we used the Fast Fourier Transform algorithm). This made that each time the user altered the music, all the art assets would react to it as well; giving the user a sensation of power in the world. The lights also played a big part on the experience. They also reacted to the DJ actions; changing the mood of the world as the user experimented with the controlled and DJ its way!
After playing with UNIMOVE , finally got the latest Unity wrapper to connect 6 PSMoves to my Mac!!
Time to do some fun stuff!