Space Nietzsche Bot

Space Nietzsche

“A color for the weather” @space_nietzsche

I wanted to get experience using nodeJS so I decided to do my bot using it. After looking at many bots out there and following a couple of tutorials on how to make a bot, i decided to create my own!

Every 30 min., Space Nietzsche Bot  selects a city and grabs its current weather information. Once I have this information I use the temperature, precipitation and wind bearing to calculate a color representation. I create an image with that color and then i post a new tweet.

Originally the tweet had a static “The current color weather of city <bla> is <img>”

I realized this was very boring so I added some adjective randomness to make the tweet more interesting! Now it is : “The <adjective and connector> color of city is <adjective and connector>.

When adding the randomness I also added a random var that will sometimes add a random hastag to the tweet.

The bot can be found :

@space_nietzsche
webpage

Final Project Research

Cubepix Demo Test – by Xavi’s Lab

I am interested in doing something with projection mapping and mixing hardware like Arduino. I really liked this project because is a very good combination of both hardware use and projection mapping. I dont like the fact that the installation is supper loud. I know it is thanks to the motors, but If I end up using both projection mapping and Arduinos, I would try to minimize the amount of motors it had to keep the volume as low as possible. Apart from that i really like the effect the boxes do when they are rotating! It looks very cool. I also liked how they added the kinect as an interactive input. I would definitely add one too. I really think that some experiences like this the kinect adds a very special touch.

FLOW 1 | KINECT PROJECTOR DANCE

This project is more focused on kinect and projection mapping. As mentioned earlier, I feel when it comes to tracking, the kinect makes it easier to create interesting things with the depth camera. I really like all the effects the dances create. It looks like they are the ones disturbing the environment and i really like how that looks. I also liked the minimal color pallet used. I felt like the high contrast makes a stronger connection between he dancers and the disruption they are creating in the background and the mapping. I have seen many similar effects to this one, so I would definitely try to innovate the canvas, maybe by adding a real static object that will add to the canvas or combine it with LED or some type of sensor that would alter the canvas, making it less flat

Creative Canvas – OF & ofxAddons

Creative Canvas

ocean_canvas

I have always liked image manipulation but had never had the chance to really experiment with it. I took this assignment as a chance to see what I could create by extracting information from any image. After looking at the openFrameworks addons, I decided to use ofxUI, ofxCV, ofxTriangleMesh, and ofxColorQuantizer.

I used ofxUI to give control to the user (me) to be able to manipulate the image. ofxCV is the one giving me all the main information of the image, mostly the contours. ofxColorQuantizer and ofxTriangleMesh are both being used to add the effects to the image.

So Creative Canvas breaks an image into the main color pallet and finds the contours thanks to this colors. You can switch between contours by clicking ‘c’ to see what it selects.

In the UI, there are 5 main features that can be applied to the image: “draw triangle”, “draw mesh”, “draw outline”, “draw particles” and “random”. What these features do is decide how you are going to draw over the current contour. If you select random, it will randomly select one of this effects.

You can loop as long as you want through the image and very cool effects come out (most of the time).

city_canvas

city
tree_canvas

Many iterations where applied on the color selection and on how the mesh and particles should look and move. Also the colors. Initially I was thinking on using a particular color pallet for all the virtual objects (triangles, particles etc) but it looked soo busy that i decided to keep the same pallet as the image.

The code can be found here: https://github.com/mariale888/Creative_Canvas

Voro Snake – Parametric Object

Voro Snake

I was looking at some very cool stuff done with voroni cells and decided to give them a try.
I wanted my object to be as closed to the user as possible, so i decided to allow the user to create it by drawing a shape with the mouse. After the shape was draws, a voroni cell pattern will simulate the shape the user did.

To be able to manipulate it, I added three main constraints: width, height and number of subdivisions. The interface has an UI that allows the user to easily change this parameters to construct their shape.

parametric_3parametric_1

You can rest it at any point and re draw another line or keep changing the parameters of the objects. I called it Voro Snake because I realized that most of the shapes end up looking like a very abstract snake 🙂


The code can be found here: https://github.com/mariale888/AbstractParametric

Motion Capture & VR

I was doing some research for my ETC semester project which involves a motion capture when i came across the two projects I am going to discus.

Map Visibility Estimation For Marge-Scale Dynamic 3D Reconstruction:

This project is tracking movement and then generating the movement path of the objects dynamically. Markers are attached to the objects that what to be tracked so the motion capture cameras can see them and human joints are tracked automatically similar to how the Microsoft Kinect does it. It is a research project here at CMU with its main focus on creating more accurate motion detection by optimal camera detection. In other words, selecting the right cameras for each point (in a very small nutshell).  It was done byHanbyul Joo, Hyun Soo Park, and Yaser Sheikh. I founded this project very inspiring because all the raw data of movement creates beautiful color patterns and shapes and because it solved almost all the issues the Kinect has to encounter when tracking humans. I feel that a very cool installation could be created with this type of technology because the entire human body is being tracked in a 3D cube/space. this would allow for a completely immersive tracking experience!

After finding this project, i decided to try to see what else similar to this is out there and I found this:

NuFormer – Virtual Reality-Video Projection:

This project is trying to combine virtual reality with motion capture to fully engage the user in the experience. apart from this, it generates a projection of the user in the virtual space to show to the audience what he/she is seeing and experiencing. This type of experiences are being explored to look for more possibilities of how far we can push VR and full our brain. This one was made by NuFormer. I really liked the concept of combining virtual reality with motion capture, this is probably what we are going to end up doing in my ETC project, but I feel this project was just a proof of concept. I didn’t find the experience that engaging, yes the art is nice, but with so that power, something more creative would have been better. Something that would really make the user be in the edge!

Marioneta

Last semester I was the technical lead for a project called Marioneta! This project was made with one experience designer, one sound designer, one artist and two programmers (me being one of them).

Marioneta was a project for the Pittsburgh Children’s Museum to create a puppet gesture recognition and mirroring effect with the use of Microsoft Kinect v2. The projects main focus is to create an endless experience in which everything in the world will react to the users actions as they become and impersonate a puppet.

This installation is currently installed in the museum, you can go check it out!

 

Watch Me Grow – PAEYC’s Hackathon

PAEYC’s Hackathon

WatchMeGrow_logo-02

What an experience! Starting by mixing with teachers and programmers to make teachers dream apps come true in 48 hours was amazing! I knew there were many unsolved problems in the educational environment that would make a teacher’s day more pleasant; but after listening to all this teachers pitch their ideas I realized the problem was bigger than what I thought. They struggle with day to day activities that only a teacher would understand. Many problems I didn’t know could be problems are impacting kids’ education every da

y. This made me very happy to be part of this hackathon. I knew after the 48 hours I would not only learn a lot and have a lot of fun, but help someone solve some headaches.

During the hackathon, a group of two developers and me were able to make a cross platform app in Unity to help a teacher keep track of its kid’s level of learning. The app was made for 3-5 years old kids that were learning to identify the alphabet, numbers, colors, and emotions.

It had two main features: Teacher side and kid’s side.

Teacher:

  • Add kids to the list and select their level for each of the fields of study
  • See the status of each kid
  • Adjust the kids level in each field of study depending how they are doing

Student:

  • Select one of the fields of study and play a learning game
  • Will record the right and wrong answers for the teachers information

The Last Egg

A 3 vs 3, competitive, team based playground game played with PSMoves. Members of 2 teams run around protecting the teammate holding virtual the egg while the other team tries to get it.
The game was a “Jury Selection” in IndieCade’14 and was showcased in Big Game Arena.

The new DJ experience with LEAP MOTION

After one week of playing with leap motion at the Entertainment Technology Center (Carnegie Mellon University) a team of 2 artist 1 sound designer and two programmers (including myself as a programmer); were able to create a new and easy way to experience the thrive of being a DJ in a Unity3D app.

Working with LEAP MOTION we realized its limitations but we were able to work around them to create an engaging and fun experience for naïve and experienced guest. It was a lot of fun working in this project, even though it was only one week I think we were able to do something very cool and unique.

All the art assets are controlled with a sound analyzer (to be accurate we used the Fast Fourier Transform algorithm). This made that each time the user altered the music, all the art assets would react to it as well; giving the user a sensation of power in the world. The lights also played a big part on the experience. They also reacted to the DJ actions; changing the mood of the world as the user experimented with the controlled and DJ its way!