The OpenNI might have been discontinued but Kinect 1/mac owners can still create amazing interactive experiences in max/msp.
The Kinect camera provides three dimensional data of a person’s movement in physical space. How can this movement be translated into video effects and sound? This class focuses on mapping movement to parameters of video and audio effects. Using 2D data from jitter and the cv.jit library and 3D data from the kinect movement is translated into interactive sound and video effects.
Two Days: November 14, 15, 2015
2 pm – 6 pm Saturday and Sunday
A presentation of work created in the class will be scheduled for the following week
Cost: $200 / $180 for members
596 Broadway, #602 | New York, NY 10012
Subway: F/M/D/B Broadway/Lafayette, R Prince, 6 Bleecker
This class is for video and installation artists, musicians, dancers and others who are interested in creating video or audio effects using the movement of the body. You will learn how to capture data from the kinect and use motion tracking methods to map movement into parameters that can affect video and audio.
Have you ever wanted to modulate the speed of a movie as you move from left to right in front of a video wall? Do you want to mix two videos and pan the audio as you move in space? Perhaps you want to create an interactive orchestra where different positions in space trigger a variety of instruments to play. Or you are a musician who wants to create an interactive dance that generates music through movement. If so, this class is for you.
This class will be hands on, with the students following along with the instructor. The project will be presented the following week at Harvestworks.
Since the OpenNI framework has been discontinued, mac users and owners of Kinect 1 have had limited resources to take creative advantage of their interfaces. There are, however, available objects that enable mac users to capture kinect 1 data in max/msp.
You will learn you how to combine 3D data from the kinect and 2D data from the cv.jit library to create interactive audio-visual experiences with the movement of the body. We will use Synapse to capture 3D Kinect data and Jean Marc Pelletier’s cv.jit library to refine our motion tracking. Once the motion tracking is complete we will monitor motion in 3D space and map body movement data to different video and audio parameters. Using the movement of your body you will control video effects such as speed, looping, mix videos with the jit.qt.movie object attributes, and manipulate objects such as the jit.gl.videoplane by applying transformations and visual effects. You will learn how to trigger audio samples and affect parameters such as volume and panning with your movement.
Topics to be covered in class:
1• Capturing depth Data with Kinect 1 and the jit.synapse object
2• Apply motion tracking methods using the cv.jit library and extract X,Y,Z data
3• Calibrate motion tracking data to match the input range required for video and audio manipulation
4• Map motion tracking data to parameters of video and audio effects
4• Manipulate a Quicktime movie and Videoplanes with movement
Required for class:
This class will use the Synapse object to capture depth data from the Kinect 1 camera and Jean Marc Pelletier’s cv.jit library for motion tracking.
It is recommended that you download and install the cv.jit library and the jit.synapse object prior to class.
Students will need to have a Kinect 1 camera.
Some experience with MAX/MSP/Jitter is recommended but not required.
Sofia Paraskeva is an artist who explores sound and visuals in the context of leading edge technology. Using tools such as MAX/MSP/Jitter she designs visual and aural narratives for interactive installations and performance. She is a concept creator, a visual effects designer and a natural video editor. Her work spans across interactive art and design, visual design, motion graphics, and experimental video and sound. She is currently involved in projects that use computer vision and custom-made interactive wearable interfaces such as a wireless bodysuit sensor gloves to create sound textures and visuals. Her interactive installation Rainbow Resonance, a computer vision musical interface, will be has been presented at Queens Museum of Art, the Mind’s Eye program, Solomon R. Guggenheim Museum, the New York Hall of Science, Harvestworks and Brooklyn Conservatory Sofia has a BA degree in Visual Studies and a BA in Media Communication. She is a graduate of the Interactive Telecommunications program at New York University.