In this workshop, we’ll look at extending the functionality of Ableton Live with Max/MSP. Specifically, we’ll see how a Max for Live Device can receive audio and MIDI from Live, as well as creative ways to process that data. We’ll also look at the Live API, which lets Max control almost any part of the Live Set automatically. This has profound implications for sound design, sequencing and algorithmic composition.
Sunday , April 17th, 2016 (4 hours)
Time: noon – 4 pm
Harvestworks 596 Broadway #602 New York NY 10012
Making music with Ableton Live is fun, obviously. Making music with Max/MSP is also fun. Making music with Ableton Live and Max/MSP together is almost too much fun. In this workshop, we’ll look at how to make Max and Live work together to accomplish almost any compositional task we can imagine. We’ll see how to use Max to build a new Live instrument using signal processing techniques unique to Max. We’ll also build a Max for Live device that can create generative rhythmic sequences, adding algorithmic composition to Live. Finally, we’ll build a Max for Live device that can walk through the Live Object Model, controlling the Live Set automatically.
Sam Tarakajian is a Brooklyn-based developer and musician. He has worked at Apple, Cycling ’74 and the New York Public Library. His work centers around interface design for musical and creative tools. A recent project, the Rhythm Necklace app for iOS, lets the user create complex polyrhythms by touching and tweaking geometric shapes. Occasionally, his love for Max/MSP bubbles over into a YouTube video for the Delicious Max tutorial series.