I’ve recently tried out two very different synthesizer apps for the iPhone: Noise.io and Bebot [app store links]
Noise.io positions itself as a “Pro Synth” while Bebot positions itself amongst the more toy-like music apps like Band and the various air guitar apps (their prices have a similar motivation too – Noise is $10 while Bebot is $2).
However, they share a common quality – the main interface is a touch sensitive axis ala Kaosspad; the X axis controls pitch and the Y axis controls some sort of filter or effect wet/dry setting.
I’m not going to do an exhaustive review, but I’ll say this: Noise.io is an amazing iPhone app – it allows for more features and control than anyone could expect on a mobile device. And I think it is the lesser of the two apps right now.
Why is this? Bebot does two things: it opens directly to a playable screen with no other buttons or feedback other than the beautifully rendered and animated Robot character (who by the way gives the best depiction of how a parametric EQ *sounds* by moving his lips into different expressions – you really have to see it to know what I’m talking about). It also allows you just enough control.
It’s not that I don’t appreciate the high amount of control that Noise.io offers, it’s just that when I’m in the mood to exercise that kind of control I’m far more likely to sit down with Reason or Ableton Live than to poke around on my phone. Bebot realizes that a phone-based program *can* be as powerful as a desktop app, but is going to have an inherently different usage. Bebot offers an iPhone-appropriate experience that is far easier to just whip out and play.
I’ll close with this: I love Noise.io, and I think their product shows a lot of the raw horsepower of the iPhone platform, and their collaboration with Intua on Beatmaker compatibility really breaks new ground. Interface-wise, though, Russel Black is really onto something with Bebot.
Mac users can usually tell when a program is “Mac-like” or not. I think iPhone users are starting to tell whether apps are “iPhone-like”, and Bebot is certainly the more “iPhone-y” of the two apps.
Phase one and two are good to go: collect data about how students process notation, and create a couple live examples.
New: qualifier for phase 2b
The examples I was thinking I can make are not necessarily interactive, but I’d like them to be (i.e. you can read them but not write them)…this may have to be a limitation of the project manifestation. I just don’t want the quartz pieces to be simple visualizations. At the very least, it will be able to represent pre-programmed MIDI notes. At the most, it will be clickable for writing music (this will be a challenge).
Phase 3 will be something that branches off of the best of phase 2b. Might be a performance using this notation? Don’t really know yet, but I’ll keep you posted.
Hope you like this tutorial I put together for my high school classes: How to make a simple vocoder using PureData.
If you follow my Twitter, you saw a mention of this. Now here it is for all the world. Please don’t sue me if you own the copyright to this song – I only did this because the basketball team needed it!
Cha Cha remix.mp3
P.S. This was done in Ableton Live with liberal use of the “Beat Repeat” plugin and Impulse to sample the drum kit.
Must post this great art/web mashup from @minimalspace, a.k.a. Jason Sloan, an interactive media professor from Baltimore. It’s called “Cloud.s”
and it’s just fantastic.
Picture this but with music, and you’ll see where I might be going with my project.
Phase one: research effectiveness of graphical notation at conveying rhythm in HS students. Hypothesis – MIDI grid is more effective than traditional notation in this specific area.
Phase two: HS novices devise alternate notation styles that depict time cycle/duration versus pitch. I take the best few examples and make living versions of them using a combination of Quartz Composer and either a DAW like Logic or a custom PD patch (whichever works better for demonstration of the concept).
Phase three: adapt the best visualization for an art installation type use. Maybe link up the PD patch to something that breathes like a Twitter feed, and use that to generate incoming notation data for the QC patch.
What do you think out there? Too broad/narrow? Hit me up on Twitter with your ideas.
Overall a good conference. Had a great conversation with Angela Siefer of shinydoor.com on why social networking should be used and encouraged in the classroom. She was impressed we’re going paperless this year too, which is nice. Lots of good reactions from the show floor on Wednesday – see videos here. Next up: graduate project time, some end of the year performance stuff for the MTers, DJing the art show, and more. Also a top secret collab I’m not talking about may or may not be in the works.