The modern mobile A/V production studio

Over at my favorite tech site The Verge, Jordan Oplinger demonstrates how on the go production is done in 2012. Note: no specialized equipment, beyond the necessary kit to capture signal. Once it’s captured, everything goes straight through to post on the MBP. This is in stark contrast to the situation last decade, where companies tried to sell things like the Streamgenie to live-produce content on the go.  The metaphor of the “media truck” is slowly dying.

I find it interesting that she records video and audio separately, just like they do in the movies – there may be an easier way to do this, but there’s no denying that The Verge always has the highest quality A/V coverage of the events they attend.

Also notice the complete absence of a “video” camera.  In 2012, a camera is a camera is a camera, and DSLR’s are the best kind of camera, whether you’re taking video or stills.

Real VJ’ing comes to the iPad – finally!

So iPad apps to me are at an awkward middle-child phase. Some are incredibly powerful standalone products (like the Korg iMS-20) that are basically one-shots that do one thing, and others are “baby” versions of professional software, which is cool but limited in real-world usefulness.

Performance apps have a real opportunity here, where on stage one needs a limited amount of raw power but the “real” response and interface options so the crowd doesn’t think you’re playing with an unoriginal toy.

Haven’t found the holy grail for music apps yet (see the opportunity, o’ certain company whose products I adore?), but today I found a great solution for visualists and VJs if you maybe lack hardware for pro V-mixing or simply feel like packing light: TouchViz by hexler.

TouchViz reminds me a lot of this crazy Quartz Composer patch for video mixing from a few years back, except it is easy to use and actually works. Load up to 120 video clips (you’ll want to make or find these loops first) and then mash around with live effects and mixing during the set. 2 Channels. You’ll need the VGA out cable for your projectors.

Only downsides I can see from using for a little bit – no audio reactive option (definitely should be on the todo list for this developer, as all iPads have microphones) and I’d really REALLY like to see an option for setting projection map zones – it wouldn’t be too hard to implement a 3D effect layer for this or maybe even just a garbage matte that’s editable.

$9.99 on the App Store – a little more expensive than VDMX, but a lot cheaper than Resolume, and a whole heck of a lot more portable. Will EMG replace our custom hacked Gameboy cartridge with this next year?

Making a hardcore video wall using Quartz Composer

Quartz Composer, originally known as “Pixelshox Studio”, is 100% FREE. If you own a Mac, you already have Quartz Composer, but it likely isn’t installed. Insert your latest system disc and there should be an “optional installs” icon. This will install all of the developer tools, including XCode, iPhone Simulator and soforth. Don’t worry, it’s worth it.

If you’re familiar with other patchers like Max/MSP or PureData, this will look somewhat familiar. The main difference in the QC patcher is that patches tend to flow from left to right rather than from top to bottom. Also, “macros” or sub-patches are far more commonly used in QC than in the other aforementioned programs.

Fig. 1: A blank Quartz Composer window (left) and view (right)

Click on “Patch Library”. This brings up a palette with all of the available patches. Type “video” into the search box and double-click “video input”.

This adds a video input patch. This object does nothing yet – it’s an “input” with nothing to “output” to.

Now, we make an output. In the patch library find “Billboard”. This is a 2D object layer that has whatever you want “printed” on it. Hook the “image” output of Video Input into the “image” input of Billboard. It should look like this:

Fig. 2: One video input hooked to an output.

Now all you need to do is copy the input/output objects. The only other step is to manually set the video device to use. You do this by selecting “Video Input” and clicking “Inspector”. Change to the “settings” tab and select the input device.

Only two simultaneous FW video inputs are allowed in Mac OS X. I’m not sure how many USB inputs are allowed but I got away with using the Built-in iSight off the iMac running the patch to create two of the four panels. One is zoomed in on the left side and the other zoomed in on the right, creating the illusion of two discrete video feeds.

Finally, check out the source file of my Quartz Composer patch – the video inputs will not work (unless you have the same exact cameras hooked up that I did) but it should be educational nonetheless.

How to chroma key using only iMovie and a green screen

-Chroma key background (muslin cloth or even green posterboard works)
-Video loop or picture (find some here)
-iMovie 09 (it came with your Mac if you bought after Jan 2009)

Before you begin read a tutorial or two about chroma key backgrounds. There are issues with the lighting and such you should be aware of before you expect perfect results. That being said, iMovie is rather forgiving when it comes to the range of chroma it’s keying out.

Enable iMovie’s advanced controls. You can find this in the Preferences.

Throw your video loop into a new project. Make sure the loop is long enough to go all the way through your greenscreen footage. (For our 5 minutes announcements, I have a 10 min. loop “just in case”)

Drag your greenscreen footage on top of the video loop. Try to line it up with the very edge of where you want it to start. A little menu will pop up – select “Green Screen”

If you want to add overlays or any other video “tracks”, you will need to export the unedited greenscreen + loop and reimport it into iMovie. Then, edit the movie and do all your other titles, overlays, etc.

Voila! iMovie 09 is very generously (and somewhat secretly given that these are not default “on” features) giving away a feature that I’m sure sold many copies of Final Cut Express.

Despite the obvious limitation of multiple tracks (that can be worked around), adding 10 minutes to post production can make the announcements look significantly better and is totally worth it.