LHS Music Tech Promo video

It’s been awhile since I’ve made a nice video outlining what I actually do at Lebanon High School.  Many people do not realize that I only teach Music Tech.  There is no band or choir or guitar class that I cover.  We have one of the only programs in the country with the participation levels to warrant such an arrangement.

My course track is unique too.  There are few college paths purely related to Music Tech.  In 2014, you still cannot get a Music Education degree with technology as your concentration.  Most widely regarded programs are at the Masters or Doctoral level.  Thus, the sequence of curriculum leads to a wider media production standpoint, allowing students to explore fields like broadcasting, theatre production, or apply the tech to a traditional music degree.  Many students use the experience to bolster their applications to undergrad programs like these.  Some will go into a related media field such as film production.  Other students will just graduate and fly off to LA to become famous.  Either way, I feel like we have tapped into a totally different kind of track than what I see happening at other schools – one that others, such as the CCM E-Media program and the NKU Electronic Media program are also serving.

But don’t trust me.  Let the students speak for themselves:

I’ll also be speaking about my program at the TI:ME Online Symposium this Monday at 3pm EDT, if you are interested in learning more about Music Tech & Media Production at Lebanon HS.  It’s free for TI:ME members, and easy to register for through the website.

How to avoid overbuying in your A/V studio

As a teacher, I’m constantly assaulted with ads and promotional materials for “pro-quality” solutions for studio gear for schools.  Companies assume that most teachers aren’t gearheads, and don’t pay a ton of attention to the equipment they’re buying, so long as it works well.  This gives them a huge opportunity for overselling, and potentially costing you to exhaust your budget on items you don’t need, or that could have been much cheaper.

Too many schools overbuy, and I think a big contributor to our school’s successful program is my absolute resistance to the idea.  Here’s an example my students like to make fun of: At a convention performance, the kids got a chance to walk around the trade show.  They saw this “AMAZING” video production hardware that cost around $15,000.  I’ll bet a lot of your schools own one.  They came to me and asked “wait – how much is our stuff, because this thing’s reel looked exactly like what we produce in our tiny little studio.

So without further ado, here’s how to do what a Tricaster does for about 1/4 the price:

(disclaimer: our announcements are pre-taped, so some of this might not apply to your situation)

The Tricaster is generally used for mixing multiple camera sources and performing a chroma key (green or blue screen) to replace the background.  Additionally it can do overlays of pictures and text.  You know, your basic Steve Brule stuff.

So first off, let’s start by remembering that everything the Tricaster can do, your computer probably also can do “in post.”  We started on this theory when I first took over our announcements.  Our first signal flow looked like this:

Sort of, at least.  I’m pretty sure I ran the camera through something like a Pinnacle Hollywood converter to make it a Firewire source for the iMac, but the principle is at least right.  We filmed against a green screen, did the chroma key and overlays in iMovie (which requires two renderings, since iMovie doesn’t support more than one video “track” at a time).  It was inefficient, but it was cheap (and it worked.)

Year two, we basically did the same thing, but added some extensions.  I took the shotgun mic off the camera and ran it to a MacGuyver-ed mic boom we made, so the audio would be much clearer, and require less boosting in post.  We also found an old piece of kit that came with the camera to do the digitizing, which made importing from tape and other sources a little easier:

By this point, I was sending home a laptop with a student every night so they could do the editing as homework.  They would deliver the laptop back to me early the next morning, and we’d show the announcements to the school.  I didn’t like this setup, because sometimes the students either a) couldn’t do this or b) didn’t do a good enough job, and there was no time to fix problems.

So year three was all about cutting back on the amount of post production needed.  This meant eliminating one of the rendering processes.  Since we can’t avoid the final render, I decided it was time to get a video mixer capable of chroma key.  The cheapest machine of this type on the market to this day is the Roland V-4.  It costs $1000, and is really designed for VJ’s, or news anchors who like their transitions to be synched to a song or something.

This saves us time by recording the background along with the anchors, and still gives our editors (who work the following class period) the freedom to choose music and pictures.

The fatal flaw with both this setup and the Tricaster is the lack of true HD support.  Everything that runs in and out of the V-4 is SD quality, so even if we upgraded to an awesome camera it would still get downgraded to SD in post.  Not that it’s a pressing issue right now, since our closed circuit system would do this anyway.

But let’s say we upgraded cameras tomorrow.  How would we do this?  HD video mixers are extremely expensive, and are not something I’m interested in buying.  I’ve thought about this, and for now it seems the best solution would be to go back to year one’s signal flow and change to something like BoinxTV for doing chroma key and overlays.  Three years ago, our computer wasn’t powerful enough to run Boinx without hiccups, but when we do get a new camera, we’ll definitely have the horsepower to handle Boinx.

So, for a studio that started with a Streamgenie and DVD-RAM for recording daily announcements, we’ve done a lot of upgrading with very little money.  Don’t let any salesman tell you that high quality school video is only doable for over $10,000 – always start simple, buy equipment that goes up to your ability level, and you won’t fall in to the “overbuying” trap.

The modern mobile A/V production studio

Over at my favorite tech site The Verge, Jordan Oplinger demonstrates how on the go production is done in 2012. Note: no specialized equipment, beyond the necessary kit to capture signal. Once it’s captured, everything goes straight through to post on the MBP. This is in stark contrast to the situation last decade, where companies tried to sell things like the Streamgenie to live-produce content on the go.  The metaphor of the “media truck” is slowly dying.

I find it interesting that she records video and audio separately, just like they do in the movies – there may be an easier way to do this, but there’s no denying that The Verge always has the highest quality A/V coverage of the events they attend.

Also notice the complete absence of a “video” camera.  In 2012, a camera is a camera is a camera, and DSLR’s are the best kind of camera, whether you’re taking video or stills.

Making a hardcore video wall using Quartz Composer

Quartz Composer, originally known as “Pixelshox Studio”, is 100% FREE. If you own a Mac, you already have Quartz Composer, but it likely isn’t installed. Insert your latest system disc and there should be an “optional installs” icon. This will install all of the developer tools, including XCode, iPhone Simulator and soforth. Don’t worry, it’s worth it.

If you’re familiar with other patchers like Max/MSP or PureData, this will look somewhat familiar. The main difference in the QC patcher is that patches tend to flow from left to right rather than from top to bottom. Also, “macros” or sub-patches are far more commonly used in QC than in the other aforementioned programs.

Fig. 1: A blank Quartz Composer window (left) and view (right)

Click on “Patch Library”. This brings up a palette with all of the available patches. Type “video” into the search box and double-click “video input”.

This adds a video input patch. This object does nothing yet – it’s an “input” with nothing to “output” to.

Now, we make an output. In the patch library find “Billboard”. This is a 2D object layer that has whatever you want “printed” on it. Hook the “image” output of Video Input into the “image” input of Billboard. It should look like this:

Fig. 2: One video input hooked to an output.

Now all you need to do is copy the input/output objects. The only other step is to manually set the video device to use. You do this by selecting “Video Input” and clicking “Inspector”. Change to the “settings” tab and select the input device.

Only two simultaneous FW video inputs are allowed in Mac OS X. I’m not sure how many USB inputs are allowed but I got away with using the Built-in iSight off the iMac running the patch to create two of the four panels. One is zoomed in on the left side and the other zoomed in on the right, creating the illusion of two discrete video feeds.

Finally, check out the source file of my Quartz Composer patch – the video inputs will not work (unless you have the same exact cameras hooked up that I did) but it should be educational nonetheless.