Category "Tutorials"

December 28, 2008

Compressor annotation plist format

Compressor accepts annotations on the command line via a plist file. This lets you add things like keyword metadata, author, etc. But Apple has neglected to document the file format. Luckily it wasn't too hard to reverse engineer by poking at the traffic the compressor GUI sends to qmaster.

Under the root node, just make keys named '' or .producer or whatever. Then populate the values as you wish.

Attached is a sample plist file with a producer and keyword field. You'll probably need to 'view source' to actually see it in your browser. Merry Christmas.


Posted by at 2:17 PM | Tutorials

Category "Tutorials"

December 3, 2008

Upscaling Made Easy

HomeCinemaChoice has a solid article about upscaling, the process of taking a standard definition source and making it HD-ish.

There's a lot of confusing terminology and marketing-speak used to describe the various technologies out there, so it's nice to have it broken down in a straightforward manner.

[Via Engadget]

Posted by at 10:10 AM | Tutorials

Category "News"

Category "Tutorials"

October 7, 2008


As promised, I'm posting some sample code for converting from DFXP to SCC. Actually, the SCC generated is contained in a class (captionConvertClass.php) independent of the DFXP parsing. To use it, all you do is:

$myClass = new captionConvert(startingTimecode);
$finishedCaptions = $myClass->outputCaptions();

the "startingTimecode" item is somewhat important. Quicktime files can have timecode tracks that don't start at zero, but DFXP captions are always relative to a 00:00:00.00 start time. So, if the timecode track of your movie begins at 01:00:00:00 (as any file coming out of FCP will) but your DFXP file starts at 00:00:00.00, you need to let the convertor know so that it adds the right times.

Otherwise, it's pretty simple and pretty basic. There are lots of things it doesn't do (scc formating, proper Drop Frame handling, foreign characters, etc). But at least it's a start. If you want to see conversion done right, look at SCC Tools, which is much more feature complete, at the expense of being much more complicated and ... perl.

captionConvert.phps (sample dfxp parsing)

captionConvertClass.phps (plain text -> scc conversion)

Posted by at 2:24 PM | News | Tutorials

Category "Tutorials"

October 3, 2008

Captioning with Compressor

There's a lot of misinformation on the net about how to do proper Quicktime captioning using Compressor. Here's the deal.

To start with - Compressor 3.0.4 can caption MPEG-2 files, as well as MOV and M4V files. When it first shipped, it could only deal with MPEG-2, and some sources haven't been updated to reflect the additional formats. That means, you can add a proper closed captioning track to a quicktime movie, and have it playable on an iPhone or Apple TV, etc.

Next, the captions must be in the Scenarist Closed Caption (SCC) format. This is a really funky format, as my recent rants have attested. The definitive site on the format is that of the SCC Tools project. There's not a ton of other information out there, as much of the spec is locked up in a design document that'll run you $170.

I believe the commercial MacCaption application will output SCC, as will a handful of other applications. Next week, I'll post some sample code for converting DFXP (flash) captions into SCC.

So, to get started, open Compressor and add your video. Highlight the background space of your imported video and then click the "additional information" tab.


Now, select "choose" at the top and point to your SCC file. Click save at the bottom of the pane.

Now you just need to pick your preset (again, any mpeg-2, mov or m4v preset) and submit the job as per usual.

Posted by at 3:11 PM | Tutorials

Category "Tutorials"

September 29, 2008

Build your own Perian

Perian 1.1.1 came out today, with fixes for flash video playback and a number of other things. This gives me a good chance to mention something we do when using Perian with Media Mill.

Because Perian doesn't always play nice with other codecs on your system, I was long hesitant to add it to the codec pack we use for the media mill cluster. But, a few months back I realized I could do a custom build of perian that just had the codecs I wanted. It's not very hard at all. Here's the deal.

First, get Xcode 3.1 - you'll need to sign up for a free apple developer account.

Next, open up your terminal, move to a directory you want to work in, and type "svn co" which will give you the most recent Perian source.

Picture 2-1

Next, browse to that folder in Finder and double click the xcode project file. Find the FFusionCodec.r file and double click it.

It's a fairly readable file, with sections for each codec. Just comment out or delete the codecs you don't want. In my case, I kept just a few.

Picture 3-1

When you're done, hit the project setting dropdown and switch to deployment, then press the big 'build and go' button. A while later, it'll finish, and you can find the "build" folder in the same place you found the xcode project. In there, find the deployment folder, and in there, find your nice shiny new Perian.component. Drop it in your /Library/Quicktime folder and you're done.

If you want to double check that you've gotten rid of the codecs you don't want, grab a copy of Fiendishthngs and run it before and after installing.

Posted by at 9:43 PM | Tutorials

Category "Tutorials"

September 25, 2008

Documentation for Communicators Forum

Later today I'll give a presentation on simple video production to a group on campus. Attached to this post is the handout that I mention in the presentation.

Communicators Handout-1

Posted by at 1:12 PM | Tutorials

Category "Misc"

Category "Tutorials"

September 22, 2008

Creating a film at 1000fps

Stealing another post from ProVideoCoalition - they have a 'making of' film about the creation of a short film for Vision Research, the creators of the Phantom high speed camera. I'm a sucker for high speed film, so I had to post it. Take a look at the finished product as well.

Posted by at 9:17 AM | Misc | Tutorials

Category "Tutorials"

August 21, 2008

Open Directory for Editing Suites

I'm currently working on switching our edit suites over to an Open Directory authenticated setup, with centralized storage of permissions. The idea is that we want a student to be able to sit down at any Mac editing station and have the dock look the same, Final Cut behave the same, etc. I figured I'd offer a few tips for folks trying to do similar things.

I'm an Open Directory newbie. There's plenty of documentation and training material out there, but often it's overly complicated. For a setup like this, here's a few things I've found helpful.

  1. In workgroup manager, make sure each user has a local home (/Users/username) and a network home (afp://server/sharename/username) and leave the network home highlighted.
  2. Create a group to assign the preferences to, and then make each editor a member of that group
  3. In mobility preferences for the group, be sure that you 'always manage' all of the sync options, even if you don't intend to use background or manual sync
  4. Sync a non-existant folder in each of the important 'home directory' folders. Otherwise, folders like Music, Pictures, etc won't exist for users on machines other than the one they do their first login on:

Picture 4-4

That's pretty much it. You should be able to leave the OD server in 'basic' mode, and do all of your work within Workgroup Manager, aside from setting up the afp share. Then just add the server within Directory Utility on the clients and you're set. Syncing is very quick and painless.

Posted by at 12:06 PM | Tutorials

Category "Misc"

Category "Tutorials"

July 24, 2008

Flash Video Bitrate Calculator

Adobe has a little web app to help you figure out the right bitrate to use when encoding flash video, either Sorenson, On2 VP6 or H264. I poked around a little bit and it seems right in line with my normal assumptions. Might be handy for folks struggling to get good quality video on the web.

Posted by at 9:43 AM | Misc | Tutorials

Category "Tutorials"

July 21, 2008

Get embedded SDI timecode into FCP via AJA

Anyone who's used one of AJA's capture cards knows that, while the AJA control panel shows embedded SDI timecode, Final Cut Pro can't read it, and thus you're stuck using RS422 even when you don't need control. DigitalVideoEditing has a quick little tutorial on using AJA's capture tool to work around that issue. Neato.

Posted by at 3:23 PM | Tutorials

Category "Tutorials"

January 3, 2008

Hanging a Plasma or LCD TV on Plaster Lath Walls

So the holidays are over, it's back to blogging for me.

As you might have gathered from the title, this is going to be a slightly different kind of post. However, I recently had to (you guessed it) hang a plasma on plaster lath walls, and found the information on the internet totally lacking. Hopefully Google will help other folks find this post.

 Sith33 1143Sofar Pictures Picture-39

The truth is, there's nothing too special involved. There are only two potential snags. Follow these instructions as your own risk - I'm not a handyman. Heck, I'm barely a man! But I do have a mighty fine drill...

First, finding studs on a plaster lath wall can be a nightmare, as the lath tends to confuse the stud finder. You can narrow it down by doing the 'tap tap tap' test, but I tend to second and third guess myself when I go that route. I have a Zircon stud finder with a 'deep scan' feature that does a pretty good job of finding studs. Unfortunately, it finds lots of other things too. So, assuming your studs are 16 inch on center, measure off an area ~20 inches wide and go back and forth many times, lifting and recalibrating the stud finder each time. You should be able to narrow down an approximate area.

Next, get out a smallish masonry bit and drill a test hole through the plaster. Then switch to a normal drill bit to drill through the lath. You should drill for just a short distance and then hit air, and very likely your drill will crash into the wall and you'll fall off the ladder. If you keep meeting resistance, you found a stud, hoorah! Now, repeat the same step 3/4" to the left and right of that hole. If you still find that you're on the stud, great, your first hole is centered (ish). If one of the new holes didn't hit stud, you know the opposite one is centered.

Now, measure 16 inches from your 'good' hole and drill another hole. If you hit the stud, congrats, you're doing great. If not, it's back to the drawing board. Perhaps you don't have 16 inch on center studs, or perhaps you hit a horizontal beam the first time.

Now, widen out the hole in the plaster with a masonry bit the same diameter as the lag bolts you'll be using with your mount (see the mount instructions). You don't want to be screwing into the plaster itself, only the stud. From here, you should be able to follow the mount instructions for drilling a pilot hole, and then placing your lag bolts.

Be careful not to torque the bolts too tight - listen very carefully for sounds of the plaster cracking. This is the second potential pitfall with plaster - being too aggressive with the screwgun can crush the plaster and weaken the overall mount.

Hopefully, you've now got a solid, secure mount on your wall. Give it some good horizontal and vertical tugs, listening and watching carefully for any movement or signs of instability. Now, it's time to hang your plasma. Woot!

Posted by at 10:09 AM | Tutorials

Category "Tutorials"

October 19, 2007

MKV to MOV for big files

For most users, getting a matroska (MKV) file into Quicktime is as simple as installing the Perian codec. It makes things joyously simple. You can even do file->save as to rewrap your video as something else. Great!

It gets problematic when you have a really big (over 2gig) MKV file. QT will just crash! Oh no!

So, here's my workaround. Not for the faint of heart.

Start by getting the current CVS version of MPEG4IP. You'll need SDL and libtoolize to build it.

Also get mkvtoolnix and install that.

Extract the video and audio tracks from the mkv file using mkvextract:

mkvextract tracks <mkv filename> 1:part1.h264 2:part1.ac3

This will need to grind for a while, but eventually you'll have your demuxed tracks.

Next, you need to use mp4creator to wrap the H.264 elementary stream in a proper mp4 box. You'll probably get a warning about an invalid SEI message. Ignore that.

/usr/local/bin/mp4creator -create=part1.h264 -rate=29.97 "My Video.mp4"

Next, we need to add the audio. Unfortunately, mp4creator can't handle ac3 audio. You'll either need to convert the audio to AAC, and then use mp4creator to merge them, or use Quicktime Pro. I prefer the latter - open the ac3 file in QT, select all, copy, then open your mp4 and select add->add to movie.

Now, when you go to save, Quicktime will likely yell at you. You need to mark an in point a second into the video, and an outpoint a second from the end, and then select edit->trim. Then you can do "file->save as" and move on.

What a pain, hu?

Posted by at 12:39 PM | Tutorials

Category "Tutorials"

October 1, 2007

iTunes as your only media management tool

I know, I know, I just posted saying that I'm somewhat disenchanted with the iTunes Store (and just wait till I post my rant about iPhone 1.1.1) ... however, iTunes itself is still my digital media jukebox of choice.

With that in mind, and with the impending arrival of an appleTV, I've decided to commit to iTunes as my complete digital media manager. Not just audio - video too.

There are two routes to getting video into iTunes. If you want the video to work seamlessly on iPods, iPhones and AppleTVs, you need to use something like Quicktime Pro or Visual Hub to transcode the video. This is a slow, lossy process and I have no interest in that.

Option two is to wrap the video in a Quicktime wrapper. This doesn't touch the video data at all, just makes it look like a Quicktime file. It'll still have XVid or Windows Media (or whatever) data inside. To do this, open the file in Quicktime, and do file->save as. Make sure to create a new file, not a reference file. The resulting mov can be dropped directly into iTunes. If you add the Perian codec to your appleTV, the files will work with that device as well (I hope, I don't have a TV yet to test this with).

That's great if you just have a few files, but what if you've got hundreds of gigabytes of video? You need automation! Other folks have created Applescripts to do this, but Quicktime 7.2 broke them all. So, I've created one that works with 7.2. Make sure you've got interface scripting turned on in the "universal access" system preference panel. Then just drop a load of videos onto this droplet and let it go to work. - SaveAsMov script

From there, you'll need to properly organize all your videos. Check out Doug's Applescripts for Itunes page for a load of scripts to make this easier. I particularly like "set video kind" and "track names with incremented number."

Dig it.

Posted by at 10:14 AM | Tutorials

Category "Tutorials"

September 4, 2007

Lots of great After Effects tutorials

I don't know why I've never watched any of these, but Creative Cow has a great collection of (primarily) After Effects tutorials available for free. Great quality and great content!

Posted by at 1:35 PM | Tutorials

Category "Tutorials"

August 20, 2007

Make XSan not suck

This is one of those posts that I'm doing just in case somebody stumbles upon it in the future while troubleshooting an issue. Since we started the Media Mill project, our XSan install has been relatively flakey. We haven't had any data loss or anything, but it's always taken forever to mount a volume after a boot, and the controllers often lost track of the clients.

It finally bothered me enough recently to dig down and solve the issue. It turns out the biggest problem (and this won't surprise Xsan veterans) was DNS. Even though the machines don't have DNS servers specified, they were still apparently waiting for DNS timeouts on every single Xsan transaction. OSX in general has obscenely long timeout periods of DNS, which was creating a cascade of problems within Xsan.

The solution was to create a hosts file. Just edit /etc/hosts (you'll have to sudo to do this) and add all of the hosts that access xsan, as well as hostnames for them. "man hosts" for an explanation of the formatting.

The trick is to run "sudo lookupd -flushcache" afterwards, or you won't get the benefits. Then, launch XSan Admin and marvel at how quickly it responds. Marvel too at how quickly hosts mount the SAN volume after boot.

There are a ton of other things you can do to make XSan behave better. Anyone thinking about installing should take a look at, the home of all things Xsan.

Posted by at 12:35 PM | Tutorials

Category "Tutorials"

August 16, 2007

Creating vignettes using Color

Digital Video Editing has a nice tutorial about using Color to create vignettes. I've just dipped a toe in Color, but it's a very powerful and exciting tool.

Posted by at 3:15 PM | Tutorials

Category "Tutorials"

July 30, 2007

Dynamic Flash metadata injection with PHP

So, between the summer lull in industry news and my own globe hopping, the blog has been pretty sparse lately. I think things will now begin to normalize, and I figured I'd start it off with a bit of info about a project I've been working on.

First, some history. Streaming flash video normally requires an expensive Flash Media Server license. Otherwise, you're limited to progressive download from http. Some very clever folks have, in the past, figured out how to do streaming over http using php. They accomplish this by adding a bunch of metadata in the flv file which cross references keyframe timestamps with byte offsets. So, if you have a keyframe 50 seconds into your video, they'd include where in the physical file data to find that. This allows you to seek in the video file without having to decode the whole flash video stream - you just need to read the metadata and do an fseek to an appropriate offset.

It gets slightly more complex, in that you need to also rebuild a proper flash header to that the file still has the right format, but for the most part that's not too difficult either.

There are a number of projects out there to make this pretty easy. Start by taking a look at the phpstream project. These solutions all require you to do the metadata injection with a separate tool, either the closed-source and Windows-only flvmdi, or the open-source, ruby-based flvtool2. Neither of these is optimal if you're looking to either do injection dynamically, or integrate injection into an existing automation workflow.

Luckily, there is another project, someone neglected, called flv4php. It implements many of the necessary FLV-parsing routines in php, allowing you to build your metadata array directly within php and write it back to the FLV file, or store it separately. You can even do this at runtime, if you're so inclined. I'd recommend against that particular approach if you're dealing with long flash videos, as there is a significant amount of processing overhead involved.

If you browse around the source, you'll find a php4 and php5 version of flv4php. The php4 version has many more features, but the php5 version has sample code for implementing metadata injection. Take a look at the test2.php code to get started. However, for long files, replace the line that says

$ts = number_format($tag->timestamp/1000, 3);


$ts = $tag->timestamp/1000;

To prevent php from adding commas to your timestamps and thus breaking flash.

The test2.php code writes out a .meta file, containing the metadata for your video. That leaves the issue of how to read that data back in and inject it appropriately. That code is below.

To actually view streaming FLVs, you can use JeroenWijering's free flv player. Take a look at the source if you're curious how it all works on the client side.

Anyways, here's how to playback a .meta file in conjuction with an FLV file. $streamPos is the offset within the video that we're seeking to (it comes from Flash as a byte offset already). A bunch of this was stolen from the php4 tree of flv4php.

$fp = fopen( $targetFile, 'r' );
header('Content-type: flv-application/octet-stream\n');
header('Content-Disposition: attachment; filename="' . $fakeName . '"');
header("Pragma: public");
header("Cache-Control: must-revalidate, post-check=0, pre-check=0");

fseek($fp, 0);

$hdr = fread( $fp, $FLV_HEADER_SIZE );
fseek($fp, 0);
$bodyOfs = (ord($hdr[5]) << 24) + (ord($hdr[6]) << 16) + (ord($hdr[7]) << 8) + (ord($hdr[8]));

echo fread($fp, $bodyOfs + 4);

$metadataFile = $targetFile . ".meta";
$metad= file_get_contents($metadataFile);
echo $metad;
$chunkSize = 4096;
$skippedOrigMeta = empty($origMetaSize);
if($streamPos == 0) {
else {
fseek($fp, $streamPos);
while (! feof($fp))
// if the original metadata is present and not yet skipped...
if (! $skippedOrigMeta)
$pos = ftell($fp);
// check if we are going to output it in this loop step
if ( $pos <= $origMetaOfs && $pos + $chunkSize > $origMetaOfs )
// output the bytes just before the original metadata tag
if ($origMetaOfs - $pos > 0)
echo fread($fp, $origMetaOfs - $pos);
// position the file pointer just after the metadata tag
fseek($fp, $origMetaOfs + $origMetaSize);
$skippedOrigMeta = true;
echo fread($fp, $chunkSize);

Posted by at 9:12 AM | Tutorials

Category "Tutorials"

June 11, 2007

Using a V1U in Final Cut Pro

I haven't had a chance to use a Sony V1U camera yet, but I'm excited to get a chance. Digital Content Producer has a nice article up on using the camera with Final Cut Pro. This is only important if you want to edit a native 24p timeline (which you probably do).

Check out their review of the V1U as well for more insight. (I think I've linked to it before?)

Posted by at 11:05 AM | Tutorials

Category "Tutorials"

March 12, 2007

Deinterlacing for Fun and Profit

Deinterlacing is one of those things that often gets overlooked in video production. When you're delivering video on the web, it's really important to deinterlace it, or you'll see all kinds of interlacing ugliness when your video is displaying on an (inherently progressive scan) computer monitor.

There are a load of ways to deinterlace an image, ranging from the most basic - just throwing away every other field - to crazy complex - optical flow analysis with motion adaptive interpolation. As you might guess, these range in processing-costs from essentially free to very expensive.

Seeing as my email is down at the moment, and seeing as it's spring break, I decided to spend some time playing with the various deinterlacing options available within Compressor. I mention Compressor specifically because all of the Media Mill presets make use of the basic "deinterlace - blur" filter to force all video to progressive. When the "100% frame size" bug is fixed, I'll be adding some straight progressive presets, without any deinterlacing.

Deinterlacing is most important when you're not doing any rescaling of the frame - when you're shrinking a 1080i frame down to 320x180, it tends to all get blurred away anyways.

Here's a 1080i60 frame, after being run through an H264 encode with no deinterlacing.

Greenscreen Tutorial-Original

It looks pretty nice, but notice the interlacing blur in Mike's hand. This is the sort of thing that can get really ugly on a panning shot. It's a bit harder to see in the web jpeg because the images get shrunk slightly. Here's a 100% crop:


Next, the same frame but with a "Deinterlace - Blur" applied. I'm using blur because it's the nicest of the "cheap" deinterlace filters available within Compressor.


Well, his hand looks better, but look at the tripod legs and Rebecca's dress. They're terrible! This is really bad artifacting, and it really bothers me because it looks so unnatural.

What happens when we throw just a little bit more (20%) cpu time at the problem? Here's a frame that has the "Frame Control" deinterlace applied, with the "fastest" option selected, line averaging.


Wow, much better! The diagonal elements look good, and his hand has a proper motion blur, like it should.

One tip - when you're using the frame control deinterlace within Compressor, be sure to also set the "output fields" dropdown to "progressive," or the deinterlace won't have any effect at all.

If we throw even more CPU time at the problem, you get another ~10% improvement in vertical resolution, but it's really not worth the effort.

So, the verdict? I've been doing it all wrong. Starting today, the highest quality Media Mill presets will use frame controls for interlacing. This means the 1080p, 720p (just added!) and "Very Large" 480p presets. This only effects the Quicktime presets for now, as I need to do more testing with WMV.


Posted by at 1:00 PM | Tutorials

Category "Tutorials"

February 19, 2007

First episode of Gear Media Tech

I missed this a couple weeks ago, but the first episode of Gear Media Tech is out. GMT is a new video podcast from Leo and the Pixel Corps. The first episode is about mixers. It's worth a watch, but don't assume that everything they say is gospel. Particularly the whole latency / balanced thing, the phantom stuff and the EQ stuff. Meh.

Posted by at 11:04 AM | Tutorials

Category "Tutorials"

February 7, 2007

Video Training Podcast

I'm super excited about this - a few of the student staff at the studios are working to produce a series of video tutorials. We've just put the first tutorial online - a basic overview of a light instrument. We've got a large list and are hoping to start churning them out routinely.

The podcast will be available from the iTunes Store in a day or so, but for now, you can subscribe to them by clicking below.

Podcast Link (small videos)
Podcast Link (large videos)

Posted by at 4:02 PM | Tutorials

Category "Tutorials"

January 31, 2007

Scopes? Hu?

CreativeMac/ has a nice little tutorial video about using scopes within Final Cut Pro. It's a nice introduction to what the different scopes do and why you should use them. Take a look, then upgrade to some real scopes ...

(via FresHDV)

Posted by at 9:59 AM | Tutorials

Category "Misc"

Category "Tutorials"

January 29, 2007

Color voodoo

The process by which real life turns into digital video is, by all accounts, pretty arcane. Not necessarily because of bad design decisions (though there are plenty of those) but just because real life is very complex, and our tiny little brains haven't figured out great ways to capture it.

In any case, color sampling and color spaces is an area that trips up many folks in this field. You've got analog ranges, digital ranges, 8bit, 10bit, 12bit, and then you've got color profiles on the computer to contend with. has a two part article (part 1, part 2) dealing with color spaces in video. Take a look if you're at all interested in understanding why your image changes colors when you move it from Photoshop to Final Cut, or from your Mac to a PC. Keep in mind though, that all of this happens separate of the color subsampling step. The detail lost by going to 4:2:2 or 4:2:0/4:1:1 is independent of the latitude lost due to color space conversion.

Posted by at 11:23 AM | Misc | Tutorials

Category "Tutorials"

December 4, 2006

Large Scale P2 Production

Check out this article from regarding large scale p2 production. One of the challenges with P2 based cameras (like the Panasonic HVX-200) is that you can't feasibly maintain your footage on the original media. Even on a major production, burning through $30,000 in p2 cards every day isn't a realistic option. So, you've got to come up with a workflow to offload that content as it's being shot. The article goes into great detail about the process being used on the production of a significant TV pilot.

Posted by at 1:00 PM | Tutorials

Category "Tutorials"

November 17, 2006

BMW "Precipice" Ad overview

StudioDaily has an interview with Ben Grossman from the Syndicate, discussing the creation of the BMW "Precipice" advertisement. It's a pretty cool bit of CG, as they had to shoot the commercial during the day, in the dry, but present a commercial which showed the car at night in the rain. Lots of rotoscoping and particle effects ensured, and the end result is pretty impressive.

Posted by at 1:07 PM | Tutorials

Category "Tutorials"

October 25, 2006

Greenscreen Fun!

Everyone else is linking to it, so I might as well too...

General Specialist has a guide to getting the most out of your greenscreen. There's some excellent tips there, such as how to configure your camera to make keying easier, and how to direct your talent to get the best performance on set. It's actually a really good piece, so check it out.

Posted by at 1:54 PM | Tutorials

Category "Tutorials"

October 3, 2006

Howto: Stream Video with Flash Media Server

As of late, I've been working on project that involves streaming flash video with Flash Media Server. I've been a bit frustrated by the process, and wanted to post some tips I've picked up along the way. Follow the jump if you care ...

First off, if you just need to stream FLV files off a FMS, take a look at the Livedoc entry for this topic. If you follow those steps directly, you should be able to stream video. Just launch Flash8, go file->import->video, enter your rtmp:// address and you're set.

If you screw something up with the main.asc file, or forget to include it, you'll need to restart the server after fixing the problem. What you'll quickly learn as you work on further main.asc development is that server restarts become habitual. This is a bit more trouble if you're working on a server that hosts live applications, but I suppose you shouldn't be doing development work on such a server.

If you want to go a step further and start implementing securing on the server - that is, real security, not just security through complexity - you have two options. Which are actually only one option. Dig?

You see, when you read the documentation about FMS, you'll find out about something called the "Access DLL," which is represented by a file called libconnect.dll/ This is supposed to be a wonderous API through which you can write C code to control access to your content.

The problem is, it doesn't exist. At least, not in any way that I've discovered. You see, there's supposed to be an SDK and sample code to aid one in developing such a module. The documentation makes repeated mentions of this sample code. But nobody seems to have ever seen said sample code. Adobe phone support was helpful enough to suggest that I purchase a $10,000 support control in order to ask whether such an SDK actually exists. Fantastic. A bit out of my budget though.

However, main.asc, the file that you need in place in order to stream you content, gives you a pretty good level of access control. They're even nice enough to document it:

//Add security here


You can do a few things to enhance your security. You can check the client.referer property to make sure that the SWF that's connecting to you actually came from a valid host. If you want to get really sneaky though, add a query string to the SWF file when you load it. For example:

Anything you pass in on the query string to the swf will be visible to the Flash server inside your main.asc actionscript. That means that you can pass information from the http server to the Flash server dynamically.

If you don't use the FLVPlayback builtin in Flash8, you have a lot more options about client->server interaction. You don't need to do the hacky query string solution, but instead can actually pass objects back and forth. However, FLVPlayback is supremely nice, so it's a shame not to use it.

Anyways, my hatred of Flash is slow waning, as I grow more comfortable with the tools. I'm far from liking it, but I'm closer to tolerating it.

Posted by at 9:08 PM | Tutorials

Category "Tutorials"

April 3, 2006

Reverse Engineering a Synergy

The Ross Synergy MD and MD-X lines of production switchers have a pretty cool ability to load images over the network and play them out through what Ross calls a "global store." This is a pretty cool feature for doing image playout, as it doesn't require extra equipment, and gives you three scriptable channels of images. With extra software upgrades, you can also play animations using this feature.

There are few downsides, from my point of view. The Synergy requires specially formatted TGA images. Furthermore, the included software for moving images from a computer to the Synergy only works on Windows, and cannot reformat images internally. With that in mind, I decided to write my own interface software. Wanna know how? Click the link.

The first issue to deal with is that Ross had little interest in providing me with any protocol documentation or source code. Not surprising really, but it meant I needed to do a bit of investigation. Luckily for me, the Synergy is built on an open platform (Linux) and makes use of industry standard protocols for the transaction. Here's an ethereal dump of an upload. I was already logged in when I initiated the transfer, but then logged in again at the end.

Picture 1

So, we can see a few things are happening. The image is being appended with _0000 and then uploaded over FTP. Then, an XML file is being sent. Next, we jump over to HTTP and call a couple URLs, then download an XML. You'll also see the username and password information at the end of the dump.

That's all pretty straightforward. The only stumbling block is the XML file that gets uploaded. Luckily, it's pretty straightforward as well, if we change our filter a bit to look at the ftp-data traffic.

Picture 2

Really pretty straightforward - just a description of a file, with lots of extra settings that don't concern us at this point.

So, now we know how to post the image to the Synergy, and how to make the Synergy aware of the new files (those http calls). Next we need to get the images into a format that the Synergy understands.

The TGA files required need to be 24bit or 32bit (with an alpha mask). If you're running your Synergy in SD, you should also take into account the pixel aspect ratio. RLE compression is OK. Also, for whatever reason, TGAs generated by The Gimp won't work.

One unfortunate thing I've discovered is that if you ask the Quicktime API to do your conversion to TGA for you, you'll get different results depending on the platform. On Windows XP, you'll get 24bit targas, whereas on OSX you'll get 32bit targas, with an empty alpha. Not the end of the world, but a bit of a bummer.

So, throw together a bit of Targa resizing and FTP/HTTP code, and you've got yourself a really nice replacement for Image Mover. I call mine Synergizer, and will post it in a few days, once I'm relatively sure it won't destroy your $100,000 switcher ...

Posted by at 9:46 AM | Tutorials

Category "Tutorials"

March 23, 2006

Getting video framerate through the Quicktime API

This is a topic in which pitfalls abound, but I think I've just about sorted out the various ways to get an accurate idea of the a video file's framerate. I'm using RealBasic for this, but the calls should be pretty generic in any language that talks to quicktime.

If you poke around a bit, you'll find that the Movie.Timescale property is "sometimes" the number you're looking for. In particular, video files generated from Apple applications (FCP, Compressor) seem to have their timescale set to the framerate. However, many other files will have their timescale set to the sample rate of the audio track, or perhaps to something entirely different.

If you can use the timescale, go for it. It's much faster than the alternative. Note that it may come in as an int - so 2997 instead of 29.97.

What I'm doing is testing whether the timescale number makes reasonable sense in the context of video. I figure I'm not likely to have a video with a framerate higher than 60fps, or lower than 10fps. If the timescale result falls somewhere outside those bounds, I fallback to an alternative method.

By using the GetNextInterestingTime Quicktime call, or in Realbasic using NextInterestingVideoTimeMBS (requires MonkeyBread) you can measure the time between frames. NextInterestingVideoTimeMBS will return the current time in the movie and the duration of time between two "interesting times" (which usually equate to frames). The number you're most interested in is the duration. By calling NextInterestingVideoTime a few times, you can be sure that the duration of each frame is the same.

Once you've got your duration number, you can divide your Timescale by the duration, and get the number of frames per second. You can even reset the Timescale to your new value (multiplied by 100) and then use the TimeDuration call to get the length of the movie in seconds.

Seems to work pretty reliably on the videos I've thrown at it. Anybody have better methods?

Posted by at 2:19 PM | Tutorials

Category "Tutorials"

March 14, 2006

Creating video framegrabs from the command line

So, I've been hacking around with different methods for getting stills from video files via the command line. This is part of a larger project I'm working on, but I figured I'd document what I've come up with to hopefully make things easier for folks in the future.

Click the link for more...

So, first off, if you're not working on a Mac OSX / OSX Server platform, your options are pretty limited. You're pretty much guaranteed to end up using something related to ffmpeg, unless you go for commercial software.

In my case, this needs to happen via some interaction with a php script, so the easiest approach is to use the ffmpeg-php extension. This is actually a very cool extension, though it's rather fiddly to get installed right. You can get all sorts of information about a video, and extract whatever information you need.

A few downsides present themselves though. First off, ffmpeg is what we might fairly call "patent encumbered." Because nobody is paying the MPEG-LA for use of the mpeg codecs, ffmpeg is in violation of any number of patents. Now, you're fairly unlikely to be personally harmed for using it, but it's important to be aware of the legality.

Next, and more pressing, ffmpeg is terrible for doing frame grabs. You get lots of control, but it gets slower as a linear function of how far into the video you need to seek. It almost seems like it converts every frame of the video up to and including your target frame. On my 2ghz g5, it can do a grab of the first frame of a video instantly, but grabbing a frame from 10 minutes in can take up to 60 seconds. That's unacceptable. I need to poke in the code a bit more to get a better sense of why that's happening - it happens with both the CVS and stable branch, no matter how I compile it.

As best as I can tell, if you're on a non-Mac platform, that's really the only free way to do it. So let's move on to the Mac options.

First, you can use applescript (via osacript) to make Quicktime do it for you. You'll need QT Pro. This may be a good option for very low volume uses where you can monitor the display. Unfortunately, my experience with applescript and Quicktime is that things often go awry. Further, you're limited to performing one operation at a time, which is tough to do when you're deal with web access. You'd need to create a backend to do batch processing at intervals - no fun!

Next, you can use the QT_Tools software. This is a very lightweight set of applications which make calls into the Quicktime API. If you're looking to learn the Quicktime API, these are actually pretty helpful too. The source is available, but developer support is limited.

Semi-related, you should take a look at the movtoy4m software. It also uses the Quicktime API via the command line to do simple video processing. It's pretty helpful for pulling basic information. It is essentially abandoned at this point.

So that's where I'm at. I think for my solution, I'll end up using the QT_Tools route. I'm somewhat tempted to learn how to write a php extension and just code my own PHP-Quicktime ext. I get the feeling though that this wouldn't be nearly as easy as it seems. Perhaps I'll just make some very stripped down binaries I can call from within PHP.

Anyone have a fantastic option that I haven't thought of?

Posted by at 6:02 PM | Tutorials

Category "Tutorials"

March 10, 2006

Fantastic HDV Articles

Take a look at these two fantastic articles about using HDV on the set of "24." Really well written stuff...

Posted by at 10:59 AM | Tutorials

Category "Tutorials"

March 6, 2006

24 frames of sadness

I find it absolutely astounding that support for 24p/24f HDV video post production is so poor. Final Cut can't do it right, Avid can't do it right, Premiere can't do it right.

I understand that something like Final Cut can't be "turned on a dime" so to speak, but seeing as 24p was the buzzword to have in a product over the last few years, I can't understand why support has been so slow in coming. As best I can tell, neither Canon or JVC do anything particularly unpleasant when they write the signal to tape.

I wanted to take a moment to quickly run through the different options for cutting 24p/24f on a Mac, to try and sum up why they're all terrible. Follow the jump for more fun and depression!


First up, you can try capturing your video with log and capture. This will fail. But at least you can say you tried. Final Cut doesn't understand the 24fps timebase in any HDV mode.

Next, you can buy a product like LumiereHD or HDVxDV. Lumiere HD will definitely work with the JVC, and rumors tell of support for the Canon. You are recompressing the video to a new format (DVCProHD most likely, though you can use other QT codecs) but otherwise it's pretty straightforward. LumiereHD also gives you the option of going back to tape. It is, however, $180.

HDVxDV is a $100 cheaper, and definitely supports the Canon format, but does not seem to be as robust a product as Lumiere HD. Many folks who are using it are running in to showstopper bugs.

The next option is decidedly free-er but a lot more work. First, you use DVHSCap from the Apple Firewire SDK to ingest the video to M2T files. Then, use MpegStreamclip (a fantastic program) to transcode the video into DVCProHD 1080p24. Then (sometimes at least) use Cinema Tools to conform the file to a 24 or 23.98 timebase. Then modify a FCP sequence to be 1080p24 DVCProHD, drop the file in, and hope it all works. Usually it won't.

Some folks are also hinting that you can use the HDV Apple Intermediate Codec preset in Final Cut Pro. While this will capture the video successfully, you'll be working in a 29.97 timebase. You can use Cinema Tools to reconform your video so that the timing is right, but I still haven't figured a clean way to make the audio match.

Truth be told, all these options are terrible. Recomressing is never a good thing, and going to DVCProHD creates the potential for some pixel-aspect issues as well.

Please, Apple/Avid/Adobe... we beg you. Support these HDV modes!

(the graphic is for the folks who said I needed more... graphics)

Posted by at 10:15 AM | Tutorials

Category "Tutorials"

March 1, 2006

Slow Motion Video

There's been a lot of talk on the net lately about the various pros and cons of overcranking with the HVX-200 versus deinterlacing and undercranking the XL-H1. Here's my take on this issue.

One of the coolest features about the HVX-200 is the ability to vary the framerate. You can undercrank down to something like 4fps, or overcrank up to 60fps. This allows for really nice looking slow motion when the 60fps video is played back at 24fps. I always thought that was one of the coolest features of the Varicam, and I was a bit surprised to see it show up in the HVX-200. If you know you're going to be doing a lot of slow motion, that feature is probably a deciding factor in favor of the Panasonic.

Recently, a few folks on the DVInfo forums have been playing with ways to get similar results from the Canon XL-H1. Here's the gist:

Shoot 1080i60
Deinterlace to 1080p60 - you have transcode to the DVCProHD 720p60 present in Compressor, with the fancy frame controls
Conform the 1080p60 to 1080p24 using cinema tools
Drop in a DVCProHD 1080p24 timeline in FCP

Since the Panasonic doesn't have all that much vertical resolution (remember?) one could argue that this method can produce images that are on par with the native progressive images from the Panasonic.

It all seems like a bit of the dogs breakfast though. It's not practical for a real workflow - the time spent in compressor alone will take ages. Moreover, I've been doing some side by side tests and I'm just not convinced that the results look any better than making a speed adjustment to the original HDV clip in Final Cut Pro. Perhaps I'm doing something wrong... more experimentation is required ...

Posted by at 10:40 AM | Tutorials

Category "Tutorials"

January 11, 2006

What's 24p?

The new generation of HDV cameras has started a big debate regarding the definition of 24p. Here's the deal.

First off, what does 24p mean? 24 frames per second (like a motion picture), progressive frames. This means, unlike normal NTSC video which samples half a frame (called a field, consisting of every other line of a frame) every 60th of a second, a whole frame is sampled every 24th of a second. The idea is to get a look on video that approximates the traditional cinematic feel.

So, what's the right way to do it? In an ideal world, we'd have a CCD or CMOS chip in the camera capable of sampling a whole frame at a time, recording to a tape format that recorded 24 individual frames each second. It's not an ideal world though, is it?

In the prosumer space, the first camera to make big waves in this area was the Panasonic DVX-100. It owes much of its success to the fact that it actually did the 24p thing right. It used progressive CCDs so that it could capture 24 individual frames. It then gave you the option of two different well-documented pulldown patterns for putting those 24 frames into the 60 fields of NTSC video. It was straightforward, it was reversible, and it looked darn good.

What the hell went wrong?

HDV has introduced a number of new problems. First off, the highest resolution within mainstream HD (the stuff that gets broadcast) is 1080i, an interlaced format running at 60 fields per second, just like standard definition. The first round of real HDV cameras, the Sony FX1 and Z1U, were optimized for this resolution. Due to this, their CCDs were designed for interlaced recording. Sensing that there was a demand for 24p recording, Sony added a "CineFrame" setting, which attempts to give a "film look" by deinterlacing the footage and tossing out some fields to get a 24 frame per second rate. For reasons I don't fully understand though, Sony chose a very odd cadence, or timing pattern. This gives motion shot in the CineFrame mode a very unrealistic look. The footage than goes through a pulldown to fit it into a traditional 60i form, but because of their odd cadence, this isn't easily reversable. Getting a true 24p timeline from footage shot on the Sony would be quite a bit of work. Frankly, it's all but unacceptable to anyone but the Soccer Mom videographer.

Hoping to capitalize on the frustration the Sony settings caused, JVC released the GY-HD100 at NAB last year. You can learn more about my feelings towards this camera in a previous post, but needless to say, it doesn't thrill me. However, if all you need is progressive video at 24fps, and you can deal with a camera that's using a bit of an odd spec, it might be the camera for you. Unfortunately, the best you're going to do is 720p24, not a full 1080p24. This is a pretty significant resolution drop.

The Panasonic HVX-200 attempts to pick up where the DVX-100 left off. You get very similar options for recording 24p, and because it's using DVCProHD on a p2 card, you can get native 24p files. The only downside is that the Panasonic CCDs may not be quite as high resolution as one might hope. Judging from the footage coming off of it though, it's still a beautiful camera.

Canon goes a route similar to Sony with their XL-H1, but with some key differences. They're still using a 1080i CCD, so you won't get native progressive frames. However, when doing 24p recording, the Canon clocks the ccds at 48hz, so that they're capturing 48 distinct moments in time. Then, using some fancy deinterlacing algorithms, they blend fields to make 24 discreet frames. While this does cost some clarity and resolution, you at least avoid the cadence issues of the Sony. Canon also uses what they call a "24f" recording format, that puts 24fps on tape, instead of using a pulldown to get to 60i. Unfortunately, this means that they've introduced yet another format that won't work with other HDV devices. Fun!

So, if you want to shoot absolute, "true" 24p, you need an HD100 or HVX200. If you want to shoot "nearly true" 24p, get the Canon. And if you want to shoot "not really 24p at all", get the Sony. At this point, if you need to edit your video in Final Cut Pro, you're pretty much stuck with the Sony or the Panasonic.

I'm sure it's all perfectly clear now, right? Ask questions!

Posted by at 4:39 PM | Tutorials