Category: Post-Production

Overdubbing – Recording Voice-overs

Post-Production | By: indie

Sometimes, despite your best efforts to create the optimal environment for filming each time you make a video, you’ll discover unforeseen problems. These issues can crop up just as often in your audio recordings as they do in your video. In fact, even in a completely isolated area that is free from noise, while reviewing your footage you may discover the occurrence of certain anomalies in your sound recordings.

Whether you’ve recorded audio track(s) together with the video or using an entirely separate system, it’s impossible to account for every error you’ll encounter. Things such as white noise produced by fans or the air rushing through ventilation ducts can go unnoticed while you’re in the middle of a set, only to show up in your production audio later on. Interference from electronic devices can often be inaudible, but may somehow disturb your camera’s operation. Wind noise from an outdoor shoot can make its way to even protected microphones, causing drop-outs and annoying crackles and pops.

Thus, you may need to resort to a process called overdubbing, which involves inserting sounds during post-production to replace the corrupted or unintentionally altered sounds captured on set. When you overdub an actor’s dialogue, the recorded result is called a voice-over. The term is also sometimes used to describe narration from someone ‘outside’ the scene, but in this case we’re speaking specifically about voice overdubbing of an actor who appears physically within the scene.

Techniques For Recording Voice-overs

I approach recording vocal overdubs for a film or video in almost the same way I would go about grabbing vocals from a singer. The main difference is in the pacing and sensory input you’re providing to your actor; after all, the most favorable conditions for overdubs would involve the actor being in the same mood or ‘state’ that he/she was in when the original scene was shot. My recording setup for overdubbing is described below.

Recording Studio

I have two separate rooms – a control room and a ‘sound booth.’ The sound booth is really just any space that can be closed off with a door. In my old apartment, it was a walk-in closet. Now I use a spare bedroom. The control room is where my DAW is located, and thus where I am located during the recording.

Microphones

I place my studio vocal mic in the sound booth on a stand to match the actor’s height as closely as possible. Use whatever microphone you have available to you – even a cheap lavalier mic from radio shack can work. I then have another vocal mic in the control room with me; this is called a ‘talk-back’ mic, and it allows me to communicate with my actor without having to yell through two closed doors and a hallway and interrupt the flow of the session (or the quality of the recording, for that matter).

Other Equipment

The actor and I both wear “earmuff-style” headphones, the type with full bass response that covers and makes a seal around your ears. These headphones are both connected to the same audio source, and I use a splitter to send signal to both at the same time.

Placement

Actors should be standing at all times, even if they are overdubbing a scene in which they were sitting, slouching, or even lying in bed. This promotes anatomically correct use of the diaphragm and allows the actor to employ a larger range of movements to enhance their performance.

A windscreen placed between the mic and the actor can help cut down on unwanted pops and clicks from some of the sharper hits, such as “p”, “t”, and “s” sounds. Whether using a windscreen or not, it may sometimes help to place the actor at an off-angle to the microphone to prevent direct airflow onto the mic while speaking. As a quick test, have the actor start talking and place your hand in front of their mouth, cupped slightly but not touching their face. Over a few seconds you will start to feel whether they tend to project air in certain directions more than others in their normal way of speaking. Use this to determine their angle and placement in relation to the microphone.

Sensory Input

In order to insure the most accurate performance, the actor should at least be able to hear him or herself during the scene, if not also having the added benefit of seeing the edited footage. I find that it’s easy enough to just let the actor listen to each line one at a time, without providing a video feed from a secondary monitor and a long cable connected to my workstation at the other end. As long as you can let the actor hear what they’re saying and use your talk-back mic to coach them into state, you should be able to get a decent performance that is similar in pitch, speed, and tone to their original.

Pacing

This varies from actor to actor. Some actors prefer to have repeated playback in their headphones so they can hear the line over and over and practice speaking it again directly over top of it. This way they are sure to get a result that rhythmically matches the original. Other actors just like hearing the line a few times and then repeating it on their own, without any background noise to distract them from their current performance. Work with your actor to figure out what makes them most comfortable.

Example

Coming sometime hopefully soon is a video example of a scene’s original recording, followed by the same scene using vocal overdubs recorded in-studio. You’ll be able to hear that the dialogue stands out much more and gives you greater control over your background noise and sound effects.

Non-Linear Editing Tutorial

Post-Production | By: indie

In this free video editing tutorial, I’m going to explain how to use a non-linear video editing system in a way that should encompass the basic functionalities in general terms, so you’ll have an understanding of everything you need to know no matter which software you choose. This is the kind of stuff I paid the big bucks to learn during my studies at university.

When you create a new project in your video editing software program, you’re probably going to have several options as to the size, quality and frame rate of your project. For now, just choose whatever is the default or standard setting. In North America, DVD quality video is 720×480 pixels of resolution (meaning it’s 720 pixels wide by 480 pixels high) at 29.97 frames per second. These settings are what is called the NTSC video standard.

The interface of your NLE may have several panels or component windows, but I’ll describe the three main panels you should look for. The first is the timeline, storyboard viewer, or clip sequencer, which is where you place video clips to be played back in your final product.

Some editing programs like Windows Movie Maker and Pinnacle Studio offer storyboard/clip sequencer interfaces because they are aimed at making it easier for novices to perform non-linear video editing tasks. Your editing software may have only a storyboard view, only a timeline view, or both, but timeline editing gives you a great deal more control over the editing process.

The second element to look for is a preview window that lets you see your edited video and any effects applied to it. Certain effects may need to be rendered before you can see what they will look like in the final exported video. Some software performs rendering on the fly, meaning it processes the effect when you place it onto your video, but for especially processor-intensive effects you will need to set the program to do a preview render.

Preview windows usually appear as a large black box at first, before you have added any video to your timeline or sequencer, and should be pretty easy to recognize.

The third and final item to look for is the library or collection window. This is where media files are stored after they are imported into your project. From here you can drag the various video clips, images, and backgrounds to your sequencer or timeline.

Once you have these three areas located and identified, it isn’t very hard to perform basic editing. Use your software’s import function (it might be File > Import or a similar command, otherwise look to your library panel for a file explorer view) to get video from your hard drive and bring it into your project. After it’s been imported, drag a video clip to the sequencer or timeline.

Now here’s where timeline editing starts to distance itself from the sequencer/storyboard view. When you drag a video clip to the timeline, it appears on a video track in its entirety, with any sound below it on an audio track. You can use the editor’s razor or slice tool to split the video clip into multiple segments at any point.

This is how non-linear video editing gets its name; once your clips have been split, you can click and drag them around and place them in any order you choose.

A storyboard or sequencer, on the other hand, basically takes the clips you have and plays them one after the other. You fill up the storyboard boxes with one video clip each and that’s about it.

Playback of your timeline or sequencer should display the video in your preview window. Most editing software programs have scrubbers that let you drag them along the timeline so you can play a video back frame by frame in your preview window.

Get Familiar

With this basic overview of how a digital NLE works you should at least be able to bring in a video, cut it up, and move your clips around to organize your shots into a basic video. But this only scratches the surface of what’s possible in most editing programs.

You can make video clips semi-transparent and add multiple video and audio layers, add transitions and fades between clips, and place effects that alter the look of your video entirely; how you do this depends on your video editing software.

Play around with your NLE‘s basic functionality as described above just to get a feel for how things work, and then start experimenting with its more advanced capabilities. Your software’s help section should get you past any snags, and there are lots of other free editing tutorials available on the web that deal with specific editing programs.

Making Your Video Look Like Film

Lighting, Post-Production | By: indie

Cinematic Magic – Making Your Video Look Like Film

There are several things you can do in post-production to make your video look like film, and I’m going to talk with you about them in this article. But it’s important to understand that video, by nature, will never be film. Just like they’ve made electronic instruments such as pianos and drums that sound so close to their ‘real’ counterparts that only the most trained ear can hear the difference and only the most astute player can feel it, there is still a difference.

The Way Film Looks

It would be fantastic if the video you just shot on your handheld camcorder could look just like film does when you see a movie in the theater, right? Well, beyond the simple fact that good sound design does more than you probably realize to make a movie great, you first have to know what you’re looking for.

So let me ask you a question. What does film look like?

Unless you’re sitting there with your hand raised, ready to shout out a bunch of specific qualities you love about film off the top of your head, chances are you couldn’t rattle them off even if you had time to think about it. Can you really describe in concrete terms the qualities that make film look the way it does?

What makes it so special? So majestic to take in?

Well, for those of you who aren’t scholars, directors, or film students, I’ll tell you why you want that film look so badly in your projects.

Film’s Characteristics

Film is imperfect; it physically moves through the camera as it captures imagery. It’s moving at a high rate of speed, and any dust, hairs, or other abnormalities present during each instant are captured as well. The film rolling through a camera reel is only exposed for 1/24 of a second, and as it passes through underneath the lens it captures an image based on the light it iss exposed to. It has a depth and color quality all its own because of the chemical reaction that takes place during this fraction of an instant.

Following are several characteristics of film, an explanation of each, and tips on what you can do during
both production and post-production of your film to mimic each characteristic.

Film Cameras Have A Narrower Field

The measurement of how much of an image is in focus is called the depth of field. I have a whole page devoted to depth of field, actually. For example, you might have a person standing in front of a mountain range and both would be perfectly clear and visible. This image is said to have an extremely wide depth of field. An example of an image with a narrow depth of field would be a close-up of a table fork, where even the tablecloth right beneath the fork is blurred.

In film as opposed to video, the depth of field is quite a bit shorter by default. Movies get part of their
magical quality from the fact that when our main character is standing in front of a crowd of people, you don’t see the whole crowd and everyone’s faces clearly. It’s just our hero, standing there, and he’s clear and sharp and crisp, while everyone in the crowd behind him is blurred.

Good filmmakers use this narrow depth of field to their advantage. The eye is naturally drawn to the part of an image that is most in focus, so everything that’s blurred becomes part of the background. When you shoot a video and your camera is zoomed all the way out (wide), there literally is no background; it’s a flat piece of scenery with a bunch of objects in it.

There is a way you can recreate the film look with video, and that is by narrowing your field depth. You’ll need to set up your shots a little differently in order to do this, and if you’re indoors you may be limited by space. But try using the following technique and see if you can make your video look like film by perfecting it.

Film Look – Tip #1: To set up a narrow depth of field, dolly back so the camera is physically much further away from your subject (in fact, the further away, the better) and zoom in. While zoomed in, every tiny bit of movement or shakiness multiplies the amount your frame will move, so these shots are best attempted while using a tripod with a still or relatively still shot.

When you are significantly further away and zoomed in (telephoto), you’ll begin to see the background blurring out and your subject coming into view clearly. To tweak this shot and adjust the focus exactly how you want it, switch your camera to manual focus mode if it has one and if you feel comfortable doing so.

Film is Slower and Softer

It isn’t composed of pixels like digital video, so film has a smoother, softer look to it. Film also shows motion blurs more easily because it has a slower frame rate than video. Since standard NTSC video is recorded at 29.97 frames per second, and most of it is interlaced, video doesn’t carry the same quality as film because the images are displayed differently.

Interlacing is the method by which video is processed to save bandwidth for broadcasting. Video that is interlaced uses odd and even scan lines that hold two frames’ worth of information in one. Some high-end video cameras shoot in progressive (full-frame) mode, which captures single frames, but most record interlaced video and are therefore prone to the sharpness of the scan lines appearing when motion occurs.

When you begin the editing and post-production process, you can immediately do a couple of things to change the format so that you can make your video look more like film.

Film Look – Tip #2: In your video editing program’s project settings, set your project’s frame rate to 24 frames per second. If there is an option that allows you to de-interlace the footage, select that option as well. These two changes will result in only a subtle change, but it’ll get you that much closer to the film look when you export it.

Film Handles Extreme Darkness and Light More Easily

You may have had an experience with a digital camera where you went to snap a photo and the device took a second (or several seconds) to finish. When it came up, it was blurry and out of focus. Digital photography and video does this often because it requires more light.

So in situations where there isn’t a lot of light, the iris tries to automatically adjust itself to compensate for what it perceives as a lack of light. Any movement on your part during this time causes the blurring effect you see in dark digital photos. Sufficient light is necessary for a digital camera in order for it to get the amount of information it perceives is necessary to take a good picture.

Film has an exceptional tolerance for more extreme levels of darkness and light; video starts to degrade
when things get dark, but film simply takes in what light it can and presses that light into its imagery.

Even on a regular, normally exposed frame of film with average lighting you can see that its levels – the difference between the darkest color and the lightest color in the image – are much wider than that of most video. When you set about making your video look like film, there are ways to adjust the spectrum of light vs. dark in your image.

The closer to true black an image becomes, the more it ‘pops’ out at the viewer. Video taken in low light
conditions tend to be flat and can appear to have a grayish screen or filter over them. You’ll have to shoot your scenes with more light when you use video, just because that’s the way video works. But you can still make video look like film.

Film Look – Tip #3: In your digital editing program, find your video effects panel or menu and look for an effect called Levels. Add this to your video and make the necessary adjustments until the
darkest spots in your videos are close to black.

Each Level filter is adjusted differently, so I can’t give you an exact interface method for getting the right picture. It’s your video though, so mess around with the settings and keep tweaking stuff until you find the look you think is the best you can achieve within your editing program.

Film Captures Closer To True Color

With digital grading and CGI being used more and more to add vivid color and special effects to
films, it’s becoming less common for film to make it from the camera all the way into theaters and home video media without having been run through a computer.

Remember, film captures light – video captures a digital interpretation of that light. The color captured on film doesn’t have to conform to its closest computerized interpretation of how to display that color.

Film Look – Tip #4: Find and add a Color Correction plug-in to your project within your
digital editor. Bring up the saturation a little bit and play with the gamma settings to adjust
the overall lightness level. If necessary, a brightness/contrast effect can also be used to offset any
increases in gamma.

The Highly Sought-After “Film Look”

Hopefully you’ll find these tips useful in getting that “film look” everyone seems to be after, but keep in mind that the way your video looks to you now might change in the future. If you’re a relatively inexperienced filmmaker, you may look back one day, slap your forehead and exclaim, “what was I thinking?!”

Use your best judgment both on set and at the editing bay. Be aware that filmic style changes over the
years and the latest fads and methods in films coming out right now might be old hat in a few years. Try less to mimic what you see on the big screen and more to develop your own unique style.

Do things you think will be pleasing to your viewers, but don’t give up your creativity in place of a trendy or overdone gimmick because it’ll probably wear out faster than you’d like. If this article has provided you with useful tips on making your video look like film, don’t hesitate to read on if you need a refresher on video production and broadcast standards.

Green Screening

Post-Production | By: indie

You can do some pretty gnarly things with a green screen, but you don’t need a high-budget studio to set up your own! Generally either blue or green will work, but for digital video a bright green is best.

Construction

Camera shops sell huge rolls for backdrops – we have an 8-foot wide roll for both blue and green at the office – but even this isn’t necessary. Check your local hardware store for tarps – they come in great colors for screening.

You can also buy fabric and create a sheet out of that, but whatever you end up using remember that a matte or non-shiny surface works best. So if you decide to buy some green paint and cover a board, sheet, tarp, or wall with it, make sure not to get glossy paint.

Setup

Hang or prop your green screen against a bare wall in your house; it doesn’t matter where you are as long as you can get sufficient and even lighting. Use a light meter if you’ve got one, but if not you should make a visual check through your camera lens and try to get the entire screen as evenly lit as possible.

The more even your lighting scheme, the more consistent your green color will come out on tape or film, and therefore the easier it will be to remove.

Green Screening During Post-Production

Both Adobe Premiere and Adobe After Effects have great video filters for green screening your footage. Regardless of the editing software you use, look for an effect or filter called color key or chroma key.

Keying

Using a Key filter like these will let you select a color within your video image and remove all pixels of that color, within a certain threshold so that colors close to it on the spectrum can be removed as well.

Sometimes I find it helps to use multiple key filters, especially if my lighting hasn’t resulted in a fully consistent green throughout. Select pixels closest to your person (or subject, if it isn’t a person) to get the cleanest effect.

Garbage Matte

By keying out shades of green closest to your subject, you may find that there’s a ‘flickering’ effect around the edges of your video where other shades of green are getting through. You can use a tool called a garbage matte to clean up your edges.

To use a garbage matte, apply the tool to your video frame and then drag each point inward from the sides and corners, until you have eliminated most of the outer portion. The matte automatically makes everything outside itself transparent and keeps everything inside visible.

Desaturation

Even if you have used great lighting and have done a thorough job keying, you might still notice a green ‘halo’ around the subject as it moves. Sometimes this is a ring of green, and sometimes it’s as subtle as a greenish tinge where the background reflected itself onto the subject.

You can use a desaturation filter to take the color out of the halo and turn it to white or gray instead. It will still be there if you look closely, but at the very least it should reduce the visibility of the halo and help it to blend in with whatever background you throw behind it.

Backgrounds

With your keying, matte and desaturation complete, you should now have a layer of transparency in your video. If you put another video or image on a layer beneath it, you should be able to see through those areas. If not, you did something wrong. Go back up to the top of this page and re-read it.

Digital Workstations

Post-Production | By: indie

Digital Workstations – The Specs

A digital workstation is nothing more than a normal computer equipped with a few tools that allow it to handle multimedia production. In order to turn your computer into a workstation for digital filmmaking, you need to supply it with (or make sure it already has) two things: a way to get video footage from your camera onto your computer, and a way to manipulate the footage once it’s there. I’ll also go over some of the technical details that will make an editing workstation worth its salt in terms of speed and storage space.

Data Transfer

With a camera that records video onto a tape or other linear medium, you will need to spend the time capturing your footage. Hard disk cameras sometimes give you the option of simply transferring full video files from the camera to the computer, but either way you will need to come up with the best way to do this.

Almost all computers that have been produced over the last several years have built-in USB ports, and digital cameras usually have USB jacks on their interfaces. But the USB standard uses shared bandwidth – meaning its total speed must be divided among every device that’s being used. On a dedicated machine with no other USB devices, this makes no difference to you, but if you’ve got a printer, a bluetooth receiver, a mouse, and a webcam all connected through USB, be aware that your video transfer rates are going to be affected by these other devices and the post-production stage of your digital filmmaking effort may suffer as a result.

A better option for video transfer is Firewire, which does not share bandwidth between multiple connected devices that use the same protocol. HDMI is becoming more prevalent with HD camcorders, but while this is a true digital standard, the requisite equipment is still relatively cost-prohibitive.

Many cameras also have A/V component output and S-Video output, but by far the cheapest and most high-quality standard is currently firewire. Making your computer compatible with firewire is often as simple as purchasing an inexpensive firewire card (around $20) that fits into one of your computer’s PCI slots.

Digital Filmmaking – Video Production

The second part of this equation is giving your computer the right software it needs to handle the capturing and editing of video. Lots of people ask me what the best editing software for digital filmmaking is, but while Final Cut Pro and Adobe Premiere are generally considered the most robust programs at the prosumer level, you can achieve great results using other less expensive packages like Sony Vegas or Pinnacle Studio.

The only software programs I’ve used that are so basic I wouldn’t recommend them to even the beginning user would be Windows Movie Maker and iMovie. Sure, if they’re all you’ve got for the time being, go ahead and play around with them. But they don’t give you nearly the degree of control and flexibility you’ll probably crave once you start cranking up your video projects.

Hard Drives and Storage

Video takes up an absolute ton of space, especially video of decent quality. You can produce videos for YouTube&tm; by capturing them at 320×240 (half normal resolution) and still come out with decent stuff for the web, but with Flash and streaming HD video coming into the forefront, you don’t want to limit yourself before you even start.

If you can get a separate hard drive on your workstation – or at least partitioned space on one hard drive – just for video storage, your machine’s performance and your ability to organize your captured footage should improve greatly. Playing video from a hard drive that is already taking care of running an operating system and the other programs you’re using is like asking for dropped frames and slow refresh rates.

As a rough guideline, a 250GB hard drive should be able to store almost 17 hours of uncompressed, standard resolution AVI video. Working with an 80GB drive, that number is reduced to a little over 5 hours.

Your hard drive’s RPM speed affects how quickly it can access your video files, so generally the higher this number is the better. When dealing with large files like videos, a 5,400 drive may be passable, but you won’t get the same reliable results as you would with a 7,200 or faster drive.

Processor and Memory

Your computer’s processor and bus speeds are definitely important, but you can get away with a slower processor as long as you have lots of RAM. In fact, RAM does more to allow a computer to operate quickly than its processor speed in most cases. I remember going into a LAN gaming house one time to play games with some friends. The games were running really well in 3d, first-person style, so I was curious and checked out the specs of the machine I was on. While the processor was nothing amazing at the time, I found that the machines were loaded with as much RAM as they could take! They had traded processor speed for RAM capability.

Peripherals

It goes without saying that you should get a comfortable chair and use proper ergonomics when working with your keyboard and mouse so you don’t end up with carpal tunnel after the countless hours of video editing you’re sure to be doing. But also, having two or three monitors to work with is a huge help when you’re working in multimedia production. Your time and effort level are only reduced by having a greater viewable area to work with.

Even if you can’t afford more than one monitor or a video card that supports them, you can still work with just one. But again, make sure that your ergonomics are right – posture, distance from the screen, angle of your wrists, and height of your chair. These might seem like small things, but over time they can cause or deter soreness and aches caused by your hypnotic editing trance.

The Result

Hopefully these tips will help you put together a workstation that meets your budget and lets you get your digital filmmaking production finished as comfortably and effortlessly as possible. If you have any additional tips or suggestions, just drop me a line by visiting our contact page.

Digital Video Editing

Post-Production | By: indie

Your videos can benefit strongly from a good NLE (non-linear editor). Using computers, we can alter the sequence of our shots and adjust qualities of the imagery in our videos to a great degree using a process called digital video editing. Here’s how it works.

Before Digital Video Editing – Film Cutting

The term “cutting room floor” originated from the fact that before digital video editing technology came to the forefront, every film editor had to literally cut film by hand into long strips. Imagine a standard film being shot at 24 frames per second, with standard 35mm film being 16 frames per foot; that should give you some idea of just how long an hour’s worth of film would be.

Even when TV studios started using tape machines and linear video editing systems, editors still had to fast forward and rewind to exactly the point they wanted to start from, set the master tape to record to the dubbing tape, and then repeat the process for every single shot.

I actually had a test in one of my video classes in college where we had to edit several shots together on one of these old machines, and do it within a certain time limit. That archaic contraption was no match for my ridiculous skills, of course. Ahem… anyway. Thank goodness we now have computers that make digital video editing so much easier!

Non-linear digital video editing allows you to cut, move, and order sequences without scissors, tapes or dial scrubbers. Once you understand how it works and familiarize yourself with your video editing software of choice, you’ll be a step ahead of all the amateurs out there.

The Non-Linear Editing Concept

A non-linear story or game is one where you take your own path. It’s one of those Choose Your Own Adventure books where you jump from page to page, or a computer role-playing game that gives you the choice to divulge into several different paths along the way.

Obviously, your final product is going to be a linear story or situation told visually. But in order to get there, the same concept that applies to the books and games described above will be true of how you edit your footage. Non-linear digital video editing is made possible because when you capture video to your computer it becomes a digital file, even if it was analog while it was on your camera.

Say you’re having a dinner party, and you’re trying to figure out where to seat your guests. So you write each guest’s name down on a 3×5 index card, set the table, and place a card on each plate. Now you have a visual scheme of where everyone will be and you can imagine the situation in your head as everyone sits down to eat.

Suddenly you realize that Aunt Sally can’t possibly sit next to Uncle Henry! They had a falling out a couple of months ago and they won’t want to be next to each other. Likewise, you can’t seat Cousin Joe across from your friend Nick because they’d cause such a ruckus by horsing around that it could be a distraction to others at the table.

Old linear editing systems were like a seating arrangement that you couldn’t change. Once the index cards hit the plates, they were there to stay. With digital video editing, you can shuffle the cards around until your seating arrangement is perfect – and you can do it well before your guests arrive.

This is accomplished because you have the ability to move sequences of video around as many times as you want, with no degradation in quality – the only outlay is your time. Take a look at my Non-Linear Editing Tutorial for some in-depth digital video editing instructions.

Capturing Video Footage

Post-Production | By: indie

Once you’ve come across that magic moment and gotten it on film (or tape, or hard disk space) you need to get that footage onto your workstation so you can edit it. That is, unless you’re using in-camera editing. This process is called capturing video footage, and is simply the means by which you transfer the footage to a place where you can alter it in some way from its original format, length, sequential order, and/or quality.

Hard disk cameras are the easiest to capture from, because instead of having to play your tape or DVD back and record that footage as a digital video file in real-time, you can simply drag and drop the files from the camera’s hard drive to your workstation. It’s essentially the same thing as getting photos off of a digital camera – once you’re connected, you can pull up a folder view of all the files available and grab whichever ones you want.

With any other technology, you’ll need to sit down and connect your camera to your computer workstation and choose the segments you want to use. You could watch the tape and assemble a detailed shot list, complete with timecode locations for each one. This may be necessary if you were not present or behind the camera during a particular shoot, in fact.

Chances are that if you were in the director’s or cameraman’s role, you have a pretty clear idea of what footage you got.

Make the Connection

The first thing to do is decide how you’re going to make the connection between your device and the computer/workstation. RCA connectors, for example, can go through a mini-plug or DVI jack on your camera and be plugged into a breakout box or sound card, depending on the hardware you have on your machine for capturing video footage. For detailed hardware specs, check out the digital workstations page.

An S-Video cable will transmit good footage, but it cannot carry sound. USB technology is fast and reliable, but firewire is better. Choose your connection method based on the resources you have and the available interface on your camera.

Transfer and Device Control

There are lots of good programs out there that will capture video footage. Most editing software has its own built-in capture application, and my favorite is Pinnacle Studio’s Capture tab. Certain types of graphics cards may also come with software utilities that allow you to capture from an external connection on the card itself. MainConcept’s MPEG Encoder also has a capture utility, and it’s one of the better ones I’ve used in the past.

If you have an older analog camera, tape location and recording are going to be two separate functions for you. Your play, stop, fast-forward and rewind will have to be set on the camera, while the record function is controlled on your capture software.

Newer digital cameras are manufactured with a feature called device control, which detects when their output jacks are connected to an exportable source. Within the capture software (as long as it also has the ability to use device control) you’ll have an interface that mimics the buttons on the camera and will let you control the play/pause/stop and tape location from your computer.

From Footage to Filesize

Think of capturing like filming, except you are recording exactly what’s on the device’s media a second time to a digital file. Your capture settings are what will now determine the video’s resolution, frame rate, quality, and bandwidth – the amount of space needed to store a video file.

Some video formats, like MPEG, have variable bandwidth settings that let the size shrink or expand within certain parameters based on the number of colors and the speed of motion within a given set of frames. More so than audio, a video’s file size can vary widely, and the largest uncompressed videos can take up large amounts of space on your workstation.

Changes in quality are also much more readily apparent in video as opposed to audio. Even with the slightest amount of compression, digital video can begin to show artifacts in the form of squaring or pixelation. You’ll need to learn how to make a video so that you know what to look for when these digital artifacts appear.

The Rule of Quality

A digital signal can never be made better than its previous iteration. In other words, it’s virtually impossible to get a clear, crisp result if your original capture is of poor quality.

It’s like if you pour a bottle of water into a glass; it will never be more purely water than when it first emerged from the bottle. As you drink it and carry it around with you, the bacteria from your mouth, any contaminants on the glass itself, dust from the air and any other particles that come in contact with the water continue to depurify it.

Without boiling or some other purification method, you have no way to make your glass of water live up to a sealed bottle. In the digital world, there is no purification method that doesn’t further alter the footage from its original source. There are plenty of filters and effects you can add to clean up certain aspects of the image, but they’ll all change the fundamental make-up of the footage. So start with your best possible source to achieve your best results.

Getting the Right Stuff

You may find that you’re watching and re-watching segments as you capture them, or that you’re having some trouble starting and stopping at the right points. If most of your footage is at least usable and your storage volume is sufficient, you may prefer to simply start capture and record an entire tape or disc in one go.

This is fine, but be aware that all the little pieces you don’t end up using will be taking up space on your workstation for as long as you want to keep your project file working in your editor. When you edit a video, it isn’t actually imported into your project file – the project simply references the copy of the video wherever it is stored on your hard drive.

This means that if you move the file, delete it, change the file name, or change any folder name along the file path, your editing software isn’t going to be able to find it and it will ask you what’s going on. Capturing in small segments can be a lot more time consuming, but if space is at all an issue then it’s the best way to go – even if you are working on a more lengthy piece.

You can also organize your work better if you have several clips and you can more easily find and access them. That way you won’t have to scrub through a super-long piece of footage trying to find the segment or shot you need. In most situations, capturing video footage is a process determined by whichever connections and technologies are available to you, but hopefully this gives you at least an overview of the basics.