🎬 NLE Overview

NLE Overview

Start at the Beginning

How I Learned to Stop Worrying and Love Resolve

Historical context is useful in many situations, and definitely so when it comes to Non-Linear Editing software. I get many questions like “Why does Hollywood use Avid?” or “Why Is Premiere Becoming Increasingly Unstable?”. I can’t answer these questions entirely, but I think a look at the broader picture is most useful in understanding why these applications existed in the first place. Equally useful is considering where their creators are aiming to take them in the future…

A Historical Tangent

Before discussing any specific NLE, let’s first explore a brief primer on film post production.

File:Film 35 mm - Essais camΓ©ra - conformitΓ© du cadre - 5 photogrammes.jpg
Andrew Lih, CC BY-SA 2.0 https://creativecommons.org/licenses/by-sa/2.0, via Wikimedia Commons

In case your inference skills struggle, “films” are so-named because they were initially shot on film. 35mm celluloid, photochemically processed film. Exposed film is removed from the camera and that original negative film and is protected at all costs (something further covered in the sections on DIT work). That negative is “contact printed” to a “positive” print so creative folks can view “dailies” and editors can have something less valuable to edit from (called the “work print”. Generally only the “circled takes” are contact printed. A list of where all the cut points lie, i.e. an “Edit Decision List” (a feature we still use) is then generated by the editor and a “negative cutter” would then take that final edit and match it back up to the original negative film.

NLE stands for “Non-Linear Editor”. It’s a fancy way of saying that you can pick any portion of any clip and edit it anywhere you want to in your existing video. Back in the tape or celluloid editing days this wasn’t so easy as the film/tape had to be linearly advanced to the correct point for an edit. As digital video replaced tape-based editing, the software grew to do far more than a simple cut and splice, but the early days were surprisingly primitive.

It gets a bit messy since dissolve transitions/superimpositions make things tricky, so you’ll hear terms like “dupe neg” or “inter positive” just for those effect shots. Inter positive is just a “stepping stone” format made by optically projecting the original negative. Dupe neg is just a new negative onto which that inter positive stock is optically projected (or contact printed). This process is eventually done for the film in its entirety (so those “effect” shots essentially through the whole process of original negative>Interpositive>Internegative>Release Print twice).

Color Timing

So the edited negative film was historically run through a “film analyzer” (Google “Hazeltine”), a tool which helped the guy operating it (called the “color timer”) to determine optical adjustments to red green and blue “printer lights” so the film would be exposed and color balanced properly in its final print (well, technically these adjustments are then used to create a “First Answer Print” which creatives would review and more rounds of changes and answer prints could be made). The “color-timed” negative creates an “Interpositive or I.P.” and that is what we finally use to transfer to home video and broadcast. These intermediate film formats, even the positive ones, aren’t designed to be viewed as is. They are gamma-neutral and physically created to withstand the abuse of multiple prints. And the print stock that used for exhibition has darker blacks and crunchier shadows, necessary for looking correct with a bright light shining through the film. The point is, these stocks are custom-tailored for each part of the process. You cannot, for example, digitally scan a print stock and display it digitally without seeing crushed shadows. This back-and-forth process of making color adjustments to “intermediate” formats of the film is the genesis of our modern “digital intermediate”. For theatrical release (on film), the interpositive is printed to a duplication stock, creating a “dupe neg” which is used to create bucketloads of “release prints” for theatrical exhibition. More “dupe negs” can be created if you have a massive theatrical release.

So from the 60’s onward, these color timing tools allowed an operator to make basic R,G,B changes (fixing exposure and color casts) to a contact-printed film. But as technology progressed, it makes sense to do this color timing from the original negative and write the “color corrected” image straight to video tape rather than back out to film. Many low budget films were written to tape and the entire editorial and delivery process never involved film again (film was simply the acquisition format). Though vastly inferior in visual quality, tape can be cheaply acquired and edited (compared to film). A machine was created to take celluloid film and transfer it to video; it essentially “televized”, or translated to TV, a “cinematic” medium and “Tele-Cine” was born. It was the ability to manipulate color at the same time that brought the telecine to film post production. Newer CCD telecines shined white light through film, then prismatically split it into R,G, and B. Those R,G,B values could be modified by the color timer and additional hardware was developed to integrate with the telecine system and add further functionality (e.g. Resolve). The photo below shows an old-school “Telecine” room. The differing frame rates of film and video are also converted in the telecine process.

http://exploitationfilmsrestoration.blogspot.com/2016/06/restoration-basics.html
File:Telecine system (6498625539).jpg
DRs Kulturarvsprojekt from Copenhagen, Danmark, CC BY-SA 2.0 https://creativecommons.org/licenses/by-sa/2.0, via Wikimedia Commons

The process of using film has changed since then. As computers grew in power and mass storage became more feasible, it became possible to scan portions of the film to a computer for processing of visual effects digitally, then to re-output those shots to the final film. The digital realm allows for far more creative compositing and adjustment than optical printing did. Eventually it was possible to scan the majority (Pleasantville) or entirety (O Brother, Where Art Thou?) of the film rather than just select VFX sequences. The aesthetic and story of these films relied on the emergent technology. Soon, movie theaters were equipped with digital projectors and once the film had been recorded and digitized it could stay digital all the way through exhibition. And in our modern era, even the phase of image capture has largely replaced film for an all-digital pipeline.

Blackmagic Resolve

Image result for blackmagic resolve logo

The video below was created before I was born, but I like to think it’s one of the reasons I came to earth. So without further ado, let me introduce my software of choice:

“Resolve” was one product of several owned by “da Vinci Systems” (which was previously created by VTA, makers of Resolve’s great great granddaddy the “Wiz” system). It was born around 2004 during a period of financial turmoil for the company and was initially divided into separate versions for specific tasks like DI color correction and visual effects. But this was a parallel processing, GPU-based system. Unlike the alternative DaVinci systems of the time (e.g. DaVinci “Classic”, “Renaissance” or “2k”), Resolve was developed in a way that it could eventually run on standard hardware (or at least had the potential to as consumer PC’s grew in power).

Resolve started as “DaVinci”, a color correction tool that required lots of hardware and cost more than my first home. Its hardware+software configurations ranged from $200k to over $800k. At this point, the customer base was (understandably) a small group (~100) of industry professionals. I’m just old enough to remember the glory days of ‘color timing’ when an operator would run film through the DaVinci on its way to videotape; the process we just looked at known as Telecine. These DaVinci systems worked in tandem with the telecine. In 2009, as that specialized software+hardware model ceased to function well (same trouble Avid’s having now but we’ll get into that) Blackmagic bought the tech at a good price and passed on the savings to the customer. Like, majorly passed on the savings. In 2011 a consumer could now by this Ferrari of post production tools for just $1000.

To make a long story short, over time, Blackmagic added their own editing code to the color tools, bought a similarly successful but struggling set of software from Eyeon and Fairlight, and now had one piece of software that could do editorial, color, sound and VFX inside the same application. And all for the price of…free. There’s a “Studio” version that adds some things many people find they never need, and even then, it’s only $300, and that’s for a perpetual license, not a subscription. I’ve used Resolve since before its acquisition by Blackmagic and since they bought it, I’ve only paid once for each of the licenses I own. How is this possible? Nobody really seems to know; it’s just Black Magic.

I love Resolve as an editor. It has the potential to overtake what Adobe hasn’t been able to crack yet , and Apple has no interest in pursuing: the pro market. I find it combines the best of all the NLEs I’ve used. FCP X’s magnetic timeline, Premiere’s versatility and playback performance, and Avid’s “we’re interested in professionals” attitude.

If you continue reading, you’ll get a general sense that Apple played things to quickly, trying to innovate far beyond what customers could tolerate, while Avid relied too much on the status quo. I feel like Resolve sits squarely between the two, innovating at a rate that is comfortable and secure for the professional and quickly enough to avoid obsolescence. It allows for great metadata tagging and searching, almost as good as FCP X, and it’s Cut page and Edit page have features that allow similar replication of much of what people love about FCP X’s timeline.

And ultimately, for the people I teach, you can’t argue with the price of free. It runs on Mac, Windows and Linux, has an excellent manual, extremely engaged development team, class-leading features, team collaboration, reliable development schedule and company ethos, unmatched workflow for ingest/edit/color/sound/VFX, etc. Because it’s platform agnostic, free and fully featured, I can recommend it regardless of OS, budget or scale of project. It simply represents the most obvious choice for what I’m teaching.

If all of that wasn’t enough, take a look at Resolve’s list of credits in high end Film and TV usage. The inclusion of Fusion alone shows you the caliber of tool now freely at your disposal.

Resolve’s tools are used in the big stuff

Adobe Premiere

Image result for premiere pro 2020 logo

The early ’90s meant big changes for the upcoming world of digital media. Though Premiere supported a max resolution of 160×120 in 1991, it relied less on hardware than the competition.

Adobe had a product that, for some time, was outstanding in its mediocrity. While it got the job done, it simply didn’t boast a feature set or stability that won folks over. Around 2010 or so, things began to really improve. The hardware acceleration of its “Mercury Playback Engine” meant cutting current codecs was possible without transcoding. Its interface got cleaned up quite a bit, small annoyances were addressed, and for the first (and only?) time, it seemed like Adobe was aggressively catering to their professional users. When Apple desecrated Final Cut Pro in 2011, Adobe really seized their opportunity, seeking to make Premiere as ubiquitous in the video world as Photoshop is in the photo world. Unfortunately, like Apple, Adobe is a behemoth, and the reality is that out of all the seats of Premiere sold, not that many are pro-level users. Adobe seems far more interested in 2019 in sucking the world into its cloud than nearly anything else.

Adobe abandoned the Mac platform for a time with Premiere.

Sure “Gone Girl” was edited in Premiere, so was Deadpool 1. Deadpool 2 was edited in Avid so what does that tell you? Premiere tracks every piece of media; it could take 25 minutes to open a feature-length project. So editors must work by reels. Gone Girl required specially-crafted SSD pre-caching of all the media on the shared storage.

In early 2020 Adobe made some exciting announcements. Working in collaboration with the editorial team on Terminator:Dark Fate, they’ve made some changes which, if implemented mainstream could be exciting news for feature film editors. Adobe is calling the new experience “Productions” and its highlight features include:

  • Similar to AVID, projects have a new hierarchical nesting structure where scenes, reels, episodes, etc. can all be divided into their own bin and essentially their own project. These can be nested within one another in a way that makes sense of the multitude of assets seen on a bigger show. Their structure can be mirrored between the hard drive’s file explorer and the NLE. Each bin can be “locked” and unlocked so multiple users have access to change only the contents of the bin they need.
  • Perhaps in part due to this new form of project segmentation and organization, Premiere is now able to work more speedily on massive projects. In other words, it doesn’t sit there indexing media for 20 minutes every time you open a large project.
  • Customizable clip layout inside of bins (traditional editors love their “group clip” (AKA multicam) on top of the different shots that comprise it. This is one of those uniquely AVID things that, silly as it sounds, makes veteran editors happy.

Apple Final Cut Pro

So the guy referenced at the top, Randy Ubillos, created the first few versions of Premiere. Not long after, he started from the ground up with another editor based on Apple’s QuickTime but for Macromedia. The product was called “Final Cut” and was initially available for both Mac and Windows. Apple ultimately purchased that product, but somewhat passively. It was really until the incorporation of IEEE1394 FireWire near the turn of the century that everyone realized how great this combo was. The ability to record with a consumer “DV” camera and digitally transfer to the computer for realtime editing was awesome. This was when I was in middle school and it rocked my world.

By the time Apple hit Final Cut Pro “7” (never an official name BTW), it had been adopted in many a post house. TV commercials, some features, and lots of web content enjoyed the use of FCP. Directors like los hermanos Coen and David Fincher were all about it. From as far back as 2002, (e.g. “The Ring”) to 2016 (e.g. Kubo and the Two Strings) FCP had its place with the big boys in editing. Though it still was certainly not designed for feature film work in the way that Avid was, FCP was reliable, capable, intuitive, and rather universally loved.

Then, in 2011, Final Cut Pro X was announced. It’s price and feature set meant it was ‘democratized’. Now everyone could afford a license. It was 64-bit, had a redesigned UI, and was exactly what Apple said we all needed. But it wasn’t. Ironically, though they’d added the “pro” monicker to “Final Cut’s” name, Apple ultimately removed professionalism from its software. There was an understandable outcry, and everyone, literally everyone I knew who used it professionally, switched to Premiere or AVID.

By 2019, the wounds have healed sufficiently that a lot of people give Final Cut Pro X a go because the negativity surrounding it is waning. But it’s usually smaller groups or individuals, and the occasional post house. Apple has to be credited for innovating. And if anyone on the planet has the authority to encourage us to rethink editorial strategy Randy Ubillos definitely qualifies. FCP X attempts to be “trackless”. Instead, you assign roles to clips before editing which defines their future interactions. Then sync and a host of other editing annoyances are somewhat mitigated. The “magnetic timeline” of FCP X is quite useable once you wrap your mind around it, and the software is quite stable which is huge. However, Apple lost such trust with professionals that it’ll take a new generation to forget its misgivings.

For perspective, there are 1.4 million active seats of Premiere. There are 3 million sold seats of FCPX. (As of 2019 Resolve reportedly had 2 million users!). Only Apple could make software so thoroughly unusable and then ultimately be forgiven for it. We won’t get into what they did to their computers here. Apple has a huge presence and their ability to sell a symbiotic set of hardware and software makes them a force to be reckoned with, no matter your personal feelings on the brand.

Avid Media Composer

Image result for avid media composer logo

Avid was founded almost immediately after I was born. Correlation, not causation, as far as I’m aware. In those days, film was the only professional option, so when Avid brought the convenience of a digital edit with the ability to “MediaMatch” it back to cut the actual film, the Moviolas were shaking in their boots. Physical cutting of celluloid (for the edit phase) was at last threatened with extinction.

But “digital” had an insatiable appetite. It soon was simply an intermediate step for the cutting of film; it was becoming the acquisition format for everything lower budgeted. The consumer digital video boom was a democratization of video-making. The success of the software-based video editing in the ’90s makes sense when you understand the alternative was expensive hardware-based solutions. Avid’s Film Composer was Mac-only, and relied on specialized supporting hardware to run. Nowadays “films” are primarily shot, edited and distributed digitally. And an NLE with its roots steeped in true film production becomes almost a virtue over a vice.

The Hollywood market is Avid, but It’s been the standard due more to tradition than superiority. Modern workflows require reimagining the way things operate today, and where Avid’s organizational niche for working with film-based workflows was unique, it’s hardly applicable anymore. I edited primarily on “Avid XPress DV” for quite some time, but it never matched the competition from a value prospect. Interestingly, it never worked on Windows Vista. But nothing really worked on Windows Vista so I don’t hold it against them. The thing today is, no software company can focus on making Hollywood-specific editing software and make it work financially in 2019. So quite recently, Avid has made huge strides to compete with the likes of the other NLEs. I still say, if you want to edit in Hollywood, learn Avid. Because it’s based on a shared storage work flow of larger operations, even things like media import can seem confusing to beginners. Avid can bring media in as “linked” media where it uses third party “AMA” plugins to read your camera’s native format. Linked media is ‘linked’ to the video file on your hard drive and simply “sees” the media in that folder structure. Alternatively, Avid can “import” media which is Avid nomenclature for importing and converting from the original camera format to an edit-friendly format (e.g. an MXF-wrapped DnX file) and storing the new files in the Avid MediaFiles folder on your hard drive.

Even if Walter Murch (legendary editor who won Avid their first editing oscar) doesn’t like to micromanage his trims frame by frame, much of the editing world relies on it. Avid’s trimming mode was once relatively unique and those used to it couldn’t live without it. Now it’s not such an exclusive thing.

There are more options, but for the sake of everyone’s sanity, I have limited to what are, for better or worse, the most popular ones used in a professional post environment. I loved the editor Vegas for a long time; it’s audio features were second-to-none. Edius also had its place. But in 2019, it’s hard for me to recommend anything more strongly than Resolve.

Final Note

Let me reiterate: all the NLE options listed above are very capable editors. Especially as you begin, the software will not be the limitation. Go create something.