Visual effects are post production effects. It may seem like semantics, but it’s important to distinguish this category from Special effects which happen during production “in-camera”. Something like an explosion could be (and often is) a mix of a special effect applied on set via a real explosion and an enhancement of that explosion via a computer-generated smoke simulation.
This world has changed dramatically over time. The number of visual effects shots in an average feature film is much higher than most people guess. Most of the shots are designed to be invisible to the audience.
Because of the increasing demand for VFX work, a reduction in barrier to entry (in both cost and skill) it’s worth being able to add some basic visual effects to your skill set. In an upcoming section we look at what sorts of VFX every editor should know, but for now let’s look at a few categories and definitions.
Often when we think of visual effects we envision the addition of something to the footage that was shot. This can be additional “live action” footage or completely computer-generated content. The term we use for layering these elements is “compositing”.
Much of the VFX work of the past relied on practical in-camera effects. When the compositing operation was an optical process on photo-chemical film it was easier to get things right in camera. Nowadays, however, skillsets like matte paintings and miniature work have changed dramatically, with most of the work being done digitally. There’s also been a huge increase in the amount of 3D computer generated work appearing on screen. In today’s world you are well-served to know at least some 3D and I’d consider it an essential part of any VFX artist’s arsenal.
I’m going to briefly discuss some of these just to give you an idea of what’s out there. This can often be a first question for many beginners and to some degree I include the brief overview to explain my preference for the free software Fusion and Blender espoused in most of this training.
This one gets first-mention because it’s the one I’ll get the most questions about. Most of the VFX work I’ve done in my life has been in After Effects. It’s an excellent compositing and motion graphics tool.
After Effects makes getting elements in place and stacked in the correct layer order and time easier than any other software. I think it’s easily the top dog in motion graphics applications. That said, for feature film compositing work, it’s layer set isn’t quite as aptly suited.
After Effects has no true 3D space and this is a wall I’d run into constantly. When you want your particle system to interact with your car model there’s just not a clean way to do it. Also, using one asset as a matte or in the process of matte creation with other mattes is very simple in a node-based compositor and very cumbersome in a layer-based one like After Effects.
All-in-all I’ve seen skilled After Effects users do amazing compositing work and skilled Fusion users do incredible motion graphics. So as always, there are no rules. You’d be best served learning the one you have access to well.
Rapid growth past few years. Excellent simulation and effects tools. The king of “procedural” workflows. This is fantastic software and priced accordingly.
Maya has been an “Industry Standard”. Compatible with decades-old pipelines and sufficiently customizable for a lot of studio work. It’s a big piece of software and even harder than Blender to get into.
Integrates with AE as a Compositor. Easier to learn. Pricey for full features. By the time you make it any where near comparable to Blender you’re auctioning off the family farm.
Resolve FX differ between standard and studio
Node transform offset works as a basic clone stamp
Colorist can work cheaper than VFX artist
Node-based
Easy-to-share comps
Best integrated into NLE
Difficult to find educational resources
Node-based
Industry Standard
Pricey
Open Source
Free
Very complete toolset
Steeper learning curve
Eevee