Tools

Slugline. Simple, elegant screenwriting.

Red Giant Color Suite, with Magic Bullet Looks 2.5 and Colorista II

Needables
  • Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony
  • Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic
  • TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM
  • The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    by Stu Maschwitz

Entries in Visual Effects (84)

Wednesday
Nov262008

Adobe Can Has Science

Dan Goldman is an old friend of mine from ILM. He now works for Adobe's top-secret God Dammit Put This In A Product Now division.


Interactive Video Object Manipulation from Dan Goldman on Vimeo.

Its no Cartoon effect, but it's got promise.

Monday
Oct272008

Scarlet Specs Countdown, OpenCut 3

RED will release the revised specs for Scarlet and Epic on November 13th.

OpenCut 3
is a VFX challenge, create and composite backgrounds for a scene shot against a vaguely greenish screen. Doesn’t seem quite the creative win-win of previous OpenCut contests (although you can win Adobe Creative Suite 4: Master Collection and one week’s free rental of a RED One), but I encourage you to enter if only to learn why I recommend in The Guide that you stay away from virtual backgrounds if at all possible!


OpenCut 3 - Sensored REDVFX Challenge from Silverado on Vimeo.

Wednesday
Oct012008

What Should Adobe Do With Premiere Pro?


Merge it with After Effects.

Why would I want to see one of my favorite applications bloat to encompass an program about which I am, at best, ambivalent? The answer lies with the post-production ruler of the roost, Autodesk’s “Advanced Systems,” AKA Flame, Smoke and Inferno.

“Do it in Flame” is the battle cry and of producers everywhere who want to get creative work done quickly and well, and leave with a finished product in their hand. “The Flame,” a term which colloquially encompasses Flame, Inferno, and to an increasing degree Smoke, has a reputation for interactive feedback, realtime operation, high quality, and seamless integration.

The interactive feedback comes from a combination of a dedicated disk array and hardware-accelerated compositing operations. “High quality” means up to 12 bits-per-channel color. Integration means that Flame can be used for onlining and mastering, interoperating with NLEs and other systems in the Autodesk family such as Smoke.

Flame has a great toolset. Many of its features are unique, or best-in-class. But many more of them are rather mundane and even behind the times. Still more of Flame’s features are simply odd, having developed within their own ecosystem instead of cross-pollinating with common entry points to digital compositing such as Photoshop and Shake. While a bit convoluted, Flame is a unique and accomplished tool that deserves its stellar reputation.

But some of the mystique that surrounds Flame is more perception than reality. Yes, you can play things back in real time—after you render them, a process which can sometimes be hardware accelerated, sometimes not, especially at high bit-depths. And those high bit-depths are not as high as the 16-bit integer color common to most inexpensive desktop solutions, such as After Effects and Fusion. Flame notoriously lacks pervasive floating-point color support, a feature ubiquitous at every level of compositing from Apple’s Motion to The Foundry’s Nuke. As integrated as a Flame might be into a facility’s post pipeline, that often simply means that it depends greatly on supporting players such as Avid, Smoke, or the kid in the back room with Photoshop, in order to appear as a complete solution to the client.

The other reason for Flame’s excellent reputation is that its high cost commands an equally high-value operator. The biggest perceived value of the flame is its speed and interactivity, but these qualities rest more on the artist’s shoulders than on the system itself. Flame and Smoke do need to render. Their many hardware-accelerated features do not often add up to to “real time.” Flame artists learn the same kind of client-management skills that colorists and offline editors master: They read the vibe in the room, tweak things when the attention is on them and disguise the subsequent rendering (which Autodesk brilliantly rebadges as “processing”) with well-timed chit-chat. “Where are you guys going to dinner tonight?” “Anyone want these Lakers tickets? I can’t use them. OK, here’s that new version…”

And once you render, you’ve got a new asset that you must decide to either build on destructively, which is tempting because of the speed but possibly disastrous if the client asks to dig back several layers for a change. Or you can work in “batch,” where you have the same kind of issues that every After Effects user faces, in that each tweak prompts a complete recalculation from square one. But for that price you maintain infinite flexibility. And incur the need for a render farm, which heavily-Flame-dependent shops do maintain, at great cost.

The inarguably great thing about Flame is that it is a responsive environment in which to be creative. While some VFX tasks require relentless hammering on a known set of assets (the domain of Shake and Nuke), some require more purely creative problem-solving. What should this effect look like? We need to add pizzaz! We really need something cool here and we didn’t shoot anything—what do you have lying around? Flame is a great tool to build, create, design, and experiment.

Does that sound familiar? It’s exactly what I love about After Effects.

After Effects is also a great creative environment. It’s got all kinds of fun tools and ways of making something out of nothing. It’s intuitive, especially for simple things, and plays nice with Photoshop and Illustrator. It wobbles under the pressure of high-end compositing, but it goes toe-to-toe with Flame for creative problem-solving.

What it lacks is Flame’s trademark interactivity. Not being able to enforce a specific graphics card, After Effects does very little with the GPU. Even on a system capable of realtime playback of, say, a 2K DPX sequence, After Effects requires you to RAM preview, and those RAM previews can be slow to build and fragile to view.

I recently visited an Orphanage artist who had temped together some effects shots for a feature film using After Effects. He had all the shots in one project, each shot a precomp conformed into one master comp that matched the cut from editorial. This is a very common thing to do in After Effects, something I do all the time: using the layered comp as an editorial timeline. It works, but it’s a hack. And in this case, it was a rather fruitless hack, as the guy’s machine didn’t have enough memory to hold a complete RAM preview of that lengthy comp. The artist had the right idea, to create a context in which to constantly check his animation, but it just didn’t work.

All day long he’s working in the individual shot precomps, at whatever resolution and quality settings he finds efficient. And every once in a while he wants to view his work in context. For that, he could be working compressed. He hardly needs uncompressed RGB for a quick little review of the cut. Instead, he tweaks one shot and returns to his timeline to find that he must re-render ten shots that he hasn’t touched. He might want to play with things editorially. For that as well, a true editorial timeline would be preferable to a stack of layers that aren’t really stacked.

That conform timeline should be NLE-style, with an implicit, user-selectable codec. The rendered previews should be cached to disk rather than held in RAM. In other words, that timeline should be Premiere Pro’s.

Adobe would suggest that you can do this now using Dynamic Link. You can indeed have a live connection between After Effects and Premiere. It’s a cool idea, but in practice it doesn’t work. It breaks when you do common things like iterate a project name. You wind up re-rendering in Premiere a lot, and when you do, you trigger expensive After Effects re-renders. Dynamic Link is an implementation of an obvious idea, not a bourne-of-need workflow enabler. You will only ever see it demoed with the very simplest of After Effects compositions, because with anything substantial, it becomes unbearable. As I’ve mentioned before, the only reason to use After Effects instead of cheaper, simpler options is to do something substantial.

We need seamless integration between the uncompressed, limitless-but-slow world of After Effects and the realtime environment of Premiere.

Rather than RAM preview an After Effects comp, you should process it to a clip, in the codec of your choosing. Every time you RAM preview you can either write over that clip or create a new one. A clip maintains a connection with the After Effects composition that created it, and knows if it is no longer current. It can display a little warning icon if changes have been made in the comp that have not yet been rendered.

Those clips are real media that you can use, re-use, edit with, and play back in realtime months later.

And they can be nested into a complex editorial timeline, where today they live as a DV temps and tomorrow they gets bumped up to uncompressed HD finals.

Because that codec of your choice could be anything from DV to uncompressed HD to 10-bit DPX.

The Premiere side of the equation would naturally adopt the complex but powerful color management features of After Effects. You’d be able to edit uncompressed DPX in a Premiere timeline and preview with a Kodak Vision 2383 film stock emulation. Heck, preview it with 2393, more commonly known as Kodak Vision Premiere!

This arrangement would let you work the way a Flame artist does, constantly able to weigh the benefits of power and speed. Just need to retouch a frame? Go ahead and paint on your last render. Need to dig back to the very bottom layer? Do so knowing that you can re-render everything above it and get the same results as before, plus the fix. And know that every time you process a clip and update it in your timeline, you’re building a realtime HD program that you can blast out to a client monitor, or a deck, or to Adobe Media Encoder. Combine this with speculative background rendering (ala Nucleo Pro), and you have a very powerful system. And this system would run just as well on a laptop as on an eight-core Mac Pro—the only difference would be the codec you’d chose.

Now maybe all of this doesn’t require the After Effects dogs and the Premiere cats to live together. Maybe Adobe doesn’t need to make the two programs into one. But there’s not a single After Effects user who couldn’t benefit from a realtime editorial timeline. Nor is there a single Premiere user who hasn’t wished for a feature that already exists in After Effects. In other words, whether After Effects and Premiere literally merge into one product SKU is not important—what is important is that, to someone with both applications installed, the option exists to use them together in such a way that the distinction between them blurs away completely.

If only there was some kind of application that existed specifically to “bridge” the Adobe Creative Suite apps together… Hey, what’s this thing taking up 400MB on my hard drive?


Maybe it’s through Bridge, maybe something else, but what we need is an Über Project that can contain both Premiere and After Effects assets and show how they are connected. I imagine it looking much like Flame’s “Desktop,” which is a sort of media home-base in Flame that serves as everything from a clip browser to a reference library to a convenient place to scratch your chin and ponder your next move. You wouldn’t be the only ones looking at this. Have you seen Eyeon’s Generation? Have you seen how much they’re charging for it?


However you do it Adobe, you gotta do it. Sure it will be hard, but only for reasons that would be wrong to base important decisions on. If you do it and do it right, you put a Flame in the lap of every one of your Production Premium customers. You can stop wasting precious development efforts (so much more precious now that your development cycles are shorter) duplicating functionality between your two flagship moving media applications. You’ll force Autodesk off their laurels and get them innovating and competing again. You’ll finally make it possible to host a client-supervised compositing/finishing session with Adobe software. You will put one and one together and the result will be three.

Visit the ProLost Amazon Store for After Effects CS4 or Adobe CS4 Production Premium and support ProLost.

UPDATE 2: Agree? Disagree? Visit the Adobe Feature Request form and make your voice heard. I hear a lot of people say that “Adobe doesn’t listen.” Well they can’t hear you if you don’t step up to the mic.

UPDATE: Some of the lovely reactions to this post on Twitter:

Friday
Apr112008

The Orphanage Brings VFX And DI Processes Together With Film Master

Straight-up press release for now, more thoughts and analysis of how this relates to prior musings later.

Top visual effects studio The Orphanage has purchased its Film Master finishing system to extend its creative and workflow management services to include DI and colour grading.

With Film Master, The Orphanage is able to harness artistic talent more efficiently and eliminate re-work across the VFX and DI processes for both film and commercial work.

The upcoming Frank Miller film The Spirit, shot with the Panavision Genesis camera, is the studio’s first project that leverages the Film Master. The Orphanage is the lead VFX studio on the film, managing the creation of effects by a number of facilities, and will also perform the final DI/colour grade. On the VFX side, the Film Master is the central repository where all of the VFX work comes together. The system is located in The Orphanage’s San Francisco studio, where it is implemented on the studio’s SAN and has access to all VFX assets for real-time playback and manipulation.

Stu Maschwitz, Co-founder, VFX Supervisor and Director at The Orphanage, said, “If a vendor gives me a shot that’s 99% there, rather than send it back and ask them to perfect the color, I can do that work immediately and potentially have a final shot instead of needing another iteration. Because that shot can then be used in the DI as is, because we’re using Film Master for both processes, it also solves that huge frustration of having VFX artists slave over the look of a shot only to have it re-created again in DI.”

With its comprehensive grading and conforming toolset and modular control panels, Film Master fit The Orphanage’s need for a full, high-end system. Maschwitz said, “We evaluated a number of systems looking for the best combination of a software system with a hardware interface, and an approach that was familiar and comfortable for high-end colorists. Every time I looked under the hood of Film Master I liked what I saw. The way the tools work, the order they’re presented in, how they interact with the panels – it’s all incredibly well thought out.”

Simon Cuff, Digital Vision President and COO, said, “The Orphanage’s implementation of Film Master pulls together two highly creative processes—VFX creation and DI—which are typically serial, and combines them to enhance the creative elements and increase final quality. Film Master offers the most complete set of conform, grading, finishing and image enhancement tools that make it an ideal platform for the converging VFX/DI workflow.”

Page 1 ... 5 6 7 8 9 ... 21 Next 4 Entries »