Tools

Slugline. Simple, elegant screenwriting.

Red Giant Color Suite, with Magic Bullet Looks 2.5 and Colorista II

Needables
  • Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony Alpha a7S Compact Interchangeable Lens Digital Camera
    Sony
  • Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic LUMIX DMC-GH4KBODY 16.05MP Digital Single Lens Mirrorless Camera with 4K Cinematic Video (Body Only)
    Panasonic
  • TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM DR-100mkII 2-Channel Portable Digital Recorder
    TASCAM
  • The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    The DV Rebel's Guide: An All-Digital Approach to Making Killer Action Movies on the Cheap (Peachpit)
    by Stu Maschwitz

Entries in Image Nerdery (53)

Monday
Jun042012

Fast Ray Tracers

Thanks to @quarterlight for the link to this demo of Clarisse, an interactive ray-traced shading and rendering environment that uses the progressive refinement method I mention in my previous post. From the FAQ:

Clarisse is CPU-based. A basic graphics card supporting OpenGL 2.0 is more than enough.

And:

Clarisse runs perfectly fine on lower end hardware such as laptops and remains perfectly smooth and interactive. The What is Clarisse video? was recorded on a dual-core laptop dating from 2009.

This is what I’m talking about when I say that the ray-traced 3D performance in After Effects CS6 could be better optimized for the CPU.

Still not convinced? Then I guess you didn’t click the progressive refinement link in that last post. It goes to an interactive demo of a ray-traced scene that you can interact with, live, at very high frame rates. The real time rendering, which includes secondary light bounces (AKA radiosity), is done in your browser.

Thursday
Oct132011

Real Men Comp With Film

Be careful having dinner with Mike Seymour.

He was sharing with me his nerd-joy over being able to interview Jon Alexander at ILM about the history of optical compositing. I offhandedly mentioned that I had once, out of pure lifelong curiosity, re-created the optical bluescreen extraction workflow using After Effects.

Oops. The next day Mike was in my office with a camera. Watch this whole video. My bit is nerdelicious, Jon’s is wonderflully insightful and grounding, and it all adds up to a great taste of what fxphd members get to enjoy heaping tons of.

Read the companion article here.

Friday
Aug192011

Realistic Lens Flares

When technology and art intersect, its often the case that those who know how have no idea why, and those who know why have no idea how.

I can’t stop thinking about this SIGGRAPH video. It shows realistic lens flares being computed in real time using a ray tracing technique. There are some lens artifacts shown here of a complexity and beauty that I’ve never seen faked convincingly before.

Lens flare is caused by light passing through a photographic lens system in an unintended way. Often considered a degrading artifact, it has become a crucial component for realistic imagery and an artistic means that can even lead to an increased perceived brightness.

Jesus nerds. “Increased perceived brightness?” That was the best sales pitch you could give on why being able to synthesise realistic lens flares is worthwhile?

Lens flares are awesome because they are fricking crazy. They are completely unreal. They increase the veil of unreality between the audience and the movie. They are beautiful. They are tiny imperfections magnified by orders of magnitude. They are aliens. And scary buildings. We give them sound effects and music cues. They make movies bigger than life because they have nothing to do with life.

And I want yours.

In the Vimeo comments the poster said:

Anamorphic optics are currently not supported, but this is not a principal limitation of the rendering scheme.

If they had put anamorphic examples in this video I think I’d be standing on their lawn with a boom box right now.

Physically-Based Real-Time Lens Flare Rendering — Hullin, Eisemann, Seidel, Lee

Monday
Mar082010

Converting 30p to 24p

As the long-awaited 24p firmware update for the Canon 5D Mark II draws near, I joined Mike Seymour on episode 57 of the Red Centre podcast to talk about how excited I am that it marks the end of painful workarounds for the 5D’s no-man’s-land frame rate of 30.0 frames per second.

For as long as I’ve had my 5D Mark II, I’ve avoided using it for any projects that I could not shoot 30-for-24, i.e. slowing down the footage to 23.976 fps, using every frame. My 5D has been a gentle overcrank-only camera. There are plenty of occasions to shoot 30 frames for 24 frame playback—we do it all the time in commercials to give things a little “float,” or to “take the edge off” some motion. I still do this often with my 7D. Whatever frame rate I shoot—24, 30, 50 or 60, I play it back at 24. Just like film.

Folks ask me about 30p conversions often. Twixtor from RE:Vision Effects is a popular tool for this, as is Apple’s Compressor. Adobe After Effects has The Foundry’s well-regarded Kronos retiming technology built-in. All of these solutions are variations on optical flow algorithms, which track areas within the frame, try to identify segments of the image that are traveling discretely (you and I would call these “objects”), and interpolate new frames based on estimating the motion that happened between the existing ones.

This sounds impressive, and it is. Both The Foundry and RE:Vision Effects deservedly won Technical Achievement Academy Awards for their efforts in this area in 2007. And yet, as Mike and I discuss, this science is imperfect.

In August of 2009 I wrote:

I’m not saying that you won’t occasionally see results from 30-to-24p conversions that look good. The technology is amazing. But while it can work often, it will fail often. And that’s not a workflow. It’s finger-crossing.

On a more subtle note, I don’t think it’s acceptable that every frame of a film should be a computer’s best guess. The magic of filmmaking comes in part from capturing and revealing a narrow, selective slice of something resonant that happened in front of the lens. When you use these motion-interpolated frame rate conversions, you invite a clever computer algorithm to replace your artfully crafted sliver of reality with a best-guess. This artificiality accumulates to create a feeling of unphotographic plasticness.

Of course, it’s often much worse than a subtle sense that something’s not right. Quite often, stuff happens in between frames that no algorithm could ever guess. Here’s a sequence of consecutive 30p frames:

Right-click and select View Image to see full-resNothing fancy, just a guy running up some stairs. But his hand is moving fast enough that it looks quite different from one frame to the next.

Here’s that same motion, converted to 24p using The Foundry’s Kronos:

Right-click and select View Image to see full-resBlech.

Again, don’t get me wrong—these technologies are great, and can be extremely useful (seriously, how amazing is it that the rest of the frame looks as good as it does?). But they work best with a lot of hand-holding and artistry, rather than as unattended conversion processes.

(And they can take their sweet time to render too.)

I’m so glad we’re getting the real thing.