Continuing the ongoing series of interviews with creative artists working on various aspects of movie and TV productions, it is my pleasure to welcome Chris A. Peterson. In this interview he talks about the transition of the industry from film to digital, the role of an editor in bringing stories to our screens, and what stays with him after productions wrap up. Around these topics and more, Chris dives deep into his work on “American Crime Story: Impeachment”.

Kirill: Please tell us about yourself and your path to where you are today.

Chris: My dad was a huge movie buff. He was a big Stanley Kubrick fan, so I saw every Kubrick film at least ten times as a kid, including “A Clockwork Orange” where he fast forwarded through the really brutal parts. I gained my love of film through Kubrick and Spielberg.

When I was a senior in high school, I took a film video class as an elective. Every class the teacher would show a movie, and I thought most of them were great, but it wasn’t until we watched “The Graduate” and saw the scale of artistry in that film that I was totally hooked. I said this is what I want to do. At that point I had already applied to colleges, and I changed my major right then to film.

During that same film and video class in high school, we had to put a couple short little films together. We had to shoot in them, act in them, do everything – and I loved the editing process the most. I thought the production part of it felt chaotic but I loved the amount of control that I had in the editing. It’s a quiet room and I felt like I could think more. And later on in college, as we were doing all these different parts – directing, shooting, writing, acting – I always felt like I came back to editing. It was a place I felt comfortable in, a place where my creative strengths were.

Kirill: How different was it back then when you were starting out if you look at the tools at your disposal from then to now?

Chris: I was filming on videotape in high school in the early ’90s, and then editing on videotape as well. Then it started the same in college, and towards the end of undergraduate we got into shooting 16mm film and cutting on flatbeds. My college was not well funded in that department, so we didn’t have any computer editing systems.

It wasn’t until I graduated that I had my first experience working on a computer based non-linear editing system. My first job after graduating college was on a documentary. When the director asked if I ever used a Media 100 (an old non-linear computer editing system) I completely lied [laughs] and said I knew how to use it. So he left me there with the computer and I found the how-to book under the desk.

Kirill: So that was before the age of Youtube?

Chris: That was in the mid-’90s — way before YouTube. Flipping through the book I learned how to edit on a computer in about 30 minutes, mostly because I was scared of losing my job [laughs].

I liked editing tape to tape and cutting on a flatbed but once I got to cut on a computer, it completely changed the world for me.

Kirill: Do you think there’s any advantage for people who did come up during that time of transition from film to digital, or might some be holding on to those memories for more nostalgic notions of how it felt to hold a strip of film in your hands?

Chris: For me, having had worked on a 16mm film flatbed was transformative, and I think it still has an effect on how I edit to this day.

A lot of times, a cut you were doing on a flatbed would take at least 10 minutes, and if you were cutting multiple tracks of audio, it could get much more complicated. So every time you were going to do a cut, you really had to think about what you were doing. Every time you look at a 16mm film strip, you see that every shot up is made up of a bunch of individual frames, and that frames have value.

In non-linear editing these days you can do and undo everything in seconds. You gain speed but one of the issues is sometimes forgetting how important each frame is — two frames this way or two frames that way, what difference does it make? It matters.

If I were taking a film editing class, I would make everyone start out on a flatbed, just to understand the importance of a frame.

Continue reading »

Aurora 1.0.0

December 7th, 2021

It started last September as a proof of concept of porting some of the Radiance ideas to the reactive programming model, and more specifically, into the Compose world. A couple months later, what started as Mosaic was then renamed to Aurora. I originally planned to be done by spring of this year, but things like these take a bit more time.

And now it’s here, the very first release of Aurora – a library for building modern, elegant and fast Compose Desktop applications. If you’re familiar with Radiance, you will see some of the same concepts in Aurora, especially around theming and projections. And if not, I’d love to hear your thoughts on what Aurora has to offer.

Radiance has been around for more than 15 years now, and I’m in it for the long haul for Aurora as well. I’d love for you to take this first Aurora release (code-named Arctic) for a spin. Stay frosty for more features coming to Aurora in 2022!

Here’s a couple of Gradle snippets that you might find useful. Both Radiance and Aurora are multi-module projects, and both have the SVG transcoder that converts SVG images into Java / Kotlin classes that contain pure canvas commands without any additional runtime dependencies on libraries such as Batik. In order to run the transcoder locally, you need to point its classpath to all the dependencies, and this is what this post is about.

To get all multi-module dependencies into a single local folder, I use a build.gradle snippet that goes along these lines:


task getFlatDependencies(type: Copy) {
    into 'build/libs'
    from {
        subprojects.findAll { it.getSubprojects().isEmpty() }.
                collect { it.configurations.compileClasspath }
    }
}

Note the usage of findAll { it.getSubprojects().isEmpty() } that filters out folders that are not modules themselves, but rather contain other modules.

Now, what about Gradle Kotlin DSL? It took me a bit to find the way to do it, as there are slight behavior changes in accessing classpaths for different configurations. Here is what I ended up with in build.gradle.kts:


tasks.register("getFlatDependencies") {
    subprojects {
        val runtimeClasspath =
            project.configurations.matching { it.name == "desktopRuntimeClasspath" }
        runtimeClasspath.all {
            for (dep in map { file: File -> file.absoluteFile }) {
                project.copy {
                    from(dep)
                    into("${rootProject.projectDir}/build/libs")
                }
            }
        }
    }
}

Radiance 5.0.0

October 29th, 2021

As detailed in the previous post, Radiance 5.0.0 is out now. It marks a fresh milestone for the Swing libraries I’ve been working on since 2004, and closes the first major chapter in the journey to unify and streamline the way Swing developers can integrate my libraries into their projects.

This document details the changes made to modules, packages and individual API classes.

The next major undertaking for Radiance is to move towards direct rendering with no offscreen bitmaps. That might take a bit longer than the usual 6-month cycle. Keep on eye on this tracker for the progress.