The art and craft of screen graphics – interview with Todd A. Marks

February 13th, 2018

Continuing the ongoing series of interviews on fantasy user interfaces, it’s an honour to welcome Todd A. Marks. In the last 25 years Todd has been deeply involved in all aspects of screen graphics for feature film, from design to playback, weathering the wave after wave of changes in the underlying screen and projection technologies. To sample just a part of his work would be to mention movies such as “Solaris”, “Deep Impact”, “Abduction”, “Spectral”, “Date Night”, “Anchorman 2” and “The Internship” (as well as the upcoming “Venom”).

As we talk about the intricate technical details of wrangling older CRT screens and syncing them with pre-digital cameras, setting up efficient and reliable playback systems on set, and finding the right solutions for productions of different shapes and sizes, Todd dives deep into two particular films. The first one is the seminal “The Net” that, almost 25 years later, manages to stay fresh and relevant as our lives become ever more entangled with digital tools, platforms and networks. The second is the recently released “Steve Jobs” for which the team had to go back to find the original hardware and software, and make it appear both authentic to the time and appealing to the more discerning modern audience.


Todd Marks in front of bank of monitors during a McDonald’s commercial shoot. Courtesy of Todd Marks.

Kirill: Please tell us about yourself and the beginning of your career.

Todd: I’m from California. I grew up in Sunnyvale and Cupertino in Silicon Valley. I graduated from Homestead High School (’81) – which is the same high school that both Jobs and Wozniak had graduated from more than a decade earlier. During the summer between middle school and high school I took a computer course with Steve Headley who was the computer teacher at Homestead. It was still the time with IBM punch cards that you ran through the reader.

My dad was a project manager on aerospace projects at Lockheed in the Valley. We were one of the very first families that I knew of that had a computer in our house. It was a Southwest Technical 6800 with about 8k of RAM. We initially used a black-and-white TV as a monitor, a tape cassette player from RadioShack to load Basic, and a World War II teletype as our printer. My dad built a variety of Heathkit electronic projects during my childhood. He built both mine and my brother’s 25″ color TVs from kits. We each had those in our rooms, which was pretty cool back then.

When I was younger, I thought I wanted to be an actor. I was into drama and television / film production. Between my junior and senior years of high school I took a film class at De Anza College in Cupertino, and I really enjoyed it. They liked what I was able to do, and they asked me to work there which I did part time while I was a senior in high school. After I graduated from Homestead, I went to De Anza College full time, and continued working in the film department as well as their community access TV studio. I worked several film and video projects during that time. I received a Film/TV Production degree from De Anza, and later went to Foothill College and got a Business degree.

Sidenote: The De Anza college film department was in back of Flint Center, up four flights of stairs. It just so happens that Flint Center is the place where most of the first act of “Steve Jobs” movie takes place. During production, we were filming in some of the same rooms that I had learned film making 30 years before!

After working for a while on industrial and music videos, I moved to Los Angeles in 1988 to get a Marketing degree at CSUN, Cal State University, Northridge. My thinking was, that the film industry is in LA, and this is where you had to go if you were really serious about getting into the film business.

While at CSUN, I started a Mac hardware, software, sales and consulting business. That was back when we had the Mac SE 30 and similar models. I also joined a couple of professional Film and TV organizations as a student member, allowing me to meet people who were working professionally. One of those people, helped get me my first job when I graduated. I worked as a Production Assistant (PA) on Carol Burnet’s TV series “Carol and Company”. It was a very cool first post-college job.


Todd Marks setting up monitors for the additional photography work on “Day The Earth Stood Still” [2008]. Courtesy of Todd Marks.

That job ended up having no direct bearing with what I’m doing now. My career-path happened in part, by being at the right place at the right time. I was at a book store with my then girlfriend, and now wife, I was hanging out in the computer-books area, and some guy asks a Mac question. Together we both chimed in to answer him. Afterwards, I started talking with that other guy, and it turns out he’s this big-time first assistant director for feature films. I got his contact info, and told my then girlfriend (now wife), “this guy could really change my life.” So I made sure I stayed in touch with him, and then became friends with him, (David Sosna).

At that time, around 1992, I was looking to get into film producing. But something got in the way of that goal…


Todd Marks running the LAX Control
Tower graphics during the pilot for
the “Scorpion” TV show [2014].
Courtesy of Todd Marks.

David introduced me to the director John Badham, (David and John had worked on War Games together). John was a Macintosh power user and fan of Apple like I was. I ended up doing computer consulting work for John and his assistants in their office. It was a good way to get on the Warner Bros Studio lot back then! Around that time, John and David were preparing to start production on “Point of No Return” with Bridget Fonda. Since there was a some major scenes with computers in them, David asked me if I wanted to help them, by assisting with computer stuff that was going to be part of the movie. I of course jumped at the opportunity. That was my first big feature film, and I learned about this job that I never knew existed – the person in charge of making the computer screens for the movies.

The production had already arranged to have a computer and video playback engineer on set with his special computer and video gear. The custom gear synchs the camera and, computers, and monitors to each other so that you can get rid of the “roll bar” and film clean images of the screens.

My job was to make sure that we had the right graphics and computers at the right time on the set, and I thought that it was a really cool job. It was a combination of many of my skills and things I liked to do – computer stuff, design with a technical aspect to it, working on set, and doing something that directly has an impact in the movie.

Sidenote: Luckily, I didn’t have to start as somebody’s assistant and then spend years and years working my way up. I was able to leapfrog that by doing this kind of work.

About a year later I got a call from John Badham’s office to come in and talk to them about their new movie, “Drop Zone”. They sent me a script with all this cool computer stuff in it. I did a breakdown, and created storyboards with an artist friend of mine. We sketched out all the computer scenes to give them and idea of how things would look. I went to the meeting with three copies of the storyboards, as I figured that I was going to meet with John and his assistant to discuss the details. When I arrive, I’m brought in to John’s office and greeted by what seems like a wall of people. It turns out that I was meeting with John AND all the department head! – I think it was about ten people. I was introduced to everyone, and then said. “Oh, we are going to need more copies of these storyboards.” [laughs].

I made the presentation, answering questions about how we’d do specific computer sequences, and deal with other technical aspects of the story. That meeting went really well, and I was confident, even though I knew they had another playback coordinator to interview. A few days later, I went off to MacWorld which was our annual pilgrimage to San Francisco. While I was there, I got paged to John’s office (we had pager’s back then!), I found a pay phone at Moscone Center, I called them and they offered me the job.

It was very exciting. That was my first big job as the head of a department, as the computer and video playback supervisor (the title we now call it), and it was my first feature film on location. We filmed in Florida, and in Los Angeles. It was a both fun and challenging, and a great learning experience. It was the launch of my career.

After that I worked on “Species”, and then someone recommended me to director Irwin Winkler who was prepping to work on “The Net”.

I went in to meet with them, discussed a variety of idea and concepts. I got hired, and “The Net” became my next big project.


Director Irwin Winkler, along with Sandra Bullock, and Todd Marks on floor of MacWorld in SF, one of the filming locations for “The Net” [1995]. Courtesy of Todd Marks.

Kirill: I think one of the interviews said that most of the technical parts of the script were written by you and your team.

Todd: It was a combination. We did a lot of brainstorming. The producer Rob Cowan was significantly more tech savvy than the director, Irwin Winkler, who was in his mid-sixties at the time. We worked very closely with Rob to flesh things out and develop story concepts. I did the graphics work with, Harold and Alex Mann, they have a technology and consulting company, Mann Consulting, in San Francisco. In preproduction, we were hired as technical consultants. We were helping to rewrite parts of the dialog to make more sense.

So we did add a lot to the story, and we came up with some of the ideas, pitching different concepts, some of which were incorporated. We also consulted with a couple big, at that time, black hat guys.

I believe that we came up with the idea for the pi symbol and the keystroke combination to get into the system. You wouldn’t easily figure out that you have to hold these three keys while clicking on that. It’s a bit of a stretch, but it was “plausible”

We weren’t credited with script consulting per se, at least in the credits. But we did get paid a bit extra for doing that.

Sidenote: Years later, when they did “The Net” TV series, I was hired as a technical consultant. They would send me scripts, and I would go through and clean up storylines, and I’d comb through different tech articles, like from Wired, and I’d send those in to the writers to give them ideas.

Since, I’ve always been more of a Mac guy, I typically lean everything more towards the Mac, (as that was what I was comfortable with and they were more reliable when working on set). For “The Net” we used some PC’s but mostly Mac. Pretty much the whole film hinged on the work that I and people who worked with me were doing. Who knew that twenty-odd years later I’m still doing interviews about this movie?


Screen graphics of “The Net” [1995].

Kirill: If you look at the scenes where she is ordering pizza or airplane tickets online, that didn’t exist at the time. You were essentially trying to extrapolate where things might be going. On something like this, is there a concern how it might look twenty years after the movie is out, or do you look at a production as a product of its time?

Todd: It depends on when the film is supposed to take place. “The Net” was supposed to be a “present day” film. It was the still the early days of being online; services like Prodigy and other early Internet companies like AOL were introducing new capabilities and features at a fairly rapid pace. Although the graphics in “The Net” look cartoony by today’s standards, they were forward thinking and visually ahead of what was available.

We saw the computers and graphics as characters in the film; like and actor, they had a story to tell. My guys were trying to be very accurate. They did programming, and they knew the real tools. We wanted to present them in a visually interesting way that would tell the story quickly, and not be silly-looking. It’s a very fine line.

I have a term that describes my philosophy, or set of rules when we are designing and creating playback graphics: VIPAL – visually interesting, plausible and logical.

Ideally, the computer screen tells that story clearly, concisely, and believably, and we may only have a few seconds to convey that story point to the audience.


Screen graphics of “The Net” [1995].

We see a plenty of bad examples on TV and in the movies. People over-simplify. They may not know the technically correct way, so they come up with some folder that says “Top Secret” or some other silly stuff.

For “The Net”, we chose to use realistic looking technical data – IP addresses and things like that. Even so, online, people still complained that the IP addresses had too many numbers and were wrong. The studio would not allow us to use a real IP addresses, so we had to add an extra digit. 98% of the audience are not going to know the difference, and to those that do, it at least looks real.

Getting back to your question, as far as looking forward, it depends on the project. For “The Net” we weren’t thinking, “Is this going to look right in twenty years?” That wasn’t the timeframe of the film. When I did “Star Trek: Nemesis”, we were worked with their design style that you couldn’t really change too much. When we went into the alien ships, we were a bit more free to come up with symbology and other things that would still play to an audience’s understanding, but look different enough.

One on my pet peeves is when shows use 16×9 and/or 4×3 screen shapes on alien or futuristic spaceships. In some of the films that I’ve done, we create bezels to break up create different shaped screens. In “Solaris” we have an onboard lab with long narrow screens where we take a big screen and put up a bezel to break it up into non-traditional shapes. Each opening in the bezel creates a different screen. I personally like to do that to make it look visually different from what is familiar to us here on Earth.

When we saw “Rogue One”, I really enjoyed it. But I kept getting pulled out of the film when things were too similar to things that we currently have now, like the army helmets, and the mesh-seats on the vehicles. That seems lazy to me that you would not design something different enough looking than what we currently have, especially if it’s “a long time ago” in a galaxy far away. Why would they have it exactly like we do?


Screen graphics of “The Net” [1995].

Kirill: If you look at “Rogue One” as the immediate prequel to “A New Hope”, they couldn’t really introduce any advanced technology.

Todd: I agree with that in regards to the monitor’s graphic style. It looked great because it matched what they had in the next movie.

But if you look at Luke Skywalker’s helmet, it didn’t look like a helmet that you would get presently on Earth. But if you look at the military stuff in “Rogue One”, it just seemed all too much like what we currently have. It wasn’t enough of a change. It looked like it came from an army surplus place. That pulled me out.

That’s a sideline, but that’s how I like to separate what we’re doing. If it’s supposed to be from some other place or time so that it doesn’t look like what we currently have.


Screen graphics of “The Net” [1995].

Kirill: Still talking about “The Net”, is it surprising to see how one particular movie kind of sticks around, and even gets more appreciation over time?

Todd: It’s a testament to the story and what we were able to do. Even though it does look dated in certain aspects, it feels good that we were able to create something like that. You look at identity theft, not being able to prove who you are, having your information easily manipulated – those ring so true today. That part of the movie is still very relevant.

You also have ordering tickets and pizza online. In 1994, year before “The Net” was released, Pizza Hut in Santa Cruz, CA started taking orders online. It was very basic; you emailed them your order. It wasn’t an interactive experience. We were trying to look at where we were then, and thinking about what would be available, and how to make it visually interesting for the audience to see.


Screen graphics of “The Net” [1995].

There was also the virus. They are usually not very interesting to see because you don’t really see what’s going on in it. There is no visual demonstration. So we were thinking about how we were going to show that. Maybe it’s getting into the software running on the graphic cards and eating away the pixels from one layer to the next.

Another thing that was silly at the time was that it was a multi-platform virus, as it worked on both Macs and PCs. There was nothing at the time that was like that, but we had to circumvent the reality. If there was something, how would it work? And now we do have those things that can work across platforms.

We were lucky that certain things continued to move forward. You look at when Dale was flying the plane, and his instruments go all wonky and he crashes. At that time it wasn’t necessarily something that somebody could do. But we were thinking about what if somebody was able to do it.


Screen graphics of “The Net” [1995].

Kirill: And then you see a more visual sequence in the second “Die Hard” when the bad guys take over a church and turn it into a mobile air traffic control center.

Todd: Right. And now you see it and it’s actually possible. Part of the reason why I think people still enjoy “The Net” are those aspects that are a reality now, and even more so than back then [laughs]. Back then it was much more of a fascination because it was so new to most people. It was this big scary thing on the horizon that somebody could take your identity, and now it’s something that actually happens. It’s a whole other type of fear because they know it’s true.

Kirill: If I look at the productions you did since “The Net”, I see that you do a lot of video playback on TV screens that we see in movies. How things have changed for you in that area as the technology is evolving so rapidly in the last 10-15 years?

Todd: Let me give you an example from when we did “Deep Impact” that had a tremendous amount of both computer and video screens, sometimes in the same set. We had to get an MSNBC set ready for the second week of production, and it was a huge undertaking to build that entire working set on a soundstage. We had five miles of cable underneath the stage, and all those video monitors and computer screens. Almost every single one of them was CRT, and that required special syncing hardware to keep the same frame rate as the camera. You had to have monitors that are capable of doing that, and special video cards and and other hardware that would drive them at the right rate.

In that case, of a lot of the video playback was done using 3/4″ videotape decks that were modified to run 24 frames (instead of 29.97), with special hardware so they could all gen-lock together to refresh the CRT’s video frame at the same time, so that it would be in sync with the film cameras (to avoid getting a “roll-bar” in the frame of the monitors).


“Deep Impact” [1998]: Mission Control Set  – 3 projected video sources, 5 different computer animation sequences.

Now with LCD’s and other modern display technologies, we can often run the screens off of a $50 media players. That’s a huge difference. Costs have dropped dramatically, screen sizes and resolutions and grown tremendously.

One of the other big changes is with video editing. On “Deep Impact”, we spent around $150K on online video editing. And now we can do all of that and significantly more on our laptops.

We had so much editing that the company that editing company we hired gave us offices to used during the production. My entire department had offices in this beautiful high-rise building in Burbank. It was also very convenient because we could just walk across the hall to the editing bay. My department budget on that film was around $1.3M, and I’ve never come close to that ever again.

I had huge staff then, and it was required for something of that nature. Now we do all that and more with a fraction of people and budget, which is what today’s software and hardware tools allow us to. We can be on-set with our laptops, editing and having our shots ready. Rendering is significantly quicker.

On “Deep Impact” we put helmet cams on all the astronauts on the surface of the comet. They’re all hanging by wires, and we had to do wire removal because we had to show those camera feeds in the ship before the visual effects company would get those shots, which would happen months later. We had two teams working around the clock in our office doing wire removal with a very early Mac compositing program. We ended up doing way more than they ever used, but that’s one of the things that you never know what you’d be seeing on camera.

At the time that software was a bit of a break-through. It allowed us to do something that we were not able to do previously. We were using early releases of that software. Every few years we’d get another hardware or software tool which allowed us to do way more than what we could in the past.


“Deep Impact” [1998]:  Dr. Wolf (Charles Martin Smith) calculates the path of the meteor. Featuring six technically detailed interactive computer animation sequences. Filmed at Mt. Wilson Observatory in CA. Courtesy of Todd Marks.

Kirill: What about the cost of the monitors?

Todd: Monitor cost was always a huge factor. In “Star Trek” which we did in 2001, the bridge was built on a giant moving platform with a hydraulic ramrod. We couldn’t use the old CRT’s that they had used in previous films and TV. I had to move them up to LCD screens, which required all sorts of changes in the way they had done things. They’d always done CRTs under smoked plexiglass at 640×480 resolution.

And here I come with 15″ and 18″ LCDs and they don’t work behind smoked plexiglass. So we were trying to use clear plexi, and eventually we had them cut a clean opening around the edge of each screen. The screens were mounted flush up against the back of the enclosure. They didn’t work that well off-angle, so when they were lining up shots, they had to be aware of the angle that they were seeing the screen. That’s one case where we started pushing with current technology, and LCDs were pretty expensive at the time.

I had a deal to get them at a discounted price, and we ended up using quite a few. Then, when we went on to do “Solaris” right after, and we purchased the monitors from “Star Trek” at 50% of what we had paid then. On “Solaris” we wanted them to look more futuristic, and we pulled the plastic bezels off of each monitor. It was exposed metal housings, which gave a more industrial look. We put wire-mesh cable-covers over the cables to make them look industrial.

When I first started working, pretty much everything we did was displayed on computer and video CRT monitors. Because they typically ran at 29.97 fps or above, they needed to be run through a 24 frame video standards converters. Most of them were built and sold by Keith Schindler, and thus were nicknamed “Schindlers”. For the most part, these took a normal video signal and converted it to 48Hz Vertical (24×2), it was then “genlocked” and connected a sync box that was connected to the film camera(s). We would then adjust them so that the shutter of the camera and the refresh rate of the monitor were timed so that the roll-bar would be eliminated or greatly minimized.

Sidenote: We would also use custom-modified 24-frame sync video cards (computer video cards with a special daughter card) that would be connected to a sync generator (if you had the green light, you knew that everything was in perfect sync), this was also connected and synced with the film camera(s). There were so many headaches [laughs]. Different cameras had different sync hardware, that didn’t always work properly, or someone didn’t know where the special cable was that was needed to make it work with the camera, or the shutter would start slowly drifting up, and you’d need to track down what when wrong, or what cable needed to be replaced… There were so many headaches back then.


Screen graphics of “The Net” [1995].

Then plasmas monitors came out, but only handful of particular models would sync at 24 frames. The very early Sony Plasmas had to modified to be synced. There was a guy who was at Pioneer there had previously done 24-frame work in film, so he knew that was a necessity. He had the engineering staff build a special mode so that they could be switched into the 24-frame mode if you knew a special key combination on the remote.

Then we started getting into LCDs and LED panels, and different types of projection. LCD projection was usually fine for 24 frame. Some of the other technologies were not that great. You had to run the projectors at higher frame, like 48 or 96 Hz, to prevent fluttering on film or video cameras. There’s always some technical challenge.

When we were doing “Abduction” in 2010, a lot of the screens were moving to larger LCDs and they were changing the backlight components. We had issues with many different manufacturers that were using technology that would pulse on camera. If you adjusted the level of the backlighting to anything below 100%, the screen would pulse (on camera). You had to disable certain auto-dimming modes. Every hardware progression also brings some headaches for us as we have to figure out how to make it work, how to color-correct it, how to adjust the exposure on the monitor without creating other issues.

When we went from analog to digital, it was great for certain things. But some monitors wouldn’t allow you to color-correct the screen when it was receiving a digital signal and some of the features and menus were just disabled.


Todd Marks on the Bridge of The Enterprise during the filming of “Star Trek: Nemesis” [2001]. Courtesy of Todd Marks.

Kirill: Sounds like you are doing most of your work on set and not in post-production.

Todd: Correct. We did do “Spectral” in 2016, and we created the graphics, overlays, HUDs all during post production. That’s not how we prefer to do it, There have been a few others, like in “Date Night” we did the 3 holographic screens cell-phone tracking scene, and also on “Spider-Man 2”, we were hired to design and build a replacement screen for one of Doc Ock’s computer systems. But, for the most part, we prefer to do, as much as we can, live on set, during production.

Sometimes we will do everything on set, but there might be aspects that they need to burn in later. Then we will provide them with those graphics elements. Sometimes it works out well, and sometimes it’s a disappointment for us because they don’t follow our guidelines. This happened on “Anchorman 2.” We had a design guide for what things should like like when GNN goes “on air”. We designed guides on how things should look when The Network first started, and how the graphics get more sophisticated and complex the longer GNN is on the air. During production, when we were on the set, we followed the guidelines, and then we handled all the elements over to post-production. But it’s out of your hands, you don’t have control. Luckily, most people won’t notice, but to us, it’s a big deal.

For “Spectral” we created all these different screens, and as I watched the film, in a few different scenes, there are a handful of monitors with the duplicate images. Ugh! Why did I give them all those design rules and alternate versions if they used the same image on all four monitors!!? That’s when it’s really frustrating. When we’re there on set, we can control it so that everything is the way it’s “supposed to be” because we have deemed it to be correct within the confines of the story. When you hand it off to someone else, you lose that control.


Filmed in Steve Jobs’ actual garage, this workbench was part of the set (included 3 different era correct software screens and monitors) for the “Steve Jobs” movie [2015]. Courtesy of Todd Marks.

Kirill: Bringing you to “Steve Jobs” that you did in 2015, and the world of technology in mid-1980s. How do go about recreating the software and the hardware of that time to look both authentic for those who experienced that technology growing up, and visually appealing for the younger audiences are used to technology that is so much more powerful?

Todd: That’s a very good question. As far as the hardware went, one of the challenges was finding all the old Mac and NeXT hardware. Much of the equipment was from our supervising engineer Ben that he acquired over the years, on eBay, Craigslist and other sources. I had brought him and John Monsour our video engineer in on the previous Steve Jobs movie.

I got called in to be on that other Steve Jobs film, and they had an extremely small budget. I was already starting work on “The Internship”, and I couldn’t do both films at the same time. I brought Ben in, because I knew that he and John could handle it. John Monsour was the video engineer who I met on my very first feature film – “Point of No Return”. We call him the godfather of 24 frame sync. He has a special mind that can design and build custom-made hand-wired video cards and daughter cards. He can go in and look at how things are built to work around the issues, and figure out how to make the device sync to the film camera.

He was the video engineer, who Apple found to sync their equipment for the early Mac commercials. During the production of the latest Steve Jobs movie, we were at The Flint Center in Cupertino at De Anza College, filming scenes that took place in 1984. In this sequence, we are recreating the event that launched the Macintosh. We were playing of the original Apple commercials on projection screens, these were actually commercials that John had worked on thirty years before. And here he was, helping us getting all these computers syncing again thirty years later, and recreating things that he had done previously. It was a pretty cool set of circumstance.


A collection of customized Macintoshes with built video modification boards were added to the Macintoshes to make them camera ready for 24 frame sync.  A very difficult process. “Steve Jobs” [2015]. Courtesy of Todd Marks.

He had gone through his files, finding his old schematics and notes, and he shared other documents from people who had worked the original mouse and other components.

He was very much involved in having to figure out how to make the old Macs and Lisas and NeXT monitors sync. It’s really intricate and beyond any normal person’s capabilities to even understand what is being done, let along how to do something as intricate and technically challenging as that.

I have pictures with these custom circuitboards inside the Macintoshes that John and a small team had built. He had rewired a serial port to become a special connector that went to a VGA connection. We basically made the Macs into 24 frame monitors, except for the standalone one which had its own custom 24 frame card inside. It had a little switch on the back that you could turn on and then we could sync the camera to that one particular Macintosh which is featured the dressing room scene with Steve Jobs when his young daughter Lisa creates an image in MacPaint on the computer.

We had to find that old hardware, make it look new, and then make it sync, which in itself was a huge undertaking because we didn’t have a lot of time. John and a small team spent a lot of late nights making custom components. We were building stuff literally the night before it was to be used on set. Each iteration of each Mac and NeXT and iMac had different requirements to make them work “on camera”, and the 24 frame Crystal sync video cards had to be “re-flashed” for each computer system. It certainly wasn’t a turnkey system!

As for the software, we were very fortunate. We were looking for the original Apple Macintosh presentation, with the chariots of fire music and the scrolling “Hello I’m a Macintosh” text, and we could not find it. I was making calls to Susan Kare and speaking to Andy Hertzfeld; and Andy told me about this kid in Michigan who had an archive of early Mac software and imagery. Andy gave me a link to this Dropbox archive, and there were tons of amazing stuff – images and videos and emulators.


A small portion of the video gear used to run the presentations for the Macintosh Product Launch scenes at Flint Center for “Steve Jobs” [2015]. Courtesy of Todd Marks.

We called this 17-year old kid whose name is Tom Frikker. Initially, he told us that he didn’t have permission to let out a some of the material in his archive out. He told us that he had become an archiver of the early Mac stuff and he had flown to Cupertino the year before and got an early Mac signed by all the people at the 30-year Mac anniversary event at Flint Center. While he was there he made all these connections with the original Mac team, and other early Apple employees. After he made these connections, he’s being sent golden masters and betas of all the software that nobody else has. One person connects him to another person, and it becomes this tremendous archive of resources.

He literally starts working with us in between his classes in high school, and he’s building software for us. If we needed to make a presentation, he would screen-grab it all and create these little Quicktime videos for us. He and his parent’s fly out from Michigan and joined us on set. When we were shooting some of the first scenes in The Flint Center, including a sequence in the dressing room with the Macintosh, we would ask him which version of MacPaint they’d be using at that time. And he’d find it on one of his floppies to copy it over. He just knew all of that stuff, and where to check if he needed to find out.

He was a god-send for us in that respect. He ended up staying for days, working on set, getting to meet all the different people. He also got to be in the movie; as they do a pan across the audience he is one of the people in the front row applauding. He’s even in the trailer for the film, and even got into the the special “thank you” section at the end.

When we started working on NeXT portion of the film, we were given other connections. Since I had grown up in the Sunnyvale/Cupertino area, I knew friends who had worked at Apple early on. One of my friends was an intern for John Sculley’s secretary. He had interesting connections and information, so he connected me with some other people. One of the guys who we got connected with was Keith Ohlifs who, unfortunately, recently died at age of 52, not long after the film was released. He had created the opening presentation for the original NeXT product launch, and he was very much connected to the whole NeXT community that still exists after all these years.

He came into our production office in South San Francisco with a big box of original invitations and programs and name badges, and all sorts of amazing materials that we just would not have had access to. But the thing that we were really looking for was the video of the original NeXT product launch. We could not find it anywhere. There were ones from later launches, but not from the original one. You could find six-second video snippets online, and we thought that somebody had to have that video somewhere.

We asked Keith if he could put an email to his NeXT friends to see if anybody had a copy of it, and sure enough after a day or two somebody replied that they might have a copy on VHS. We asked to have it FedEx’d over, and the next day we had two VHS tapes which we had digitized. They were pretty horrifically old, with a timecode at the bottom and very dark. But it was gold. It was something that very few people had actually seen originally, but in the last twenty-odd years pretty much nobody had seen it.

You see a very young and nervous Steve Jobs presenting to the crowd and later the press conference. After the film was released, Tom Frikker cleaned it up and released it online, and it got all sorts of attention because it had been missing for so long, and people started connecting to it. That’s part of what we had to do for this film – make the connections to all the people who were involved early on.


VIP/Press Room Set. Screens and computers were modified for 24-frame sync. “Steve Jobs” [2015]. Courtesy of Todd Marks.

Kirill: It sounds like a very faithful recreation of the events.

Todd: We tried to be as accurate with all the software as we could. When it came to the presentations, we had to be a little bit more visually interesting. We were very careful to not be beyond the capabilities of what was technically possible at that time.

In the first act when Jobs is making the presentation on the Mac, we had a video camera that was both SD and HD. We ran the SD feed through a 200-foot long cable to degrade it, through our system and into the projector so that it was kind of noisy because the projectors of that time were pretty low-resolution. We tried to pay attention to those details to make it not look too good so that you could see the progression as you go through the acts, as things get better, bigger and sharper.

In the iMac Keynote scene. during the introduction of the iMac, “Steve Jobs” calls a cameraman out on stage to show the audience a live-feed walk-around demonstration of the beauty and features of iMac. We think we used the same model of camera as in the original 1988 presentation. Although not all of our presentation graphics made it into the final cut of the film, you can see more of them in behind-the-scenes videos that are included on the BluRay and
DVD.


Custom built video modification boards
were added to the Macintoshes
to make them camera ready for
24-frame sync. “Steve Jobs” [2015].
Courtesy of Todd Marks.

In certain situations, we had to make up some of the stuff. During the NeXT act, we’ve used a design style and the burgundy color that, according to the original NeXT design people we spoke to, Steve Jobs would never have gone approved of. It’s a movie and not a documentary, so there’s a little bit of a stretch here and there. Sometimes, we do have to do things a little visually different. We used the program Keynote to run all the presentations, even though it wasn’t out yet. But it was something that Steve Jobs had designed for making presentations.

We talked to the several original engineers and people who ran the presentations at those product launches. A lot of them happen to still be video technicians or projection technicians, working in the San Francisco area. Being in the Bay Area certainly provide the opportunity for more authenticity in the film, and we wanted to have people who were involved in the original events to get as much insight as possible.

Now, how the other parts play out in the movie – dialog and action – are not be as accurate as what we did, especially the whole part around NeXT computer not having the operating system. That was kind of a bit ridiculous. And the whole thing about the Guy Kawasaki letter wasn’t right because the letter didn’t actually come out until years later. Aaron Sorkin took some poetic license there [laughs] to stretch the truth a bit.

Kirill: How much time out of your year does a production like “Steve Jobs” take?

Todd: That was a four months project. We were in production from January til March, which was very rushed due to tight time constraints because of locations that were available only on specific times. Our second location which was the San Francisco Opera House was negotiated for months. The last movie to shoot there was “Foul Play” with Goldie Hawn, and that was only for two days and only in the audience section. We were shooting all over the place, and we had to work around the schedule of the ballet which is their number one client.

We had to do all the things to not upset the ballet and to not mess it up for the Opera House. We did these bizarre all-night shifts for a couple of nights, flipping back and forth for several weeks.

On average a big project is about three to four months. “Deep Impact” was nine months. We started early on with a very long pre-production because there was a lot to be prepared. There can be a lot of downtime between jobs. There are a lot of people that work constantly, but that has never been my thing. It’s nice to have time off, but not too much [laughs].


The Roland VH 50HD was used to mix multiple types of media used in the projected presentations during the NeXT, and iMac product launch sequences. “Steve Jobs” [2015]. Courtesy of Todd Marks.

Kirill: You’ve talked about the constant changes in screens and monitors, and the evolution of technology in both hardware and software. Does it get easier with year that passes nonetheless?

Todd: There are certain aspects that are easier, but there are also a lot more demands on what we’re expected to do, and the time frame that we’re expected to do it in. Pre-productions on films used to be a kind of a set thing where they had a certain number of weeks or months, and they keep cutting it shorter and shorter. On a lot of shows, don’t want to start you until a week before production, and so you don’t have a lot of time to do things.

As far as technology goes, obviously having a MacBook with all the different outputs, and editing and graphic capabilities is a tremendous tool. All that time that we’ve spent editing on “Deep Impact” and we can now do all that and more on my laptop at a significantly higher resolution. There are always new software tool that come out and allow us to do new things. Sometimes we cross into other areas that we didn’t necessarily do previously. There’s more visual effects work in certain cases, and often there’s a blending between departments.

You’ll have photographs that need to be manipulated with certain settings, or to put somebody’s face on somebody else’s body. Certainly doing that now is much easier than it used to be. Sometimes the art department will do that, sometimes we’ll do that, and sometimes they’ll start it and we’ll finish and animate it.

There are things that are easier and there are things that are more challenging because of the new capabilities of the hardware.


Original “Blue Bondi” iMacs specially modified to be synched at 96Hz allowing them to be filmed cleanly by a 24 frame camera on the set of “Steve Jobs” [2015]. Courtesy of Todd Marks.

Kirill: How do you describe what you do to somebody that you’ve just met at a party, somebody that is not intimately familiar with your field? Are people surprised that every screen that is seen in a movie needs to have its content designed and put there on purpose?

Todd: There are certainly people who don’t understand, or hadn’t really thought about that. The way that I typically describe this is that we do screens on screen. I always say “Picture the bridge of the Enterprise, and all those monitors and all those graphics. Anything that’s on the screen is my domain. We’re responsible for acquiring and creating all that content, to play it all back, to make sure that monitors are in sync, are exposed correctly and are the right color temperature. If it’s on a screen or projected, it’s my responsibility.” From smartphones and tablets, laptops, bullpens, hacker-labs, control rooms, and space-ship consoles, to functioning TV studio sets, we handle any and all screens as well as all aspects of playback. We oversee the creation of the playback graphics. There’s a lot of work put into it, from design to scheduling to management.

Then I show some pictures, and that’s when some people that thought that they got it, actually get it. It clicks better. My website has lots of great imagery and samples of our work, and we have a direct link to the interview that Vox did with me. It’s a great piece. It really goes into a lot more details about what’s involved in being a video playback supervisor.

Unless people come to set and are watching what we have to do, to grasp the complexity of our job is very difficult. People think that we just plug in a TV and put some images on. Yes, but let’s go into some details. You have a scripted scene where the actor has to do this thing, and click on this, and this happens, and meanwhile there are forty screens behind that actor with other video and/or graphics on them. We have to make sure that all these things are happening in sync, and the right things are coming up, and if there’s a technical glitch, we hold up production because they can’t shoot.

The first assistant director is leaning over your shoulder while you’re trying to fix the problem, and asking “How much longer?” And you tell them that it’s going to be five minutes, and five minutes later they come over and ask if we’re ready. I’ve had that happen on more shows than I care to remember, especially on big complicated sets when there are so many things in there that can go wrong. Somebody plugs in to our power, and all of a sudden we have noise in half the monitors, and we have to go and track down who’s plugged in. Or you have some weird interference coming from one of the wireless camera systems that is causing weird noise to appear on one of the screens.

The other one that has happened more recently was where we were doing a scene on an iPhone, and were using another iPhone to trigger the action. It worked perfectly in rehearsal, and then when they started rolling all of a sudden things are not working. The camera department is using wireless zoom controllers, the sound department is using wireless microphones, another department has some other wireless gadgets. Any number of things can be causing interference, and we’re there scrambling trying to figure out why it’s not working.

I try to describe the pressure that we are under. I tell people to imagine giving a presentation to a few people in their company, and the projector isn’t working. How nerve-wracking can that be for them? And now imagine that you’re on a movie set with a hundred people, ten high-priced actors, and a short-tempered director, and you’re trying to make something work and it’s not working, and you’re holding up the entire production. That’s pressure [laughs]. That’s what we have to encounter quite often, because it’s a technically challenging job.

Things can often go wrong. Everything would go fine and they start rolling, and your computer freezes. It was just waiting until they started rolling to crash [laughs].


Todd Marks behind the scenes running the cockpit instrument controls during “Flight” [2012]. Courtesy of Todd Marks.

Kirill: Is it becoming harder for you to compete against the variety of screens that we have in our daily lives?

Todd: It certainly is a challenge. Creating an interface that is viable, that tells the story and that doesn’t look fake takes time. And we often don’t have that time. If there’s a computer scene, and let’s say that Apple has given us the computer placement, we can use their OS and their programs. That’s great for us, because we want to be practical as much as we can – calendar, browsers etc. That makes it way easier than to build it from scratch and then make it look similar, but different enough so that you’re not getting into trouble.

As far as the average viewer these days, people indeed are more savvy to what the computers look like and what they can do. There are shows like CSI where they do these ridiculous searches and zoom-ins – some people enjoy that, but some are just frustrated because it is so ridiculous and wouldn’t happen like that.

It’s certainly harder to build differently-looking content and have it modern-looking enough while being able to convey the story. You look at our monitors now and whatever small text size they use, and when you look at a typical TV show, everything is enlarged. There’s a very fine line, at least for me. We can make it a little bigger, but too much and it looks ridiculous. There’s always that fight with certain directors or production designers who don’t quite understand and ask to make it bigger. You can do a close-up of the monitor; you don’t have to read it from across the room!


Alex Mann, Rutherford Wilson and Todd Marks on the set of “The Net” [1995]. Courtesy of Todd Marks.

Kirill: Do you keep track of design trends that come and go in the software around us? Does something like the recent switch away from skeumorphism towards flat find its way into these productions, or are blue lines on dark background here to stay?

Todd: The higher-contrast stuff typically reads well. There are shows that will do very normallooking, monotone stuff. If I put my regular monitor running Mac mail program on camera, it’s going to looking pretty bad – one big glowing blob. That’s why you tend to see the inverted visuals.

A mostly white screen glows bright, and can draw a lot of attention to itself, and you don’t necessarily want to the audience to be be focusing on this big white blob at the back of the room when the actors are in front. You have to look at the scene and at the surroundings, and play to your setting. We’re there to complement and help establish a look and a feel. So unless it’s specifically scripted, we’re not there to steal the attention.

Sometimes you’ll see shows where obviously they did not take the time or did not have the right people to adjust the monitors. You will see very cool blue un-color corrected monitors, which typically means that they’ve hired somebody who doesn’t know how to color-correct, or maybe left it to a props person to do it. Or it may be an overexposed screen, glaring and calling too much attention to itself.

Sometimes you’re in a situation, and it certainly happened to me, when you don’t have time to go in and adjust every single monitor, or they’ll switch the f-stop on you and not tell you. But we have to ride that very carefully. There are some DPs [directors of photography] that shoot on film where you don’t have the immediate feedback of HD. We have light-meters and color temperature meters, and we have to check everything. I carry an 80a (blue filter) with me on set, I can use that to look at the monitor to see how it would look like with tungsten (3200 degrees kelvin).

When we’re shooting HD and you can see the camera feed on an off set monitor, it’s much easier. You can look at it with the DP, and see if the monitor’s brightness, contrast, and/or color temperature need to be adjusted. That’s really helpful, and that tool has made our job much easier to not make mistakes. I see “Team America” now, and there’s the scene at Kim Jong-Il’s palace with 50″ plasmas that one of my guys had color-corrected completely wrong. We were all trying to get it right, and you look at it on camera, and it’s way too magenta. It kills me [laughs]. You can’t win them all.


Director Anne Fletcher and Todd Marks on the set of “The Guilt Trip” [2012]. Courtesy of Todd Marks.

As far as the trends go, we will look at tools and designs that we might want to emulate depending on the projects. Sometimes in a police or an FBI drama we would try to get real software for mugshots or fingerprints, or look at the real stuff and then emulate and enhance it. I look at the stuff that I designed years ago in Photoshop, and everything has beveled edges and drop shadows. OK, that’s so dated now [laughs], but at the time it looked good. There are certainly trends that we go through. And sometimes you do something completely different and futuristic.

I already mentioned that we did very regimented stuff for “Star Trek”, and then as we went into “Solaris”, we started from a clean slate. I took a lot of graphics people from Trek with me, and brought a couple of new designers to do some really cool stuff. We had Rick Sternbach who designed a lot of the ships and lots of other things for Star Trek. He built us a library of elements so that my entire graphics team had a base to work from. And then we thought about the elements, the layers and the windows, floating around through the screen, and when you click on one, it comes forward. It had a screensaver feel to it with all the floating windows, a very calm feel that played well with the whole look of the film.

And it wasn’t like “Iron Man” or one of those other films that has lots of slick lines and moving numbers. There’s plenty of that out there, where it’s basically feels like a bunch of slick AfterEffects templates. Some people really like that, and it’s certainly cool for certain things. But there are so many screens that look completely impractical to me. No one would be able to use that information, because there’s so much moving around. Where is the eye supposed to go? Too many things in the same color.

For me it’s very important to establish a logic to the interface as much as we can. There are certain cases where you have to go with lines and circles, like if you’re doing something like a sniper HUD. But so many of the HUD designs are completely overused.

Kirill: One of the arguments that I hear in favor of those elements is that they allow the audience to have some sense of familiarity. If there’s a lot of fast-moving text, that probably means a lot of information to process. And a red flashing gigantic text means trouble.

Todd: I have to be able to tell the audience what and where to look at within one to two seconds. Some of the cutaways are simply that short. I can’t have a screen of all the same-looking information. The idea, though, is not to make it look so ridiculous that it pulls you out of the story. You’re not going to have a 5″ tall flashing warning, in most cases at least, because our screens typically don’t do that.

You have to walk that fine line of conveying the message, and making it visually interesting, plausible and logical. I go back to my rule of VIPAL, and that’s the rule I live by. There are always circumstances where it might not make much sense, and you have to make it look cool. So sometimes that happens, and you’re not able to get what you want as far as a look or a feel on the screen.


Perry Freeze, Danny Boyle and Todd Marks posing in front of Desi and Lucy on the set of the “Steve Jobs” movie [2015]. Courtesy of Todd Marks.

Kirill: Bringing you to the world of tools that you use for work and pleasure, are there any big pain points that you find yourself having to deal with in your daily interactions with computers?

Todd: There are certainly some pieces of software that have become so complex that they’ve lost the ease of use. But you continue to learn the complexity of certain types of software. As far as the interfaces – keyboards, mice, trackpads or pen input devices – everybody has the tools that they gravitate towards. Perry Freeze, who works with me likes to use his Wacom tablet, and I’ve tried but failed to find a comfortable connection there, probably just because I haven’t done it long enough. Some of these input devices are good for some things, and sometimes you’re certainly fighting against the interface.

But the capabilities and the speed of the tools are so tremendous now. Being able to render high-def video and animation elements in minutes or less, compared to hours and hours before, is extremely helpful in situations where we’re having to make changes or build stuff from scratch.

When we did “Spectral” in 2016, as we were turning out “final” graphics work and uploading it to Dropbox – again, a majorly helpful tool – they asked if it was possible to modify a screen into a different aspect ratio. Very quickly I was able to go into AfterEffects, rearrange a few pieces, rerender it and upload it to Dropbox five minutes later. And before that it would take a day or two [laughs].

It’s also knowing your tools. I’ve learned so many shortcuts over the years out of necessity, shortcuts that other people just don’t know. Say, if I need to make a glitch in a video, I’d go into Quicktime, export out frames at different distortion, and throw them in one frame at a time to build a little glitch piece. I’ve done so many little things with keyboard shortcuts on set. There are tools now to make some of that, but before you had to be creative in the ways you did things.

What used to be Macromind Director and sadly now is the Discontinued Adobe Director, was our the go-to tool for interactive motion graphics for playback. It had great capabilities for sprites. we built typing gags where any key typed on the keyboard would reveal the right letter. You could create self-contained applications, export them to PC or Mac, and then play fullscreen as an interactive app. The problem is that Adobe let it die a slow painful death. We were hoping that they’d sell it to off. But no. It’s a crime to lose that tool, because it was so useful.

There are other tools that we’ve learned to used. Apple’s Keynote is remarkably functional for certain types playback graphics. You can fake the typing gags, you can fake some different effects, you can make things clickable, you can make an interactive thing that works on an iPhone or an iPad. I can have it playing on one phone and drive it from another one. For example, if they want to do a sequence where it looks like an iPhone is getting an incoming call, I can remotely control the Keynote animation from another phone by clicking through the pieces.


Screen graphics of “The Net” [1995].

Kirill: Do you think that sometimes the technology in our lives, particularly software, changes just for the sake of change itself and not necessarily to make things better for its users?

Todd: That’s a problem, and some companies are better than others. And sometimes a program ages out and is no longer being developed. It’s stuck on the previous version of OS and can no longer work on the newer versions. That happened with Director, and it has happened with a few other programs we used to use often. One of my favorite programs for creating particle effects was Wondertouch ParticleIllusion, and while I can use it on some of my newer machines, they’ve stopped updating it. It’s starting to look dated, and you’re not getting new particles. Obsolescence is always a problem.

Sometimes software gets changed so much that you don’t even know what you’re using. That happened with Microsoft Word and Excel, numerous times. I kind of stopped using Office almost entirely. I don’t have it on any of my machines now. Most of the things I do in Pages and Numbers, but they are certainly limited. I was trying to do mail merge the other day, and you can do it in Numbers in something like 17 steps. So I just typed in the names myself [laughs].

The same thing happened when we moved out of OS 9. There were these amazing programs, so quick and so wonderfully optimized. NOW Contacts and NOW Up to Date were the most amazing address book and calendar book tied together. It was so quick and so useful, and then they never made a proper update to it. There were a lot of tools that we left behind, and a lot of efficiency that we left behind as we’ve moved forward.

You get new benefits and new tools, but you also leave behind some of the other stuff that was significantly more straightforward. We’ve lost the simple interface that Apple once was known for, and it’s no longer that simple. Just go into your console, and read everything that is going on there – one thing after another. Why is my machine running so slow? There’s a thousand messages of what’s happening [laughs].

Kirill: What keeps you going? Why do you stay in the industry?

Todd: It’s still an excitingly challenging job. I love working on movies. The environment of a production is like nothing else that I’ve ever done. You have a family of very creative people, getting together to solve problems and to make magic. You’re creating a reality of the film. Having an opportunity to have a direct impact on a production is always a thrill for me.

To go back to see a movie and see that something is there because I did it, is still cool. Those big screens are there in “The Internship” because I did all that product placement. They wouldn’t be there if I hadn’t been able to do that. There are all sorts of things.

What keeps me going is the love of production, and the opportunity to be a part of movie magic, and to create something that continues on past the time that we did it. Like “The Net” that is still referenced, and it’s kind of cool. You do something that people still enjoy.

Here’s one last story that’s pretty cool. We had a scene in “Deep Impact” where Morgan Freeman as the president is talking to the crowds in Times Square about the impending doom, and what the government’s plans were. Our production crew, actors, and several hundred extras were actually in Times Square at three in the morning on a Sunday night, and at that time (1997) there was only one main big screen there. Our director, Mimi Leder, wanted to hear the president’s speech so that the crowds could hear it and react. The problem was that there were no actual speakers for the big screen system. And the other problem was that the control room where the video played from was in New Jersey.

So in order for us to play the sound for the crowd, my video tech and I set up these giant speakers facing the crowd. We had an exact duplicate of video (with sound). We were on the phone with the guy in the New Jersey control room, and we needed to start playing the video at the same time. Camera’s would roll the cameras, they’d yell “PLAYBACK”, we’d count 3-2-1 and both push PLAY. Remarkably, it worked pretty well, even though the sound and the video were running separately, running in two separate places! That’s how we were able to accomplish what the director wanted. Creative thinking, and technology saved the day! A great memory!


“Deep Impact” [1998]:  On Board “The Messiah”, Todd Marks gives instructions to the computer and video playback crew, while director Mimi Leder gives direction to the actors. Courtesy of Todd Marks.

And here I’d like to thank Todd A. Marks for taking the time out of his busy schedule and walking me down the memory lane starting from “The Net” and all the way forward through the industry’s changes in both hardware and software to the present day, and for sharing the behind-the-scenes images. Todd’s next production “Venom” will be opening in theaters later this year. And if you’re interested to read additional interviews about the wonderful world of screen graphics and user interfaces for film and TV, click here for more.