What a year has it been for Territory Studio and its founder David Sheldon-Hicks. It’s only been nine months since our interview about the studio’s screen graphics work on “Prometheus” and “Guardians of the Galaxy”, and in the meantime Territory’s work has graced the big screens on three major motion pictures! In this interview David talks about exploring the technical and human aspects of advances in the field of artificial intelligence in “Ex Machina“, the futuristic interfaces of the alien universe of “Jupiter Ascending” and the vast screen graphics canvas of “Avengers: Age of Ultron“. He touches on the collaboration with directors and production designers to distill and refine the visual language of the interfaces, the process of rapid prototyping in the pre-production phase and the tight back-and-forth adjustments of on-set playback sequences, and the technical aspects of projecting interface sequences on translucent glass screens. In the last part of the interview David takes a deep dive into the world of virtual reality interfaces, ranging from enhancing cinematic experiences to better leverage of content in our everyday work and leisure scenarios.
Kirill: Last time we spoke about your studio and its earlier work. And in the meantime you’ve had Jupiter Ascending, Ex Machina and Avengers: Age of Ultron come out, with a lot of work that you’ve contributed to these productions. And on top of that, you’ve started exploring the world of virtual reality.
David: We actually worked on Jupiter Ascending a year prior to that interview. The project took a while to come out (6 Feb 2015) so we haven’t been able to talk about it up until more recently.
Kirill: How does it feel that you have this pile of work that you’re sitting on and can’t tell anybody about?
David: In some ways it’s really frustrating, especially for the people in the studio who have worked on it. But, we’ve all signed NDAs, and don’t want to ruin it for the fans by saying something that might spoil the story.
Yet, it is quite nice to have a bit of time to prepare our portfolio and showcase, to fully do it justice in terms of the story behind all the work. When you’ve just finished a film, you’re so exhausted and still in that space. Sometimes it’s hard to look at the project objectively and fully appreciate all of the thinking and the ideas that went into it. When you have space and time – around six months or so ideally – you get enough distance to join up the dots between the deliberate and intuitive.
This gets really interesting when it comes to our approach to technology in a film like Avengers. By reflecting on how the script and story points influenced our research, we can sometimes see how quickly perceptions and expectations around innovations evolve.
Looking back we can see how public opinion regarded a piece of technology at the moment we were making the film, and how it changed over the 6 or 12 months leading up to it’s release – and hopefully we’re not behind technology!
Kirill: Does it worry you that the world of consumer technology is evolving so rapidly around us in that regard? Can your work quickly become outdated?
David: As a user, yes, because it’s hard to keep up sometimes. But as far as the work being outdated, in some ways we have to celebrate that. We play on the fact that it’s a film of the moment that reflects the cultural experiences of that point in time. It’s not just about the technology, but also about what the technology says about us as a culture. And I’ve always loved that about unashamedly 80s movies. For example, the idea of hoverboards in “Back to the Future” reflected our cultural expectations that skateboarding, so huge at the time, would exist in the future.
So the idea of a date stamp is not such a bad thing. If a film does date because of its views on technology, it can be quite enjoyable. And there’s an element of nostalgia in that, perhaps ten years into the future. You look back on it with fondness the same way you’ll look back at your current iPhone 5/6 when you’re holding your new iPhone 20 or whatever it might be.
Of course, there are other ways to explore technology and different films date in different ways.
In Ex Machina, Alex Garland [writer and director] was more interested in near-future thinking rather than techno-fantasy. He had done a lot of research around AI for the film to keep the story grounded, and we also undertook a lot of research into UI and UX in terms of where the technology is going. So our work in that film reflects the evolution of user experience towards simplicity, personalisation and ease of access, while it also reflects the narrative layers in the story.
Embedded just underneath the lovely uncluttered UI is access to programming code: we wanted it to feel that Nathan had created this OS for the benefit of the public, yet always kept close to the code, and could easily access and manipulate it.
So, while the work that we delivered made a loose statement on where UI and UX are going, it was also a technological window into narratives playing out around the film – a narrative device for the characters to monitor the activities around the building, for example.
I’m sure Ex Machina will date in a slightly different way other films of that genre because its focus is not to comment on future technology and it’s not tying itself into the cultural zeitgeist in terms of style. As a film about the thinking around AI and our relationship to technology at this time, it’s less a statement about aesthetics and more of a timeless piece.
Kirill: It feels like it’s tapping into the ongoing conversation around our anxiety as human beings that human society will be replaced by AI in our lifetime.
David: I think it’s angst about us being replaced by technology in general. Are our jobs going to be OK? Are we going be replaced by robots at some point? Will a big corporate invent a piece of software that makes designers obsolete?
We all have those concerns, but I think Ex Machina goes deeper than that. It touches on a lot of other themes that have nothing to do with technology, but everything to do with our cultural perceptions and prejudices, about relationships gender equality and the right to control.
Kirill: And you mentioned that you’ve worked closely with the production designer Mark Digby.
David: Yes, and there were a couple of things going on in the art department, which was heavily involved with the look of the set at Pinewoods. We worked with on-set graphic designer Andrew Tapper to develop the right branding for Blue Book. He wanted it to feel modernist, to be about the simplicity of UI with clean color palettes that had a slight Scandinavian feel to it, partly pulling from the minimalist interior set of the house and the stunning glacial environment.
So we were tying into those key aesthetics, but it kept coming back to the understanding of code.
Kirill: And going back to what you’ve mentioned about your wait for Jupiter Ascending to be released, you don’t even know how much of your work will be cut out and how much will make it into the final movie. How does that feel?
David: It fills me with fear. I have a lot of staff here who work really hard on producing stunning graphics, and they want to be able to see that work in the film. We always want to be proud of the work and our contribution to the story, but it’s always tentative because we never know exactly which screens will feature in the final cut. Fortunately, we’ve been lucky so far and a lot of our work has made it into the films.
Also, that fact that a lot of our work is on-set screen graphics, means that while our screens are in shot, the directors are free to use it as background ambiance or to make a close-up narrative point – often we just don’t know how the set will translate in the final edit. Sometimes, we may get an indication from the directors, but we can never be sure what makes it in.
This was the case in Jupiter Ascending, which in terms of process was very similar to Zero Dark Thirty. Katherine Bigelow assumed that we would be delivering all the graphics for VFX after the film had been shot, and the Wachowski’s were under the same impression.
Originally they asked for all the screens to be green, but as on-set screen graphics is my creative field, I asked for a meeting between myself and Mark from Compuhire and Hugh Bateup the production designer and Peter Walpole the set decorator. While we’re happy to delivering to VFX and often do so, we were able to convince them that we could work with their butterfly screens on-set and have live screens in front of the actors literally hours before the shoot. Our work with Ridley Scott on the set of Prometheus convinced them and they asked us to do a test with one of their butterfly glass panels, hung at 45 degrees to the user. The problem with that angle, especially if you’re shooting at head height, is that the camera lens gets the light of the screen projected at it – but we were able to compensate for the optical warp distortion of the angle and the light bounce by invert-warping our designs and animations. We used that technique on Prometheus, and we had to really refine it for Jupiter Ascending. And the Wachowski’s loved the immediacy and tangibility of it!
It was a lot of screens on a lot of ships, with different angles, shapes and effects. One amazing effect was created by the backlit bevelled etchings that Hugh had applied in the glass panels. Once the panels were lit, the combination of physical lighting with kinetic animations created a beautiful meld of physical design and digital motion design – a really unique effect.
Kirill: So everything was on set, even those angular hexagonal consoles?
David: Yes, which was fun. Obviously the holograms were done afterwards in VFX. But all those hexagonal screens were on-set projections. I suspect that someone boosted them in post-production to make them clean and vibrant.
We had Ryan and Nick (who were the two main designers on the project) on set with the Wachowskis. They had the main set with the actors going onto it. The Wachowskis would have a look at the graphics, Lana would be up on set with the actors, and Andy back in the tent looking at the monitor, checking how it was working. He would then walk around to our guys asking them to change the color to pink, for example, or some different word popping up in this box. And they’d be quickly changing stuff as everybody was getting ready for the next take, rendering it out and sending it to the screen. It was not quite real-time, but as quick as we could do address that feedback on the graphics.
And we like to do it that way because then it doesn’t end up being replaced in post, and the actor can see it in front of him and can react to it with authenticity. We’re always talking about connecting with the performance, and I think it’s really important for the work that we do. The design doesn’t control the actor’s performance in any way, but they’re able to point and look at the right thing on the screen and when they deliver a line, it’s in reference to something they see. And if the camera is at the right angle, you pick up both the screen context and the actor’s relationship to that digital content.
That’s really important to us and we like working with directors in that way. Those conversations with directors allow us to understand what they want out of the technology and graphics that we’re putting on set. We often work on the VFX delivery and at that point it tends to become more disconnected from the high-level conversations around the editorial and narrative and shot choice. At that point you’re more or less reacting to the shot that you’re given, but you’re not necessarily aware of the story that’s preceding or following it, which is not always ideal.
We work that way for secrecy, obviously. If we’re on set and we’re seeing different takes and we’re seeing why actors are doing certain things, and why director of photography is framing in a particular way – it becomes less reactive. We can offer up ideas and be a little bit proactive about our involvement in the feature film.
Kirill: Does it help that with all-digital on-set equipment you and everybody else can monitor what’s being captured by the camera right then and there?
David: Exactly. It was incredible on “Prometheus” because Ridley Scott was using several stereoscopic cameras, and you would see everything on monitor as he was almost editing it there. He would do one or two takes, but he only needed that because he had one camera for each main character. So he could cut between all the different shots and you knew that had an entire sequence in the bag.
For continuity with that way of filming you need to make sure that all of your screens line up in exactly the right way, and that is why these conversations are so important, and why it’s so important for us to be on set.
Kirill: How much planning happens in the pre-production phase?
David: We go through a lot of rapid prototyping to flesh it out and understand some of the problems we might come up against. Often people look at my work as creative director and think that it’s about the detail. They think that you look at that little piece of text, or where does this menu go. But a lot of my work on the film, if you compare it to what the designers and the animators do, is to look at the broader picture. How do these eight screens talk about this character or this set? Did they work as an overall brush stroke? Does it sit well in the overall picture on a wide framing?
And then my directors and motion designers are concerned about what happens when the camera goes over the character’s shoulder. Does it still look believable in a close shot? There’s a real balance there between technical details and making sure that every detail matters, and the broader picture of how the entire environment looks like. Do all these screens tell an overall story? And that became true on Avengers: Age of Utron as well.
Kirill: When did you start working on it?
David: We started in November 2013. It feels like such a long time ago, and it’s been our biggest project so far. We’ve had a couple of VFX replacements on the film, and as far as I’m aware, we were the only on-set graphics company. From everything I’ve seen, apart from a couple of the bigger VFX holographic 3D deliveries, it’s about 90% our work.
Kirill: Was it overwhelming at the beginning of the process to consider the scope of that work?
David: When we were first involved with the project, there was no final script and we didn’t know how many screens we’d be delivering. As a project evolves, sometimes you have characters that are not particularly scientific or technical, and you don’t see why they would have anything particularly complicated in terms of UI. But as your understanding of the character and story develops in meetings with the director or the production designer, you can suddenly find that the project is scaled up and goes from one computer screen to thirty. Other times it’s scaled down.
The requirement on screens changes as you’re working through pre-production, and sometimes into production. We definitely worked on one set that didn’t make it into the final film as a whole, but the screen graphics got split up and used in various other places.
When we first started working on The Avengers: Age of Ultron, we knew that some of our screens would help the audience locate the action – so an Iron Man suit schematic would immediately say ‘this is Stark’s lab’. But we knew that Joss and Charlie really wanted the screens to show more rounded characters. So we began by breaking the tasks into Bruce Banner, Tony Stark, and some of the villains that we knew would require screens, and the new character Dr. Cho was going to require screens. We designated color palettes, design systems and content that would be appropriate to those characters.
For instance, Cho was a super-advanced scientist / doctor. The character could perform amazing surgery on some of the other heroes, so we designed advanced UI systems that were bespoke to her. To support her character, we did a lot of research into how advanced medical technology is developing. There was also a few technical discussions about 3D printing at the time, when the technology was quite fresh. We looked into medical uses and discussed 3D printing skin, 3D printing bone and other stuff.
For Banner and Stark, we explored the characters backstory and interests and how to reflect that in digital form. For example, Banner’s interested in the ways plant biology and chemistry could influence the human body, because obviously he wanted to solve his own problem. And Stark was heavily into engineering research project. We though about his background, his love of classical cars and motorbikes and hotrods. We thought that maybe he’s reverse-engineering some of that technology. Maybe he’s looking at what NASA is doing and trying to understand how satellite technology works. Maybe he’s got his own satellite network up there that he’s making use of. And what other technologies are feeding into the Iron Man suit?
If you were building Iron Man and his suit now, who would you need to reach out to, to contract highly skilled metal engineers, hydraulics and power supplies? What sort of information would those providers give you? How would you use that information? What software systems are you using to look at those things?
Today Territory has different departments that work on projects as varied as product design, brand experience and digital apps. We can use these in-house skills to inform our understanding of real world processes and who the best placed providers are.
Recently there have been a lot of conversations with airplane manufacturers and the suppliers of those specific technologies, asking questions such as How do you go from a concept to a piece of technology and move that through 3D software and design process and engineering process? How do you test that out? Are you running simulations? Are you stress-testing the materials? And the understanding that we can tap into means that our designers can pull those references in to give the screens a bit of context and believability.
Kirill: When are these explorations and conversations happening? Do they involve the director and the production designer, as you mentioned that when you started working on the film the script wasn’t even finalized yet.
David: At that beginning stage I want to understand what is the director’s vision for the film? What is he doing with the film that’s different from his previous one? Will we be repeating what other companies had already done, or could Territory contribute with something new, adding value to the project.
And the key thing that came out of the conversation with Joss Whedon [writer / director] was about getting under the skin of the characters. Joss wanted to play on some of the human nature of those characters, and some of their weaknesses, and also get a little bit dark and gritty with the tone of the film.
Charles Wood [production designer] and had another idea that the screen graphcis should marry high-tech Marvel UI with real-world data, context and material. For instance, if we’re doing the MRI scans of the brain, how does that look when it’s merged with the highly stylized world of Marvel and Avengers and the comic-book feel? We’ve really enjoyed exploring how to ‘mash up’ those two spaces, discovering what visual languages and systems could come out of that.
That was particularly true of Banner. We brought a lot of real-world imagery and content, and moved that into a more advanced Marvel style superhero UI. It worked really well. Giving the character a visual language that is highly stylized and beautiful, that’s embedded in something more authentic and tangible to our own experiences of user interfaces and computer technology.
Kirill: The screens that I’m seeing in your portfolio for Age of Ultron are really dense, with a lot of going on if you compare them to your other productions.
David: Yes and the density is deliberate to reflect the character. So, Tony Stark needs to feel as though he’s one step ahead of everyone. You see this amazing drive and energy and intelligence in his character as he chucks around data, almost ahead of the system. And he’s used to work with Jarvis, a computer intelligence that has been designed to anticipate his next move and offer up pre-emptive content and information. And this relationship between Stark and Jarvis is something I really loved about the Iron Man films and the original Avengers film.
There’s another reason for such a multi-layered approach, especially with Cho and Banner and, to a degree, Stark. This film was all about the group coming together, and working together in a collaborative way, and becoming aware of the collaborative environment and experience. So we explored how to bring a group to a project, with each individual accessing different aspects of it, yet working together in a way that is more useful than working in isolation.
We looked at current cloud developments, collaborative workflow software and zoomable map technology, and extrapolated that information can exist in one space, with a UX and UI that enabled ‘windows’ into a much larger data matrix. This worked well for Banner and Cho’s screens, which looked at different details of the same information.
Our designs became as much about suggesting activity at the periphery of the screen, as it did about the details at its centre. And it was important to Charles specifically to layer this data, suggesting depth that could be move through, a bit like layers of clouds. The characters could move through a layer of cloud and things become clearer, and then drop closer again for more detail, pan around and then pop back out.
Kirill: Do shooting considerations and aesthetics affect the kinds of screens you primarily focus on? Should I be “worried” that we’re primarily seeing larger “desktop” screens instead of smaller handheld devices?
David: I think the screen selection is made based on what work as a narrative device for the story. Sometimes a little screen becomes that character’s personal space, and maybe it’s a connection between one character and another. The big screens are about bigger sets and making those sets alive. Mobile phones won’t help you out in the same way as a bank of large screens that can show a lot of ongoing activity when people are working hard to figure something out.
In real world we have certain screens and certain content for certain functions. I use my iPad at home on my sofa, and I use my phone on the train, and I use my laptop at work. Each use just feels natural to the purpose, and it’s the same within the film.
Kirill: Do you think that the proliferation of screens around us is somehow translating into feature productions. It feels that I’m seeing more and more screen time devoted to screen graphics.
David: I think it should happen more [laughs]. From what I’ve seen, I honestly can’t ask for any more of our screens to be in there and I think that Joss has been very generous with screen time, because they serve a certain purpose within the narrative space. An actor can talk about a location of a moving target and where it’s going, but it would be very complicated to describe it, and it wouldn’t be a particularly elegant way of delivering it. But you see a map on the screen with a target, and you see an enemy or something moving towards it, and you get a story very quickly in a shot.
This is especially true for large-scale battles and multiple heroes and helping the audience understand the relationship of those different heroes and foes, and where we are in those different locations. Screens offer up a really good way even to relocate sometimes. A lot of green screens? We’re definitely in the Banner lab. Purple screens? We’re definitely with Cho now. It helps place the action and deliver particular story points.
You go back to the original Star Wars, and you saw X-Wings going down the trench and two red lines kind of targeting in, and you got that story point immediately. I think screens do that, and we’re comfortable with that kinetic moving backdrop in our lives. So it feels odd when it’s not there in a working environment, and essentially the Avengers are at work. When they’re not battling an army of big robots, they’re back at their base trying to figure something out.
And we all these days access content, we figure things out, we use all our tools and those tools are screen-based. That is the reality of our lives, and in a film you have to make that interesting and sexy and tell a story. It’s a balancing act to pull between authenticity and reality of how we understand technology and computer interfaces against the dullness and reality of that. You’re moving it towards something more entertaining and vibrant but still not too crazy.
Kirill: How does it go for you to work with different directors on these different franchises, each with his or her own approach to storytelling?
David: They’re all completely different, and you learn from every single director. There’s a reason that he or she is heading up a huge film. They’re incredibly talented. They know their stuff, especially when it comes to storytelling. I’m always learning from them and I always want to work closely with the director or the production designer whenever I can. The worth of knowledge and craft that they have is phenomenal.
And while what we bring to the table is unique and specific, they map that out across a much bigger project and understand how it ties into other facets. More than anything, they want to tell a great story, a story that entertains or informs or moves someone in a certain way. I’m always learning from that and I’m always trying to figure out how we can do that with our medium. At the basic level, does this thing feel like a relevant piece of technology to the film? Does this help the actor in some way in their performance? Does it tell a particular part of the story that they couldn’t tell through other means?
Ultimately with movies like “Her” it can actually have an emotional impact on the audience.
Kirill: And that one was very low-key on the actual screen graphics.
David: In terms of screen graphics yes. But in terms of technology and the operating system it was had a very high impact on the movie itself. I’m sure there are other means of doing that, perhaps doing it holographically or stylistically. I’m thinking about Terminator when you’re seeing through his eyes, and his understanding of the world is pretty powerful. Seeing the world through robot’s vision ties back into Ex Machina and, more recently, our work on the short film “Miles“.
You start thinking about how robots perceive the world. It’s interesting, and playing that back through the film can be quite child-like.
Kirill: Or the work you did on Jupiter Ascending with organic flowing lines and non-rectangular screens for the alien ship interfaces. You start pushing the boundaries to explore what can happen to an interface when it is not designed by a human.
David: Absolutely. You go back to The Guardians of the Galaxy where we were exploring how would different alien races understand information. How would different alien races understand advertisement or a gambling screen?
Jupiter Ascending was interesting because Hugh Bateup [production designer] wanted to explore how we depict forces that we are not aware of yet. You’re talking about gravitational fields or generating a wormhole. How do you depict something that isn’t visible though the naked eye? How do we hit that process through data? And some of those things were happening on Prometheus as well. It goes back to the story points that just can’t be explained through dialog or music or visual effects shot.
It’s quite exciting for us to be a small part of the storytelling process, and using data to do that. And it’s really using data and digital content to tell a small part of the story that couldn’t be told so eloquently in another medium.
Kirill: Does it flow back to other work that your studio is doing?
David: We’re working on a couple of different projects at the moment. We have virtual reality [VR] headsets in the mix, and they’re throwing up new problems and crossing over into different fields of interest for our studio. It’s product design, gaming, architecture, fashion, marketing, telling a story. How would viewing a film through a VR headset go? How do we make that work? Is that relevant or necessary? Are there particular story types that work really well for a VR experience?
And how does UI feature in that? If you can move your head anywhere, are there particular places where you put UI? Will it behave in a particular way and why you’re accessing this information through VR? These technologies are coming through, and we’re thinking about the purposes and useful ways of applying them, and tracking back to the best interactive moments and the best ways of presenting information when you might be dealing with augmented reality where the information is reactive to the space it’s projected over.
Kirill: There’s a lot of hardware and software companies throwing a lot of money into their own VR headsets. Does it feel to you that we’re just around the corner of VR being actually usable as a mass product?
David: Definitely. I think that films have cued up the excitement around seeing the world in a different way. Think about Blade Runner or Ghost in a Shell. They were tantalizing the moment they came out, and had the idea that we don’t need to engineer ourselves in some way to augment our understanding of the world. The technology coming through that can sit in our field of vision and help us understand the world differently.
In the same way, virtual reality feels like a complete paradigm shift. It’s a way in which we can look at the world, understand the world, present information that helps us view things differently. You need content, you need information that is relevant to that new medium, and it feels like that is the question that everyone is asking at the moment. What entertainment, what content, what information is relevant to that technology space and that way of looking at the world?
For the first time in a long time we’re not talking about a flat rectangular screen. And it’s exciting for designers. We get to solve new problems. Moving type opened up a framework for graphic design, and then you move forward to Bauhaus and modernism and our understanding of laying of information. That was another shift. Then you jump forwards to the Internet and being able to pull information from multiple different sources, and cross-collaboration on flows – that was another shift.
It genuinely feels like VR is in the same place where we were at with mobile phones. In the UK in the 1980s the city bankers had these big brick mobile phones, and the general public would look at those and think that it’s not a usable piece of technology for everyday life. It’s expensive and impractical. And you fast-forward twenty years, and everybody has a mobile phone – my parents, my grandparents.
I think that we’re going to be in the same place with VR. There’s now a technology that makes sense, and we get that certain technologies are used in certain spaces. I don’t know if we’re going to see people walking in streets with VR headsets.
Kirill: That’s what I’m afraid of where you end up being so physically detached from the society.
David: I think there will be acceptable spaces and times for the use of VR. It might be work, it might be cinema, it might be a time where you sit down and get consumed by the content that is being promoted and presented to you. There’s a definite social aspect to it. Maybe we’ll all be sitting in our offices with VR headsets on, waving our arms around. There’s probably a film just around the corner that will help to promote that.
The medium definitely makes sense for certain content that will be heightened and leveraged in the right way. I think that’s what all these technology providers are trying to figure out. How does this content plug into the ecosystem to feel relevant to mass market? These are the questions that we’re helping to figure out in other teams at Territory.
Film helps us. It presents those questions earlier than we would with our other clients. And it helps us consider them in an entertaining way. What works for a performance of an actor might not work at home. Tom Cruise can wave his hands around in Minority Report, but I’m not going to write an email doing that.
Kirill: That’s the fantasy part of Fantasy UIs, and it’d depend on how close the film is supposed to be to our present time.
David: Absolutely. I thought Minority Report was great, and maybe now we need a VR headset film to address that and make it feel exciting.
Kirill: The most interesting part about Minority Report was that “think tank summit” where they had invited scientists and technologists to discuss and explore this specific path in such a structured way.
David: Absolutely. VR is such a personal space. Iron Man, to a degree, is a precursor to that. And you might have different people collaborating in the same virtual space. There’s lots of uses, and it’s us executing it in the right way for it to feel relevant to the broader public.
Kirill: I can only hope that this materializes in a decently cohesive way in my lifetime. I’m thinking back to around 4-5 years ago where we had a glut of augmented reality [AR] mobile apps, and that fizzled out. It feels like there needs to be a visionary with enough money to throw at it.
David: If you have any of those, I’d love to speak to them [laughs].
Kirill: Going back to your film work, if I pause one of your movies and zoom into the finer details, would I happen to come across some hidden Easter Egg?
David: I think our artists may have worked their names in here and there as part of some bit of code or data in the UI, but certainly not in details that are of any significance in the film.
And the thing about Marvel is that they plan a long way ahead. Some is kept under wraps, and the bigger plan is revealed later on. You can pick up on the depth the directors think to , and build that into the DNA. As a screen designer, if you look at things in a wider context, you start to notice details. Sometimes we work on something, having had conversations with the director and the production designer, and not understood certain decisions. You follow their lead and broadly understand some of the stylistic decisions, and then later on when you see the film or the sequel, you suddenly understand the reasoning behind them. It was very obvious to me that Joss had a lot of thought in mapping out things that were happening. Maybe not everything was in place, but the threads are going in particular directions, and he wanted to imbue that in everything that was designed and visible in the film.
Kirill: That’s what keeps the hardcore films watching and analyzing every single frame in every single movie in the franchise.
David: A lot of us here are some of those hardcore fans. It’s important for us. We enjoy it. It’s pleasurable to add that depth of thought into the work. For example, we received a 3D model of the Iron Man suit from ILM [Industrial Light and Magic], and spent probably half a day trying to open the thing because it’s so detailed. And that level of craftsmanship means that we can actually take a small chunk of an engineered piece of metal and design a screen out from it.
Every single company and department and individual is committed to adding that level of detail, depth and richness in their work. It comes from the core DNA, which comes from the director and the script. As long as those have a strong foundation, then all the work that we’re building comes from a good place. You can build those layers and that depth to be meaningful. And you can trace them to other narratives and characters in comic books and all the research that goes on behind the scenes.
Kirill: And you want to have passionate people and passion in yourself to be willing to work a year on a project such as this.
David: That’s really easy – it’s Avengers! The hard part is the personal pressure. You’re sitting at your desk and you’ve dreamed about this moment. You’re going to be working on a UI display for Stark. You’ve wanted to do this for some time and you open the 3D model of Iron Man that you got from ILM, and you have two days to design and animate the thing, because you have another ten that you have to do. And we all are putting our names to it.
It’s self-inflicted pressure. Obviously the director wants the film to look great and to have meaning and to tell a story. But, in terms of design and visual impact and impressing your peers – it’s all there. There’s no lack of motivation on any of these projects.
We had direct conversations with the director and the production designer about AI on Ex Machina. And it felt really relevant to the moment. And I loved the Matrix films. They were incredibly influential to me, and to work with those directors on Jupiter Ascending, and to see them and their process, and to be a part of their storytelling was really influential. So each one of these projects doesn’t lack motivation. It’s a lot of pressure that you apply to yourself, and you’re constantly worrying about things.
You go to the cinema to watch the film and you start asking questions. Have I missed something? Is it going to work against the action? How have they edited this? Have they taken that scene out and suddenly our graphics won’t make any sense? And you know that doesn’t happen because they’re considering every last element.
Kirill: That’s the part where you don’t have full control over how your work is incorporated into the film, through post-production, VFX and editing.
David: But they’re always making it better. That’s the beauty of everyone being at the top of their game. From the editor to the sound designer, everyone is working to elevate the work to another level. And good-looking actors really help as well!
We can’t complain. Yes, absolutely everyone imprints after us on the process and on the screen graphics. But there are no complaints there, it’s all part of the process. And going back to the collaborative process, it’s absolutely essential for our work to be elevated up to that level when the whole film is being judged by the public and will live or die based on audience figures. It’s very demanding and cut-throat in that respect. If you do your job well, it stands the test of time, it’s entertaining and it fulfills a spectrum of other criteria.
And here, yet again, I’d like to thank David Sheldon-Hicks for the wonderful work Territory Studio is doing on screen graphics and movie user interfaces, and for sharing the background materials for the interview. Both “Ex Machina” and “Avengers: Age of Ultron” are playing in theaters now.