In “Mercy,” Chris Pratt plays a police detective accused of murdering his wife. It takes place in the future, when the judicial system is turned over to an AI “judge” played by Rebecca Ferguson. Until the final third of the film, everything takes place in a cyber-courtroom. He is trapped in a chair and has just 90 minutes to defend himself. But he has access to all the records, all the surveillance, and all the witnesses he wants. That means it all plays out on a gigantic screen. And that means the film’s editors, Austin Keeling and Lam T. Nguyen, had to make sure that the audience could try to solve the mystery by observing a lot of information — files, security camera footage, interviews — on that screen.
In an interview with RogerEbert.com, Keeling and Nguyen talk about that process, as well as finding ways to make audiences process a lot of information without getting lost in data.
This interview has been edited for clarity.
Everyone looks at screens all day. How do you overcome the audience’s screen fatigue and make looking at screens so vivid and involving?
AUSTIN KEELING: We ARE all looking at screens all the time, so keeping it fresh and interesting was definitely one of the biggest challenges. We got lucky that the script is about a murder investigation, so everything shown on the screens is directly linked to the central mystery. We made sure to justify the existence of each screen, only including it if it was connected to the detective work that Chris Pratt’s character engages in throughout the film. This draws the audience in, as each new screen presents a new clue or piece of information, allowing them to sleuth right alongside the main character.
LAM T. NGUYEN: Our objective was to create an immersive experience for our audience, and what sets this format apart is the incorporation of 3D elements in the screens. This unique hybrid format combines traditional filming techniques, screen-life elements, and 3D effects. During the editing process, we developed a digital camera that applied a blur (rack focus) effect and pan moves to our POV shots. This technique allowed us to establish a cinematic visual language that made the film entertaining and engaging, enabling our audience to follow the story effectively.

How do you direct our attention to what you want us to see and sneak information in the edges of what we see so we need to watch it again?
AK: Another huge challenge was juggling the massive quantity of assets in this film, both footage and graphics. We built the film by assembling “wide shots” of the Mercy courtroom and populating them with all the necessary assets for a given scene (footage of Judge Maddox, the courtroom background, video calls, security footage, websites, emails, etc.). We meticulously placed and animated each of these assets to exist in a master wide, and then created a “digital camera” to act as Chris’s POV. With this, we could zoom in and use keyframes to animate the camera to “look around” at various points in the room. This allowed us to focus the camera (and the audience’s attention) on key details throughout the Mercy courtroom while still retaining the extensive supplementary content at the edges.
LN: While editing this with our director, Timur Bekmambetov, we discussed that any important information should be placed in the center of the frame. We made a conscious effort to center everything important and then populate the surrounding screens afterward. We relied on our characters’ emotions to determine how quickly or slowly we looked at the screens. Since multiple screens are visible at any given time, everything has to be consistent and realistic to match the primary screen we are looking at. So there are definitely some easter eggs in there.
How do you balance what seems familiar or at least recognizable with some futuristic graphics that still have to be familiar enough for us to believe and process them?
LN: During pre-production, we explored all visual aspects of the film with Timur and the VFX team. There was a point at which things looked super futuristic and cool. However, Timur insisted on pulling that back and keeping it closer to our reality. Since the story takes place in 2029-2030, we wanted to advance the technology without straying too far from what we’re used to, so it’s more relatable for our audience when they experience the film. When I first met Austin, I brought up the Apple Vision Pro (since it was a new release at the time), and we used it as a baseline to develop the film’s visual language.
AK: We wanted to make sure that the Mercy chamber was not too far-fetched or futuristic (the film takes place only a few years in the future, after all). We played around with a bunch of different designs and animations, but ultimately the collaboration with Timur and the VFX team led us to a simple, unadorned style for most of the floating screens. Lam and I looked at the Apple Vision Pro early on as a jumping-off point for organizing and animating screens in 3D space. And all of the websites and video call formats are based on existing references. Essentially, we tried to keep a level of familiarity in the UI and functionality of the Mercy system so that audiences would believe in the scenario and ease into the chaos and tension of this world.

Photo credit: Justin Lubin
© 2025 Amazon Content Services LLC. All Rights Reserved.
What do your home screens look like? Cluttered or very organized?
AK: I keep my home screen very clean and organized! I actually only have three folders on my desktop and nothing else. So the sequences in the film with all the screens of the Municipal Cloud flying past Chris were a fun challenge to tackle.
LN: I get annoyed if I leave a notification number on apps from my phone or laptop. I always try to clear out the notifications. I’m a minimalist, so my home screen is very organized. It’s the only way I’m able to focus, haha. However, my edit timeline can look very scattered and overwhelming at times, but I constantly clean up after every version of the edit.
What was your first conversation with Bekmambetov about the film like? What did he say his priorities were?
LN: Meeting Timur for the first time was an honor. He’s a true visionary who constantly thinks outside the box. I recall the first thing he told me was to do research on stock footage and user experience. He wanted the movie to be as authentic as possible. Our priority was also to focus on the story’s dialogue first, then populate it with visuals afterward. He provided a lot of trust in us and allowed us to explore various visions to present to him, and we would find a cohesive vision together.
AK: Timur was incredibly collaborative and trusting from day one. We had a general discussion about his vision for the film, and then he let us take a crack at putting together a pre-vis using stock footage, temp graphics, storyboards, and a table read Chris Pratt had done.
The main priority Timur had us focus on in this first pass was perfecting the rhythm of the back-and-forth dialogue between Chris Raven and Maddox, and using that as a backbone for the rest of the elements in the film. Lam and I built this version of the movie from the ground up (in less than three weeks!), and immediately started working with Timur to try out new ideas. We had daily discussions in which Timur would respond to the current edit and give us notes and experiments to try.
We continued to tweak the pre-vis right up until principal photography began, at which point the locked pre-vis cut served as a guideline for the shoot. Lam and I then began swapping out the temp assets for dailies as the footage came in. And once the shoot was over, we spent another 5 or 6 months in the editing room with Timur, shuffling, whittling, and experimenting with the materials we had to create the best version of the film possible.
How did you coordinate with Composer Ramin Djawadi?
AK: Working with Ramin was a great experience. We first worked with an Australia-based music editor named Rod Berling, who took some early ideas from Ramin (as well as material from his previous scores) and delivered temp score options for key sequences in the film. As we progressed and got notes from Timur, Ramin would send over more ideas and themes, which Rod then incorporated into his edits. It was a seamless process that allowed us to work with temporary music that tested many of the key sounds and themes Ramin eventually incorporated into the final score.
LN: Our initial meeting with Ramin was to discuss the emotions for each scene. So we reviewed our editor’s cut that had a temp score. Ramin provided his input and fresh insights on some scenes. Once he sent samples, they seamlessly integrated into the cut. Our notes were minimal, but he was an excellent collaborator.
How did you coordinate with the VFX team?
LN: Editorial and VFX had to work in complete unison at every stage of the edit. We developed a unique turnover workflow because each shot comprises 10 to 18 screens. While VFX for a single shot was manageable, replicating the visuals of the entire edit posed a significant challenge. Eventually, we established a workflow that became a true synergy for us. Our VFX Supervisor Axel Bonami, VFX Producer Bryony Duncan, and the entire VFX team were incredibly collaborative.
Timur, being a highly visual individual, insisted on approving the film’s visual aesthetics during the editing phase. Subsequently, VFX had to meticulously replicate every movement, rack focus, and edit placement. The VFX team demonstrated exceptional attention to detail and delivered an outstanding final look for the 3D glass effect, as well as the immersive, surreal environment they created.
AK: We worked very closely with the VFX team throughout the entire process, and we were lucky to have both Editorial and VFX in the same post house for the entire post-production timeline (first in Los Angeles, and then in Sydney). Because we were always making so many changes in the editing room, we had to communicate with VFX multiple times a day and continuously keep each other in the loop.
Almost no decision in the edit was made without input from the VFX side of the process; in fact, many of the biggest creative conversations on the film occurred in the editing rooms between Timur, Lam, and me, the producers, and the VFX team. Shout out to VFX Supervisor Axel Bonami and VFX Producer Bryony Duncan for being such great collaborators on this wild journey!
What movies did you see growing up that made you think about the role of the editor?
LN: I remember seeing “Memento” for the first time and being in awe of it. The film’s editing was masterfully crafted, and I studied it many times. That inspired me to become an editor. I admired Dody Dorn for editing that film and recall wishing it would be amazing to meet her someday. So it was very serendipitous that Dody jumped on board with our film at such a late stage in the process. She was complimentary about what Austin and I did, and she was amazing to work with, helping us get this film to the finish line. So we definitely want to give a shout-out to Dody Dorn to work on this film with us.
AK: That’s a tough one, but I guess it was the films I saw in the early 2000s that first made me notice editing and think about the art of filmmaking in general. Two movies that come to mind are “Requiem for a Dream” (for its very noticeable style and rhythm of editing) and “Magnolia” (for how it juggles and weaves together so many disparate storylines and characters).
What are you doing next?
AK: I’m still waiting for another movie to come my way, but when I’m not editing, I am the co-founder of an LA-based immersive theater company called E3W Productions. I’m currently developing two immersive theater activations in New Orleans and Los Angeles later this year.
LN: Currently, finishing up a feature called “Drifter” with Director/Actor Sung Kang (the “Fast and Furious” franchise). I’m very excited for the film once it comes out, as we are doing a unique marketing push for it.