Jon Favreau has long treated the film set as a laboratory for emerging technology. Having pioneered the use of "The Volume" — the massive, wraparound LED walls that replaced traditional green screens for The Mandalorian — the director is now turning to spatial computing to address a fundamental disconnect in high-format cinematography: the gap between the scale of an IMAX frame and the small monitors used on set during production.
During a recent appearance on The Town podcast, Favreau revealed that he commissioned Disney to build a custom Apple Vision Pro application for the production of the upcoming feature The Mandalorian and Grogu. The software allows him to virtually sit in a simulated IMAX theater while lining up shots, providing a sense of spatial fidelity that a standard production monitor cannot match. By viewing the live camera feed through the headset, Favreau can judge how a composition will actually translate to an audience sitting before a seven-story screen — in real time, on set, before committing to a take.
The monitor problem
The challenge Favreau is addressing is not new. Since IMAX cameras entered mainstream narrative filmmaking — a trend accelerated by Christopher Nolan's adoption of the format beginning with The Dark Knight in 2008 — directors have contended with a persistent mismatch. The aspect ratio and sheer physical scale of an IMAX projection bear little resemblance to the seven-inch or seventeen-inch monitors that populate a typical video village. Compositions that feel balanced on a small screen can appear cavernous or poorly weighted when projected onto a surface several stories tall. Conversely, subtle details that register on a monitor may vanish entirely in a vast auditorium.
Traditional workarounds have been limited. Some directors rely on experience and intuition. Others review dailies in full-size screening rooms at the end of each shooting day, a workflow that introduces delay and makes real-time adjustments impossible. Favreau's approach collapses that feedback loop: the headset becomes, in effect, a portable IMAX theater that travels with the director from setup to setup.
The technical plausibility of this approach rests on the Vision Pro's display hardware. Apple's headset uses micro-OLED panels with a pixel density high enough to simulate a large-format screen at close range without visible pixelation — a threshold that earlier generations of VR headsets could not reliably clear. Whether the simulation is a perfect proxy for a true IMAX projection is a separate question, but the goal is directorial confidence in framing, not color-accurate mastering.
Spatial computing finds a professional foothold
Favreau's use case is notable less for its novelty than for what it signals about where spatial computing may gain traction. Consumer adoption of the Vision Pro has been a subject of persistent debate since the device launched, with questions about price, comfort, and the breadth of its app ecosystem. Professional production environments operate under different economics. A headset that costs a few thousand dollars is a rounding error on the budget of a tentpole film, and the value proposition — replacing an expensive, time-consuming screening-room workflow with an on-set tool — is concrete and measurable.
Other filmmakers have already explored adjacent territory. Jon Chu reportedly used the Vision Pro during post-production on Wicked for remote editing and review. But Favreau's application sits further upstream in the pipeline, embedded in the live production process itself. That distinction matters: post-production tools augment an existing workflow, while on-set tools can reshape how decisions are made in the moment.
The broader pattern is consistent with Favreau's career arc. The Volume, which debuted on The Mandalorian in 2019, was initially regarded as an experiment; within a few years, LED-wall virtual production had become an industry-wide technique adopted across film and television. Whether a virtual screening room follows the same trajectory depends on factors beyond one director's enthusiasm — the maturity of the software, the willingness of studios to invest in custom applications, and the degree to which other filmmakers find the tool genuinely useful rather than merely novel.
What remains clear is that the gap between capture and exhibition — between what a director sees on set and what an audience experiences in a theater — has been a persistent friction point in large-format filmmaking. Favreau's experiment does not eliminate that gap, but it compresses it in a way that no physical monitor can. The question is whether the rest of the industry follows, or whether this remains an idiosyncratic tool for a director who has always been unusually willing to build his own.
With reporting from Engadget.
Source · Engadget



