When technology, creativity, and emotion converge – innovation happens.
At Tomorrowland’s UNITY show inside the iconic Sphere in Las Vegas, Prismax achieved something unprecedented:
the first-ever fully real-time visual experience ever performed live in the Sphere.
For the final hour of UNITY, every pixel on the Sphere’s 16K x 16K curved canvas was generated and controlled in real time – a historic milestone in immersive entertainment.
A world first in the world’s most advanced venue
The Sphere is not just another screen – it’s a living architecture of light, space, and motion.
Its massive interior LED surface envelops 18,000 people in a 180° world of pure visual storytelling.
Traditionally, content for the Sphere is pre-rendered and pre-calibrated – due to the immense computational power required to run visuals at that scale and fidelity.
But Prismax, together with Unreal Engine and Disguise, pushed the boundaries further.
For the very first time, the show’s final hour was performed fully in real time, allowing the visuals to react dynamically to the music, lighting, and live energy of the room.
Rethinking the content pipeline
Leading the technical vision behind this breakthrough was Ruben Gorissen, Prismax’s CTO and head of innovation.
“Running real-time graphics at 16K by 16K resolution across a 180° LED dome isn’t just a scaling challenge – it’s rewriting how live visuals are created,” Ruben explains.
“We built a custom operating tool capable of delivering uncompromised visual fidelity, with real-time responsiveness and near zero latency.”
To make this possible, Prismax integrated a hybrid Unreal Engine + Disguise workflow that combined:
- Real-time rendering with dynamic shader systems and procedural lighting;
- DMX (Light), OSC (Lasers) and MIDI (controllers)to the musical score of UNITY;
- VR application to preview the scenes and iterate between different camera positions and tree positions
The power of real-time creation
By harnessing Unreal Engine’s real-time rendering capabilities, Prismax could blend cinematic fidelity with interactive adaptability.
Instead of rendering frames weeks in advance, the visuals were generated, composited, and manipulated in the moment, giving creative operators the ability to adjust light, atmosphere, and motion live inside the Sphere – an unprecedented level of creative control.
“It’s not just about visuals,” Ruben adds. “It’s about transforming technology into a living instrument – one that responds emotionally.”
Partnerships that push the boundaries
This world-first achievement was made possible through deep collaboration with Epic Games’ Unreal Engine and Disguise, whose real-time media servers and virtual production tools allowed Prismax to blend innovation with stability on a massive scale.
The result: a seamless, fully responsive visual environment that evolved with the show’s final act – a breathtaking crescendo that united music, motion, and emotion in perfect harmony.

A milestone for the industry
With this real-time integration, Prismax has not only expanded the creative potential of the Sphere but also redefined what’s possible in large-scale immersive entertainment.
This project stands as proof that the future of live visual storytelling lies in real-time technology – where art, code, and performance become one.
“This was a defining moment,” says Ruben. “We didn’t just play visuals – we performed them.”