02
Pixera: End to End 2110
In February 2025, working with Landon Looper, I was invited by Conor McGill, Director of PIXERA USA, to contribute a keynote presentation and live demonstration to the SMPTE ST 2110 End-to-End Showcase at PIXERA's Santa Monica studio. The invitation came out of a conversation at the 2024 NAB Show — the kind of conversation where two people realize they're asking the same question from different directions. PIXERA's question was about the future of networked media infrastructure. My question was the same one I always ask: how do you build a system where the technology disappears and what's left feels alive?
The demonstration I built is called The Exquisite Sammich, named after the Exquisite Corpse — the surrealist technique where multiple artists contribute to a single work without seeing each other's contributions, trusting the collision to produce something none of them could have made alone. That's exactly what we built: a real-time generative world running simultaneously across Unreal Engine, TouchDesigner, Unity, a Panasonic UE160 PTZ camera running neural depth mapping, PIXERA media servers, and a live LED wall output through Megapixel VR and ROE Visual, all networked together over SMPTE ST 2110.
Each system had a completely different role. The camera fed live video through a neural network that generated real-time point clouds and fluid simulations in TouchDesigner. Those fluid simulations, including a cascading waterfall built as an active particle system, ran on a dedicated PIXERA PX4 server. Unreal Engine and Unity handled real-time rendering on separate PX4 servers. A fourth server managed UI and system control. Panasonic's Kairos handled live switching. All of it flowed as uncompressed ST 2110 streams across a Netgear AV network, composited in real time and output to the LED wall simultaneously.
What made it work — and what made it feel like something more than a technical demonstration — was that none of it had seams. The waterfall wasn't a video playing. It was a live particle simulation revealing the Unreal Engine world behind it as the particles dispersed, as if water itself were pulling back a curtain. The entire ecosystem breathed together. Water, light, motion, all generated by completely different engines, coexisting as naturally as elements in nature do because the infrastructure was built to let them.
This is proof of concept for something I believe is coming: media production as a living system rather than a pipeline. Not a signal chain from camera to output but a network of collaborating intelligences, each contributing something distinct, none of them aware of the seams. The technology disappears. What's left is a world.
It all began at the 2024 NAB Show, where we ran into a longtime friend who had recently joined the PIXERA team. They told us about PIXERA’s new studio in Santa Monica, built as a hub for innovation in real-time media workflows.
In the middle of the show floor, we found ourselves in an impromptu conversation with Conor McGill, Director of PIXERA USA. The discussion quickly turned into something bigger, how could PIXERA’s tools and our expertise in live, interactive workflows push the limits of SMPTE ST 2110? How do we bridge technology and creativity in a way that feels organic and alive?
We shared our past projects that experimented with AI, real-time rendering, and generative media, and the conversation naturally evolved into a bigger idea: What if we built something together? Something that not only showcased technical excellence but also created a new kind of interactive world?
A few months later, PIXERA reached out with the idea for the ST 2110 End-to-End Showcase, and we knew we had to be part of it.
For this showcase, we wanted to break away from traditional production workflows and instead craft a living, generative world, a space where digital elements evolved together like an ecosystem. The idea became known as The Exquisite Sammich, inspired by the Exquisite Corpse, a surrealist art technique where multiple artists contribute to a single work without seeing the full picture.
We designed a real-time generative nature system, where multiple software engines and media servers worked together as if they were part of the same organic world. Each system had a unique role, but they communicated seamlessly over SMPTE ST 2110, ensuring that every element, whether fluid simulation, environmental rendering, or real-time compositing, felt like part of a unified whole.
This entire experience was powered by a fully networked infrastructure, with contributions from our partners, including:
- PIXERA media servers handling compositing and output
- Panasonic’s Kairos for live production switching
- Megapixel VR and ROE Visual for LED processing and display
- Netgear AV and Matrox Video for high-bandwidth networking and conversion
At the heart of the Exquisite Sammich was a fully networked SMPTE ST 2110 setup, built to move uncompressed video and data across multiple processing nodes without a hitch. We integrated PIXERA media servers, Unreal Engine, TouchDesigner, Unity, and a mix of hardware components to create a dynamic, real-time nature simulation. The whole system relied on a high-speed Netgear AV network, linking everything together through ST 2110 multicast streams.
Our goal was to build a digital world that felt alive. We wanted elements like water, air, and organic motion to interact as if they were part of the same ecosystem, even though they were generated by completely different technologies. Just like in nature, where everything coexists seamlessly, our setup blended real-time physics simulations, machine learning depth mapping, and high-end rendering into one cohesive experience. A Panasonic UE160 PTZ camera, running over SMPTE ST 2110, captured live video, which was then processed through a neural network for real-time depth mapping. This allowed us to generate point clouds and fluid simulations inside TouchDesigner, forming the foundation of this constantly evolving environment.
To keep everything running smoothly, we split the computational load across multiple systems. The fluid simulations, like a cascading waterfall, were handled on a dedicated PIXERA PX4 server. Meanwhile, Unreal Engine and Unity managed real-time rendering on separate PX4 servers, ensuring smooth compositing and interactivity. A fourth PIXERA server (PX2) was used for UI control and system management, making it easy to adjust parameters and interact with the environment in real time.
One of the most visually striking elements was the waterfall. It wasn’t just a graphic—it was an active particle simulation that dynamically revealed the Unreal Engine-generated world behind it. As particles moved and dispersed, they acted as a natural transition layer, creating the illusion of depth, as if water was refracting light and revealing hidden layers underneath. Seeing it all come together in real time really underscored the power of networked media workflows and what’s possible when you mix physics-based simulations with real-time rendering.
The processed data was sent to PIXERA’s compositing engine, where it was combined with real-time rendering from Unreal Engine to generate a fully immersive scene. This composited image was encoded as an ST 2110 stream and distributed to multiple outputs, including a ROE LED wall powered by a Megapixel VR processor and a projection system from AV Stumpfl. Meanwhile, Panasonic’s Kairos served as the event’s live production core, integrating camera feeds and additional media for real-time switching and output.
By leveraging ST 2110 for both video and data transport, the system enabled low-latency, high-fidelity media processing, allowing multiple tools and render engines to collaborate in real time without signal degradation. This demonstrated how networked workflows can seamlessly integrate AI-driven image processing, live video, and high-end visual effects in a unified, IP-based production environment, bridging the gap between digital and organic worlds.
We want to extend our deepest thanks to PIXERA and the entire AV Stumpfl team, especially Conor McGill, for inviting us to be part of this showcase and trusting us to bring something truly experimental to life.
To the partners, Panasonic Connect, Megapixel VR, ROE Visual, Netgear AV, Matrox Video, Creative Technology, and FUSE Technical Group, thank you for bringing your expertise, your technology, and your willingness to collaborate. And to everyone who attended, engaged, and explored The Exquisite Sammish with us, thank you. This is just the beginning of something bigger, and we can’t wait to explore what’s next.
Until next time.