Our UWB-based camera tracking solution consists of a small rover placed on the camera and a minimum of 8 beacons placed around the stage to calculate its position to an accuracy of +/- 2cm. An internal inertial measurement unit calculates pan, tilt, and roll. The AirPixel rover is connected to a Control Unit, from which data is sent to the render engine.
Using radio (UWB) rather than optical tracking, VIPS beacons can be placed behind a green/blue screen.
AirPixel works in bright sunlight, rain, mist etc., with a very small camera payload.
AirPixel covers stages of virtually unlimited size without any deterioration in position quality.
AirPixel features an inbuilt IMU providing pan, tilt and roll in addition to X, Y, and Z position.
AirPixel can integrate FIZ data to provide a complete data set to the render engine. We support popular rendering solutions, such as Unreal Engine, MotionBuilder & Disguise d3.
In the above video, Julian Thomas, Managing Director at Racelogic, explains how VIPS (VBOX Indoor Positioning System), now renamed to AirPixel, is being used as a tracking system to capture the position and orientation of the camera in Virtual Productions and Motion Capture.
AirPixel consists of a number of UWB beacons which are placed around the stage and a small UWB receiver with inbuilt IMU that's mounted on top of the camera. The beacons communicate with each other to track the camera position at all times, whilst the internal IMU measures the camera pan, tilt and roll.
The UWB receiver is also connected to a VIPS/ FIZ Processing Unit, which combines the positional and orientational data with the lens information and sends it to a render engine.
AirPixel supports popular rendering solutions such as Unreal Engine, MotionBuilder, and Disguise d3, with new support always being added. Talk to us about how we can integrate with your workflow today.
This is the latest demo of our UWB/IMU based camera tracking solution. The results in the video are all generated in real-time with no post-processing.
In this setup, 12 fixed beacons around the room, and the position and orientation are being calculated on the receiver on top of the camera. The system updates at 100 Hz, but the output to the camera and computer is genlocked at 30fps.
The tracking data is fed into Unreal which is generating the virtual studio, and the green screen video with the graphics is composed using an Ultimatte 12. AirPixel can also read the focus, iris and zoom values from a variety of cameras and controllers and feed this into Unreal.
LED walls are simply amazing tools for making films and TV programs, and they were the next stage for our ongoing camera tracking development. This video shows the ARRI stage in Uxbridge, where we have put 12 of our UWB beacons around the top of the walls and on various places around the sides.
The position and rotation of the camera is being measured by our AirPixel system on the camera and sent to the Unreal Engine, which then generates the graphics being presented on the walls. You can see the perspective of the trees changing as the camera tracks in front of the car to give the impression of a real 3D environment from the camera’s perspective.
The in-camera effects are immediate and very realistic, saving a lot of time in post-production, with many obvious benefits to cost and workflow.
AirPixel is the ideal indoor positioning system for the film, game, TV and VR industry. Using UWB positioning techniques to give real-time indoor precision tracking, AirPixel is as accurate as optical tracking systems, whilst not requiring a clear line of sight. Possible applications include:
We are particularly keen to talk with integrators and technical partners wishing to explore just how effective VIPS can be when applied to challenges facing the media and entertainment industry.