Antilatency: camera tracking for Unity (demo inside)

Hi there, Virtual Production newbies, professionals and everyone in between! We know that setting up a VP studio can be quite overwhelming, especially when choosing a tracking system. There are various options to choose from, and this quick demo shows you how to use Antilatency with Unity. You can track cameras, objects and even body parts.

In this case, we have used a tracking area set up on the truss system, attached one tracker to the camera, and two trackers to each leg. All tracking data was transmitted to Unity via proprietary radio protocol, and was used for real-time rendering. We didn’t use any post-processing or cleanup — only real-time data.

Let us know what you think, and share your results with Antilatency if you’ve had the chance to work with it before. We’re also open to suggestions on what demos we should do next, so fire away!

3 Likes

Hey @iamrosokolov … looks very cool. Does your implementation take advantage of Live Capture so that Antilatency input can be aligned with other motion capture tools?

Thank you. In this demo, we use our Unity SDK. Thank you for the Live Capture link we will jump into and do some research.

1 Like