Carnegie Mellon University
July 25, 2025

PatternTrack for Seamless Multi-User AR

By Ashlyn Lacovara

Chris Harrison, affiliated faculty with the Extended Reality Technology Center (XRTC), and his lab — the Future Interfaces Group — recently showcased a new study called PatternTrack at CHI this year. This research aims to simplify and synchronize multi-user augmented reality (AR) experiences, especially when users are co-located in the same physical space.

demoapp2.png

When multiple people use AR devices like iPhones, iPads, the Apple Vision Pro, or the Meta Quest 3 in the same room, ensuring that everyone sees the same virtual content aligned in the same way is surprisingly difficult. Current systems require complex calibration, infrastructure, or visually distinctive environments — which makes consistent AR across devices hard to achieve in practice.

PatternTrack is a unique tracking system that stands out for its simplicity, accuracy, and versatility. Here's what makes it unique:

  • No extra hardware or setup is required — it works without markers, external cameras, or pre-scanning the space.

  • It uses the infrared structured light already emitted by the VCSEL-driven depth sensors built into many modern AR devices.

  • It performs well on plain, featureless surfaces — where traditional tracking often fails.

  • It provides real-time 3D tracking, accurately determining both the position and orientation of nearby users' devices.

The team tested PatternTrack across six different surface types and at inter-device distances of up to 2.6 meters (260 cm). The results speak to its reliability:

  • Average position error: 11.02 cm

  • Average angular error: 6.81°

PatternTrack represents a major step forward in enabling seamless, infrastructure-free shared AR experiences. From multiplayer gaming to collaborative virtual design, this innovation paves the way for richer, more immersive interactions — without the need for complicated setup or calibration.