Toward Multimodal Privacy in XR: Design and Evaluation of Composite Privatization Methods for Gaze and Body Tracking Data
Azim Ibragimov, Ethan Wilson, Kevin R. B. Butler, Eakta Jain
Stop applying privacy mechanisms to each sensor stream independently. Deploy composite obfuscation that treats gaze and body telemetry as a unified attack surface. Prioritize this for social XR platforms where behavioral biometrics pose the highest re-identification risk.
XR headsets leak identity through combined gaze and body telemetry. Naively pairing unimodal privacy mechanisms—eye privatization plus body privatization—fails to prevent re-identification when attackers fuse both signals.
Method: The paper introduces composite privatization methods that jointly obfuscate gaze and body tracking data rather than treating them independently. They demonstrate that attackers can re-identify users by correlating patterns across modalities even when each stream is individually protected. Their composite approach applies coordinated noise injection and temporal smoothing across both data streams simultaneously, reducing re-identification accuracy while preserving interaction fidelity for XR applications.
Caveats: Evaluation focused on specific attack models. Real-world adversaries may develop novel fusion techniques that exploit different correlation patterns.
Reflections: How do composite privatization methods scale to additional modalities like hand tracking or facial expressions? · What is the minimum interaction fidelity threshold users will tolerate for privacy gains in different XR contexts? · Can machine learning models trained on privatized multimodal data maintain utility for personalization features?