Augmented Reality Privacy Challenges
Augmented and extended reality devices pose unique privacy challenges in a rapidly changing industry. With new products such as Meta’s “Metaverse” and the Apple Augmented Reality services, it can be difficult to understand how such technologies may impact privacy going forward. Extended reality devices allow for the integration of sensors and communication systems to provided a more detailed form of augmented reality that includes a digital twin of the real world. Extended reality technology also typically includes a higher degree of positional tracking (such as eye and pointer tracking) to further enhance the experience. However, these enhancements require additional cameras in order to properly register what the user is trying to interact with (Slocum et al., n.d.). Many systems also utilize microphones or devices which may monitor local networks in order to improve the user experience.
Because of the large amount of data being collected, privacy data may inadvertently be collected by these devices. Using digital image recognition technology, nearby individuals may be accidentally tagged or otherwise associated with a location or process. For example, a user wearing a recording device may record unaffiliated individuals in a specific area and then apply automatic tagging such that the unaffiliated individuals can be geolocated to the original location. This carries an unintended and unauthorized privacy intrusion where individuals may find that they receive more targeted advertising based on an additional service being aware of their location. The sharing of this data likely never will involve the data subject.
With the advent of deepfakes, attacks using AR/XR technologies may see an increase. As a user may record a subject without their knowledge, it may be possible to receive sufficient information so as to produce a deepfake mimicry of that subject just through random recordings from XR systems that are stitched together.
This additional level of spatial data collection may also provide a new vector for privacy data deanonymization: avatar-based attacks. As key physical characteristics such as common motions, gait, and physical measurements are increasingly collected by sensor technologies, it is currently possible to associate a known set of criteria to an individual based on movements alone (Falk et al., 2021).
Even specific malware targeting XR systems has been developed. Malware now exists which acts to disorient or confuse individuals wearing XR devices by adjusting roll, pitch, and yaw values such that the user becomes disoriented. More privacy-center malware has also been developed which impacts the classification of individuals nearby and allows for behavioral-based identification to statistically infer what the data subjects may be performing (such as social media usage on a cellphone or spreadsheet editing on a laptop). This may allow threat actors to more precisely target watering-hole style attacks going forward.
Overall, XR and AR systems allow for greater flexibility both in the workplace and recreationally. However, these systems carry serious implications for both data privacy and user safety. As the technology advances, so to do the threats caused both because of and directed towards that technology. Only through continuous improvement of regulations and defensive technologies can we ensure the safety of users and subjects alike.
References:
Falk, B., Meng, Y., Zhan, Y., & Zhu, H. (2021). POSTER: ReAvatar: Virtual Reality De-anonymization Attack Through Correlating Movement Signatures. Proceedings of the 2021 ACM SIGSAC Conference on Computer and Communications Security, 2405–2407. https://doi.org/10.1145/3460120.3485345
Slocum, C., Zhang, Y., Abu-Ghazaleh, N., & Chen, J. (n.d.). Going through the motions: AR/VR keylogging from user head motions.
Member discussion