Enabling Privacy-Friendly AR Experiences

As we approach the everyday usage of AR, novel privacy concerns arise (e.g., environmental sensing techniques capturing sensitive physical areas or bystanders without their consent). To mitigate these risks, my PhD work establishes a foundation for adapting AR interactions to minimize exposure of sensitive data and guide the development of privacy-friendly AR interfaces.

Thumbnail of Privacy Equilibrium Toolkit: Virtual avatars walk through a simulated environment and trigger a system-driven negotiation of AR sensing when they come within proximity of each other.

Privacy Equilibrium: Balancing Privacy Needs in Dynamic Multi-User Augmented Reality Scenarios

UIST 2025

Shwetha Rajaram, Jiasi Chen, Michael Nebeling

Thumbnail of privacy-driven AR adaptation technique design space. The top of the image shows an axis from system-driven to user-driven techniques, and the left shows an axis from full to partial sensing access. Illustrations of different adaptation techniques (e.g., silent speech, pre-defined selection areas) are mapped across the axes.

Exploring the Design Space of Privacy-Driven Adaptation Techniques for Future Augmented Reality Interfaces

CHI 2025, Honorable Mention

Shwetha Rajaram, Macarena Peralta, Janet G. Johnson, Michael Nebeling

Thumbnail of Reframe storyboarding system. In the middle of a cartoonish city environment, a Graffiti Artist character stands and sprays spam AR content into the environment. The outline of AR glasses and a hand gesturing at the character are shown.

Reframe: An Augmented Reality Storyboarding Tool for Character-Driven Analysis of Security & Privacy Concerns

UIST 2023

Shwetha Rajaram, Franziska Roesner, Michael Nebeling

Thumbnail of multi-user AR sharing technique design space. The top row shows 7 common AR interaction techniques (e.g., virtual menus, gestures, proximity-based interaction). These map to illustrations of access control proposals from our study below.

Eliciting Security & Privacy-Informed Sharing Techniques for Multi-User Augmented Reality

CHI 2023

Shwetha Rajaram, Chen Chen, Franziska Roesner, Michael Nebeling

Customizing Interactions with XR and GenAI-Enabled Systems

With wearable intelligent systems that support task continuity across settings (e.g., AR devices, AI-enabled smart-glasses), users’ goals and their perceptions of privacy risks can frequently evolve. Through internships and other projects, I developed interaction techniques that allow users to rapidly align context-aware systems to their goals.

Thumbnail illustrating glasses-based interaction techniques. A woman asks her assistant to remind her what happened in a previous meeting, and uses gestures to navigate the system responses. She receives haptic guidance through a wristband.

Gesture and Audio-Haptic Guidance Techniques to Direct Conversations with Intelligent Voice Interfaces

CHI 2025

Shwetha Rajaram, Hemant Bhaskar Surale, Codie McConkey, Carine Rognon, Hrim Mehta, Michael Glueck, Christopher Collins

Thumbnail of BlendScape (AI-generated video-conferencing environment, representing a creative office space for design ideation). Nic is rendered in the top left, while a video feed below renders his hand pointing to a physical notebook. Nathalie is rendered on the bottom right.

BlendScape: Enabling End-User Customization of Video-Conferencing Environments through Generative AI

UIST 2024, Honorable Mention

Shwetha Rajaram*, Nels Numan*, Balasaravanan Thoravi Kumaravel, Nicolai Marquardt, Andrew D. Wilson

Thumbnail of AI-generated environment from the SpaceBlender system. A hexagonal 3D environment shows a blended library and living spaces with large sunny windows.

SpaceBlender: Creating Context-Rich Collaborative Spaces Through Generative 3D Scene Blending

UIST 2024

Nels Numan*, Shwetha Rajaram*, Balasaravanan Thoravi Kumaravel, Nicolai Marquardt, Andrew D. Wilson

Thumbnail of augmented chemistry notebook created with the Paper Trail system. Intermixed with hand-written notes is an AR video tutorial for titration and AR audio playback controls. Shwetha's hand points at the play button to active audio.

Paper Trail: An Immersive Authoring System for Augmented Reality Instructional Experiences

CHI 2022

Shwetha Rajaram, Michael Nebeling

Thumbnail of XRStudio system. The top right shows a video feed of Michael in his office, wearing a VR headset and drawing with his controllers. To the left, Michael is rendered within the VR environment and his drawings are visible to student avatars.

XRStudio: A Virtual Production Technology Probe for Immersive Instructional Experiences

CHI 2021

Michael Nebeling, Shwetha Rajaram, Liwei Wu, Yi Fei Cheng, Jaylin Herskovitz

Thumbnail of MRAT's in-situ interaction visualizations. AR points and arrows show different events captured from an AR usage session like Status Update.

MRAT: The Mixed Reality Analytics Toolkit

CHI 2020, Best Paper

Michael Nebeling, Maximillian Speicher, Xizi Wang, Shwetha Rajaram, Brian Hall, Zijian Xie, Alexander Raistrick, Michelle Aebersold, Edward Happ, Jiayin Wang, Yanan Sun, Lotus Zhang, Leah Ramsier, Rhea Kulkarni