Foveated Rendering and Distance Perception in VR
I study how gaze-contingent foveated rendering changes egocentric distance judgments in VR using Meta Quest Pro eye tracking in Unity.
VR AR and AI researcher and lecturer at UMass Boston. Distance perception, foveated rendering, HCI, and robot learning with Isaac Sim.
I am a computer scientist working at the intersection of VR, AR, AI, graphics, and HCI. I build gaze-aware systems and study distance perception in immersive scenes. I also design human-in-the-loop data collection and evaluation for robot learning.
Stack highlights include C++, Python, C#, Unity, OpenGL, GLSL, and Maya. Current work focuses on eye-tracked foveated rendering and VR-to-Isaac Sim data pipelines.
I study how gaze-contingent foveated rendering changes egocentric distance judgments in VR using Meta Quest Pro eye tracking in Unity.
I build a VR teleoperation pipeline to collect demonstrations in NVIDIA Isaac Sim and test if VR data improves robot learning on pick and place tasks.