Research Assistant at Indiana University
Summary
Developed a user tracking system using computer vision and CNN to analyze spatial behavior in VR environments, achieving 97.86% accuracy. Collaborated with researchers to collect and analyze interaction data from 100+ participants, improving user experience and spatial design.
Impact
Automated data collection processes, reducing manual effort by 90% and enhancing efficiency in XR research.
My Contribution
Developed the automation
Collected data from 30 participants
Analyzed Data on Tableau
Team
1 UX Designer
3 Developers and Researchers
2 Architect
Timeline
1 Year
Tools
Google Docs
Figma
Zoom
Jupyter
Background
This project aims to understand user navigation in Virtual worlds, which would help in understanding behavioural patterns to improve the interior design of various floor plans.
Data Collection
For collecting data from a hundred participants initially the architect would manually trace the path of the participant at a time.
Automation
Automated the process of data collection by creating an AI model that can detect users location from the ego centric view.
Understand existing methods of using AI automation in VR to identify ways to build the model and visualize user navigation data.
Interactive Geography - Ben Rydal Shapiro
Developed a hybrid model combining SIFT (Scale-Invariant Feature Transform) and ResNet (Residual Neural Network) for accurate location detection.
Captured screenshots from various locations in a virtual museum.
Cropped images and cleaned icons from the data to improve accuracy of recognition
To improve accuracy of recognition added parameters to generate more similar data
Created labels for each of the images taken and created folders for images.
We switched to a new floor plan for the study.
We had to take screenshots for the new rooms and build the model using the base model used previously.
Adjusted training parameters to fit the model and give a more accurate recognition system and added a more elaborate traversal path.
Conducted a pilot test with users and our team in the VR environment.
Implemented talk-aloud protocols to gather qualitative insights during user interactions.
Used Tableau to evaluate user emotions and feedback to understand their VR experience using data from surveys.
Created visual representations of user movement patterns and behavior, along with visualization of survey results.
Created a correlation metric to identify how emotions depend on each other and change with the VR experience.
Identified key pain points and opportunities to enhance VR spatial design and user experience.
Being calm at the start reduces stress, but doesn’t counter sadness or anger completely.
Introducing joy early blocks most negative emotions while exploring the environment
Fear and anxiety don’t fade naturally in VR and may even cause dizziness.
Developed a website to showcase findings and model performance, making insights accessible to stakeholders.
Planned for scaling the system using AWS for broader accessibility and real-time analysis, ensuring the model can be used for multiple layouts.
Big thanks to our team and especially Prof. Hoa Vo for guiding us in this study.
Feel free to drop me a message anytime.
I'm all ears for cool and creative ideas!
Let's make peoples lives better with the help of AI.
desnehashsh@gmail.com
Resume