Research Assistant at Indiana University

Occupant Behaviour Analysis in Virtual Reality

Summary
Developed a user tracking system using computer vision and CNN to analyze spatial behavior in VR environments, achieving 97.86% accuracy. Collaborated with researchers to collect and analyze interaction data from 100+ participants, improving user experience and spatial design.

Impact
Automated data collection processes, reducing manual effort by 90% and enhancing efficiency in XR research.

Brief

My Contribution

Developed the automation

Collected data from 30 participants

Analyzed Data on Tableau

Team

1 UX Designer

3 Developers and Researchers

2 Architect

Timeline

1 Year

Tools

Google Docs

Figma

Zoom

Jupyter

Opportunity

The Problem

Background
This project aims to understand user navigation in Virtual worlds, which would help in understanding behavioural patterns to improve the interior design of various floor plans.

Data Collection
For collecting data from a hundred participants initially the architect would manually trace the path of the participant at a time.

My Solution

Automation
Automated the process of data collection by creating an AI model that can detect users location from the ego centric view.

Research

Literature Review

Understand existing methods of using AI automation in VR to identify ways to build the model and visualize user navigation data.

Interactive Geography - Ben Rydal Shapiro

SIFT + Resnet 50

Developed a hybrid model combining SIFT (Scale-Invariant Feature Transform) and ResNet (Residual Neural Network) for accurate location detection.

Base Model

Data Collection

Captured screenshots from various locations in a virtual museum.

Data Cleaning

Cropped images and cleaned icons from the data to improve accuracy of recognition

Data Augmentation

To improve accuracy of recognition added parameters to generate more similar data

Data Labelling

Created labels for each of the images taken and created folders for images.

Pivot


New Floor Plan

We switched to a new floor plan for the study.


Taking Screenshots Again

We had to take screenshots for the new rooms and build the model using the base model used previously.


Improved Model

Adjusted training parameters to fit the model and give a more accurate recognition system and added a more elaborate traversal path.

Data Collection

Pilot Test

Conducted a pilot test with users and our team in the VR environment.

Thing Aloud

Implemented talk-aloud protocols to gather qualitative insights during user interactions.

Analysis

Sentiment

Used Tableau to evaluate user emotions and feedback to understand their VR experience using data from surveys.

Data Visualizations

Created visual representations of user movement patterns and behavior, along with visualization of survey results.

Corelation

Created a correlation metric to identify how emotions depend on each other and change with the VR experience.

Insights

Identified key pain points and opportunities to enhance VR spatial design and user experience.

Stress

Being calm at the start reduces stress, but doesn’t counter sadness or anger completely.

Joy

Introducing joy early blocks most negative emotions while exploring the environment

Insights

Fear and anxiety don’t fade naturally in VR and may even cause dizziness.

Final Steps

Web Development

Developed a website to showcase findings and model performance, making insights accessible to stakeholders.

AWS Scaling

Planned for scaling the system using AWS for broader accessibility and real-time analysis, ensuring the model can be used for multiple layouts.

Credits

Big thanks to our team and especially Prof. Hoa Vo for guiding us in this study.

Lets Connect

Feel free to drop me a message anytime.
I'm all ears for cool and creative ideas!
Let's make peoples lives better with the help of AI.

desnehashsh@gmail.com

Resume

Linkedin

BriefOpportunityResearchBase ModelPivotData CollectionAnalysisFinal Steps