top of page

Virtual Reality 

The Multimodal Virtual Reality Traffic project is a cutting-edge research endeavor that involves creating a fully immersive virtual reality environment with realistic visual and auditory elements. This project was specifically designed to conduct research sponsored by the National Institutes of Health by simulating realistic traffic situations for participants.

Download Published Works

Woman in VR Headset

Purpose of Virtual Reality Project

"We strive to enhance pedestrian safety through innovative technology. Our primary goal is to create realistic traffic scenarios that allow us to study the responses of individuals across various age groups and with different auditory and visual abilities. By simulating these environments, we aim to develop insights that contribute to safer urban spaces for everyone." 

Research Questions

  • Is the traffic simulation realistic?

  • Does the addition of auditory information enhance overall immersion?

  • Can users report answers properly?

Wall of ideas

Methods

This project employed physiological, quantitative, and qualitative methodologies to address the research questions and gain a comprehensive understanding of user behavior in the VR environment.

Methods Used

  • Eye Tracking

  • Questionnaire

  • Completion Rate and Time

  • Think-a-loud Protocol

Eyes

Eye Tracking Methodology

An essential task of the VR simulation is to provide a realistic representation of moving vehicles, particularly in how users perceive and react to them. To evaluate the perceptual realism of the simulation, eye tracking data collected from the integrated headset was used to address the following questions:

​

  1. Are users looking at the vehicle when it enters the scene?
    In real-world environments, people typically orient their attention to dynamic objects that may pose a risk, such as oncoming vehicles. If users consistently fixate on the vehicle upon its appearance, this suggests that the simulation is capturing their attention in a way that mirrors natural behavior.

  2. Are users able to follow the moving vehicle with their eyes?
    Smooth pursuit eye movements are a natural response to tracking moving objects. If users are able to follow the vehicle’s path with minimal loss of gaze, this indicates that the motion is both visible and believable, supporting the realism of the vehicle's dynamics.

  3. Are users using environmental cues to make judgments on the vehicle's movement?
    In this simulation, judgments about vehicle motion should rely solely on the vehicle’s trajectory, speed, and appearance—not on environmental cues like shadows or road markings. If eye movement patterns show that users are focusing on the vehicle rather than extraneous visual elements, it suggests that they are using the correct visual information, which supports both the ecological validity and controlled design of the simulation.

 

By evaluating users’ visual attention to the vehicle and their ability to follow it naturally, these questions help determine whether the VR environment elicits behaviors consistent with those seen in real-world scenarios, thereby serving as an indicator of simulation realism.

VR Driving

Perceptual Immersion

As this VR system was designed to study perception and interactions in traffic settings, the virtual environment needed to offer a convincing simulation of moving vehicles. Beyond eye movement data, it was important to directly assess participants’ subjective experiences of visual and auditory realism. To do this, we used a post-experience questionnaire to capture participants’ perceptions of motion and spatial presence—key indicators of immersion.

​

Questionnaires are well suited for this purpose because they allow researchers to probe users’ internal experiences and judgments—such as whether an object seemed to move realistically in depth or whether visual and auditory cues were integrated into a coherent percept. These types of experiences are difficult to infer from behavioral data alone.

​

Participants responded to the following questions:

  • Did the visual simulation of the objects look like the objects moved in depth toward you? Why?

  • Did the auditory simulation of the objects sound like the objects moved in depth toward you? Why?

  • When you both heard and saw the approaching object, how often did you perceive a single approaching object rather than two separate approaching objects? (Scale from 1 – Always to 5 – Never)

  • Did the object ever seem like it was coming from any direction not directly in front of you?

 

These responses helped evaluate whether the multisensory presentation of moving vehicles was immersive, coherent, and spatially accurate—factors essential to a realistic VR traffic environment.

 

Responses

htc-vive-wireless-controller-20-2018-2_e

Because this VR system was intended for use in empirical studies, it was essential to ensure that participants could reliably provide responses and advance through trials within the virtual environment. To accomplish this, the VR controller was mapped with two distinct input functions:

​

  • The response button was located at the 3 o’clock position on the main circular button.

  • The next trial button was assigned to the trigger on the back of the controller.

 

To evaluate the usability of this input method, participants completed a response task during which both hit rate (i.e., whether the correct button was pressed) and time-to-completion were recorded. These metrics allowed us to determine whether participants could correctly identify and use the appropriate buttons, and how efficiently they could do so.

When difficulties were observed—such as hesitation, incorrect button presses, or repeated attempts—a think-aloud protocol was introduced. This method provided qualitative insight into participants’ thought processes, helping us understand the sources of confusion or delay in the interaction. This combination of performance metrics and verbal feedback helped inform refinements to the input design to better support intuitive interaction in future studies.

Computer with Graph

RESULTS

The findings from the above studies informed a set of recommendations aimed at enhancing the VR system’s realism, usability, and suitability for future empirical research.

Close Up on Eyes

Eye Movement Results and System Updates

The eye tracking data revealed the following:

​

  • Initial Gaze Behavior: Participants consistently fixated on the vehicle as it entered the scene, indicating that the simulation effectively drew user attention to relevant motion.

  • Tracking Limitations: Participants struggled to visually track the vehicle when its speed exceeded 60 km/h, suggesting a perceptual or hardware limitation in smooth pursuit at higher speeds.

  • Use of Environmental Cues: Despite instructions, participants frequently relied on background elements to judge the vehicle’s motion rather than using the vehicle's movement alone.

 

Changes Implemented:

​

  • Speed Adjustment: Vehicle speed was capped at 60 km/h to support accurate visual tracking within the VR environment.

  • Cue Control: Environmental cues (e.g., background motion, fixed reference points) were removed or minimized to ensure participants relied solely on the vehicle's movement when making judgments.

 

These updates helped align the simulation more closely with the study’s perceptual goals, improving control over the visual variables being tested.

Music Equalizer

Questionnaire Results and Audio Design Updates

Participant responses to the immersion questionnaire revealed several issues with the auditory simulation:

​

  • Sound Source Localization: Participants noted that the vehicle sound did not always appear to come from the direction of the vehicle, reducing the realism of the experience.

  • Volume Inconsistency: The volume of the vehicle sounds was frequently mismatched—either too loud or too soft—relative to the size and speed of the associated vehicle, disrupting immersion.

 

Changes Implemented:

​

  • Environmental Reflection Modeling: To improve spatial accuracy, we adjusted the audio to reflect off virtual surfaces (e.g., ground, buildings) as it would in the real world. This created a more natural and believable soundscape.

  • Dynamic Sound Modulation: Vehicle sound was modulated not only by speed but also by vehicle size, helping to match users' expectations of how larger or smaller vehicles should sound at various velocities.

 

These refinements contributed to a more cohesive multisensory experience, helping users perceive the vehicle’s presence and motion more naturally.

Input Usability Results and Interaction Flow Updates

The response task and think-aloud protocol uncovered several usability challenges with the input method:

​

  • Slower Response Times: Participants responded more slowly when using the button at the 3 o’clock position on the circular input, compared to the more intuitive trigger button.

  • Inconsistent Presses: Users frequently pressed other areas of the circular button, indicating difficulty with targeting the designated location.

  • Interaction Confusion: Think-aloud feedback revealed that participants occasionally reversed the intended input order—pressing the trigger to start the next trial before submitting a response—resulting in skipped data.

 

Changes Implemented:

​

  • Expanded Response Area: The full circular button was enabled for submitting responses, reducing targeting precision requirements and improving response speed.

  • Interaction Safeguards: The system was updated so the trigger (Next Trial) button would remain disabled until a valid response had been recorded, ensuring data completeness and preserving the integrity of the study design.

 

These adjustments improved the intuitiveness and reliability of the input system, supporting smoother and more accurate participant interactions within the VR environment.

Final Design

Below is an example of one trial run on the VR system. Note that sound could not be accurately represented here due to inconsistent speaker location

Example of Visual Scene

bottom of page