Skip to content

Open source simulator for self-driving vehicles (extended for eHMI research)

License

Notifications You must be signed in to change notification settings

tlab-wide/Smartpole-VR-AWSIM

Repository files navigation

Smartpole-VR-AWSIM

This is a fork of tier4/AWSIM adapted to support VR-based simulations of Smartpole Interaction Units (SPIUs).
The goal is to simulate infrastructure-to-human interaction experiences and evaluate perceived safety, clarity, and timing of eHMIs (external Human-Machine Interfaces) provided by SPIUs in urban environments.

AWSIM Screenshot

VR scenarios simulating infrastructure-to-human interactions with smartpoles, including blind spots, intersections, and pedestrian crossings


🔍 Purpose

This project extends AWSIM to enable:

  • Immersive first-person VR experiences of smartpole-based external HMI signals.
  • Human-in-the-loop evaluations for safety perception and usability testing.
  • Scenario experimentation using ROS 2-based simulation, Autoware compatibility, and Unity rendering.

Ideal for researchers and developers working on:

  • Infrastructure-based eHMI interactions with pedestrians using VR
  • Collective perception with different road participants
  • Smart city and V2X deployment studies
  • Safety evaluation of infrastructure HMI systems

🌟 Features

  • Unity-based 3D simulation with VR support
  • Scenarios involving smartpole interactions, traffic lights, and vulnerable road users
  • AWSIM itself runs on Ubuntu 22.04 and Windows 10/11, but VR scenarios are mainly supported on Windows 10/11
  • Forked and compatible with the latest Autoware
  • Fully integrated with ROS 2 communication
  • VR Scenarios tested on Meta Quest 2 and 3

Available eHMI Assets

eHMI Description "GO" Display "STOP" Display
Autonomous Vehicle
Smartpole Interaction Unit
Delivery Robot

🚀 Getting Started

Start with the base AWSIM tutorial:

👉 Quick Start Demo

Then, check out the SPIU-specific VR simulation guide (coming soon).

Available VR Scenarios

The project includes four specialized VR scenarios located in Assets/AWSIM/Scenes/Main/:

  1. BlindSpotVRScenario
    • Focuses on simulating blind spot situations where pedestrians might be hidden from vehicle drivers
    • Tests how smartpole-based eHMIs can help prevent accidents in areas with limited visibility
    • Includes buildings, vehicles, and pedestrians positioned to create blind spot situations

Blind Spot VR Scenario

Blind Spot VR Scenario showing smartpole eHMIs helping with visibility in occluded areas

  1. FourWayIntersectionVRScenario
    • Complex urban intersection simulation
    • Tests smartpole interactions in a typical four-way intersection setting
    • Includes traffic lights, multiple lanes, and various road users (pedestrians, vehicles, cyclists)
    • Perfect for evaluating how eHMIs can improve safety and communication at busy intersections

Four Way Intersection VR Scenario

Four Way Intersection VR Scenario demonstrating smartpole eHMIs at a complex urban intersection

  1. DeliveryRobotVRScenario
    • Focuses on interactions between pedestrians and delivery robots
    • Tests how smartpole-based eHMIs can facilitate safe coexistence between pedestrians and autonomous delivery robots
    • Includes sidewalks, delivery robots, and pedestrian paths

Delivery Robot VR Scenario

Delivery Robot VR Scenario showing interactions between pedestrians and autonomous delivery robots

  1. NightTimeVRScenario
    • Nighttime version of the simulation
    • Tests how eHMIs perform in low-light conditions
    • Includes appropriate lighting conditions and nighttime-specific challenges
    • Important for evaluating the visibility and effectiveness of eHMIs during dark hours

Night Time VR Scenario

Night Time VR Scenario showing smartpole eHMIs in low-light conditions

Each scenario is designed to test different aspects of smartpole-based eHMIs:

  • Different lighting conditions (day/night)
  • Various traffic situations (intersections, blind spots)
  • Different types of road users (pedestrians, vehicles, delivery robots)
  • Various environmental conditions (visibility, traffic density)

These scenarios work together to provide a comprehensive testing environment for evaluating:

  • Safety perception
  • Clarity of communication
  • Timing of eHMI signals
  • Effectiveness in different real-world situations

Running VR Simulations on Meta Quest

To run the VR scenarios on Meta Quest:

  1. Prerequisites

    • Windows PC
    • Meta Quest 2 or newer
    • USB-C cable for PC connection
    • Meta Air Link software installed
  2. Setup Steps

    • Connect your Meta Quest to your PC using the USB-C cable
    • Enable Meta Quest Air Link in your Quest settings
    • Open the project in Unity
    • Select your desired VR scenario from Assets/AWSIM/Scenes/Main/
    • Click the Play button in Unity
  3. Controls

    • Move your head to see the different directions in the scene
  4. Performance Tips

    • Ensure your PC meets the minimum VR requirements
    • Close unnecessary background applications
    • Adjust graphics settings in Unity if needed
    • Use a high-quality USB-C cable for optimal performance

📚 Documentation


📄 License

This repository inherits the AWSIM license:

  • Code: Apache 2.0
  • Assets: CC BY-NC

See LICENSE for details.


📬 Contact

日本語/English OK
Email: [email protected], [email protected], [email protected]

Discord: TBD
Twitter: TBD


Smartpole-VR-AWSIM is an experimental extension for human-centered smart infrastructure research. Contributions are welcome!

If you use this repository in your research, please cite the following paper:

@inproceedings{Chauhan2025SilentNegotiator,
  author    = {Vishal Chauhan and Anubhav and Robin Sidhu and Yu Asabe and Kanta Tanaka and Chia-Ming Chang and Xiang Su and Ehsan Javanmardi and Takeo Igarashi and Alex Orsholits and Kantaro Fujiwara and Manabu Tsukada},
  title     = {A Silent Negotiator? Cross-cultural VR Evaluation of Smart Pole Interaction Units in Dynamic Shared Spaces},
  booktitle = {Proceedings of the 31st ACM Symposium on Virtual Reality Software and Technology (VRST '25)},
  year      = {2025},
  month     = {November},
  pages     = {},
  publisher = {ACM},
  address   = {Montreal, QC, Canada},
  doi       = {10.1145/3756884.3765991},
  isbn      = {979-8-4007-2118-2/2025/11}
}

About

Open source simulator for self-driving vehicles (extended for eHMI research)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages