Thursday, February 4, 2021
Connected simulation avatar
The avatar of a real person accurately reflects the limb and head movements of that person.

NADS recently wrapped up a multi-year project on connected (a.k.a., distributed) simulation funded by the Federal Highway Administration. While most simulations in driving research involve one human participant, this project involved multiple people interacting in the same virtual environment. 

A driver in the NADS-1 simulator could see and interact with an actual pedestrian who was wearing a virtual reality headset and walking around in another part of the facility. The pedestrian and the driver could see each other’s head and limb movement through their digital avatars, and they could gesture at each other to visually communicate.  

The goal? The team wanted to know if participants in the simulator would behave differently if they were interacting with the avatar of a real person instead of a computer-generated (agent) pedestrian.  

The team completed three major aims for the project:  

  • Develop the technology for connecting real-time driving and pedestrian simulators. 
  • Create realistic avatars to represent the tracked motions of real drivers and pedestrians.  
  • Design and conduct an experiment in which pairs of naive drivers and pedestrians interacted in a shared virtual world.


The research team concluded that there is benefit to using these types of interaction while studying safety. 

“It gave us a deeper understanding of how people interact,” said Chris Schwarz, director of engineering and modeling research. 

Not only were human drivers more likely to slow down and stop than the agent vehicles, but human pedestrians were also more likely to cross in front of human drivers than agent vehicles. So it did appear to affect the behavior of both parties, even when over half of the participants couldn’t distinguish which simulations were avatars of real people and which were computer-generated agents.  

Two crossing types were compared: at junctions and at mid-block. The agent vehicles were not programmed to stop at the mid-block crossings, whereas drivers were more likely to come to a stop or slow down at the mid-block crossings. When human pedestrians crossed at mid-block, the driver always stopped to avoid hitting them, but the agent vehicles often had a collision.

Shawn Allen and Chris Schwarz
Chris Schwarz looks over the shoulder of Shawn Allen while collaborating on a project.

Creating compatible software 

In order for the driver to interact with a live pedestrian, two systems had to be merged: the NADS-1 software and the Unity3D game engine, used by the University of Iowa’s Hank Lab pedestrian simulator. 

Shawn Allen, engineering coordinator, led the creation of a new virtual environment and created an object database that was compatible with Unity. To do this, he developed a workflow to combine scripts and manually processed objects in the scenario library to Unity. 

This work was done under the Federal Highway Administration’s Exploratory Advanced Research (EAR) program based on a $1.8 million grant. The project was done in collaboration with the UI Hank Virtual Environments Lab and Visual Intelligence Lab. 

Interaction with bicyclists will be added in future work.

This article originally appeared in the 2020 NADS Annual Report.