NADS miniSim driving simulator uses DI-Guy to inject realism into its driving environment
The recent holiday season marked the one-year anniversary of DI-Guy joining the MÄK team " and what a year it has been! From increasing DI-Guy performance and ease-of-use, to developing new ways to control characters, to building more realistic character simulations, and to creating much more content out-of-the-box, 2014 has been the year of DI-Guy.
With such a strong year in the records and such a strong product on the shelf, it makes sense that the National Advanced Driving Simulator (NADS) trusts DI-Guy’s human character simulation in its NADS miniSim¢ driving simulator.
NADS’ miniSim, located at the University of Iowa, is a high-performance driving simulator used for research, development, clinical, and training applications by universities and organizations around the world. The core software is based on the state-of-the-art technology developed over decades for the world-famous NADS-1 simulator and used at the premier driving simulation facility.
When challenged with adding high-fidelity human characters and behaviors to its simulations, NADS chooses DI-Guy humans; they make the driving environment more realistic by becoming part of the simulation’s surroundings, like buildings, trees, and street furniture. DI-Guy simulated humans also provide both normal and unpredictable traffic interactions like a person crossing the street, or a child running into the street while chasing a basketball.
"Our goal from day one has been to deliver the most realistic driving simulation capability to our customers," said Andy Veit, miniSim Program Manager at NADS. "DI-Guy enables users to populate the driving environment with realistic pedestrians and enhances our ability to orchestrate compelling scenarios."
DI-Guy humans simulate motions and behavior in real-time and allow users to rapidly populate simulation scenarios with intelligent characters, which can be played into scenes over a network. DI-Guy characters know how to find their way around an environment, can respond to other entities in the simulation, and make seamless transitions from one activity to the next, all while moving naturally like real people.
Read the full January 2015 newsletter now!