VR-Engage is MAK’s multi-role virtual simulator. VR-Engage lets you play the role of a first-person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter. VR-Engage can be deployed as a trainee simulator, as a role player station, an instructor aid, a desktop simulation game, even as a VR headset experience. Built on mature proven technologies, VR-Engage gets its simulation engine from VR-Forces, and its game-quality 3D graphics from VR-Vantage.
- A high-fidelity vehicle physics engine needed for accurate vehicle motion.
- Ground, rotary and fixed-wing vehicles, and the full library of friendly, hostile, and neutral DI-Guy characters.
- Radio and voice communications over DIS and HLA.
- Sensors, weapons, countermeasures, and behavior models for air-to-air, air-to-ground, on-theground, and person-to-person engagements.
- Vehicle and person-specific interactions with the environment (open and close doors, move, destroy, and so on.)
- Terrain agility. As with VR-Vantage and VR-Forces, you can use the terrain you have or take advantage of innovative streaming and procedural terrain techniques.
Click through the red tabs below or download the VR-Engage Capabilities document to learn about VR-Engage.
Simulation (Role-Playing) Capabilities
VR-Engage lets you choose which entity you want to simulate. Said another way, you choose the role you want to play. You can play:
- Dismounted soldiers and other human characters.
- A member of a the crew of a ground vehicle.
- A helicopter or fixed-wing aircraft pilot.
Dismounted Soldiers and Other Human Characters
If you take the role of a dismounted soldier or other human character, you have access to the full DI-Guy library of friendly, hostile, and neutral characters. Your character has the following capabilities:
- Individual movement (look, run, walk, crawl, crouch, prone).
- Interact with environment (open/close, move, carry, break).
- Interact with other characters (shoot at and be shot at).
- Sensors (binoculars and night vision goggles (NVG)).
- Weapons (rifles, grenades).
- Use carried gear, such as a flashlight, range finder, or compass.
- Lase targets for other players or simulated entities.
- CGF Assist (passenger, fast-roping). When you are running a VR-Forces scenario, you can give up control of your character and let the simulation engine manage complex behaviors.
- Embark and disembark. Enter and leave vehicles and take on roles of vehicle crew members.
Ground Vehicle Crew
VR-Engage lets you take on the roles of the driver, gunner, or commander of a ground vehicle (one role at a time, of course) that has the following capabilities:
- Physics-based ground vehicle dynamics model.
- Interact with environment (collide, go over, push, destroy).
- Ground-to-ground engagements:
- Sensors (EO, IR, NVG).
- Weapons (main gun, machine gun).
- Signals/Countermeasures (smoke grenades).
- Vehicle controls (headlights, tail lights).
- CGF Assist (drive, gun). In a VR-Forces scenario, as you switch roles, the simulation engine can take over the tasks and plans that manage the other roles for the vehicle. For example, if you are the gunner, VR-Forces can manage driving the vehicle.
You can easily switch roles and can disembark from the vehicle to assume the role of a dismounted soldier.
As a pilot, you can fly an aircraft with the following capabilities:
- Physics-based fixed & rotary wing dynamics models.
- Head up display (HUD).
- Multi-function cockpit display. Flight instruments, map/chart, imagery.
- Air to Air Engagements:
- Sensors (radar scope, radar warning receiver).
- Weapons (missiles).
- Countermeasures (chaff, flares).
- Air to Ground Engagements:
- Sensors (IR targeting, SAR (requires RadarFX Server)).
- Weapons (missiles, guns, bombs).
- Countermeasures (chaff, flares).
- CGF Assist.
You can control your character using the keyboard, mouse, and game controllers. The graphical user interface provides command menus and cockpit and crew cabin displays. Other interfaces include:
- Radio (push-to-talk, radio channels, MAK Data Logger support).
- Control with role specific devices (driver-steering wheel, pilot-HOTAS).
- Display system support (monitors, projectors, Oculus Rift).
- Audio (engine sounds, environmental noise, event effects, warning tones).
VR-Engage includes a built-in Image Generator, based on VR-Vantage IG, that can support multiple display channels, to the limits of your installed graphics card. It can also be extended to fill larger multi-channel display systems by adding VR-Vantage IG Remote Display Engines. This is an example of the flexibility of MAK's product architecture that lets you configure a system of any size.
Regardless of which role your VR-Engage simulator is playing, VR-Engage delivers realistic scene rendering (terrain, entities, interaction effects, weather, clouds, time-of-day, lighting, shadows, high dynamic range (HDR) lighting.)
The image generator supports the following configurations:
- Single screen.
- Dual screen.
- Multi-channel distributed rendering (requires VR-Vantage Remote Display Engine). Spread the view across multiple monitors run by multiple computers.
- IR and NVG Sensors using the included CameraFX or optional SensorFX.
- On screen Multi-function Display (IR, SAR (requires SensorFX)).
VR-Engage, like VR-Vantage and VR-Forces, lets you use the terrain you have or take advantage of innovative streaming and procedural terrain techniques to simulate on large, complex terrains.
- Terrain agile (all types supported by VR-Vantage and VR-Forces).
- Dynamic terrain. Destroyable buildings, bridges, and props.
- Dynamic ocean. Waves, spray, wakes, wind.
VR-Engage includes a set of useful geo-specific terrains and options for building more.
Integration with VR-Forces
When VR-Engage is used with a compatible version of VR-Forces and other MAK products, you can reap the additional benefits of a common system architecture:
- Common representation of the environment across player and CGF stations, including synchronized weather, time-of-day, and dynamic terrain.
- Build terrains, models, and configurations once, and deploy them across VR-Engage player stations, VR-Forces simulation engines and front-ends, and any other applications that use VR-Vantage IG.
- Role-play multiple entities at a time by switching between manual and CGF control on-the-fly. Take control of VR-Forces-driven entities.
- Configure players with the VR-Forces Simulation Object Editor.
- Create scenarios in VR-Forces and load them in VR-Engage for stand-alone use.
- Automatic WebLVC integration (tablet-based IOS, and so on.)
The following sections list the benefits of different configurations of VR-Engage and VR-Forces.
The VR-Forces GUI (graphical user interface) can serve as a common instructor interface to manage both the player-controlled entities and computer generated forces (CGF) entities - including unified laydown, checkpointing, drag/drop, and scenario save/load.
- One role player can switch between CGF / 1st person to control multiple entities.
- CGF Assist lets the player give control to the VR-Forces simulation engine.
- Players can embark on vehicles in the VR-Engage space or those simulated by a compatible VR-Forces scenario.
- Players can task CGF entities.
A role player can play VR-Forces scenarios that have been copied to VR-Engage.
Integration with Other MAK Products
Previous sections have discussed VR-Engage’s integration with VR-Forces and VR-Vantage. Additionally, like all MAK products, VR-Engage uses VR-Link for network interoperability and supports DIS and HLA. It integrates seamlessly with our other Link products, such as MAK Data Logger for recording and AAR, the MAK RTI for HLA support, and VR-Exchange for data interchange in heterogeneous simulation environments.
After Action Review – using MAK Data Logger
Whether in standalone mode or when playing with other simulation applications, VR-Engage publishes to the DIS or HLA network. Therefore, you can use the MAK Data Logger to record your simulation and play the recording back for after-action-review (AAR).
Network Interoperability with other federates
VR-Engage is natively compliant with DIS and HLA, so it can participate in any topology of networked simulators and simulation support systems, including multi-player classroom environments. It can interoperate with existing simulation applications and 3rd party simulators, SAFs, and CGFs.
Although VR-Engage comes ready-to-run out of the box, its design is versatile, allowing system integrators to add modules and complementary products, and to customize and extend VR-Engage to meet program-specific requirements:
- VR-Vantage Remote Display engines provide a multi-channel display.
- SensorFX enhances the fidelity of EO/IR sensors using physically accurate modeling based on the material properties of terrain and objects.
- RadarFX Server generates SAR (Synthetic Aperture Radar) images upon request from an aircraft pilot, which are displayed in VR-Engage’s multi-function displays.
- VR-TheWorld Server provides streaming terrain data (elevation, imagery, land use, and feature layers) through open standards.
- WebLVC Server enables web and mobile Apps that can be used by exercise support staff to manage and stimulate VR-Engage entities (e.g. position entities, change weather, or initiate scenario events from a tablet).
- DiSTI’s GL Studio editor can be used to author and edit interactive cockpit instruments.
- Editors from CM Labs and RT Dynamics allow you to author new vehicle types and change the dynamics of existing vehicle types.
- DI-Guy editors and SDK allow you to add new characters and motions.
And perhaps most important - almost any aspect of the VR-Engage system can be customized or extended by a C++ developer using the VR-Forces or VR-Vantage toolkits.
- Users can configure different user interface control devices using mapping files.
- Users can configure output devices using VRF backend plugins (or MAK could help you to get at the Vortex integration with DBOX).
- E.g. Stimulate my altimeter in my physical cockpit.
- E.g. Stimulate my motion platform
- VR-Engage can take control of any type of VR-Forces human character entity. However, only one character is used in the 1st person POV. So, the hands and sleeves won’t change to match the character chosen.
- MAK engineering is required to rig another character for 1st person POV (the tools you’ll need are available with the VRV-Dev and DI-GUY SDK, but the process is difficult). This is the same as in VR-Vantage.
- MAK Engineering is required to rig new weapon types (change FPS rig; Sim Engine plug-in or configuration to change simulation model.
New Ground Vehicles
- Users can rig new vehicle models to use one of the vehicle types provided by MAK (e.g. 4-wheel truck, multi-wheel truck, tracked vehicle). The Vortex editor and simple mapping is required.
- MAK Engineering is required to rig any new vehicle behavior.
New Air Vehicles
- RT Dynamics editor can alter the parameters of flight dynamics models then the new model will work in VRE.
- If you want to add any new vehicle behavior, MAK engineering will be required.
Custom Cockpit/Cabin Instruments
- Users can change the look/feel/arrangement of cockpit instruments with the GL Studio editor, as long as the changes are compatible with the interface that our widgets already use.
- Interfaces to new capabilities will require MAK engineering.
Simulation and IG customization
- Users can add many IG and simulation capabilities with the VR-Vantage and VR-Forces (VRF-Dev includes VRV-Dev) toolkits.
- g. Changing the way a vehicle takes damage could be done by configuring VRF parameters or by adding a VRF plugin.
- g. Want to add a red dot (like a laser sight spot) to indicate where the player is pointing the gun, this could be done with a VRV plugin. Adding or changing particle system effects and other scene representation details can also be done.
- MAK engineering is required to alter VR-Engage behavior .
- Users can configure EO/IR/NVG/SAR sensors with the product GUI
- If you wanted to make a new type of sensor you could do it with JRM tools (SigSim, SenSim) and plugin it into VRV. (This is the same as for VR-Vantage.)