No cookie for

Performance-Driven, Multi-Channel IG + Sensor Views

With VR-Vantage IG’s special effects and physics-based models, participants immerse in their synthetic world and reap the full benefits of each session. Realistic out-the-window (OTW) scenes, physics- and effects-based sensor views that can be deployed on a variety of COTS hardware configurations from simple desktops to multi-channel displays for virtual cockpits, monitor-based training systems and AR/VR components.

VR-Vantage IG is an Open-Architecture

Every project has unique challenges that require variations and ease of configuration. VR-Vantage IG's open-architecture was designed from the ground up offering the most flexible, high-performance visualization tools on the market. VR-Vantage IG APIs make it possible to accommodate any Training, Simulation or Mission Rehearsal requirements. This approach also supports compatibility with your current, existing applications.

 

Key Features

Connectivity

  • CIGI
  • DIS
  • HLA

Renderer

  • Multi-channel
  • Distortion correction

Simulated Environmental Effects

  • Dynamic weather: thunderstorms, rain, snow, clouds
  • Dynamic ocean: waves, surf, wakes, shallow water, and buoyancy (responds to dynamic weather)
  • Particle-based recirculation and downwash effects
  • Rotor wash
  • Particle-based weather systems
  • Procedural ground textures
  • Realistic and dense vegetation

Terrain

  • VR-World Whole-Earth
  • CDB, OpenFlight, and many other formats
  • Dynamic terrain: damage on buildings and bridges
  • Terrain blending options

Sensors

  • Optical Sensor Models (EO/IR/NVG) included
  • Optical Sensor Physics (EO/IR/NVG) with SensorFX Plugin

2D Overlays

  • Heads-up display (HUD) support

Simulated models

  • Realistic Lifeforms; animated 3D human characters and animals with cultural diversity
  • Extensive library of high-quality air, land and maritime moving entity models
  • Dynamic structures; damage on buildings and bridges

System / Performance

  • Open Architecture
  • Commercial-off-the-shelf (COTS) Software
  • Scalable/manageable performance
  • Supports AR / VR

VR-Vantage IG Capabilities

Click through the red tabs below or download the VR-Vantage IG Capabilities document to learn more.

Edge-Blended Multi-Channel

VR-Vantage IG is built from the ground up with multi-channel distributed rendering. This means that users can add VR-Vantage IG Remote Display Engines to extend the image generation to fill all the displays in their training devices. And when their display solutions involve curved screens, VR-Vantage IG’s built in support for Scalable Display Technologies’ enables them to setup image warping and edge blending to match the specific geometry of the display.

Sensors

VR-Vantage IG offers two options for visualizing sensors: CameraFX and SensorFX.

CameraFX

VR-Vantage IG can simulate the way that the 3D scene would look if an observer were using visual sensor devices such as night vision goggles or viewing the infrared spectrum. This is an observer-specific setting. You can adjust the contrast, blur, and noise of the view.

VR-Vantage IG includes one sensor module – CameraFX, which uses the SenSim libraries from JRM Technologies. The sensors do not take into account the materials of the objects that you are viewing. They simply filter the view to produce the desired effect. If you want physics-based sensor effects, you can add the SensorFX Plug-in to VR-Vantage IG.

VR-V_cap_doc-CameraFx

SensorFX

SensorFX is an extra cost plug-in for VR-Vantage IG. SensorFX changes VR-Vantage IG from a visual scene generator to a sensor scene generator. SensorFX models the physics of light energy as it is reflected and emitted from surfaces in the scene and as it is transmitted through the atmosphere and into a sensing device. SensorFX also models the collection and processing properties of the sensing device to render an accurate electro-optical (EO), night vision or infrared (IR) scene.

Extensive Sensor Coverage SensorFX enables you to credibly simulate any sensor in the 0.3-16.0um band with VR-Vantage IG, including:

  • FLIRs / Thermal Imagers: 3-5 & 8-12um.
  • Image Intensifiers / NVGs: 2nd & 3rd Gen.
  • EO Cameras: Color CCD, LLTV, BW, SWIR.

Sensor FX

Supports CIGI, DIS and HLA

VR-Vantage IG can receive data from external simulations and be controlled by them using standard simulation protocols. You can connect and disconnect from simulations during runtime and can configure the connections in the graphical user interface (GUI).

VR-Vantage IG supports the following simulation network protocols:

  • Common Image Generator Interface (CIGI) 3.2 and 3.3. 
  • HLA 1.3, HLA 1516, and HLA Evolved. VR-Vantage IG supports the HLA 1.3 specification, the current draft of the IEEE 1516 C++ API maintained by the SISO Dynamic Link Compatible RTI API Product Development Group, and HLA Evolved (IEEE 1516-2010). It has built-in support for the HLA RPR FOM and can support other FOMs through the FOM Mapping feature.
  • Distributed Interactive Simulation (DIS) protocols DIS 4, 5, 6, and 7.

Streaming Video

VR-Vantage IG can send the view in the display window to a video stream. It supports several different open standards. If you have a supported viewer, you can use the simulated video as a flexible alternative to live video for demonstration, development, testing, and embedded training of operational video exploitation systems.

Performance Features

VR-Vantage IG includes tools to help customers measure performance and manage tradeoffs between scene content and performance.

Image generation for immersive display systems requires smooth 60Hz update rates (and VR applications require 90Hz and up). Achieving this standard of performance while maximizing content density is a constant struggle. VR-Vantage IG helps users and system integrators manage this fundamental balance.

High performance image generators, like VR-Vantage IG, use sophisticated graphics techniques to render beautiful full-motion scenes of the world. Many of these techniques come with a performance cost that can adversely affect frame update rates. VR-Vantage IG  presents all these techniques to integrators, so that they can choose the techniques that most positively impact the rendered scenes for the specific type of their simulation. VR-Vantage IG also provides tools to help diagnose performance bottlenecks, which is key to addressing issues with content and configuration settings — a precursor to resolving performance problems.

VR-Vantage IG can display two kinds of performance statistics — OSG statistics, and VR-Vantage IG statistics.

VR V cap doc Performance Statistics

Performance statistics

The StatsHandler class in the osgViewer library can gather and display the following rendering performance information.

  • Frame rate. osgViewer displays the number of frames rendered per second (FPS).
  • Traversal time. osgViewer displays the amount of time spent in each of the event, update, cull, and draw traversals, including a graphical display.
  • Geometry information. osgViewer displays the number of rendered osg::Drawable objects, as well as the total number of vertices and primitives processed per frame.

File Caching

VR-Vantage IG saves loaded files into a fast loading format so that they are quickly loaded from disk the next time you need them. VR-Vantage IG also supports preemptive model loading - you can specify a list of models to load at startup so that there are no stalls when the model is discovered during a simulation.

Object Instancing

VR-Vantage IG keeps instances of objects in memory and clones or references them instead of loading new copies of the same data from disk again.

Configurable Render Settings

VR-Vantage IG lets you decide which of the visual features you want to enable and disable, such as advanced lighting, wake and spray effects, shadows, and the ocean height map. This lets you optimize visual quality or performance depending on your needs.

Filtering Entities Using Interest Management

VR-Vantage IG can use interest management to improve its performance in simulations that have high entity counts. Interest management is an implementation of HLA data distribution management (DDM). When interest management is enabled, VR-Vantage IG filters out all entities that are more than a specified distance from the observer.

Light Points and Lobes

VR-Vantage IG offers accurate light points and the most interactive approach to tuning lighting systems. This ability to interactively edit the lighting systems allows system integrators a unique ability to interactively tune the lights in the short amount of the time they typically have with subject matter experts on site, even as late in the delivery process as Factory Acceptance Testing.

Light points are of vital importance in flight simulation where real-world lights are specifically designed to aid pilots in navigating and landing, as well as obstacle avoidance and runway/taxiway operations. For non air centric uses, VR-Vantage IG’s easy ability to control lighting systems will help the general appearance of the environment particularly cities, roadways and other cultural features.



Realistic Light Lobes

Terrain

Simulations must exist in the context of a simulated world. VR-Vantage IG supports multiple terrain formats and allow you to combine elevation data, imagery, and feature data to support your simulated environment.

Terrain Agility and Composability

VR-Vantage IG allows you to build your terrain at runtime using a variety of database and vector formats. VR-Vantage IG supports the following broad types of terrain:

  • Terrain models. Static terrains, such as OpenFlight, built using terrain construction applications.
  • Paging terrain. Large area terrains that page in multiple terrain pages, such as MetaFlight.
  • Direct from source. Terrains composed by combining various types of terrain elevation, imagery, and features data.

VR-Vantage IG can load OpenFlight, CDB, DTED, CTDB, FBX, and MetaFlight databases. It can load feature data from shapefiles. It can stream 3D tiles. You can also superimpose geospatial images such as GeoTiffs or other raster image files on the terrain. You can save these composed terrains in the VR-Vantage MTF file format.

  • Open streaming terrain. Terrains that stream data from various public and private data sources, such as VR-TheWorld.
  • Procedural terrain. Terrain created using geo-typical imagery based on soil type data.
  • Dynamic terrain. VR-Vantage IG allows changes to OpenFlight models though HLA, DIS, and API calls to support a correlated and robust dynamic terrain environment.

 

Streaming and Paging Terrain

For simulation on large terrains, VR-Vantage IG can stream data from external servers or from local directories. VR-Vantage IG can stream elevation and imagery from terrain servers such as VR-TheWorld or other WMS-C (Web Mapping Service-Cached, from Open Geospatial Consortium) and TMS (OSGeo’s Tile Map Service) servers.

VR-Vantage IG uses osgEarth to import streaming terrain elevation and imagery data. (osgEarth is an open source plug-in to OpenSceneGraph, maintained by Pelican Mapping at http://osgEarth.org.) For MetaFlight, VR-Vantage IG has its own pager.

Procedural Terrain

A procedural terrain applies high-resolution geo-typical textures to the terrain based on soil-type information rather than using satellite imagery. This approach provides high-quality visuals over large areas without the performance cost of high-resolution imagery. Customer who use procedural terrain will usually insert locally accurate imagery and features for the areas that they are particularly interested in. For example, procedurally add rooftop clutter, such as AC units, hatches and other mechanical items, to extruded buildings.

Vegetation

VR-Vantage IG uses SpeedTree software and content for animated real-time 3D foliage and vegetation. Speedtrees can move with the wind. SpeedTree is developed by Interactive Data Visualization (IDV) (http://www.speedtree.com).

Props

Props are terrain elements, such as buildings, light poles, and vegetation, that you can manipulate through the VR-Vantage IG GUI. VR-Vantage IG can import shape (or other feature data file formats) and create geometry (props) for point features in the source data.

Entities

VR-Vantage IG includes many 3D vehicle models, some of which display movement of articulated parts, such as turrets and landing gear. These models can change to show a damaged or destroyed state. You can provide your own models and map them to entity types.

VR-Vantage IG can render real time shadows for entities and lifeforms based on the position of the sun.

Realistic 3D Human Characters

VR-Vantage IG uses DI-Guy software and content for human character animation. VR-Vantage IG comes with DI-Guy functionality built-in, and with a large set of DI-Guy characters, appearances, and animations. If a vehicle has interior geometry, VR-Vantage automatically puts a human character in the driver’s seat.

VR-Vantage IG includes the DI-Guy Character Viewer, which lets you view all of the DI-Guy characters, their various heads and hand items, and their animations.

Cockpits

VR-Vantage IG uses GL Studio software and content to render interactive cockpit instrumentation displays. VR-Vantage IG is delivered with several generic cockpit displays. GL Studio is developed by DiSTI (http://www.disti.com).

Trajectory Smoothing

VR-Vantage IG can smooth the trajectories of moving vehicles to compensate for discontinuous positional data.

Surface Entity Movement

When dynamic ocean is enabled, surface entities bob up and down with ocean swells. Destroyed entities sink beneath the waves.

Ground Clamping

The ground clamping feature can ensure that all ground entities are placed correctly on the terrain surface.

Inset Views

You can display inset views of individual entities. The illustration shows a helicopter hovering over one area of the terrain with an inset view of a vehicle driving through the town.

vrv insetview capabilities

You can change the observer position and orientation in an inset view using keyboard controls. You can also edit window and channel properties for the window in the Display Engine Configuration Editor.

Sensor Views

VR-Vantage IG can display the view from gimbaled visual sensors simulated by VR-Forces, such as a camera on a UAV. The view is displayed in an inset window that has information about the observer mode and area being viewed. The window has its own observer and you can change the observer mode in the view.

vrv sensorview

Sound Effects

Sound effects increase the immersive effect of the visual environment. VR-Vantage IG can associate sound effects with entities, fire events, and detonation events. You can enable and disable use of sound effects and you can specify that sounds will be heard only when the observer is within a certain distance of the entity or event that generates them.

Environment

VR-Vantage IG creates a realistic atmospheric environment with sun, moon, clouds, lighting, and precipitation. For the marine environment, VR-Vantage IG renders realistic ocean effects, including waves, swells, wakes, and spray effects. Users can choose among several pre-configured environment conditions or adjust any of the weather and marine features to create custom weather conditions. VR-Vantage IG can display weather conditions sent from VR-Forces.

VR-Vantage IG uses SilverLining software and content to compute lighting and to render the atmosphere. SilverLining is developed by Sundog Software (http://www.sundog-soft.com). VR-Vantage IG uses the Triton SDK, also from Sundog Software, to create the marine environment.

Configurable Weather

VR-Vantage IG lets you set the wind direction and speed, precipitation type and intensity, visibility, and cloud cover. Wind direction and speed affects dynamic ocean. Together with the lighting effects, varying cloud cover creates dramatic environment visualization.

VR-Vantage IG supports fog, dust storms, and hail. Rain can accumulate into puddles and snow can accumulate and blow in the wind.

You can render the effect of rain splashing on the observer’s camera. When dynamic ocean is enabled, waves can splash onto the observer view.

Configurable Weather

Rain on the windscreen

Animated Flags and Windsocks

Flags on terrain and ships, and windsocks respond to wind speed and direction.

Dynamic Ocean

VR-Vantage IG supports a variety of dynamic ocean effects, including:

  • Douglas Sea State. This includes wind waves (wind sea), swell character, and the directions of each. VR-Vantage IG allows each to be separately configured.
  • Wave chop.
  • Surface transparency. The ability to see through the water from above sea level.
  • Underwater visibility. The ability to see underwater.
  • Swell.
  • Surge depth. Lets you calm shallow water to visualize offshore wind and calm harbors.

Lighting

Date and Time

VR-Vantage IG uses a full-year ephemeris model that changes the position of the sun and moon as a function of date and time of day. You can set the date and advance time in real time or at a faster or slower rate. VR-Vantage IG can also advance time based on messages from VR-Forces.

Lighting Effects

VR-Vantage IG supports the following lighting effects:

  • Light Points.
  • Light Lobes.
  • Dynamic lighting.
  • Ephemeris model.
  • Ocean planar reflection.
  • Lens flare.
  • Crepuscular (sun) rays.
  • Atmospheric scattering.
  • Shadows.
  • Fresnel lighting for scene reflections.

VR-Vantage IG supports light lobes as defined in an OpenFlight model. The light can be cast onto the terrain, and the lights can rotate or move like a search-light or a lighthouse. Because cast lights are computationally expensive, VR-Vantage IG will only display the N closest lights to the observer, where N can be configured.

VR-Vantage IG also supports light points. These are single points that light up in the dark, but do not cast light on objects near it. For example, light points could be used for air port or sea channel navigation. Lights can be colored, can blink, and can be directional.

Shadows

VR-Vantage IG can display shadows for entities, props, and terrain features. You can enable and disable them and configure many aspects of shadow quality.

Shader-Based Effects Texture Maps

VR-Vantage IG supports several types of shader-based effects texture maps. Texture maps are raster images that apply highly realistic textures to the terrain and models. By applying different types of texture maps to terrain and models, you can improve the visual quality of your simulation without the overhead of high polygon counts.

VR-Vantage IG supports the following types of shader-based effects maps:

  • Normal, or bump, maps. Normal maps give terrains the appearance of relief, such as a rocky landscape.
  • Specular maps. Specular maps affect the highlight color of objects.
  • Ambient occlusion maps. Ambient occlusion maps model areas that do not receive direct light, such as cracks and crevices and shaded areas of terrain and models. These areas are lit only by ambient light.
  • Reflection maps. Reflection maps affect the reflectivity of surfaces, such as windows. Reflection maps reflect objects in the sky, not the terrain.
  • Emissive maps. Emissive maps control the emissivity of whatever they are applied to based on the current ambient light values.
  • Nightmaps maps. Used when ambient light is low. Used in conjunction with emissive maps to control brightness of pixels.
  • Emissive Nightmap maps. Used when ambient light is low. Pixels are 100% bright. You do not need an emissive map, but you cannot control pixel brightness.
  • Lightmap maps. Lightmaps store precomputed lighting for surfaces in a scene.
  • Sensor maps. Adds an intensity to all three color channels to support night vision goggles and infrared.
  • Gloss maps. Gloss maps control how wide or narrow the specular highlight appears.
  • Reflection maps. Reflects the environment onto an object.
  • Flow maps. Flow maps define direction-based distortion, such as water flow.
  • Snow Mask maps. Controls how quickly snow accumulates and to what depth.
  • Rain Mask effect maps. Controls whether an area will have puddles and the amount of accumulation before puddles appear.

vrv dynamiclighting makland

Depth of Field

Depth of field controls the area in the scene that is in focus. Depth of field is calculated based on a focal depth and a focal range. In the image below, the focal depth is about 10 meters, fairly close to the observer, and the focal range is also small, so that the scene is out of focus just past the car.

depth-of-field

Effects and Interactions

VR-Vantage IG displays:

  • Smoking and flaming effects for damaged entities.
  • Trailing effects, such as footprints, wakes, missile trails, and dust clouds.
  • Muzzle flashes.
  • Tactical smoke.
  • Smoke and flame for detonation impacts.
  • The Particle System Editor gives you greater control over special effects, such as smoke, fire, explosions, debris, dust trails, and weather effects.