No cookie for

VT MAK Releases VR-Vantage 2.4!

This release enhances the rendering performance of OpenFlight and MetaFlight terrains. Terrains built by offline tools now take advantage of rendering techniques – Indirect Rendering, Bindless Textures, and advanced Occlusion Culling – that were previously only used in procedurally-generated terrains. This upgrade will have the most benefit on databases that are not already highly optimized for VR-Vantage’s render engine.

While substantial architectural changes were made internally to VR-Vantage, they do not significantly alter the VR-Vantage API and configuration, so the upgrade process should be straight-forward.

In addition to the performance improvements above, we made a number of smaller, yet important improvements, including:

  • Destructible Terrain with Indirect Rendering – While previous releases have used the much faster Indirect Rendering approach for point features on a streaming terrain, if those point features contained switch nodes, Indirect Rendering was not possible. This is no longer the case; buildings with switch nodes (dynamic terrain, destructible states, doors etc.) can now get the same speed-up as any other model by using Indirect Rendering. This means that dynamic content no longer limits performance in large terrains.

  • Improved Image/Detail Blending – It is now possible to configure VR-Vantage to smoothly transition from real imagery in the distance to high-resolution land-use-based textures close to the camera.

  • DI-Guys in Vehicles – Too often 3D models are built without humans in them - cars driving around the scene with no driver, or aircraft flying with no pilot. VR-Vantage can now automatically render DI-Guy characters in the driver/pilot seat without simulating an embarked entity on the network. Users can configure the specific character, appearance, and location of the character for their custom 3D vehicle models. This simple change leads to much more realistic immersive scenes with high detail characters where you care the most.

  • CIGI Control of Articulated Parts for Humans – You can now directly control DI-Guy joint angles (elbows, knees, etc.) through CIGI. If your host simulation uses hand controllers, a motion-capture suit, or a video-based motion capture system such as Microsoft Kinect to capture the motion of a real person; you can now drive the DI-Guy characters within VR-Vantage to mimic that person’s motions in real time. This feature brings high fidelity, fine grain control of human characters by external data sources to VR-Vantage for some really impressive immersive scenes.

  • Improvements to Sound System API – We have overhauled the API used to control and produce sound in VR-Vantage to make it easier to extend to meet program needs.

  • H265 Video Streaming – >VR-Vantage can now generate streaming H265 video using on-board GPU hardware to drive high performance and high pixel count displays.

In conjunction with this release of VR-Vantage, we are providing a new Data Package. This new Data Package is also compatible with VR-Vantage 2.3.x and VR-Forces 4.6.x. Customers using VR-Forces can reinstall VR-Forces with this data build to gain access to the new terrain and model improvements. However, for customers not wishing to upgrade VR-Forces data, VR-Forces 4.6.x will continue to function well with VR-Vantage and the new data as well.

For a complete description of this release please see the Release Notes:

 

MAK is dedicated to helping you succeed. If you're a MAK customer, let us know if you would like to receive product download links as a part of our MAK Announce emails. To reach us with questions or comments please email us at This email address is being protected from spambots. You need JavaScript enabled to view it..
 

Get more information about who we are and what we do...

 

 

·  Destructible Terrain with Indirect Rendering – While previous releases have used the much faster Indirect Rendering approach for point features on a streaming terrain, if those point features contained switch nodes, Indirect Rendering was not possible. This is no longer the case; buildings with switch nodes (dynamic terrain, destructible states, doors etc.) can now get the same speed-up as any other model by using Indirect Rendering. This means that dynamic content no longer limits performance in large terrains.

·  Improved Image/Detail Blending – It is now possible to configure VR-Vantage to smoothly transition from real imagery in the distance to high-resolution land-use-based textures close to the camera.

·  DI-Guys in Vehicles – Too often 3D models are built without humans in them - cars driving around the scene with no driver, or aircraft flying with no pilot. VR-Vantage can now automatically render DI-Guy characters in the driver/pilot seat without simulating an embarked entity on the network. Users can configure the specific character, appearance, and location of the character for their custom 3D vehicle models. This simple change leads to much more realistic immersive scenes with high detail characters where you care the most.

·  CIGI Control of Articulated Parts for Humans – You can now directly control DI-Guy joint angles (elbows, knees, etc.) through CIGI. If your host simulation uses hand controllers, a motion-capture suit, or a video-based motion capture system such as Microsoft Kinect to capture the motion of a real person; you can now drive the DI-Guy characters within VR-Vantage to mimic that person’s motions in real time. This feature brings high fidelity, fine grain control of human characters by external data sources to VR-Vantage for some really impressive immersive scenes.

·  Improvements to Sound System API – We have overhauled the API used to control and produce sound in VR-Vantage to make it easier to extend to meet program needs.

·  H265 Video Streaming – VR-Vantage can now generate streaming H265 video using on-board GPU hardware to drive high performance and high pixel count displays.