No cookie for

System Designs

See the types of simulation systems and components that our customers build with MAK ONE. For each we identify where the MAK ONE products benefit the design.

Explore the possibilities

Product Legend

Sensor Operator Training

Sensor operators and aircraft pilots provide critical intelligence and support during complex, sometimes dangerous missions. To qualify, they must have in-depth control of the ground control station sensor system along with the tactical prowess required to work as a team during missions with unpredictable complications.

To train, VT MAK offers flexible and powerful simulation products designed to address the range of Sensor-Operator and Pilot training requirements. Here are a few examples:


Use Case #1,  Add ISR sensors to an existing training system:
Whether you want to add an airborne sensor asset to an existing exercise or to host a classroom full of beginner sensor-operators, VR-Engage Sensor Operator offers a quick and easy way to integrate with simulation systems right out of the box as shown in Figure 1.  Experienced participants can use it to provide intel from their payload vantage point and beginners can gain baseline training.

Sensor Operator Use Case Use Case 1

Figure 1: A Sensor Operator adds another view point to a JTAC system

 

All of MAK products are terrain- and protocol-agile, allowing you to leverage your existing capabilities while attaching a gimbaled sensor to any DIS or HLA entity in your existing simulation system.


Use Case #2, Train in-depth system operation

Before operating a sensor system in the real world, operators need training for in-depth sensor system operation.

This training is done with the combination of VR-Forces  as the simulation engine and VR-Engage as the Sensor Operator role player station. VR-Forces is used to design scenarios that require the learning of essential skills in controlling the sensor gimbal. It provides a way to assign real-world Patterns Of Life and add specific behavioral patterns to human characters or crowds. Fill the world with intelligent, tactically significant characters (bad guys, civilians, and military personnel) to search for or track. Create targets, threats, triggers and events. VR-Forces is also the computer-assisted flight control for the sensor operator’s aircraft.

VR-Engage can be configured to use custom controllers and menu structures to mimic buttonology and emulate the physical gimbal controls. Adding SensorFX to VR-Engage will further enhance fidelity -- emulating the physics-based visuals so that they provide the same effects, enhancements, and optimizations as the actual sensor. In short, students can train on a replica of their system configuration.

Sensor Operator Use Case Use Case 2 Update Use Case 1 copy

Figure 2:  Training to operate a particular sensor device 

 


Use Case #3, Train the full airborne mission team

Before integrating with a larger mission, the Sensor and platform operators must learn to operate tactically as a Remote Piloted Aircraft unit.

These skills can be acquired while training side by side on a full-mission trainer. The stations in Figure 3 use combinations of VR-Forces and VR-Engage to fulfill the roles of the Instructor, the Pilot of the aircraft, and the Sensor Operator.

Pilot:
In the Pilot station, VR-Forces provides the Computer-Assisted flight control of the UAV.

Through the VR-Forces GUI’s 2D map interface, a user can task a UAV to fly to a specific waypoint or location, follow a route, fly a desired altitude or heading, orbit a point, and even point the sensor at a particular target or location (sometimes the pilot, rather than the Sensor Operator, will want to temporarily control the sensor).  A user can also create waypoints, routes, and other control objects during flight. In addition, the VR-Forces GUI can show the footprint and frustum of the sensor to enhance situational awareness (in 2D and 3D).

VR-Engage provides manual control of the aircraft, including realistic aircraft dynamics and response to the environmental conditions. In this role, the Pilot can choose to see what the sensor sees, even share control with the Sensor Operator.

Sensor Operator:
VR-Engage provides the role of the Sensor Operator, letting the user manually control the gimbaled sensor on the platform. In this role, the Sensor Operator gains the required set of advanced skills and tactical training to become an integral part of the mission. They learn to acquire and track targets, prioritize mission-related warnings, updates, radio communications.

Instructor Station:
This is where the scenario design gets creative; the instructor can use VR-Forces to inject complexities into the scenario by using its advanced AI to create tactically significant behaviors in human characters or crowds. Tweak the clouds and fog, producing rain to change visibility. Increase wind and change its direction and even jam communications during runtime.

As students advance through full mission training they learn to support their crewmen in complex missions. They share salient information between each other, operate radios, and communicate with ground teams, rear-area commanders and other entities covering the target area.

Sensor Operator Use Case Use Case 1 copy 2

Figure 3: Apply knowledge of systems, weapons and tactics to complete missions together

MAK products are well suited to Sensor Operator training. VR-Engage’s Sensor Operator role is ready to use and connect to existing training simulations, it can be configured and customized to emulate specific sensor controllers, and it can interoperate with the full capabilities of VR-Forces to form full mission trainers.

MAK products can be used for live, virtual and constructive training.

Get in touch and let us help you Get Ahead of the Game.

 

Product Legend

Intelligence, Surveillance, Reconnaissance Systems

MAK Software is useful in a multitude of ways to people modeling, simulating, and training ISR systems.

Through a combination of longtime MAK values like interoperability and flexibility along with some new features recently added, MAK products fit whether you’re using the whole suite to model a complete simulation environment or integrating one MAK component into your system.

In this info-graphic, we illustrate the multitude of ways that MAK software can contribute to a simulation that models border security scenarios.

 

Slider

 

MAK’s ability to model ISR systems comes in many forms:

  • VR-Engage Sensor Operator role can be used to control a gimbaled sensor while viewing the sensor imagery feed.
  • The vehicle/platform carrying the sensor pod can be flown interactive using the first-person VR-Engage Pilot role.
  • VR-Forces computer generated forces AI can fly aircraft along planned routes that also respond to simulation events.
  • Sensors can be fixed at a location such as on a building or observation tower.
  • VR-Vantage IG renders sensor video and streams it to real (non-simulated) systems like unmanned vehicle ground stations or image processing and analysis tools.
  • At a more abstract level, VR-Forces can model scenarios where entities detect one another and send spot reports to intelligence systems.

All of these configurations benefit from MAK’s full suite of product capabilities.

  • VR-Forces models scenarios involving vehicles, weapons, and human characters.
  • VR-TheWorld Server provides geographic data to create the terrain databases within VR-Forces, VR-Vantage, and VR-Engage.
  • DI-Guy SDK provides human character visualization to integrate into many commercial visual systems.
  • MAK interoperability tools (VR-Link, MAK RTI, VR-Exchange) provide network connectivity between all the components of the ISR system.

 

Product Legend

Command Staff Training

MAK CST

CST_brochure_2015_web-cover

Product Legend

Shipboard Weapons Training System

Shipboard Weapons Training System

This Shipboard Weapons Training System immerses trainees within a virtual environment. Students learn to operate the weapons along with team communication and coordination.

Related Use Case - French Navy's ship defence simulators create an immersive environment allowing personnel to train and practice shipboard defence against any kind of threat.

Shipboard weapons training systems focus on developing coordination and firing skills amongst a gunnery unit at sea. When it comes to appropriate levels of fidelity, it is important to develop a system that replicates the firing process, accurately renders weapon effects, and instills the environmental accuracy of operating on a ship in motion. In addition, the training system must stimulate the unit with appropriate threats.

VT MAK’s off-the-shelf-technologies transform this system into a realistic simulation environment, providing an ideal training ground for a large variety of training skills and learning objectives.

(For narration, make sure the volume is on)

 

The MAK Advantage:

Choosing MAK for your simulation infrastructure gives you state of the art technology and the renowned ‘engineer down the hall’ technical support that has been the foundation of MAK’s culture since its beginnings.

 

MAK Capabilities within each of the simulation components:

The Dome and the UAV station —VR-Vantage IG
  • Game/Simulator Quality Graphics and Rendering Techniques take advantage of the increasing power of NVIDIA graphics cards, yielding stunning views of the ocean, the sky, the content between in between.
  • Built in Support for Multi-Channel Rendering. Depending on system design choices for performance and number of computers deployed, VR-Vantage can render multiple channels from a single graphics processor (GPU) or can render channels on separate computers using Remote Display Engines attached to a master IG channel.
  • 3D Content to Represent many vehicle types, human characters, weapon systems, and destroyable buildings. Visual effects are provided for weapons engagements including particle systems for signal smoke, weapon fire, detonations, fire, and smoke.
  • Terrain Agility supports most of the terrain strategies commonly used in the modeling, simulation & training industry.
  • Environmental effects are modeled to render with realism. Proper lighting — day or night, the effects of shadows, atmospheric and water effects including dynamic oceans (tidal, swell size and direction, transparency, reflection, etc.)  Add multiple cloud layers, wind, rain, snow, even trees and grass that move naturally with the variations of wind.
  • Sensor Modeling the look and feel of the UAV sensor feed. Sensors can be controlled by a participant or given an artificial intelligence plan. Add SensorFX to model physically accurate views from a UAV's sensor, accounting to environmental variations, such as fog, due, snow, rain, and many other factors that influence temperature and visibility.  To see how its done, Click here.
First-person Helicopter Flight Simulator —VR-Engage
  • A high-fidelity vehicle physics engine needed for accurate vehicle motion.
  • Ground, rotary and fixed-wing vehicles, and the full library of friendly, hostile, and neutral DI-Guy characters.
  • Radio and voice communications over DIS and HLA using Link products.
  • Sensors, weapons, countermeasures, and behavior models for air-to-air, air-to-ground, on-the-ground, and person-to-person engagements.
  • Vehicle and person-specific interactions with the environment (open and close doors, move, destroy, and so on.)
  • Standard navigation displays and multi-function display (MFD) navigation chart.
  • Terrain agility. As with VR-Vantage IG and VR-Forces, you can use the terrain you have or take advantage of innovative streaming and procedural terrain techniques.
Instructor Operator Station — VR-Forces and VR-Engage

VR-Forces is a powerful, scalable, flexible, and easy-to-use computer generated forces (CGF) simulation system used as the basis of Threat Generators and Instructor Operator Stations (IOS). VR-Forces provides the flexibility to fit a large variety of architectures right out-of-the-box or to be completely customized to meet specific requirements.

VR-Engage lets the instructor choose when to play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter. The instructor takes interactive control of the jet in real-time, engaging with other entities using input devices. This adds the human touch for a higher behavioral fidelity when needed.

Some of VR-Forces’ many capabilities:

  • Scenario Definition that enables instructors to create, execute, and distribute simulation scenarios. Using its intuitive interfaces, they can build scenarios that scale from just a few individuals in close quarters to large multi-echelon simulations covering the entire theater of operations. The user interface can be used as-is or customized for a training specific look and feel.
  • Simulating objects behavior and interactions with the environment and other simulation objects. These behaviors give VR-Forces simulation objects a level of autonomy to react to the rest of the simulation on their own. This saves you from having to script their behavior in detail.
  • It's easy to set up and preserve your workspace, override default behaviors, and modify simulation object models. Simulation objects have many parameters that affect their behavior.
  • Training Exercise Management, allowing the instructor to manipulate all entities in real-time while the training is ongoing.
  • Artificial Intelligence (AI) control, where entities are given tasks to execute missions, like attack plans that trigger when the ship reaches a certain waypoint.. While on their mission reactive tasks deal with contingencies and the CGF AI plays out the mission.
  • 2D & 3D Viewing Control that allows the instructor to switch and save their prefered point of views in real time.
  • High-fidelity weapons interactions components are plugged in for accurate measures in real-time.

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation?
Interested in a demonstration?

 

Product Legend

Air Mission Operations Training Center

Air Mission Operations Training Center

Air Mission Operations Training Centers are large systems focused on training aircraft pilots and the teams of people neede to conduct air missions.  

To make the simulation environment valid for training, simulators are needed to fill the virtual world with mission support units, opposing forces & threats, and civilian patterns of life. Depending on the specifics of each training exercise, the fidelity of each simulation can range from completely autonomous computer generated forces, to desktop role player stations, to fully immersive training simulators.

Scroll down to watch video on how VT MAK’s simulation technology fits into an air missions operations training center.
Click in the bottom corner of the video for volume control and full screen viewing. 

 

The MAK Advantage:

MAK technologies can be deployed in many places within an air missions operations training center. 

VT MAK provides a powerful and flexible computer generated forces simulation, VR-Forces. Used to manage air, land, and sea missions, as well as civilians activity. It can be the ‘one CGF’ for all operational domains.

Desktop role players and targeted fidelity simulators are used where human players are needed to increase fidelity and represent tactically precise decision making and behavior.

Remote simulation centers connect over long-haul networks to participate when specific trials need the fidelity of those high-value simulation assets. MAK offers an interoperability solution that facilitates a common extensible simulation architecture based on international standards. VR-Link helps developers build DIS & HLA into their simulations. VR-Exchange connects simulations even when they use differing protocols. The MAK RTI provides the high-performance infrastructure for HLA networking.

Local simulators, based on MAK’s VR-Engage, take the place of remote simulations — when connecting to remote facilities is not needed. VR-Engage lets users play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter.

VR-Engage can be used for role player stations. Or used as the basis for targeted fidelity or high-fidelity simulators.

MAK products are meant to be used in complex simulation environments — interoperating with simulations built by customers and other vendors. However, big efficiencies are gained by choosing MAK products as the core of your simulation environment.

 

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

Product Legend

Flight Deck Training

Flight Deck Training

What’s at stake?

Working on the flight deck of an aircraft carrier is dangerous business. Especially if your job is in “the bucket” and you’re supposed to release the catapult to accelerate a jet down the deck to aid its take off. You need to set the tension on the catapult after you get the weight from the refueling crew, you need to get the thumbs up from the deck chief, and finally make sure that the pilot is focused on the runway and ready. If you let that catapult go too soon, its going to hurt – a lot.

The MAK Advantage

VT MAK has the tools you need to develop a training system to teach flight deck safety procedures. 

  • DI-Guy Scenario can create human character performances that model the activities on the flight deck.
  • With the DI-Guy SDK, you can integrate the performances into your image generator, or you can use VR-Vantage IG, which has DI-Guy built in. 
  • If you need special gestures to meet your training requirements, you can use the DI-Guy Motion Editor to customize the thousands of character apearances that come with all DI-Guy products. Or you can create characters from motion capture files. 
  • If your training requires the detail of specific facial expression, then DI-Guy Expressive Faces will plug right in and give the you the control you need. 
  • And if you’d like help pulling this all together. MAK is here to help. With renowned product support and custom services to ensure your training system is a success. 

Product Legend

Mechanized Infantry Training

Mechanized Infantry Training

Complex training systems typically require similarly complex software and hardware configurations. MAK’s technology breaks with that sentiment, by making it easy to put together amazing training systems with simple system architectures. In this case, four different training setups are brought to life at appropriate fidelities for each player, all in one streamlined package, to yield a comprehensive Mechanized Infantry Trainer.

The MAK Solution:

VR-Vantage provides trainees with high-detail role-specific visual scenes, including scenes with high-fidelity data overlays. VR-Vantage emulates exterior camera views and 2D maps for the driver and commander, and a scope for the gunner with accurate data displays. Instructors use VR-Vantage to observe the exercise from a third-person perspective and evaluate trainees. VR-Vantage can be customized to match performance and resolution needs, and is used on a range of hardware, from lightweight laptops to complex motion-platform simulators.

For instructors looking to control the simulation and incorporate computer-generated forces, VR-Forces is the perfect pairing for VR-Vantage. VR-Forces is a scalable simulation engine that allows instructors to populate the scene with friendly forces, hostile units, civilians, and obstacles. Instructors use VR-Forces to move units around the scene, setting up scenarios or altering a live situation in real time.

With MAK you have choices on how to create a host vehicle simulation. For ground vehicles we’ve found Vortex, by CM Labs, to be an excellent vehicle dynamics solution. Vortex's contact dynamics simulate all the moving parts of the vehicle including the interaction with the terrain, water, obstacles, vision systems, grasping, and more. Everything from suspension travel to traction and gearing is accounted for to provide the driver with an enriching, engaging training scenario. RT Dynamics provides the flight dynamics for air vehicles, increasing realism for all maneuvers with physics-based aircraft entities. It also adds new maneuvers such as formation flight, terrain following flight, and vertical take off/landing.

VR-TheWorld Server is a powerful web-based streaming terrain server that lets you stream in elevation, features, and imagery. It streams the terrain database to each station, giving users a synthetic environment within which to simulate.

Both VR-Forces and VR-Vantage include MAK’s networking technology. VR-Link’s protocol independent API allows both applications to communicate through industry standard High Level Architecture (HLA) and the Distributed Interactive Simulation (DIS) protocols, including HLA 1.3, HLA 1516, HLA Evolved, DIS, and DIS 7. The MAK Data Logger records and plays back all the network simulation traffic for after action review and analysis. The MAK RTI (runtime infrastructure) is available when connecting to HLA federations using any of these SISO standard protocols: HLA 1.3, HLA 1516, and HLA Evolved.

Want to learn more? Have a look at the VR-Vantage page for more information. Interested in seeing a demonstration?

 

Product Legend

Close Air Support: JTAC Training

Close Air Support: JTAC Training

As part of a Tactical Air Control Party (TACP), only the Joint Terminal Air Controller (JTAC) is authorized to say CLEARED HOT on the radio and direct aircraft to deliver their ordnance on a target.

JTACs are relied on to direct and coordinate close air support missions, advise commanders on matters pertaining to air support, and observe and report the results of strikes. Their ability to communicate effectively with pilots, and coordinate accurate air strikes can play a huge role in the success of a mission.

Virtual training systems allow JTACs to practice identifying targets, calibrating their locations, requesting air support, and the highly-specialized procedures for communicating with pilots. 

Scroll down to watch video on how VT MAK’s simulation technology comes together to make up a JTAC simulator.

The MAK Advantage:

The JTAC simulator in this use case takes advantage of simulations built on MAK’s core technologies.

The tight coupling of system components provides a rich simulation environment for each participant. The JTAC simulation is rendered in the dome using VR-Vantage; the flight simulation takes advantage of the VR-Forces first-person simulation engine; and the instructor/role player station uses VR-Forces CGF to populate the synthetic environment and control the training scenarios.

All these system components share a common terrain database and are connected together using VR-Link and the MAK RTI, giving the system integrator the ability to deploy reliably and cost effectively while leaving open the opportunity to expand the system to add bigger and more complex networks of live, virtual and/or constructive simulations.

Choosing MAK for your simulation infrastructure gives you state of the art technology and the renowned ‘engineer down the hall’ technical support that has been the foundation of MAK’s culture since its beginnings.

Capabilities the core technologies bring to the simulators:

JTAC Dome — Built with VR-Vantage
  • Game/Simulator Quality Graphics and Rendering Techniques

    VR-Vantage uses the most modern image rendering and shader techniques to take advantage of the increasing power of NVIDIA graphics cards. VT MAK's Image Generator has real-time visual effects to rival any modern IG or game engine.

  • Multi-Channel Rendering

    Support for multi-channel rendering is built in. Depending on system design choices for performance and number of computers deployed, VR-Vantage can render multiple channels from a single graphics processor (GPU) or can render channels on separate computers using Remote Display Engines attached to a master IG channel.

  • 3D Content to Represent Players and Interactions

    VR-Vantage is loaded with content including 3D models of all vehicle types, human characters, weapon systems, and destroyable buildings. Visual effects are provided for weapons engagements including particle systems for signal smoke, weapon fire, detonations, fire, and smoke.

  • Terrain Agility

    All MAK’s simulation and visualization products are designed to be terrain agile, that means that they can support most of the terrain strategies commonly used in the modeling, simulation & training industry. Look here for technical details and a list of the formats supported.

  • Environmental Modeling

    VR-Vantage can render scenes of the terrain and environment with the realism of proper lighting — day or night, the effects of illuminated light sources and shadows, atmospheric and water effects including multiple cloud layers effects and dynamic oceans, trees and grass that move naturally with the wind.

  • Sensor Modeling

    VR-Vantage can render scenes in all wavelengths: Night vision, infrared, and visible (as needed on a JTACs dome display). Sensor zooming, depth of field effects, and reticle overlays model the use of binoculars and laser range finders.

Flight Simulator — Built with VR-Forces & VR-Vantage
  • Flight Dynamics

    High-fidelity physics-based aerodynamics model for accurate flight controls using game or professional level hands on throttle and stick controls (HOTAS).

  • Air to Ground Engagements

    Sensors (targeting pod (IR camera with gimbal and overlay), SAR request/response (requires RadarFX Server) Weapons (missiles, guns, bombs)

  • Navigation

    Standard six-pack navigation displays and multi-function display (MFD) navigation chart.

  • Image Generator

    All the same VR-Vantage based IG capabilities in a flight simulator/roleplayer station as in the JTAC’s dome display. The flexibility to configure as needed: Single screen (OTW + controls + HUD), Dual screen (OTW + HUD, controls), Multi Screen OTW (using remote display engines).

  • Integration with IOS & JTAC

    The flight simulator is integrated with the VR-Forces-based IOS so the instructor can initialize the combat air patrol (CAP) mission appropriately in preparation for the close air support (CAS) mission called by the JTAC. All flights are captured by the MAK Data Logger for after action review (AAR) analysis and debriefing. Radios are provided that communicate over the DIS or HLA simulation infrastructure and are recorded by the MAK Data Logger for AAR.

Instructor Operator Station — Built with VR-Forces

VR-Forces is a powerful, scalable, flexible, and easy-to-use computer generated forces (CGF) simulation system used as the basis of Threat Generators and Instructor Operator Stations (IOS).

  • Scenario Definition

    VR-Forces comes with a rich set of capabilities that enable instructors to create, execute, and distribute simulation scenarios. Using its intuitive interfaces, they can build scenarios that scale from just a few individuals in close quarters to large multi-echelon simulations covering the entire theater of operations. The user interface can be used as-is or customized for a training specific look and feel.

  • Training Exercise Management

    All of the entities defined by a VR-Forces scenario can be interactively manipulated in real-time while the training is ongoing. Instructors can choose from:

    Direct control, where new entities can be created on the fly or existing entities can be moved into position, their status, rules of engagement, or tasking changed on a whim. Some call the instructor using this method a “puckster”.

    Artificial Intelligence (AI) control, where entities are given tasks to execute missions, like close air support (CAS), suppressive fire, or attack with guns. While on their mission reactive tasks deal with contingencies and reactive tasks deal and the CGF AI plays out the mission. In games, these are sometimes called “non-player characters”.

    First person control, where the instructor takes interactive control of a vehicle or human character and moves it around and engages with other entities using input devices.

  • 2D & 3D Viewing Control

    When creating training scenarios, the VR-Forces GUI allows instructors to quickly switch between 2D and 3D views.

    The 2D view provides a dynamic map display of the simulated world and is the most productive for laying down entities and tactical graphics that help to control the AI of those entities.

    The 3D views provide an intuitive, immersive, situational awareness and allow precise placement of simulation objects on the terrain. Users can quickly and easily switch between display modes or open a secondary window and use a different mode in each one.

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

Product Legend

Air Traffic Simulation

Bringing Advanced Traffic Simulation and Visualization Technologies to Help Design and Build the Next-Generation ATM System

The U.S. Federal Aviation Administration (FAA) is embarking on an ambitious project to upgrade the nation’s air traffic management (ATM) systems. Each team under the FAA Systems Engineering 2020 (SE2020) IDIQ will need an integrated, distributed modeling and synthetic simulation environment to try out their new systems and concepts. Aircraft manufacturers will need to upgrade their aircraft engineering simulators to model the new ATM environment in order to test their equipment in realistic synthetic environments.

The MAK Advantage:

MAK saves our customers time, effort, cost, and risk by applying our distributed simulation expertise and suite of tools to ATM simulation. We provide cost effective, open solutions to meet ATM visualization and simulation requirements, in the breadth of application areas:

  • Concept Exploration and Validation
  • Control Tower Training
  • Man-Machine Interface Research
  • System Design (by simulating detailed operation)
  • Technology Performance Assessment

VT MAK is in a unique position to help your company in this process. Many players in the aviation space already use MAK products for simulation, visualization, and interoperability. For the ATM community we can offer:

  • Simulation Integration and Interoperability. The US has standardized on the High Level Architecture (HLA) protocol for interoperability and the MAK RTI is already in use as part of the AviationSimNet. We can provide general distributed simulation interoperability services, including streaming terrain (VR-TheWorld Server, GIS Enabled Modeling & Simulation), gateways to operational systems (VR-Exchange), simulation debugging and monitoring (HLA Traffic Analyzer).
  • Visualization. MAK’s visual simulation solution, VR-Vantage IG, can be used to create 3D representations of the airspace from the big picture to an individual airport. VR-TheWorld can provide the central terrain storage.
  • Simulation. VR-Forces is an easy to use CGF used to develop specific models for non-commercial aviation entities such as UAVs, fighter aircraft, rogue aircraft, people on the ground, ground vehicles, etc. In addition, as an open toolkit, VR-Forces may well be preferred in many labs over the closed ATC simulators.

Product Legend

UAS Operations

UAS Operations

 

What’s at Stake?

You are tasked with training a team of sensor payload operators to use UAVs for urban reconnaissance missions in a specific city. Upon completion of training, trainees must be able to comb an area for a target, make a positive identification, monitor behavior and interactions, radio in an airstrike, and then report on the outcome.

An ineffective training environment could lead to additional costs, losing important targets, and inefficient surveillance systems. Training with a robust solution enhances homeland security human resources for a minimal product investment.

What Are We building?

As the instructor, you need to mock up a ground control station with accurate pilot/payload operator role definitions and supply that system with surveillance data from a content-rich simulation environment. You need to construct a scene that is informative, while providing trainees with opportunities to develop their instincts and test their operating procedures based on how the scenario unfolds.

Each UAV must be equipped with an electro-optical camera as well as an infrared sensor mounted to a gimbal. Radio communication between the UAV operators and a central command center must be available to coordinate surveillance and call in airstrikes.
Trainees need to experience the scenario through the electro-optical sensor and infrared sensor with rich, accurate data overlays to provide them with the information they need to communicate positioning and targeting effectively.
Your urban environment requires crowds of people who behave in realistic ways and traverse the city in intelligent paths. When a UAV operator spots someone, they need to be able to lock onto them when they are in motion to mimic algorithmic tracking tools.
 

The simulation needs to be adjustable in real time so that the instructor can minimize repeat behaviors and walk the team through different scenarios. Instructors also must be able to judge the effectiveness of a trainee’s technique.

The MAK Advantage:

In this particular case, VR-Forces provides all the software you need to bring your environment to life. 
Watch this MAKtv episode to see how easy it is to set up.

MAKtv SensorsCapture

Sensor modeling is a point of strength for VR-Forces. Give your trainees a beautiful, detailed point of view of the scene through the electro-optical sensor, and provide a high-fidelity infrared sensor display when the daylight fades. VR-Forces adds accurate data overlays so that trainees can learn to quickly and accurately read and report based on that information. Instructors can visualize 3D volumetric view frustums and assess trainees’ combing strategies as well as any gaps in coverage, and engineer surveillance systems. We model sensor tracking to lock onto targets while they are in movement or on a fixed location. 

VR-Forces is an ideal tool for scenario development. It can model UAVs in fine detail, while allowing for instructors to customize those entities based on the scope of a mission. It’s simple to add the gimbal mounted sensor array that we need for this scenario and define parameters for - including zoom, zoom speed, slew rate, and gimbal stops. Easily populate an urban environment with people by using the group objects function to add crowds of entities at a time. VR-Forces has features from Autodesk's Gameware built in, enabling Pattern of Life intelligent flows of people and vehicles, in addition to plotting the locations and tasks of individual entities. The Pattern of Life lets you manipulate patterns within the scenario – including realistic background traffic, whether it’s people, road, or air. Certain DI-Guy capabilities have been integrated into VR-Forces, meaning behavior modeling is more authentic, thanks to motion capture technology. Now you can train your team to look out for certain suspicious movements and calibrate their responses based on the actions of the target.

What really makes VR-Forces perfect for training is the ability of instructors to manipulate the scenario in real time. You can keep your trainees from running scenarios that are too predictable by having your target enter buildings, change his mode of transportation, or actively attempt to avoid detection, all during live action.

Interested in Learning More? Have a look at the VR-Forces page for more information.  

Can we interest you in a personal demonstration.

 

Product Legend

Developing Air and Ground Traffic Policy

Developing Air and Ground Traffic Policy in a World Increasingly Populated by UAS

As UAS technologies become more accessible, an increase in air traffic, particularly around urban centers is inevitable. It will be essential for governments and their agencies to develop policies with regards to air traffic and its relationship with ground traffic, specifically for low-flying UASs, and particularly in emergency situations. Well-developed traffic management will maximize safe traffic speed in regular conditions and divert flows efficiently in emergency scenarios when first-responders are rushing to a scene. Poor planning may result in economic and human loss. Simulation is an ideal space to test current traffic policies under changing conditions and to research and develop new solutions.

Governments and agencies need a tool that can depict an area modeled after their own and simulate air traffic within it. The tool should be capable of depicting specific types of air traffic, including planes, helicopters, and UASs, as well as airspace demarcation. There needs to be a concurrent display of ground traffic, including pedestrians, bicyclists, and vehicles - particularly around the scene of an incident. Policymakers want to be able to visualize traffic flows and craft response strategies for general and specific situations.

 

The MAK Advantage:

VT MAK offers commercial-off-the-shelf (COTS) technology to construct airspace simulations, backed by a company with an “engineer down the hall” philosophy to help organizations select and implement the most effective solution.

 

VR-Forces provides a scalable computer-generated forces simulation engine capable of populating an environment with air and ground traffic, as well as infrastructure specific to traffic systems. There is plenty of out-of-the-box content of all shapes and sizes, from sUAS up to 747s in the air, and everything from human characters and bicyclists to fire trucks on the ground. If an out-of-the-box model needs to be modified to match local specifications, or if an agency wants to create their own from scratch, MAK’s open-source API allows for full customization of entity appearance and performance.

 

VR-Forces depicts volumetric airspace regulations, giving policymakers a three-dimensional perspective of air corridors and restricted spaces as they swell and shrink. Crucially, volumetric airspace restrictions can be assigned to impact air and ground traffic systems accordingly. For example, if there was an auto accident, set policies could dictate an air restriction in the area up to a certain height to provide space for UAS emergency response and redirect UAS traffic as long as necessary. At the same time, traffic on the ground within a particular radius may have their speeds reduced, or lanes may be opened specifically for first responders to access the scene more readily.

 

Policymakers can calibrate the size and rules applied to air corridors and measure the impact of these changes on the traffic patterns of the city. VR-Forces is capable of depicting traffic density as it shifts with new incidents, even assigning a color-coded density maps to better visualize areas of congestion in air and on the ground.

 

VR-TheWorld allows policymakers to test these impacts inside any city for which they have the terrain data, through a web-based interface. This creates the most realistic testing lab for research and development projects.

 

Want to learn more? Have a look at the VR-Forces page for more information. interested in seeing a demonstration?

 

 

Product Legend

Incident Management

 

Incident Management

Using the Power of Modeling & Simulation for First Responder Training, Emergency Response Preparedness, and Critical Infrastructure Protection

The homeland security, emergency response, and public safety communities face challenges similar to those dealt with in the military domain--they need to plan and train. But large scale live simulations are simply too disruptive to be conducted with regularity. Catastrophic emergencies require coordination of local and state public safety personnel, emergency management personnel, National Guard, and possibly regular military. Interoperability is a major problem.

On a basic level, simulations require generic urban terrains with multi-story interior models, transportation infrastructure such as subways and airports, and the ability to simulate crowd behaviors and traffic. They may require terrains for specific urban areas or transportation infrastructure. Given the role of ubiquitous communications in the public sector, the ability to simulate communications networks (land-line, cell, data) and disruptions in them may also be important. For specialized emergency response training, the ability to simulate chemical, biological, and radiological dispersion may also be necessary.

The need for simulation and training in this domain is self evident. The budgetary constraints are daunting for many agencies. The cost-effective solutions that VT MAK has developed for the defense community can provide immediate benefits to homeland security, emergency response, and public safety agencies.

  • First Responder Training
  • Emergency Response Planning
  • Perimeter Monitoring/Security
  • Human Behavior Studies

 

The MAK Advantage:

MAK can help you use simulation systems to keep your homeland secure. Here's how:

  • With VR-Link, VR-Exchange and MAK RTILink simulation components into simulation systems, or connect systems into world-wide interoperable distributed simulation networks.

  • With VR-Forces: Build and Populate 3D simulation environments (a.k.a. virtual worlds), from vehicle or building interiors to urban terrain areas, to the whole planet. Then Simulate the mobility, dynamics and behavior of people and vehicles; from individual role players to large scale simulations involving 10’s of thousands of entities.

  • With VR-Vantage IG: Visualize the simulation to understand analytical results or participate in immersive experiences.

Product Legend

Instructor Operator Stations

Instructor Operator Stations  

VR Forces Station

Where does the Instructor Operator Station fit within the system architecture?

Training events are becoming larger and more widely distributed across networked environments. Yet staffing for these exercises is often static, or even decreasing. Therefore, instructors and operators need IOS systems to help manage their tasks, including designing scenarios, running exercises, providing real-time guidance and feedback, and conducting AAR.

Instructor Operator Stations (IOS) provide a central location from which instructors and operators can manage training simulations. An effective IOS enables seamless control of exercise start-up, execution, and After Action Review (AAR) across distributed systems. It automates many setup and execution tasks, and provides interfaces tailored to the simulation domain for tasks that are done manually.

How does MAK software fit within the Instructor Operator Station?

MAK has proven technologies that allow us to build and customize an IOS to meet your training system requirements.

  • Simulation Control Interface – Instructors can create and modify training scenarios. Execution of the scenarios may be distributed across one or more remote systems. The instructor or operator can dynamically inject events into a scenario to stimulate trainee responses, or otherwise guide a trainee’s actions during a training exercise. Core technology: VR-Forces Graphical User Interface
  • Situational Awareness – The MAK IOS includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allow the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Analysis & After Action Review – The MAK IOS supports pre-mission briefing and AAR / debriefing. It can record exercises and play them back. The instructor can annotate key events in real-time or post exercise, assess trainee performance, and generate debrief presentations and reports. The logged data can be exported to a variety of databases and analysis tools for data mining and performance assessment. Core technology: MAK Data Logger
  • Open Standards Compliance –MAK IOS supports the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. Core technology: VR-Link networking toolkit.
  • Simulated Voice Radios – Optionally Includes services to communicate with trainees using real or simulated radios, VOIP, or text chat, as appropriate for the training environment.

Product Legend

Scenario/Threat-Generation Stations

Scenario/Threat Generators  

scenario generators

Where does the Scenario/Threat Generator fit within the system architecture?

Your job is to place a trainee or analyst in a realistic virtual environment in which they can train or experiment. It could be a hardware simulator, a battle lab, or even the actual equipment, configured for simulated input. Then you need to stimulate that environment with realistic events for the trainee to respond to. The stimulation usually comes from a scenario generator, also known as a threat generator. A scenario generator typically simulates the opposing force entities and complementary friendly force entities that the trainees need to interact with.

Trends

A scenario generator should allow training staff to quickly and easily design and develop scenarios that place trainees in a realistic situation. The application should use the proper terminology and concepts for the trainees’ knowledge domain. It should be flexible enough to handle the entire spectrum of simulation needs. The entities simulated by the scenario generator should be able to operate with enough autonomy that once the simulation starts they do not need constant attention by an instructor / operator, but could be managed dynamically if necessary.

In addition to its basic capabilities, a scenario generator needs to be able to communicate with the simulator and other exercise participants using standard simulation protocols. It needs to be able to load the terrain databases and entity models that you need without forcing you to use some narrowly defined or proprietary set of formats. Finally, a scenario generator needs to work well with the visualization and data logging tools that are often used in simulation settings.

How does MAK software fit within the Scenario/Threat generator?

MAK will work with you to so that you have the Scenario Generator that you need - a powerful and flexible simulation component for generating and executing battlefield scenarios. MAK will work with you to customize it to meet the particular needs of your simulation domain. For example, higher fidelity models for a particular platform can be added, new tasks can be implemented, or the graphical user interface can be customized to create the correct level or realism. Features include:

  • Scenario development – Staff can rapidly create and modify scenarios. Entities can be controlled directly as the scenario runs. Core Technology: VR-Forces
  • Situational Awareness – The visualization system includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allows the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Network modeling – The lab can simulate communications networks and the difficulties of real-world communications. Core technologies: Qualnet eXata and AGI SMART.
  • Correlated Terrain – VT MAK’s approach to terrain, terrain agility, ensures that you can use the terrain formats you need, when you need them. We can also help you develop custom terrains and can integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. Core technologies: VR-TheWorld Server, VR-inTerra.
  • Sensor modeling – The visualization component can model visuals through different sensor spectrums, such as infrared and night vision. Core technology: JRM SensorFX.
  • Open Standards Compliance – VR-Forces supports the High Level Architecture (HLA), Distributed Interactive Simulation (DIS).

Product Legend

Sensor Operator Station

Sensor Operator Station

ISR Newsletter Image 1 5 components

Using the new Sensor Operator capability, a VR-Engage user can perform common surveillance and reconnaissance tasks such as tracking fixed and moving targets - using a simulated E/O camera or IR sensor, with configurable informational overlays. Immediately control the gimbaled sensor using joysticks or gamepads; or configure VR-Engage to work with sensor-specific hand controller devices. VR-Engage has built-in support for HLA/DIS radios, allowing sensor operators to communicate with pilots, ground personnel, or other trainees or role players using standard headsets.

VR-Engage's new Sensor Operator capability can fit into your larger simulation environment in a number of different ways to help meet a variety of training and experimentation requirements:

  • Attach a gimbaled sensor to any DIS or HLA entity, such as a UAV, ship, or manned aircraft - even if the entity itself is simulated by an existing 3rd party application.
  • When VR-Engage is used in conjunction with VR-Forces CGF, a role player can take manual control of a camera or sensor that has been configured on a VR-Forces entity.
  • For a full UAV Ground Control Station, use VR-Forces GUI to "pilot" the aircraft by assigning waypoints, routes, and missions; while using VR-Engage's Sensor Operator capability to control and view the sensor on a second screen.
  • Execute a multi-crew aircraft simulation using two copies of VR-Engage - one for the pilot to fly the aircraft using a standard HOTAS device or gamepad; and a second for the Sensor Operator.
  • Place fixed or user-controllable remote cameras directly onto the terrain, and stream the resulting simulated video into real security applications or command and control systems using open standards like H.264 or MPEG4.

VR-Engage comes with MAK's built-in CameraFX module which allows you to control blur, noise, gain, color, and many other camera or sensor post-processing effects. The optional SensorFX add-on can be used to increase the fidelity of an IR scene - SensorFX models the physics of light and its response to various materials and the environment, as well as the dynamic thermal response of engines, wheels, smokestacks, and more.

Product Legend

Virtual Simulators

Virtual Simulators  

VR Engage Trio

Where do Virtual Simulators fit within the system architecture?

Virtual simulators are used for many different roles within Training, Experimentation, and R&D systems. 

  • Trainees – The primary interface for training vehicle and weapons operations, techniques, tactics, and procedures, is often a Virtual Simulator. Pilots use flight simulators, ground forces us driving and gunnery trainers, soldiers use First Person Shooter simulators, etc.
  • Test Subjects – Virtual Simulators are used to test and evaluate virtual prototypes, to study system designs, or analysing behavior of operators.
  • In either case, the Virtual Simulator is connected to other simulators, instructor operator stations, and analysis tools using a distributed simulation network. Collectively these systems present a rich synthetic environment for the benefit of the trainee or researcher.
  • Role Players –  Scenario Generators, Computer Generated Forces, Threat Generators, or any other form of artificial intelligence (AI) can add entitites to bring the simulated environment to life. But, as good as AI can be, some exercises need real people  to control supporting entities to make the training or analysis accurate. In these cases Virtual Simulators can be used to add entities into the scenarios.

The fidelity of Virtual Simulators can vary widely to support the objectives within available budgets.  The following slideshow is from the The Tech Savvy Guide to Virtual Simulation, which goes into great detail about the fidelity of Virtual Simulators. Download the Tech Savvy Guide here.

previous arrow
next arrow
Slider

How does MAK software fit within a Virtual Simulator?

  • Multi-role Virtual Simulator – MAK's VR-Engage is a great place to start. We've done the work of integrating high-fidelity vehicle physics, sensors, weapons, first-person controls, and high performance game-quality graphics. VR-Engage lets users play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter. Use it as is, or customize it to the specifications of your training or experimentation. As with VR-Vantage and VR-Forces, VR-Engage is terrain agile so you can use the terrain you have or take advantage of innovative streaming and procedural terrain techniques.
  • VR-Engage can run standalone - without requiring any other MAK products and is fully interoperable with 3rd party CGFs and other simulators through open standards. But many additional benefits apply when VR-Engage is used together with MAK’s  VR-Forces – Immediately share and reuse existing terrain, models, configurations, and other content across VR-Forces, VR-Vantage, and VR-Engage - with full correlation; Unified scenario authoring and management; and run-time switching between player-control and AI control of an entity.
  • Visual System – If you have your own vehicle simulation and need immersive 3D graphics, then VR-Vantage IG is the tool of choice for high-performance visual scenes for out-the-window visuals, sensor channels, and simulated UAS video feeds. VR-Vantage can be controlled through the Computer Image Generator Interface (CIGI) standard as well as the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols.
  • SAR Simulation – If your simulator is all set, but you need to simulate and send Synthetic Aperture Radar images to the cockpit, then RadarFX SAR Server, can generate realistic SAR and ISAR scenes and send them over the network for you to integrate into your cockpit displays.  
  • Network Interoperability – Developers who build the Virtual Simulator from scratch can take advantage of VR‑Link and the MAK RTI to connect the simulation to the network for interoperation with other simulation applications using the High Level Architecture (HLA), and Distributed Interactive Simulation (DIS) protocols.

 

 

Product Legend

Image Generators

Image Generators

Where does an Image Generator fit within the system architecture?

Image generators provide visual scenes of the simulation environment from the perspective of the participants. These can be displayed on hardware as simple as a desktop monitor, or as complex as a multiple projector dome display. The scenes can be rendered in the visible spectrum for "out-the-window" views or in other wavelengths to simulate optical sensors. In any case, the Image generator must generate scenes very quickly to maintain a realistic sense of motion.  

This slideshow is from the Image Generation section of The Tech Savvy Guide to Virtual Simulation, which goes into great detail about the Image Generator and other components of a simulation system. Download the Tech Savvy Guide here.

Slider

How does MAK software fit within an Image Generator?

VR-Vantage is the core of MAK Image Generation solution. It uses best of breed technologies to custom tailor high quality visual scenes of the synthetic environment for your specific simulation needs. VR-Vantage IG can be used as a standalone Image generator, connected to a host simulation via CIGI, DIS, or HLA protocols. VR-Vantage is also used as the rendering engine for all MAKs graphic applications, including VR-Engage (multi-role virtual simulator), VR-Forces (Computer generated forces), and VR-Vantage Stealth (battlefield visualization). 

  • Multi-channel distributed rendering – VR-Vantage IG takes advantage of the power of the latest NVIDIA graphics processing units (GPU) to render to one, or more, displays per computer.  When very large fields of view are needed, VR-Vantage can be setup to distribute the rendering task across multiple computers each running the VR-Vantage Display Engine. When tight frame synchronization is needed between channels, VR-Vantage can support NVIDIA's professional graphics cards that perform hardware synchronization to an external synchronization source (also known as G-Sync). 
  • Distortion Correction – When rendering to complex surface shapes, like those used to surround a cockpit with "out-the-window" video, VR-Vantage IG includes a plugin made by Scalable Display Technologies that warps the image to match the shape of the display surface.
  • High Fidelity Sensor Visualization – VR-Vantage IG can, optionally, be configured with SensorFX to convert the visual scene into an accurately rendered sensor scene to simulate night vision goggles, infra-red, and other optical wavelengths. 
  • Correlated Terrain – VT MAK’s approach to terrain – terrain agility – ensures that you can use the terrain formats you need, when you need them. VR-Vantage, VR-engage, and VR-Forces all support many of the terrain technologies used throughout the industry. We can also help you develop custom terrains and integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. 
  • Open Standards Compliance –VR-Vantage supports the Computer Image Generator Interface (CIGI) standard as well as the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. 

Product Legend

Unmanned Vehicle System (UVS) Simulation

Unmanned Vehicle System (UVS) Simulation

unmanned vehicle controller stations

Where does Unmanned Vehicle System (UVS) Simulation fit within the system architecture?

Since Unmanned Vehicles are by their nature controlled remotely, simulations systems can be designed and built to closely model the actual control stations used in live operation.  An Unmanned Vehicle System (UVS), also known as Remotely Piloted Vehicle (RPV), can run as a stand-alone system or be integrated with or embedded into a UVS simulator or ground control station.

Because unmanned vehicle systems have become more and more prevalent in all aspects of defense and homeland security, there is an increasing need for simulations to support every phase of the development life cycle including:  

  • Demonstration – Visualization of new designs helps to confirm their value. UVS systems help to demonstrate new vehicle designs or concepts within a synthetic environment.
  • Experimentation – Simulations help prove and refine new concepts or Tactics, Techniques, and Procedures (TTPs). Simulated UVSs are used in complex scenarios as part of realistic simulations that are linked to real systems, hardware, and other human-in-the-loop simulators.
  • Research & Development – Unmanned Vehicle System simulations can be used to test guidance, navigation, and control functions of a new or modified UVS without the risk of harming people or property that is inherent in live testing. UVS simulations can provide realistic avionics models, sensor models, and visuals, and emulate real-world controls and communication systems. 
  • Training – Simulations allow pilots, sensor/payload operators, mission commanders, and visual intelligence analysts to train, practice, and analyze decision-making and communication processes.

How does MAK software fit within Unmanned Vehicle System (UVS) Simulation?

  • Control Station – VR-Forces can form the basis of the UVS control station by providing 2D tactical map displays for operator planning and simulated remote operation of the vehicle. 
  • Vehicle Simulation – VR-Forces can also be the simulation platform for the unmanned vehicle, either by treating the vehicle as a standard Computer Generated Forces (CGF) entity or by plugging 3-rd party modules with higher fidelity vehicle dynamics models. Developers can also extend VR-Forces to use their own vehicle dynamics and control system models. 
  • Scenario Generation – Regardless of how the unmanned vehicle is simulated, VR-Forces can provide a powerful and easy to use scenario generation capability to create operationally relevant scenarios.
  • Sensor Simulation VR-Vantage IG and SensorFX provide realistic and accurate 3D perspective scenes to model electro-optical (EO), night vision (NV), or infrared (IR) sensors.  
  • Networking – Developers who build the UAV simulation from scratch can take advantage of VR-Link and the MAK RTI to connect the simulation to the network for interoperation with other simulation applications using the High Level Architecture (HLA), and Distributed Interactive Simulation (DIS) protocols. 
  •  

Product Legend

After Action Review / Debrief Systems

After Action Review (AAR) / Debrief 

 system classroom

Where does the AAR/Debrief component fit within the system architecture?

After Action Review (AAR) systems provide feedback on mission and task performance during training exercises. AAR is a critical part of the training process – it is a structured review, or debrief, for analyzing what happened during an exercise and why it happened. By comparing the actual events with the expected outcomes, instructors and students can identify strengths and weaknesses and decide how to improve performance.

As training exercises become larger, the need becomes greater for automated tools that capture the actions of all training participants, evaluate performance against standard metrics, and provide the information necessary to support a structure debrief of the training event. These tools should be flexible enough to support chronological reviews of an entire exercise or tightly focused reviews highlighting a few key issues.  

How does MAK software fit within the AAR/Debrief component?

Simulation Recording and Playback – Simulation data can be recorded and played back using the MAK Data LoggerThis includes simulated entities and interactions, audio communications, video streams, and other events injected into the simulation. The Data Logger user can pause, slow-down, or speed-up the playback, and can jump to specific moments during the exercise via a simple, easy-to-use, DVR-like control panel. 

  • Annotation – The embedded annotation system enables instructors to bookmark key events during the action and then jump to those events during playback to review specific actions.
  • 2D map and 3D out-the-window visualization – The VR-Vantage Stealth provides 2D tactical map displays for situational awareness; and 3D 'god's eye views' create immersive scenes of the simulated environment from the perspective of any trainee. An exaggerated reality mode combines the best of 2D & 3D techniques into a compelling visual analysis of the training exercise. 
  • Analytical Information – Users can track a specific trainee or groups of trainees and view the scene from various perspectives. Informational overlays including trajectory histories, unit status, attacker-target lines, and other tactical graphics further aid in understanding the training scenario as it plays out.
  • Data Mining - The recorded training events can be exported to standard database and spreadsheet applications, which can be used for data mining, in-depth analysis, and charting or graphing of specific datasets.
  • Open Standards Compliance – MAK software applications are all built with the VR-Link networking toolkit, which supports the High Level Architecture (HLA) and
    Distributed Interactive Simulation (DIS) protocols. Developers can use or extend VR-Link to presnet and capture simulation specific information.

Product Legend

Terrain Data Server

 

Terrain Data Servers

How does the Terrain Data Server componnet fit within the system architecture?

Distributed simulation designs are constructed by combining loosely coupled and interoperable components. However, while the simulation applications are distributed, terrain data usually is not. Each simulation application usually accesses terrain data stored on the local machine. This approach limits terrain interoperability and reuse. It constrains the amount of data that can be accessed to the amount that can be stored locally and necessitates redundant copying of data from one machine to another. Terrain servers solve this problem by storing large amounts of terrain data and supplying it over the network to client applications.    

In addition to the problem of terrain availability, simulations that aspire to high levels of realism must deal with the problem of static terrain. The world is not static. Actions of simulated objects can affect aspects of the environment and these changes need to be available to all participants. Bombs can create craters; vehicles can form berms to provide fortified positions; objects can be destroyed and produce debris. When terrain data is stored locally, there is no good way to propagate terrain changes to exercise participants. A dynamic terrain server solves this issue.

Where does MAK software fit within the Terrain Data Server component?

MAK's dynamic terrain server, VR-TheWorld, enables data to be accessed as a shared service that can be seamlessly and interactively viewed from anywhere, at any level, in a variety of formats. The server provides the foundation for current simulations, facilitates centralized management and future scalability, and enables dynamic terrain changes to be propagated to simulation applicationsl; it also lets you interactively stream and view on-demand data in real-time simulations. Simulation users can spin the globe, zoom into a location, drop units, and start simulating.  

The server supports the concept of “stateful” objects within the terrain, such as destructible buildings, doors that can be opened or breached, and trees that can fall and block movement along roads. When the state of a terrain object changes, that change is immediately available to all applications using the terrain server, allowing entities to interact appropriately. Furthermore, the terrain skin can be deformable, enabling simulation-specific changes such as craters to be seamlessly “stitched” into the terrain.  

All the terrain details are accessible via open standards.

Product Legend

Dynamic- Terrain Server