No cookie for

Virtual Training

Using the Power of Modeling & Simulation for First Responder Training, Emergency Response Preparedness, and Critical Infrastructure Protection

The homeland security, emergency response, and public safety communities face challenges similar to those dealt with in the military domain--they need to plan and train. But large scale live simulations are simply too disruptive to be conducted with regularity. Catastrophic emergencies require coordination of local and state public safety personnel, emergency management personnel, National Guard, and possibly regular military. Interoperability is a major problem.

Virtual Training Solutions

Check out some of the ways people are using MÄK's products for Defense and Homeland Security today:

Air Mission Operations Training Center

Air Mission Operations Training Center

Air Mission Operations Training Centers are large systems focused on training aircraft pilots and the teams of people needed to conduct air missions.  

To make the simulation environment valid for training, simulators are needed to fill the virtual world with mission support units, opposing forces & threats, and civilian patterns of life.

Depending on the specifics of each training exercise, the fidelity of each simulation can range from completely autonomous computer generated forces, to desktop role player stations, to fully immersive training simulators.

Scroll down to watch video on how VT MÄK’s simulation technology fits into an air missions operations training center. Click in the bottom corner of the video for volume control and full screen viewing. 

The MÄK Advantage:

MÄK technologies can be deployed in many places within an air missions operations training center.  Here are quick links to the relevant products: VR‑Forces, VR‑Vantage IG, and VR‑Link, the MÄK Data Logger and the MÄK RTI.

VT MÄK provides a powerful and flexible computer generated forces simulation, VR-Forces. Used to manage air, land, and sea missions, as well as civilians activity. It can be the ‘one CGF’ for all operational domains.

Desktop role players and targeted fidelity simulators are used where human players are needed to increase fidelity and represent tactically precise decision making and behavior.

Remote simulation centers connect over long-haul networks to participate when specific trials need the fidelity of those high-value simulation assets.MÄK offers an interoperability solution that facilitates a common extensible simulation architecture based on international standards. VR-Link helps developers build DIS & HLA into their simulations. VR-Exchange connects simulations even when they use differing protocols. The MÄK RTI provides the high-performance infrastructure for HLA networking.

Local simulators, based on MÄK’s VR-Engage, take the place of remote simulations — when connecting to remote facilities is not needed. VR-Engage lets users play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter.

VR-Engage can be used for role player stations. Or used as the basis for targeted fidelity or high-fidelity simulators.

MÄK products are meant to be used in complex simulation environments — interoperating with simulations built by customers and other vendors. However, big efficiencies are gained by choosing MÄK products as the core of your simulation environment.

Get ahead of the Game. VT MÄK

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

Close Air Support: JTAC Training

Close Air Support: JTAC Training

As part of a Tactical Air Control Party (TACP), only the Joint Terminal Air Controller (JTAC) is authorized to say CLEARED HOT on the radio and direct aircraft to deliver their ordnance on a target.

JTACs are relied on to direct and coordinate close air support missions, advise commanders on matters pertaining to air support, and observe and report the results of strikes. Their ability to communicate effectively with pilots, and coordinate accurate air strikes can play a huge role in the success of a mission.

Virtual training systems allow JTACs to practice identifying targets, calibrating their locations, requesting air support, and the highly-specialized procedures for communicating with pilots. 

Scroll down to watch video on how VT MÄK’s simulation technology comes together to make up a JTAC simulator.

The MÄK Advantage:

The JTAC simulator in this use case takes advantage of simulations built on MÄK’s core technologies: VR-Forces, VR-Vantage IG, and VR-Link, the MÄK Data Logger and the MÄK RTI.

The tight coupling of system components provides a rich simulation environment for each participant. The JTAC simulation is rendered in the dome using VR-Vantage; the flight simulation takes advantage of the VR-Forces first-person simulation engine; and the instructor/role player station uses VR-Forces CGF to populate the synthetic environment and control the training scenarios.

All these system components share a common terrain database and are connected together using VR-Link and the MÄK RTI, giving the system integrator the ability to deploy reliably and cost effectively while leaving open the opportunity to expand the system to add bigger and more complex networks of live, virtual and/or constructive simulations.

Choosing MÄK for your simulation infrastructure gives you state of the art technology and the renowned ‘engineer down the hall’ technical support that has been the foundation of MÄK’s culture since its beginnings.

Capabilities the core technologies bring to the simulators:

JTAC Dome — Built with VR-Vantage

  • Game/Simulator Quality Graphics and Rendering Techniques

    VR-Vantage uses the most modern image rendering and shader techniques to take advantage of the increasing power of NVIDIA graphics cards. VT MÄK's Image Generator has real-time visual effects to rival any modern IG or game engine.

  • Multi-Channel Rendering

    Support for multi-channel rendering is built in. Depending on system design choices for performance and number of computers deployed, VR-Vantage can render multiple channels from a single graphics processor (GPU) or can render channels on separate computers using Remote Display Engines attached to a master IG channel.

  • 3D Content to Represent Players and Interactions

    VR-Vantage is loaded with content including 3D models of all vehicle types, human characters, weapon systems, and destroyable buildings. Visual effects are provided for weapons engagements including particle systems for signal smoke, weapon fire, detonations, fire, and smoke.

  • Terrain Agility

    All MÄK’s simulation and visualization products are designed to be terrain agile, that means that they can support most of the terrain strategies commonly used in the modeling, simulation & training industry. Look here for technical details and a list of the formats supported.

  • Environmental Modeling

    VR-Vantage can render scenes of the terrain and environment with the realism of proper lighting — day or night, the effects of illuminated light sources and shadows, atmospheric and water effects including multiple cloud layers effects and dynamic oceans, trees and grass that move naturally with the wind.

  • Sensor Modeling

    VR-Vantage can render scenes in all wavelengths: Night vision, infrared, and visible (as needed on a JTACs dome display). Sensor zooming, depth of field effects, and reticle overlays model the use of binoculars and laser range finders.

Flight Simulator — Built with VR-Forces & VR-Vantage

  • Flight Dynamics

    High-fidelity physics-based aerodynamics model for accurate flight controls using game or professional level hands on throttle and stick controls (HOTAS).

  • Air to Ground Engagements

    Sensors (targeting pod (IR camera with gimbal and overlay), SAR request/response (requires RadarFX Server) Weapons (missiles, guns, bombs)

  • Navigation

    Standard six-pack navigation displays and multi-function display (MFD) navigation chart.

  • Image Generator

    All the same VR-Vantage based IG capabilities in a flight simulator/roleplayer station as in the JTAC’s dome display. The flexibility to configure as needed: Single screen (OTW + controls + HUD), Dual screen (OTW + HUD, controls), Multi Screen OTW (using remote display engines).

  • Integration with IOS & JTAC

    The flight simulator is integrated with the VR-Forces-based IOS so the instructor can initialize the combat air patrol (CAP) mission appropriately in preparation for the close air support (CAS) mission called by the JTAC. All flights are captured by the MAK Data Logger for after action review (AAR) analysis and debriefing. Radios are provided that communicate over the DIS or HLA simulation infrastructure and are recorded by the MÄK Data Logger fro AAR.

Instructor Operator Station — Built with VR-Forces CGF

VR-Forces is a powerful, scalable, flexible, and easy-to-use computer generated forces (CGF) simulation system used as the basis of Threat Generators and Instructor Operator Stations (IOS).

  • Scenario Definition

    VR-Forces comes with a rich set of capabilities that enable instructors to create, execute, and distribute simulation scenarios. Using its intuitive interfaces, they can build scenarios that scale from just a few individuals in close quarters to large multi-echelon simulations covering the entire theater of operations. The user interface can be used as-is or customized for a training specific look and feel.

  • Training Exercise Management

    All of the entities defined by a VR-Forces scenario can be interactively manipulated in real-time while the training is ongoing. Instructors can choose from:

    Direct control, where new entities can be created on the fly or existing entities can be moved into position, their status, rules of engagement, or tasking changed on a whim. Some call the instructor using this method a “puckster”.

    Artificial Intelligence (AI) control, where entities are given tasks to execute missions, like close air support (CAS), suppressive fire, or attack with guns. While on their mission reactive tasks deal with contingencies and reactive tasks deal and the CGF AI plays out the mission. In games, these are sometimes called “non-player characters”.

    First person control, where the instructor takes interactive control of a vehicle or human character and moves it around and engages with other entities using input devices.

  • 2D & 3D Viewing Control

    When creating training scenarios, the VR-Forces GUI allows instructors to quickly switch between 2D and 3D views.

    The 2D view provides a dynamic map display of the simulated world and is the most productive for laying down entities and tactical graphics that help to control the AI of those entities.

    The 3D views provide an intuitive, immersive, situational awareness and allow precise placement of simulation objects on the terrain. Users can quickly and easily switch between display modes or open a secondary window and use a different mode in each one.

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

UAV Surveillance and Operations Training with VR-Forces

What’s at Stake?


You are tasked with training a team of sensor payload operators to use UAVs for urban reconnaissance missions in a specific city. Upon completion of training, trainees must be able to comb an area for a target, make a positive identification, monitor behavior and interactions, radio in an airstrike, and then report on the outcome.
An ineffective training environment could lead to additional costs, losing important targets, and inefficient surveillance systems. Training with a robust solution enhances homeland security human resources for a minimal product investment.

What Are We Building?


As the instructor, you need to mock up a ground control station with accurate pilot/payload operator role definitions and supply that system with surveillance data from a content-rich simulation environment. You need to construct a scene that is informative, while providing trainees with opportunities to develop their instincts and test their operating procedures based on how the scenario unfolds.
Each UAV must be equipped with an electro-optical camera as well as an infrared sensor mounted to a gimbal. Radio communication between the UAV operators and a central command center must be available to coordinate surveillance and call in airstrikes.
Trainees need to experience the scenario through the electro-optical sensor and infrared sensor with rich, accurate data overlays to provide them with the information they need to communicate positioning and targeting effectively.
Your urban environment requires crowds of people who behave in realistic ways and traverse the city in intelligent paths. When a UAV operator spots someone, they need to be able to lock onto them when they are in motion to mimic algorithmic tracking tools.
The simulation needs to be adjustable in real time so that the instructor can minimize repeat behaviors and walk the team through different scenarios. Instructors also must be able to judge the effectiveness of a trainee’s technique.

MÄK’s Solution


In this particular case, VR-Forces provides all the software you need to bring your environment to life.

VR-Forces is an ideal tool for scenario development. It can model UAVs in fine detail, while allowing for instructors to customize those entities based on the scope of a mission. It’s simple to add the gimbal mounted sensor array that we need for this scenario and define parameters for - including zoom, zoom speed, slew rate, and gimbal stops.

Easily populate an urban environment with people by using the group objects function to add crowds of entities at a time. VR-Forces has features from Autodesk's Gameware built in, enabling Pattern of Life intelligent flows of people and vehicles, in addition to plotting the locations and tasks of individual entities. The Pattern of Life lets you manipulate patterns within the scenario – including realistic background traffic, whether it’s people, road, or air. Certain DI-Guy capabilities have been integrated into VR-Forces, meaning behavior modeling is more authentic, thanks to motion capture technology. Now you can train your team to look out for certain suspicious movements and calibrate their responses based on the actions of the target.

Sensor modeling is a point of strength for VR-Forces. Give your trainees a beautiful, detailed point of view of the scene through the electro-optical sensor, and provide a high-fidelity infrared sensor display when the daylight fades. VR-Forces adds accurate data overlays so that trainees can learn to quickly and accurately read and report based on that information. Instructors can visualize 3D volumetric view frustums and assess trainees’ combing strategies as well as any gaps in coverage, and engineer surveillance systems. We model sensor tracking to lock onto targets while they are in movement or on a fixed location.

What really makes VR-Forces perfect for training is the ability of instructors to manipulate the scenario in real time. You can keep your trainees from running scenarios that are too predictable by having your target enter buildings, change his mode of transportation, or actively attempt to avoid detection, all during live action.

Interested in Learning More?


Look HERE for more information on VR-Forces. Request a demo HERE.

Virtual Role Player Stations

 

Virtual Role Player Stations  

What's at stake?

Your challenge is to create a place where a human player can step into a virtual world and control a character, vehicle, sensor, or weapon system. Whether the player is sitting in a virtual cockpit flying a flight simulator, driving a desktop tank or HMMWV simulator with a joystick, or navigating a human character through a building with a mouse-and-keyboard-based game interface, that person is a player in the game.

Distributed, virtual simulation environments are built to achieve a variety of goals – training, mission rehearsal, experimentation, concept demonstration, analysis and more. But regardless of the objective, nearly all of these systems incorporate some kind of virtual player station. In virtual training systems, the trainees themselves typically operate virtual player stations, but instructors or white cell role players might also control individual simulated entities using similar applications.

In a Battle Lab environment, simulation engineers or domain experts might evaluate a proposed change to a vehicle system or battlefield doctrine by play-testing it in the virtual world. And in an operational environment, a soldier might familiarize himself with the terrain or mission ahead by driving through an accurately modeled 3D representation of a town or by performing a simulated UAS flyover.

Each virtual player station is tailored to the role, user interface, fidelity, and display system that are appropriate for the task at hand. But a wide variety of applications share a common, core set of requirements and capabilities:

  • Real-time 3D image generation, with 1st person “out-the-window” and 3rd person views
  • Vehicle or avatar navigation and control
  • Terrain database import and interaction
  • Human-machine interfaces (dashboards, heads-up-displays, instrumentation, etc.)
  • Interoperability with other Player Stations, SAFs, and constructive simulations
  • Weather and environment modeling
  • Physics (vehicle dynamics and collision with the 3D geometry)
  • Sensor modeling and visualization
  • Weapons modeling, including fly-out

MÄK can help.

The MÄK Virtual Player Stations tailor technologies from MÄK and our partners to meet your requirements:

  • Visualization. The visualization system provides high-performance visuals for out-the-window visuals and simulated UAS video feeds. It comes pre-configured with a broad set of high-resolution vehicle models from Simthetiq and other vendors, has built-in support for human character animation through Boston Dynamic’s DI-Guy, and dynamic vegetation using IDV’s SpeedTree. Core technology: VR-Vantage IG
  • Network Interoperability. Our Player Stations are natively compliant with both the DIS protocol and with several versions of the High Level Architecture (HLA), including IEEE 1516 and HLA Evolved. The entity that you control is automatically published onto the network and remote entities will appear in your view so that you can interact with other DIS- or HLA-compliant Player Stations and with CGF entities. Core technology: VR-Link
  • Vehicle Modeling. Vehicle dynamics and physics can be provided at various levels of fidelity depending on your requirements and budget. We have delivered player stations that incorporate the same basic parameterized dynamics models that we use in our VR-Forces CGF. But we have also built customized systems based on RTDyanmics’ RotorLib and FixedWingLib aircraft models and on CM Lab’s Vortex models, which provide engineering-grade vehicle physics so that you can see individual wheels bouncing over bumps in the terrain as you drive.
  • Terrain Agility. Like all MÄK products, our Virtual Player Stations are Terrain Agile so that we can achieve true terrain correlation with many different simulation and visualization systems in your environment. Our player stations can import terrain using a variety of approaches, techniques, and formats. We can read visual databases in OpenFlight and MetaFlight format, import elevation, imagery, and vector data directly from source, or even stream terrain from remote servers such as MÄK’s VR-TheWorld Server through web-services standards. Core technology: VR-Vantage IG, VR-inTerra, VR-TheWorld
  • Cockpit Displays. Your VPS can include head-up displays, virtual cockpits, and simulated analytic devices. We can create and incorporate custom 2D and 3D HMI elements depending on your needs. Core technology: GL-Studio 
  • Sensor Displays. The visualization component can include infrared, night-vision, and other sensor displays. Core technology: SensorFX – a physics-based sensor visualization plug-in to MÄK’s VR-Vantage, based on JRM Technologies’ SigSim and SenSim products.

Let MÄK build a custom, cost-effective Virtual Player Station for you based on our proven platform, process, and technology.

Scenario Generation

 


Scenario Generators  

What's at stake?

Your job is to place a trainee or analyst in a realistic virtual environment in which they can train or experiment. It could be a hardware simulator, a battle lab, or even the actual equipment, configured for simulated input. Then you need to stimulate that environment with realistic events for the trainee to respond to. The stimulation usually comes from a scenario generator, also known as a threat generator. A scenario generator typically simulates the opposing force entities and complementary friendly force entities that the trainees need to interact with.

Trends

A scenario generator should allow training staff to quickly and easily design and develop scenarios that place trainees in a realistic situation. The application should use the proper terminology and concepts for the trainees’ knowledge domain. It should be flexible enough to handle the entire spectrum of simulation needs. The entities simulated by the scenario generator should be able to operate with enough autonomy that once the simulation starts they do not need constant attention by an instructor / operator, but could be managed dynamically if necessary.

In addition to its basic capabilities, a scenario generator needs to be able to communicate with the simulator and other exercise participants using standard simulation protocols. It needs to be able to load the terrain databases and entity models that you need without forcing you to use some narrowly defined or proprietary set of formats. Finally, a scenario generator needs to work well with the visualization and data logging tools that are often used in simulation settings.

MÄK can help.

MÄK will work with you to so that you have the Scenario Generator that you need - a powerful and flexible simulation component for generating and executing battlefield scenarios. MÄK will work with you to customize it to meet the particular needs of your simulation domain. For example, higher fidelity models for a particular platform can be added, new tasks can be implemented, or the graphical user interface can be customized to create the correct level or realism. Features include:

  • Scenario development – Staff can rapidly create and modify scenarios. Entities can be controlled directly as the scenario runs. Core Technology: VR-Forces
  • Situational Awareness – The visualization system includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allows the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Network modeling – The lab can simulate communications networks and the difficulties of real-world communications. Core technologies: Qualnet eXata and AGI SMART.
  • Correlated Terrain – VT MÄK’s approach to terrain, terrain agility, ensures that you can use the terrain formats you need, when you need them. We can also help you develop custom terrains and can integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. Core technologies: VR-TheWorld Server, VR-inTerra.
  • Sensor modeling – The visualization component can model visuals through different sensor spectrums, such as infrared and night vision. Core technology: JRM SensorFX.
  • Open Standards Compliance – VR-Forces supports the High Level Architecture (HLA), Distributed Interactive Simulation (DIS).

Image Generators

 

Image Generators

What's at stake?

You need to provide realistic scenes of a virtual environment to role players in the simulation or the analysts in engineering simulations. To do this, you'll want an image generator (IG), which is the visual component of a training or analysis system.Simulations often require a visual representation of the synthetic environment. Whether this is a high performance Out-The-Window view for a full motion flight simulator, or a sensor channel for hardware-in-the-loop testing, an image generator is the visual component of the solution.

All simulations are different and vary greatly - that's also true with image generators. The IG component for an attack helicopter trainer will require drastically different technology then the visual system for a truck driver trainer.

MÄK can help.

The MÄK Image Generator uses best of breed technologies to custom tailor high quality visual scenes of the synthetic environment for your specific simulation needs. A MÄK IG system will have some or all of the following components:

  • Rendering Engine – The rendering engine renders 3D scenes with 1000s of vehicles and other simulation objects. Core technology: VR-Vantage.
  • Multichannel distributed views – The MÄK Image Generator can manage multiple monitors per computer and drive display engines on multiple computers. Core technology: VR-Vantage IG.
  • High Fidelity Sensor Visualization – The MÄK Image Generator can simulate night vision goggles, infra-red, and other spectrums. Core technology: JRM SensorFX.
  • Correlated Terrain – VT MÄK’s approach to terrain, terrain agility, ensures that you can use the terrain formats you need, when you need them. We can also help you develop custom terrains and integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. Core technologies: VR-TheWorld Server, VR-inTerra.
  • Hardware – Rack mounted or stand-alone PCs built specifically for running the MÄK IG.
  • Open Standards Compliance –MÄK Image Generator supports the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. Core technology: VR-Link networking toolkit.

After-Action Review/Debrief Systems

 

After Action Review (AAR) / Debrief Systems

What's at stake?

After Action Review (AAR) systems provide feedback on mission and task performance during training exercises. An AAR is a critical part of the training process. It is a structured review, or debrief, for analyzing what happened during an exercise and why it happened. By comparing the actual events with the expected outcomes, leaders and soldiers can identify strengths and weaknesses and decide how to improve their performance.

As training exercises become larger, there is a greater need for automated tools to capture the actions of all training participants, evaluate performance against standard metrics, and provide the information necessary to support a structure debrief of the training event. These tools should be flexible enough to support chronological reviews of an entire exercise or tightly focused reviews highlighting a few key issues.  

MÄK can help.

MÄK can help instructors convert the huge amount of data collected during a training exercise into focused learning points. MÄK uses proven technologies to build a system to meet your training system requirements.  

  • Simulation Recording and Playback – All simulation data can be recorded and played back. This includes simulated entities and interactions, trainee communications, and other events injected into the simulation. The AAR facilitator can pause, slow-down, or speed-up the playback, and can jump to specific moments during the exercise via a simple, easy-to-use, DVR-like control panel. Core technology: MÄK Data Logger.
  • Annotation – The embedded annotation system enables instructors to bookmark key events during the action and then jump to those events during playback to review specific actions.
  • 2D map and 3D out-the-window visualization2D tactical map displays provide situational awareness. 3D views create immersive scenes of the simulated environment from the perspective of the trainee. An exaggerated reality mode combines the best of 2D & 3D techniques into a compelling visual analysis of the training exercise. Core technology: VR-Vantage.
  • Analytical Information – The AAR facilitator can track a specific trainee or groups of trainees and view the scene from various perspectives. Informational overlays including trajectory histories, unit status, attacker-target lines, and other tactical graphics further aid in understanding the training scenario as it plays out.
  • Data Mining - The recorded training events can be exported to standard database and spreadsheet applications, which can be used for data mining, in-depth analysis, and charting or graphing of specific datasets.
  • Open Standards Compliance – MÄK AAR System supports the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS), Core technology: VR-Link networking toolkit.

Instructor Operator Stations

 

Instructor Operator Stations  

What's at stake?

Training events are becoming larger and more widely distributed across networked environments. Yet staffing for these exercises is often static, or even decreasing. Therefore, instructors and operators need IOS systems to help manage their tasks, including designing scenarios, running exercises, providing real-time guidance and feedback, and conducting AAR.

Instructor Operator Stations (IOS) provide a central location from which instructors and operators can manage training simulations. An effective IOS enables seamless control of exercise start-up, execution, and After Action Review (AAR) across distributed systems. It automates many setup and execution tasks, and provides interfaces tailored to the simulation domain for tasks that are done manually.

MÄK can help.

MÄK has proven technologies that allow us to build and customize an IOS to meet your training system requirements.

  • Simulation Control Interface – Instructors can create and modify training scenarios. Execution of the scenarios may be distributed across one or more remote systems. The instructor or operator can dynamically inject events into a scenario to stimulate trainee responses, or otherwise guide a trainee’s actions during a training exercise. Core technology: VR-Forces Graphical User Interface
  • Situational Awareness – The MÄK IOS includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allow the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Analysis & After Action Review – The MÄK IOS supports pre-mission briefing and AAR / debriefing. It can record exercises and play them back. The instructor can annotate key events in real-time or post exercise, assess trainee performance, and generate debrief presentations and reports. The logged data can be exported to a variety of databases and analysis tools for data mining and performance assessment. Core technology: MÄK Data Logger
  • Open Standards Compliance –MÄK IOS supports the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. Core technology: VR-Link networking toolkit.
  • Simulated Voice Radios – Optionally Includes services to communicate with trainees using real or simulated radios, VOIP, or text chat, as appropriate for the training environment.

 

Vehicle Crew Training Systems

Vehicle Crew Training Systems


What’s at Stake:

For armies around the world, ground vehicles play important strategic roles. Whether they’re transporting personnel and cargo as part of a convoy, mobilizing for an offensive maneuver, or defending against a roadside ambush, they are an essential piece of the mobile support and sustainment structure.

Developing effective crew training systems to operate ground vehicles is key to maximizing this strategic asset. Proper training improves team communication, combat effectiveness, safety, and driving efficiency. The consequences of a lack of training range from inefficiency to loss of life.

The Challenge:

As the demand for vehicle crews remains high, there is a need to train larger classes. Instructors are looking for systems that simulate vehicles in convoy missions and hostile entities in a high-fidelity synthetic environment. They seek systems that provide role-specific, informative visual interfaces to the crew and creative scenario construction in real time. Instructors want to connect to local or global networks of simulation systems, with maximum hardware flexibility.

The MÄK Solution:

VR-Vantage provides trainees with high-detail role-specific visual scenes, including scenes with high-fidelity data overlays. VR-Vantage emulates exterior camera views and 2D maps for the driver and commander, and a scope for the gunner with accurate data displays. Instructors use VR-Vantage to observe the exercise from a third-person perspective and evaluate trainees. VR-Vantage can be customized to match performance and resolution needs, and is used on a range of hardware, from lightweight laptops to complex motion-platform simulators.

With MÄK you have choices on how to create a host vehicle simulation. For ground vehicles we’ve found Vortex, by CM Labs, to be an excellent vehicle dynamics solution. Vortex's contact dynamics simulate all the moving parts of the vehicle including the interaction with the terrain, water, obstacles, vision systems, grasping, and more. Everything from suspension travel to traction and gearing is accounted for to provide the driver with an enriching, engaging training scenario.

For instructors looking to control the simulation and incorporate computer-generated forces, VR-Forces is the perfect pairing for VR-Vantage. VR-Forces is a scalable simulation engine that allows instructors to populate the scene with friendly forces, hostile units, civilians, animals, and obstacles. Instructors use VR-Forces to move units around the scene, setting up scenarios or altering a live situation in real time.

Both VR-Forces and VR-Vantage include MÄK’s networking technology. VR-Link’s protocol independent API allows both applications to communicate through industry standard High Level Architecture (HLA) and the Distributed Interactive Simulation (DIS) protocols, including HLA 1.3, HLA 1516, HLA Evolved, DIS, and DIS 7. The MÄK Data Logger records and plays back all the network simulation traffic for after action review and analysis. The MÄK RTI (runtime infrastructure) is available when connecting to HLA federations using any of these SISO standard protocols: HLA 1.3, HLA 1516, and HLA Evolved.

Want to learn more? Have a look at the VR-Vantage page for more information.
Interested in seeing a demonstration?

Cyber and Electronic Warfare

The Battlefield is Evolving: The Increased Threat of Cyber Attack Affects Strategic Decision Making

 

As technologies continue to advance and become more deeply ingrained in modern life, threats of a crippling cyber attack or electronic warfare (EW) become increasingly probable. In an attempt to mitigate these risks, the Colombian national government (represented by the Ministry of Information Technology), the Higher War School in Colombia, and ITM Consulting Company joined forces to explore the role simulation plays in understanding, preparing for, and combating cyber attacks.

The Challenge:

The organizations needed a tool that could create and model elements vulnerable to cyber attacks, such as radar systems, military and civilian entities, and communication systems. The organizations were looking for the freedom to redesign the User interface (UI) to match specific scenario needs and create response strategies.

The MÄK Solution:

VT MÄK offers commercial-off-the-shelf (COTS) technology to build EW simulations, backed by a company with an “engineer down the hall” philosophy to help organizations select and implement the most effective solution.

VR-Forces provided a scalable computer-generated forces simulation engine to populate the training environment with targeted infrastructure systems, friendly forces and hostile entities. VR-Forces allowed the organizations to pre-plan scenarios as well as interactively alter a live situation in real time.

VR-Forces provided the flexibility sought by the organizations, including UI customization. The group used this flexibility to conduct three major cyber attack scenarios, and create response strategies.

In the first scenario, VR-Forces simulated two aircraft teams. The red team was given a mission to use scanners and jammers to alter the frequency on the blue team’s radar systems; doing this enabled the red team to use attack aircraft to undermine the blue team’s defense system.

The second scenario used VR-Forces to simulate an electronic warfare attack on the Colombian oil infrastructure, a victim of frequent terrorist attacks. in this exercise, the red team was instructed to alter the readings on specific valves on a pipeline to ignite fires. Attacks on the blue team’s surveillance systems (via unmanned aircraft) set out to deter the blue team’s response.

The third scenario highlighted the inherent danger to civilian populations if the turbines in a hydroelectric plant are compromised through a cyber attack. The red team in this situation instigated drastic variations in water levels at the plant that in turn disrupted the power and energy generated to the nearby town. The power and energy disruptions brought about detrimental consequences for the simulated town.

The exercises using VR-Forces have contributed to research and development efforts led by the department of Telematics eSdegUe, and in particular its line of research in Cybersecurity and Cyber defense.

“What we were able to do with VR-Forces allowed us to lead the research process for the modeling and simulation of electronic Warfare and Cybernetics; it is through this research that we learn how to best describe the behavior of different cyber attacks and EW tactics to determine scenarios, trends, and courses of action with excellent results,” says Colonel Martha Liliana Sanchez Lozano, the Official Colombian Air Force Chief of Telematics and Program Coordinator of Cybersecurity and Cyber defense at the Higher War School.

Want to learn more? Have a look at the VR-Forces page for more information. interested in seeing a demonstration?

Joint Incident Management Training

Taking incident management training to the next level with simulation

 

When disaster strikes, communities rely on incident management teams to respond immediately by taking actions to save lives and property. Incident management staff in the United States is commonly divided into two separate groups- Emergency Response Teams (ERTs) and Emergency Operations Center Teams (EOCTs). This setup has been adopted by many states, local governments, and the private sector. This commonality improves coordination between all layers of public and private bureaucracy.

 

ERTs work on-the-ground to provide life- and property-saving actions, mitigate issues, and gather information. EOCTs work as a command/control center to develop a Plan of Action involving Incident Command, Operations, Communications, Resources Management, Logistics, and Records.

 

The success of the ERTs is directly dependant on EOCTs’ ability to effectively command and control a situation. To maximize effectiveness, EOC team members require effective contextualized training. Unfortunately, the lack of simulation available for EOCT training in the past limited the scope of practice available to them in terms of context. Simulation can help take EOC to the next level in terms of contextualizing situations and better helping them manage peripheral aspects of an operation that are either too expensive or impossible to generate in a real-world environment.

 

Cross-Training: Simulation is an ideal way to cross-train ERTs and EOCTs together in order to build cohesion and organization in the face of emergency. Cross-training can also work well vertically, as federal, state, local and private groups can co-develop mutually beneficial disaster planning. Certain scenarios lend themselves to cross-training in a simulation environment, including Temporary Infrastructure / Supply Delivery, Wildfire – Detection and Extinguishing, High-Rise Building Fire Response, Chemical, Biological, Radiological, Nuclear, or Explosive (CBRNE) Event, and Search and Rescue Operations.

 

Crowd and Traffic Control: MÄK’s human characters can be set up to mimic crowds that would gather at or away from the scene of an incident, giving EOCTs the opportunity to set up barriers and  manage flows of people. MÄK’s AI allows for crowds to build up and dissipate, intelligently plan paths around temporary obstacles, and flood your scene with busy commuters at rush hour.

 

Structural and Risk Assessment: UAV simulations can be used to train EOCTs to advise ERTs’ on situational awareness, help them locate injured parties at the scene, and perform structural analysis of damaged infrastructure. The superior vantage point of the UAV provides teams with maximum awareness while preventing ERT’s from entering potentially dangerous or contaminated areas.

Logistics Support: MÄK’s simulation technology allows for testing of evacuation mapping and planning. Develop temporary infrastructure designs, evacuate casualties, and simulate supply delivery plans to test the feasibility of your Plan of Action.

Implementation of New Technology: As new technology emerges, agencies can simulate the new resource and determine the value of acquiring it in their model. If an acquisition is made, they can train their operatives to use the new technology via simulation to minimize training costs and maximize effectiveness. This is also effective for training when making policy/procedure changes.

For more information on our human character simulation capabilities, check out our Humans products.

 

 

cyber-and-electronic-warfare