No cookie for

Welcome to the Solutions page.

We have successful customers who build simulation systems from components empowered by MAK software.

Open the accordion sliders on this page to see how MAK products fit into the components, how the components are arranged into systems by our customers.  Learn about successful projects from our customers who worked with MAK.  

Successful Customers – Read their stories

Below are systems that our customers have deployed using various configurations of our products to meet their projects' needs.

Product Legend

The Air Force Test Pilot School Upgrades to VR-Vantage IG

VR-Vantage IG helps Calspan double the value of flight tests to both train test pilots and collect data for industry partners.

 

Air Force Test Pilot School

 

 

After the Flight Sciences Simulator was upgraded to use VR-Vantage IG, we got a chance to catch up with Jay Kemper, Senior Software Engineer at Calspan. We discussed how MAK’s VR-Vantage IG is used by the Air Force Test Pilot School and what they are learning using the VR-Vantage product.

 “MAK supports the Air Force Test Pilot School at Edwards AFB with VRVantage IG; an image generator product. VR-Vantage IG has been used in the Flight Sciences Simulator to train the next generation of test pilots in everything from aerodynamics to flight control design”, said Mr Kemper. Students have a one year curriculum and they run courses starting every six months so there is a six month overlap in the programs. A physics-based flight simulation model is used to learn the effects of aerodynamics. For example, they halve or double the effects of one of the flight controls so the pilot can experience the effects on the performance and maneuverability of the aircraft.

Each class has a capstone event which is a flight test. A flight test is a controlled experiment where most of the flight conditions are known (controlled) and specific characteristics are varied so they can be measured to collect data and learn about the effects of the variable. The school has moved beyond using the flight tests as just training exercises, now they are also partnering with industry leaders (e.g. Cessna and Lockheed) to perform tests as needed for research and development. It’s a win-win — the school gets to learn flight testing skills and the industry obtains valid flight test data.

“One of the most advanced uses of the simulator is to connect it to a programmable aircraft and use the simulator as a ground cockpit”, said Mr Kemper. Calspan is studying issues with unmanned aircraft systems, e.g. the effects of delays in satellite communication relays. The Calspan test aircraft is a Learjet 25 with a four man crew. The aircraft has a programmable flight control system, enabling the crew to get the plane into a controlled situation, then turn control over to the ground station to carry out the test. The ground control station can land the Learjet while the crew stands by with emergency override control in case anything goes wrong. “VR-Vantage uses the data from the aircraft to generate the scenes of the terrain to a very high degree of accuracy, allowing for an aircraft to be landed with no video stream”, said Mr Kemper. The aircraft sends position, orientation, and status data to the ground station using a C-Band radio, and Calspan can get update rates up to 60 Hz. The terrain database was built by AFRL using TerraVista (TerraPage format).

Calspan is working with the UAS community to help set standards for how to test UAS, as simulation is always an integral part of flight testing. The flight test is the important part because the simulation can remove some, but not all doubt. By using the “Workload BuildUp Approach”, simulation can be used to run many iterations and flush out problems.

When asked about MAK’s support, Mr Kemper said, “The support MAK gives is second to none, response time of its salespersons and support has been fantastic. They have gone above and beyond to facilitate their products to fit our needs, from getting licenses to product to plugin support. I’ve even asked about how to perform a certain task in a plugin, and the support responded with a full code that I could just drop in and build!”

Want to learn more? Have a look at the VR-Vantage page for more information. Are you interested in seeing a demonstration?

Product Legend

University of Alabama in Huntsville Uses VT MAK’s VR-Forces to Recreate the Battle of 73 Easting

VT MAK’s modeling and simulation software enhances students’ learning of the Persian Gulf War battle

 

The University of Alabama in Huntsville (UAH) is using MAK’s VR-Forces simulation software to recreate the Battle of 73 Easting, a major United States victory during the 1991 Persian Gulf War, enhancing its students’ learning experience. With VR-Forces, graduate students at UAH have achieved increasingly accurate replication of the historical results.

The simulations are directed by Mikel D. Petty, Ph.D, Associate Professor of Computer Science and Senior Scientist for M&S, Information Technology and Systems Center at UAH.

“The Battle of 73 Easting is arguably the best-documented battle in U.S. history, which makes it a perfect application for both simulation analysis and historical study,” said Dr. Petty.

“Then-Capt. H.R. McMaster’s Eagle Troop of the Second Armored Cavalry Regiment aggressively engaged a much larger Iraqi armored force, destroying a large part of it in only twenty-three minutes. Simulation improves our ability to study the tactics of Eagle Troop and the rest of the regiment from a historical perspective, and recreating historical events in simulation provides insight into the accuracy and capabilities of the simulation.”

Engineers, test pilots and air traffic controllers can work together, regardless of location, via a standard and holistic interoperability solution. This avoids the disruption and expense of moving into a new external facility and allows research facilities, airports and others to connect to the platform.

 

VR-Forces is MAK’s complete simulation solution software. It is a powerful and flexible Computer Generated Forces platform that populates simulated synthetic environments with battlefield entities. Users can create scenarios full of custom, lifelike entities with specific behaviors, making it an ideal platform for training and education. The VR-Forces software also allows students to alter the parameters of the battle and compare simulation results with historical events. This year, under Dr. Petty’s guidance, his students will be investigating the change in outcome given a national scenario in which the Iraqi forces were able to use modern T-14 Russian Armata battle tanks in 1991.

“This is a great application of VR-Forces,” said Dan Schimmel, President and CEO of VT MAK. “We’re thrilled to see a famous battle simulated and analyzed by Dr. Petty and his students at the University of Alabama in Huntsville.“

“MAK has been very accommodating in allowing my students to use VR-Forces for their academic projects,” said Dr. Petty. “The intuitive nature of their graphical user interface and the accuracy of their simulation results make VR-Forces an excellent technology to enhance learning for modeling and simulation students.”

 

Product Legend

Product Legend

Wargaming System to Stimulate C4I System – Raytheon IDS

What’s at stake

Raytheon Integrated Defense Systems selected MAK to develop the simulation system to drive the wargaming & training component of their Command View C4I system that was subsequently delivered to a Raytheon customer.

The C4I system is used for national defense and simulated Command Post Exercises at the division, brigade and battalion levels, providing opportunities for experimentation, doctrine development, and training.

How MAK Helped

MAK used VR-Forces' flexible architecture to develop and deliver a MAK Command and Staff Training System (MAK CST) to stimulate Raytheon’s C4I system. MAK CST acts as the simulation engine for operational level training exercises — it feeds the C4I system with track data and reports from simulated forces.

Commanders and their staff lay down the Order of Battle and participate in the exercise using the same C4I system they would use in a real-world battle, while members of their organization direct the simulated forces by interacting with MAK CST in the simulation cell.

This two year project resulted in VT MAK delivering a fully compliant system, on time, and within budget. Deliveries included documentation, training, comprehensive test procedures, and on-site integration support. The newly developed software has been integrated into the COTS VR-Forces product for ease of maintenance and upgrade. The system includes theVR-TheWorld Streaming Terrain Server.

Product Legend

Product Legend

Naval C3 System Upgrade

What's At Stake?

When Terma was chosen to upgrade Command, Control and Communication (C3) systems on the Royal Danish Navy’s Flyvefisken class ships, they started looking for commercial off the shelf tools to save development time. They needed products that had the same flexibility they were building into their C-Flex systems.

The C3 systems technology upgrade is based on the new Terma C-Flex platform and incorporates all operational and technical experience from many years of operations of the current C3 systems onboard twenty-one vessels of the Royal Danish Navy. The C-Flex software is a true component-based architecture. The main advantages are flexibility in configuration, reduced test requirements, open to third party software and reduced maintenance costs.

How MAK Helped

“We chose MAK products because of their knowledge and experience with HLA,” explained Thorsten N. Hansen project manager. “For scenario generation, we chose their COTS product VR-Forces. VR-Forces is flexible and can easily accommodate special requirements. MAK seems to be serious about making their software highly customizable. We saved a lot of development hours using VR-Forces instead of making our own proprietary scenario generator.”

Product Legend

Product Legend

Air Support Operations Center (ASOC) Total Mission Training

What's At Stake?

The ASOC (Air Support Operations Center) Battlefield Simulation fills a crucial gap in USAF and United Kingdom Close Air Support (CAS) and airspace manager training. The system provides six squadrons with the capability to conduct total-mission training events whenever the personnel and time are available.

How MAK Helped

When the 111th ASOS returned from their first deployment to Afghanistan they realized the training available prior to deployment was inadequate. They sought an organic training capability focused on the ASOC mission that was low cost, simple to use, adaptable, and available now. With the assistance of VT MAK, they developed a complete training system based on VT MAK’s QuickStrike HLA-based simulation. Through more than two years of spiral development, incorporating lessons learned, the system has matured, and can now realistically replicate the Tactical Operations Center (TOC) in Kabul, Afghanistan, the TOC supporting the mission in Iraq, or can expand to support a major conflict scenario. The training system provides a collaborative workspace for the training audience and exercise control group via integrated software and workstations that can easily adapt to new mission requirements and TOC configurations. The system continues to mature. Based on inputs from the warfighter, new capabilities have been incorporated to add realism and simplify the scenario development process. The QuickStrike simulation can now import TBMCS Air Tasking Order air mission data and can provide air and ground tracks to a common operating picture; presented through either TACP CASS or JADOCS.

 

Product Legend

Air Defense Operations

What's At Stake?

Rheinmetall Canada delivered an Air Defense Operations Simulation (ADOS) for the Dutch Army to be used for decision-making training of the staff of the air defense units responsible for Command and Control (C2). The Dutch Army is fielding new Ground Based Air Defense Systems (GBADS) from Rheinmetall and integrating these with Short Range Air Defense (SHORAD) systems, requiring new training systems for tactical decision-making as opposed to purely operation training. The ADOS system is also being used as an aid in the development of air defense Doctrine, Tactics, Techniques, and Procedures (DTTP).

How MAK Helped

The training system comprises a simulation network connected to a copy of the operational C2 system.
VR-Forces, MAK’s simulation solution, provides scenario generation and modeling of ground and air threats and blue force including the mobile GBADS. VR-Vantage is used as the primary interface for the role players which include the Stinger Weapon Platform platoons, the Norwegian Advanced Surface to Air Missile System (NASAMS) fire units, and opposing forces. 

All positions are connected via the High Level Architecture (HLA) through the MAK High Performance RTI. Students, making up the Staff platoon, are at copies of the operational C2 system. The MAK Data Logger supports After Action Review.

“MAK’s VR-Forces and VR-Vantage combination of providing out-of-the-box capabilities with a powerful API for easy customization made them the ideal choice as the simulation and visualization component of the ADOS System. The technical capabilities of MAK’s products coupled with their excellent reputation for unsurpassed customer support made the decision to team with MAK an easy one,” said Ledin Charles, ADOS Program Manager, Rheinmetall Canada.

Product Legend

 

NASA Project Wins, Grows with the MAK RTI

 

As aviation technology has improved, commercial air traffic has increased significantly, requiring better airspace management techniques. In an attempt to develop better air capacity, safety, and flexibility, NASA’s Air Traffic Operations Laboratory (ATOL) used a massive simulation environment called Air and Traffic Operations Simulation (ATOS) to explore better techniques. As the project’s success lead to its growth, NASA required a licensing option that would be easily scalable in a simulation that is ever-expanding.

NASA originally needed a tool that could effectively communicate and maintain all the entities involved in its complex, multi-laboratory, simulation that includes 400+ workstation-based high-fidelity aircraft simulators networked together. The simulation would eventually demand a creative, flexible solution so that licensing restrictions would not hinder its development.

The MAK Advantage:

VT MAK offers commercial-off-the-shelf (COTS) technology to facilitate Air Traffic Management simulations, backed by a company with an “engineer down the hall” philosophy to help organizations creatively solve their implementation issues. The MAK RTI has been used by ATOL for over a decade to enable their High Level Architecture (HLA) federations to rapidly and efficiently communicate the positioning and actions of entities in the ATOS simulation. It was also used to communicate with external laboratories at NASA LaRC, NASA Ames Research Center, FAA and other compliant facilities. As the simulation’s success led to the growth in the number of federates and labs involved in the simulation, NASA’s needs changed. The ATOL required a way to use unlimited instances of the MAK RTI, and needed to do it in a way that would be cost-efficient. MAK’s “engineer own the hall” philosophy played a big role in ensuring success, as we worked out a custom licensing model to meet their needs and ensure that MAK would continue to play a role in the success of the ATOL.

We were able to reach Glover Barker of NASA Langley Research Center for comment: “At NASA Langley Research Center, we have used the MAK RTI libraries since 2005. Our Airspace and Traffic Operations Simulation (ATOS) uses High Level Architecture (HLA), so we initially tried an open source HLA solution. But the quality and reliability were not adequate, so we purchased RTI from MAK. MAK's RTI implementation conforms well to HLA standards, so we could easily substitute MAK RTI for the open source solution. We have been satisfied customers ever since.”

“MAK has steadily improved their product in every new release. MAK's technical support always responds quickly and helpfully when we have questions or problems. When needed, they have sent staff to our site to inspect our environment, and made recommendations to change our configuration. In the cases when we experienced a bug because of the unique way we used their software, MAK has diligently provided fixes for our problems. When we needed a new licensing scheme to fit our usage model, MAK delivered.”

Want to learn more? Have a look at the MAK RTI page for more information. Interested in seeing a demonstration?

 

Product Legend

Air Defense Battle Management

What's At Stake?

The Air Defence Ground Environment Simulator (ADGESIM) was developed by the Defence, Science and Technology Organization (DSTO) to evolve Royal Australian Air Force (RAAF) training systems and prepare personnel for network-centric operations. ADGESIM, a network-centric aerospace battle management application, is a major component of the RAAF strategy to adopt distributed simulation technology.

How MAK Helped

ADGESIM is comprised of three applications developed by DSTO and YTEK (DSTO engineering contractor) in C++ and closely integrated with VR-Forces and VR-Link. The ADGESIM Pilot Interface is used to create and fly simulated aircraft entities. All high resolution entity modeling is done in the background by VR-Forces and ADGESIM entities have been proven compatible with both Australian and US Navy distributed simulation environments. The RAAF routinely runs scenarios with eight Pilot Interface workstations simultaneously without taxing the PC-based ADGESIM system.

Jon Blacklock, Head, Air Projects Analysis, Air Operations Research Branch of DSTO, explained that “his research program had identified VR-Forces and VR-Link as the only products with a sufficiently open architecture and useable programming interface to support rapid prototyping and development. Using a mixture of MAK’s COTS tools and custom-built “thin client” applications, we were able to create ADGESIM in less than six months, for around half the original budget.”

“VR-Forces’ flexibility secured its place in our development,” said Blacklock. “The ease of customization and the API were among the biggest reasons we chose it. At the start it lacked a few features on our wish list, but MAK’s outstanding tech support helped us fill in the gaps. During the six-month development, MAK redeveloped their applications and added features in parallel with our development. It is a testament to MAK’s professional approach that when we got the release version of VR Forces, we needed just two weeks prior to installation in the first Regional Operations center. We plugged in our applications and everything just worked.”

 

Product Legend

Light Armored Vehicle Simulator Demonstration

Arriving on the floor to setup for a trade show, Ivan finds himself in a stressful situation: Only 4 hours for setting up a demo of a collaborative Light Armored Vehicle (LAV) simulator. The new partner, ADVENTUREtech, just introduced a display system that needs to become part of the simulation.

The demo includes 4 different user stations: 

  • LAV Driver in an ergonomic seat inside a 3 channel CAVE display 
  • Gunner using game controller with a single monitor 
  • Commander using the Oculus VR head gear
  • Observers on a monitor to display what the Commander sees

"VT MAK's product suite transformed a booth with a display system into a fully-functional, team-oriented training demonstration."

Scroll down to watch animation based on the Tech-Savvy Guide to Virtual Simulation

 

The MAK Advantage:

MAK used its Light Armored Vehicle (LAV) simulator built on VR-Forces and VR-Vantage to set up this system.

VR-Vantage IG is used to provide trainees with high-detail role-specific visual scenes. VR-Vantage's multi-channel distributed rendering architecture makes it easy to configure a continuous scene across a CAVE display. The CAVE is a form of display that displays scenes onto three to six sides of a cube. CAVEs provide a wide viewing angle that makes an immersive environment for the Driver without the expense of edge blended dome displays. This display is a good match with the ergonomically positioned driver controls that make driving natural and leave the operator's attention on the mission.

The Gunner's view is presented on a single monitor and operated with a commercial game controller. The gunner IG supports daylight visual and night vision scenes, toggled by a switch on the game controller. This lower fidelity interface is appropriate if the gunner’s tasks focus on targeting decisions and communicating with the commander, but could be implemented with higher fidelity interfaces if shooting skills were important. 

The Commander is using the Oculus VR headgear to present a fully immersive display that is reactive to his head position and movements. A video monitor is used to show observers what the Commander is seeing.

The VR-Forces simulation engine is hosting the light armored vehicle ownship simulation. It uses CM Lab’s Vortex physics engine to model the vehicle and gun dynamics. Because it’s based on VR-Forces simulation framework, it can coordinate user inputs from each of the operators, provide terrain access to the Vortex models, and handle the weapons effects, sensors, and damage models for the vehicle. 

The terrain used is Simthetiq’s SUROBI VTE, a terrain database hand built from source data from the Surobi valley in Afghanistan. This database provides high performing models with close-up detail of a typical Afghan village. 

The virtual simulator is connected through the network, using VR-Link, to two other simulations. VR-Forces CGF, which simulates neutral, friendly and opposing force entities. VR-Forces enables instructors to pre-plan training scenarios where non-player AI move and interact based on mission plans, reactive tasks, and triggers based on the actions of other entities in the simulation. Instructors and role players can interact at several levels: they can give squad leaders tasks and have the members of the squad follow those instructions, they can command individual entities, or take first person control of vehicles and human characters. 

DI-Guy Scenario can add additional human characters to the simulation that model the patterns of life in the village and react to the actions of the LAV. 

VR-Forces, VR-Vantage, and DI-Guy Lifeform Server include MAK’s networking technology. VR-Link’s protocol independent API allows all three applications to communicate through industry standard High Level Architecture (HLA) and the Distributed Interactive Simulation (DIS) protocols, including HLA 1.3, HLA 1516, HLA Evolved, DIS, and DIS 7. The MAK Data Logger records and plays back all the network simulation traffic for after action review and analysis. The MAK RTI (runtime infrastructure) is available when connecting to HLA federations using any of these SISO standard protocols: HLA 1.3, HLA 1516, and HLA Evolved.

Want to learn more about virtual simulation

Interested to see a demonstration?

 

Product Legend

Dutch Army Command and Staff Trainer

What's At Stake?

In this use case, the trainees are military commanders who interact with the real C4I interface, which is configured to be in a training mode. Role players contribute to the simulation scenarios. And white-cell operators manage the exercise making sure the scenarios stay on track.

How MAK Helped

Working as a subcontractor to Elbit Systems, VT MAK delivered a modified version of VR-Forces that was used as the simulation engine for a Command and Staff Trainer for the Dutch Army. VR-Forces is used for scenario authoring, simulation execution, and as a role-player station for white-cell operators and instructors.

VR-Forces-simulated entities, and other objects, are displayed on the C4I system's map. Elbit, the developers of the C4I system, wrote a Gateway application using VR-Link and the VR-Forces Remote Control API that translates between the HLA protocols spoken by VR-Forces, and the C4I protocols used by the operational C4I system.

The Dutch system employs several VR-Forces simulation engines to share the load of simulating several thousand objects.

This project focuses on the tactical level more than the operation level.  For example, it had requirements for 3D views (UAVs camera displays and "pop the hatch" out-the-window views for operators playing the role of company commanders), thus, much of the actual simulation is at the entity level rather than the more constructive true aggregate level.

 

Product Legend

CST Experimentation - Norwegian Defence Research Establishment

What's At Stake?

The Norwegian Defence Research Establishment (FFI) has established a demonstrator for experimentation with command and control information system (C2IS) technology. The demonstrator is used for studying middleware, different communication media, legacy information systems and user interface equipment employed in C2ISs.

How MAK Helped

MAK’s VR-Forces, computer generated forces simulation toolkit is a component of the demonstrator serving as the general framework for rapid development of synthetic environments. It is used for describing the scenario and representing the behavior of most of the entities in the environment. The demonstrator’s flexible and extendable HLA based synthetic environment supporting VR-Forces participation as a federate in the HLA federation, allowing it to exchange data with entities and systems represented in external simulation models.

Product Legend

Networking Simulation to C2 Systems

What's At Stake?

Large investments have been made developing simulation systems that model and simulate various aspects of the military environment. Networking these together enables building systems-of-systems that model larger and more complete operational scenarios. Connecting these to operational C2, C3, C4I, and Mission Command systems allows commanders and their staff to have experiential learning opportunities that otherwise could not be achieved outside of actual conflict.  

How MAK Helped

MAK’s VR-Link interoperability toolkit is the de facto industry standard for networking simulations. MAK’s implementation of the HLA run time infrastructure (RTI) has been certified for HLA 1.3 and HLA 1516, and is currently being certified for HLA 1516:2010. The MAK RTI has been used in many federations, some with over a thousand federates and tens of thousands of simulated entities. MAK’s universal translator, VR-Exchange, has been sold as a COTS product since 2005 and was first developed for and used on the US Army Test and Evaluation Command (ATEC) Distributed Test Event 2005 (DTE5) linking live, virtual and constructive simulations for the first time. Since then it has been used on many programs and at many sites to link together live (TENA), virtual (HLA), and constructive (DIS) simulations.

MAK has interfaced Battle Command to operational Command and Control Systems including C2PC, Cursor on Target, and JADOCS for the USMC and USAF. Previously we developed an interface for FBCB2.

The US Air Force selected MAK interoperability products for use on the Air Force Modeling and Simulation Training Toolkit (AFMSTT) program and we have successfully completed delivery. The MAK tools include the new MAK WebLVC Server, VR-Exchange, VR-Link and MAK Data Logger. Based on the Air Force's Air Warfare Simulation (AWSIM) model, the AFMSTT system enables training of senior commanders and staff for joint air warfare and operations. MAK's tools are being used to help migrate the AFMSTT system to a service-oriented architecture based on High Level Architecture (HLA) interoperability and web technologies.

Product Legend

Tactical Operations Training – USMC MCTOG

What's At Stake?

The U.S. Marine Corps Tactical Operations Group (MCTOG) uses advanced human simulation, ECO Sim to model blue forces, oppositional forces, and civilian pattern-of-life to train captains two weeks prior to deployment in Afghanistan. ECO Sim trains IED defeat missions simulating sophisticated human networks of financiers, bomb makers, safe houses, leaders, and emplacers. These IED networks operate within a larger backdrop of ambient civilian behavior: farmers in fields, children attending school, families going to marketplaces and religious services. The Marine captain commands searches, patrols, and detentions, all while monitoring the battlefield using ISR data provided by UAS and stationary cameras. In addition, ECO Sim has a sophisticated report capability, mimicking the way Marines will actually convey and receive information in the battlefield.

 

Product Legend

The Battlefield is Evolving: The Increased Threat of Cyber Attack Affects Strategic Decision Making

 

As technologies continue to advance and become more deeply ingrained in modern life, threats of a crippling cyber attack or electronic warfare (EW) become increasingly probable. In an attempt to mitigate these risks, the Colombian national government (represented by the Ministry of Information Technology), the Higher War School in Colombia, and ITM Consulting Company joined forces to explore the role simulation plays in understanding, preparing for, and combating cyber attacks.

The organizations needed a tool that could create and model elements vulnerable to cyber attacks, such as radar systems, military and civilian entities, and communication systems. The organizations were looking for the freedom to redesign the User interface (UI) to match specific scenario needs and create response strategies.

 

How MAK Helped:

VT MAK offers commercial-off-the-shelf (COTS) technology to build EW simulations, backed by a company with an “engineer down the hall” philosophy to help organizations select and implement the most effective solution.

VR-Forces provided a scalable computer-generated forces simulation engine to populate the training environment with targeted infrastructure systems, friendly forces and hostile entities. VR-Forces allowed the organizations to pre-plan scenarios as well as interactively alter a live situation in real time.

VR-Forces provided the flexibility sought by the organizations, including UI customization. The group used this flexibility to conduct three major cyber attack scenarios, and create response strategies.

In the first scenario, VR-Forces simulated two aircraft teams. The red team was given a mission to use scanners and jammers to alter the frequency on the blue team’s radar systems; doing this enabled the red team to use attack aircraft to undermine the blue team’s defense system.

The second scenario used VR-Forces to simulate an electronic warfare attack on the Colombian oil infrastructure, a victim of frequent terrorist attacks. in this exercise, the red team was instructed to alter the readings on specific valves on a pipeline to ignite fires. Attacks on the blue team’s surveillance systems (via unmanned aircraft) set out to deter the blue team’s response.

The third scenario highlighted the inherent danger to civilian populations if the turbines in a hydroelectric plant are compromised through a cyber attack. The red team in this situation instigated drastic variations in water levels at the plant that in turn disrupted the power and energy generated to the nearby town. The power and energy disruptions brought about detrimental consequences for the simulated town.

The exercises using VR-Forces have contributed to research and development efforts led by the department of Telematics eSdegUe, and in particular its line of research in Cybersecurity and Cyber defense.

“What we were able to do with VR-Forces allowed us to lead the research process for the modeling and simulation of electronic Warfare and Cybernetics; it is through this research that we learn how to best describe the behavior of different cyber attacks and EW tactics to determine scenarios, trends, and courses of action with excellent results,” says Colonel Martha Liliana Sanchez Lozano, the Official Colombian Air Force Chief of Telematics and Program Coordinator of Cybersecurity and Cyber defense at the Higher War School.

Want to learn more? Have a look at the VR-Forces page for more information. interested in seeing a demonstration?

Simulation Systems – Use cases for training and experimentation

Configure, customize, and extend our products to meet your project requirements.

Product Legend

Shipboard Weapons Training System

Shipboard Weapons Training System

This Shipboard Weapons Training System immerses trainees within a virtual environment. Students learn to operate the weapons along with team communication and coordination.

Shipboard weapons training systems focus on developing coordination and firing skills amongst a gunnery unit at sea. When it comes to appropriate levels of fidelity, it is important to develop a system that replicates the firing process, accurately renders weapon effects, and instills the environmental accuracy of operating on a ship in motion. In addition, the training system must stimulate the unit with appropriate threats.

VT MAK’s off-the-shelf-technologies transform this system into a realistic simulation environment, providing an ideal training ground for a large variety of training skills and learning objectives.

(For narration, make sure the volume is on)

 

The MAK Advantage:

Choosing MAK for your simulation infrastructure gives you state of the art technology and the renowned ‘engineer down the hall’ technical support that has been the foundation of MAK’s culture since its beginnings.

 

MAK Capabilities within each of the simulation components:

The Dome and the UAV station —VR-Vantage IG
  • Game/Simulator Quality Graphics and Rendering Techniques take advantage of the increasing power of NVIDIA graphics cards, yielding stunning views of the ocean, the sky, the content between in between.
  • Built in Support for Multi-Channel Rendering. Depending on system design choices for performance and number of computers deployed, VR-Vantage can render multiple channels from a single graphics processor (GPU) or can render channels on separate computers using Remote Display Engines attached to a master IG channel.
  • 3D Content to Represent many vehicle types, human characters, weapon systems, and destroyable buildings. Visual effects are provided for weapons engagements including particle systems for signal smoke, weapon fire, detonations, fire, and smoke.
  • Terrain Agility supports most of the terrain strategies commonly used in the modeling, simulation & training industry.
  • Environmental effects are modeled to render with realism. Proper lighting — day or night, the effects of shadows, atmospheric and water effects including dynamic oceans (tidal, swell size and direction, transparency, reflection, etc.)  Add multiple cloud layers, wind, rain, snow, even trees and grass that move naturally with the variations of wind.
  • Sensor Modeling the look and feel of the UAV sensor feed. Sensors can be controlled by a participant or given an artificial intelligence plan. Add SensorFX to model physically accurate views from a UAV's sensor, accounting to environmental variations, such as fog, due, snow, rain, and many other factors that influence temperature and visibility.  To see how its done, Click here.
First-person Helicopter Flight Simulator —VR-Engage
  • A high-fidelity vehicle physics engine needed for accurate vehicle motion.
  • Ground, rotary and fixed-wing vehicles, and the full library of friendly, hostile, and neutral DI-Guy characters.
  • Radio and voice communications over DIS and HLA using Link products.
  • Sensors, weapons, countermeasures, and behavior models for air-to-air, air-to-ground, on-the-ground, and person-to-person engagements.
  • Vehicle and person-specific interactions with the environment (open and close doors, move, destroy, and so on.)
  • Standard navigation displays and multi-function display (MFD) navigation chart.
  • Terrain agility. As with VR-Vantage IG and VR-Forces, you can use the terrain you have or take advantage of innovative streaming and procedural terrain techniques.
Instructor Operator Station — VR-Forces and VR-Engage

VR-Forces is a powerful, scalable, flexible, and easy-to-use computer generated forces (CGF) simulation system used as the basis of Threat Generators and Instructor Operator Stations (IOS). VR-Forces provides the flexibility to fit a large variety of architectures right out-of-the-box or to be completely customized to meet specific requirements.

VR-Engage lets the instructor choose when to play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter. The instructor takes interactive control of the jet in real-time, engaging with other entities using input devices. This adds the human touch for a higher behavioral fidelity when needed.

Some of VR-Forces’ many capabilities:

  • Scenario Definition that enables instructors to create, execute, and distribute simulation scenarios. Using its intuitive interfaces, they can build scenarios that scale from just a few individuals in close quarters to large multi-echelon simulations covering the entire theater of operations. The user interface can be used as-is or customized for a training specific look and feel.
  • Simulating objects behavior and interactions with the environment and other simulation objects. These behaviors give VR-Forces simulation objects a level of autonomy to react to the rest of the simulation on their own. This saves you from having to script their behavior in detail.
  • It's easy to set up and preserve your workspace, override default behaviors, and modify simulation object models. Simulation objects have many parameters that affect their behavior.
  • Training Exercise Management, allowing the instructor to manipulate all entities in real-time while the training is ongoing.
  • Artificial Intelligence (AI) control, where entities are given tasks to execute missions, like attack plans that trigger when the ship reaches a certain waypoint.. While on their mission reactive tasks deal with contingencies and the CGF AI plays out the mission.
  • 2D & 3D Viewing Control that allows the instructor to switch and save their prefered point of views in real time.
  • High-fidelity weapons interactions components are plugged in for accurate measures in real-time.

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation?
Interested in a demonstration?

 

Product Legend

Air Mission Operations Training Center

Air Mission Operations Training Center

Air Mission Operations Training Centers are large systems focused on training aircraft pilots and the teams of people neede to conduct air missions.  

To make the simulation environment valid for training, simulators are needed to fill the virtual world with mission support units, opposing forces & threats, and civilian patterns of life. Depending on the specifics of each training exercise, the fidelity of each simulation can range from completely autonomous computer generated forces, to desktop role player stations, to fully immersive training simulators.

Scroll down to watch video on how VT MAK’s simulation technology fits into an air missions operations training center.
Click in the bottom corner of the video for volume control and full screen viewing. 

 

The MAK Advantage:

MAK technologies can be deployed in many places within an air missions operations training center. 

VT MAK provides a powerful and flexible computer generated forces simulation, VR-Forces. Used to manage air, land, and sea missions, as well as civilians activity. It can be the ‘one CGF’ for all operational domains.

Desktop role players and targeted fidelity simulators are used where human players are needed to increase fidelity and represent tactically precise decision making and behavior.

Remote simulation centers connect over long-haul networks to participate when specific trials need the fidelity of those high-value simulation assets. MAK offers an interoperability solution that facilitates a common extensible simulation architecture based on international standards. VR-Link helps developers build DIS & HLA into their simulations. VR-Exchange connects simulations even when they use differing protocols. The MAK RTI provides the high-performance infrastructure for HLA networking.

Local simulators, based on MAK’s VR-Engage, take the place of remote simulations — when connecting to remote facilities is not needed. VR-Engage lets users play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter.

VR-Engage can be used for role player stations. Or used as the basis for targeted fidelity or high-fidelity simulators.

MAK products are meant to be used in complex simulation environments — interoperating with simulations built by customers and other vendors. However, big efficiencies are gained by choosing MAK products as the core of your simulation environment.

 

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

Product Legend

Flight Deck Training

Flight Deck Training

What’s at stake?

Working on the flight deck of an aircraft carrier is dangerous business. Especially if your job is in “the bucket” and you’re supposed to release the catapult to accelerate a jet down the deck to aid its take off. You need to set the tension on the catapult after you get the weight from the refueling crew, you need to get the thumbs up from the deck chief, and finally make sure that the pilot is focused on the runway and ready. If you let that catapult go too soon, its going to hurt – a lot.

The MAK Advantage

VT MAK has the tools you need to develop a training system to teach flight deck safety procedures. 

  • DI-Guy Scenario can create human character performances that model the activities on the flight deck.
  • With the DI-Guy SDK, you can integrate the performances into your image generator, or you can use VR-Vantage IG, which has DI-Guy built in. 
  • If you need special gestures to meet your training requirements, you can use the DI-Guy Motion Editor to customize the thousands of character apearances that come with all DI-Guy products. Or you can create characters from motion capture files. 
  • If your training requires the detail of specific facial expression, then DI-Guy Expressive Faces will plug right in and give the you the control you need. 
  • And if you’d like help pulling this all together. MAK is here to help. With renowned product support and custom services to ensure your training system is a success. 

Product Legend

Mechanized Infantry Training

Mechanized Infantry Training

Complex training systems typically require similarly complex software and hardware configurations. MAK’s technology breaks with that sentiment, by making it easy to put together amazing training systems with simple system architectures. In this case, four different training setups are brought to life at appropriate fidelities for each player, all in one streamlined package, to yield a comprehensive Mechanized Infantry Trainer.

The MAK Solution:

VR-Vantage provides trainees with high-detail role-specific visual scenes, including scenes with high-fidelity data overlays. VR-Vantage emulates exterior camera views and 2D maps for the driver and commander, and a scope for the gunner with accurate data displays. Instructors use VR-Vantage to observe the exercise from a third-person perspective and evaluate trainees. VR-Vantage can be customized to match performance and resolution needs, and is used on a range of hardware, from lightweight laptops to complex motion-platform simulators.

For instructors looking to control the simulation and incorporate computer-generated forces, VR-Forces is the perfect pairing for VR-Vantage. VR-Forces is a scalable simulation engine that allows instructors to populate the scene with friendly forces, hostile units, civilians, and obstacles. Instructors use VR-Forces to move units around the scene, setting up scenarios or altering a live situation in real time.

With MAK you have choices on how to create a host vehicle simulation. For ground vehicles we’ve found Vortex, by CM Labs, to be an excellent vehicle dynamics solution. Vortex's contact dynamics simulate all the moving parts of the vehicle including the interaction with the terrain, water, obstacles, vision systems, grasping, and more. Everything from suspension travel to traction and gearing is accounted for to provide the driver with an enriching, engaging training scenario. RT Dynamics provides the flight dynamics for air vehicles, increasing realism for all maneuvers with physics-based aircraft entities. It also adds new maneuvers such as formation flight, terrain following flight, and vertical take off/landing.

VR-TheWorld Server is a powerful web-based streaming terrain server that lets you stream in elevation, features, and imagery. It streams the terrain database to each station, giving users a synthetic environment within which to simulate.

Both VR-Forces and VR-Vantage include MAK’s networking technology. VR-Link’s protocol independent API allows both applications to communicate through industry standard High Level Architecture (HLA) and the Distributed Interactive Simulation (DIS) protocols, including HLA 1.3, HLA 1516, HLA Evolved, DIS, and DIS 7. The MAK Data Logger records and plays back all the network simulation traffic for after action review and analysis. The MAK RTI (runtime infrastructure) is available when connecting to HLA federations using any of these SISO standard protocols: HLA 1.3, HLA 1516, and HLA Evolved.

Want to learn more? Have a look at the VR-Vantage page for more information. Interested in seeing a demonstration?

 

Product Legend

Close Air Support: JTAC Training

Close Air Support: JTAC Training

As part of a Tactical Air Control Party (TACP), only the Joint Terminal Air Controller (JTAC) is authorized to say CLEARED HOT on the radio and direct aircraft to deliver their ordnance on a target.

JTACs are relied on to direct and coordinate close air support missions, advise commanders on matters pertaining to air support, and observe and report the results of strikes. Their ability to communicate effectively with pilots, and coordinate accurate air strikes can play a huge role in the success of a mission.

Virtual training systems allow JTACs to practice identifying targets, calibrating their locations, requesting air support, and the highly-specialized procedures for communicating with pilots. 

Scroll down to watch video on how VT MAK’s simulation technology comes together to make up a JTAC simulator.

The MAK Advantage:

The JTAC simulator in this use case takes advantage of simulations built on MAK’s core technologies.

The tight coupling of system components provides a rich simulation environment for each participant. The JTAC simulation is rendered in the dome using VR-Vantage; the flight simulation takes advantage of the VR-Forces first-person simulation engine; and the instructor/role player station uses VR-Forces CGF to populate the synthetic environment and control the training scenarios.

All these system components share a common terrain database and are connected together using VR-Link and the MAK RTI, giving the system integrator the ability to deploy reliably and cost effectively while leaving open the opportunity to expand the system to add bigger and more complex networks of live, virtual and/or constructive simulations.

Choosing MAK for your simulation infrastructure gives you state of the art technology and the renowned ‘engineer down the hall’ technical support that has been the foundation of MAK’s culture since its beginnings.

Capabilities the core technologies bring to the simulators:

JTAC Dome — Built with VR-Vantage
  • Game/Simulator Quality Graphics and Rendering Techniques

    VR-Vantage uses the most modern image rendering and shader techniques to take advantage of the increasing power of NVIDIA graphics cards. VT MAK's Image Generator has real-time visual effects to rival any modern IG or game engine.

  • Multi-Channel Rendering

    Support for multi-channel rendering is built in. Depending on system design choices for performance and number of computers deployed, VR-Vantage can render multiple channels from a single graphics processor (GPU) or can render channels on separate computers using Remote Display Engines attached to a master IG channel.

  • 3D Content to Represent Players and Interactions

    VR-Vantage is loaded with content including 3D models of all vehicle types, human characters, weapon systems, and destroyable buildings. Visual effects are provided for weapons engagements including particle systems for signal smoke, weapon fire, detonations, fire, and smoke.

  • Terrain Agility

    All MAK’s simulation and visualization products are designed to be terrain agile, that means that they can support most of the terrain strategies commonly used in the modeling, simulation & training industry. Look here for technical details and a list of the formats supported.

  • Environmental Modeling

    VR-Vantage can render scenes of the terrain and environment with the realism of proper lighting — day or night, the effects of illuminated light sources and shadows, atmospheric and water effects including multiple cloud layers effects and dynamic oceans, trees and grass that move naturally with the wind.

  • Sensor Modeling

    VR-Vantage can render scenes in all wavelengths: Night vision, infrared, and visible (as needed on a JTACs dome display). Sensor zooming, depth of field effects, and reticle overlays model the use of binoculars and laser range finders.

Flight Simulator — Built with VR-Forces & VR-Vantage
  • Flight Dynamics

    High-fidelity physics-based aerodynamics model for accurate flight controls using game or professional level hands on throttle and stick controls (HOTAS).

  • Air to Ground Engagements

    Sensors (targeting pod (IR camera with gimbal and overlay), SAR request/response (requires RadarFX Server) Weapons (missiles, guns, bombs)

  • Navigation

    Standard six-pack navigation displays and multi-function display (MFD) navigation chart.

  • Image Generator

    All the same VR-Vantage based IG capabilities in a flight simulator/roleplayer station as in the JTAC’s dome display. The flexibility to configure as needed: Single screen (OTW + controls + HUD), Dual screen (OTW + HUD, controls), Multi Screen OTW (using remote display engines).

  • Integration with IOS & JTAC

    The flight simulator is integrated with the VR-Forces-based IOS so the instructor can initialize the combat air patrol (CAP) mission appropriately in preparation for the close air support (CAS) mission called by the JTAC. All flights are captured by the MAK Data Logger for after action review (AAR) analysis and debriefing. Radios are provided that communicate over the DIS or HLA simulation infrastructure and are recorded by the MAK Data Logger for AAR.

Instructor Operator Station — Built with VR-Forces

VR-Forces is a powerful, scalable, flexible, and easy-to-use computer generated forces (CGF) simulation system used as the basis of Threat Generators and Instructor Operator Stations (IOS).

  • Scenario Definition

    VR-Forces comes with a rich set of capabilities that enable instructors to create, execute, and distribute simulation scenarios. Using its intuitive interfaces, they can build scenarios that scale from just a few individuals in close quarters to large multi-echelon simulations covering the entire theater of operations. The user interface can be used as-is or customized for a training specific look and feel.

  • Training Exercise Management

    All of the entities defined by a VR-Forces scenario can be interactively manipulated in real-time while the training is ongoing. Instructors can choose from:

    Direct control, where new entities can be created on the fly or existing entities can be moved into position, their status, rules of engagement, or tasking changed on a whim. Some call the instructor using this method a “puckster”.

    Artificial Intelligence (AI) control, where entities are given tasks to execute missions, like close air support (CAS), suppressive fire, or attack with guns. While on their mission reactive tasks deal with contingencies and reactive tasks deal and the CGF AI plays out the mission. In games, these are sometimes called “non-player characters”.

    First person control, where the instructor takes interactive control of a vehicle or human character and moves it around and engages with other entities using input devices.

  • 2D & 3D Viewing Control

    When creating training scenarios, the VR-Forces GUI allows instructors to quickly switch between 2D and 3D views.

    The 2D view provides a dynamic map display of the simulated world and is the most productive for laying down entities and tactical graphics that help to control the AI of those entities.

    The 3D views provide an intuitive, immersive, situational awareness and allow precise placement of simulation objects on the terrain. Users can quickly and easily switch between display modes or open a secondary window and use a different mode in each one.

Want to learn about trade-offs in fidelity? See The Tech-Savvy Guide to Virtual Simulation

Interested to see a demonstration?

 

Product Legend

Air Traffic Simulation

Bringing Advanced Traffic Simulation and Visualization Technologies to Help Design and Build the Next-Generation ATM System

The U.S. Federal Aviation Administration (FAA) is embarking on an ambitious project to upgrade the nation’s air traffic management (ATM) systems. Each team under the FAA Systems Engineering 2020 (SE2020) IDIQ will need an integrated, distributed modeling and synthetic simulation environment to try out their new systems and concepts. Aircraft manufacturers will need to upgrade their aircraft engineering simulators to model the new ATM environment in order to test their equipment in realistic synthetic environments.

The MAK Advantage:

MAK saves our customers time, effort, cost, and risk by applying our distributed simulation expertise and suite of tools to ATM simulation. We provide cost effective, open solutions to meet ATM visualization and simulation requirements, in the breadth of application areas:

  • Concept Exploration and Validation
  • Control Tower Training
  • Man-Machine Interface Research
  • System Design (by simulating detailed operation)
  • Technology Performance Assessment

VT MAK is in a unique position to help your company in this process. Many players in the aviation space already use MAK products for simulation, visualization, and interoperability. For the ATM community we can offer:

  • Simulation Integration and Interoperability. The US has standardized on the High Level Architecture (HLA) protocol for interoperability and the MAK RTI is already in use as part of the AviationSimNet. We can provide general distributed simulation interoperability services, including streaming terrain (VR-TheWorld Server, GIS Enabled Modeling & Simulation), gateways to operational systems (VR-Exchange), simulation debugging and monitoring (HLA Traffic Analyzer).
  • Visualization. MAK’s visual simulation solution, VR-Vantage IG, can be used to create 3D representations of the airspace from the big picture to an individual airport. VR-TheWorld can provide the central terrain storage.
  • Simulation. VR-Forces is an easy to use CGF used to develop specific models for non-commercial aviation entities such as UAVs, fighter aircraft, rogue aircraft, people on the ground, ground vehicles, etc. In addition, as an open toolkit, VR-Forces may well be preferred in many labs over the closed ATC simulators.

Product Legend

Drone Operations

Drone Operations

 

What’s at Stake?

You are tasked with training a team of sensor payload operators to use UAVs for urban reconnaissance missions in a specific city. Upon completion of training, trainees must be able to comb an area for a target, make a positive identification, monitor behavior and interactions, radio in an airstrike, and then report on the outcome.

An ineffective training environment could lead to additional costs, losing important targets, and inefficient surveillance systems. Training with a robust solution enhances homeland security human resources for a minimal product investment.

What Are We building?

As the instructor, you need to mock up a ground control station with accurate pilot/payload operator role definitions and supply that system with surveillance data from a content-rich simulation environment. You need to construct a scene that is informative, while providing trainees with opportunities to develop their instincts and test their operating procedures based on how the scenario unfolds.

Each UAV must be equipped with an electro-optical camera as well as an infrared sensor mounted to a gimbal. Radio communication between the UAV operators and a central command center must be available to coordinate surveillance and call in airstrikes.
Trainees need to experience the scenario through the electro-optical sensor and infrared sensor with rich, accurate data overlays to provide them with the information they need to communicate positioning and targeting effectively.
Your urban environment requires crowds of people who behave in realistic ways and traverse the city in intelligent paths. When a UAV operator spots someone, they need to be able to lock onto them when they are in motion to mimic algorithmic tracking tools.
 

The simulation needs to be adjustable in real time so that the instructor can minimize repeat behaviors and walk the team through different scenarios. Instructors also must be able to judge the effectiveness of a trainee’s technique.

The MAK Advantage:

In this particular case, VR-Forces provides all the software you need to bring your environment to life. 
Watch this MAKtv episode to see how easy it is to set up.

MAKtv SensorsCapture

Sensor modeling is a point of strength for VR-Forces. Give your trainees a beautiful, detailed point of view of the scene through the electro-optical sensor, and provide a high-fidelity infrared sensor display when the daylight fades. VR-Forces adds accurate data overlays so that trainees can learn to quickly and accurately read and report based on that information. Instructors can visualize 3D volumetric view frustums and assess trainees’ combing strategies as well as any gaps in coverage, and engineer surveillance systems. We model sensor tracking to lock onto targets while they are in movement or on a fixed location. 

VR-Forces is an ideal tool for scenario development. It can model UAVs in fine detail, while allowing for instructors to customize those entities based on the scope of a mission. It’s simple to add the gimbal mounted sensor array that we need for this scenario and define parameters for - including zoom, zoom speed, slew rate, and gimbal stops. Easily populate an urban environment with people by using the group objects function to add crowds of entities at a time. VR-Forces has features from Autodesk's Gameware built in, enabling Pattern of Life intelligent flows of people and vehicles, in addition to plotting the locations and tasks of individual entities. The Pattern of Life lets you manipulate patterns within the scenario – including realistic background traffic, whether it’s people, road, or air. Certain DI-Guy capabilities have been integrated into VR-Forces, meaning behavior modeling is more authentic, thanks to motion capture technology. Now you can train your team to look out for certain suspicious movements and calibrate their responses based on the actions of the target.

What really makes VR-Forces perfect for training is the ability of instructors to manipulate the scenario in real time. You can keep your trainees from running scenarios that are too predictable by having your target enter buildings, change his mode of transportation, or actively attempt to avoid detection, all during live action.

Interested in Learning More? Have a look at the VR-Forces page for more information.  

Can we interest you in a personal demonstration.

 

Product Legend

Developing Air and Ground Traffic Policy

Developing Air and Ground Traffic Policy in a World Increasingly Populated by UAS

As UAS technologies become more accessible, an increase in air traffic, particularly around urban centers is inevitable. It will be essential for governments and their agencies to develop policies with regards to air traffic and its relationship with ground traffic, specifically for low-flying UASs, and particularly in emergency situations. Well-developed traffic management will maximize safe traffic speed in regular conditions and divert flows efficiently in emergency scenarios when first-responders are rushing to a scene. Poor planning may result in economic and human loss. Simulation is an ideal space to test current traffic policies under changing conditions and to research and develop new solutions.

Governments and agencies need a tool that can depict an area modeled after their own and simulate air traffic within it. The tool should be capable of depicting specific types of air traffic, including planes, helicopters, and UASs, as well as airspace demarcation. There needs to be a concurrent display of ground traffic, including pedestrians, bicyclists, and vehicles - particularly around the scene of an incident. Policymakers want to be able to visualize traffic flows and craft response strategies for general and specific situations.

 

The MAK Advantage:

VT MAK offers commercial-off-the-shelf (COTS) technology to construct airspace simulations, backed by a company with an “engineer down the hall” philosophy to help organizations select and implement the most effective solution.

 

VR-Forces provides a scalable computer-generated forces simulation engine capable of populating an environment with air and ground traffic, as well as infrastructure specific to traffic systems. There is plenty of out-of-the-box content of all shapes and sizes, from sUAS up to 747s in the air, and everything from human characters and bicyclists to fire trucks on the ground. If an out-of-the-box model needs to be modified to match local specifications, or if an agency wants to create their own from scratch, MAK’s open-source API allows for full customization of entity appearance and performance.

 

VR-Forces depicts volumetric airspace regulations, giving policymakers a three-dimensional perspective of air corridors and restricted spaces as they swell and shrink. Crucially, volumetric airspace restrictions can be assigned to impact air and ground traffic systems accordingly. For example, if there was an auto accident, set policies could dictate an air restriction in the area up to a certain height to provide space for UAS emergency response and redirect UAS traffic as long as necessary. At the same time, traffic on the ground within a particular radius may have their speeds reduced, or lanes may be opened specifically for first responders to access the scene more readily.

 

Policymakers can calibrate the size and rules applied to air corridors and measure the impact of these changes on the traffic patterns of the city. VR-Forces is capable of depicting traffic density as it shifts with new incidents, even assigning a color-coded density maps to better visualize areas of congestion in air and on the ground.

 

VR-TheWorld allows policymakers to test these impacts inside any city for which they have the terrain data, through a web-based interface. This creates the most realistic testing lab for research and development projects.

 

Want to learn more? Have a look at the VR-Forces page for more information. interested in seeing a demonstration?

 

 

Product Legend

Incident Management

 

Incident Management

Using the Power of Modeling & Simulation for First Responder Training, Emergency Response Preparedness, and Critical Infrastructure Protection

The homeland security, emergency response, and public safety communities face challenges similar to those dealt with in the military domain--they need to plan and train. But large scale live simulations are simply too disruptive to be conducted with regularity. Catastrophic emergencies require coordination of local and state public safety personnel, emergency management personnel, National Guard, and possibly regular military. Interoperability is a major problem.

On a basic level, simulations require generic urban terrains with multi-story interior models, transportation infrastructure such as subways and airports, and the ability to simulate crowd behaviors and traffic. They may require terrains for specific urban areas or transportation infrastructure. Given the role of ubiquitous communications in the public sector, the ability to simulate communications networks (land-line, cell, data) and disruptions in them may also be important. For specialized emergency response training, the ability to simulate chemical, biological, and radiological dispersion may also be necessary.

The need for simulation and training in this domain is self evident. The budgetary constraints are daunting for many agencies. The cost-effective solutions that VT MAK has developed for the defense community can provide immediate benefits to homeland security, emergency response, and public safety agencies.

  • First Responder Training
  • Emergency Response Planning
  • Perimeter Monitoring/Security
  • Human Behavior Studies

 

The MAK Advantage:

MAK can help you use simulation systems to keep your homeland secure. Here's how:

  • With VR-Link, VR-Exchange and MAK RTILink simulation components into simulation systems, or connect systems into world-wide interoperable distributed simulation networks.

  • With VR-Forces: Build and Populate 3D simulation environments (a.k.a. virtual worlds), from vehicle or building interiors to urban terrain areas, to the whole planet. Then Simulate the mobility, dynamics and behavior of people and vehicles; from individual role players to large scale simulations involving 10’s of thousands of entities.

  • With VR-Vantage IG: Visualize the simulation to understand analytical results or participate in immersive experiences.

 

Components – Powered by MAK software

These are a few of the system components that our customers use to build their simulation systems. For each component we try to express one or more ways in which these components can be designed to fit into different system designs. And we identify where the MAK products benefit the design of each.

Product Legend

Instructor Operator Stations

Instructor Operator Stations  

VR Forces Station

Where does the Instructor Operator Station fit within the system architecture?

Training events are becoming larger and more widely distributed across networked environments. Yet staffing for these exercises is often static, or even decreasing. Therefore, instructors and operators need IOS systems to help manage their tasks, including designing scenarios, running exercises, providing real-time guidance and feedback, and conducting AAR.

Instructor Operator Stations (IOS) provide a central location from which instructors and operators can manage training simulations. An effective IOS enables seamless control of exercise start-up, execution, and After Action Review (AAR) across distributed systems. It automates many setup and execution tasks, and provides interfaces tailored to the simulation domain for tasks that are done manually.

How does MAK software fit within the Instructor Operator Station?

MAK has proven technologies that allow us to build and customize an IOS to meet your training system requirements.

  • Simulation Control Interface – Instructors can create and modify training scenarios. Execution of the scenarios may be distributed across one or more remote systems. The instructor or operator can dynamically inject events into a scenario to stimulate trainee responses, or otherwise guide a trainee’s actions during a training exercise. Core technology: VR-Forces Graphical User Interface
  • Situational Awareness – The MAK IOS includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allow the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Analysis & After Action Review – The MAK IOS supports pre-mission briefing and AAR / debriefing. It can record exercises and play them back. The instructor can annotate key events in real-time or post exercise, assess trainee performance, and generate debrief presentations and reports. The logged data can be exported to a variety of databases and analysis tools for data mining and performance assessment. Core technology: MAK Data Logger
  • Open Standards Compliance –MAK IOS supports the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. Core technology: VR-Link networking toolkit.
  • Simulated Voice Radios – Optionally Includes services to communicate with trainees using real or simulated radios, VOIP, or text chat, as appropriate for the training environment.

Product Legend

Scenario/Threat-Generation Stations

Scenario/Threat Generators  

scenario generators

Where does the Scenario/Threat Generator fit within the system architecture?

Your job is to place a trainee or analyst in a realistic virtual environment in which they can train or experiment. It could be a hardware simulator, a battle lab, or even the actual equipment, configured for simulated input. Then you need to stimulate that environment with realistic events for the trainee to respond to. The stimulation usually comes from a scenario generator, also known as a threat generator. A scenario generator typically simulates the opposing force entities and complementary friendly force entities that the trainees need to interact with.

Trends

A scenario generator should allow training staff to quickly and easily design and develop scenarios that place trainees in a realistic situation. The application should use the proper terminology and concepts for the trainees’ knowledge domain. It should be flexible enough to handle the entire spectrum of simulation needs. The entities simulated by the scenario generator should be able to operate with enough autonomy that once the simulation starts they do not need constant attention by an instructor / operator, but could be managed dynamically if necessary.

In addition to its basic capabilities, a scenario generator needs to be able to communicate with the simulator and other exercise participants using standard simulation protocols. It needs to be able to load the terrain databases and entity models that you need without forcing you to use some narrowly defined or proprietary set of formats. Finally, a scenario generator needs to work well with the visualization and data logging tools that are often used in simulation settings.

How does MAK software fit within the Scenario/Threat generator?

MAK will work with you to so that you have the Scenario Generator that you need - a powerful and flexible simulation component for generating and executing battlefield scenarios. MAK will work with you to customize it to meet the particular needs of your simulation domain. For example, higher fidelity models for a particular platform can be added, new tasks can be implemented, or the graphical user interface can be customized to create the correct level or realism. Features include:

  • Scenario development – Staff can rapidly create and modify scenarios. Entities can be controlled directly as the scenario runs. Core Technology: VR-Forces
  • Situational Awareness – The visualization system includes a 2D tactical map display, a realistic 3D view, and an eXaggerated Reality (XR) 3D view. All views support scenario creation and mission planning. The 3D view provides situational awareness and an immersive experience. The 2D and XR views provide the big picture battlefield-level view and allows the instructor to monitor overall performance during the exercise. To further the instructor’s understanding of the exercise, the displays include tactical graphics such as points and roads, entity effects such as trajectory histories and attacker-target lines, and entity details such as name, heading, speed. Core technology: VR-Vantage.
  • Network modeling – The lab can simulate communications networks and the difficulties of real-world communications. Core technologies: Qualnet eXata and AGI SMART.
  • Correlated Terrain – VT MAK’s approach to terrain, terrain agility, ensures that you can use the terrain formats you need, when you need them. We can also help you develop custom terrains and can integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. Core technologies: VR-TheWorld Server, VR-inTerra.
  • Sensor modeling – The visualization component can model visuals through different sensor spectrums, such as infrared and night vision. Core technology: JRM SensorFX.
  • Open Standards Compliance – VR-Forces supports the High Level Architecture (HLA), Distributed Interactive Simulation (DIS).

Product Legend

Virtual Simulators

Virtual Simulators  

VR Engage Trio

Where do Virtual Simulators fit within the system architecture?

Virtual simulators are used for many different roles within Training, Experimentation, and R&D systems. 

  • Trainees – The primary interface for training vehicle and weapons operations, techniques, tactics, and procedures, is often a Virtual Simulator. Pilots use flight simulators, ground forces us driving and gunnery trainers, soldiers use First Person Shooter simulators, etc.
  • Test Subjects – Virtual Simulators are used to test and evaluate virtual prototypes, to study system designs, or analysing behavior of operators.
  • In either case, the Virtual Simulator is connected to other simulators, instructor operator stations, and analysis tools using a distributed simulation network. Collectively these systems present a rich synthetic environment for the benefit of the trainee or researcher.
  • Role Players –  Scenario Generators, Computer Generated Forces, Threat Generators, or any other form of artificial intelligence (AI) can add entitites to bring the simulated environment to life. But, as good as AI can be, some exercises need real people  to control supporting entities to make the training or analysis accurate. In these cases Virtual Simulators can be used to add entities into the scenarios.

The fidelity of Virtual Simulators can vary widely to support the objectives within available budgets.  The following slideshow is from the The Tech Savvy Guide to Virtual Simulation, which goes into great detail about the fidelity of Virtual Simulators. Download the Tech Savvy Guide here.

Arrow
Arrow
Full screenExit full screen
Slider

How does MAK software fit within a Virtual Simulator?

  • Multi-role Virtual Simulator – MAK's VR-Engage is a great place to start. We've done the work of integrating high-fidelity vehicle physics, sensors, weapons, first-person controls, and high performance game-quality graphics. VR-Engage lets users play the role of a first person human character; a vehicle driver, gunner or commander; or the pilot of an airplane or helicopter. Use it as is, or customize it to the specifications of your training or experimentation. As with VR-Vantage and VR-Forces, VR-Engage is terrain agile so you can use the terrain you have or take advantage of innovative streaming and procedural terrain techniques.
  • VR-Engage can run standalone - without requiring any other MAK products and is fully interoperable with 3rd party CGFs and other simulators through open standards. But many additional benefits apply when VR-Engage is used together with MAK’s  VR-Forces – Immediately share and reuse existing terrain, models, configurations, and other content across VR-Forces, VR-Vantage, and VR-Engage - with full correlation; Unified scenario authoring and management; and run-time switching between player-control and AI control of an entity.
  • Visual System – If you have your own vehicle simulation and need immersive 3D graphics, then VR-Vantage IG is the tool of choice for high-performance visual scenes for out-the-window visuals, sensor channels, and simulated UAS video feeds. VR-Vantage can be controlled through the Computer Image Generator Interface (CIGI) standard as well as the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols.
  • SAR Simulation – If your simulator is all set, but you need to simulate and send Synthetic Aperture Radar images to the cockpit, then RadarFX SAR Server, can generate realistic SAR and ISAR scenes and send them over the network for you to integrate into your cockpit displays.  
  • Network Interoperability – Developers who build the Virtual Simulator from scratch can take advantage of VR‑Link and the MAK RTI to connect the simulation to the network for interoperation with other simulation applications using the High Level Architecture (HLA), and Distributed Interactive Simulation (DIS) protocols.

 

 

Product Legend

Image Generators

Image Generators

Where does an Image Generator fit within the system architecture?

Image generators provide visual scenes of the simulation environment from the perspective of the participants. These can be displayed on hardware as simple as a desktop monitor, or as complex as a multiple projector dome display. The scenes can be rendered in the visible spectrum for "out-the-window" views or in other wavelengths to simulate optical sensors. In any case, the Image generator must generate scenes very quickly to maintain a realistic sense of motion.  

This slideshow is from the Image Generation section of The Tech Savvy Guide to Virtual Simulation, which goes into great detail about the Image Generator and other components of a simulation system. Download the Tech Savvy Guide here.

Arrow
Arrow
Full screenExit full screen
Slider

How does MAK software fit within an Image Generator?

VR-Vantage is the core of MAK Image Generation solution. It uses best of breed technologies to custom tailor high quality visual scenes of the synthetic environment for your specific simulation needs. VR-Vantage IG can be used as a standalone Image generator, connected to a host simulation via CIGI, DIS, or HLA protocols. VR-Vantage is also used as the rendering engine for all MAKs graphic applications, including VR-Engage (multi-role virtual simulator), VR-Forces (Computer generated forces), and VR-Vantage Stealth (battlefield visualization). 

  • Multi-channel distributed rendering – VR-Vantage IG takes advantage of the power of the latest NVIDIA graphics processing units (GPU) to render to one, or more, displays per computer.  When very large fields of view are needed, VR-Vantage can be setup to distribute the rendering task across multiple computers each running the VR-Vantage Display Engine. When tight frame synchronization is needed between channels, VR-Vantage can support NVIDIA's professional graphics cards that perform hardware synchronization to an external synchronization source (also known as G-Sync). 
  • Distortion Correction – When rendering to complex surface shapes, like those used to surround a cockpit with "out-the-window" video, VR-Vantage IG includes a plugin made by Scalable Display Technologies that warps the image to match the shape of the display surface.
  • High Fidelity Sensor Visualization – VR-Vantage IG can, optionally, be configured with SensorFX to convert the visual scene into an accurately rendered sensor scene to simulate night vision goggles, infra-red, and other optical wavelengths. 
  • Correlated Terrain – VT MAK’s approach to terrain – terrain agility – ensures that you can use the terrain formats you need, when you need them. VR-Vantage, VR-engage, and VR-Forces all support many of the terrain technologies used throughout the industry. We can also help you develop custom terrains and integrate them with correlated terrain solutions to ensure interoperability with other exercise participants. 
  • Open Standards Compliance –VR-Vantage supports the Computer Image Generator Interface (CIGI) standard as well as the High Level Architecture (HLA) and Distributed Interactive Simulation (DIS) protocols. 

Product Legend

Unmanned Vehicle System (UVS) Simulation

Unmanned Vehicle System (UVS) Simulation

unmanned vehicle controller stations

Where does Unmanned Vehicle System (UVS) Simulation fit within the system architecture?

Since Unmanned Vehicles are by their nature controlled remotely, simulations systems can be designed and built to closely model the actual control stations used in live operation.  An Unmanned Vehicle System (UVS), also known as Remotely Piloted Vehicle (RPV), can run as a stand-alone system or be integrated with or embedded into a UVS simulator or ground control station.

Because unmanned vehicle systems have become more and more prevalent in all aspects of defense and homeland security, there is an increasing need for simulations to support every phase of the development life cycle including:  

  • Demonstration – Visualization of new designs helps to confirm their value. UVS systems help to demonstrate new vehicle designs or concepts within a synthetic environment.
  • Experimentation – Simulations help prove and refine new concepts or Tactics, Techniques, and Procedures (TTPs). Simulated UVSs are used in complex scenarios as part of realistic simulations that are linked to real systems, hardware, and other human-in-the-loop simulators.
  • Research & Development – Unmanned Vehicle System simulations can be used to test guidance, navigation, and control functions of a new or modified UVS without the risk of harming people or property that is inherent in live testing. UVS simulations can provide realistic avionics models, sensor models, and visuals, and emulate real-world controls and communication systems. 
  • Training – Simulations allow pilots, sensor/payload operators, mission commanders, and visual intelligence analysts to train, practice, and analyze decision-making and communication processes.

How does MAK software fit within Unmanned Vehicle System (UVS) Simulation?

  • Control Station – VR-Forces can form the basis of the UVS control station by providing 2D tactical map displays for operator planning and simulated remote operation of the vehicle. 
  • Vehicle Simulation – VR-Forces can also be the simulation platform for the unmanned vehicle, either by treating the vehicle as a standard Computer Generated Forces (CGF) entity or by plugging 3-rd party modules with higher fidelity vehicle dynamics models. Developers can also extend VR-Forces to use their own vehicle dynamics and control system models. 
  • Scenario Generation – Regardless of how the unmanned vehicle is simulated, VR-Forces can provide a powerful and easy to use scenario generation capability to create operationally relevant scenarios.
  • Realistic Visuals VR-Vantage IG and SensorFX provide realistic and accurate 3D perspective scenes to model electro-optical (EO), night vision (NV), or infrared (IR) sensors.  
  • Networking – Developers who build the UAV simulation from scratch can take advantage of VR-Link and the MAK RTI to connect the simulation to the network for interoperation with other simulation applications using the High Level Architecture (HLA), and Distributed Interactive Simulation (DIS) protocols. 
  •  

Product Legend

After Action Review / Debrief Systems

After Action Review (AAR) / Debrief 

 system classroom

Where does the AAR/Debrief component fit within the system architecture?

After Action Review (AAR) systems provide feedback on mission and task performance during training exercises. AAR is a critical part of the training process – it is a structured review, or debrief, for analyzing what happened during an exercise and why it happened. By comparing the actual events with the expected outcomes, instructors and students can identify strengths and weaknesses and decide how to improve performance.

As training exercises become larger, the need becomes greater for automated tools that capture the actions of all training participants, evaluate performance against standard metrics, and provide the information necessary to support a structure debrief of the training event. These tools should be flexible enough to support chronological reviews of an entire exercise or tightly focused reviews highlighting a few key issues.  

How does MAK software fit within the AAR/Debrief component?

Simulation Recording and Playback – Simulation data can be recorded and played back using the MAK Data LoggerThis includes simulated entities and interactions, audio communications, video streams, and other events injected into the simulation. The Data Logger user can pause, slow-down, or speed-up the playback, and can jump to specific moments during the exercise via a simple, easy-to-use, DVR-like control panel. 

  • Annotation – The embedded annotation system enables instructors to bookmark key events during the action and then jump to those events during playback to review specific actions.
  • 2D map and 3D out-the-window visualization – The VR-Vantage Stealth provides 2D tactical map displays for situational awareness; and 3D 'god's eye views' create immersive scenes of the simulated environment from the perspective of any trainee. An exaggerated reality mode combines the best of 2D & 3D techniques into a compelling visual analysis of the training exercise. 
  • Analytical Information – Users can track a specific trainee or groups of trainees and view the scene from various perspectives. Informational overlays including trajectory histories, unit status, attacker-target lines, and other tactical graphics further aid in understanding the training scenario as it plays out.
  • Data Mining - The recorded training events can be exported to standard database and spreadsheet applications, which can be used for data mining, in-depth analysis, and charting or graphing of specific datasets.
  • Open Standards Compliance – MAK software applications are all built with the VR-Link networking toolkit, which supports the High Level Architecture (HLA) and
    Distributed Interactive Simulation (DIS) protocols. Developers can use or extend VR-Link to presnet and capture simulation specific information.

Product Legend

Terrain Data Server

 

Terrain Data Servers

How does the Terrain Data Server componnet fit within the system architecture?

Distributed simulation designs are constructed by combining loosely coupled and interoperable components. However, while the simulation applications are distributed, terrain data usually is not. Each simulation application usually accesses terrain data stored on the local machine. This approach limits terrain interoperability and reuse. It constrains the amount of data that can be accessed to the amount that can be stored locally and necessitates redundant copying of data from one machine to another. Terrain servers solve this problem by storing large amounts of terrain data and supplying it over the network to client applications.    

In addition to the problem of terrain availability, simulations that aspire to high levels of realism must deal with the problem of static terrain. The world is not static. Actions of simulated objects can affect aspects of the environment and these changes need to be available to all participants. Bombs can create craters; vehicles can form berms to provide fortified positions; objects can be destroyed and produce debris. When terrain data is stored locally, there is no good way to propagate terrain changes to exercise participants. A dynamic terrain server solves this issue.

Where does MAK software fit within the Terrain Data Server component?

MAK's dynamic terrain server, VR-TheWorld, enables data to be accessed as a shared service that can be seamlessly and interactively viewed from anywhere, at any level, in a variety of formats. The server provides the foundation for current simulations, facilitates centralized management and future scalability, and enables dynamic terrain changes to be propagated to simulation applicationsl; it also lets you interactively stream and view on-demand data in real-time simulations. Simulation users can spin the globe, zoom into a location, drop units, and start simulating.  

The server supports the concept of “stateful” objects within the terrain, such as destructible buildings, doors that can be opened or breached, and trees that can fall and block movement along roads. When the state of a terrain object changes, that change is immediately available to all applications using the terrain server, allowing entities to interact appropriately. Furthermore, the terrain skin can be deformable, enabling simulation-specific changes such as craters to be seamlessly “stitched” into the terrain.  

All the terrain details are accessible via open standards.

Product Legend

Dynamic- Terrain Server