2020 transformed us. The way we did business came to a halt, and we were all forced to navigate a world under lockdown – we experienced an immediate shift to all things virtual, and there was a steep learning curve. (See below for a roundup of our articles that outline our approach to make virtual events and meetings more engaging, more personal, and more human.)
This year, we’re taking advantage of our lessons learned to bring you a richer, better MAK experience. We’ve heard from many customers and friends that they’re ready to re-engage personally with us - we are excited for this, though we understand that the definition of “personal” will be unique to every company. As we expect to see the world start to emerge from complete lockdowns, we are modulating our approach to meetings so that we can connect more deeply and personally with you where you are, both physically and virtually, through a hybrid seminar session approach.
Here’s how it works...
Update as of 1/27/2021: MAK Legion won the DisTec People's Choice Award! We are so grateful for your votes and can't wait to see how Legion disrupts technology!
We're proud that MAK has been shortlisted in ITEC's Disruptive Technology (DisTec) Challenge, a competition showcasing solutions that have the potential to disrupt training and simulation as we know it. Our submission highlights our next-generation scalability and communications framework, MAK Legion, to manage and deliver millions of entities. Take a look at our submission video on the DisTec Challenge site or check out the transcript of my interview below with Len Granowetter, MAK's CTO, as he outlines the how's, why's and, so what's about our new Legion technology. (And while you're learning about MAK Legion, vote for us to win the DisTec People's Choice Award!)
Dan: Len, tell us about the Legion Scalability framework. Is it a disruptive technology?
This article was originally written and posted for publication in ST Engineering's Agile Blog.
At the dawn of the new millennium, two of the biggest aircraft manufacturers were vying for a $200 billion contract to build America’s next-generation fighter jet – the F-35 Joint Strike Fighter.
The Air Force demanded a fighter jet that would be faster and more maneuverable, while the Navy needed a version with longer wings to land on its aircraft carriers. But among the biggest challenges was to build a third variation which would be a world-first – one that could land vertically on shortened runways for the Marine Corps.
When we create a virtual world for modeling, simulation & training, does it matter if we use a “round-earth” coordinate system and 64-bit precision in our coordinates? Yes, but much more so in some circumstances than others.
Let’s start with the basic concepts. We all know that the world is round, well round-ish, but most of the time we can’t see the effect. When you stand on a rise and look into the distance it’s the shape of the topology that dominates your view. You can’t really see the curvature of the earth.
We sat down with Bill Cole, MAK’s President and CEO, to ask him a few questions as he nears his two-year anniversary with MAK. Read about his impact on MAK over the past two years, the evolution and direction of the industry and MAK’s corresponding trajectory, as well as a few thoughts on how MAK is handling the new world in the times of COVID.
The team was tremendously talented before I arrived, so I can’t take all the credit for the past two years of success! MAK already had all the right ingredients - great people making great products supporting great customers. The fact that we’ve been able to grow during a pandemic while keeping our customer-obsessed attitude is something that I am very proud of and think it speaks volumes of this team.
My role has been to encourage and support the team as they reach for bigger and more challenging opportunities - we can never be afraid to grow the company. We should always be thinking of new and better ways to approach challenges and try for bigger opportunities, and I’m here to help pave the way for that.
A US soldier is trapped under rubble from a damaged building in hostile territory. As a Pararescuer, your team must get in, stabilize the situation, and get out – skins intact.
The rescue mission begins with a helicopter ride over to the site - the ride is bumpy and loud as combat zones dot the geography below. The war worn building comes into view and when you arrive, you fast rope out of the helo and into the rubble. You navigate to the trapped soldier and as you begin to address the situation and tend to the rock pinning him down, there’s an explosion. Even more smoke, debris, and confusion fill the area; when the dust settles, you learn that more soldiers are injured, even a civilian is hurt.
What do you do? How do you react?
Your squad has been tasked with a convoy mission through a town with suspected insurgent activity. As a surveillance operator, you need to spot the threats and alert your team before it’s too late.
You peer down from a UAV through an infrared camera analyzing and scrutinizing the happenings of a seemingly ordinary town. You see farmers in fields, children coming from and going to school, families en route to and from the marketplace, and religious services – everything seems normal but your training tells you that you need to look ahead. That’s when you notice signs of suspicious behavior: people moving to rooftops looking to the sky for incoming aircraft, armed civilians lurking behind corners, and most dangerous of all, a child wearing a heavily laden vest. You use your comms channels and report the potential threat to your squad leader.
At MÄK, we help our customers simulate unmanned vehicles in a lot of ways, depending on what part of the system architecture the customer is addressing. Some use VR-Forces to simulate the UAV’s mission plans and flight dynamics. Some use VR-Vantage to simulate the EO/IR sensor video. Of those, some use VR-Vantage as the basis of their payload simulation and others stream video into their ground control station (GCS) from a VR-Vantage streaming video server.
All of our customers now have the opportunity to add a Synthetic Aperture Radar (SAR) to their UAV simulations — and here’s how to do it. SensorFx SAR Server comes as two parts: a client and a server. The server runs on a machine on your network and connects to one or more clients. Whenever a client requests a SAR image, it sends a message to the server, providing the flight information of the UAV and the target location where to take a SAR image. The server, built with VR-Vantage, then uses the JRM Technologies radar simulation technology to generate a synthetic radar image and return it to the client.
The SAR Server renders SAR images taking into account the specified radar properties, the terrain database, and knowledge of all the simulated entities. The radar parameters are configured on the server in advance of the simulation. The terrain database uses the same material classification data that is used by SensorFX for rendering infrared camera video so your sensor package will have the best possible correlation. The server connects to the simulation exercise network using DIS or HLA so that it has knowledge of all the entities. It uses this knowledge to include targets in the SAR scenes and so that you can use a simulated entity to host the SAR sensor.
At MÄK, we are constantly seeking ways to improve our products by diligently researching the latest technologies that will elevate our fidelity and performance. In this blog, we’ll tell you how we’re doing exactly that by integrating the photogrammetry process into our human content pipeline.
Photogrammetry is the science of making measurements from photographs " we’re using it to make a high-resolution 3D mesh. We expertly capture photos of a subject, use specialized processing software and post-processing by our team of 3D artists to make hyper-realistic, high-performing humans for DI-Guy, our Human Simulation software. DI-Guy’s ability to support multi-texturing via albedo, bump, specular, gloss, and ambient occlusion allows us to retain the minute detail of these captures while delivering them in low-polygonal, high-performing models. The DI-Guy artists use industry-leading tools such as ZBrush, 3D Studio Max, Maya, and Photoshop to translate these models from reality to virtual reality. As you can see from the photos and videos, the results are impressive.
While the results of this method are arguably the best quality humans in the simulation market, the benefits do not end there. In the design and implementation of this technology, we created what our 3D artists call the PortaScan Studio, a portable photogrammetry studio that allows them to travel anywhere to create custom human content for simulations. Imagine this: Private First Class (PFC) John Miller shows up for training where he is scanned and added to the DI-Guy library, along with his fellow trainees and comrades. He then dons an Oculus Head-Mounted Display and enters the training scenario. As he looks to his left he sees PFC Mike Marshall (not a stock avatar mind you, but Mike’s actual visage!) and to his right, his Captain. Talk about full immersion.
In version 4.3, VR-Forces introduces the notion of aggregate-level simulation. Okay. What exactly is the difference between aggregate-level simulation (ALS) and entity-level simulation (ELS)?
At the core, aggregate-level simulation is a more abstract level of modeling and therefore is more suitable for representing higher echelons of a force structure " units like companies, battalions, and brigades. Entity-level modeling has the fidelity appropriate for individual entities, like vehicles and human characters.
Lets look at maneuver modeling as an example. In ALS, units have to slow down to move through a forested area, whereas entities in ELS have to maneuver around individual trees. This higher level of abstraction happens for all the types of models. Combat in ELS happens when an entity has line of sight with another entity. When one entity fires, a hit/miss calculation is performed between the detonated ordinance and the nearby entities. Damage is assessed only for the entities that are actually hit. In ALS, units, which cover an area, must have line of sight to the "area’ of the other unit. Combat then proceeds as rates of change in the resources and status of the units. For example, a large, well-equipped unit will more quickly deplete the resources and status of a smaller less equipped unit.
Simulation has become an accepted, routine, and critical method of training militaries worldwide. Many nations have invested heavily in large simulations for wargaming, however there is no "one size fits all" training simulation. Software that may be appropriate for one nation may be too cumbersome, resource intensive, and unmanageable for others. A low-overhead simulation system will address a nation’s wargaming and constructive simulation requirements, while also being much more economical in terms of procurement, training, and sustainment.
MÄK CST fills the Command & Staff training capability gap. It combines the user-friendly features of a game with capabilities of the larger, more complex simulations to help trainees learn how to make stronger battlefield decisions. Because of its flexibility and ease-of-use, MÄK CST can be used in the classroom, in the simulation center, on deployment, and at home stations.
The Cost-Effective Solution
VR-Vantage IG delivers game-like visual quality in a high-performance image generator " designed with the flexibility, scalability, and deliverability required for simulation and training.
With VR-Vantage IG, immerse your trainees in stunning virtual environments. Experience 60 Hz frame rates for smooth motion, engaging action to stimulate trainees, and beautiful effects for immersive realism; all this, inside world-wide geo-specific databases.
We use the latest shader-based rendering techniques " just like the triple A games do " to take full advantage of today’s powerful GPUs. In your scenes, you’ll see dynamic light sources that cast light on scene geometry, full-scene dynamic shadows, ambient occlusion, reflections, and bump maps, depth of field, zoom, and other camera effects " and a whole lot more.
Many IGs are targeted to one environment. IGs designed specifically to provide the correct cues to high-flying-fast-jets don’t do so well in first-person-shootouts. Truck driving simulators don’t generally render the water well enough for maritime operations. Part of this is due to the choices in the content and part is the tuning of the IG and the graphics processing unit (GPU).
We’ve designed VR-Vantage IG to render beautiful scenes in any domain " air, land, and sea " and to fit into your simulation architectures. Version 2.0 has concentrated on both beauty and performance so you can get the most out of the graphics card.
Graphics cards these days are awesome. They take a steady stream of data and turn it into beautiful pictures rendered at upwards of 60 times each second (60Hz). To pull it off, the GPU computes color values for each pixel on your display. A 1920x1200 desktop monitor has over 2 million pixels and at 60Hz, thats 120 million color values. A lot of processing goes into each pixel so that collectively they form a beautiful picture. AAA game development houses do the work to configure the graphics card for all their target platforms; you, as a system integrator, have to do the same thing for your training customer.
We’ve been demonstrating our new VR-Vantage IG image generation capability by building five first-person player stations " each representing a different type of player. One of these stations was a Light Armored Vehicle (LAV) player where we collaborated with Simthetiq for the terrain database, with CM Labs for the vehicle physics, and with MAK’s own DI-Guy human character simulation to populate the environment. Watch the video below as Bob Holcomb explains (with the help of Gedalia as the driver) one of our most popular I/ITSEC 2014 demos.
WebLVC is an architecture for developing and deploying interoperable web and mobile applications in simulation environments, and for connecting these applications with existing, native modeling and simulation federations (which may use HLA, DIS, or other native interoperability protocols). Watch Matt Figueroa, one of our highly esteemed Link team engineers here at MÄK, explain the basics about WebLVC and how you can use it to see and interact with your simulation over the web in the video below.
Our goal is always to make it easier for our customers to create and use simulations. At both I/ITSEC 2013 and 2014, we showcased the MÄK Training System Demonstrator to show how to reduce operator workload and increase development productivity.
In the short demo below, Dan walks you through how the TSD uses the advantages of MÄK’s entire product line to create both a student and instructor maritime training environment. Watch as air, land, and sea entities start off behaving according to their plans; through our training interfaces, CGF, and web-apps, users can manipulate the simulation to achieve training in their techniques, tactics, and procedures.
Enhanced Company Operations Simulation (ECOSim) is known for its ease-of-use, rapid scenario generation, runtime operator control, and realistic & reactive human simulation. The short video below explains how easy it is to set up a scenario with DI-Guy humans in ECOSim, MÄK’s company-level training simulation that teaches leaders how best to deploy troops, UAVs, convoys, and other assets. Watch how easy it is to place a hostile or friendly squads into the scenario and see how the civilians and townspeople react through the Small Unit Leader Interface (SULI) and the Unmanned Aerial Vehicle (UAV) feed.
Whether you’re wargaming or managing a local crisis, simulation plays an important role in command staff training. Its job is to model the situation to provide learning opportunities for the trainees and to stimulate the command and control (C2), or Mission Command systems, they use. Simulation helps trainees and instructors plan the battle, fight the battle, and review the battle.
Brian Spaulding spent his days at I/ITSEC 2014 showing our visitors how MÄK tools are specialized for Command Staff Training. He explains how our most recent version of VR-Forces highlights aggregate-level simulation (with a new "thunder run" demonstration) and how our WebLVC-based web app helps decision-makers accomplish specific training objectives in a light-weight, interoperable way. Check out our demos with Brian below.
Here’s more about our Command Staff Training approach.
At I/ITSEC 2014, I demonstrated another integration of VR-Vantage with the Oculus Rift. My demonstration has come a long way since the one I showed at I/ITSEC 2013. Most importantly it’s been updated to use the Development Kit 2 (DK2) Oculus Rift prototype and the latest OVR SDK. I also incorporated VR-Forces in order to turn it into an F-35 flight simulator which can be controlled via a gamepad. In this post I’ve included a complete description of how the demo was put together, a system diagram, and also a photo of the demo at our booth.
I also have some exciting news for VR-Vantage users; this isn’t something you’ll only see at trade shows - I’m currently working on integrating the Oculus with the core product and you’ll be able to use it with the upcoming VR-Vantage 2.0 release! (Stay tuned to this blog for more info!)
The Details about VR-Vantage and Oculus
We know that budgets are tight and that many of you weren’t able to make it to I/ITSEC 2014 in December. Well, good news: MÄK is on your side. In the coming days and weeks, we’ll be posting videos of our most popular demos at I/ITSEC to give you a taste of what you missed. If you see something that grips your curiosity, imagination, or interest, get in touch - we would love to pack up our demos and bring them to you in your facilities. Catch a sneak peek below of the videos to come!
NADS miniSim driving simulator uses DI-Guy to inject realism into its driving environment
The recent holiday season marked the one-year anniversary of DI-Guy joining the MÄK team " and what a year it has been! From increasing DI-Guy performance and ease-of-use, to developing new ways to control characters, to building more realistic character simulations, and to creating much more content out-of-the-box, 2014 has been the year of DI-Guy.
With such a strong year in the records and such a strong product on the shelf, it makes sense that the National Advanced Driving Simulator (NADS) trusts DI-Guy’s human character simulation in its NADS miniSim¢ driving simulator.
VT MÄK, Antycip Simulation, and Thales have entered into a multi-year corporate-wide agreement to provide the MÄK RTI to Thales. Using the MÄK RTI, Thales will provide High Level Architecture (HLA) Evolved and HLA 1.3 compatibility to their range of simulations for training, experimentation, and demonstration.
The MÄK RTI is a proven solution that enables HLA federations to rapidly and efficiently communicate. It has been chosen for both large and small federations because of its support for a wide variety of network topologies and architectures, ease of configuration, high performance, and its range of supported platforms.
MÄK’s first HLA certification came in 1998 and since then, the company has been on the leading edge of developing and implementing the standard. MÄK’s tools and services have helped hundreds of organizations around the world comply with multiple standards including HLA, DIS, and DDS.
Commanders, like all good leaders, are responsible for the people below them. But they can’t do it alone. A commander’s staff exists to support the commander, work as a team, and deliver information to help make good, informed decisions. Training and preparation enable the command staff to function efficiently and properly in challenging situations; training allows the commander and his team to assess the situation, make decisions, and communicate those decisions.
Simulation plays an important role in command staff training; it’s job is to stimulate those situations where learning takes place. The simulation content depends on the echelon (level) and the missions the staff is being trained for. Marine Captains need entity-level simulation to train look-ahead surveillance for convoy protection missions while General Officers need aggregate-level simulation to model wargames for course of action analysis. (And there’s countless more examples of both.)
Modeling all of the elements needed to stimulate a command staff " all the activity in a training scenario " is a huge endeavor. Especially when it includes the behavior of opposing forces, the background civilian population, the political and social environment as well as the friendly force operations. To make it happen, commanders either need role players acting out the parts of each unit/entity/vehicle/person or a very powerful, believable, and capable artificial intelligence (AI) solution. Since full scale operations are time consuming and expensive to setup and run, many training tasks use the divide-and-conquer approach of focusing lessons on tasks that are manageable subsets of a complete environment.
When a patient comes into a medical clinic or hospital, the nurses have to assess what’s wrong. The same thing is true when an EMT shows up to the scene of a car accident. Sometimes the patients are clear and correct with their complaints, but often the patient does not know what’s wrong, has multiple seemingly unrelated issues, or is just delirious. Medical professionals have to figure it out. They must poke, prompt, and gauge the expressions on the patient’s faces to determine the appropriate course of treatment. This is a skill that comes with exposure to many cases " a skill that benefits from experiential learning. This is where virtual patient simulations come in.
Virtual training made its debut into medical professions with the advent of the CD-ROM and other interactive programs. Technology has since matured and medical professionals are increasingly taking advantage of virtual patient simulations to create much more responsive, interactive, and intelligent training situations.
Database correlation between different systems is a difficult issue, but sometimes we make it harder than it has to be. For example, imagine someone has a large terrain database built with TerraVista. You want that terrain and since your system can handle OpenFlight, you think, "Great! Let’s try it out. Send me that database." What you get is hundreds of openflight files and one master.flt file that references the hundreds of individual tiles of terrain. When you try to load the master.flt file, your system runs out of memory and crashes. Bummer, that didn’t work. It’s like trying to eat a bag of popcorn without opening the bag first.
To handle this problem, you could choose an approach that would be optimal for your system, but also the hardest and most time-consuming to implement; you would have to reprocess the terrain database into a structure that better suits your system architecture. But many times you don’t have the skills, time, or energy to do that. You just want to load the thing and see if it is a useful database before committing to optimizing it.
Here’s what I recommend: try MetaFlight. Lots of people think MetaFlight is a different kind of database but it’s not. It’s just an XML-based way to reference the many tiles of a terrain. MetaFlight describes the grid of tiles using your database’s coordinate system so that the simulation or visual system can fetch the tiles that it needs and ignore the ones that are not needed or in view. When you use MetaFlight, it’s like reaching into the bowl of popcorn and getting the handful that you want.
Some wargaming simulations are so large that setting them up requires the organizational skills of a multi-echelon military structure. The’re too big to stop and restart when something goes wrong. So how do you handle it? How do you make sure that all the training participants learn what they are supposed to learn? This is where the Gamemakers come in.
Maybe you’ve seen the movie "The Hunger Games". The Gamemakers control the contest, adding distractions, challenges, even new opponents to steer contestants toward the conflict. Military officers acting as instructors do something similar for large wargames. They steer the conflict by adding supporting elements and opponents " changing entities’ behavior and capabilities. They set up situations so that learners must use the tactics, techniques, and procedures in their curriculum.
You, too, can be a Gamemaker. To be successful, you’ll need simulation tools that can be used while the simulation is running. Tools so easy to manage that you can detect problems and effect changes immediately. For this, you should try VR-Forces and DI-Guy Scenario, tools that enable you, the instructor, to dynamically inject events to stimulate trainee responses or guide a trainee’s actions during a training exercise. You can make your game a winning simulation and MÄK can help. The odds will always be in your favor.
Why simulate? Because you can learn and gain insight into problems that are too difficult, expensive, or risky to explore any other way. In the case of unmanned vehicle systems (UVS), there has never been a better time to invest in simulation tools to help bring your UVS goals to life. Whether you want to demonstrate new vehicle concepts within a synthetic environment, prove and refine new Tactics, Techniques, and Procedures (TTPs), or provide a way for pilots, sensor/payload operators, and mission commanders to practice and analyze decision-making and communication processes, VT MÄK has the tools to make it happen.
MÄK is proud to help system integrators experiment and research entire UAS environments from the ground up - from ground control stations, to the unmanned vehicle, to sensors on the UAV, and to the human-in-the-loop. VR-Forces, MÄK’s scenario generation software, models everything going on in the (virtual) world and provides an intuitive 2D/3D user interface to create dynamic, interactive scenarios for military and civilian applications. With VR-Forces, you can build scenarios to include both the sensor platforms and the target entities and their semi-automated interactions. Experience the view from your virtual unmanned vehicle by attaching simulated electro-optic (EO), infrared (IR), night vision (NVG) sensors to VR-Forces sensor platforms.
During live missions, soldiers and marines interact with the rich complexity of human behavior. Fellow blue forces and command structures behave, or at least are supposed to behave, according to doctrine. Civilians are busy going about their business, which can be as simple as hanging around, or as complex as searching for and gathering with friends, avoiding traffic, playing games, shopping, or engaging in religious or civic causes. Opposing forces may also be following doctrine or, all too often these days, be very unorthodox in their behavior. It’s hard to tell the normal, peaceful activities from the malicious ones.
Here’s a suggestion on how to model such a complex environment " use artificial intelligence (AI) for the simulated entities, then group them together and add AI to the groups. There, we made it sound easy. But seriously, The U.S. Marine Corps Tactical Operations Group (MCTOG) had exactly this issue. They needed to train captains to learn their role as commanders through realistic search, patrol, and intelligence gathering exercises. And to do that they needed a way to model a complex network of human behavior. Working together with our DI-Guy team, we developed a new level of interactive scenario generation capability. We call it Enhanced Company Operation Simulation, ECO Sim for short.
ECO Sim builds upon DI-Guy AI, which provides "Lua-brains" to individual characters, by applying Lua intelligence to collections of characters to form sophisticated human networks. Opposing forces intent on deploying improvised explosive devices (IED) are modeled with financiers, bomb makers, safe houses, leaders, and emplacers. These IED networks operate within a larger backdrop of ambient civilian behavioral patterns of life: farmers in fields, children attending school, families going to marketplaces, and religious services.
Imagine that you are responsible for modernizing a large and complex simulation system; you must bring its training capabilities up-to-date and leverage new technological innovations. Do you think it’s possible to manage this transformation while maintaining interoperability with existing systems? It’s entirely possible " and it’s been done with MÄK.
When the US Air Force needed to modernize their Air Warfare Simulation (AWSIM) system, they needed to improve interoperability among their own applications and maintain interoperability with the broader joint forces’ war gaming systems. They chose HLA Evolved as the interoperability architecture in part because of its more flexible approach to managing federation object model (FOM) extensions. HLA Evolved enables federation designers to agree on the common core of a FOM for broad interoperability, and use FOM modules to address specific communication needs within the Air Force systems. (continued...)
To ensure successful training of Intelligence, Surveillance, and Reconnaissance (ISR), the simulation scenario must be believable; to deliver that training scenario to many students, it must be repeatable. To be believable, the scenario needs accurate background activity to clutter the scene and make it difficult to identify and track the suspected high valued individual (HVI). The HVI needs to have subtle behavioral clues that expose the HVI. To be reproducible, the HVI must perform his predetermined task regardless of what the background entities are doing - any unpredictable actions lead to inconsistencies that detract from the training. (continued...)
If you stopped by the MÄK booth at I/ITSEC 2013, it’s likely that you walked away with some bright red stress balls, one-of-a-kind red and white chocolate mints, and ideas about how MÄK can be your partner in all things simulation. This is because we’ve refreshed our branding to focus on our core technologies.
We had a great time at the show this year and a successful week of demonstrating our re-energized simulation behaviors, amped up visualization capabilities, interactive training system demo, and our low-overhead command staff trainer. And we were happy to showcase all of these incredible demos in our brand new booth. (continued...)
In case you missed it, here’s a recent customer win highlighting the Air Force’s choice to go with MÄK’s interoperability tools - enjoy!
The US Air Force chose several MÄK interoperability products for the Air Force Modeling and Simulation Training Toolkit (AFMSTT) program, including the new MÄK WebLVC Server, VR-Exchange, and MÄK Data Logger.
Based on the Air Force’s Air Warfare Simulation (AWSIM) model, the AFMSTT system enables training of senior commanders and staff for joint air warfare and operations. MÄK’s tools will be used to help migrate the AFMSTT system to a service-oriented architecture based on High Level Architecture (HLA) interoperability and web technologies. The program uses MÄK’s WebLVC Server to help monitor, control, and interact with core AWSIM and AFMSTT components through lightweight, thin clients running in a browser.
We’re here at the Transportation Research Board’s (TRB) annual conference in Washington D.C. with transportation professionals from around the world; we’ve seen members of federal, state, and local governments, along with plenty of researchers from universities.
In this exposition of very robust and specific traffic simulations, MÄK’s web-based traffic simulation, TurboTraffic, is making quite a splash. The ability to quickly define traffic flows on the road network (provided in the cloud from OpenStreetMap), assign a volume of traffic, and then immediately see cars flowing into intersections is causing people to think of new applications. This "quick sketch" style lets non-experts create traffic where they previously would have hired a traffic consultant or simply gone without.
Recently the Africa Aerospace and Defence 2012 Air show was held in the South African Waterkloof Air Force Base in Centurion City of Tshwane, South Africa. The Council for Scientific & Industrial Research (CSIR), a MAK customer, exhibited and presented VR-Vantage and the Hawaiian database provided by VR-TheWorld Server as part of their helicopter simulator in the CSIR booth exhibits.
Steve Haselum, CSIR, Systems Engineering Manager, commented on the event: "Back in the office now but after a successful Africa Aerospace and Defence 2012. The helicopter simulator system was certainly a crowd puller...there was quite a lot of interest in both the visual and database."
The helicopter visual system was spanning across three 65 LCD Panels providing a 150 degree field of view display. General exhibitor and visitor responses were"¦"the visuals and Hawaiian database looked really good."
Help spread the word about modeling and simulation!
This week, MÄK is in sunny Las Vegas for the AUVSI Unmanned Systems 2012 show. Sitting in the middle of one of the largest unmanned vehicle trade shows is a like sitting on the set of Wall-E, but with more robots. We’re showing our simulated video solutions and how we can support experimentation and training for these systems. We teamed up with DiSTI this week to develop a ground control station that interacts with the simulation, controls the UAV, and receives data streams from the simulated UAV over MISB standards compliant protocols just like a real UAV would.
If you’re looking for a winning bet, come by the VT MÄK booth (#2911) and see how we can help you produce reliable training and experimentation platforms for your unmanned systems.
Esri, the biggest player in the Geographic Information System market, held its international users conference last week in San Diego. As usual, it was a very impressive event.
Many of us in the Modeling, Simulation & Training industries use, or have used, Esri’s ArcGIS tools to prepare geographic information for use as source data in our terrain database generation workflows. Well, this year Esri stepped into the 3D site model generation business by acquiring Procedural and their City Engine technology for building 3D urban environments. Those of you who know me, know I’ve been a proponent of procedural terrain generation for years, I even authored an I/ITSEC paper on the subject in 2004. So, on the flight to San Diego, I took the opportunity to give City Engine a try.
I found the design approach to be just what you expect from a procedural tool: fast and creative. The product comes with several sample projects that let you experience the scope of the rule-driven approach. After playing with the sample projects for a while, I felt like I understood the approach and wanted to try it out with my own data. So, I loaded a shapefile of road centerlines and was pleased to find that it automatically found all my intersections, buffered the road, created sidewalks, blocks between the roads, and lots within the blocks. All this because of default rules. I then tweaked the parameters to make lots the sizes I wanted and I’m off. There’s lots more gems to be found in the rules they have set up in the sample projects.
The next major release of VR-Vantage (1.5 coming out Q3 2012) will let users visualize radio communications. Users of VR-Vantage Stealth, VR-Vantage PVD and VR-Vantage XR (and eventually VR-Forces and SimMetrics) can tell who’s sending radio messages by their“Squawks”. You’ll also be able to see who they’re communicating with via &ldquoRadio Communication Lines”.
You have to hand it to Singapore. They sure do know how to put on an Air Show. There’s nothing like holding a pleasant conversation with customers or colleagues only to be interrupted by the roar of an F15 jet passing right in front of the door then ascending like a rocket straight into the clouds. Very impressive.
MÄK is presenting in two locations at the Singapore Airshow. Our COTS products are on display at the TME Systems booth and our Battle Lab (a.k.a. ISR Lab) is in the Ideas section of the ST Engineering booth.
The Battle Lab is getting a lot of attention. Some are attracted by the analysis graphics shown in the AGI SimMetrics display and others by the Simulated Video streaming from the UAV’s sensor IG through the comms model to the Ground Control Station.
Having released VR-Vantage 1.4 it’s time to move on to the next version. We showed several technology demonstations at IITSEC this year and we are in the process of productizing them for VR-Vanatage 1.5.
Effects based sensors let users visualize NVG, FLIR and other sensor views without materially classifying data. While these sensor visualizations are not physically accurate, they are pretty good and since they don’t require any changes to models or databases they are super easy to use. They still utilize JRM’s world class technology for sensor visualization, just without the high fidelity physics based stuff (which can be enabled via a drop-in add-on module).
Video streaming is also being added. You can stream simulated video from a VR-Vantage channel to a client application in real time. This is usefull for applications like UAS ground operator stations where the simulated video is streamed from the UAS to the operator station.
It’s great to be back at I/ITSEC for another year"seeing those familiar faces, a few that I haven’t seen in over 10 years, is a wonderful thing that makes being here a fun and special time.
As far as what’s happening on the show floor, I’ve seen some really innovative visual technologies around. Something that I’m sure fellow MÄK bloggers have commented on is how well our streaming terrain is being received. It seems like every time I turn around there’s a new crowd asking for a terrain demo. I haven’t seen anything yet that is comparable to our streaming terrain"¦perhaps this explains our popularity.
Last night was a fun night for MÄK as well: instead of putting on the ChowdahFest as we do every year, MÄK decided to put on a series of events - the MÄK Fest. Last night’s event included lots of fun at “Howl at the Moon”, a renowned dueling piano bar in Orlando. It was fun to see all of our MÄK staff, customers, and I/ITSEC friends hanging out, wearing cowboy hats (the funny hat trend continues), and having a great time. I’m looking forward to seeing fireworks tonight at Epcot and doing some karaoke tomorrow night at Orlando CityWalk! Fun times ahead.
Well, folks, it’s here again. That long awaited and long prepared for season that comes only once a year: I/ITSEC season. MÄK is busy building and preparing our booth for the show, which starts this Monday at 2:00 pm sharp.
If you’re planning to attend the tradeshow, we invite you to stop by our booth (#2549) and introduce yourself! We’d love to meet you and answer any questions you may have about our product capabilities or customer-oriented solutions. If you can’t make it to the show, we hope you’ll follow our blog. We plan on posting several entries a day about the happenings at I/ITSEC, all written by our attending MÃK staff.
Whether you’re here in Orlando or joining us virtually on the blog, we look forward to sharing our I/ITSEC experience with you!
I recently attended the Fall 2011 Simulation Interoperability Workshop in Orlando (with Aaron Dubois -- check out his account of SIW here), which I have been attending since 1990 or so. One of the focus areas for this meeting was the NASA Smackdown, which is a lunar lander and rover simulation event that took place at the Spring 2011 SIW and will again be shown at the Spring 2012 SIW. It is organized by NASA with participation from a dozen colleges and universities around the world. A few more universities plan to attend next year, including Arizona State University. We are providing the RTI and technical support for this event again next year. It is really a great way for future engineers and scientists get familiar with distributed simulation and MÄK is pleased to continue to support it. I also spent some time in the MSDL and CBML product development group meetings. Phase II of MSDL is getting started and the CBML group is just finishing up their Trial Use period for Phase I. We are keeping our eye on these developing standards and may support them in future product versions. If you are at all interested in using these standards in MÄK products, please let us know.
This week MÄK is presenting at the Air Traffic Control Association Conference Exhibition in National Harbor (across the river from Washington DC)
We’re demonstrating the simulation technologies that are helping our customers at the FAA Tech Center study system concepts to improve pilot’s abilities to make decisions in bad weather.
By Aaron DuBois - The MAK RTI version 4.0 was released on the same day that IEEE officially released the IEEE 1516-2010 standard, otherwise known as HLA Evolved. We were very excited to be able to fully support the new version of HLA from the very first day the standard was out. The down side, however, is that we did all of our development for RTI 4.0 before the standard was finalized, and even at the very end there were minor tweaks happening. Unfortunately we failed to capture the very last change made to the C++ API. As a result, versions 4.0-4.0.3 of the MAK RTI were built against a nearly-final version of the C++ headers, which means that those versions are not quite compatible with the final version of the specification. The new release of RTI 4.0.4 fixes this, and is now built against the final version of the header files.
The final change that was not included in the previous RTI versions was related to a defect in one of the final draft versions of the specification. We actually wrote about this defect in a previous blog post. The problem was with the createFederationExecution RTIambassador methods. There were three variations of this method, each with different input parameters. Some of these parameters contained default values, and as a result there was an ambiguity between two of the variations. We mistakenly thought that there hadn’t been time to get a fix for this ambiguity into the spec, but apparently it did make it in after all. The third variation was renamed to createFederationExecutionWithMIM.
So what does this mean? If you are an RTI customer, but are currently using HLA 1.3 or 1516-2000, this doesn’t affect you at all. The new version of the RTI contains a few bug fixes, so you may want to upgrade anyway, but the HLA Evolved API change won’t be a problem unless you decide to move to the new standard. If you are using HLA Evolved, however, we strongly recommend that you upgrade to the new release and recompile your federate against the new header files. If you were using the third variation of createFederationExecution you will also need to edit your code to use the renamed method. Otherwise, no code changes are necessary. Once you recompile your federate, it will then be truly compatible with the final version of the HLA Evolved specification.
By Brett Wiesner - Recently I gave a presentation at IMAGE 2011 in Scottsdale, Arizona and at an NDIA meeting in Fairfax, Virginia on the benefits of Open Streaming Terrain (OST). I thought I’d share just a brief synopsis of that here.
Terrain databases are an important part of any simulation and there are four main approaches for building terrain databases. You have hand modeled terrains that are built by artists and 3D modelers. There are tool generated terrain databases that are built by terrain generation tools. You have direct from source terrains that are constructed on the fly from source data in the client application. And finally you have streaming terrain, where content is streamed from a sever to a client directly. Each of these terrain approaches has its advantages and drawbacks.
Open Streaming Terrain (OST) is a kind of streaming terrain, where the data (elevation, imagery and feature data like roads and building footprints) is streamed from a server to a client using open standards. It’s the open standards thing that’s the important part. See, by using WMS, WFS, TMS or any of the open standards looked after by an open governing body like the Open Geospatial Consortium (OGC) or Open Source Geospatial Foundation (OSGeo) you can build an application that“talks” to other compliant applications and can take advantage of petabytes of free (or fee) source data thats out there on the internet right now.
By Brett Wiesner - We know we needed to get a linux build of VR-Vantage out to customers ASAP but we felt like sneaking in a few more features along the way was a good idea. We will release VR-Vantage 1.3.1 around the end of July with support for Red Hat Enterprise Linux 5 and also some great new capabilities like zoom and terrain scaling.
With zoom, you can magnify the view and see things that are far away without changing the observers location. This is useful for UAV sensor applications, ground based binocular views or even periscopes!
Terrain Scaling is a VR-Vantage XR capability that allows you to exaggerate the height of the terrain in order to get a different perspective of the situation. Don’t know which path is an easier climb or drive? Or maybe you want a clearer picture of the relationship between aircraft and the terrain? Exaggerate the slope with terrain scaling and you’ll find out easily!
If you missed I/ITSEC this year in Orlando, then you missed some very impressive demonstrations presented by the experts at MÄK. Our booth was bigger than ever this year, allowing us to showcase both our COTS products and higher level solutions.
On the product front, we showcased VR-TheWorld a streaming terrain server for Modeling & Simulation.
Check out the video below of VT MAK’s latest product: VR-TheWorld Server! VR-TheWorld Server is a streaming terrain server for modeling and simulation. It supports the TMS and WMS-C open standards and streams imagery and elevation to your simulation and visualization applications! The easy to use web interface let’s up upload your own data and get it into your simulation in no time!
Len Granowetter has been with MÄK since 1993. He currently serves as MÄK’s Vice President of Products with overall responsibility for MÄK’s Modeling and Simulation Product and Solutions business. From 1999-2009, Granowetter was MÄK’s Director of Product Development. Under his direction, MÄK’s product line expanded from a few interoperability toolkits to a full portfolio of applications, plug-ins, and developer’s tools, spanning the focus areas of Link, Simulate, and Visualize.
Just issued a press release about our recent efforts with Alion to use Battle Command to train the Latvian Armed Forces.
You can read the release here: https://www.mak.com/company/news/news-archive
We’re making progress.
For those of you who haven’t seen what things look like on the show floor before Monday, here’s how things start, at least for MAK. It’s amazing that it all comes together in three days.
What’s I/ITSEC all about? Check out this overview provided by NTSA.