MAK Blog

This is some blog description about this site
Jim Kogler

Light Points

Pilots rely on visual inputs the most to orient themselves in flight. Because vision is so important, night flying can introduce new challenges – limited eyesight, night illusions and light blindness. To combat these issues, pilots train to use a consistent, regulated set of lights (to indicate approach, threshold, etc) to help guide them through darkness, identify where they are, and assess how fast they are moving.

Continue reading
  9556 Hits
Jim Kogler

VT MÄK Announces the Release of RadarFX SAR Server 1.0

We've just launched RadarFX, our new Synthetic Aperture Radar (SAR) simulation and visualization product! We built it in conjunction with our partner, JRM Technologies.

In the real world, a SAR sensor is typically attached to an aircraft or satellite.  A SAR system generates photograph-like still images of a target area by combining radar return data collected from multiple antenna locations along a path of flight. Requests from users on the ground define the target area to be scanned, and other parameters used to generate and return the image.

Continue reading
  4743 Hits
Dan Brockway

Zoom, Terrain Scaling, Linux support and more coming soon in VR-Vantage 1.3.1!

By Brett Wiesner - We know we needed to get a linux build of VR-Vantage out to customers ASAP but we felt like sneaking in a few more features along the way was a good idea. We will release VR-Vantage 1.3.1 around the end of July with support for Red Hat Enterprise Linux 5 and also some great new capabilities like zoom and terrain scaling.

With zoom, you can magnify the view and see things that are far away without changing the observers location. This is useful for UAV sensor applications, ground based binocular views or even periscopes!

Terrain Scaling is a VR-Vantage XR capability that allows you to exaggerate the height of the terrain in order to get a different perspective of the situation. Don’t know which path is an easier climb or drive? Or maybe you want a clearer picture of the relationship between aircraft and the terrain? Exaggerate the slope with terrain scaling and you’ll find out easily!

  2250 Hits
Bob Holcomb

Configuring VLC to receive video streams with low latency

Last week you read all about setting up buffers in VR-Vantage to best suit streaming video. This week I’ll talk about configuring VLC to receive video streams with low latency.

VLC is a commonly used application to show received network video streams out of Vantage and is used often in testing. (You can get VLC from their www.videolan.org/.) The default settings for viewing network streams in VLC includes quite a bit of buffering to make sure the video plays smoothly. Sometimes you want to see the video with as little latency as possible, which will require changing a few settings.

  10317 Hits
Dan Brockway

Tuning your GPU for VR-Vantage

Many IGs are targeted to one environment. IGs designed specifically to provide the correct cues to high-flying-fast-jets don’t do so well in first-person-shootouts. Truck driving simulators don’t generally render the water well enough for maritime operations. Part of this is due to the choices in the content and part is the tuning of the IG and the graphics processing unit (GPU).

We’ve designed VR-Vantage IG to render beautiful scenes in any domain "“ air, land, and sea "“ and to fit into your simulation architectures. Version 2.0 has concentrated on both beauty and performance so you can get the most out of the graphics card.

Graphics cards these days are awesome. They take a steady stream of data and turn it into beautiful pictures rendered at upwards of 60 times each second (60Hz). To pull it off, the GPU computes color values for each pixel on your display. A 1920x1200 desktop monitor has over 2 million pixels and at 60Hz, thats 120 million color values. A lot of processing goes into each pixel so that collectively they form a beautiful picture. AAA game development houses do the work to configure the graphics card for all their target platforms; you, as a system integrator, have to do the same thing for your training customer. 

Continue reading
  8923 Hits
Jim Kogler

Do I need a new graphics card?

Frequently we get questions about hardware requirements for customers who are trying to use VR-Vantage as an IG for a specific program. Typically, the customer is looking to achieve 60 frames per second (FPS) in VR-Vantage and their scene is rendering slower than they would like/expect. They have read the MÄK Blog about minimum hardware yet didn’t find the answer they were looking for.

Over the years, many of us have been conditioned to assume that buying newer/better hardware will yield better performance; if your performance isn’t up to snuff, just buy something newer. This often works – new GPUs are released yearly, often with phenomenal performance improvements. The cost for this new hardware is low compared to the total program cost, so upgrading can make sense. That said, most terrains used in the Modeling & Simulation community aren’t particularly complicated and so should run really fast even on old hardware. So how can you figure out if it’s your terrain that is slowing you down or if it’s your graphics card that is the culprit? This blog will try to answer that question for you.

To understand where your bottleneck is, you need to understand if your application is CPU or GPU bound. For this blog I will use the term “CPU” to mean not just the physical processor, but also the process of organizing and passing information to the GPU. Simply put, VR-Vantage can be bottlenecked in many places: collecting information from the network, updating the scene graph, sending information to the GPU, or the GPU itself may be bottlenecked trying to render the actual scene. Of these possible bottlenecks, upgrading your video card will only help the final case. That means if your scene is slow for any reason besides the final render step, you need to optimize your scene’s content and configuration, not by buying a better graphics card.

Continue reading
  6823 Hits
Jim Kogler

Do I need a new graphics card? Part 2

A week ago, I wrote a blog entitled “Do I need a new graphics card?” to answer the common question: Will I get better performance if I just upgrade my graphics card? In the blog, I discussed the difference between CPU and GPU bound scenes, and made the point that if you are CPU bound, getting a new graphics card will not help much. Typically scene performance will improve more with better terrain organization. 

While that is all true, there is one additional problem you may encounter that will spoil performance and can be addressed by upgrading hardware: running out of video memory. VR-Vantage 2.0.1 now tracks your total video memory, how much you are using, and if any of your textures have been pushed out of memory (evictions). Once you have consumed all of your video memory, the card will start swapping textures off the card and into the system memory. This is incredibly slow and will seriously affect frame rate. Scenes that were fast may all of a sudden have a 100ms draw time. 

To see how your scene is performing, turn on your Performance Statistics Overlay (found in Display Settings -> Render Settings).  You would want to see something below 80% usage. As you move around in your scene, if the memory consumption gets up to 100%, or you start seeing Evictions, then your performance is being seriously affected by a lack of memory. 

Continue reading
  4719 Hits
Jim Kogler

SimMetrics “ It’s 10 o’clock; do you know where your children are?

In the next few blogs I want to talk about a new exciting product MÄK is working on with AGI”SimMetrics. SimMetrics is a cool product that uses AGI’s analytics and MÄK’s visualization to model sensors, sensor tracking, the GPS Constellation, and GPS receivers to produce a real time analytical capability integrated with your simulation environment. All of this is done to add high fidelity real time Intelligence, Surveillance, and Recognizance (ISR) capabilities to your simulation environment. To model a high fidelity ISR capability, you need to know where you are and where your targets are. SimMetrics helps you easily do both.

To help you understand where you are, SimMetrics models the entire GPS constellation taking into consideration both your position and the calendar time.

Accurate GPS catalogues are used so if a satellite is off line for maintenance or currently experiencing technical difficulty, two factors which will have significant impact on your ability to know where you are.

Continue reading
  3830 Hits
Dan Brockway

Tentative Plans for VR-Vantage 1.5

Having released VR-Vantage 1.4 it’s time to move on to the next version. We showed several technology demonstations at IITSEC this year and we are in the process of productizing them for VR-Vanatage 1.5. 

Effects based sensors let users visualize NVG, FLIR and other sensor views without materially classifying data. While these sensor visualizations are not physically accurate, they are pretty good and since they don’t require any changes to models or databases they are super easy to use. They still utilize JRM’s world class technology for sensor visualization, just without the high fidelity physics based stuff (which can be enabled via a drop-in add-on module).

Video streaming is also being added. You can stream simulated video from a VR-Vantage channel to a client application in real time. This is usefull for applications like UAS ground operator stations where the simulated video is streamed from the UAS to the operator station.

Continue reading
  2107 Hits
Dan Brockway

Visualize Radio Communications in VR-Vantage

The next major release of VR-Vantage (1.5 coming out Q3 2012) will let users visualize radio communications. Users of VR-Vantage Stealth, VR-Vantage PVD and VR-Vantage XR (and eventually VR-Forces and SimMetrics) can tell who’s sending radio messages by their“Squawks”. You’ll also be able to see who they’re communicating with via &ldquoRadio Communication Lines”. 

  2290 Hits
Jim Kogler

VR-Vantage 1.4.1 Released!

VT MÄK is pleased to announce the release of VR-Vantage 1.4.1. This release marks another milestone in our Open Streaming Terrain story by adding the visualization of streaming vector data.  VR-Vantage applications (like VR-Vantage Stealth or VR-Vantage IG) can now stream in point, linear and areal features from a compliant terrain server using the open standard Web Feature Service (WFS) protocol, and use those features to generate textured 3D geometry on-the-fly at run-time.  VR-Vantage applications can:

  • Generate 3D geometry for buildings, fences, and walls by extruding polygons from geo-specific footprints (linear or areal features), and applying geotypical textures based on feature attributes
  • Place pre-built 3D models representing trees, geospecific buildings, lampposts, etc., into the scene based on the locations and attributes of individual point features ("point feature substitution")
  • Automatically populate forests with trees, or populate roads with telephone poles, fire hydrants, etc., by randomly placing 3D objects within areal features, or along linear features.

Combined with our existing support for streaming elevation and imagery, these new capabilities allow you to very quickly visualize 3D environments that are both global in scale, and visually rich:  Just upload your source data to a compliant streaming terrain server such as MAK’s VR-TheWorld Server, configure your feature-to-geometry mappings using an XML-based“.earth file”, and tell VR-Vantage to connect.

To demonstrate the new capabilities, we’ve collected readily available source data for the Hawaiian island of Oahu, put this data on our VR-TheWorld Online server, and shipped a sample .earth file with VR-Vantage 1.4.1.  Check it out using any VR-Vantage application (Download VR-Vantage FreeViewhere) or watch a video tour of Hawaii here.

Continue reading
  2443 Hits
Fred Wersan

Track History Improvements

In VR-Vantage 1.5, we will be making some small changes to the Track History feature. Track Histories are currently limited in how long they can get. This is for performance reasons because, as you can imagine, creating infinitely long track histories will cause the application to run out of memory.

In 1.5, we will allow the users to specify the length of each track history segment and also the total number of segments allowed per track history. This lets you have longer track histories if you have fewer entities, for example. 

Whether you are conducting scenario generation with our scenario generation tools like VR-Forces or have your own distributed simulation (built with VR-Link, right?), VR-Vantage Stealth is the best 2D/3D DIS/HLA Stealth Viewer on the market. Use our Stealth Viewer because you need the most information about your simulation displayed beautifully - and you correctly realize that you’d spend a fortune building a less capable viewer on your own.

  5188 Hits
Bob Holcomb

Streaming Video Setup: How many buffers do I need in my VR-Vantage settings?

A common question I hear about video streaming setup is "how many buffers do I need?"  The answer is "it depends."  Let’s look at why and how you can determine for yourself the best setting for your system.

  3396 Hits
Jim Kogler

Behind the Scenes with VR-Vantage

MÄK is making a huge investment in our premier visual suite, VR-Vantage. Last year we made tremendous strides by adding ocean and maritime visualization. The work continues full force as we continue to improve our visual environment. The next release of VR-Vantage, 2.0, is planned for later this year and has two major directions: performance improvements and visual quality enhancements.

We are committed to improving performance in VR-Vantage. Look forward to shader optimizations that take advantage of game-based rendering techniques, an improved physics engine to enhance the visual interaction between objects (like ships that rock on the dynamic ocean), optimized loading algorithms for large terrains, and improved internal organization and grouping of geometries to maximize capabilities of the GPU. If that all sounds like techno-jargon, it is! We’re focusing on the complicated stuff so you can focus on better-looking, better- performing scenes that run at 60 frames per second (fps), the gold standard of smooth visualization.

Visually, we are concentrating on several areas: a beautiful environment, lighting effects (both day and night), improved trees and vegetation, and high fidelity sensor/camera modeling. Both the ocean and the sky in VR-Vantage have been greatly improved. The ocean supports many new features, including helicopter rotor wash, significantly faster/better wakes (both up close and from the air), and underwater crepuscular rays ("God Rays"). The sky draws faster and can be rendered with high-resolution clouds. Complex surf patterns on shorelines can now be configured through shape files, allowing surf to roll onto beaches and inlets accurately.

Continue reading
  6957 Hits
Dan Brockway

Get Bumpier, Shinier, and Shadier Graphics in VR-Vantage

What’s the difference between a dull, old model and a bright shiny, new model?

Turns out, it’s just texture maps. Oh yeah, and the VR-Vantage rendering engine. With VR-Vantage 1.6 all you have to do to get bumpy, shiny, and shady effects in your models is add normal, specular, and occlusion maps. That might sound pretty complicated. But really these are all textures that you can create with tools like Crazybump, Blender, and Photoshop.

Crazybump will take your texture map and guess what shape it is and then use that shape to generate (bake) specular, normal, and occlusion maps. But it’s just guessing. If you have a high-polygon count 3D model, then you can use tools like Blender to bake specular, normal, and occlusion maps from that model. And in Photoshop, you can paint specular maps by highlighting the shiny spots of your original texture.

Continue reading
  3569 Hits
Jim Kogler

Deep in the Heart of Texas: Configuring Stars in VR-Vantage

While MÄK is based in Massachusetts, we have some very good friends down in Texas. If you are in Texas, or if you’re just simulating it, you know that the stars at night need to shine really bright. VR-Vantage can help with that. VR-Vantage uses a real star map to calculate thousands of star positions for every day of every year. The stars are accurate, and if you look closely enough, you can pick out some of the planets as well.

When you are simulating at night, it’s necessary to make some of the stars brighter, or perhaps play with the luminosity of the moon. Here’s how you can do that: While some of the details of sky configuration can be found in the GUI, some of the more obscure and advanced settings can be found in the file vrvantage/data/Environment/Sky/SilverLining.config. If you look through this file, you will see lots of ways to configure the Sun, Moon, Clouds, Stars, and the Atmosphere.

  3774 Hits
Dan Brockway

Putting the VR into VR-Vantage at I/ITSEC 2014

At I/ITSEC 2014, I demonstrated another integration of VR-Vantage with the Oculus Rift. My demonstration has come a long way since the one I showed at I/ITSEC 2013. Most importantly it’s been updated to use the Development Kit 2 (DK2) Oculus Rift prototype and the latest OVR SDK. I also incorporated VR-Forces in order to turn it into an F-35 flight simulator which can be controlled via a gamepad. In this post I’ve included a complete description of how the demo was put together, a system diagram, and also a photo of the demo at our booth.

I also have some exciting news for VR-Vantage users; this isn’t something you’ll only see at trade shows - I’m currently working on integrating the Oculus with the core product and you’ll be able to use it with the upcoming VR-Vantage 2.0 release! (Stay tuned to this blog for more info!)

The Details about VR-Vantage and Oculus

Continue reading
  10797 Hits
Dan Brockway

Light Armored Vehicle (LAV) IG Demonstration

We’ve been demonstrating our new VR-Vantage IG image generation capability by building five first-person player stations "“ each representing a different type of player. One of these stations was a Light Armored Vehicle (LAV) player where we collaborated with Simthetiq for the terrain database, with CM Labs for the vehicle physics, and with MAK’s own DI-Guy human character simulation to populate the environment. Watch the video below as Bob Holcomb explains (with the help of Gedalia as the driver) one of our most popular I/ITSEC 2014 demos.

  3550 Hits
Dan Brockway

VT MÄK’s First-Person Driving Simulation Using Simthetiq’s SUROBI Virtual Terrain Environment

This blog focuses on the benefits of using highly accurate and immersive training environments "” a critical part of making any simulation a success.

At I/ITSEC 2014, we demonstrated our new VR-Vantage IG image generation capabilities by building five first-person player stations "“ each representing a different type of player. One of these stations was a Light Armoured Vehicle (LAV) player where we collaborated with Simthetiq for the terrain database, CM Labs for the vehicle physics, and with MÄK’s own DI-Guy human Character simulation to populate the environment. 

Typically when building a competitive simulation solution, the biggest proportion of investment is on the hardware and software at the detriment of the visual database. Everyone agrees that the IG features and hardware performance are vital for any virtual training exercise "“ but all that action happens in the context of the virtual terrain. A poor visual database will make any investment much less effective. Simthetiq specializes in building cost-effective, immersive training environments that reach the new level of realism wanted by today`s demanding customers.

  4478 Hits
Dan Brockway

How the visual sub-system fits into training system architectures

A great visual scene is a key aspect of virtual training systems. They provide the geographic context for the simulation and immerse the trainees in a virtual world where they can play out their training objectives.

Virtual training systems come in many shapes and sizes depending on the tasks being trained and the fidelity requirements. This blog outlines several architectures for integrating the visual sub-system into the training system architecture. Keep reading...

  3647 Hits
Jim Kogler

Hot off the Press: VR-Vantage 2.0

VT MÄK is pleased to announce the release of VR-Vantage 2.0! This major release represents a huge leap forward in both the performance and visual quality of VR-Vantage IG - with upgrades to nearly every one of the product’s main components. VR-Vantage 2.0 includes a brand new shader infrastructure, dynamic lighting engine, real-time full-scene shadows, upgraded vegetation, environment, and dynamic ocean models, a robust CIGI implementation, and much more. With VR-Vantage 2.0, we’ve achieved our goal of delivering game-like visual quality in a high-performance, 60Hz immersive environment.

  8540 Hits
Dan Brockway

VR-Vantage IG: A New Vision for Modeling & Simulation

VR-Vantage IG delivers game-like visual quality in a high-performance image generator "” designed with the flexibility, scalability, and deliverability required for simulation and training.

With VR-Vantage IG, immerse your trainees in stunning virtual environments. Experience 60 Hz frame rates for smooth motion, engaging action to stimulate trainees, and beautiful effects for immersive realism; all this, inside world-wide geo-specific databases.

We use the latest shader-based rendering techniques "” just like the triple A games do "” to take full advantage of today’s powerful GPUs. In your scenes, you’ll see dynamic light sources that cast light on scene geometry, full-scene dynamic shadows, ambient occlusion, reflections, and bump maps, depth of field, zoom, and other camera effects "“ and a whole lot more.  

Continue reading
  9377 Hits
Jim Kogler

Understanding VR-Vantage Scene Performance – Part 1: The Drawable

To draw a frame, VR-Vantage needs to a) build/update the scene graph, b) organize the scene graph and send it to the GPU, and c) have the GPU render the scene. There are a bunch of other steps like loading terrain from disk (or across the network), processing network packages (DIS/HLA or CIGI), but for the most part those occur in other threads. I will address each of these issues in future posts. For the moment, let’s just focus on static scenes devoid of entities. Let’s look at *this* scene:

Continue reading
  22333 Hits
Petr Kyn

Tips for artists building 3d models for VR-Vantage


If you have a deep technical understanding of building 3d models, here are some quick tips to make sure they perform optimally in your VR-Vantage application.

To make your models as fast as possible, you need to minimize drawables. Each drawable is a collection of polygons with the same state set and the same primitive set. Every time the information about a drawable is sent to the graphics card, your draw time goes up significantly. Having hundreds if not thousands of drawables in your scene will kill your performance.  To reduce drawables, follow these rules:

Continue reading
  14901 Hits
Dan Brockway

Adding Synthetic Aperture Radar (SAR) to Your UVS Simulation’s Intelligence Reconnaissance surveillance (ISR) Capability

At MÄK, we help our customers simulate unmanned vehicles in a lot of ways, depending on what part of the system architecture the customer is addressing. Some use VR-Forces to simulate the UAV’s mission plans and flight dynamics. Some use VR-Vantage to simulate the EO/IR sensor video. Of those, some use VR-Vantage as the basis of their payload simulation and others stream video into their ground control station (GCS) from a VR-Vantage streaming video server. 

All of our customers now have the opportunity to add a Synthetic Aperture Radar (SAR) to their UAV simulations — and here’s how to do it. SensorFx SAR Server comes as two parts: a client and a server. The server runs on a machine on your network and connects to one or more clients. Whenever a client requests a SAR image, it sends a message to the server, providing the flight information of the UAV and the target location where to take a SAR image. The server, built with VR-Vantage, then uses the JRM Technologies radar simulation technology to generate a synthetic radar image and return it to the client.  

The SAR Server renders SAR images taking into account the specified radar properties, the terrain database, and knowledge of all the simulated entities. The radar parameters are configured on the server in advance of the simulation. The terrain database uses the same material classification data that is used by SensorFX for rendering infrared camera video so your sensor package will have the best possible correlation. The server connects to the simulation exercise network using DIS or HLA so that it has knowledge of all the entities. It uses this knowledge to include targets in the SAR scenes and so that you can use a simulated entity to host the SAR sensor. 

Continue reading
  8897 Hits
Dan Brockway

VR-Vantage At IITSEC: L-3

Also at IITSEC 2010, L-3 built a great solution using VR-Vantage XR and VR-Forces.
  2465 Hits