No cookie for
Blog posts tagged in SAR Server

At MÄK, we help our customers simulate unmanned vehicles in a lot of ways, depending on what part of the system architecture the customer is addressing. Some use VR-Forces to simulate the UAV’s mission plans and flight dynamics. Some use VR-Vantage to simulate the EO/IR sensor video. Of those, some use VR-Vantage as the basis of their payload simulation and others stream video into their ground control station (GCS) from a VR-Vantage streaming video server. 

All of our customers now have the opportunity to add a Synthetic Aperture Radar (SAR) to their UAV simulations — and here’s how to do it. SensorFx SAR Server comes as two parts: a client and a server. The server runs on a machine on your network and connects to one or more clients. Whenever a client requests a SAR image, it sends a message to the server, providing the flight information of the UAV and the target location where to take a SAR image. The server, built with VR-Vantage, then uses the JRM Technologies radar simulation technology to generate a synthetic radar image and return it to the client.  

The SAR Server renders SAR images taking into account the specified radar properties, the terrain database, and knowledge of all the simulated entities. The radar parameters are configured on the server in advance of the simulation. The terrain database uses the same material classification data that is used by SensorFX for rendering infrared camera video so your sensor package will have the best possible correlation. The server connects to the simulation exercise network using DIS or HLA so that it has knowledge of all the entities. It uses this knowledge to include targets in the SAR scenes and so that you can use a simulated entity to host the SAR sensor. 

Last modified on Read more
Hits: 7358 0 Comments