Projecting Sensor Video on Terrain
For a recent R&D project, our IRIS UxS product team was asked to quickly add a feature to our UAS navigation application that would allow live video data to be projected onto map terrain.
In applications where the UAS platform payload camera provides STANAG 4609 metadata with frame coordinates, it should be relatively straightforward for developers to render the camera imagery on the map.
With a little effort, it could easily be turned into a system which renders the projected imagery into a map layer, and saves it as a near real-time operational map for applications that require up-to-date map imagery such as ongoing ground search-and-rescue (GSAR) missions, major events, or infrastructure surveys.
This capability is generally only available in environments that support post-processing, but the TerraLens SDK allows for updates to map layers on demand, and supports a refresh rate that is capable of handling this type of operation while also maintaining a product that can be run on a consumer grade laptop.
Our team was able to implement the feature fairly quickly, and we created a prototype for the project. The image above shows how live video from the UAS was projected onto the map in real time. A screenshot of the application in use is shown above. The image actually demonstrates one of the challenges of projecting video on terrain, in that if the payload camera isn’t perpendicular to the terrain, the projected imagery will be distorted when applied to the map. This, however, can be addressed to a degree through the choice of camera angle, lens, and field of view.