The ATLAS Lab at Florida Tech sent in a challenge problem to design a desktop and Virtual Reality (VR) small Unmanned Aircraft Systems (sUAS) Synthetic Task Environment (STE).
The ATLAS Lab needed a high cognitive fidelity simulator to facilitate training transfer to real world operations by incorporating a user interface that mimics popular commercial sUAS interfaces and map data from actual sUAS training sites.
UAVSIM needed a VR simulation to mimic a modern UAS led rescue mission and also allow for the testing of different interface configurations.
Also needed was the ability to automatically collect situation awareness and performance data.
Students designed the VR and desktop STE using Unreal Engine 4.
Blender was utilized for 3D modeling of the UAV, Gimp was used to create icons, and satellite imagery was utilized as a blueprint to recreate real world locations.
UAVSIM can be utilized on a single-monitor, dual-monitor, and VR headset configuration in two interface modes
Traditional operation in which the environmental view is presented separate from a heads-down display of the vehicle parameters (such as with a tablet-based controller).
Heads-up display of vehicle parameters overlaid on the environmental view.
UAVSIM was developed to be flexible, reconfigurable and support various training and research needs. All vehicle and target parameters can be easily manipulated to vary type and difficulty of mission. UAVSIM has integrated data collection tools such as performance tracking, event timestamping, and situation awareness assessment queries.
UAVSIM has been tested with local first responders and utilized in an experiment examining the impact of heads-up vs. heads-down display configuration on situation awareness.
Future work will recreate the simulated task in real-world operations utilizing AR glasses to validate UAVSIM as a testbed for sUAS research and training.
Future development will include:
Terrain and mission parameters to support ocean rescue and fire inspection scenarios.