SensorFusionAI
Sensor-Agnostic, 3D Data Fusion Engine for Complex Environments
Detection of drones or Unmanned Aerial Systems (UAS) is moving towards a multi-sensor approach for fixed site (and in certain situations, vehicles and ship systems), where the space and budget allows (due to ability to provide better detection results with multiple sensor modalities, such as Radio Frequency (RF), radar, acoustic and camera systems). This can be deployed in a single or across multiple nodes.
However the multi-sensor approach only generates better results, with an intelligent software engine to fuse together the sensor outputs and give an intelligent set of outputs – otherwise adding more sensors is counterproductive as it creates more data without a clear way to manage it.
DroneShield has developed a true AI-based SensorFusion engine, SensorFusionAI (SFAI), initially for its own DroneSentry-C2 Command-and-Control system, including all common drone detection modalities (RF, radar, acoustics, camera). It is also available for third-party C2 manufacturers to add SFAI to their C2 systems, thus improving the performance of their C2 offerings.
Features
Behaviour Analysis: Track an object to determine classification and predict trajectory
Threat Assessment: Intelligently determine threat level based on a wide range of data types
Confidence Levels: Designed for complex, high noise environments, with inconsistent data inputs
After-Action Reporting: Sophisticated analytics presented in easy to interpret graphical dashboards
Edge Processing: Utilizes an edge processing device (SmartHub) for reduced network load and high scalability
Versatile Adaptable Inputs: New sensors use existing software adaptors to improve integration time
Output to Any Platform: Visualization on DroneSentry-C2 or third-party C2 platforms, data analysis, alert systems, or security management software
Data Flow
1 - Observe:
Ingests data from all sensor modalities and manufacturers. Available APIs for multi-sensor technology data output ingestion. Random Finite Sets analysis to determine emitter presence (Object Existence Estimation) and tracks (Object Count Estimation)
2 - Orient:
Predicts track of detected objects that have disappeared from view. Single-object tracking via Non-linear Multi Hypothesis State Estimation, Gaussian Sum Filters for measurements and Motion Models for Prediction
3 - Decide:
Fuses multiple sensor ‘hits’ if they are from the same UAS. Multiple Object Distinction and Tracking via the use of Density Clustering models and Joint Probabilistic Data-Association Filter (JPDAF)
4 - Act:
Generates UAS tracks rather than points. Actionable intelligence with rich information, threat level, and confidence level. Track Oriented - Multi Hypothesis Tracker (TO-MHT) for data association and multiple-object tracking
Correlation vs SensorFusionAI
Direct Correlation
Direct Correlation: Utilizes methods such as temporal and spatial correlation for object confirmation.
System attempts to correlate raw sensor data in isolated incidents, data is not weighted based on the environment
Hard sensitivity selection will either generate high numbers of false-positives or false-negatives
Results in inconsistent paths or dropped detections
Conflicting or bad sensor data can present inaccurate results, system trust is broken
Cluttered display and need for human analysis increases operator cognitive load
SensorFusionAI
SensorFusion: Sensors are utilized for their strengths with their weaknesses offset by the strengths of sensor types.
System intelligently builds a model informed by all inputs over time
Confidence values allow for soft sensitivity selection, reducing false-positives or false-negatives
Prediction model can interpolate paths for consistent tracking even with sparse data
Any incomplete or contradictory data mediated by comprehensive object model
All sensor data fused into one consistent intelligence packet
Our Approach
How to Buy
See how to buy for more information on how to buy DroneShield Products.