20
January 11, 2019 Sam Siewert, ICARUS Group AIAA SciTech 2019, San Diego Slew-to-Cue Electro-Optical and Infrared Sensor Network for Small UAS Detection, Tracking, and Identification Tilt/Pan Tracking EO/IR Camera System

Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

January 11, 2019 Sam Siewert, ICARUS GroupAIAA SciTech 2019, San Diego

Slew-to-Cue Electro-Optical and Infrared Sensor Network for Small UAS Detection, Tracking, and

Identification

Tilt/Pan TrackingEO/IR Camera System

Page 2: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

www.Bloomberg.com

Drone Net - Challenge & SignificanceMotivation – Growth of sUASNASA UTM and FAA UPP, IPPProblem – Sharing AirspaceInterim solution– Part 107, Restrictions, ADS-B Rx– sUAS ADS-B Tx/Rx insufficient

Research and Development– Explore sensor fusion of ground camera

(visible and infrared) and acoustic passive sensing

– Compare to ground RADAR and flight LIDAR

– Address Public Opportunity and Concern– Enable Safe Urban UAS Operations

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 2

www.citylab.com/transportation

Page 3: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Multi-Node Information Fusion Concept1. EO/IR Multi-spectral camera system

– Visible, NIR, LWIR pixel-level fusion, narrow FoV

2. All-sky camera - hemispherical, high resolution (2 MP)– Azimuth and Elevation (AZ, EL Cue to slew EO/IR)

3. Microphone arrays– Sound Intensity Probe (Beam-forming) microphone array

(AZ,EL)

4. K-band RADAR system (all sky or tracking)– Echodyne sUAS tracking to 1 Km + (installing spring 2019)

5. Flight LIDAR and EO/IR - last 50 foot navigation

Link Drone Net Nodes– Detect and Track Compliant and Non-compliant sUAS– Wireless to Acoustic arrays, wired All-sky, EO/IR and RADAR

ADS-B aggregation and improvement– Comparing ADS-B (e.g. ping2020)– to OEM and Custom IMU+GPS

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 3

Page 4: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Goal and Objectives for Paper1. All-sky camera - hemispherical, high resolution

– Build 6 x 2 MP camera array to determine feasibility of concept

– Azimuth and Elevation (AZ, EL Cue to slew EO/IR)

– AZ, EL Estimation to narrow EO/IR search space (coarse)

– Cues for multiple targets that EO/IR can prioritize (drone swarm)

2. Microphone arrays– Sound Intensity Probe (Beam-forming) microphone array (AZ,EL)

– Correspondence to coarse AZ, EL determined by All-sky

– Show that All-sky assists to confirm drone detection (compared to plane, bird, bug, false-positive)

3. EO/IR Multi-spectral camera system– Can re-detect target of interest from cue to track for detailed MV/ML

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 4

Page 5: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Method - Slew to Cue Scaling for Campus

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 5

All-sky camera coarse AZ, EL estimate narrows search space for EO/IR camerasAcoustic array (sound intensity probe) serve same purposeSimple Motion Detect with Classification (Plane, Drone, Bug/Bird, Other)Operates in real-time, wireless connections between acoustic nodes, all-sky, EO/IR

Page 6: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

All-Sky Detection For Coarse AZ, ELFeasibility with 12 MP, 6 camera array shownMotion detect pixel registration for coarse AZ, ELCorrespondence to Acoustic coarse AZ, EL

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 6

Step 1 - Camera that registers drone target (classified) over timeCamera Registering Drone or PlaneAll-sky camera coverage and FoV Model

Specific Camera (C1…C6) Pixel Registration Azimuth: Compute from X,Y registered pixel COM, Xbar and YbarAngle-off-CamAzimuth=( Xbar - [Xres/2]) / Xpixels-per-degreeAzimuth = CamAzimuth + Angle-off-CamAzimuthAzimuth = 41.36 degrees

Elevation: Camtilt + [(Yres-Ybar) / Ypixels-per-degree]Elevation=55.45 degrees

Page 7: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Acoustic Detection For Coarse AZ, ELDistributed processing microphone arraySound intensity probe feasibility for coarse AZ, ELCorrespondence to All-sky visual coarse AZ, EL

Sam Siewert, ICARUS Group ERAU IAB 2018 - Drone Net 7

Page 8: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

EO/IR Identification Method – MV/ML

Machine Vision using SoC Linux Built-into each EO/IR Tracker– Salient Object Detection (R-CNN sensor fusion input pre-processing)

Shape, Behavior and Contrast/Color/Texture in Multiple BandsPerformance [ROC, PR, F-measure, confusion matrices]

– Real-Time Detection, Segmentation, Tracking, Classification, Identification

Comparing R-CNN and Deep Learning Methods to Traditional Machine Learning– Expert systems– Bayesian inference, Deep Belief Net– PCA [Principal Component Analysis]– SVM [Support Vector Machines]– Clustering [e.g. K-means]– GPU Accelerated DNN (cuDNN)– Supervised, Unsupervised learning

Leverage Open Source: ROS, OpenCV, PyBrain, PyML, MLpack, cuDNN, Caffe, Tensorflow

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 8

Page 9: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Figure 1. Visible Detection Example @ 1Km

sec sec( ') , , , R'

tion tionfov fov

H VpixelsH Vpixels pixels

pixels

H D Dg fG B AR H R Vf V G G−= × = = × = ×

AR is aspect ratio, B is the object image size on the detector, f’ is the focal lengthg is the working distance, and G is the physical extent of an observable object

E.g. ALTA6 sUAS has Hsection =1126 mm, at g=617.5 meters using an LWIR 6.0 degree HfovG=64.96 meters, so horizontal pixel extents for 640 line-scan resolution would be 11 pixels

Figure 1, shows a test image from 55mm focal length visible camera with a 24mm detector, such thatG=269.43 meters, 6K line-scan resolution, and therefore 25 pixels for Hsection of 1126mm.

EO/IR Tracker - Pixel Extents at Working Distance

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 9

Page 10: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

MV/ML Flight EO/IR Frames[OEM Snapshot for prototype,MV/ML future enhancement]

MV/ML Ground EO/IR Frames[Detection, Classification and Identification

Subset of frames fromContinuous 10Hz baseline]

MATLABGeometric Analysis

& Re-Simulation

OEM NavigationLog Data

HF NavigationLog Data

[future enhancement]

ADS-B Log Data[sUAS, GA compliant

identification]

MV/MLDetection Performance

HRV ROC, PR, F-measure

Human ReviewDetection, Classification, and Identification

{TP, FP, TN, FN}

Localization Error &ADS-B Identification, Detection

{TP, FP, TN, FN}

Simulated HFOV, VFOVAnd Cross Section of Tracked sUAS

Synthetic Frame Generation

Time Correlated Frame Retrieval

HF truthOEM truth

Optical Navigation truth

Frame Compare

ADS-B truth

Actual

sUAS Not sUAS

PredictedsUAS 250 TP 43 FP

Not sUAS 0 FN 3 TN

© Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 10

Page 11: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Future Work on Drone Net Aerial NodeProject at ERAU Prescott– HF Navigation compared to

ADS-B– LIDAR + LWIR Fusion– Last 50 foot Urban

Navigation

Goal - Determine safe urban operation, GPS-denied, for parcel delivery scenarios with Sense-and-Avoid

NASA UTM Challenge (2020)

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 11

LIDAR Point Cloud from Lab Bench Test with LWIR image fusion

Page 12: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Future Work on Data Management and MV/ML Analytics

Modest GP-GPU On-site processing

MV/ML on Workstation ground nodes (Lambda DevBox, 20TB RAID)– R-DBMS for Aerial Catalog– File management of raw

images– Automated human review

(Auto-it)– Real-Time ATC NOTAMs– Forensic browsing

Goal - Human Review Truth model and Secure sharing of aerial catalog for registered and non-compliant sUAS

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 12

ERAU ICARUS STEM 125 LabCompliance Description Flight

PlanADS-B Complia

nceATC

notification

Registered sUAS, flight plan filed, following flight plan, ADS-B, safe navigation.

X X Full None

No ADS-B, unknown navigation equipment, standing waiver with Part 101 registered drone (e.g. hobby)

X Full None

Registered sUAS, ADS-B, but not on filed flight plan.

X Partial Warning

No ADS-B, unknown navigation equipment, no standing waiver or filed flight plan

Partial Warning

No ADS-B, large visual size, no standing waiver or filed flight plan, not classified or identified as hobby drone, unexpected track, shape, texture, and color in visible and LWIR.

None Safety Alert

Page 13: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

On-Going Verification with Re-simulationWork in Progress

Presented at IEEE Aerospace– MATLAB simulation to verify

detection in HFOV, VFOV– Track history and geometric

observability– Add virtual cameras to explore

potential improvementsGoal - Geometric Truth model for compliant HF Nav, OEM Nav, ADS-B tracked drone

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 13

Track segmentshown in “a”

Page 14: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

SummaryDrone Net Architecture Defined and Shown Feasible

– All-Sky Camera Feasibility– Acoustic Sound Intensity Probe Feasibility– Extends work done on EO/IR previously– Summer Experiment Planning in progress with tilt/pan EO/IR and Flight Node

Promise to match or enhance RADAR at low cost– Integration of Echodyne RADAR spring 2019– Active sensing– Another truth model - 1) GPS/IMU, 2) ADS-B, 3) Human Review of MV/ML, 4)

RADAR track

Forms Reference Design for UTM collaboration research

Next Steps …– RADAR data fusion with passive sensing data– Further exploration of acoustic (beam forming as well as SIP)– Higher resolution All-sky camera– Expand test grid to ¼ ERAU Prescott Campus

Questions?

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 14

Page 15: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

2018-19 Team – ERAU SponsoredERAU – Drone Net– Dr. Sam Siewert, PI, Assistant Prof.– Dr. Stephen Bruder, Co-I, ICARUS Director– Dr. Mehran Andalibi, Co-I– Dr. Iacopo Gentilini, Co-I

– Jonathan Buchholz - ME Robotics (graduate), MS UASE– Dakota Burklund - AE Student (graduated)– Garrison Bybee - SE Student

CU Boulder – Embedded Systems Engineering– Steve Rizor - MS, ESE (acoustic design and analysis)– Aasheesh Dandupally – MS, ESE– Omkar Prabhu – MS, ESE– Soumyatha Gavvala – MS, ESE (graduated)

Sam Siewert, ICARUS Group AIAA SciTech 2019 - Drone Net 15

Page 16: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Research References

Sam Siewert 16

Page 17: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

References

Sam Siewert 17

S. Siewert, M. Andalibi, S. Bruder, I. Gentilini, A. Dandupally, S. Gavvala, O. Prabhu, J. Buchholz, D. Burklund, “Drone Net, a passive instrument network driven by machine vision and machine learning to automate UAS traffic management”, AUVSI Xponential poster, Denver, Colorado, May 2018.

S. Siewert, M. Andalibi, S. Bruder, I. Gentilini, J. Buchholz, “Drone Net Architecture for UAS Traffic Management Multi-modal Sensor Networking Experiments”, IEEE Aerospace Conference [presentation], Big Sky, Montana, March 2018.

S. Siewert, M. Vis, R. Claus, R. Krishnamurthy, S. B. Singh, A. K. Singh, S. Gunasekaran, “Image and Information Fusion Experiments with a Software-Defined Multi-Spectral Imaging System for Aviation and Marine Sensor Networks”, AIAA SciTech 2017, Grapevine, Texas, January 2017.

S. Siewert, V. Angoth, R. Krishnamurthy, K. Mani, K. Mock, S. B. Singh, S. Srivistava, C. Wagner, R. Claus, M. Demi Vis, “Software Defined Multi-Spectral Imaging for Arctic Sensor Networks”, SPIE Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery XXII, Baltimore, Maryland, April 2016.

S. Siewert, J. Shihadeh, Randall Myers, Jay Khandhar, Vitaly Ivanov, “Low Cost, High Performance and Efficiency Computational Photometer Design”, SPIE Sensing Technology and Applications, SPIE Proceedings, Volume 9121, Baltimore, Maryland, May 2014.

Piella, G. (2003). A general framework for multiresolution image fusion: from pixels to regions. Information fusion, 4(4), 259-280.

Blum, R. S., & Liu, Z. (Eds.). (2005). Multi-sensor image fusion and its applications. CRC press.

Liu, Z., Blasch, E., Xue, Z., Zhao, J., Laganiere, R., & Wu, W. (2012). Objective assessment of multiresolution image fusion algorithms for context enhancement in night vision: a comparative study. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 34(1), 94-109.

Fiott, Daniel. "Europe and the Pentagon’s third offset strategy." The RUSI journal 161.1 (2016): 26-31.

Page 18: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

References

Sam Siewert 18

Simone, G., Farina, A., Morabito, F. C., Serpico, S. B., & Bruzzone, L. (2002). Image fusion techniques for remote sensing applications. Information fusion, 3(1), 3-15.

Mitchell, H. B. (2010). Image fusion: theories, techniques and applications. Springer Science & Business Media.

Szeliski, R. (2010). Computer vision: algorithms and applications. Springer Science & Business Media.

Sharma, G., Jurie, F., & Schmid, C. (2012, June). Discriminative spatial saliency for image classification. In Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on (pp. 3506-3513). IEEE.

Toet, A. (2011). Computational versus psychophysical bottom-up image saliency: A comparative evaluation study. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 33(11), 2131-2146.

Valenti, R., Sebe, N., & Gevers, T. (2009, September). Image saliency by isocentric curvedness and color. In Computer Vision, 2009 IEEE 12th International Conference on (pp. 2185-2192). IEEE.

Wang, M., Konrad, J., Ishwar, P., Jing, K., & Rowley, H. (2011, June). Image saliency: From intrinsic to extrinsic context. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on (pp. 417-424). IEEE.

Liu, F., & Gleicher, M. (2006, July). Region enhanced scale-invariant saliency detection. In Multimedia and Expo, 2006 IEEE International Conference on (pp. 1477-1480). IEEE.

Cheng, M. M., Mitra, N. J., Huang, X., & Hu, S. M. (2014). Salientshape: Group saliency in image collections. The Visual Computer, 30(4), 443-453.

http://global.digitalglobe.com/sites/default/files/DG_WorldView2_DS_PROD.pdf

http://www.spaceimagingme.com/downloads/sensors/datasheets/DG_WorldView3_DS_2014.pdf

Richards, Mark A., James A. Scheer, and William A. Holm. Principles of modern radar. SciTech Pub., 2010.

Page 19: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

References

Sam Siewert 19

Brown, Christopher D., and Herbert T. Davis. "Receiver operating characteristics curves and related decision measures: A tutorial." Chemometrics and Intelligent Laboratory Systems 80.1 (2006): 24-38.

Wang, Bin, and Piotr Dudek. "A fast self-tuning background subtraction algorithm." Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops. 2014.

Panagiotakis, Costas, et al. "Segmentation and sampling of moving object trajectories based on representativeness." IEEE Transactions on Knowledge and Data Engineering 24.7 (2012): 1328-1343.

Public SDMSI shared data web site for video sequences captured and used in two experiments presented in this paper - http://mercury.pr.erau.edu/~siewerts/extra/papers/AIAA-SDMSI-data-2017/

Perazzi, Federico, et al. "Saliency filters: Contrast based filtering for salient region detection." Computer Vision and Pattern Recognition (CVPR), 2012 IEEE Conference on. IEEE, 2012.

Achanta, Radhakrishna, et al. "SLIC superpixels compared to state-of-the-art superpixel methods." IEEE transactions on pattern analysis and machine intelligence 34.11 (2012): 2274-2282.24Hou, Xiaodi, and Liqing Zhang. "Saliency detection: A spectral residual approach." 2007 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, 2007.

Global Contrast based Salient Region Detection. Ming-Ming Cheng, Niloy J. Mitra, Xiaolei Huang, Philip H. S. Torr, Shi-Min Hu. IEEE Transactions on Pattern Analysis and Machine Intelligence (IEEE TPAMI), 37(3), 569-582, 2015.

flightradar24.com, ADS-B, primary/secondary RADAR flight localization and aggregation services.

Birch, Gabriel Carisle, John Clark Griffin, and Matthew Kelly Erdman. UAS Detection Classification and Neutralization: Market Survey 2015. No. SAND2015-6365. Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States), 2015.

Page 20: Slew-to-Cue Electro-Optical and Infrared Sensor Network ...mercury.pr.erau.edu/...SciTech-2019...Presentation.pdf · – Dr. Stephen Bruder, Co -I, ICARUS Director – Dr. Mehran

Drone Detection and Neutralization CompaniesLeading Drone Detection Companies

• Rohde & Schwarz Ardronis - Ardronis I• https://www.blacksagetech.com/• https://www.droneshield.com/• http://www.dedrone.com/en/• https://www.kongsberggeospatial.com/applications/argus-cuas• https://fortemtech.com/ - DroneHunter

List of Drone Detection and Counter UAS Products and Experiments• DJI Aeroscope• Drone Capture with Nets• Test at JFK by FBI• Dynetics - Counter UAS• SRC Gryphon Sensors Counter UAS• SPI Infrared Drone Detection• Industrial Camera Drone Detection• HGH Infrared Drone Detection• http://www.cerbair.com• Israel Defense Counter UAS• AARONIA Drone Detection

© Sam Siewert 20