43
Sensor Fusion and Navigation for Autonomous Systems Using MATLAB & Simulink Abhishek Tiwari Application Engineering

Sensor Fusion and Navigation for Autonomous Systems Using

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Sensor Fusion and Navigation for Autonomous Systems Using

Sensor Fusion and Navigation for Autonomous Systems

Using MATLAB & Simulink

Abhishek Tiwari

Application Engineering

Page 2: Sensor Fusion and Navigation for Autonomous Systems Using

Smart autonomous package delivery

2

②Warehouse

Automation

① Autonomous

Driving

③ Last Mile Delivery

Manufacturer

Consumer

Page 3: Sensor Fusion and Navigation for Autonomous Systems Using

Capabilities of an Autonomous System

Perception

Some common perception tasks

▪ Design localization algorithms

▪ Design environment mapping algorithms

▪ Design SLAM algorithms

▪ Design fusion and tracking algorithms

▪ Label sensor data

▪ Design deep learning networks

▪ Design radar algorithms

▪ Design vision algorithms

▪ Design lidar algorithms

▪ Generate C/C++ code

Perception

Page 4: Sensor Fusion and Navigation for Autonomous Systems Using

Capabilities of an Autonomous System

Learning Algorithms

Optimization

Planning

Perception

Some common planning tasks

▪ Visualize street maps

▪ Connect to HERE HD Live Map

▪ Design local and global path planners

▪ Design vehicle motion behavior planners

▪ Design trajectory generation algorithms

▪ Generate C/C++ code

Planning

Page 5: Sensor Fusion and Navigation for Autonomous Systems Using

Capabilities of an Autonomous System

Planning

Control

Perception

Some common control tasks

▪ Connect to recorded and live CAN

data

▪ Design reinforcement learning

networks

▪ Model vehicle dynamics

▪ Automate regression testing

▪ Prototype on real-time hardware

▪ Design path tracking controllers

▪ Design model-predictive controllers

▪ Generate production C/C++ code

▪ Generate AUTOSAR code

▪ Certify for ISO26262

Control

Page 6: Sensor Fusion and Navigation for Autonomous Systems Using

6

Perception

Planning

Control

In this talk, you will learn

Reference workflow for autonomous navigation

systems development

MATLAB and Simulink capabilities to design, simulate,

test, deploy algorithms for sensor fusion and

navigation

• Perception algorithm design

• Fuse sensor data to maintain situational awareness

• Mapping and Localization

• Path planning and path following control

Page 7: Sensor Fusion and Navigation for Autonomous Systems Using

Analog front

end + ADCDetector TrackerProcessing

Real

World

Sensor outputs vary along the processing chain

Page 8: Sensor Fusion and Navigation for Autonomous Systems Using

Vir

tual Scen

ari

o

Signals/Pixels Processed Signals Object Detections Tracks

Analog front

end + ADCDetector TrackerProcessing

Real

World

Camera

Radar

Lidar

Sensor outputs vary along the processing chain

Page 9: Sensor Fusion and Navigation for Autonomous Systems Using

Vir

tual Scen

ari

o

Signals/Pixels Processed Signals Object Detections Tracks

Analog front

end + ADCDetector TrackerProcessing

Real

World

Camera

Radar

Lidar

Sensor outputs vary along the processing chain

Page 10: Sensor Fusion and Navigation for Autonomous Systems Using

Vir

tual Scen

ari

o

Signals/Pixels Processed Signals Object Detections Tracks

Analog front

end + ADCDetector TrackerProcessing

Real

World

Camera

Radar

Lidar

Sensor outputs vary along the processing chain

Page 11: Sensor Fusion and Navigation for Autonomous Systems Using

Vir

tual Scen

ari

o

Signals/Pixels Processed Signals Object Detections Tracks

Analog front

end + ADCDetector TrackerProcessing

Real

World

Camera

Radar

Lidar

Sensor outputs vary along the processing chain

Page 12: Sensor Fusion and Navigation for Autonomous Systems Using

Vir

tual Scen

ari

o

Signals/Pixels Processed Signals Object Detections Tracks

Analog front

end + ADCDetector TrackerProcessing

Real

World

Camera

Radar

Lidar

Sensor outputs vary along the processing chain

Page 13: Sensor Fusion and Navigation for Autonomous Systems Using

13

Scenario Definition and Sensor

Simulation

Ownship

Trajectory

Generation

INS (IMU, GPS)

Sensor

Simulation

Sensor DataMulti-object

Trackers

Actors/

Platforms

Lidar, Radar, IR,

& Sonar Sensor

Simulation

Fusion for

orientation and

position

rosbag data

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Many options to bring sensor data to perception algorithms

SLAM

Visualization

&

Metrics

Page 14: Sensor Fusion and Navigation for Autonomous Systems Using

Simulate sensorsDefine scenarios

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Live data can be augmented for a more robust testbench

Page 15: Sensor Fusion and Navigation for Autonomous Systems Using

Simulate sensorsDefine scenarios

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Live data can be augmented for a more robust testbench

Page 16: Sensor Fusion and Navigation for Autonomous Systems Using

Estimate the pose using Monte Carlo Localization

16

Particle Filter

Planning

Control

Perception

• Localization

• Mapping

• TrackingMotion Model

(Odometry readings)

Sensor Model

(Lidar scan) Known Map

Page 17: Sensor Fusion and Navigation for Autonomous Systems Using

– Support dynamic environment changes

– Synchronization between global and local maps

What is the world around me?Egocentric occupancy maps

Dynamic Environment

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 18: Sensor Fusion and Navigation for Autonomous Systems Using

18

3D map for UAV motion planning

3D map for Autonomous Driving

Planning

Control

Perception

• Localization

• Mapping

• Tracking

What is the world around me?3D Occupancy Map

Page 19: Sensor Fusion and Navigation for Autonomous Systems Using

Build a map of an unknown environment while simultaneously

keeping track of robot’s pose.

Planning

Control

Perception

• Localization

• Mapping

• Tracking

SLAM

Where am I in the unknown environment?Simultaneous Localization and Mapping (SLAM)

2D Lidar SLAM

Page 20: Sensor Fusion and Navigation for Autonomous Systems Using

Simultaneous Localization and MappingSLAM Map Builder App (2D only)

App enables more interactive and user-friendly workflow

Planning

Control

Perception

• Localization

• Mapping

• Tracking

SLAM

Page 21: Sensor Fusion and Navigation for Autonomous Systems Using

Point Cloud Processing

Local MappingLoop Closure

DetectionPose Graph Optimization

Map representation

Computer Vision Navigation

Simultaneous Localization and Mapping3D Lidar SLAM

Planning

Control

Perception

• Localization

• Mapping

• Tracking

SLAM

Page 22: Sensor Fusion and Navigation for Autonomous Systems Using

Autonomous systems can track objects from Lidar point clouds

Track Objects Using Lidar: From Point Cloud to Track List

Track surrounding objects during automated lane change

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 23: Sensor Fusion and Navigation for Autonomous Systems Using

2D radar can be used to track position, size, and orientation

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 24: Sensor Fusion and Navigation for Autonomous Systems Using

Fusing multiple sensor modalities provides a better result

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 25: Sensor Fusion and Navigation for Autonomous Systems Using

Radar and Lidar fusion can increase tracking performance

25

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 26: Sensor Fusion and Navigation for Autonomous Systems Using

▪ Assess missed tracks

▪ Assess false tracks

▪ Assess Generalized Optimal

Sub Pattern Assignment

Metric (GOSPA)

GOSPAMissed Targets

False Tracks

Fuse lidar point cloud with radar detections

Page 27: Sensor Fusion and Navigation for Autonomous Systems Using

Plan a path from start to destination

Planning

Control

Perception

• Localization

• Mapping

• TrackingInitial

LocationGoal

Location

Global

Planning

Local Re-

planning

Page 28: Sensor Fusion and Navigation for Autonomous Systems Using

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Plan a path from start to destination

Global

PlanningLocal Re-

planning

Behavior

Planning

• Path planning

algorithms

• High-level

decision making

• Trajectory

generation

• mobileRobotPRM

• plannerRRT

• plannerRRTStar

• plannerHybridAStar

• trajectoryOptimalFrenet

• Lane keeping/changing

• Intersection handling

• Traffic light handling

• Adaptive cruise control

Page 29: Sensor Fusion and Navigation for Autonomous Systems Using

29

Urban driving needs planning on multiple levelsGlobal, behavior, and local planners

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Generate optimal trajectories for local re-planning and merge back with the global plan

Page 30: Sensor Fusion and Navigation for Autonomous Systems Using

30

Simulate shortest path to change lanes on a highway

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Simulate trajectory generation and the lane change maneuver

Page 31: Sensor Fusion and Navigation for Autonomous Systems Using

31

Mission planning for UAV leads to last mile delivery

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Page 32: Sensor Fusion and Navigation for Autonomous Systems Using

32

Choose a path planner based on your application

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Sampling-based planners such as RRT* Use path metrics to compare different paths

Page 33: Sensor Fusion and Navigation for Autonomous Systems Using

Computes angular

velocity from this desired

curvature

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Compute control commands for ground vehicles

Compute linear and angular velocity commands for a mobile robot

Page 34: Sensor Fusion and Navigation for Autonomous Systems Using

Use Pure Pursuit controller with supervisory logic

34

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Path Controller

Pure Pursuit

Supervisory Logic

Page 35: Sensor Fusion and Navigation for Autonomous Systems Using

Send control commands to the vehicle to follow the planned path

Calculate the steering angle and vehicle velocities to track the trajectories

Planning

Control

Perception

• Localization

• Mapping

• Tracking

35

Page 36: Sensor Fusion and Navigation for Autonomous Systems Using

36

Control lane change maneuver for highway driving

Planning

Control

Perception

• Localization

• Mapping

• Tracking

Longitudinal and Lateral Controllers to adjust the acceleration and steering

Page 37: Sensor Fusion and Navigation for Autonomous Systems Using

37

Simulate high-fidelity UAV model with waypoint following

Approximate High-Fidelity Model with Low-Fidelity Model

Roll Angle

Air Speed

Height

Simulate GPS and IMU sensor models Waypoint following controller

Page 38: Sensor Fusion and Navigation for Autonomous Systems Using

Bridging ROS with MATLAB and Simulink:

From Algorithm To Deployment

ROS bags

(Log files)

Data analysis

and playback

Standalone

node

deployment

ROS Nodes

(Software)

Simulators

Hardware

Desktop

prototyping

Controls

Perception

Planning and

decision making

ROS

Toolbox

Page 39: Sensor Fusion and Navigation for Autonomous Systems Using

Deploy and test sensor fusion and navigation algorithms on hardware

39

Perception

Planning

Control

HDL

Code

C/C++

Code

GPU

Code

Processors

FPGAs GPUs

Page 40: Sensor Fusion and Navigation for Autonomous Systems Using

Full Model Based Design Workflow for Autonomous Systems

Autonomous Algorithms

Sensor Fusion and

Tracking Toolbox

Automated Driving

Toolbox

Computer Vision

Toolbox

Model Predictive

Control Toolbox

Reinforcement

Learning Toolbox

Stateflow

Navigation

Toolbox

Platform MATLAB Simulink

Verification & Validation

Robotics System

Toolbox

Connect / Deploy Code Generation

AUTOSAR BlocksetROS Toolbox

Perceive

Sense

Plan &

Decide

ControlImage Acquisition

Toolbox

Page 41: Sensor Fusion and Navigation for Autonomous Systems Using

Control

You can lower risk in your autonomous navigation development

PlanningPerception

Tracking

Mapping

Localization

Path/Waypoint FollowingPath Metrics

Motion Planning

Page 42: Sensor Fusion and Navigation for Autonomous Systems Using

There are many resources to get started with

Tech Talks

Quick Start Guide

Contact for Training & Consulting ServicesExploit the full potential of MathWorks products