Upload
others
View
1
Download
0
Embed Size (px)
Citation preview
Dragan Samardzija
January 2020
1
Deep Learning for Autonomous and Networked Vehicles
- 2 -
This lecture is compiled using numerous course martials, books, conference papers and presentations.
1. CS231n Convolutional Neural Networks for Visual Recognition, Stanford University, Spring 2017
2. ECBM E4040 Neural Networks and Deep Learning, Columbia University, 2017
3. MIT 6.S094: Deep Learning for Self-Driving Cars, MIT, 2017
4. Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville
2
Machine LearningRegression, Classification, Learning
Deep Neural Networks Feed-forward, Back-propagationConvolutional Networks
Course Organization and Schedule
TensorFlowClassification of Hand-written Numerals
Convolutional Deep Neural NetworksCase Studies OptimizationVehicular Applications
Lectures Computer-lab Work
week 1
week 2
3
MondayTuesdayWednesday
ThursdayFridayMondayTuesday
WednesdayThursdayFriday
Course Organization and Schedule, contd.
Implementation on Embedded Platform Alpha Board
Reinforcement Learning Transportation System ApplicationsSummary
Exam (Feb. 21)
Lectures Computer-lab Work
week 3
week 4
4
Off(Feb. 17)
MondayTuesday WednesdayThursdayFriday
Tuesday WednesdayThursday
5
Deep Learning BasicsPerceptron
Artificial Neural Networks
Multilayer Networks
Training and Inference
History
General Comments
Universality of Neural Networks
Activation Functions
Types
Issues
PerceptronBasic Unit of Artificial Neural Networks
6
Biological Neuron and Perceptron
7
Artificial Neural Network (ANN)
• An ANN consists of• many simple connected neurons (i.e.,
perceptrons),• each neuron has an activation function,• input neurons get activated through sensors
perceiving the environment,• other neurons are activated through weighted
connections from previously activated neurons.
• ANNs are built out of linear building blocks, followed by nonlinear activation functions.• Functions that are differentiable.
• Function approximators.
8
Multilayer Neural NetworkDeep Neural Networks (DNNs)
9
2-layer neural network
1-hidden-layer network
3-layer neural network
2-hidden-layer network
Fully Connected LayerExample – Image Processing
10
11
Training and Inference in DNN
12
Reminder: Linear Classifier and Logistic RegressionTraining and Inference
History
Cybernetics (1940 - 1960).
• McCulloch-Pitts neuron (1943).• Perceptron (1957).
Connectionism (1980s-1990)• Back-propagation algorithm for training deep
networks (1980s).
Deep learning (2006)• Deeper networks than before can be trained
and have record-breaking results, emphasizing the importance of depth.
13
Perceptron• Built at Cornell 1957.• Single-layer feedforward network • Used for letter and shape recognition.
Deep LearningGeneral Comments
• The true challenge to AI proved to be solving the tasks that are easy for people to perform but hard for people to describe formally—problems that we solve intuitively.
• Although deep learning has historical roots going back decades - the field was reignited by papers such as Krizhevsky, Sutskever and Hinton’s in 2012 on image classification.
• The technique excels at solving classification problems, in which a wide range of potential inputs must be mapped onto a limited number of categories, given that there is enough data available and the test set closely resembles the training set.
• Deep learning is essentially a statistical technique for classifying patterns, based on sample data, using neural networks with multiple layers.
• It is a particular kind of machine learning that achieves great power and flexibility by learning to represent the world as a nested hierarchy of concepts.• By breaking the desired complicated mapping into a series
of nested simple mappings, each described by a different layer of the model.
Image Classification
• It rapidly became the best known technique in artificial intelligence.
• Deep learning is a ‘big hammer’ applicable in many use-cases.
• Andrew Ng suggested that “If a typical person can do a mental task with less than one second of thought, we can probably automate it using AI either now or in the near future.”, 2016.
• AI is the new electricity - AI today is beginning to transform every major industry.
• In principle, given infinite data, deep learning systems are powerful enough to represent any finite deterministic “mapping” between any given set of inputs and a set of corresponding outputs.
• DNNs remain something of a black box.
Deep LearningGeneral Comments, contd.
Is driving hard?
16
Deep Learning BasicsPerceptron
Artificial Neural Networks
Multilayer Networks
Training and Inference
History
General Comments
Universality of Neural Networks
Activation Functions
Types
Issues
DNN - Functional View
• The goal of a DNN is to approximate function y = f(x, q).
• Through the learning process the parameter q is determined so the best approximation is achieved f*(x, q).
• The function is represented by composing together many different functions (i.e., layers).
17
y ~ f*(x, q) = f (4) ( f (3) (f (2) (f (1)(x))))
Universality of Neural Networks
• A feedforward network with a single layer is sufficient to represent any function, but it may be unfeasibly large and may fail to learn and generalize correctly. • Universality holds even for a single-layer neural network.
• Deeper networks are more efficient - can represent functions that would require an exponential number of hidden units in a shallow neural network.
• Deep networks using many hidden layers with rectified units are good at approximating functions which can be composed from simpler functions.
*Given a good algorithm for training those networks.
18
Activation Functions
19
exponential linear unit
Rectified Linear Unit (ReLU)
20
ReLU – works really well but no real explanation why.
ReLU – was key improvement to stability.
Example of Activation Function Issues
21
Active ReLU
Inactive ReLU →
never updated
Initialize neurons with small
positive biases (e.g., 0.01)
Activation Functions and Derivatives
22
• Small derivatives lead to slow learnings.
• If neurons initialized poorly may lead to the network not being active for the entire training set.
Leaky ReLU
23