Computer Vision with Deep Learning...Computer Vision with Deep Learning Classi cation and TensorFlow...

Preview:

Citation preview

Computer Vision with Deep LearningClassification and TensorFlow

J. Irving Vasquez

Centro de Innovacion y Desarrollo Tecnologico en Computo

October 3, 2018

1 / 34

Classification

I Classification consist on assigning classes or labels to objects.[Sucar 15]

2 / 34

Classification

Unsupervised In this case the classes are unknown. The problemconsists in dividing a set of objects into n groups orclusters, so that a class is assigned to each differentgroup. Clustering.

Supervised The possible classes or labels are known a priori, andthe problem consists in finding a function or rule thatassigns each object to one of the classes.

3 / 34

Supervised Classification

4 / 34

Logistic classifier

I WX + B = Y

I Implemented by a neural network with multiple outputs

5 / 34

Softmax function

I Converts a vector into “probabilities”

I Inputs (y) are usually called logits

6 / 34

One-hot encoding

I Facilitates the representation of classes

7 / 34

One-hot encoding - quiz

8 / 34

Cross entropy

I Measures the distance between two probability vectors

D(S , L) = −∑i

li log(si ) (1)

I D(S , L) 6= D(L,S)

9 / 34

Multinomial logistic classification

I Obtain the outputsY = WX + B

where Y is the logit vector

I Convert to probabilities

S(yi ) =eyi∑j e

yj

I Measure the distance with respect of the one-hot vectors

D(S , L) = −∑i

li log(si )

10 / 34

Multinomial logistic classification

I D(S(WX + B), L)

I How do we compute the weights?

11 / 34

Multinomial logistic classification

I D(S(WX + B), L)

I How do we compute the weights?

11 / 34

Minimizing the cross entropy

I What do we need to do?

D(S(WX + B), L)

I Increase the distances

↓ D(A, a)

↑ D(A,¬a)

I Loss = average cross entropy

Loss =1

N

∑i

D(S(Wxi + B), Li )

12 / 34

Minimizing the cross entropy

13 / 34

Stochastic Gradient Descent (SGD)

I We compute the loss on some random data

14 / 34

Stochastic Gradient Descent (SGD)

I Some tips

I Inputs mean 0 and equal variance

I Initial weights, random, mean 0 and small variance

15 / 34

SGD - Momentum

16 / 34

Learning Rate Decay

17 / 34

Hyper parameters

18 / 34

Performance measurement

I Lets suppose a dataset, XΩ = x1, . . . , xm, with positiveexamples.

I We could use a subset of our set of observations as anestimate of the performance.

X = x1, . . . , xn

X ′ = xn+1, . . . , xm

, X is the training set and X ′ is usually called the validationset. [Smola]

I Typically, one uses about 80% of the training data for trainingand 20% for validation [GoodFellow]

19 / 34

Performance Measurements

Common measurements for categorical outputs

Accuracy

Acc(f ) =1

T

T∑i=1

(1yi=yi )

Error 0-1 Loss

Acc(f ) =1

T

T∑i=1

(1yi 6=yi )

20 / 34

Overfitting

21 / 34

Implementation details

22 / 34

Implementation details - Images

I Translate the RBG values to normalized values around 0, e.g.

Rn =R − 128

128(2)

23 / 34

Significance

24 / 34

Introduction

I TensorFlow is an open source software library for numericalcomputation using data flow graphs.

I Nodes in the graph represent mathematical operations

I Edges represent the multidimensional data arrays (tensors)communicated between them.

25 / 34

Resources

I https://www.tensorflow.org

I Deep Learning by google. Udacity - Free

26 / 34

I We will use TensorFlow to classify images from the notMNISTdataset

Figure:

27 / 34

Installation with anaconda

I Run the following commands to setup your environment:

conda create --name=IntroToTensorFlow python=3 anaconda

source activate IntroToTensorFlow

conda install -c conda-forge tensorflow

I That’s it!

28 / 34

Hello World

import tensorflow as tf

# Create TensorFlow object called tensor

hello_constant = tf.constant(’Hello World!’)

with tf.Session() as sess:

# Run the tf.constant operation in the session

output = sess.run(hello_constant)

print(output)

29 / 34

Hello World

I Tensor. It storages the values. Data is not stored as integers,floats, or strings.

hello_constant = tf.constant(’Hello World!’)

hello constant is a 0-dimensional string tensor

I Examples

# A is a 0-dimensional int32 tensor

A = tf.constant(1234)

# B is a 1-dimensional int32 tensor

B = tf.constant([123,456,789])

# C is a 2-dimensional int32 tensor

C = tf.constant([ [123,456,789], [222,333,444] ])

30 / 34

Session

I TensorFlow’s api is built around the idea of a computationalgraph

I A ”TensorFlow Session”, as shown above, is an environmentfor running a graph.

31 / 34

Exercises

I Repositoriohttps://github.com/irvingvasquez/cv2course_04tf

I Exercises 01 and 02

32 / 34

Mini project

Clone the Repository and Run the Notebook Run the commandsbelow to clone the Lab Repository and then run the notebook:https://github.com/irvingvasquez/CarND-TensorFlow-Lab

3 problems for you to solve:

1. Normalize the features

2. Use TensorFlow operations to create features, labels, weight,and biases tensors

3. Tune the learning rate, number of steps, and batch size forthe best accuracy

Make a report with your accuracy and parameters you use.

33 / 34

References

Sucar 15 Sucar, Luis Enrique. Probabilistic Graphical Models:Principles and Applications. Springer, 2015.

Smola Alex Smola and S.V.N. Vishwanathan, Introduction toMachine Learning, Cambirdge University Press.

Udacity Self Driving Car Nanodegree

34 / 34

Recommended