Machine Learning Introduction

Preview:

DESCRIPTION

Machine Learning Introduction. Study on the Coursera All Right Reserved : Andrew Ng Lecturer: Much Database Lab of X iamen University. Aug 12,2014. Examples: Database mining Large datasets from growth of automation/web. - PowerPoint PPT Presentation

Citation preview

Machine Learning Introduction

Study on the CourseraAll Right Reserved : Andrew Ng

Lecturer:Much Database Lab of Xiamen University

Aug 12,2014

• Examples: - Database mining

• Large datasets from growth of automation/web. • Web click data, medical records, biology, engineering

- Applications can’t program by hand.• Handwriting recognition, most of Natural Language

Processing (NLP), Computer Vision.

• Machine Learning- Grew out of work in AI(Artificial Intelligence)- New capability for computers

Machine Learning Definition

• Tom Mitchell (1998) Well-posed Learning Problem:

A computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E.

• Suppose your email program watches which emails you do or do not mark as spam, and based on that learns how to better filter spam. What is the task T in this setting? T : Classifying emails as spam or not spam

E : Watching you label emails as spam or not spam

P: The number of emails correctly classified as spam/not spam

“A computer program is said to learn from experience E with respect to some task T and some performance measure P, if its performance on T, as measured by P, improves with experience E.”

Machine Learning Algorithms

- Supervised learning- Unsupervised learning

- Others:- Reinforcement learning- Recommender systems.

x1

x2

Supervised Learning & Unsupervised Learning

Supervised Learning Unsupervised Learning

Linear Regression with one Variable

Housing Prices(Portland, OR)

Price(in 1000s of dollars)

Size (feet2)

Supervised Learning

Given the “right answer” for each example in the data.

Regression Problem

Predict real-valued output

Notation:

m = Number of training examples

x’s = “input” variable / features

y’s = “output” variable / “target” variable

Size in feet2 (x) Price ($) in 1000's (y)2104 4601416 2321534 315852 178… …

Training set ofhousing prices

Training Set

Learning Algorithm

hSize of house

Estimated price

Question : How to describe h?

How to choose ‘s ?

Training Set

Hypothesis:‘s: Parameters

Size in feet2 (x) Price ($) in 1000's (y)2104 4601416 2321534 315852 178… …

y

x

Idea: Choose so that is close to for our training examples

Cost Function

Hypothesis:

Parameters:

Cost Function:

Goal:

Simplified:

Price ($) in

1000’s

Size in feet2 (x)

Question:How to minimize J?

Gradient Descent

Have some function

Want

Outline:

• Start with some

• Keep changing to reduce

until we hopefully end up at a minimum

Gradient descent algorithm

Correct: Simultaneous update Incorrect:

Gradient descent algorithm

Notice : α is the learning rate.

If α is too small, gradient descent can be slow.

If α is too large, gradient descent can overshoot the minimum. It may fail to converge, or even diverge.

at local optima

Current value of

Unchange

Gradient descent can converge to a local minimum, even with the learning rate α fixed.

As we approach a local minimum, gradient descent will automatically take smaller steps. So, no need to decrease α over time.

Gradient Descent for Linear Regression

Gradient descent algorithm Linear Regression Model

Gradient descent algorithm

update and

simultaneously

J()

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

(for fixed , this is a function of x) (function of the parameters )

Linear Regression with multiple variables

Hypothesis:

Cost function:

Parameters:

(simultaneously update for every )

RepeatGradient descent:

(simultaneously update )

Gradient Descent

RepeatPreviously (n=1):

New algorithm :Repeat

(simultaneously update for )

Size (feet2) Number of bedrooms

Number of floors

Age of home (years)

Price ($1000)

1 2104 5 1 45 4601 1416 3 2 40 2321 1534 3 2 30 3151 852 2 1 36 178

Size (feet2) Number of bedrooms

Number of floors

Age of home (years)

Price ($1000)

2104 5 1 45 4601416 3 2 40 2321534 3 2 30 315852 2 1 36 178

Examples:

simultaneously update

Size (feet2) Number of bedrooms

Number of floors

Age of home (years)

Price ($1000)

2104 5 1 45 4601416 3 2 40 2321534 3 2 30 315852 2 1 36 178

Summarize• This is a briefly Introduction about Supervised Learning(Classification)in Machine Leaning.• There is still a lot of things in this subject,such as Clustering, Support Vector Machine(SVM), Dimensionality Reduction, ETC. The Core Idea of MS is very similar,hope you will be fond of the Machine Learning !

Thanks for Listening !

Recommended