33
HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY Transfer for Supervised Learning Tasks

Transfer for Supervised Learning Tasks

  • Upload
    evelia

  • View
    51

  • Download
    0

Embed Size (px)

DESCRIPTION

Transfer for Supervised Learning Tasks. HAITHAM BOU AMMAR MAASTRICHT UNIVERSITY. Motivation. Model. Assumptions: Training and Test are from same distribution Training and Test are in same feature space. Not True in many real-world applications. Training Data. ?. Test Data. - PowerPoint PPT Presentation

Citation preview

Page 1: Transfer for Supervised Learning Tasks

HAITHAM BOU AMMAR

MAASTRICHT UNIVERSITY

Transfer for Supervised Learning Tasks

Page 2: Transfer for Supervised Learning Tasks

Motivation

Model

y

x

x€

y

Training Data

?

Test Data

Assumptions: 1.Training and Test are from same distribution 2.Training and Test are in same feature spaceNo

t Tru

e in

man

y re

al-

world

app

licat

ions

Page 3: Transfer for Supervised Learning Tasks

Examples: Web-document Classification

Model

?

Physics Machine Learning

Life Science

Page 4: Transfer for Supervised Learning Tasks

Examples: Web-document Classification

Model

?

Physics Machine Learning

Life Science

Content Change !

Assumption violated!

Learn a new model

Page 5: Transfer for Supervised Learning Tasks

Learn new Model :

1.Collect new Labeled Data2.Build new model

Reuse & Adapt already learned model !

Page 6: Transfer for Supervised Learning Tasks

Examples: Image Classification

Model OneFeatures Task One

Task One

Page 7: Transfer for Supervised Learning Tasks

Examples: Image Classification

Cars

Motorcycles

Task Two

Features Task One

Features Task Two

Reuse

Model Two

Page 8: Transfer for Supervised Learning Tasks

Traditional Machine Learning vs. Transfer

Source Task

Knowledge

Target Task

Learning System

Different Tasks

Learning System

Learning System

Learning System

Traditional Machine Learning

Transfer Learning

Page 9: Transfer for Supervised Learning Tasks

Questions ?

Page 10: Transfer for Supervised Learning Tasks

Transfer Learning Definition

Notation: Domain :

Feature Space: Marginal Probability Distribution:

with

Given a domain then a task is :

Labe

l

Space P(Y

|X)

Page 11: Transfer for Supervised Learning Tasks

Transfer Learning Definition

Given a source domain and source learning task, a target domain and a target learning task, transfer learning aims to help improve the learning of the target predictive function using the source knowledge, where

or

Page 12: Transfer for Supervised Learning Tasks

Transfer Definition

Therefore, if either : Domain Differences

Task Differences

Page 13: Transfer for Supervised Learning Tasks

Examples: Cancer Data

Age Smoking

Age Height Smoking

Page 14: Transfer for Supervised Learning Tasks

Examples: Cancer Data

Task S

ourc

e: Clas

sify

into c

ance

r or n

o

canc

erTa

sk T

arge

t: Clas

sify

into c

ance

r lev

el on

e,

canc

er le

vel t

wo,

canc

er le

vel t

hree

Page 15: Transfer for Supervised Learning Tasks

Quiz

When doesn’t transfer help ? (Hint: On what should you

condition?)

Page 16: Transfer for Supervised Learning Tasks

Questions ?

Page 17: Transfer for Supervised Learning Tasks

Questions to answer when transferringW

hat t

o Tra

nsfe

r ?

How to

Trans

fer ?

Whe

n to

Tra

nsfe

r ?

Instan

ces

?

Model

?

Features

?

Map

M

odel

?

Unify

Feat

ures

?

Weig

ht

Insta

nces

?

In w

hich

Si

tuat

ions

Page 18: Transfer for Supervised Learning Tasks

Algorithms: TrAdaBoost

Assumptions: Source and Target task have same feature space:

Marginal distributions are different:

Not all source data might be helpful !

Page 19: Transfer for Supervised Learning Tasks

Algorithms: TrAdaBoost (Quiz)

Wha

t to

Trans

fer ?

How to

Tra

nsfer

?

Instan

ces

Weig

ht In

stanc

es

Page 20: Transfer for Supervised Learning Tasks

Algorithm: TrAdaBoost

Idea: Iteratively reweight source samples such that:

reduce effect of “bad” source instances encourage effect of “good” source instances

Requires: Source task labeled data set Very small Target task labeled data set Unlabeled Target data set Base Learner

Page 21: Transfer for Supervised Learning Tasks

Algorithm: TrAdaBoost

Weights Initialization

Hypothesis Learning and error calculation

Weights Update

Page 22: Transfer for Supervised Learning Tasks

Questions ?

Page 23: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

Problem Targeted is :

1.Little labeled data

2.A lot of unlabeled data

Build a model on the labeled data

Natural scenes

Car

Motorcycl

e

Page 24: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

Assumptions: Source and Target task have different feature space:

Marginal distributions are different:

Label Space is different:

Page 25: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning (Quiz)

Wha

t to

Trans

fer ?

How to

Tra

nsfer

?

1. Disc

over

Featu

res

2. Unif

y Fea

ture

s

Features

Page 26: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

Framework: Source Unlabeled data set:

Target Labeled data set: Natural

scenes

Build classifier for cars and Motorbikes

Page 27: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

Step One: Discover high level features from Source data by

Regularization Term Re-construction Error

Constraint on the Bases

Page 28: Transfer for Supervised Learning Tasks

Algorithm: Self-Taught Learning

Unlabeled Data Set

High

Leve

l Fea

ture

s

Page 29: Transfer for Supervised Learning Tasks

Algorithm: Self-Taught Learning

Step Two: Project target data onto the attained features by

Informally, find the activations in the attained bases such that: 1.Re-construction is minimized 2.Attained vector is sparse

Page 30: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

High Level F

eatures

Page 31: Transfer for Supervised Learning Tasks

Algorithms: Self-Taught Learning

Step Three: Learn a Classifier with the new features

Target TaskSource Task

Learn new features (Step 1)

Project target data (Step 2)Learn Model (Step 3)

Page 32: Transfer for Supervised Learning Tasks

Conclusions

Transfer learning is to re-use source knowledge to help a target learner

Transfer learning is not generalization

TrAdaBoot transfers instances

Self-Taught learning transfer unlabeled features

Page 33: Transfer for Supervised Learning Tasks

hmm