20
Sta 414/2104 S http://www.utstat.utoronto.ca/reid/414S10.html STA414S/2104S: Statistical Methods for Data Mining and Machine Learning January - April, 2010 , Tuesday 12-2, Thursday 12-1, SS 2105 Course Information This course will consider topics in statistics that have played a role in the development of techniques for data mining and machine learning. We will cover linear methods for regression and classification, nonparametric regression and classification methods, generalized additive models, aspects of model inference and model selection, model averaging and tree based methods. Prerequisite : Either STA 302H (regression) or CSC 411H (machine learning). CSC108H was recently added: this is not urgent but you must be willing to use a statistical computing environment such as R or Matlab. Office Hours : Tuesdays, 3‐4; Thursdays, 2‐3; or by appointment. Textbook : Hastie, Tibshirani and Friedman. The Elements of Statistical Learning. Springer‐Verlag. http://www‐stat.stanford.edu/~tibs/ElemStatLearn/index.html Course evaluation : Homework 1 due February 11: 20%, Homework 2 due March 4: 20%, Midterm exam, March 16: 20%, Final project due April 16: 40%. Grades and Class Emails will be managed through Blackboard. 1 / 20

Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Embed Size (px)

Citation preview

Page 1: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Sta 414/2104 S

http://www.utstat.utoronto.ca/reid/414S10.html 

STA414S/2104S:StatisticalMethodsforDataMiningandMachineLearning 

  January ­ April, 2010 , Tuesday 12­2, Thursday 12­1,  SS 2105 

 Course Information  

This course will consider topics in statistics that have played a role in the development of techniques for data mining and machine learning. We will cover linear methods for regression and classification, nonparametric regression and classification methods, generalized additive models, aspects of model inference and model selection, model averaging and tree based methods. 

 Prerequisite : Either STA 302H (regression) or CSC 411H (machine learning). CSC108H was recently added: this is not urgent but you must be willing to use a statistical computing environment such as R or Matlab. 

 Office Hours : Tuesdays, 3‐4; Thursdays, 2‐3; or by appointment. 

Textbook : Hastie, Tibshirani and Friedman. The Elements of Statistical Learning. Springer‐Verlag. http://www‐stat.stanford.edu/~tibs/ElemStatLearn/index.html  

 Course evaluation : 

• Homework 1 due February 11: 20%, 

• Homework 2 due March 4:         20%, 

• Midterm exam, March 16:           20%, 

• Final project due April 16:          40%. 

Grades and Class Emails will be managed through Blackboard. 

Computing : I will refer to, and provide explanations for, the  R  computing environment. You are welcome to use some other package if you prefer. There are many online resources for  R , including: 

• Jeff Rosenthal's http://probability.ca/jeff/teaching/0708/sta410/Rinfo.html introduction for the course STA410/2102, 

• the documentation http://probability.ca/cran/manuals.html provided on the R project web site ‐‐ especially the 

• Introduction to R <http://cran.r‐project.org/doc/manuals/R‐intro.html> 

• John Verzani' s http://www.math.csi.cuny.edu/Statistics/R/simpleR/online book simpleR, 

• Radford Neal's http://www.utstat.utoronto.ca/~radford/sta2102.S04/> notes for  STA 410  

1 / 20

Page 2: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Tentative SyllabusI Regression: linear, ridge, lasso, logistic, polynomial

splines, smoothing splines, kernel methods, additivemodels, regression trees, projection pursuit, neuralnetworks: Chapters 3, 5, 9, 11

I Classification: logistic regression, linear discriminantanalysis, generalized additive models, kernel methods,naive Bayes, classification trees, support vector machines,neural networks, K -means, k -nearest neighbours, randomforests: Chapters 4, 6, 9, 11, 12, 15

I Model Selection and Averaging: AIC, cross-validation, testerror, training error, bootstrap aggregation: Chapter 7, 8.7

I Unsupervised learning: Kmeans clustering, k -nearestneighbours, hierarchical clustering: Chapter 14

2 / 20

Page 3: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some referencesI Venables, W.N. and Ripley, B.D. (2002). Modern Applied

Statistics with S (4th Ed.). Springer-Verlag. Detailedreference for computing with R.

I Maindonald, J. and Braun, J. (). Data Analysis andGraphics using R. Cambridge University Press.Gentler source for R information.

I Hand, D., Mannila, H. and Smyth, P. (2001). Principals ofData Mining. MIT Press.Nice blend of computer science and statistical methods.

I Ripley, B.D. (1996). Pattern Recognition and NeuralNetworks. Cambridge University Press.Excellent, but concise, discussion of many machine learningmethods.

3 / 20

Page 4: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some CoursesI http://www.stat.columbia.edu/˜madigan/DM08

Columbia (with links to other courses)I http://www-stat.stanford.edu/˜tibs/stat315a.html

StanfordI http://www.stat.cmu.edu/˜larry/=sml2008/

Carnegie-Mellon

4 / 20

Page 5: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Data mining/machine learningI large data setsI high dimensional spacesI potentially little information on structureI computationally intensiveI plots are essential, but require considerable pre-processingI emphasis on means and variancesI emphasis on predictionI prediction rules are often automated: e.g. spam filters

I Hand, Mannila, Smyth “Data mining is the analysis of(often large) observational data sets to find unsuspectedrelationships and to summarize the data in novel ways thatare both understandable and useful to the data owner”

5 / 20

Page 6: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Applied Statistics (Sta 302/437/442)I small data setsI low dimensionI lots of information on the structureI plots are useful, and easy to useI emphasis on likelihood using probability distributionsI emphasis on inference: often maximum likelihood

estimators (or least squares estimators)I inference results highly ‘user-intensive’: focus on scientific

understanding

6 / 20

Page 7: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some common methodsI regression (Sta 302)I classification (Sta 437 – applied multivariate)I kernel smoothing (Sta 414, App Stat 2)

I neural networksI Support Vector MachinesI kernel methodsI nearest neighboursI classification and regression treesI random forestsI ensemble learning

7 / 20

Page 8: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some quotesI HMS1: “Data mining is often set in the broader context of

knowledge discovery in databases, or KDD”http://www.kdnuggets.com/

I Hand et al.: Data mining tasks are: exploratory dataanalysis; descriptive modelling (cluster analysis,segmentation), predictive modelling (classification,regression), pattern detection, retrieval by content,recommender systems

I Data mining algorithms include: model or pattern structure,score function for model quality, optimization and search,data management

I Ripley (pattern recognition):“Given some examples ofcomplex signals and the correct decision for them, makedecision automatically for a stream of future examples”

1Haughton, D. and 5 others (2003). A review of software packages fordata mining. American Statistician, 57, 290–309.

8 / 20

Page 9: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

SoftwareI Haughton et al., software for data mining: Enterprise

Miner, XLMiner, Quadstone, GhostMiner, Clementine

I

Enterprise Miner $40,000-$100,000XLMiner $1,499 (educ)Quadstone $200,000 - $1,000,000GhostMiner $2,500 – $30,000 + annual fees

I http://probability.ca/cran/ – Free!!I You will want to install the package MASS, other packages

will be mentioned as needed

9 / 20

Page 10: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Examples from textI email spam 4601 messages, frequencies of 57 commonly

occurring wordsI prostrate cancer 97 patients, 9 covariatesI handwriting recognition dataI DNA microarray data 64 samples, 6830 genes

I “Learning” = building a prediction model to predict anoutcome for new observations

I “Supervised learning” = regression or classification: haveavailable sample of outcome (y )

I “Unsupervised learning” = clustering: searching forstructure in the data

Book.Figures

10 / 20

Page 11: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Elements of Statistical Learning c©Hastie, Tibshirani & Friedman 2001 Chapter 1

lpsa

-1 1 3

oooooo ooo ooo ooooo oo o o oo oo ooo o ooo ooo ooo ooo oo ooo oo ooo oo o ooooooo o oooo ooo o ooo o ooooooo oooo ooo oo ooo

ooo o

o oo ooooooooooo oo ooo ooooooo oooo o oooo ooo ooo ooooo oooo ooo oo oo o oo ooo ooooooooo oo oo ooooo o oooooo oo ooo

oooo40 60 80

o o oooo ooo oooo oo o ooo oooo o oooooooo oo oo oo oo o ooo oo ooo oooo oo ooo oo oo oooo o oooo ooo oo o ooooo ooo oooo ooo oo o oo

oooooo o oooo ooooo ooo oo oo o ooo ooo ooooo ooo oooo oo oo ooo oo oooo oo oo o ooo oo o o oo oo oo oo o oooo ooo o oo ooo oo oooo oo

0.0 0.4 0.8

oooooooooooooooooooooooooooooooooooooo oooooooo ooooooooooooooo oo ooooooo oo oooooo oooo ooo oo oooo oo

oooo

oooooooooooo oo oo o oooo oo ooo oo o ooooo oo oo oooo o ooo ooo ooo o ooo ooooo oo oo o ooo o oo o o o ooo ooo oo oo oo o oooo o ooo o

6.0 7.5 9.0

oo oooooooooo oooo ooooo oo ooo ooooooooo o oooo ooo ooo oooo oo oooooo ooo o ooo oooo oooo oooooooooo oooo oooo oo

oooo

01

23

45

oo oooooooooo oooo ooooo oo ooo oo oooooooooo oo oooo oo oooo oo oooooo o oo o ooo o oooo ooo oo ooo ooo oo oo oo o o oo o ooo oo

-10

12

34

oooo

o

o

oo

ooo

o

oooo

o

o

oooo

o

o

oooo

o

o

oo

o

oo

ooo

o

oooo

oooo

ooooo

o

oo

oooooo

ooooooo

o

ooooo

oo

oooo

oooo

o

o

oo

o

o

ooo

oooo

lcavol

o oo

o

o

o

oo

ooo

o

oo oo

o

o

oo

oo

o

o

oo

oo

o

o

o o

o

oo

ooo

o

oooo

oooo

ooo o

o

o

oo

ooo oo

o

oo

ooooo

o

oo

o oo

oo

oooo

oo oo

o

o

oo

o

o

oooo

ooo

o oo

o

o

o

oo

ooo

o

o oo o

o

o

oo

oo

o

o

oo

oo

o

o

oo

o

o o

o oo

o

o oo

o

o ooo

oo

ooo

o

oo

ooo o

oo

oo

ooo

o o

o

oo

ooo

oo

oooo

oo oo

o

o

oo

o

o

oo o

oo o

o

oooo

o

o

o o

ooo

o

oooo

o

o

oo

oo

o

o

oo

oo

o

o

oo

o

oo

ooo

o

ooo

o

o ooo

oo

ooo

o

oo

ooooo

o

oo

ooo

o o

o

oo

ooo

o o

oooo

ooo o

o

o

oo

o

o

ooo

oo o

o

oooo

o

o

oo

ooo

o

oooo

o

o

oooo

o

o

oooo

o

o

oo

o

oo

ooo

o

oooo

oooo

ooooo

o

oo

oooooo

oo

ooooo

o

oo

o oo

oo

oo oo

oo oo

o

o

o o

o

o

ooooooo

oooo

o

o

oo

ooo

o

oo oo

o

o

ooo

o

o

o

oo

oo

o

o

oo

o

o o

o oo

o

ooo

o

oooo

oo

ooo

o

oo

o ooooo

oo

ooo

oo

o

oo

o oo

o o

oo oo

ooo o

o

o

o o

o

o

oo o

ooo

o

ooo

o

o

o

oo

ooo

o

oooo

o

o

ooo

o

o

o

oo

oo

o

o

oo

o

oo

o oo

o

o oo

o

oooo

ooo oo

o

oo

ooo o

oo

oo

ooooo

o

oooo

o

oo

oooo

ooo o

o

o

o o

o

o

ooooooo

ooo

o

o

o

oo

ooo

o

oooo

o

o

ooo

o

o

o

oo

oo

o

o

oo

o

oo

ooo

o

o oo

o

o ooo

ooo oo

o

oo

ooo o

oo

oo

ooo

oo

o

oo

ooo

o o

oo oo

ooo o

o

o

o o

o

o

oo o

oo o

o

oooooooooo

ooooooooooooooooooooo

o

ooo

oo

o

o

ooooooooooooooooo

o

oooo

ooooooooo

oooooo

oooooooooooo

o

oooooo

oo

oo

oo oo ooo o

oooo

oo

o oo

oo oo ooo

ooo o

o

o

ooo

oo

o

o

oooo ooo

oooo

o oo

oo

o

o

oo

oo

o oooo oo

oo

ooo

ooo

oooo

ooooo oo

o

o

oo

oo ooo o lweight

oo

oooo ooo o

ooo o

oo

ooo

oooo o o

ooooo

o

o

oo o

oo

o

o

o oooo

oooo

o ooo

oo

oo

o

oo

oo

o oooo o ooo

o oo

ooo

o ooo

oo

ooo ooo

o

oo

o ooo

oo

oooooo o oooo ooooo

ooo

oo oo o o

oo o

ooo

o

ooo

oo

o

o

ooo oo

oooo

o oo o

oo

oo

o

oo

oo

oooo o o o

oo

o oo

oo

o

oooo

oo

o o oo oo

o

oo

oooo

oo

ooooooooooooooooooooooooooooooo

o

ooo

oo

o

o

ooooooo

oooooooooo

o

oooo

oooooo

ooo

oooooo

oooo

oo

ooo ooo

o

oo

oooooo

oooooooooooo

ooo

oo o

ooo oo ooo

ooo o

o

o

oo o

oo

o

o

ooo o o

oooo

o ooo

oo

oo

o

oooo

oooo o oo

oo

ooo

oo

o

oooo

oo

o oo ooo

o

ooo o o

oo o

oo

oooooooooo

ooo

ooo

ooo oo ooo

oooo

o

o

ooo

oo

o

o

o ooo ooo

oooo

oooooo

o

ooo

o

o ooo ooo

oo

ooo

ooo

ooooooo ooooo

o

oo

oooooo

34

56

oo

oooooooooo

ooo

ooo

ooo oo ooo

oooo

o

o

ooo

oo

o

o

o oooo

oooo

oooo

oo

oo

o

oo

oo

o ooo o oo

oo

ooo

oo

o

oooo

oo

o oo ooo

o

oo

o ooo

oo

4050

6070

80

o

o

o

oo

o

oo

o

ooooo

o

ooo

o

o

ooooooooooooo

o

oo

o

oo

oo

ooooo

o

o

o

ooooo

oo

oo

o

o

o

o

oooooooo

o

o

o

o

oooo

ooo

o

o

oooooo

o

ooo

oo

oo

o

o

o

oo

o

oo

o

ooo oo

o

oo

o

o

o

o ooo

o ooo o oooo

o

o o

o

oo

oo

oo o

oo

o

o

o

oooo

o

oo

oo

o

o

o

o

ooooooo o

o

o

o

o

ooo

o

ooo

o

o

oo

oo

o o

o

ooo

oo

o o

o

o

o

oo

o

oo

o

ooooo

o

ooo

o

o

oooooo oooo o o

o

o

o o

o

oo

oo

ooo

oo

o

o

o

o oooo

oo

o o

o

o

o

o

oooooooo

o

o

o

o

oooo

ooo

o

o

oo

oo

o o

o

ooo

oo

oo

ageo

o

o

oo

o

oo

o

oo ooo

o

oo

o

o

o

o ooo

ooo ooo ooo

o

o o

o

oo

oo

oooo

o

o

o

o

oooo

o

oo

o o

o

o

o

o

ooo o

ooo o

o

o

o

o

o ooo

oo o

o

o

oo

oo

oo

o

oo

o

oo

oo

o

o

o

oo

o

oo

o

ooooo

o

ooo

o

o

ooooooooooooo

o

oo

o

oo

oo

ooooo

o

o

o

ooooo

oo

oo

o

o

o

o

oooo

oooo

o

o

o

o

oooo

ooo

o

o

oo

oo

oo

o

ooo

oo

oo

o

o

o

oo

o

oo

o

ooo oo

o

oo

o

o

o

o ooo

oo oo o oooo

o

oo

o

oo

oo

oo oo

o

o

o

o

ooo o

o

oo

oo

o

o

o

o

ooo o

ooo o

o

o

o

o

o oo

o

ooo

o

o

oo

oo

o o

o

oo

o

oo

o o

o

o

o

oo

o

oo

o

ooo oo

o

oo

o

o

o

o ooo

oo ooooooo

o

o o

o

oo

oo

oo o

oo

o

o

o

o oo o

o

oo

oo

o

o

o

o

ooo oooo o

o

o

o

o

oooo

ooo

o

o

ooo

ooo

o

ooo

oo

oo

o

o

o

oo

o

oo

o

ooo oo

o

oo

o

o

o

o ooo

oo oo ooooo

o

oo

o

oo

oo

ooo

oo

o

o

o

o oo o

o

oo

oo

o

o

o

o

ooo o

oooo

o

o

o

o

o oo

o

ooo

o

o

oo

oo

o o

o

oo

o

oo

oo

oooooo

o

o

ooo

o

oooo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

oo

o

o

o

oo

oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

o

o

oo

oo

oo

o

oo

o

o

oo

o

o

o

ooo

o

o

oooo oo

o

o

o oo

o

oooo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

oo

o

o

o

oo

o o

o

o

o

o

oo

o

o

o

oo o

o

o

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

o

o

oo

oo

o o

o

oo

o

o

o o

o

o

o

o oo

o

o

o oo ooo

o

o

ooo

o

oo oo

o

oo

o

o

o

o

o

o

o

o

o

o

o

o oo

oo

o

o

o

oo

o o

o

o

o

o

oo

o

o

o

oo o

o

o

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

o

o

oo

oo

o o

o

oo

o

o

o o

o

o

o

ooo

o

o

o o oooo

o

o

o oo

o

o oo o

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

o o

o

o

o

oo

o o

o

o

o

o

oo

o

o

o

oo o

o

o

o

o

o

o

oo

o

o

o

oo

o

o

o

o

o

o

o

o

o

o o

oo

oo

o

oo

o

o

oo

o

o

o

oo o

o

o lbph

oooooo

o

o

ooo

o

oooo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

oo

o

o

o

oo

oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

o

o

oo

oo

oo

o

oo

o

o

oo

o

o

o

ooo

o

o

oooooo

o

o

ooo

o

oo oo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

o o

o

o

o

oo

oo

o

o

o

o

oo

o

o

o

oo o

o

o

o

o

o

o

oo

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

oo

o o

o

oo

o

o

o o

o

o

o

o oo

o

o

oo oooo

o

o

ooo

o

oooo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

oo

o

o

o

oo

oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

o

o

oo

oo

oo

o

oo

o

o

oo

o

o

o

ooo

o

o

-10

12

oo oooo

o

o

ooo

o

oooo

o

oo

o

o

o

o

o

o

o

o

o

o

o

ooo

oo

o

o

o

oo

oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

o

o

o

oo

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

oo

o o

o

oo

o

o

o o

o

o

o

ooo

o

o

0.0

0.4

0.8

oooooooooooooooooooooooooooooooooooooo

o

ooooooo

o

oooooooooooooo

o

o

o

oooooo

o

o

oooo

oo

o

ooo

o

oo

o

o

ooo

o

oooooo

oooo oo ooo ooo ooooo oo o o oo oo ooo o ooo ooo ooo

o

oo oo ooo

o

o ooo oo o ooooooo

o

o

o

oo ooo o

o

o

o o oo

oo

o

oo o

o

oo

o

o

o oo

o

oo ooo o

o oo ooooooooooo oo ooo ooooooo oooo o oooo ooo

o

oo ooooo

o

ooo ooo oo oo o oo o

o

o

o

oooooo

o

o

oo oo

oo

o

oo o

o

oo

o

o

o oo

o

oooooo

o o oooo ooo oooo oo o ooo oooo o oooooooo oo oo oo

o

o o ooo oo

o

oo oooo oo ooo oo o

o

o

o

oo o ooo

o

o

oo oo

o o

o

ooo

o

oo

o

o

oo o

o

o oo o oo

oooooo o oooo ooooo ooo oo oo o ooo ooo ooooo ooo

o

ooo oo oo

o

oo oo oooo oo oo o o

o

o

o

o o o oo o

o

o

o oo o

oo

o

o oo

o

o o

o

o

oo o

o

oooo oo

svi

oooooooooooo oo oo o oooo oo ooo oo o ooooo oo oo

o

ooo o ooo

o

oo ooo o ooo ooooo

o

o

o

o o ooo o

o

o

o o o o

oo

o

oo o

o

oo

o

o

o oo

o

o o ooo o

oo oooooooooo oooo ooooo oo ooo ooooooooo o oo

o

o ooo ooo

o

ooo oo oooooo ooo

o

o

o

o oooo o

o

o

o ooo

oo

o

ooo

o

oo

o

o

ooo

o

oooooo

oo oooooooooo oooo ooooo oo ooo oo oooooooooo

o

o oooo oo

o

ooo oo oooooo o oo

o

o

o

o o oooo

o

o

o oo o

oo

o

oo o

o

oo

o

o

o o o

o

o ooo oo

oooooooooooooo

o

oo

o

ooo

o

o

o

oooo

o

o

oooooo

o

o

o

o

oo

o

o

o

o

o

o

oooo

o

o

oo

o

oooo

ooo

o

o

o

o

oo

o

o

o

ooo

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

ooo

o

o

oooo oo ooo ooooo

o

oo

o

o o o

o

o

o

o oo

o

o

o

oo ooo

o

o

o

o

o

o o

o

o

o

o

o

o

ooo o

o

o

oo

o

oooo

o oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

ooo

o

o

o oo ooooooooooo

o

oo

o

o oo

o

o

o

oooo

o

o

oooo

oo

o

o

o

o

o o

o

o

o

o

o

o

oo

oo

o

o

o o

o

o oo o

ooo

o

o

o

o

oo

o

o

o

oo

o

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

ooo

o

o

o o oooo ooo oooo

o

o

oo

o

o oo

o

o

o

oooo

o

o

oo oo

oo

o

o

o

o

o o

o

o

o

o

o

o

oo

oo

o

o

o o

o

o oo o

o oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

oo

o

o

o

o

o

o

o

o

o

oo

oo o

o

o

oooooo o oooo ooo

o

oo

o

o oo

o

o

o

ooo

o

o

o

oooo

oo

o

o

o

o

oo

o

o

o

o

o

o

oo

o o

o

o

o o

o

oo o o

ooo

o

o

o

o

oo

o

o

o

oo

o

o

o

o

o o

o

o

o

o

o

o

o

o

o

o o

ooo

o

o

oooooooooooooo

o

oo

o

ooo

o

o

o

oooo

o

o

oooooo

o

o

o

o

oo

o

o

o

o

o

o

oooo

o

o

oo

o

oooo

ooo

o

o

o

o

oo

o

o

o

ooo

o

o

o

oo

o

o

o

o

o

o

o

o

o

o o

ooo

o

o

lcp

oo oooooooooooo

o

oo

o

ooo

o

o

o

oooo

o

o

ooooo

o

o

o

o

o

oo

o

o

o

o

o

o

oo

oo

o

o

oo

o

o ooo

o oo

o

o

o

o

oo

o

o

o

ooo

o

o

o

oo

o

o

o

o

o

o

o

o

o

o o

ooo

o

o

-10

12

3

oo ooooooooooo

o

o

oo

o

ooo

o

o

o

ooo

o

o

o

oooooo

o

o

o

o

oo

o

o

o

o

o

o

oo

oo

o

o

oo

o

o o oo

o oo

o

o

o

o

oo

o

o

o

oo

o

o

o

o

oo

o

o

o

o

o

o

o

o

o

o o

ooo

o

o

6.0

7.0

8.0

9.0

oo

o

ooooooooo

ooo

o

o

oooo

o

o

o

oo

ooo

oooooo

o

o

ooo

o

o

o

ooo

o

o

oo

o

o

ooooo

o

oo

o

o

o

o

o

ooo

o

oooo

o

ooooooooo

o

oo

o

ooo

o

oooooo

oo

o

o oo ooo ooo

ooo

o

o

oo o o

o

o

o

o o

oo o

ooo ooo

o

o

o oo

o

o

o

ooo

o

o

oo

o

o

o o ooo

o

oo

o

o

o

o

o

o oo

o

o ooo

o

ooooooo oo

o

o o

o

o oo

o

oo ooo o

o o

o

ooooooooo

oo o

o

o

oo oo

o

o

o

oo

ooo

o o oooo

o

o

o oo

o

o

o

ooo

o

o

oo

o

o

o oo oo

o

oo

o

o

o

o

o

ooo

o

ooo o

o

oo ooooo o o

o

oo

o

o oo

o

oooooo

o o

o

ooo ooo ooo

o oo

o

o

oo oo

o

o

o

oo

ooo

ooo oo o

o

o

o oo

o

o

o

o oo

o

o

o o

o

o

o oo oo

o

oo

o

o

o

o

o

o o o

o

oo oo

o

oo o ooooo o

o

o o

o

oo o

o

o oo o oo

oo

o

ooo o oooo o

ooo

o

o

oo oo

o

o

o

oo

o oo

o ooooo

o

o

o oo

o

o

o

o oo

o

o

o o

o

o

ooo oo

o

o o

o

o

o

o

o

o o o

o

oo oo

o

o o oooo ooo

o

oo

o

oo o

o

oooo oo

oo

o

ooooooooo

ooo

o

o

oooo

o

o

o

oo

ooo

oooooo

o

o

o oo

o

o

o

ooo

o

o

oo

o

o

ooooo

o

oo

o

o

o

o

o

ooo

o

o oo o

o

oooo oooo o

o

o o

o

ooo

o

oooooo

oo

o

ooooooooo

oo o

o

o

oooo

o

o

o

oo

oo o

ooooo o

o

o

o oo

o

o

o

ooo

o

o

o o

o

o

o ooo o

o

oo

o

o

o

o

o

o oo

o

o oo o

o

o ooo ooo oo

o

o o

o

o oo

o

o o ooo ogleason

oo

o

ooooooooo

ooo

o

o

oooo

o

o

o

oo

oo o

oooooo

o

o

o oo

o

o

o

o oo

o

o

oo

o

o

ooooo

o

o o

o

o

o

o

o

o oo

o

o ooo

o

o ooo ooo oo

o

o o

o

o o o

o

o ooo oo

0 2 4oooooooooooo

o

ooo

o

oooooo

o

oo

o

o

o

oooooooooo

o

o

ooooo

o

o

oo

o

o

o

o

oooooo

o

o

o

o

oo

o

ooo

o

oo

o

o

oo

o

oooo

o

o

o

o

o

o

oo

oo

oo

o

o

oooo

o oo ooo ooo

o

ooo

o

oo o oo

o

o

o o

o

o

o

ooo ooo ooo

o

o

o

oo o

oo

o

o

oo

o

o

o

o

oo

ooo

o

o

o

o

o

oo

o

oo o

o

oo

o

o

oo

o

oo

oo

o

o

o

o

o

o

oo

oo

oo

o

o

o

3 4 5 6o oo

ooooooooo

o

o oo

o

oo oooo

o

oo

o

o

o

o o oooo ooo

o

o

o

ooo

oo

o

o

oo

o

o

o

o

oo

o oo

o

o

o

o

o

oo

o

ooo

o

o o

o

o

oo

o

oo

oo

o

o

o

o

o

o

oo

oo

oo

o

o

oo o

oooo ooo ooo

o

oo o

o

oo oooo

o

oo

o

o

o

ooo oo oooo

o

o

o

ooo

oo

o

o

o o

o

o

o

o

oo

ooo

o

o

o

o

o

oo

o

ooo

o

oo

o

o

oo

o

ooo

o

o

o

o

o

o

o

oo

oo

oo

o

o

o

-1 0 1 2oooooo o oooo o

o

ooo

o

oo ooo

o

o

oo

o

o

o

o ooooo ooo

o

o

o

ooooo

o

o

o o

o

o

o

o

oo

o oo

o

o

o

o

o

oo

o

oo o

o

oo

o

o

oo

o

oo

oo

o

o

o

o

o

o

oo

oo

oo

o

o

ooooooooooooo

o

ooo

o

oooooo

o

oo

o

o

o

ooooooooo

o

o

o

ooooo

o

o

oo

o

o

o

o

oooooo

o

o

o

o

oo

o

ooo

o

o o

o

o

oo

o

oooo

o

o

o

o

o

o

oo

oo

oo

o

o

o

-1 1 2 3oooooooooooo

o

o oo

o

ooooo

o

o

oo

o

o

o

ooooo ooooo

o

o

oo ooo

o

o

o o

o

o

o

o

oo

oooo

o

o

o

o

oo

o

oo o

o

o o

o

o

oo

o

ooo

o

o

o

o

o

o

o

oo

oo

oo

o

o

ooo

oooooooooo

o

ooo

o

ooooo

o

o

oo

o

o

o

oooooo ooo

o

o

o

oo o

oo

o

o

oo

o

o

o

o

oooooo

o

o

o

o

oo

o

oo o

o

oo

o

o

oo

o

oooo

o

o

o

o

o

o

oo

oo

oo

o

o

o

0 40 800

2060

100

pgg45

Figure 1.1: Scatterplot matrix of the prostate cancer

data. The first row shows the response against each of

the predictors in turn. Two of the predictors, svi and

gleason, are categorical.

11 / 20

Page 12: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some of the handwriting data

12 / 20

Page 13: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Some basic definitionsI input X (features), output Y (response)I data (xi , yi), i = 1, . . .NI use data to assign a rule (function) taking X → YI goal is to predict a new value of Y , given X : Y (X )

I regression if Y is continuousclassification if Y is discrete

I Examples...

13 / 20

Page 14: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

How to know if the rule works?

I compare yi to yi on data training errorI compare Y to Y on new data test errorI In ordinary linear regression, the least squares estimates

minimizeΣ(yi − xT

i β)2

I and the minimized value is the sum of the squaredresiduals

Σi(yi − yi)2 = Σi(yi − xT

i β)2

I test error isΣj(Yj − xT

j β)2

14 / 20

Page 15: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

... training and test errorI In classification: use training data (xi , yi) to estimate a

classification rule G(·)I G(x) takes a new observation x and assigns a

class/category/target g ∈ G for the corresponding new yI for example, the simplest version of linear discriminant

analysis saysyj = 1 if xT

j β > c else y = 0I what criterion function does this minimize?I test error is proportion of mis-classifications in the test data

setI in regression or in linear discriminant analysis, training

error is always smaller than test error: why?

15 / 20

Page 16: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Linear regression with polynomialsI model yi = β0 + β1xi + β2x2

i + β3x3i + · · ·+ εi

I assume εi ∼ N(0, σ2) (independent)I how to choose degree of polynomial?I for fixed degree, find β by least squares

0.0 0.2 0.4 0.6 0.8 1.0

-1.0

-0.5

0.0

0.5

1.0

x

y

go to R 16 / 20

Page 17: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

The Netflix Prizehttp://www.netflixprize.com//index

17 / 20

Page 18: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Digital DoctoringI “Digital doctoring: how to tell the real from the fake”, H.

Farid, Significance 2006.I “Exposing Digital Forgeries by Detecting Inconsistencies in

Lighting”, M.K. Johnson, H. Farid, ACM 2005I http://www.cs.dartmouth.edu/farid/

H. Farid, Significance 2006.

18 / 20

Page 19: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

Automated Vision SystemsI http://www.cs.ubc.ca/˜murphyk/Vision/placeRecognition.html

I “Context-based vision system for place and objectrecognition” Antonio Torralba, Kevin P. Murphy, William T.Freeman and Mark Rubin, ICCV 2003.

19 / 20

Page 20: Sta 414/2104 S - fisher.utstat.toronto.edufisher.utstat.toronto.edu/reid/sta414/jan5.pdf · Data mining/machine learning I large data sets I high dimensional spaces I potentially

For next class/weekI Thursday: Make sure you can start RI Thursday: Overview of linear regression in R

I Tuesday: Read Chapter 1, 2.1–2.3 of HTFI Tuesday: We will cover Ch. 3.1-3.3/4 of HTF

20 / 20