26
Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning structure (e.g. BN graphs) § Learning hidden concepts (e.g. clustering, neural nets) § Today: model-based classification with Naive Bayes

Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

  • Upload
    others

  • View
    18

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

MachineLearning

§ Machinelearning:howtoacquireamodelfromdata/experience§ Learningparameters(e.g.probabilities)§ Learningstructure(e.g.BNgraphs)§ Learninghiddenconcepts(e.g.clustering,neuralnets)

§ Today:model-basedclassificationwithNaiveBayes

Page 2: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Classification

Page 3: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:SpamFilter

§ Input:anemail§ Output:spam/ham

§ Setup:§ Getalargecollectionofexampleemails,eachlabeled

“spam”or“ham”§ Note:someonehastohandlabelallthisdata!§ Wanttolearntopredictlabelsofnew,futureemails

§ Features:Theattributesusedtomaketheham/spamdecision§ Words:FREE!§ TextPatterns:$dd,CAPS§ Non-text:SenderInContacts,WidelyBroadcast§ …

DearSir.

First,Imustsolicityourconfidenceinthistransaction,thisisbyvirtureofitsnatureasbeingutterlyconfidencialandtopsecret.…

TOBEREMOVEDFROMFUTUREMAILINGS,SIMPLYREPLYTOTHISMESSAGEANDPUT"REMOVE"INTHESUBJECT.

99MILLIONEMAILADDRESSESFORONLY$99

Ok,IknowthisisblatantlyOTbutI'mbeginningtogoinsane.HadanoldDellDimensionXPSsittinginthecorneranddecidedtoputittouse,Iknowitwasworkingprebeingstuckinthecorner,butwhenIpluggeditin,hitthepowernothinghappened.

Page 4: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:DigitRecognition

§ Input:images/pixelgrids§ Output:adigit0-9

§ Setup:§ Getalargecollectionofexampleimages,eachlabeledwithadigit§ Note:someonehastohandlabelallthisdata!§ Wanttolearntopredictlabelsofnew,futuredigitimages

§ Features:Theattributesusedtomakethedigitdecision§ Pixels:(6,8)=ON§ ShapePatterns:NumComponents,AspectRatio,NumLoops§ …§ Featuresareincreasinglyinducedratherthancrafted

0

1

2

1

??

Page 5: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

OtherClassificationTasks

§ Classification:giveninputsx,predictlabels(classes)y

§ Examples:§ Medicaldiagnosis(input:symptoms,

classes:diseases)§ Frauddetection(input:accountactivity,

classes:fraud/nofraud)§ Automaticessaygrading(input:document,

classes:grades)§ Customerserviceemailrouting§ Reviewsentiment§ LanguageID§ …manymore

§ Classificationisanimportantcommercialtechnology!

Page 6: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Model-BasedClassification

Page 7: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Model-BasedClassification

§ Model-basedapproach§ Buildamodel(e.g.Bayes’net)whereboththeoutputlabelandinputfeaturesarerandomvariables

§ Instantiateanyobservedfeatures§ Queryforthedistributionofthelabelconditionedonthefeatures

§ Challenges§ WhatstructureshouldtheBNhave?§ Howshouldwelearnitsparameters?

Page 8: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

NaïveBayesforDigits

§ NaïveBayes:Assumeallfeaturesareindependenteffectsofthelabel

§ Simpledigitrecognitionversion:§ Onefeature(variable)Fij foreachgridposition<i,j>§ Featurevaluesareon/off,basedonwhetherintensity

ismoreorlessthan0.5inunderlyingimage§ Eachinputmapstoafeaturevector,e.g.

§ Here:lotsoffeatures,eachisbinaryvalued

§ NaïveBayesmodel: (Bayes'theorem)

§ Whatdoweneedtolearn?

Y

F1 FnF2

Page 9: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

GeneralNaïveBayes

§ AgeneralNaiveBayes model:

§ Weonlyhavetospecifyhoweachfeaturedependsontheclass§ Totalnumberofparametersislinear inn§ Modelisverysimplistic,butoftenworksanyway

Y

F1 FnF2

|Y|parameters

nx|F|x|Y|parameters

|Y|x|F|n values

Page 10: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

InferenceforNaïveBayes

§ Goal:computeposteriordistributionoverlabelvariableY§ Step1:getjointprobabilityoflabelandevidenceforeachlabel

§ Step2:sumtogetprobabilityofevidence

§ Step3:normalizebydividingStep1byStep2

+

Page 11: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

GeneralNaïveBayes

§ WhatdoweneedinordertouseNaïveBayes?

§ Inferencemethod(wejustsawthispart)§ Startwithabunchofprobabilities:P(Y)andtheP(Fi|Y)tables§ UsestandardinferencetocomputeP(Y|F1…Fn)§ Nothingnewhere

§ Estimatesoflocalconditionalprobabilitytables§ P(Y),theprioroverlabels§ P(Fi|Y)foreachfeature(evidencevariable)§ Theseprobabilitiesarecollectivelycalledtheparameters ofthemodelanddenotedbyq

§ Upuntilnow,weassumedtheseappearedbymagic,but…§ …theytypicallycomefromtrainingdatacounts:we’lllookatthissoon

Page 12: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:ConditionalProbabilities

1 0.12 0.13 0.14 0.15 0.16 0.17 0.18 0.19 0.10 0.1

1 0.012 0.053 0.054 0.305 0.806 0.907 0.058 0.609 0.500 0.80

1 0.052 0.013 0.904 0.805 0.906 0.907 0.258 0.859 0.600 0.80

Page 13: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

NaïveBayesforText

§ Bag-of-wordsNaïveBayes:§ Features:Wi isthewordatpositioni§ Asbefore:predictlabelconditionedonfeaturevariables(spamvs.ham)§ Asbefore:assumefeaturesareconditionallyindependentgivenlabel§ New:eachWi isidenticallydistributed

§ Generativemodel:

§ “Tied”distributionsandbag-of-words§ Usually,eachvariablegetsitsownconditionalprobabilitydistributionP(F|Y)§ Inabag-of-wordsmodel

§ Eachpositionisidenticallydistributed§ Allpositionssharethesameconditionalprobs P(W|Y)§ Whymakethisassumption?

§ Called“bag-of-words”becausemodelisinsensitivetowordorderorreordering

Wordatpositioni,notith wordinthedictionary!

Page 14: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:SpamFiltering

§ Model:

§ Whataretheparameters?

§ Wheredothesetablescomefrom?

the : 0.0156to : 0.0153and : 0.0115of : 0.0095you : 0.0093a : 0.0086with: 0.0080from: 0.0075...

the : 0.0210to : 0.0133of : 0.01192002: 0.0110with: 0.0108from: 0.0107and : 0.0105a : 0.0100...

ham : 0.66spam: 0.33

Page 15: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

TrainingandTesting

Page 16: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

EmpiricalRiskMinimization

§ Empiricalriskminimization§ Basicprincipleofmachinelearning§ Wewantthemodel(classifier,etc)thatdoesbestonthetruetestdistribution§ Don’tknowthetruedistributionsopickthebestmodelonouractualtrainingset§ Finding“thebest”modelonthetrainingsetisphrasedasanoptimizationproblem

§ Mainworry:overfittingtothetrainingset§ Betterwithmoretrainingdata(lesssamplingvariance,trainingmoreliketest)§ Betterifwelimitthecomplexityofourhypotheses(regularizationand/orsmallhypothesisspaces)

Page 17: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

ImportantConcepts

§ Data:labeledinstances(e.g.emailsmarkedspam/ham)§ Trainingset§ Heldoutset§ Testset

§ Features:attribute-valuepairswhichcharacterizeeachx

§ Experimentationcycle§ Learnparameters(e.g.modelprobabilities)ontrainingset§ (Tunehyperparameters onheld-outset)§ Computeaccuracyoftestset§ Veryimportant:never“peek”atthetestset!

§ Evaluation(manymetricspossible,e.g.accuracy)§ Accuracy:fractionofinstancespredictedcorrectly

§ Overfitting andgeneralization§ Wantaclassifierwhichdoeswellontest data§ Overfitting:fittingthetrainingdataveryclosely,butnot

generalizingwell§ We’llinvestigateoverfitting andgeneralizationformallyinafew

lectures

TrainingData

Held-OutData

TestData

Page 18: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

GeneralizationandOverfitting

Page 19: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Overfitting

Page 20: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:Overfitting

2wins!!

Page 21: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Example:Overfitting

§ Posteriorsdeterminedbyrelativeprobabilities(oddsratios):

south-west : infnation : infmorally : infnicely : infextent : infseriously : inf...

Whatwentwronghere?

screens : infminute : infguaranteed : inf$205.00 : infdelivery : infsignature : inf...

Page 22: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

GeneralizationandOverfitting

§ Relativefrequencyparameterswilloverfit thetrainingdata!§ Justbecauseweneversawa3withpixel(15,15)onduringtrainingdoesn’tmeanwewon’tseeitattesttime§ Unlikelythateveryoccurrenceof“minute”is100%spam§ Unlikelythateveryoccurrenceof“seriously”is100%ham§ Whataboutallthewordsthatdon’toccurinthetrainingsetatall?§ Ingeneral,wecan’tgoaroundgivingunseeneventszeroprobability

§ Asanextremecase,imagineusingtheentireemailastheonlyfeature(e.g.documentID)§ Wouldgetthetrainingdataperfect(ifdeterministiclabeling)§ Wouldn’tgeneralize atall§ Justmakingthebag-of-wordsassumptiongivesussomegeneralization,butisn’tenough

§ Togeneralizebetter:weneedtosmoothorregularizetheestimates

Page 23: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

ParameterEstimation

Page 24: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

ParameterEstimation

§ Estimatingthedistributionofarandomvariable

§ Elicitation: askahuman(whyisthishard?)

§ Empirically:usetrainingdata(learning!)§ E.g.:foreachoutcomex,lookattheempiricalrate ofthatvalue:

§ Thisistheestimatethatmaximizesthelikelihoodofthedata

r r b

r b b

r bbrb b

r bb

r

b

b

Page 25: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

Implementation

§ Two lines of code in scikit-learn (homework 2)

gnb =GaussianNB()y_pred =gnb.fit(train_data,train_label).predict(test_data)

Page 26: Machine Learning - Wuwei Lan · 2020-05-06 · Machine Learning § Machine learning: how to acquire a model from data / experience § Learning parameters (e.g. probabilities) § Learning

NextTime:Smoothing