Upload
others
View
18
Download
0
Embed Size (px)
Citation preview
MachineLearning
§ Machinelearning:howtoacquireamodelfromdata/experience§ Learningparameters(e.g.probabilities)§ Learningstructure(e.g.BNgraphs)§ Learninghiddenconcepts(e.g.clustering,neuralnets)
§ Today:model-basedclassificationwithNaiveBayes
Classification
Example:SpamFilter
§ Input:anemail§ Output:spam/ham
§ Setup:§ Getalargecollectionofexampleemails,eachlabeled
“spam”or“ham”§ Note:someonehastohandlabelallthisdata!§ Wanttolearntopredictlabelsofnew,futureemails
§ Features:Theattributesusedtomaketheham/spamdecision§ Words:FREE!§ TextPatterns:$dd,CAPS§ Non-text:SenderInContacts,WidelyBroadcast§ …
DearSir.
First,Imustsolicityourconfidenceinthistransaction,thisisbyvirtureofitsnatureasbeingutterlyconfidencialandtopsecret.…
TOBEREMOVEDFROMFUTUREMAILINGS,SIMPLYREPLYTOTHISMESSAGEANDPUT"REMOVE"INTHESUBJECT.
99MILLIONEMAILADDRESSESFORONLY$99
Ok,IknowthisisblatantlyOTbutI'mbeginningtogoinsane.HadanoldDellDimensionXPSsittinginthecorneranddecidedtoputittouse,Iknowitwasworkingprebeingstuckinthecorner,butwhenIpluggeditin,hitthepowernothinghappened.
Example:DigitRecognition
§ Input:images/pixelgrids§ Output:adigit0-9
§ Setup:§ Getalargecollectionofexampleimages,eachlabeledwithadigit§ Note:someonehastohandlabelallthisdata!§ Wanttolearntopredictlabelsofnew,futuredigitimages
§ Features:Theattributesusedtomakethedigitdecision§ Pixels:(6,8)=ON§ ShapePatterns:NumComponents,AspectRatio,NumLoops§ …§ Featuresareincreasinglyinducedratherthancrafted
0
1
2
1
??
OtherClassificationTasks
§ Classification:giveninputsx,predictlabels(classes)y
§ Examples:§ Medicaldiagnosis(input:symptoms,
classes:diseases)§ Frauddetection(input:accountactivity,
classes:fraud/nofraud)§ Automaticessaygrading(input:document,
classes:grades)§ Customerserviceemailrouting§ Reviewsentiment§ LanguageID§ …manymore
§ Classificationisanimportantcommercialtechnology!
Model-BasedClassification
Model-BasedClassification
§ Model-basedapproach§ Buildamodel(e.g.Bayes’net)whereboththeoutputlabelandinputfeaturesarerandomvariables
§ Instantiateanyobservedfeatures§ Queryforthedistributionofthelabelconditionedonthefeatures
§ Challenges§ WhatstructureshouldtheBNhave?§ Howshouldwelearnitsparameters?
NaïveBayesforDigits
§ NaïveBayes:Assumeallfeaturesareindependenteffectsofthelabel
§ Simpledigitrecognitionversion:§ Onefeature(variable)Fij foreachgridposition<i,j>§ Featurevaluesareon/off,basedonwhetherintensity
ismoreorlessthan0.5inunderlyingimage§ Eachinputmapstoafeaturevector,e.g.
§ Here:lotsoffeatures,eachisbinaryvalued
§ NaïveBayesmodel: (Bayes'theorem)
§ Whatdoweneedtolearn?
Y
F1 FnF2
GeneralNaïveBayes
§ AgeneralNaiveBayes model:
§ Weonlyhavetospecifyhoweachfeaturedependsontheclass§ Totalnumberofparametersislinear inn§ Modelisverysimplistic,butoftenworksanyway
Y
F1 FnF2
|Y|parameters
nx|F|x|Y|parameters
|Y|x|F|n values
InferenceforNaïveBayes
§ Goal:computeposteriordistributionoverlabelvariableY§ Step1:getjointprobabilityoflabelandevidenceforeachlabel
§ Step2:sumtogetprobabilityofevidence
§ Step3:normalizebydividingStep1byStep2
+
GeneralNaïveBayes
§ WhatdoweneedinordertouseNaïveBayes?
§ Inferencemethod(wejustsawthispart)§ Startwithabunchofprobabilities:P(Y)andtheP(Fi|Y)tables§ UsestandardinferencetocomputeP(Y|F1…Fn)§ Nothingnewhere
§ Estimatesoflocalconditionalprobabilitytables§ P(Y),theprioroverlabels§ P(Fi|Y)foreachfeature(evidencevariable)§ Theseprobabilitiesarecollectivelycalledtheparameters ofthemodelanddenotedbyq
§ Upuntilnow,weassumedtheseappearedbymagic,but…§ …theytypicallycomefromtrainingdatacounts:we’lllookatthissoon
Example:ConditionalProbabilities
1 0.12 0.13 0.14 0.15 0.16 0.17 0.18 0.19 0.10 0.1
1 0.012 0.053 0.054 0.305 0.806 0.907 0.058 0.609 0.500 0.80
1 0.052 0.013 0.904 0.805 0.906 0.907 0.258 0.859 0.600 0.80
NaïveBayesforText
§ Bag-of-wordsNaïveBayes:§ Features:Wi isthewordatpositioni§ Asbefore:predictlabelconditionedonfeaturevariables(spamvs.ham)§ Asbefore:assumefeaturesareconditionallyindependentgivenlabel§ New:eachWi isidenticallydistributed
§ Generativemodel:
§ “Tied”distributionsandbag-of-words§ Usually,eachvariablegetsitsownconditionalprobabilitydistributionP(F|Y)§ Inabag-of-wordsmodel
§ Eachpositionisidenticallydistributed§ Allpositionssharethesameconditionalprobs P(W|Y)§ Whymakethisassumption?
§ Called“bag-of-words”becausemodelisinsensitivetowordorderorreordering
Wordatpositioni,notith wordinthedictionary!
Example:SpamFiltering
§ Model:
§ Whataretheparameters?
§ Wheredothesetablescomefrom?
the : 0.0156to : 0.0153and : 0.0115of : 0.0095you : 0.0093a : 0.0086with: 0.0080from: 0.0075...
the : 0.0210to : 0.0133of : 0.01192002: 0.0110with: 0.0108from: 0.0107and : 0.0105a : 0.0100...
ham : 0.66spam: 0.33
TrainingandTesting
EmpiricalRiskMinimization
§ Empiricalriskminimization§ Basicprincipleofmachinelearning§ Wewantthemodel(classifier,etc)thatdoesbestonthetruetestdistribution§ Don’tknowthetruedistributionsopickthebestmodelonouractualtrainingset§ Finding“thebest”modelonthetrainingsetisphrasedasanoptimizationproblem
§ Mainworry:overfittingtothetrainingset§ Betterwithmoretrainingdata(lesssamplingvariance,trainingmoreliketest)§ Betterifwelimitthecomplexityofourhypotheses(regularizationand/orsmallhypothesisspaces)
ImportantConcepts
§ Data:labeledinstances(e.g.emailsmarkedspam/ham)§ Trainingset§ Heldoutset§ Testset
§ Features:attribute-valuepairswhichcharacterizeeachx
§ Experimentationcycle§ Learnparameters(e.g.modelprobabilities)ontrainingset§ (Tunehyperparameters onheld-outset)§ Computeaccuracyoftestset§ Veryimportant:never“peek”atthetestset!
§ Evaluation(manymetricspossible,e.g.accuracy)§ Accuracy:fractionofinstancespredictedcorrectly
§ Overfitting andgeneralization§ Wantaclassifierwhichdoeswellontest data§ Overfitting:fittingthetrainingdataveryclosely,butnot
generalizingwell§ We’llinvestigateoverfitting andgeneralizationformallyinafew
lectures
TrainingData
Held-OutData
TestData
GeneralizationandOverfitting
Overfitting
Example:Overfitting
2wins!!
Example:Overfitting
§ Posteriorsdeterminedbyrelativeprobabilities(oddsratios):
south-west : infnation : infmorally : infnicely : infextent : infseriously : inf...
Whatwentwronghere?
screens : infminute : infguaranteed : inf$205.00 : infdelivery : infsignature : inf...
GeneralizationandOverfitting
§ Relativefrequencyparameterswilloverfit thetrainingdata!§ Justbecauseweneversawa3withpixel(15,15)onduringtrainingdoesn’tmeanwewon’tseeitattesttime§ Unlikelythateveryoccurrenceof“minute”is100%spam§ Unlikelythateveryoccurrenceof“seriously”is100%ham§ Whataboutallthewordsthatdon’toccurinthetrainingsetatall?§ Ingeneral,wecan’tgoaroundgivingunseeneventszeroprobability
§ Asanextremecase,imagineusingtheentireemailastheonlyfeature(e.g.documentID)§ Wouldgetthetrainingdataperfect(ifdeterministiclabeling)§ Wouldn’tgeneralize atall§ Justmakingthebag-of-wordsassumptiongivesussomegeneralization,butisn’tenough
§ Togeneralizebetter:weneedtosmoothorregularizetheestimates
ParameterEstimation
ParameterEstimation
§ Estimatingthedistributionofarandomvariable
§ Elicitation: askahuman(whyisthishard?)
§ Empirically:usetrainingdata(learning!)§ E.g.:foreachoutcomex,lookattheempiricalrate ofthatvalue:
§ Thisistheestimatethatmaximizesthelikelihoodofthedata
r r b
r b b
r bbrb b
r bb
r
b
b
Implementation
§ Two lines of code in scikit-learn (homework 2)
gnb =GaussianNB()y_pred =gnb.fit(train_data,train_label).predict(test_data)
NextTime:Smoothing