103

Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

  • Upload
    others

  • View
    15

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python
Page 2: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

MachineLearning

Step-by-StepGuideToImplementMachineLearningAlgorithmswithPython

Author

RudolphRussell

Page 3: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

© Copyright2018-Allrightsreserved.

Ifyouwouldliketosharethisbookwithanotherperson,pleasepurchaseanadditionalcopyforeachrecipient.Thankyouforrespectingthehardworkofthisauthor.Otherwise,thetransmission,duplicationorreproductionofanyofthefollowingworkincludingspecificinformationwillbeconsideredanillegalactirrespectiveofifitisdoneelectronicallyorinprint.ThisextendstocreatingasecondaryortertiarycopyoftheworkorarecordedcopyandisonlyallowedwithanexpresswrittenconsentfromthePublisher.Alladditionalrightreserved.

Page 4: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TableofContents

CHAPTER1INTRODUCTIONTOMACHINELEARNINGTheoryWhatismachinelearning?Whymachinelearning?Whenshouldyouusemachinelearning?TypesofSystemsofMachineLearningSupervisedandunsupervisedlearningSupervisedLearningThemostimportantsupervisedalgorithmsUnsupervisedLearningThemostimportantunsupervisedalgorithms

ReinforcementLearningBatchLearningOnlineLearningInstancebasedlearningModel-basedlearningBadandInsufficientQuantityofTrainingDataPoor-QualityDataIrrelevantFeaturesFeatureEngineering

TestingOverfittingtheDataSolutions

UnderfittingtheDataSolutions

EXERCISESSUMMARYREFERENCES

CHAPTER2

Page 5: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

CLASSIFICATIONInstallationTheMNISTMeasuresofPerformanceConfusionMatrixRecallRecallTradeoffROCMulti-classClassificationTrainingaRandomForestClassifierErrorAnalysisMulti-labelClassificationsMulti-outputClassificationEXERCISESREFERENCES

CHAPTER3HOWTOTRAINAMODELLinearRegressionComputationalComplexityGradientDescentBatchGradientDescentStochasticGradientDescentMini-BatchGradientDescentPolynomialRegressionLearningCurvesRegularizedLinearModelsRidgeRegressionLassoRegression

EXERCISESSUMMARYREFERENCES

Chapter4DifferentmodelscombinationsImplementingasimplemajorityclassiferCombiningdifferentalgorithmsforclassificationwithmajorityvoteQuestions

Page 6: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

CHAPTER1

INTRODUCTIONTOMACHINELEARNING

Page 7: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TheoryIfIaskyouabout“Machinelearning,”you'llprobablyimaginearobotorsomethingliketheTerminator.Inrealityt,machinelearningisinvolvednotonlyinrobotics,butalsoinmanyotherapplications.Youcanalsoimaginesomethinglikeaspamfilterasbeingoneofthefirstapplicationsinmachinelearning,whichhelpsimprovethelivesofmillionsofpeople.Inthischapter,I'llintroduceyouwhatmachinelearningis,andhowitworks.

Page 8: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Whatismachinelearning?Machinelearningisthepracticeofprogrammingcomputerstolearnfromdata.Intheaboveexample,theprogramwilleasilybeabletodetermineifgivenareimportantorare“spam”.Inmachinelearning,datareferredtoascalledtrainingsetsorexamples.

Page 9: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Whymachinelearning?Let’sassumethatyou'dliketowritethefilterprogramwithoutusingmachinelearningmethods.Inthiscase,youwouldhavetocarryoutthefollowingsteps:∙Inthebeginning,you'dtakealookatwhatspame-mailslookslike.Youmightselectthemforthewordsorphrasestheyuse,like“debitcard,”“free,”andsoon,andalsofrompatternsthatareusedinthesender’snameorinthebodyoftheemail∙Second,you'dwriteanalgorithmtodetectthepatternsthatyou'veseen,andthenthesoftwarewouldflagemailsasspamifacertainnumberofthosepatternsaredetected.∙Finally,you'dtesttheprogram,andthenredothefirsttwostepsagainuntiltheresultsaregoodenough.

Becausetheprogramisnotsoftware,itcontainsaverylonglistofrulesthataredifficulttomaintain.ButifyoudevelopedthesamesoftwareusingML,you'llbeabletomaintainitproperly.

Page 10: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Inaddition,theemailsenderscanchangetheire-mailtemplatessothatawordlike“4U”isnow“foryou,”sincetheiremailshavebeendeterminedtobespam.Theprogramusingtraditionaltechniqueswouldneedtobeupdated,whichmeansthat,iftherewereanyotherchanges,youwouldlneedtoupdateyourcodeagainandagainandagain.

Ontheotherhand,aprogramthatusesMLtechniqueswillautomaticallydetectthischangebyusers,anditstartstoflagthemwithoutyoumanuallytellingitto.

Also,wecanuse,machinelearningtosolveproblemsthatareverycomplexfornon-machinelearningsoftware.Forexample,speechrecognition:whenyousay“one”or“two”,theprogramshouldbeabletodistinguishthedifference.So,for

Page 11: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

thistask,you'llneedtodevelopanalgorithmthatmeasuressound.

Intheend,machinelearningwillhelpustolearn,andmachine-learningalgorithmscanhelpusseewhatwehavelearned.

Page 12: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Whenshouldyouusemachinelearning?•Whenyouhaveaproblemthatrequiresmanylonglistsofrulestofindthesolution.Inthiscase,machine-learningtechniquescansimplifyyourcodeandimproveperformance.

•Verycomplexproblemsforwhichthereisnosolutionwithatraditionalapproach.

•Non-stableenvironments’:machine-learningsoftwarecanadapttonewdata.

Page 13: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TypesofSystemsofMachineLearningTherearedifferenttypesofmachine-learningsystems.Wecandividethemintocategories,dependingonwhether

•Theyhavebeentrainedwithhumansornot

-Supervised-Unsupervised-Semi-supervised-ReinforcementLearning

•Iftheycanlearnincrementally

•Iftheyworksimplybycomparingnewdatapointstofinddatapoints,orcandetectnewpatternsinthedata,andthenwillbuildamodel.

Page 14: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

SupervisedandunsupervisedlearningWecanclassifymachinelearningsystemsaccordingtothetypeandamountofhumansupervisionduringthetraining.Youcanfindfourmajorcategories,asweexplainedbefore.

-Supervisedlearning-Unsupervisedlearning-Semi-supervisedlearning-Reinforcementlearning

SupervisedLearningInthistypeofmachine-learningsystem,thedatathatyoufeedintothealgorithm,withthedesiredsolution,arereferredtoas“labels.”

-Supervisedlearninggroupstogetheratasksofclassification.Theaboveprogramisagoodexampleofthisbecauseit'sbeentrainedwithmanyemailsatthesametimeastheirclass.

Anotherexampleistopredictanumericvaluelikethepriceofaflat,givenasetoffeatures(location,numberofrooms,facilities)calledpredictors;thistypeoftaskiscalledregression.

Page 15: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Youshouldkeepinmindthatsomeregressionalgorithmscanbeusedforclassificationsaswell,andviceversa.

Themostimportantsupervisedalgorithms-K-nearsneighbors-Linearregression-Neuralnetworks-Supportvectormachines-Logisticregression-Decisiontreesandrandomforests

UnsupervisedLearningInthistypeofmachine-learningsystem,youcanguessthatthedataisunlabeled.

Themostimportantunsupervisedalgorithms-Clustering:k-means,hierarchicalclusteranalysis-Associationrulelearning:Eclat,apriori-Visualizationanddimensionalityreduction:kernelPCA,t-distributed,PCA

Asanexample,supposeyou'vegotmanydataonvisitorUsingofoneofour

Page 16: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

algorithmsfordetectinggroupswithsimilarvisitors.Itmayfindthat65%ofyourvisitorsaremaleswholovewatchingmoviesintheevening,while30%watchplaysintheevening;inthiscase,byusingaclusteringalgorithm,itwilldivideeverygroupintosmallersub-groups.

Therearesomeveryimportantalgorithms,likevisualizationalgorithms;theseareunsupervisedlearningalgorithms.You'llneedtogivethemmanydataandunlabeleddataasaninput,andthenyou'llget2Dor3Dvisualizationasanoutput.

Thegoalhereistomaketheoutputassimpleaspossiblewithoutlosinganyoftheinformation.Tohandlethisproblem.itwillcombineseveralrelatedfeaturesintoonefeature:forexample,itwillcmbnacar’smakewithitsmodel.Thisiscalledfeatureextraction.

Page 17: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ReinforcementLearningReinforcementlearningisanothertypeofmachine-learningsystem.Anagent“AIsystem”willobservetheenvironment,performgivenactions,andthenreceivetrewardsinreturn.Withthistype,theagentmustlearnbyitself.Tiescalledapolicy.

Youcanfindthistypeoflearningtypeinmanyroboticsapplicationsthatlearnhowtowalk

Page 18: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

BatchLearningInthiskindofmachine-learningsystems,thesystemcan’tlearnincrementally:thesystemmustobtainalltheneededdata.Thatmeansitwillrequiremanyresourcesandahugeamountoftime,soit’salwaysdoneoffline.So,toworkwiththistypeoflearning,thefirstthingtodoistotrainthesystem,andthenlaunchitwithoutanylearning.

Page 19: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

OnlineLearningThiskindoflearningistheoppositeofbatchlearning.Imeanthat,here,thesystemcanlearnincrementallybyprovidingthesystemwithalltheavailabledataasinstances(groupsorindividually),andthenthesystemcanlearnonthefly.

Youcanusethistypeofsystemforproblemsthatrequirethecontinuousflowofdata,whichalsoneedstoadaptquicklytoanychanges.Also,youcanusethistypeofsystemtoworkwithverylargedatasets,

Youshouldknowhowfastyoursystemcanadapttoanychangesinthedata’s“learningrate.”Ifthespeedishigh,meansthatthesystemwilllearnquite,quickly,butitalsowillforgetolddataquickly.

Page 20: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

InstancebasedlearningThisisthesimplesttypeoflearningthatyoushouldlearnbyheart.Byusingthistypeoflearninginouremailprogram,itwillflagalloftheemailsthatwereflaggedbyusers.

.

Page 21: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Model-basedlearningThereisanothertypeoflearninginwhichlearningfromexamplesallowsconstructiontomakepredictions

Page 22: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

BadandInsufficientQuantityofTrainingDataMachine-learningsystemsarenotlikechildren,whocandistinguishapplesandorangesinallsortsofcolorsandshapes,buttheyrequirelotofdatatoworkeffectively,whetheryou'reworkingwithverysimpleprogramsandproblems,orcomplexapplicationslikeimageprocessingandspeechrecognition.Hereisanexampleoftheunreasonableeffectivenessofdata,showingtheMSproject,whichincludessimpledataandthecomplexproblemofNLP.

Page 23: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Poor-QualityDataIfyou'reworkingwithtrainingdatathatisfulloferrorsandoutliers,thiswillmakeitveryhardforthesystemtodetectpatterns,soitwon'tworkproperly.So,ifyouwantyourprogramtoworkwell,youmustspendmoretimecleaningupyourtrainingdata.

Page 24: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

IrrelevantFeaturesThesystemwillonlybeabletolearnifthetrainingdatacontainsenoughfeaturesanddatathataren’ttooirrelevant.ThemostimportantpartofanyMLprojectistodevelopgoodfeatures“offeatureengineering”.

FeatureEngineeringTheprocessoffeatureengineeringgoeslikethis:

.Selectionoffeatures:selectingthemostusefulfeatures.

.Extractionoffeatures:combiningexistingfeaturestoprovidemoreusefulfeatures.

.Creationofnewfeatures:creationofnewfeatures,basedondata.

Page 25: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TestingIfyou'dliketomakesurethatyourmodelisworkingwellandthatmodelcangeneralizewithnewcases,youcantryoutnewcaseswithitbyputtingthemodelintheenvironmentandthenmonitoringhowitwillperform.Thisisagoodmethod,butifyourmodelisinadequate,theuserwillcomplain.

Youshoulddivideyourdataintotwosets,onesetfortrainingandthesecondonefortesting,sothatyoucantrainyourmodelusingthefirstoneandtestitusingthesecond.Thegeneralizationerroristherateoferrorbyevaluationofyourmodelonthetestset.Thevalueyougetwilltellyouifyourmodelisgoodenough,andifitwillworkproperly.

Iftheerrorrateislow,themodelisgoodandwillperformproperly.Incontrast,ifyourrateishigh,thismeansyourmodelwillperformbadlyandnotworkproperly.Myadvicetoyouistouse80%ofthedatafortrainingand20%fortestingpurposes,sothatit’sverysimpletotestorevaluateamodel.

Page 26: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

OverfittingtheDataIfyou'reinaforeigncountryandsomeonestealssomethingofyours,youmightsaythateveryoneisathief.Thisisanovergeneralization,and,inmachinelearning,iscalled“overfitting”.Thismeansthatmachinesdothesamething:theycanperformwellwhenthey'reworkingwiththetrainingdata,buttheycan'tgeneralizethemproperly.Forexample,inthefollowingfigureyou'llfindahighdegreeoflifesatisfactionmodelthatoverfitsthedata,butitworkswellwiththetrainingdata.

Whendoesthisoccur?

Overfittingoccurswhenthemodelisverycomplexfortheamountoftrainingdatagiven.

SolutionsTosolvetheoverfittingproblem,youshoulddothefollowing:

-Gathermoredatafor“trainingdata”-Reducethenoiselevel-Selectonewithfewerparameters

Page 27: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

UnderfittingtheDataFromitsname,underfittingistheoppositeofoverfitting,andyou'llencounterthiswhenthemodelisverysimpletolearn.Forexample,usingtheexampleofqualityoflife,reallifeismorecomplexthanyourmodel,sothepredictionswon'tyieldthesame,eveninthetrainingexamples.

SolutionsTofixthisproblem:

-Selectthemostpowerfulmodel,whichhasmanyparameters.-Feedthebestfeaturesintoyouralgorithm.Here,I'mreferringtofeatureengineering.-Reducetheconstraintsonyourmodel.

Page 28: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

EXERCISES

Inthischapter,wehavecoveredmanyconceptsofmachinelearning.Thefollowingchapterswillbeverypractical,andyou'llwritecode,butyoushouldanswerthefollowingquestionsjusttomakesureyou'reontherighttrack.

1. Definemachinelearning2. Describethefourtypesofmachine-learningsystems.3. Whatisthedifferencebetweensupervisedandunsupervisedlearning.4. Nametheunsupervisedtasks.5. Whyaretestingandvalidationimportant?6. Inonesentence,describewhatonlinelearningis.7. Whatisthedifferencebetweenbatchandofflinelearning?8. Whichtypeofmachinelearningsystemshouldyouusetomakea

robotlearnhowtowalk?

Page 29: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

SUMMARY

Inthischapter,you'velearnedmanyusefulconcepts,solet’sreviewsomeconceptsthatyoumayfeelabitlostwith.Machinelearning:MLreferstomakingmachinesworkbetteratsometask,usinggivendata.

.Machinelearningcomesinmanytypes,suchassupervised,batch,unsupervised,andonlinelearning.

.ToperformanMLproject,youneedtogatherdatainatrainingset,andthenfeedthatsettoalearningalgorithmtogetanoutput,“predictions”.

.Ifyouwanttogettherightoutput,yoursystemshouldusecleardata,whichisnottoosmallandwhichdoesnothaveirrelevantfeatures.

Page 30: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

REFERENCES.http://static.googleusercontent.com/media/research.google.com/fr//pubs/archive/35179.pdf

Page 31: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

CHAPTER2

CLASSIFICATION

Page 32: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

InstallationYou'llneedtoinstallPython,MatplotlibandScikit-learnforthischapter.Justgotothereferencessectionandfollowthestepsindicated.

Page 33: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TheMNISTInthischapter,you'llgodeeperintoclassificationsystems,andworkwiththeMNISTdataset.Thisisasetof70,000imagesofdigitshandwrittenbystudentsandemployees.You'llfindthateachimagehasalabelandadigitthatrepresentsit.Thisprojectislikethe“Hello,world”exampleoftraditionalprogramming.Soeverybeginnertomachinelearningshouldstartwiththisprojecttolearnabouttheclassificationalgorithm.Scikit-Learnhasmanyfunctions,includingtheMNIST.Let’stakealookatthecode:

>>>fromsklearn.datasetsimportfetch_mldata>>>mn=fetch_mldata('MNISToriginal')>>>mn{'COL_NAMES':['label','data'],'Description':'mldata.orgdataset:mn-original','data':array([[0,0,0,...,0,0,0],[0,0,0,...,0,0,0],[0,0,0,...,0,0,0],...,[0,0,0,...,0,0,0],[0,0,0,...,0,0,0],[0,0,0,...,0,0,0]],dataType=uint8),'tar':array([0.,0.,0.,...,9.,9.,9.])}de

.Descriptionisakeythatdescribesthedataset.

.Thedatakeyherecontainsanarraywithjustonerowforinstance,andacolumnforeveryfeature.

.Thistargetkeycontainsanarraywithlabels.

Let’sworkwithsomeofthecode:

>>>X,y=mn["data"],mn["tar"]>>>X.shape(70000,784)>>>y.shape(70000,)

.7000heremeansthatthereare70,000images,andeveryimagehasmorethan

Page 34: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

700features:“784”.Because,asyoucansee,everyimageis28x28pixels,youcanimaginethateverypixelisonefeature.

Let’stakeanotherexamplefromthedataset.You'llonlyneedtograbaninstance’sfeature,thenmakeit26x26arrays,andthendisplaythemusingtheimshowfunction:

%matplotlibinlineimportmatplotlibimportmatplotlib.pyplotaspltyourDigit=X[36000]Your_image=your_image.reshape(26,26)plt.imshow(Your_image,cmap=matplotlib.cm.binary,interpolation="nearest")plt.axis("off")plt.show()

Asyoucanseeinthefollowingimage,itlookslikethenumberfive,andwecangivethatalabelthattellsusit’sfive.

Inthefollowingfigure,youcanseemorecomplexclassificationtasksfromtheMNISTdataset.

Page 35: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Also,youshouldcreateatestsetandmakeitbeforeyourdataisinspected.

TheMNISTdatasetisdividedintotwosets,onefortrainingandonefortesting.

x_tr,x_tes,y_tr,y_te=x[:60000],x[60000:],y[:60000],y[60000:]

Let’splaywithyourtrainingsetasfollowstomakethecross-validationtobesimilar(withoutanymissingofanydigit)

Importnumpyasnp

myData=np.radom.permutaion(50000)

x_tr,y_tr=x_tr[myData],y_tr[myData]

Nowit’stimetomakeitsimpleenough,we'lltrytojustidentifyonedigit,e.g.thenumber6.This“6-detector”willbeanexampleofthebinaryclassifier,todistinguishbetween6andnot6,sowe'llcreatethevectorsforthistask:

Y_tr_6=(y_tr==6)//thismeansitwillbetruefor6s,andfalseforanyothernumber

Y_tes_6=(Y_tes==6)

Afterthat,wecanchooseaclassifierandtrainit.BeginwiththeSGD(StochasticGradientDescent)classifier.

TheScikit-Learnclasshastheadvantageofhandlingverylargedatasets.Inthisexample,theSGDwilldealwithinstancesseparately,asfollows.

Page 36: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

fromsklearn.linear_modelimportSGDClassifier

mycl=SGDClassifier(random_state=42)

mycl.fit(x_tr,y_tr_6)

touseittodetectthe6

>>>mycl.prdict([any_digit)]

Page 37: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

MeasuresofPerformanceIfyouwanttoevaluateaclassifier,thiswillbemoredifficultthanaregressor,solet’sexplainhowtoevaluateaclassifier.

Inthisexample,we'lluseacross-validationtoevaluateourmodel.

fromsklearn.model_selectionimportStratifiedKFold

formsklearn.baseimportclone

sf=StratifiedKFold(n=2,ran_state=40)

fortrain_index,test_indexinsf.split(x_tr,y_tr_6):

cl=clone(sgd_clf)

x_tr_fd=x_tr[train_index]

y_tr_fd=(y_tr_6[train_index])

x_tes_fd=x_tr[test_index]

y_tes_fd=(y_tr_6[test_index])

cl.fit(x_tr_fd,y_tr_fd)

y_p=cl.predict(x_tes_fd)

print(n_correct/len(y_p))

.WeusetheStratifiedFoldclasstoperformstratifiedsamplingthatproducesfoldsthatcontainarationforeveryclass.Next,everyiterationinthecodewillcreateacloneoftheclassifiertomakepredictionsonthetestfold.Andfinally,itwillcountthenumberofcorrectpredictionsandtheirratio

.Nowwe'llusethecross_val_scorefunctiontoevaluatetheSGDClassifierbyK-foldcrossvalidation.Thekfoldcrossvalidationwilldividethetrainingsetinto3folds,andthenitwillmakepredictionandevaluationoneachfold.

fromsklearn.model_selectionimportcross_val_score

cross_val_score(sgd_clf,x_tr,y_tr_6,cv=3,scoring=“accuracy”)

You'llgettheratioofaccuracyof“correctpredictions”onallfolds.

Page 38: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Let’sclassifyeveryclassifierateverysingleimageinthenot-6

fromsklearn.baseimportBaseEstimator

classnever6Classifier(BaseEstimator):

deffit(self,X,y=None):

pass

defpredict(self,x):

returnnp.zeros((len(X),1),dtype=bool)

Let’sexaminetheaccuracyofthismodelwiththefollowingcode:

>>>never_6_cl=Never6Classifier()

>>>cross_val_score(never_6_cl,x_tr,y_tr_6,cv=3,scoring=“accuracy”)

Output:array([“num”,“num”,“num”])

Fortheoutput,you'llgetnolessthan90%:only10%oftheimagesare6s,sowecanalwaysimaginethatanimageisnota6.We'llberightabout90%ofthetime.

Bearinmindthataccuracyisnotthebestperformancemeasureforclassifiers,ifyou'reworkingwithskeweddatasets.

Page 39: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ConfusionMatrixThereisabettermethodtoevaluatetheperformanceofyourclassifier:theconfusionmatrix.

It’seasytomeasureperformancewiththeconfusionmatrix,justbycountingthenumberoftimesinstancesofclassXareclassifiedasclassY,forexample.Togetthenumberoftimesofimageclassifiersof6swith2s,youshouldlookinthe6throwand2ndcolumnoftheconfusionmatrix.

Let’scalculatetheconfusionmatrixusingthecross_val_predict()function.

fromsklearn.model_selectionimportcross_Val_predict

y_tr_pre=cross_val_predict(sgd_cl,x_tr,y_tr_6,cv=3)

Thisfunction,likethecross_val_score()function,performsthekfoldcross-validation,anditalsoreturnspredictionsoneachfold.Italsoreturnsacleanpredictionforeveryinstanceinyourtrainingset.

Nowwe'rereadytogetthematrixusingthefollowingcode.

fromsklearn.metricsimportconfusion_matrix

confusion_matrix(y_tr_6,y_tr_pred)

You'llgetanarrayof4values,“numbers”.

Everyrowrepresentsaclassinthematrix,andeverycolumnrepresentsapredictedclass.

Thefirstrowisthenegativeone:that“containnon-6images”.Youcanlearnalotfromthematrix.

Butthereisalsoagoodonethat's,interestingtoworkwithifyou'dliketogettheaccuracyofthepositivepredictions,whichistheprecisionoftheclassifierusingthisequation.

Precision=(TP)/(TP+FP)

TP:numberoftruepositives

FP:numberoffalsepositives

Page 40: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Recall=(TP)/(TP+FN)“sensitivity”:itmeasuretheratioofpositiveinstances.

Page 41: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Recall>>>fromsklearn.metricsimportprecision_score,recall_score

>>>precision_score(y_tr_6,y_pre)

>>>recall_score(y_tr_6,y_tr_pre)

It’sverycommontocombineprecisionandrecallintojustonemetric,whichistheF1score.

F1isthemeanofbothprecisionandrecall.WecancalculatetheF1scorewiththefollowingequation:

F1=2/((1/precision)+(1)/recall))=2*(precision*recall)/(precision+recall)=(TP)/((TP)+(FN+FP)/2)

TocalculatetheF1score,simplyusethefollowingfunction:

>>>fromsklearn.metricsimportf1_score

>>>f1_score(y_tr_6,y_pre)

Page 42: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

RecallTradeoffTogettothispoint,youshouldtakealookattheSGDClassifierandhowitmakesdecisionsregardingclassifications.Itcalculatesthescorebasedonthedecisionfunction,andthenitcomparesthescorewiththethreshold.Ifit’sgreaterthanthisscore,itwillassigntheinstancetothe“positiveornegative”.class

Forexample,ifthedecisionthresholdisatthecenter,you'llfind4true+ontherightsideofthethreshold,andonlyonefalse.Sotheprecisionratiowillbeonly80%.

InScikit-Learn,youcan'tsetathresholddirectly.You'llneedtoaccessthedecisionscores,whichusepredictions,andbyycallingthedecisionfunction,().

>>>y_sco=sgd_clf.decision_funciton([anydigit])

>>>y_sco

>>>threshold=0

>>>y_any_digit_pre=(y_sco>threshold)

Inthiscode,theSGDClassifiercontainsathreshold,=0,toreturnthesameresultasthethepredict()function.

>>>threshold=20000

>>>y_any_digit_pre=(y_sco>threshold)

>>>y_any_digit_pre

Page 43: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Thiscodewillconfirmthat,whenthethresholdincreases,therecalldecreases.

y_sco=cross_val_predict(sgd_cl,x_tr,y_tr_6,cv=3,method=”decisionfunction)

It’stimetocalculateallpossibleprecisionandrecallforthethresholdbycallingtheprecision_recall_curve()function

fromsklearn.metricsimportprecision_recall_curve

precisions,recalls,threshold=precision_recall_curve(y_tr_6,y_sco)

andnowlet’splottheprecisionandtherecallusingMatplotlib

defplot_pre_re(pre,re,thr):

plt.plot(thr,pre[:-1],“b—“,label=“precision”)

plt.plot(thr,re[:1],“g-“,label=”Recall”)

plt.xlabel(“Threshold”)

plt.legend(loc=”left”)

plt.ylim([0,1])

plot_pre_re(pre,re,thr)

plt.show

Page 44: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ROCROCstandsforreceiveroperatingcharacteristicandit'satoolthatusedwithbinaryclassifiers.

Thistoolissimilartotherecallcurve,butitdoesn’tplottheprecisionandrecall:itplotsthepositiverate

andfalserate.You'llworkalsowithFPR,whichistheratioofnegativesamples.Youcanimagineifit'slike(1–negativerate.AnotherconceptistheTNRandit'sthespecificity.Recall=1–specificity.

Let’splaywiththeROCCurve.First,we'llneedtocalculatetheTPRandtheFPR,justbycallingtheroc-curve()function,

fromsklearn.metricsimportroc_curve

fp,tp,thers=roc_curve(y_tr_6,y_sco)

Afterthat,you'llplottheFPRandTPRwithMatplotlibaccordingtothefollowinginstructions.

def_roc_plot(fp,tp,label=none):

plt.plot(fp,tp,linewidth=2,label=label)

plt.plot([0,1)],[0,1],“k--”)

plt.axis([0,1,0,1])

plt.xlabel(‘Thisisthefalserate’)

plt.ylabel(‘Thisisthetruerate’)

roc_plot(fp,tp)

plt.show

Page 45: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python
Page 46: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Multi-classClassificationWeusebinaryclassifierstodistinguishbetweenanytwoclasses,butwhatifyou'dliketodistinguishbetweenmorethantwo?

YoucanusesomethinglikerandomforestclassifiersorBayesclassifiers,whichcancomparebetweenmorethantwo.But,ontheotherhand,SVM(theSupportVectorMachine)andlinearclassifiersfunctionlikebinaryclassifiers.

Ifyou'dliketodevelopasystemthatclassifiesimagesofdigitinto12classes(from0to11)you'llneedtotrain12binaryclassifiers,andmakeoneforeveryclassifier(suchas4–detector,5-detector,6-detectorandsoon),andthenyou'llneedtogettheDS,the“decisionscore,”ofeveryclassifierfortheimage.Then,you'llchoosethehighestscoreclassifier.WecallthistheOvAstrategy:“one-versus-all.”

Theothermethodistotrainabinaryclassifierforeachpairofdigits;forexample,onefor5sand6sandanotheronefor5sand7s.—wecallthismethodOvO,“one-versus-one”—tocounthowmanyclassifiersyou'llneed,basedonthenumberofclassesthatusethefollowingequation:“N=numberofclasses”.

N*(N-1)/2.Ifyou'dliketousethistechniquewiththeMNIST10*(10-1)/2,theoutputwillbe45classifiers,“binaryclassifiers”.

InScikit-Learn,youexecuteOvAautomaticallywhenyouuseabinaryclassificationalgorithm.

>>>sgd_cl.fit(x_tr,y_tr)

>>>sgd_cl.Predict([any-digit])

Additionally,youcancallthedecision_function()toreturnthescores“10scoresforoneclass”

>>>any_digit_scores=sgd_cl.decision_function([any_digit])

>>>any_digit_scores

Array([“num”,“num”,“num”,“num”,“num”,“num”,“num”,“num”,“num”,”num”]])

Page 47: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

TrainingaRandomForestClassifier>>>forest.clf.fit(x_tr,y_tr)

>>>forest.clf.predict([any-digit])

array([num])

Asyoucansee,trainingarandomforestclassifierwthonlytwolinesofcodeisveryeasy.

TheScikit-Learndidn’texecuteanyOvAorOvOfunctionsbecausethiskindofalgorithm—“randomforestclassifiers”—canautomaticallyworkmultipleclasses.Ifyou'dliketotakealookatthelistofclassifierpossibilities,youcancallthepredict_oroba()function.

>>>forest_cl.predict_proba([any_digit])

array([[0.1,0,0,0.1,0,0.8,0,0,0]])

Theclassifierisveryaccuratewithitsprediction,asyoucanseeintheoutput;thereis0.8atindexnumber5.

Let’sevaluatetheclassifierusingthecross_val_score()function.

>>>cross_val_score(sgd_cl,x_tr,y_tr,cv=3,scoring=“accuracy”)

array([0.84463177,0.859668,0.8662669])

You'llget84%morenthefolds.Whenusingarandomclassifier,you'llget,inthiscase,10%fortheaccuracyscore.Keepinmindthatthehigherthisvalueis,thebetter.

Page 48: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ErrorAnalysisFirstofall,whendevelopingamachinelearningproject:

1. Determinetheproblem;2. Collectyourdata;3. Workonyourdataandexploreit;4. Cleanthedata5. Workwithseveralmodelsandchoosethebestone;6. Combineyourmodelsintothesolution;7. Showyoursolution;8. Executeandtestyoursystem.First,youshouldworkwiththeconfusionmatrixandmakepredictionsbythecross-valfunction.Next,you'llcalltheconfusionmatrixfunction:>>>y_tr_pre=cross_val_prediciton(sgd_cl,x_tr_scaled,y_tr,cv=3)>>>cn_mx=confusion_matrix(y_tr,y_tr_pre)>>>cn_mx

array([[5625,2,25,8,11,44,52,12,34,6],

[2,2415,41,22,8,45,10,10,9],

[52,43,7443,104,89,26,87,60,166,13],

[47,46,141,5342,1,231,40,50,141,92],

[19,29,41,10,5366,9,56,37,86,189],

[73,45,36,193,64,4582,111,30,193,94],

[29,34,44,2,42,85,5627,10,45,0],

[25,24,74,32,54,12,6,5787,15,236],

[52,161,73,156,10,163,61,25,5027,123],

[50,24,32,81,170,38,5,433,80,4250]])

Page 49: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Plt.matshow(cn_mx,cmap=plt.cm.gray)

Plt.show()

First,youshoulddivideeveryvalueinthematrixbythenumberofimagesintheclass,andthenyou'llcomparetheerrorrates.

rw_sm=cn_mx.sum(axis=1,keepdims=True)

nm_cn_mx=cn_mx/rw_sum

Thenextstepistomakeallthezerosonthediagonal,andthatwillkeeptheerrorsfromoccurring.

np.fill_diagonal(nm_cn_mx,0)

plt.matshow(nm_cn_mx,cmap=plt.cm.gray)

plt.show()

Page 50: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Theerrorsareeasytospotintheaboveschema.Onethingtokeepinmindisthattherowsrepresentclassesandthecolumnsrepresentthepredictedvalues.

Page 51: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Multi-labelClassifications

Intheaboveexamples,everyclasshasjustoneinstance.Butwhatifwewanttoassigntheinstancestomultipleclasses—facerecognition,forexample.Supposethatyou'dliketofindmorethanonefaceinthesamephoto.Therewillbeonelabelforeachface.Let'spracticewithasimpleexample.

y_tr_big=(y_tr>=7)

y_tr_odd=(y_tr%2==1)

y_multi=np.c[y_tr_big,y_tr_odd]

kng_cl=KNeighborsClassifier()

kng_cl.fit(x_tr,y_m,ulti)

Intheseinstructions,wehavecreateday_mulltiarraythatcontainstwolabelsforeveryimage.

Andthefirstonecontainsinformationonwhetherthedigitis“big”(8,9,.),andthesecondonechecksifit'soddornot.

Next,we'llmakeapredictionusingthefollowingsetofinstructions.

>>>kng_cl.predict([any-digit])

Array([false,true],dataType=bool)

Trueheremeansthatit'soddandfalse,thatit'snotbig.

Page 52: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Multi-outputClassification

Atthispoint,wecancoverthefinaltypeofclassificationtask,whichisthemulti-outputclassification.

It’sjustageneralcaseofmulti-labelclassification,buteverylabelwillhaveamulticlass.Inotherwords,itwillhavemorethanonevalue.

Let’smakeitclearwiththisexample,usingtheMNISTimages,andaddingsomenoisetotheimagewiththeNumPyfunctions.

No=rnd.randint(0,101,(len(x_tr),785)))

No=rnd.randint(0,101,(len(x_tes),785))

x_tr_mo=x_tr+no

x_tes_mo=x_tes+no

y_tr_mo=x_tr

y_tes_mo=x_tes

kng_cl.fit(x_tr_mo,y_tr_mo)

cl_digit=kng_cl.predict(x_tes_mo[any-index]])

plot_digit(cl_digit)

Page 53: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

EXERCISES

1. ConstructaclassifierfortheMNISTdataset.Trytogetmorethan96%accuracyonyourtestset.

2. WriteamethodtoshiftanimagefromtheMNIST(rightorleft)by2pixels.

3. Developyourownanti-spamprogramorclassifier.

-DownloadexamplesofspamfromGoogle.

-Extractthedataset.

-Dividethedatasetintotrainingforatestset.

-Writeaprogramtoconverteveryemailtoafeaturevector.

-Playwiththeclassifiers,andtrytoconstructthebestonepossible,withhighvaluesforrecallandprecision.

Page 54: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

SUMMARY

Inthischapter,you'velearnedusefulnewconceptsandimplementedmanytypesofclassificationalgorithms.You'vealsoworkedwithnewconcepts,like:

-ROC:thereceiveroperatingcharacteristic,thetoolusedwithbinaryclassifiers.

-ErrorAnalysis:optimizingyouralgorithms.

-HowtotrainarandomforestclassifierusingtheforestfunctioninScikit-Learn.

-UnderstandingMulti-OutputClassification.

-Understandingmulti-Labelclassifications.

Page 55: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

REFERENCES

http://scikit-learn.org/stable/install.html

https://www.python.org

https://matplotlib.org/2.1.0/users/installing.html

http://yann.lecun.com/exdb/mnist/

Page 56: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

CHAPTER3

HOWTOTRAINAMODELAfterworkingwithmanymachinelearningmodelsandtrainingalgorithms,whichseemlikeunfathomableblackboxes.wewereabletooptimizearegressionsystem,havealsoworkedwithimageclassifiers.Butwedevelopedthesesystemswithoutunderstandingwhat'ssinsideandhowtheywork,sonowweneedtogodeepersothatwecangrasphowtheyworkandunderstandthedetailsofimplementation.

Gainingadeepunderstandingofthesedetailswillhelpyouwiththerightmodelandwithchoosingthebesttrainingalgorithm.Also,itwillhelpyouwithdebugginganderroranalysis.

Inthischapter,we'llworkwithpolynomialregression,whichisacomplexmodelthatworksfornonlineardatasets.Inaddition,we'llworkingwithseveralregularizationtechniquesthatreducetrainingthatencouragesoverfitting.

Page 57: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

LinearRegressionAsanexample,we'lltakel_S=θ0+θ1×GDP_per_cap.Thisisasimplemodelforalinearfunctionoftheinputfeature,“GPD_per_cap”.(θ0andθ1)aretheparametersofthemodel,

Ingeneral,you'llusealinearmodeltomakeapredictionbycalculatingaweightedsumoftheinputfeatures,andalsoaconstant“bias,”asyoucanseeinthefollowingequation.

.Yisthevalueofthepredictor.

.Nrepresentsthefeatures

.X1isthevalueofthefeature.

.Θjisthemodelparameterofjtheta.

Also,wecanwritetheequationinvectorizedform,asinthefollowingexample:

.Θisthevaluethatminimizesthecost.

.Ycontainsthevaluesy(1)toy(m).

Let’swritesomecodetopractice.

Importnumpyasnp

V1_x=2*np.random.rand(100,1)

V2_y=4+3*V1_x+np.random.randn(100,1)

Page 58: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Afterthat,we'llcalculateΘvalueusingourequation.It'stimetousetheinv()functionfromourlinearalgebramoduleofnumpy(np.linalg)tocalculatetheinverseofanymatrix,andalso,thedot()functionformultiplyourmatrix

Value1=np.c_[np.ones((100,1)),V1_x]

myTheta=np.linalg.inv(Value1.T.dot(Value1)).dot(Value1.T).dot(V2_y)

>>>myTheta

Array([[num],[num]])

Thisfunctionusesthefollowingequation—y=4+3x+noise“Gaussian”—togenerateourdata.

Nowlet’smakeourpredictions.

>>>V1_new=np.array([[0],[2]])

>>>V1_new_2=np.c_[np.ones((2,1)),V1_new]

>>>V2_predicit=V1_new_2.dot(myTheta)

>>>V2_predict

Array([[4.219424],[9.74422282]])

Now,it’stimetoplotthemodel.

Page 59: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Plt.plot(V1_new,V2_predict,“r-“)

Plt.plot(V1_x,V2_y,“b.”)

Plt.axis([0,2,0,15])

Plt.show()

Page 60: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ComputationalComplexityWiththenormalformula,wecancomputetheinverseofM^T.M—thatis,an*nmatrix(n=thenumberoffeatures).ThecomplexityofthisinversionissomethinglikeO(n^2.5)toO(n^3.2),whichisbasedontheimplementation.Actually,ifyoumakethenumberoffeaturesliketwice,you'llmakethetimeofthecomputationattainbetween2^2.5and2^3.2.

Thegreatnewshereisthattheequationisalinearequation.ThismeansItcaneasilyhandlehugetrainingsetsandfitthememoryin.

Aftertrainingyourmodel,thepredictionswillbenotslow,andthecomplexitywillbesimple,thankstothelinearmodel.It’stimetogodeeperintothemethodsoftrainingalinearregressionmodel,whichisalwaysusedwhenthereisalargenumberoffeaturesandinstancesinthememory.

Page 61: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

GradientDescentThisalgorithmisageneralalgorithmthatisusedforoptimizationandforprovidingtheoptimalsolutionforvariousproblems.Theideaofthisalgorithmistoworkwiththeparametersinaniterativeway,tomakethecostfunctionassimpleaspossible.

Thegradientdescentalgorithmcalculatesthegradientoftheerrorusingtheparametertheta,anditworkswiththemethodofdescendinggradient.Ifthegradientisequaltozero,you'llreachtheminimum.

Also,youshouldkeepinthemindthatthesizeofthestepsisveryimportantforthisalgorithm,becauseifit'sverysmall–“meaningtherateoflearning”isslow–itwilltakealongtimetocovereverythingthatitneedsto.

Page 62: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Butwhentherateoflearningishigh,Itwilltakeshorttimetocoverwhat’sneeded,anditwillprovideanoptimalsolution.

Attheend,youwon'talwaysfindthatallcostfunctionsareeasy,asyoucansee,butyou'llalsofindirregularfunctionsthatmakegettinganoptimalsolutionverydifficult.Thisproblemoccurswhenthelocalminimumandglobalminimumlooksliketheydointhefollowingfigure.

Page 63: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Ifyouassignanytoanytwopointsonyourcurve,you'llfindthatthesegmentofthelinewon'tjointhemonthesamecurve.Thiscostfunctionwilllooklikeabowl,whichwilloccurifthefeatureshavemanyscales,asnthefollowingimage

Page 64: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

BatchGradientDescentIfyou'dliketoimplementthisalgorithm,youshouldfirstcalculatethegradientofyourcostfunctionusingthethetaparameter.Ifthevalueoftheparameterthetahaschanged,you’llneedtoknowthechangingrateofyourcostfunction.Wecancallthischangebyapartialderivative

Wecancalculatethepartialderivativeusingthefollowingequation:

Butwe`llalsousethefollowingequationtocalculatethepartialderivativesandthegradientvectortogether.

Let’simplementthealgorithm.

Lr=1#Lrforlearningrate

Num_it=1000#numberofiterations

L=100

myTheta=np.random.randn(2,1)

Page 65: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

foritinrange(Num_it):

gr=2/L*Value1.T.dot(Value1.dot(myTheta)–V2_y)

myTheta=myTheta–Lr*gr

>>>myTheta

Array([[num],[num]])

Ifyoutrytochangethelearningratevalue,you'llgetdifferentshapes,asinthefollowingfigure.

Page 66: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

StochasticGradientDescentYou'llfindaproblemwhenyou’reusingthebatchgradientdescent:itneedstousethewholetrainingsetinordertocalculatethevalueateachstep,andthatwillaffectperformance“speed”.

Butwhenusingthestochasticgradientdescent,thealgorithmwillrandomlychooseaninstancefromyourtrainingsetateachstep,andthenitwillcalculatethevalue.Inthisway,thealgorithmwillbefasterthanthebatchgradientdescent,sinceitdoesn’tneedtousethewholesettocalculatethevalue.Ontheotherhand,becauseoftherandomnessofthismethod,itwillbeirregularwhencomparedtothebatchalgorithm.

Let’simplementthealgorithm.

Nums=50

L1,L2=5,50

Deflr_sc(s):

returnL1/(s+L2)

myTheta=np.random.randn(2,1)

forNuminrange(Nums):

forlinrange(f)

Page 67: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

myIndex=np.random.randint(f)

V1_Xi=Value1[myIndex:myIndex+1]

V2_yi=V2_y[myIndex:myIndex+1]

gr=2*V1_xi.T.dot(V1_xi.dot(myTheta)–V2_yi)

Lr=lr_sc(Num*f+i)

myTheta=myTheta–Lr*gr

>>>myTheta

Array([[num],[num]])

Page 68: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Mini-BatchGradientDescentBecauseyoualreadyknowthebatchandthestochasticalgorithms,thiskindofalgorithmsisveryeasytounderstandandworkwith.Asyouknow,bothalgorithmscalculatethevalueofthegradients,basedonthewholetrainingsetorjustoneinstance.However,themini-batchcalculatesitsalgorithmsbasedonsmallandrandomsets,andperformsbetterthantheothertwoalgorithms.

Page 69: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

PolynomialRegressionWe’llusethistechniquewhenworkingwithmorecomplexdata,especially,inthecaseoflinearandnonlineardata.Afterwe’veaddedthepowersofeveryfeature,wecantrainthemodelwithnewfeatures.Thisisknownaspolynomialregression.

Now,let’swritesomecode.

L=100

V1=6*np.random.rand(L,1)–3

V2=0.5*V1**2+V1+2+np.random.randn(L,1)

Asyoucansee.thestraightwillneverrepresentthedatainthemostefficientway.Sowe'llusethepolynomialmethodtoworkonthisproblem.

>>>fromsklearn.preprocessionimportPolynomialFeatures

>>>P_F=PolynomialFeatures(degree=2,include_bias=False)

>>>V1_P=P_F.fit_transform(V1)

Page 70: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>V1[0]

Array([num])

>>>V1_P[0]

Now,let’smakethisfunctionproperlywithourdata,andchangethestraightline.

>>>ln_r=LinearRegression()

>>>ln_r.fit(V1_P,V2)

>>>ln_r.intercept_,ln_r.coef

Page 71: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

LearningCurvesAssumethatyou’reworkwithpolynomialregression,andyouwantittofitthedatabetterthanwiththelinearone.Inthefollowingimage,you’llfinda300-degreemodel.Wecanalsocomparethefinalresultwiththeotherkindoftheregression:“normallinear”.

Intheabovefigure,youcanseetheoverfittingofdatawhenyou’reusingthepolynomial.Ontheotherhand,withlinearone,youcanseethatthedataisobviouslybeingunderfitted.

Page 72: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

RegularizedLinearModelsWehaveworked,inthefirstandsecondchapters,onhowtoreduceoverfittingbyregularizingthemodelalittle,asanexample,ifyou'dliketoregularizeapolynomialmodel.Inthiscase,tofixtheproblem,youshoulddecreasethenumberofdegrees.

RidgeRegressionTheridgeregressionisanotherversionofthelinearregression,but,afterregularizingitandaddingweighttothecostfunction,thismakesitfitthedata,andevenmakestheweightofthemodelassimpleaspossible.Hereisthecostfunctionoftheridgeregression:

Asanexampleofridgeregression,justtakealookatthefollowingfigures.

LassoRegression

Page 73: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

“Lasso”regressionstandsfor“LeastAbsoluteShrinkageandSelectionOperator”regression.Thisisanothertypeoftheregularizedversionoflinearregression.

Itlookslikeridgeregression,butwithasmalldifferenceintheequation,asinthefollowingfigures

Thecostfunctionofthelassoregression:

Asyoucanseeinthefollowingfigure,thelassoregressionusessmallervaluesthantheridge.

Page 74: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

EXERCISES

1. Ifyouhaveasetthatcontainsahugenumberoffeatures(millionsofthem),whichregressionalgorithmshouldyouuse,andwhy?

2. Ifyouusebatchgradientdescenttoplottheerrorateachperiod,andsuddenlytherateoferrorsincreases,howwouldyoufixthisproblem?

3. Whatshouldyoudoifyounoticethaterrorsbecomelargerwhenyou’reusingthemini-batchmethod?Why?

4. Fromthesepairs,whichmethodisbetter?Why?:.Ridgeregressionandlinearregression?.Lassoregressionandridgeregression?5. WritethebatchGradientdescentalgorithm.

Page 75: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

SUMMARY

Inthischapter,you'velearnednewconcepts,andhavelearnedhowtotrainamodelusingdifferenttypesofalgorithms.You’vealsolearnedwhentouseeachalgorithm,includingthefollowing:

-Batchgradientdescent

-Mini-batchgradientdescent

-Polynomialregression

-Regularizedlinearmodels.Ridgeregression

.Lassoregression

Inaddition,younowknowthemeaningofcertainterms:linearregression,computationalcomplexity,andgradientdescent.

Page 76: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

REFERENCES

https://docs.scipy.org/doc/numpy-dev/user/quickstart.html

http://scikit-learn.org/stable/auto_examples/linear_model/plot_polynomial_interpolation.html

Page 77: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Chapter4

Differentmodelscombinations

Treeclassifers.

ThenextimagewillillustratethedefinitionofageneraltargetofcollectingfunctionsthatisjusttomergedifferentclassifersintoaOne-classiferthathasabettergeneralizationperformancethaneachindividualclassiferalone.Asanexample,assumethatyoucollectedpredictionsfrommanyexperts.Ensemblemethodswouldallowustomergethesepredictionsbythelotsofexpertstogetapredictionthatismoreproperandrobustthanthepredictionsofeachindividualexpert.Asyoucanseelaterinthispart,therearemanydifferentmethodstocreateanensembleofclassifers.Inthispart,wewillintroduceabasicperceptionabouthowensemblesworkandwhytheyaretypicallyrecognizedforyieldingagoodgeneralizationperformance.

Inthispart,wewillworkwiththemostpopularensemblemethodthatusesthe

Page 78: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

majorityvotingprinciple.Manyvotingsimplymeansthatwechoosethelabelthathasbeenpredictedbythemajorityofclassifers;thatis,receivedmorethan50percentofthevotes.Asanexample,thetermhereislikevotereferstojustbinaryclasssettingsonly.However,itisnothardtogeneratethemajorityvotingprincipletomulti-classsettings,whichiscalledpluralityvoting.Afterthat,wewillchoosetheclasslabelthatreceivedthemostvotes.Thefollowingdiagramillustratestheconceptofmajorityandpluralityvotingforanensembleof10classiferswhereeachuniquesymbol(triangle,square,andcircle)representsauniqueclasslabel:

Usingthetrainingset,westartbytrainingmdifferentclassifers(CC1,,…m).Basedonthemethod,theensemblecanbebuiltfrommanyclassificationalgorithms;forexample,decisiontrees,supportvectormachines,logisticregressionclassifers,andsoon.Infact,youcanusethesamebaseclassificationalgorithmfittingdifferentsubsetsofthetrainingset.Anexampleofthismethodwouldbetherandomforestalgorithm,whichmergesmanydecisionensemblewaysusingmajorityvoting.

Page 79: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Topredictaclasslabelviaasimplemajorityorpluralityvoting,wecombinethepredictedclasslabelsofeachindividualclassiferCjandselecttheclasslabelyˆthatreceivedthemostvotes:

ymˆ=ode{CC12()xx,,()…,Cm()x}

Forexample,inabinaryclassificationtaskwhereclass11=−andclass21=+,wecanwritethemajorityvoteprediction.

Toillustratewhyensemblemethodscanworkbetterthanindividualclassifiersalone,let'sapplythesimpleconceptsofcombinatory.Forthefollowingexample,wemaketheassumptionthatallnbaseclassifiersforabinaryclassificationtaskhaveanequalerrorrate,ε.Additionally,weassumethattheclassifiersareindependentandtheerrorratesarenotcorrelated.Asyoucansee,wecansimplyexplaintheerrorstatisticsofanensembleofbaseclassifiersasaprobability.

Page 80: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Massfunctionofabinomialdistribution:

Here,n.kisthebinomialcoefficientnchoosek.Asyoucansee,youcancalculatetheprobabilitythatthepredictionoftheensembleiswrong.Now,let'stakealookatamoreconcreteexampleof11baseclassifiers(n=11)withanerrorrateof0.25(ε=0.25):

Youcannoticethattheerrorrateoftheensemble(0.034)issmallerthantheerrorrateofeachindividualclassifer(0.25)ifalltheassumptionsaremet.Notethatinthissimplifiedimage,a50-50splitbyanevennumberofclassifiersnistreatedasanerror,whereasthisisonlytruehalfofthetime.Tocomparesuchanidealisticensembleclassifertoabaseclassiferoverarangeofdifferentbaseerrorrates,let'simplementtheprobabilitymassfunctioninPython:

>>>importmath

>>>defensemble_error(n_classifier,error):

...q_start=math.ceil(n_classifier/2.0)

...Probability=[comb(n_class,q)*

...error**q*

...(1-error)**(n_classifier-q)

...forqinrange(q_start,l_classifier+2)]

...returnsum(Probability)

>>>ensemble_error(n_classifier=11,error=0.25)

0.034327507019042969

Let’swritesomecodetocomputetheratesforthedifferenterrorsvisualizetherelationshipbetweenensembleandbaseerrorsinalinegraph:

>>>importnumpyasnp

>>>error_range=np.arange(0.0,1.01,0.01)

Page 81: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>en_error=[en_er(n_classifier=11,er=er)

...foreriner_range]

>>>importmatplotlib.pyplotasplt

>>>plt.plot(er_range,en_error,

...label='Ensembleerror',

...linewidth=2)

>>>plt.plot(er_range,er_range,

...ls='--',label='B_er',

...linewidth=2)

>>>plt.xlabel('B_er')

>>>plt.label('B/En_er')

>>>plt.legend(loc='upperleft')

>>>plt.grid()

>>>plt.show()

Aswecanseeintheresultingplot,theerrorprobabilityofanensembleisalwaysbetterthantheerrorofanindividualbaseclassiferaslongasthebaseclassifiersperformbetterthanrandomguessing(ε<0.5).Youshouldnoticethatthey-axisdepictsthebaseerroraswellastheensembleerror(continuousline):

Page 82: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python
Page 83: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

ImplementingasimplemajorityclassiferAswesawintheintroductiontomergelearninginthelastsection,wewillworkwithawarm-uptrainingandthendevelopasimpleclassiferformajorityvotinginPythonprogramming.Asyoucansee,thenextalgorithmwillworkonmulti-classsettingsviapluralityvoting;youwillusethetermmajorityvotingforsimplicityasisalsooftendoneinliterature.

Inthefollowingprogram,wewilldevelopandalsocombinedifferentclassificationprogramsassociatedwithindividualweightsforconfidence.Ourgoalistobuildastrongermeta-classiferthatbalancesouttheindividualclassifiers'weaknessesonaparticulardataset.Inmoreprecisemathematicalterms,wecanwritetheweightedmajorityvote.

TotranslatetheconceptoftheweightedmajorityvoteintoPythoncode,wecanuseNumPy'sconvenientargmaxandbincountfunctions:

>>>importnumpyasnp

>>>np.argmax(np.bincount([0,0,1],

...weights=[0.2,0.2,0.6]))

1

Here,pijisthepredictedprobabilityofthejthclassiferforclasslabeli.Tocontinuewithourpreviousexample,let'sassumethatwehaveabinaryclassificationproblemwithclasslabelsi∈ {}0,1andanensembleofthreeclassifiersCj(j∈ {}1,2,3).Let'sassumethattheclassiferCjreturnsthefollowingclassmembershipprobabilitiesforaparticularsamplex:

C1()x→ []0.9,0.1,C2()x→ []0.8,0.2,C3()x→ []0.4,0.6

Toimplementtheweightedmajorityvotebasedonclassprobabilities,wecanagainmakeuseofNumPyusingnumpy.averageandnp.argmax:

Page 84: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>ex=np.array([[0.9,0.1],

...[0.8,0.2],

...[0.4,0.6]])

>>>p=np.average(ex,axis=0,weights=[0.2,0.2,0.6])

>>>p

array([0.58,0.42])

>>>np.argmax(p)

0

Puttingeverythingtogether,let'snowimplementaMajorityVoteClassifierinPython:

fromsklearn.baseimportClassifierMixin

fromsklearn.pre_processingimportLabel_En

fromsklearn.extimportsix

fromsklearn.baimportclone

fromsklearn.pipelineimport_name_estimators

importnumpyasnp

importoperator

classMVClassifier(BaseEstimator,

ClassifierMixin):

"""Amajorityvoteensembleclassifier

Parameters

----------

cl:array-like,shape=[n_classifiers]

Page 85: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Differentclassifiersfortheensemblevote:str,{'cl_label','prob'}

Default:'cl_label'

If'cl_label'thepredictionisbasedontheargmaxofclasslabels.Elif'prob',theargofthetotalofprobsisusedtopredicttheclasslabel(recommendedforcalibratedclassifiers).

w:arr-like,s=[n_cl]

Optional,default:None

Ifalistof`int`or`float`valuesareprovided,theclassifiersareweightedby"""

def__init__(s,cl,

v='cl_label',w=None):

s.cl=cl

s.named_cl={key:valuefor

key,valuein

_name_estimators(cl)}

s.v=v

s.w=w

deffit_cl(s,X,y):

"""Fit_cl.

Parameters

----------

X:{array-like,sparsematrix},

s=[n_samples,n_features]

Matrixoftrainingsamples.

Page 86: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

y:arr_like,sh=[n_samples]

Vectoroftargetclasslabels.

Returns

-------

s:object

"""

#UseLabelEncodertoensureclasslabelsstart

#with0,whichisimportantfornp.argmax

#callins.predict

s.l_=LabelEncoder()

s.l_.fit(y)

s.cl_=self.lablenc_.classes_

s.cl_=[]

forclins.cl:

fit_cl=clone(cl).fit(X,

s.la_.transform(y))

s.cl_.append(fit_cl)

returns

Iaddedalotofcommentstothecodetobetterunderstandtheindividualparts.However,beforeweimplementtheremainingmethods,let'stakeaquickbreakanddiscusssomeofthecodethatmaylookconfusingatfirst.WeusedtheparentclassesBaseEstimatorandClassifierMixintogetsomebasefunctionalityforfree,includingthemethodsget_paramsandset_paramstosetandreturntheclassifer'sparametersaswellasthescoremethodtocalculatetheprediction

Page 87: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

accuracy,respectively.Also,notethatweimportedsixtomaketheMajorityVoteClassifiercompatiblewithPython2.7.

Next,wewilladdthepredictmethodtopredicttheclasslabelviamajorityvotebasedontheclasslabelsifweinitializeanewMajorityVoteClassifierobjectwithvote='classlabel'.Alternatively,wewillbeabletoinitializetheensembleclassiferwithvote='probability'topredicttheclasslabelbasedontheclassmembershipprobabilities.Furthermore,wewillalsoaddapredict_probamethodtoreturntheaverageprobabilities,whichisusefultocomputetheReceiverOperatorCharacteristicareaunderthecurve(ROCAUC).

defpre(s,X):

"""PreclasslabelsforX.

Parameters

----------

X:{arr-like,sparmat},

Sh=[n_samples,n_features]

Matrixoftrainingsamples.

Returns

----------

ma_v:arr-like,sh=[n_samples]

Predictedclasslabels.

"""

ifse.v=='probability':

ma_v=np.argmax(spredict_prob(X),

axis=1)

Page 88: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

else:#'cl_label'v

predictions=np.asarr([cl.predict(X)

forclin

s.cl_]).T

ma_v=np.ap_al_ax(

lambdax:

np.argmax(np.bincount(x,weights=s.w)),

axis=1,

arr=predictions)

ma_v=s.l_.inverse_transform(ma_v)

returnma_v

defpredict_proba(self,X):

"""PredictionforX.

Parameters

----------

X:{arr-like,spmat},

sh=[n_samples,n_features]

Trainingvectors,wheren_samplesisthenumberofsamplesandn_featuresisthenumberoffeatures.

Returns

----------

av_prob:array-like,

sh=[n_samples,n_classes]

Page 89: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Weightedaverageprobabilityforeachclasspersample.

"""

probs=np.asarr([cl.predict_prob(X)

forclins.cl_])

av_prob=np.av(probs,

axis=0,weights=s.w)

returnav_prob

defget_ps(self,deep=True):

"""GetclassifierparameternamesforGridSearch"""

ifnotdeep:

returnsuper(MVC,

self).get_ps(deep=False)

else:

ou=s.n_cl.copy()

forn,stepin\

six.iteritems(s.n_cl):

fork,valueinsix.iteritems(

step.get_ps(deep=True)):

ou['%s__%s'%(n,k)]=value

returnou

Page 90: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

CombiningdifferentalgorithmsforclassificationwithmajorityvoteNow,itisabouttimetoputtheMVCthatweimplementedintheprevioussectionintoaction.Youshouldfirstprepareadatasetthatyoucantestiton.SincewearealreadyfamiliarwithtechniquestoloaddatasetsfromCSVfiles,wewilltakeashortcutandloadtheIrisdatasetfromscikit-learn'sdatasetmodule.

Furthermore,wewillonlyselecttwofeatures,sepalwidthandpetallength,tomaketheclassificationtaskmorechallenging.AlthoughourMajorityVoteClassifier,orMVC,generalizestomulticlassproblems,wewillonlyclassifyflowersamplesfromthetwoclasses,Ir-VersicolorandIr-Virginica,tocomputetheROCAUC.Thecodeisasfollows:

>>>importsklearnassk

>>>importsklearn.cross_validationascv

>>>ir=datasets.load_ir()

>>>X,y=ir.data[50:,[1,2]],ir.target[50:]

>>>le=LabelEncoder()

>>>y=le.fit_transform(y)

NextwesplittheIrissamplesinto50percenttrainingand50percenttestdata:

>>>X_train,X_test,y_train,y_test=\

...train_test_split(X,y,

...test_size=0.5,

...random_state=1)

Usingthetrainingdataset,wenowwilltrainthreedifferentclassifiers—alogisticregressionclassifier,adecisiontreeclassifer,andak-nearestneighborsclassifier—andlookattheirindividualperformancesviaa10cross-validation

Page 91: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

onthetrainingdatasetbeforewemergethemintoanensembleone:

importthefollowing

sklearn.cross_validation

sklearn.linear_model

sklearn.tree

sklearn.pipeline

Pipeline

numpyasnp

>>>clf1=LogisticRegression(penalty='l2',

...C=0.001,

...random_state=0)

>>>clf2=DTCl(max_depth=1,

...criterion='entropy',

...random_state=0)

>>>cl=KNC(n_nb=1,

...p=2,

...met='minsk')

>>>pipe1=Pipeline([['sc',StandardScaler()],

...['clf',clf1]])

>>>pipe3=Pipeline([['sc',StandardScaler()],

...['clf',clf3]])

Page 92: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>clf_labels=['LogisticRegression','DecisionTree','KNN']

>>>print('10-foldcrossvalidation:\n')

>>>forclf,labelinzip([pipe1,clf2,pipe3],clf_labels):

...sc=crossVSc(estimator=clf,

>>>X=X_train,

>>>y=y_train,

>>>cv=10,

>>>scoring='roc_auc')

>>>print("ROCAUC:%0.2f(+/-%0.2f)[%s]"

...%(scores.mean(),scores.std(),label))

Theoutputthatwereceive,asshowninthefollowingsnippet,showsthatthe

predictiveperformancesoftheindividualclassifiersarealmostequal:

10-foldcrossvalidation:

ROCAUC:0.92(+/-0.20)[LogisticRegression]

ROCAUC:0.92(+/-0.15)[DecisionTree]

ROCAUC:0.93(+/-0.10)[KNN]

Youmaybewonderingwhywetrainedthelogisticregressionandk-nearestneighborsclassifieraspartofapipeline.Thecausehereisthat,aswesaid,logisticregressionandk-nearestneighborsalgorithms(usingtheEuclideandistancemetric)arenotscale-invariantincontrastwithdecisiontrees.However,theIradvantagesareallmeasuredonthesamescale;itisagoodhabittoworkwithstandardizedfeatures.

Now,let'smoveontothemoreexcitingpartandcombinetheindividualclassifiersformajorityrulevotinginourM_V_C:

Page 93: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>mv_cl=M_V_C(

...cl=[pipe1,clf2,pipe3])

>>>cl_labels+=['MajorityVoting']

>>>all_cl=[pipe1,clf2,pipe3,mv_clf]

>>>forcl,labelinzip(all_clf,clf_labels):

...sc=cross_val_score(est=cl,

...X=X_train,

...y=y_train,

...cv=10,

...scoring='roc_auc')

...%(scores.mean(),scores.std(),label))

R_AUC:0.92(+/-0.20)[LogisticRegression]

R_AUC:0.92(+/-0.15)[D_T]

R_AUC:0.93(+/-0.10)[KNN]

R_AUC:0.97(+/-0.10)[MajorityVoting]

Additionally,theoutputoftheMajorityVotingClassifierhassubstantiallyimprovedovertheindividualclassifiersinthe10-foldcross-validationevaluation.

Classifier

Inthispart,youaregoingtocomputetheR_CcurvesfromthetestsettocheckiftheMV_Classifiergeneralizeswelltounseendata.Weshouldrememberthatthetestsetwillnotbeusedformodelselection;theonlygoalistoreportanestimateoftheaccuracyofaclassifersystem.Let’stakealookatImportmetrics.

Page 94: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

importroc_curvefromsklearn.metricsimportauc

cls=['black','orange','blue','green']

ls=[':','--','-.','-']

forcl,label,cl,l\

...inzip(all_cl,cl_labels,cls,ls):

...y_pred=clf.fit(X_train,

...y_train).predict_proba(X_test)[:,1]

...fpr,tpr,thresholds=rc_curve(y_t=y_tes,

...y_sc=y_pr)

...rc_auc=ac(x=fpr,y=tpr)

...plt.plot(fpr,tpr,

...color=clr,

...linestyle=ls,

...la='%s(ac=%0.2f)'%(la,rc_ac))

>>>plt.lg(lc='lowerright')

>>>plt.plot([0,1],[0,1],

...linestyle='--',

...color='gray',

...linewidth=2)

>>>plt.xlim([-0.1,1.1])

>>>plt.ylim([-0.1,1.1])

>>>plt.grid()

Page 95: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

>>>plt.xlb('FalsePositiveRate')

>>>plt.ylb('TruePositiveRate')

>>>plt.show()

AswecanseeintheresultingROC,theensembleclassiferalsoperformswellonthetestset(ROCAUC=0.95),whereasthek-nearestneighborsclassiferseemstobeover-fittingthetrainingdata(trainingROCAUC=0.93,testROCAUC=0.86):

Youonlychoosetwofeaturesfortheclassificationtasks.Itwillbeinterestingtoshowwhatthedecisionregionoftheensembleclassiferactuallylookslike.Althoughitisnotnecessarytostandardizethetrainingfeaturespriortomodeltofitbecauseourlogisticregressionandk-nearestneighborspipelineswillautomaticallytakecareofthis,youwillmakethetrainingsetsothatthedecisionregionsofthedecisiontreewillbeonthesamescaleforvisualpurposes.

Let’stakealook:

>>>sc=SS()

X_tra_std=sc.fit_transform(X_train)

Fromitertoolsimportproduct

x_mi=X_tra_std[:,0].mi()-1

x_ma=X_tra_std[:,0].ma()+1

Page 96: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

y_mi=X_tra_std[:,1].mi()-1

y_ma=X_tra_std[:,1].ma()+1

xx,yy=np.meshgrid(np.arange(x_min,x_max,0.1),

...np.arange(y_mi,y_ma,0.1))

f,axarr=plt.subplots(nrows=2,ncols=2,

sharex='col',

sharey='row',

figze=(7,5))

forix,cl,ttinzip(product([0,1],[0,1]),

all_cl,cl_lb):

...cl.fit(X_tra_std,y_tra)

...Z=clf.predict(np.c_[xx.ravel(),yy.ravel()])

...Z=Z.resh(xx.shape)

...axarr[idx[0],idx[1]].contou(_xx,_yy,Z,alph=0.3)

...axarr[idx[0],idx[1]].scatter(X_tra_std[y_tra==0,0],

...X_tra_std[y_tra==0,1],

...c='blue',

...mark='^',

...s=50)

...axarr[idx[0],idx[1]].scatt(X_tra_std[y_tra==1,0],

...X_tra_std[y_tra==1,1],

...c='red',

Page 97: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

...marker='o',

...s=50)

...axarr[idx[0],idx[1]].set_title(tt)

>>>plt.text(-3.5,-4.5,

...z='Slwid[standardized]',

...ha='center',va='center',ftsize=12)

>>>plt.text(-10.5,4.5,

...z='P_length[standardized]',

...ha='center',va='center',

...f_size=13,rotation=90)

>>>plt.show()

Interestingly,butalsoasexpected,thedecisionregionsoftheensembleclassifierseemtobeahybridofthedecisionregionsfromtheindividualclassifiers.Atfirstglance,themajorityvotedecisionboundarylooksalotlikethedecisionboundaryofthek-nearestneighborclassifier.However,wecanseethatitisorthogonaltotheyaxisforsepalwidth≥1,justlikethedecisiontreestump:

Page 98: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Beforeyoulearnhowtotunetheindividualclassiferparametersforensembleclassification,let'scalltheget_psmethodtofindanessentialideaofhowwecanaccesstheindividualparametersinsideaGridSearchobject:

>>>mv_clf.get_params()

{'decisiontreeclassifier':DecisionTreeClassifier(class_weight=None,

criterion='entropy',max_depth=1,

max_features=None,max_leaf_nodes=None,min_samples_

leaf=1,

min_samples_split=2,min_weight_fraction_leaf=0.0,

random_state=0,splitter='best'),

'decisiontreeclassifier__class_weight':None,

'decisiontreeclassifier__criterion':'entropy',

Page 99: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

[...]

'decisiontreeclassifier__random_state':0,

'decisiontreeclassifier__splitter':'best',

'pipeline-1':Pipeline(steps=[('sc',StandardScaler(copy=True,with_

mean=True,with_std=True)),('clf',LogisticRegression(C=0.001,class_

weight=None,dual=False,fit_intercept=True,

intercept_scaling=1,max_iter=100,multi_class='ovr',

penalty='l2',random_state=0,solver='liblinear',

tol=0.0001,

verbose=0))]),

'pipeline-1__clf':LogisticRegression(C=0.001,class_weight=None,

dual=False,fit_intercept=True,

intercept_scaling=1,max_iter=100,multi_class='ovr',

penalty='l2',random_state=0,solver='liblinear',

tol=0.0001,

verbose=0),

'pipeline-1__clf__C':0.001,

'pipeline-1__clf__class_weight':None,

'pipeline-1__clf__dual':False,

[...]

'pipeline-1__sc__with_std':True,

'pipeline-2':Pipeline(steps=[('sc',StandardScaler(copy=True,with_

Page 100: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

mean=True,with_std=True)),('clf',KNeighborsClassifier(algorithm='au

to',leaf_size=30,metric='minkowski',

metric_params=None,n_neighbors=1,p=2,

w='uniform'))]),

'p-2__cl”:KNC(algorithm='auto',leaf_

size=30,met='miski',

met_ps=None,n_neighbors=1,p=2,

w='uniform'),

'p-2__cl__algorithm':'auto',

[...]

'p-2__sc__with_std':T}

Dependingonthevaluesreturnedbytheget_psmethod,younowknowhowtoaccesstheindividualclassifier'sattributes.Let’sworkwiththeinverseregularizationparameterCofthelogisticregressionclassifierandthedecisiontreedepthviaagridsearchfordemonstrationpurposes.Let’stakealookat:

>>>fromsklearn.grid_searchimportGdSearchCV

>>>params={'dtreecl__max_depth':[0.1,.02],

...'p-1__clf__C':[0.001,0.1,100.0]}

>>>gd=GdSearchCV(estimator=mv_cl,

...param_grid=params,

...cv=10,

...scoring='roc_auc')

>>>gd.fit(X_tra,y_tra)

Page 101: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Afterthegridsearchhascompleted,wecanprintthedifferenthyperparametervaluecombinationsandtheaverageR_CACscorescomputedthrough10-foldcross-validation.Thecodeisasfollows:

>>>forparams,mean_sc,scoresingrid.grid_sc_:

...print("%0.3f+/-%0.2f%r"

...%(mean_sc,sc.std()/2,params))

0.967+/-0.05{'p-1__cl__C':0.001,'dtreeclassifier__

ma_depth':1}

0.967+/-0.05{'p-1__cl__C':0.1,'dtreeclassifier__ma_

depth':1}

1.000+/-0.00{'p-1__cl__C':100.0,'dtreeclassifier__

ma_depth':1}

0.967+/-0.05{'p-1__cl__C':0.001,'dtreeclassifier__

ma_depth':2}

0.967+/-0.05{'p-1__cl__C':0.1,'dtreeclassifier__ma_

depth':2}

1.000+/-0.00{'p-1__cl__C':100.0,'dtreeclassifier__

ma_depth':2}

>>>print('Bestparameters:%s'%gd.best_ps_)

Bestparameters:{'p1__cl__C':100.0,

'dtreeclassifier__ma_depth':1}

>>>print('Accuracy:%.2f'%gd.best_sc_)

Accuracy:1.00

Page 102: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python
Page 103: Machine Learning: Step-by-Step Guide To Implement Machine Learning Algorithms with Python

Questions

1. Explainhowtocombinedifferentmodelsindetail.2. Whatarethegoalsandbenefitsofcombiningmodels?