View
215
Download
0
Category
Tags:
Preview:
Citation preview
1
Learning from ShadowsLearning from Shadows
Dimensionality Reduction and its Dimensionality Reduction and its Application in Artificial Intelligence, Application in Artificial Intelligence,
Signal Processing and RoboticsSignal Processing and Robotics
Ali GhodsiDepartment of Statistics and Actuarial
Science
University of WaterlooOctober 2006
2
3
Dimensionality ReductionDimensionality Reduction
4
Dimensionality ReductionDimensionality Reduction
5
Manifold and Hidden VariablesManifold and Hidden Variables
6
Data RepresentationData Representation
7
Data RepresentationData Representation
8
11 11 11 11 11
11 00 11 00 11
11 11 11 11 11
11 0.50.5 0.50.5 0.50.5 11
11 11 11 11 11
Data RepresentationData Representation
9
10
644 by 103
644 by 2
2 by 103
23 by 28 23 by 28
-2.19
-0.02
-3.19
1.02
2 by 12 by 1
11
12
13
14
15
16Hastie, Tibshirani, Friedman 2001
17
The Big PictureThe Big Picture
18
Uses of Dimensionality Uses of Dimensionality ReductionReduction
(Manifold Learning)(Manifold Learning)
19
DenoisingDenoising
Mika et. al. 1999
Zhu and Ghodsi 2005
20
Tenenbaum, V de Silva, Langford 2001
21
Roweis and. Saul 2000
22
Arranging words: Each word was initially represented by a high-dimensional vector that counted the number of times it appeared in different encyclopedia articles. Words with similar contexts are collocated
Roweis and Saul 2000
23
Hinton and Roweis 2002
24
Embedding of Sparse Music Embedding of Sparse Music Similarity GraphSimilarity Graph
Platt, 2004
25
Pattern Recognition Pattern Recognition
Ghodsi, Huang, Schuurmans 2004
26
Pattern RecognitionPattern Recognition
27
ClusteringClustering
28
Glasses vs. No GlassesGlasses vs. No Glasses
29
Beard vs. No BeardBeard vs. No Beard
30
Beard DistinctionBeard Distinction
Ghodsi , Wilkinson, Southey 2006
31
Glasses DistinctionGlasses Distinction
32
Multiple-Attribute MetricMultiple-Attribute Metric
33
Reinforcement LearningReinforcement Learning
Mahadevan and Maggioini, 2005
34
Semi-supervised LearningSemi-supervised Learning
Use graph-based discretization of manifold to infer missing labels.
Build classifiers from bottom eigenvectors of graph Laplacian.
Belkin & Niyogi, 2004; Zien et al, Eds., 2005
35
Learning CorrespondencesLearning Correspondences
How can we learn manifold structure that is shared across multiple data sets?
Ham et al, 2003, 2005
36
Mapping and Robot LocalizationMapping and Robot Localization
Bowling, Ghodsi, Wilkinson 2005
Ham, Lin, D.D. 2005
37
Action Respecting Action Respecting EmbeddingEmbedding
Joint Work with
Michael Bowlingand
Dana Wilkinson
38
Modelling Temporal Data and Modelling Temporal Data and ActionsActions
39
OutlineOutline
• Background– PCA– Kernel PCA
• Action Respecting Embedding (ARE)– Prediction and Planning– Probabilistic Actions
• Future Work
40
Principal Component Analysis Principal Component Analysis (PCA)(PCA)
41
Principal Component Analysis Principal Component Analysis (PCA)(PCA)
42
Kernel MethodsKernel Methods
43
Kernel TrickKernel Trick
44
Observed, Feature and Embedded Observed, Feature and Embedded SpacesSpaces
45
Kernel PCA Kernel PCA
46
ProblemProblem
47
IdeaIdea
48
Action Respecting Embedding Action Respecting Embedding (ARE)(ARE)
49
Action Respecting ConstraintAction Respecting Constraint
50
Preserve distances between each point and its k nearest neighbors.
Local Distances ConstraintLocal Distances Constraint
51
Preserve local distances
Local Distances ConstraintLocal Distances Constraint
52
Semidefinite ProgrammingSemidefinite Programming
53
ExperimentExperiment
54
Experiment 1Experiment 1
55
Experiment 2Experiment 2
56
Experiment 3Experiment 3
57
Experiment 4Experiment 4
58
Experiment 5Experiment 5
59
PlanningPlanning
60
PlanningPlanning
61
PlanningPlanning
62
ExperimentExperiment
63
Probabilistic Actions
64
Future workFuture work
65
Related PapersRelated Papers
Recommended