13
Machine Learning Lab University of Freiburg Clustering Machine Learning Summer 2015 Dr. Joschka Boedecker Slides courtesy of Manuel Blum

Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

  • Upload
    others

  • View
    27

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

Machine Learning LabUniversity of Freiburg

ClusteringMachine Learning Summer 2015

Dr. Joschka Boedecker

Slides courtesy of Manuel Blum

Page 2: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

Supervised vs. Unsupervised Learning

1 1

Supervised Learning Unsupervised Learning

(x(1), y

(1))(x(2)

, y

(2))...

(x(m), y

(m))

x

(1)

x

(2)

...x

(m)

Page 3: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm Example

−2 0 2 4 6 8

−2

0

2

4

6

8

x1

x 2

patternsprototypes Initialization of cluster centroids

Page 4: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm Example

−2 0 2 4 6 8

−2

0

2

4

6

8

x1

x 2

patternsprototypes Iteration 1:

compute closest centroids

Page 5: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm Example

−2 0 2 4 6 8

−2

0

2

4

6

8

x1

x 2

patternsprototypes Iteration 1:

compute closest centroidsmove controids to mean of assigned points

Iteration 2:compute closest centroids

Page 6: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm Example

−2 0 2 4 6 8

−2

0

2

4

6

8

x1

x 2

patternsprototypes Iteration 1:

compute closest centroidsmove controids to mean of assigned points

Iteration 2:compute closest centroidsmove centroids

Iteration 3:compute closest centroids

Page 7: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm Example

−2 0 2 4 6 8

−2

0

2

4

6

8

x1

x 2

patternsprototypes Iteration 1:

compute closest centroidsmove controids to mean of assigned training patterns

Iteration 2:compute closest centroidsmove centroids

Iteration 3:compute closest centroidsmove centroids

Iteration 4:algorithm converges

Page 8: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Algorithm

input: K, {x(1), x(2), ..., x(m)}, x(i) ∊ Rn

randomly initialize K cluster controids μ1, μ2, ..., μK ∊ Rn

do for i = 1 to m

c(i) := index of cluster centroid closest to x(i)

for k = 1 to Kμk := mean of training patterns assigned to cluster k

until convergence

Page 9: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

K-means Optimization Objective

J

⇣c

(1), . . . , c

(m), µ1, . . . , µK

⌘=

1

m

mX

i=1

���x(i) � µc(i)

���2

- cluster assignment step minimizes J w.r.t. c(1), ..., c(m)

- moving the centroids minimizes J w.r.t. μ1,..., μK

- the objective function is monotonically decreasing towards a local minimum of J

- K-means always converges within finite time

Page 10: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

Application: Color Quantization- reduce the number of distinct colors in an image by clustering the pixels- the pixels of the original image are used as training patterns x(i)

- K controls the number of colors in the output image- K-means will learn the K most typical colors in the image- the original pixels can be replaced by the closest prototypes after training

Original pixels

Learned cluster centers

R

R

B

B

Page 11: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

Random Initialization

- problem: the performance of K-means heavily depends on the initial cluster centers

- simple solution: - run K-means multiple times using different random initializations- choose the clustering that minimizes the cost function J

- Forgy‘s method:- initialize centroids to K randomly picked training patterns

- the random partition method:- randomly assign a cluster to each training pattern- move centroids to the means of the randomly assigned points

Page 12: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

How to choose K

- correct choice of k is often ambiguous- most often K is specified by hand

1

- the elbow method:

Cos

t fu

nctio

n J

Number of clusters K

1 2 3 4 5 6 7 8

Page 13: Machine Learning Lab University of Freiburgml.informatik.uni-freiburg.de/_media/documents/teaching/...Machine Learning Lab University of Freiburg Clustering Machine Learning Summer

Final Remarks- results of K-means heavily depend on the scaling of

the data

- Euclidean distance must be a meaningful measure of similarity for the dataset

- K-means will rarely work forhigh dimensional data (d > 20)

- cluster centroids are also called prototypes or codebook vectors

- the set of prototypes is called codebook