1
Artist Name HTML ... ...... ..... . ......... .... .... .... .... ... . ... ... .... ... ... .... .... . ? 2x Rock 1x Punk ... ... 4x Guitar 1x Beats ... ... 5x Berlin 4x Guitar 2x Rock 1x Punk 1x Beats ... ... Technique ? Song A Song B X X Y Y Y MusicRainbow is a simple user inter- face for browsing music collections. The colors of the rainbow encode dif- ferent styles of music. Similar artists are located near each other on the circu- lar rainbow. Labels are extracted from web pages to summarize the contents. The user controls the interface with a knob which can be turned (to select a different artist) and pushed (to listen to music from the selected artist). Inside Outside Summaries Elias Pampalk and Masataka Goto National Institute of Advanced Industrial Science and Technology (AIST) IT, AIST, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan MusicRainbow: A New User Interface to Discover Artists Using Audio-based Similarity and Web-based Labeling Presented at the ISMIR International Conference on Music Information Retrieval, 8-12 October 2006, Victoria, Canada. This work was supported by CrestMuse, CREST, JST. To compute the similarity of artists we compare their songs using audio-based techniques which analyze loudness fluctua- tions and spectral shapes [1]. The distance between artists X and Y is computed by finding the closest song to each song (red & blue lines), computing the mean of the red lines and the mean of the blue lines, and taking the maximum of the two values. The distances between artists define a similarity space. For our demonstration we use a collection consisting of 558 art- ists. The shortest path which connects all art- ists is found using a travelling salesman algorithm [2]. The path in the high-dimensional similar- ity space is projected onto a circle. Each artist is assigned one position on the circle. Similar artists are located close to each other. For each artist/word pair, we compute a relevance weight on the basis of the word occurances in the webpages [4]. The weights are smoothed. This helps identify labels which are useful to de- scribe not only a single artist but a region on the rainbow. The peaks are used by a heuristic to decide where to place the labels. The smoothed weights of the labels placed inside the rainbow define the color intensities. Red corresponds to the most frequent term, purple to the least frequent. For each artist we retrieve related pages using Google [3]. The pages are parsed for words listed in one of three vocabularies and word occurrences are counted. Each vocabulary covers a different hierarchical level. References [1] E. Pampalk, Computational Models of Music Similarity and their Application in Music Information Retrieval , Doctoral Dissertation, Vienna University of Technology, Austria, 2006. [2] T. Pohle, E. Pampalk, and G. Widmer, Generating Similarity-Based Playlists Using Travelling Salesman Algorithms, in Proceedings of DAFx, 2005. [3] B. Whitman and S. Lawrence, Inferring Descriptions and Similarity for Music from Community Metadata, in Proceedings of ICMC, 2002. [4] K. Lagus and S. Kaski, Keyword Selection Method for Characterizing Text Document Maps, in Proceedings of ICANN, 1999.

MusicRainbow: A New User Interface to Discover Artists ... · 4x Guitar 1x Beats..... 5x Berlin 4x Guitar 2x Rock ... using audio-based techniques ... 2005. [3] B. Whitman and S

Embed Size (px)

Citation preview

Artist NameHTML... ...... ..... .......... .... ........ .... ... . ... ....... ... ... .... .... .

?

2x Rock1x Punk ......

4x Guitar1x Beats......

5x Berlin4x Guitar2x Rock1x Punk1x Beats......

Technique

?

Song A

Song B

X

X

Y Y

Y

MusicRainbow is a simple user inter-face for browsing music collections. The colors of the rainbow encode dif-ferent styles of music. Similar artists are located near each other on the circu-lar rainbow. Labels are extracted from web pages to summarize the contents. The user controls the interface with a knob which can be turned (to select a di�erent artist) and pushed (to listen to music from the selected artist).

Insi

de Outside

Summaries

Elias Pampalk and Masataka GotoNational Institute of Advanced Industrial Science and Technology (AIST)IT, AIST, 1-1-1 Umezono, Tsukuba, Ibaraki 305-8568, Japan

MusicRainbow: A New User Interface to Discover Artists Using Audio-based Similarity and Web-based Labeling

Presented at the ISMIR International Conference on Music Information Retrieval, 8-12 October 2006, Victoria, Canada. This work was supported by CrestMuse, CREST, JST.

To compute the similarity of artists we compare their songs using audio-based techniques which analyze loudness �uctua-tions and spectral shapes [1].

The distance between artists X and Y is computed by �nding the closest song to each song (red & blue lines), computing the mean of the red lines and the mean of the blue lines, and taking the maximum of the two values.

The distances between artists de�ne a similarity space. For our demonstration we use a collection consisting of 558 art-ists.

The shortest path which connects all art-ists is found using a travelling salesman algorithm [2].

The path in the high-dimensional similar-ity space is projected onto a circle. Each artist is assigned one position on the circle. Similar artists are located close to each other.

For each artist/word pair, we compute a relevance weight on the basis of the word occurances in the webpages [4].

The weights are smoothed. This helps identify labels which are useful to de-scribe not only a single artist but a region on the rainbow. The peaks are used by a heuristic to decide where to place the labels.

The smoothed weights of the labels placed inside the rainbow de�ne the color intensities. Red corresponds to the most frequent term, purple to the least frequent.

For each artist we retrieve related pages using Google [3]. The pages are parsed for words listed in one of three vocabularies and word occurrences are counted. Each vocabulary covers a di�erent hierarchical level.

References [1] E. Pampalk, Computational Models of Music Similarity and their Application in Music Information Retrieval, Doctoral Dissertation, Vienna University of Technology, Austria, 2006.[2] T. Pohle, E. Pampalk, and G. Widmer, Generating Similarity-Based Playlists Using Travelling Salesman Algorithms, in Proceedings of DAFx, 2005.[3] B. Whitman and S. Lawrence, Inferring Descriptions and Similarity for Music from Community Metadata, in Proceedings of ICMC, 2002.[4] K. Lagus and S. Kaski, Keyword Selection Method for Characterizing Text Document Maps, in Proceedings of ICANN, 1999.