44
The future music platform Brian Whitman @bwhitman co-Founder & CTO, The Echo Nest

The future music platform

Embed Size (px)

DESCRIPTION

Brian Whitman from the Echo Nest discusses the music platform and how new startups and app developers are the future of the music industry..

Citation preview

Page 1: The future music platform

The future music platform

Brian Whitman @bwhitman

co-Founder & CTO, The Echo Nest

Page 2: The future music platform
Page 3: The future music platform

No

Page 4: The future music platform

If by “fail” you mean “become the future of the industry”, OK.

Page 5: The future music platform

Hello! I “teach computers to listen to music”

Page 6: The future music platform

“Fish / Cut bait”

Handheld-music (1998-2001)

I’m a often-frustrated musician & technology person

Page 7: The future music platform

Making something from everything

Page 8: The future music platform

Hit prediction

Taste / recommendation

Automated composition

Classification, tagging

Feature extraction (key, tempo)

Search & retrieval

creepy

Music retrieval and “music intelligence” applications

Page 9: The future music platform

Retrieving Music by Rhythmic Similarity

cal excerpt. (The effect of varying the truncated regions was notexamined, and it is not unlikely that other values may result in bet-ter retrieval performance.)

4.1.1 Euclidean DistanceThree different distance measures were used. The first wasstraightforward squared Euclidean distance measure, or the sum ofthe squares of the element-by-element differences between the val-ues, as used in Experiment 1. For evaluation, each excerpt wasused as a query. Each of the 15 corpus documents was then rankedby similarity to each of the 15 queries using the squared Euclideandistance. (For the purposes of ranking, the squared distance servesas well as the distance, as the square root function is monotonic.)Each query had 2 relevant documents in the corpus, so this waschosen as the cutoff point for measuring retrieval precision. Thusthere were 30 relevant documents for this query set. For eachquery, documents were ranked by increasing Euclidean distancefrom the query. Using this measure, 24 of the 30 possible docu-ments were relevant (i.e. from the same relevance class), giving aretrieval precision of 80%. (More sophisticated analyses such asROC curves, are probably not warranted due to the small corpussize.)

4.1.2 Cosine DistanceThe second measure used is a cosine metric, similar to thatdescribed in the previous section. This distance measure may bepreferable because it is less sensitive to the actual magnitudes ofthe vectors involved. This measure proved to perform significantlybetter than the Euclidean distance. Using this measure, 29 of the 30

documents retrieved were relevant, giving a retrieval precision of96.7% at this cutoff.

4.1.3 Fourier Beat Spectral CoefficientsThe final distance measure is based on the Fourier coefficients ofthe beat spectrum, because they can represent the rough spectralshape with many fewer parameters. A more compact representa-tion is valuable for a number of reasons: for example, fewer ele-ments speeds distance comparisons and also reduces the amount ofdata that must be stored to represent each file. To this effect, thefast Fourier transform was computed for each beat spectral vector.The log of the magnitude was then determined, and the mean sub-tracted from each coefficient. Because high “frequencies” in thebeat spectra are not rhythmically significant, the transform resultswere truncated to the 25 lowest coefficients. Additionally thezeroth coefficient was ignored, as the DC component is insignifi-cant for zero-mean data. The cosine distance metric was computedfor the 24 zero-mean Fourier coefficients, which served as the finaldistance metric. Experimentally, this measure performed identi-cally to the cosine metric, yielding 29 of 30 relevant documents or96.7% precision. Note that this performance was achieved using anorder of magnitude fewer parameters.Though this corpus is admittedly very small, there is no reason thatthe methods presented here could not be scaled to thousands oreven millions of works. Computing the beast spectrum is computa-tionally quite reasonable and can be done several times faster thanreal time, and even more rapidly if spectral parameters can bederived directly from MP3 compressed data as in [12] and [13].Additionally, well-known database organization methods can dra-

0

0.5

1

1.5

Tempo (bpm)

squa

red

Eucl

idea

n di

stan

ce

110 130 120 122 124 126 128 118 116 114 112

Figure 5. Euclidean Distance vs. Tempo

110 bpm

112 bpm

114 bpm

116 bpm

120 bpm

122 bpm

124 bpm

126 bpm

128 bpm

130 bpm

4 · Elias Pampalk

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

airavemariaelise

kidscenemond

therosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherosetherose

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

memoryrainbow

threetimesalady

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

eternalflamefeeling

revolution

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

americanpiedrummerboy

friendyesterday−b

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

addictga−lienewyorksml−adia

bigworldlovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwomanbigworld

lovedwoman

brandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bachbrandenvm−bach

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

beethovenfuguedminor

futurelovemetendervm−brahms

ironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironicironic

angelsmissathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathingangels

missathing

ga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantitga−iwantit

fatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoungfatherandsonforeveryoung

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

californiadreamfirsttimerisingsun

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−broken

limp−nobodypr−brokenlimp−nobodypr−brokenlimp−nobodypr−broken

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

korn−freaklimp−pollutionpr−deadcellpr−revenge

frozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozenfrozen

dancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueendancingqueen

unbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheartunbreak−myheart

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

ga−nospeechtorn

lovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheairlovsisintheair

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

ga−doedelga−japan

nma−bigblue

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

rhcp−californicationsl−whatigot

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

cocojambomacarenarockdj

gowestradiogowestradio

gowestradiogowestradiogowestradiogowestradio

gowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradiogowestradio

gowestradiogowestradiogowestradiogowestradio

gowestradiogowestradiogowestradio

manicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmondaymanicmonday

rhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−worldrhcp−world

limp−n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2getherlimp−

n2gether

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

eifel65−bluefromnewyorktolasupertrouper

congamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfielscongamindfiels

sexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbombsexbomb

bfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertimebfmc−freestylersl−summertime

bongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotreebongobongthemangotree

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

bfmc−instereobfmc−rockingbfmc−skylimitbfmc−uprocking

Fig. 2. Islands of Music representing a 7x7 SOM with 77 pieces of music. The artists and full titles of the pieces,

which are represented by the short identifiers here, can be found on the web page or in the thesis.

5. CONCLUSION

The Islands of Music have not yet reached a level which would suggest their commercial usage,however, they demonstrate the possibility of such systems and serve well as a tool for furtherresearch. Any part of the system can be modified or replaced and the resulting effects can easilybe evaluated using the graphical user interface. Furthermore, the feature extraction techniqueand the visualization technique developed for this thesis can both be applied seperately to abroad range of applications.

REFERENCES

Bishop, C. M., Svensen, M., and Williams, C. K. I. 1998. GTM: The generative topographic mapping.

Neural Computation 10, 1, 215–234.

Bladon, R. 1981. Modeling the judgment of vowel quality differences. Journal of the Acoustical Society ofAmerica 69, 1414–1422.

Fastl, H. 1982. Fluctuation strength and temporal masking patterns of amplitude-modulated broad-band

noise. Hearing Research 8, 59–69.

Feiten, B. and Gunzel, S. 1994. Automatic Indexing of a Sound Database Using Self-organizing Neural

Nets. Computer Music Journal 18, 3, 53–65.

Fruhwirth, M. 2001. Automatische Analyse und Organisation von Musikarchiven (Automatic Analysis

and Organization of Music Archives). Master’s thesis, Vienna University of Technology, Austria.

Kohonen, T. 2001. Self-Organizing Maps (3rd ed.), Volume 30 of Springer Series in Information Sciences.Springer, Berlin.

Rauber, A. 1999. LabelSOM: On the Labeling of Self-Organizing Maps. In Proceedings of the InternationalJoint Conference on Neural Networks, IJCNN’99 (Washington, DC, 1999).

Scheirer, E. D. 2000. Music-Listening Systems. Ph. D. thesis, MIT Media Laboratory.

Schroder, M. R., Atal, B. S., and Hall, J. L. 1979. Optimizing digital speech coders by exploiting

masking properties of the human ear. Journal of the Acoustical Society of America 66, 1647–1652.

Ultsch, A. and Siemon, H. P. 1990. Kohonen’s Self-Organizing Feature Maps for Exploratory Data Anal-

ysis. In Proceedings of the International Neural Network Conference (INNC’90) (Dordrecht, Netherlands,

1990), pp. 305–308. Kluwer.

Zwicker, E. and Fastl, H. 1999. Psychoacoustics, Facts and Models (2nd updated ed.), Volume 22 of

Springer Series of Information Sciences. Springer, Berlin.

Fig. IV GenreGram

5. USER INTERFACESTwo new graphical user interfaces for browsing and interactingwith collections of audio signals have been developed (FigureIV,V) . They are based on the extracted feature vectors and theautomatic genre classification results.

• GenreGram is a dynamic real-time audio display forshowing automatic genre classification results. Each genre isrepresented as a cylinder that moves up and down in realtime based on a classification confidence measure rangingfrom 0.0 to 1.0. Each cylinder is texture-mapped with arepresentative image of each category. In addition to being anice demonstration of automatic real time audioclassification, the GenreGram gives valuable feedback bothto the user and the algorithm designer. Differentclassification decisions and their relative strengths arecombined visually, revealing correlations and classificationpatterns. Since the boundaries between musical genres arefuzzy, a display like this is more informative than a singleclassification decision. For example, mot of the time a rapsong will trigger Male Voice, Sports Announcing andHipHop. This exact case is shown in Figure IV.

• GenreSpace is a tool for visualizing large sound collectionsfor browsing. Each audio file is represented a single point ina 3D space. Principal Component Analysis (PCA) [16] isused to reduce the dimensionality of the feature vectorrepresenting the file to the 3-dimensional feature vectorcorresponding to the point coordinates. Coloring of thepoints is based on the automatic genre classification. Theuser can zoom, rotate and scale the space to interact with thedata. The GenreSpace also represents the relative similaritywithin genres by the distance between points. A principalcurve [17] can be used to move sequentially through thepoints in a way that preserves the local clusteringinformation.

Fig. V GenreSpace

6. FUTURE WORKAn obvious direction for future research is to expand the genrehierarchy both in width and depth. The combination ofsegmentation [20] with automatic genre classification couldprovide a way to browse audio to locate regions of interest.Another interesting direction is the combination of the graphicaluser interfaces described with automatic similarity retrieval thattakes into account the automatic genre classification. In its currentform the beat analysis algorithm can not be performed in real timeas it needs to collect information from the whole signal. A realtime version of beat analysis is planned for the future. It is ourbelief that more rhythmic information can be extracted from audiosignals and we plan to investigate the ability of the beat analysisto detect rhythmic structure in synthetic stimuli.

7. SUMMARYA feature set for representing music surface and rhythminformation was proposed and used to build automatic genreclassification algorithms. The performance of the proposed dataset was evaluated by training statistical pattern recognitionclassifiers on real-world data sets. Two new graphical userinterfaces based on the extracted feature set and the automaticgenre classification were developed.

The software used for this paper is available as part of MARSYAS[19] a software framework for rapid development of computeraudition application written in C++ and JAVA. It is available asfree software under the Gnu Public License (GPL) at:

http://www.cs.princeton.edu/~gtzan/marsyas.html

8. ACKNOWLEDGEMENTSThis work was funded under NSF grant 9984087 and from giftsfrom Intel and Arial Foundation. Douglas Turnbull helped withthe implementation of the GenreGram.

Fig. I Beat Histograms for Classical (left) and Pop (right)

The following features based on the “beat” histogram are used:

1. Period0: Periodicity in bpm of the first peak Period0

2. Amplitude0: Relative amplitude (divided by sum ofamplitudes) of the first peak.

3. RatioPeriod1: Ratio of periodicity of second peak to theperiodicity of the first peak

4. Amplitude1: Relative amplitude of second peak.

5. RatioPeriod2, Amplitude2, RatioPeriod3, Amplitude3

These features represent the strength of beat (“beatedness”) of thesignal and the relations between the prominent periodicities of thesignal. This feature vector carries more information thantraditional beat tracking systems [9, 10] where a single measure ofthe beat corresponding to the tempo and its strength are used.

Figure I shows the “beat” histograms of two classical music piecesand two modern pop music pieces. The fewer and stronger peaksof the two pop music histograms indicate the strong presence of aregular beat unlike the distributed weaker peaks of classicalmusic. The 8-dimensional feature vector used to representrhythmic structure and strength is combined with the 9-dimensional musical surface feature vector to form a 17-dimensional feature vector that is used for automatic genreclassification.

4. CLASSIFICATIONTo evaluate the performance of the proposed feature set, statisticalpattern recognition classifiers were trained and evaluated usingdata sets collected from radio, compact disks and the Web. FigureII shows the classification hierarchy used for the experiments. Foreach node in the tree of Figure II, a Gaussian classifier wastrained using a dataset of 50 samples (each 30 seconds long).Using the Gaussian classifier each class is represented as a singlemultidimensional Gaussian distribution with parameters estimatedfrom the training dataset [15]. The full data collection consists of15 * 50 * 30 = 22500 seconds = 6.25 hours of audio.

For the Musical Genres (Classical, Country…..) the combinedfeature set described in this paper was used. For the ClassicalGenres (Orchestra, Piano…) only the Music Surface features wereused and for the Speech Genres (MaleVoice, FemaleVoice…)mel-frequency cepstral coefficients [16] (MFCC) were used.MFCC are features commonly used in speech recognitionresearch.

M usic

C la ssica l C o un try D isco H ipH o p Ja zz R o ck

O rchestra P iano C ho ir S tring Q uarte t

Sp eech

M aleV o ice , F e m aleV o ice , S po rtsA nno u ncing

Fig. II Genre Classification Hierarchy

Table 1. Classification accuracy percentage results

MusicSpeech Genres Voices Classical

Random 50 16 33 25

Gaussian 14 57 74 61

Fig. III Genre Classification Hierarchy

Table 1. summarizes the classification results as pecentages ofclassification accuracy. In all cases the results are significantlybetter than random classification. These classification results arecalculated using a 10-fold evaluation strategy where theevaluation data set is randomly partitioned so that 10% is used fortesting and 90% for training. The process is iterated with differentrandom partitions and the results are averaged (in the evaluationof Table.1 one hundred iterations where used).Figure III shows the relative importance of the “musical surface”and “rhythm” feature sets for the automatic genre classification.As expected both feature sets perform better than random andtheir combination improves the classification accuracy.

0

10

20

30

40

50

60

Random

RandomRhythmMusicSurfaceFull

Computers “helping” you discover music

Page 10: The future music platform

Columbia University, NYC

MIT Media Labfinishing my dissertation

Academic lifestyle

Page 11: The future music platform

Always do it automatically, on everything

Page 12: The future music platform

Started a company - are you sure you want to do this?

Page 13: The future music platform

The Echo Nest 2010

Somerville, MA USA30 people200 computersLots of products5,000,000,000 documents1,200,000 artists20,000,000 songs

Where we’ve gotten in the past few years

Page 14: The future music platform

Good news! We’re on the right track

Page 15: The future music platform

We are a music application development platformthat knows more about music content and music

consumers than anyone.

Page 16: The future music platform

The platform

Page 17: The future music platform

You’re a music fan. How do you get to music?Or inversely, you’re a musician. How do you get to your fans?

Page 18: The future music platform

Only 10 years ago - pretty terrible. Buy a thing or borrow it.

Page 19: The future music platform

5 years ago - a little better. Easier to self publish and discover.

Page 20: The future music platform

Today. The choices for distribution are immense.

Page 21: The future music platform

App developers are nowthe future of the music business.

Page 22: The future music platform

Good news: thousands of developers working on music.Bad news: most have been locked out of the business.

Page 23: The future music platform

Until now!

Page 24: The future music platform

Music platform

Content Engineering Audience

What a music platform gives you

Page 25: The future music platform

Music platform

Content Engineering Audience

Who can do what

Page 26: The future music platform

The power of APIs

Page 27: The future music platform

This is the Echo Nest’s business!

Page 28: The future music platform

Similarity

AcousticanalysisArtist metrics

Feeds

RecommendationSearch / Tags

Metadata

Predictive analytics

Content / Streaming

So many possibilities all built into the plaftorm

Fingerprint

Page 29: The future music platform

vibrant, active developer community

developer API

discoveryplaylisting

mobilemusic games

music visualizersremix appsanalytics

targeted marketingbranded

entertainment

100s of apps to date

Not just independents-- the power is from partnerships

Page 30: The future music platform

Content / Streaming

Where can I get audio from?

Page 31: The future music platform

Probably shouldn’t play YouTube videos

Content / Streaming

Page 32: The future music platform

Or stream from blogs

Content / Streaming

Page 33: The future music platform

Content / Streaming

But the distributors & labels are getting there!

Page 34: The future music platform

Content / Streaming

They believe in this as much as we do.

Page 35: The future music platform

ArtistMoodKey

TempoPopularityFamiliarityLocation

This Pandora competitor was built in 24 hours by one developer.It has 10 million songs available.

Page 36: The future music platform

An employee built this algorithm during a MHDWithin a week there were 3 competing web apps for it.

Page 37: The future music platform

audiokicker is an upcoming iPhone app for social music

Page 38: The future music platform

audiokicker uses Echo Nest for recommendationand 7digital for content

Page 39: The future music platform

EN client streaming service Thumbplay

Page 40: The future music platform

All the potential is a bit baffling.This very inspired person is building “Frasier”-themed experiences.

http://music.joshmillard.com/2010/06/04/nine-inch-niles-the-seattleward-spiral/

Page 41: The future music platform

There are hundreds of apps available on the platform alreadyFrom big companies to one-person ideas

Page 42: The future music platform

Someone in this room is going to invent the waymusic is experienced in the future.

Page 43: The future music platform
Page 44: The future music platform