Hacker News new | past | comments | ask | show | jobs | submit login

> It's the music discovery engine,

Didn't the Music Genome Project / Pandora do this half a decade before Spotify?




that's a great question - those approaches where based on acoustic similarity. Later approaches, including Echo Nest, included social features.

An example I used to use with Brian was that Bad Religion and NOFX sound similar. However, fans of one of those bands weren't often fans of the other. In this model, people have a need for an amount of that sound and a social connection, NOT a need for more and more and more of that sound.

Another diff is that Pandora used human music experts where EN used machine listening. They have some overlap, and some differences, but obv the machine listening is offline.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: