AudioRadar, provides a map of songs by their sound and similarities. Using algorithms developed by other acoustical researchers over the years, it scans a music collection, measuring song qualities: tempo, chordal shifts, volume, harmony, and so on. Then it weights the songs by four key criteria: fast or slow, melodic or rhythmic, turbulent or calm, and rough or clean. (Turbulence measures the abruptness of shifts; "rough" indicates the number of shifts.)
Based on these metrics, the application creates a map in which a chosen song appears at the center of the screen, with similar songs clustered in a circle around it -- sort of like points of light on a radar screen. Then users can gauge, for instance, the "calmness" or "cleanness" of another music choice by its relative position on the map. Distances are scaled; for instance, a song at the circle's outer edge would be twice as calm as one in the center. And the cluster rearranges itself after each new song. Thus, users can surf their collections without needing to remember every song they own. They can build mood-based playlists or let the program select the next most similar song.
Collections of electronic music are mostly organized according to playlists based on artist names and song titles. Music genres are inherently ambiguous and, to make matters worse, assigned manually by a diverse user community. People tend to organize music based on similarity to other music and based on the music’s emotional qualities. Taking this into account, we have designed a music player which derives a set of criteria from the actual music data and then provides a coherent visual metaphor for a similarity-based navigation of the music collection.