Analysis Categories Explained
MixMoose provides 24 categories to analyze your playlist tracks. If you're curious about what they all mean then this guide is for you.
Where do These Categories Come From?
In 2014 Spotify acquired Echo Nest (No relation to Amazon Echo or Google Nest), a Massachusetts-based copy that provided music recommendations based on analyzing distinct features of tracks. Their process converted a track into a image, cut it into slices, made determinations about that slice, and then uses those values to figure out overall values. I assume it's how it works today.
That information is still used today and provided to developers by Spotify via it's APIfor free. This is super cool, especially when you consider the trend for companies to start charging for previously free APIs. (Thank you Spotify!) MixMoose uses the API to show all the categories below as well as create some if it's own meta-categories.
Basic Categories
These are the features that come directly from Spotify's analysis, properties directly on a track, or simple meta functions from those previous elements.
Category | Description |
---|---|
Acousticness | Measures the level of confidence of whether a track is acoustic. |
Added At | The date the track was added to the playlist. |
Bars per Minutes | Combines Tempo and Time Signature to provide a more universal measure of tempo. This is especially helpful when tracks mistakenly show an incorrect time signature because they've got a funky beat. |
Danceability | Measures how suitable a track is for dancing. This might take into account including beat strength (A thumping beat being more danceable), tempo stability (A steady beat being more danceable) and overall tempo (Something in between 90-128 bpm would be more danceable). |
Duration | The length of the track. |
Energy | Measures the perceptual energy of a track. An energetic tracks will be fast, loud and noisy. |
Explicitness | This is an indicator of explicitness at the track. This appears to be self-reported because I've heard a few tracks that weren't "explicit" but had plenty of "no-no" words. |
Instrumentalness | Measures whether the track contains no vocals. |
Key | This is the estimated overall key of the track. This combines two features from Spotify, mode and key. The mode indicates whether the track is major or minor and is represented by either a upper-case or lower case note - A = A Major, a = A Minor. |
Liveness | Measures the likelihood that a track was recorded live. |
Loudness | Measures the overall loudness of a track in decibels. |
Open Key Notation | The key of the track expressed in Open Key Notation. This notation makes it easy to find harmonically compatible tracks to mix into by checking if the next track is with 1 step of the current track. So a tracks with 1 (C/a) would be compatible with 2 (G/e)' and 12 (F/d) (since 12 and 1 are adjacent). |
Popularity | The current popularity of a track according to Spotify. |
Release Date | The self reported date the track was released. It can be wrong sometimes. |
Speechiness | Measures the presence of spoken words in a track. |
Tempo | Measures the speed of a track in terms of beats per minute. Sometimes this can be off. For example, a very fast track can have a very low tempo or a track with an irregular beat might have an odd tempo along with an odd time signature. |
Time Signature | The number of beats in each bar. |
Valence | Measures the musical positiveness conveyed by a track. Tracks with a high value are more cheerful and bright, while those with a low value or more negative and dark. |
Year | This is the approximate year a track was released. It's usually right, but sometimes it can be crazy. |
Advanced Categories
These are more computationally intensive than the basic categories, and typically involve comparing tracks with each other.
Category | Description |
---|---|
Absolute Aggregate Change | Absolute Aggregate Change is the absolute differences in comparable analysis category values between adjacent tracks. A higher value would indicate that the tracks have greater differences. The categories compared are: Acousticness, BPM Compatibility, Danceability, Energy, Instrumentalness, Liveness, Open Key notation, Speechiness, and Valence. |
Artist Genre Similarity | Compares adjacent tracks' artists' genres. The more similar the genres are the closer to 100% the match will be. Honestly this normally ends up at 0% because THERE ARE A LOT OF DIFFERENT GENRES. |
BPM Compatibility | Compares a track with adjacent tracks to determine how compatible BPMs are. A track with a very near bpm or double the bpm would be much more compatible with the track they're being compared to.. |
Exponential Aggregate Change | Behaves in the same was as Absolute Aggregate Change, except differences between categories are weighted exponentially. This has the effect of drawing attention to tracks that have single categories that have a sizable difference. |
Harmonic Compatibility | Uses the circle of fifths to determine if the harmonic structure of a track is similar to that of the prior track. So a track in the key of D Major will show Yes if the next track is in A Major, B Minor, E Minor, F♯ Minor, or G Major. |