Analysis Categories Explained headline image

Analysis Categories Explained

By MixMoose
August 3, 2024

MixMoose provides 24 categories to analyze your playlist tracks. If you're curious about what they all mean then this guide is for you.

Where do These Categories Come From?

In 2014 Spotify acquired Echo Nest (No relation to Amazon Echo or Google Nest), a Massachusetts-based copy that provided music recommendations based on analyzing distinct features of tracks. Their process converted a track into a image, cut it into slices, made determinations about that slice, and then uses those values to figure out overall values. I assume it's how it works today.

That information is still used today and provided to developers by Spotify via it's APIfor free. This is super cool, especially when you consider the trend for companies to start charging for previously free APIs. (Thank you Spotify!) MixMoose uses the API to show all the categories below as well as create some if it's own meta-categories.

Basic Categories

These are the features that come directly from Spotify's analysis, properties directly on a track, or simple meta functions from those previous elements.

CategoryDescription
AcousticnessMeasures the level of confidence of whether a track is acoustic.
Added AtThe date the track was added to the playlist.
Bars per Minutes Combines Tempo and Time Signature to provide a more universal measure of tempo. This is especially helpful when tracks mistakenly show an incorrect time signature because they've got a funky beat.
DanceabilityMeasures how suitable a track is for dancing. This might take into account including beat strength (A thumping beat being more danceable), tempo stability (A steady beat being more danceable) and overall tempo (Something in between 90-128 bpm would be more danceable).
DurationThe length of the track.
EnergyMeasures the perceptual energy of a track. An energetic tracks will be fast, loud and noisy.
Explicitness This is an indicator of explicitness at the track. This appears to be self-reported because I've heard a few tracks that weren't "explicit" but had plenty of "no-no" words.
InstrumentalnessMeasures whether the track contains no vocals.
Key This is the estimated overall key of the track. This combines two features from Spotify, mode and key. The mode indicates whether the track is major or minor and is represented by either a upper-case or lower case note - A = A Major, a = A Minor.
LivenessMeasures the likelihood that a track was recorded live.
LoudnessMeasures the overall loudness of a track in decibels.
Open Key Notation The key of the track expressed in Open Key Notation. This notation makes it easy to find harmonically compatible tracks to mix into by checking if the next track is with 1 step of the current track. So a tracks with 1 (C/a) would be compatible with 2 (G/e)' and 12 (F/d) (since 12 and 1 are adjacent).
PopularityThe current popularity of a track according to Spotify.
Release DateThe self reported date the track was released. It can be wrong sometimes.
SpeechinessMeasures the presence of spoken words in a track.
Tempo Measures the speed of a track in terms of beats per minute. Sometimes this can be off. For example, a very fast track can have a very low tempo or a track with an irregular beat might have an odd tempo along with an odd time signature.
Time SignatureThe number of beats in each bar.
Valence Measures the musical positiveness conveyed by a track. Tracks with a high value are more cheerful and bright, while those with a low value or more negative and dark.
YearThis is the approximate year a track was released. It's usually right, but sometimes it can be crazy.

Advanced Categories

These are more computationally intensive than the basic categories, and typically involve comparing tracks with each other.

CategoryDescription
Absolute Aggregate Change Absolute Aggregate Change is the absolute differences in comparable analysis category values between adjacent tracks. A higher value would indicate that the tracks have greater differences. The categories compared are: Acousticness, BPM Compatibility, Danceability, Energy, Instrumentalness, Liveness, Open Key notation, Speechiness, and Valence.
Artist Genre Similarity Compares adjacent tracks' artists' genres. The more similar the genres are the closer to 100% the match will be. Honestly this normally ends up at 0% because THERE ARE A LOT OF DIFFERENT GENRES.
BPM Compatibility Compares a track with adjacent tracks to determine how compatible BPMs are. A track with a very near bpm or double the bpm would be much more compatible with the track they're being compared to..
Exponential Aggregate Change Behaves in the same was as Absolute Aggregate Change, except differences between categories are weighted exponentially. This has the effect of drawing attention to tracks that have single categories that have a sizable difference.
Harmonic Compatibility Uses the circle of fifths to determine if the harmonic structure of a track is similar to that of the prior track. So a track in the key of D Major will show Yes if the next track is in A Major, B Minor, E Minor, F♯ Minor, or G Major.
© MixMoose 2024
Privacy PolicyContact
© MixMoose 2024