Spotify extension for Scratch

In this post, I want to share a new Scratch extension that I made this week, explain what it does, and suggest a few ideas for the sorts of ways that it could be used.

Overview

The extension makes some of the data from the Spotify Audio Features API available as blocks in Scratch.

It means you can get numeric values representing different characteristics of songs, directly into a Scratch project.


The blocks

Adapted from the Spotify API docs.

search for

This searches Spotify’s library for a track. You can put artist and title in any order, and it’s fairly tolerant of spelling mistakes.

play preview

This plays a 30-second preview of the last track that you searched for. The block will stay active until the song preview has completed, so any blocks attached to the bottom of this block will be run after the preview has finished playing.

acousticness score

This returns a confidence score of whether the track is acoustic. The score is between 0.0 and 1.0, with 1.0 representing high confidence the track is acoustic.

danceability

This describes how suitable a track is for dancing – using a combination of musical elements including tempo, rhythm stability, beat strength, and overall regularity. The score is between 0.0 and 1.0, with 0.0 being least danceable and 1.0 most danceable.

duration

The length of the full track (not the preview clip) in seconds.

energy

This returns a perceptual measure of intensity and activity, between 0.0 and 1.0.

Typically, energetic tracks feel fast, loud, and noisy. For example, death metal has high energy, while a Bach prelude scores low on the scale. Perceptual features contributing to this attribute include dynamic range, perceived loudness, timbre, onset rate, and general entropy.

instrumentalness

This describes whether the track has any vocals. The score is between 0.0 and 1.0.

“Ooh” and “aah” sounds are treated as instrumental in this context. Rap or spoken word tracks are clearly “vocal”. The closer the instrumentalness value is to 1.0, the greater likelihood the track contains no vocal content. Values above 0.5 are intended to represent instrumental tracks, but confidence is higher as the value approaches 1.0.

key (code)

This returns a number representing the key the track is in, using pitch class notation. If no key was detected, the value is -1.

key (name)

This returns a string with the name of the key that the track is in. Names are described using sharps rather than flats (e.g. B? will be returned as A#)

liveness

This returns a prediction of whether the track was performed live. This is done by detecting the presence of an audience in the recording. The value is between 0.0 and 1.0, with higher values representing a higher probability that the track was live.

loudness

This returns the overall loudness of the track in decibels.

Loudness values are averaged across the entire track and are useful for comparing relative loudness of tracks. Loudness is the quality of a sound that is the primary psychological correlate of physical strength (amplitude). Values typically range between -60 and 0 db.

mode

This returns the modality (major or minor) of the track. Major is represented by 1, minor is represented by 0.

speechiness

This detects the presence of spoken words in a track.

The more exclusively speech-like the recording (e.g. talk show, audio book, poetry), the closer to 1.0 the attribute value. Values above 0.66 describe tracks that are probably made entirely of spoken words. Values between 0.33 and 0.66 describe tracks that may contain both music and speech, either in sections or layered, including such cases as rap music. Values below 0.33 most likely represent music and other non-speech-like tracks.

beats per minute

This returns an estimate for the tempo of the track.

beats per bar

This returns an estimate for the time signature of the track.

happiness

This returns a number describing the musical positiveness conveyed by a track. The score is between 0.0 and 1.0, with higher scores representing more positive tracks (e.g. happy, cheerful, euphoric), and lower scores representing negative tracks (e.g. sad, depressed, angry).

Project ideas

The reason for creating the extension was to try creating some machine learning projects, training a machine learning model using some or all of this data. I’ve not tried doing that yet, so in the meantime, here are a few quick ideas for the sorts of things that can simply be made using the extension.

Sprites that dance in time to music

You can animate a sprite by switching costumes. Using the beats per minute block, you can animate a sprite in time to music, so it looks like they’re dancing.

You can include the danceability block as well to make your sprite refuse to dance to songs that aren’t danceable enough!


Demo video at youtu.be/8T2k2LhAgms

This is the simplest project I can think of, in terms of the amount of Scratch coding needed.

And if you skip the danceability check, it’s even simpler.

Download this Scratch project here – give it a try!

Higher or lower

There are a whole range of variations on the “higher or lower” game that you could make in Scratch, using one of the values that Spotify can give.

To make it into a game, you could choose the songs ahead of time, and get the person playing the game to guess whether the next song is higher or lower. That feels a bit restrictive, so I tried turning this on it’s head – and getting the game to choose higher or lower, and getting the person playing the game to think of a song that matches.

For example, I tried this with the happiness block.


Demo video at youtu.be/-otSUzdTM5o

Choose a song. It’s happiness score is 47.
Can you think of a song that sounds more happy? You choose a song with a happiness score of 73.
Can you think of a song that sounds less happy?
And so on.

Download this Scratch project here – give it a try! I managed a high score of 15 when I recorded the demo video above. Can you beat that?

I think the idea could work with other values, too. For example, you could do this with the beats per minute block. Starting from one song, can you think of a song that is faster? Then can you think of a song that is slower? And so on.

Try to guess the tempo of a song

The other type of project I thought of was getting the user to guess what value Spotify will return for a song, and then showing them how close they are to the value from the Spotify API.

For example, you could make a project where the user has to estimate the tempo of a song.


Demo video at youtu.be/t8IUDJMGPK0

You could add the play preview block as well if you want to make it easier, and let them listen to the song while they’re trying to work out the tempo.

Download this Scratch project here – give it a try!

How to try this out

I’ve added this extension to the version of Scratch that I host at scratch.machinelearningforkids.co.uk so you’ll need to go there to try it out.

To access the extensions library, click on the extensions button in the bottom left.

Other ideas

I’m sure there are loads of other things that could be made with these blocks – these are just a few early ideas. As I mentioned above, the original motivation for making it was to try using the values in a machine learning project, but it’s been fun to try a few other ideas ahead of that, too.

Tags: ,

One Response to “Spotify extension for Scratch”