Scale Degrees

This module explores how the notes in a melody can be represented as degrees of a scale. This module requires the music21 package.

All major and minor scales (known collectively as “diatonic” scales) contain seven different notes. After the seventh note, the first note repeats again. For example, the C major scale contains the notes C-D-E-F-G-A-B, then repeats C at the top.

We often refer to the notes of the scale by a number, called the scale degree, that describes its place in the scale. In C major, C is 1, D is 2, E is 3, F is 4, G is 5, A is 6, and B is 7. The C at the top is 1 again. This numbering system can be applied to any diatonic scale.

The first scale degree, also known as the tonic, always has the same name as the scale (the tonic of G major is G, the tonic of F minor is F, etc.). Melodies often exhibit a pattern where they emphasize the tonic at the beginning, move away from the tonic in the middle, and then return to the tonic at the end.

Let’s use the tinyNotation feature of music21 to enter a simple example:

from music21 import *

waltz = converter.parse("tinyNotation: 3/4 f a c' f'2 d'4 c'2 g4 a2 f4")

waltz.show()

(To learn more about tinyNotation, check out this page.)

Play the example on the piano, or use MIDI playback if your XML viewer supports it.

This melody is excerpted from a waltz called “Grata Esperança” by the Brazilian composer Francisca “Chiquinha” Gonzaga in 1886. It begins after a short, slower introduction:

It is in the key of F major and, following the pattern described above, begins and ends on the note F. It begins on the tonic, drifts away in the middle, and comes home to the tonic at the end. It might sound simple, but this basic principle holds true for quite a lot of music!

Many melodies emphasize the first, third, and fifth scale degrees. If you’re not sure what key a melody is in, you can count the number of times each note appears and see if it aligns with these scale degrees in a particular key.

We can create a simple function to count the notes for us. For example:

def num_pitch(melody):
 pitch_list = [elem.name for elem in melody.pitches]
 elem_list = list(set(pitch_list))
 count_list = [pitch_list.count(x) for x in elem_list]
 for i in range(len(elem_list)):
  print(note.Note(elem_list[i]),":",count_list[i])

This function takes a melody as an input and outputs a list of notes in the melody, along with how many times each appears. The first line uses a list comprehension to create a list of all of the notes in the melody. The second line keeps track of how many unique notes are in the melody using the set() function, which removes duplicates. The third line uses another comprehension to count the number of instances of each unique note in the melody. The fourth line displays the data in a readable format. Let’s run the function on our melody:

num_pitch(waltz)
> <music21.note.Note G> : 1
> <music21.note.Note C> : 2
> <music21.note.Note A> : 2
> <music21.note.Note D> : 1
> <music21.note.Note F> : 3

Sure enough, the three most prevalent notes are F, A, and C, which correspond to the first, third, and fifth scale degrees.

Checking the number of notes can be helpful, but a more sophisticated approach would take into account the total duration of each note as well. The music21 toolkit allows us to analyze the key of any melody by comparing the total durations of each note with typical distributions for any key. It’s called the analyze() method, and it’s very easy to use:

waltz.analyze('key')
> <music21.key.Key of F major>

Unsurprisingly, we get the same result. This is a pretty clear-cut example, but if we need to, we can check how “confident” the algorithm is by asking for what’s called the correlation coefficient. In a nutshell, this value indicates how likely the algorithm’s guess is to be correct. Values range from -1 to 1, and the closer to 1 the value is, the more likely it is that the key is correct:

waltz.analyze('key').correlationCoefficient
> 0.8618765596477617

The value of about 0.86 is quite solid. Generally anything over 0.7 indicates a strong correlation, meaning a good guess.

The music21 library has numerous tools for working with scale degrees. Now that we’re certain of our key, let’s identify the scale degree (number) of each note in our melody. After saving the results of our analysis as a key object, we can use the “getScaleDegreeFromPitch” function:

waltzkey = waltz.analyze('key')

waltzkey.getScaleDegreeFromPitch('f')
> 1

waltzkey.getScaleDegreeFromPitch('c')
> 5

Rather than go through each of these one by one, however, let’s label each note with its scale degree, using a for loop to iterate over each note in the melody:

for each_note in waltz.recurse().notes:
 each_note.addLyric(waltzkey.getScaleDegreeFromPitch(each_note))

(We use the addLyric function to label, and we use recurse() to iterate through multiple hierarchical levels of the notation stream.)

Then close your notation display and re-load it:

waltz.show()

Each note in your example should now display its scale degree beneath it.

Now let’s try a different example. We’ll look at a song by the American composer Duke Ellington called “Come Sunday,” written in 1942 as part of Ellington’s Black, Brown, and Beige suite. The vocal melody begins after a short piano introduction:

Click here to download a MusicXML file that you can load into music21 in Python.

Unzip the file and import it using the following code, replacing the path below with the location on your computer where you saved the file:

sundaychords = converter.parse('/Users/username/Desktop/sunday_chords.musicxml')

sundaychords.show()

As you’ll see, the top line gives the first two phrases of the melody, and below that is a piano part with the chords. Let’s start with the melody. We can isolate the top part by using the .parts function, using the index 0 for the first part:

num_pitch(sundaychords.parts[0])
> <music21.note.Note G> : 6
> <music21.note.Note C> : 2
> <music21.note.Note A> : 2
> <music21.note.Note B-> : 4
> <music21.note.Note D> : 4
> <music21.note.Note F> : 4

This one is a little tricky. G is the most prevalent note, but then we also have a three-way tie for second place between B-flat, D, and F. Sounds like G minor, right? Let’s verify using the .analyze() method:

sundaychords.parts[0].analyze('key')
> <music21.key.Key of g minor>

sundaychords.parts[0].analyze('key').correlationCoefficient
> 0.8288232378388906

Sure enough, the algorithm predicts G minor, and fairly confidently. That’s that, right?

Well, not quite. Let’s look back at our chord changes. Normally we’d expect to find the tonic chord (G minor) somewhere in there, but not only is there no G minor–there’s actually a G major (flat seven) chord! That is very unusual to find in a short melody in G minor. Could the computer have led us astray?

Let’s try analyzing the whole thing, melody and accompaniment:

sundaychords.analyze('key')
> <music21.key.Key of B- major>

sundaychords.analyze('key').correlationCoefficient
> 0.8228646906666668

Could we actually be in B-flat major? I think so. If we look at the last three chords, we have a classic ii-V-I (C minor-F major-B-flat major) jazz cadence, and the final chord is B-flat major. These are both pretty strong indications of the key.

So why were we wrong at first? It turns out that if we don’t know the key to begin with, we have to take the harmony into account as well as the melody.

One neat feature of music21’s analyze() function is that we can see what the closest alternative guesses were. We’ll use the indices 0 to 3 to specify the three next-best guesses for the melody alone:

sundaychords.parts[0].analyze('key').alternateInterpretations[0:3]
> [<music21.key.Key of F major>, <music21.key.Key of B- major>, <music21.key.Key of E- major>]

It turns out that F major was the computer’s next guess, followed by B-flat major, and E-flat major. Is that surprising? The analytical tools in music21 focus on classical music, so perhaps it’s not so surprising that a jazz melody wouldn’t follow the same rules. At the same time, “Come Sunday” does seem to have a more ambiguous sense of tonality than “Grata Esperança,” and we can largely attribute the difference to an emphasis on different scale degrees.

Extensions

  1. Try playing or listening to the Ellington melody by itself. Can you hear it in G minor? What if you (or a friend) plays a G minor chord underneath? What about F major?
  2. Take a look at the details of the key-finding algorithm used by music21. Can you account for why the computer might have guessed that G minor or F major were better choices than B-flat major for the melody (without the chords)?
  3. Write a program that determines the key and labels each note with its scale degree in a single function. Refine it so that it works for different source material. How should enharmonics be treated?

Further Reading

Check out the Open Music Theory entry on scales and scale degrees.