I’m Old, Part XVII: Reputation? What Reputation?

Round about the time that Acrobat 1.0 was in the works, I took an interest in making life casts and prosthetic make up. The internet was tiny and there weren’t many resources, so I had to make up a lot of it myself. Trial and error led to approaches that worked well enough for making plaster positives and then subsequent negatives for making prosthetics. I did a bunch of the experimentation with Alan Wootton, who was a good egg about the whole thing.

Surgical stitches
Surgical stitches

At one point, I had a prolonged illness which left me unable to work effectively and consequently with a lot of time on my hands. Eventually, I was able to work from home, which was a godsend. What I decided I’d do is a few days after I returned to work, I would come in wearing “comedy lobotomy scars”. I figured that there were a lot of people at Adobe that hadn’t seen me in a long time and would be easy to convince that I had to have some set of sutures put in.

I will briefly interject that during the time that I was working from home, the people in my department set up a sign-up sheet for people to go out and get lunch or dinner and bring it by and sit down with me. It was an extremely touching and classy thing to do. I felt cared for.

I had convinced Jeremy, a friend of mine from college, now a doctor to send me some suture materials as well as instructions on how to tie them. I made some brutal looking latex scars and stitched in some sutures at key points.

I arranged some time with Karin Jurcevich, the administrative assistant of the Acrobat group. Karin has a fabulous sense of humor. She got laser surgery to correct her vision and they did one eye at a time. After the first one was done, she came in wearing her glasses, but with one lens out and made a point of rubbing that eye right through the glasses. At one point, Karin had worked at the makeup counter of Nordstrom and still had her tester kit, so she brought that in and she helped blend the scars into my skin.

As I’ve mentioned before, John Warnock took a very hands-on approach to the Acrobat project and followed each of the products during every phase.  Today was the day that he wanted to check on the Macintosh full text search progress. It wasn’t a full-on Warnocking, just a quick catch-up. As John and eye were talking, his eyes kept going from my eyes to the scars on my forehead and back again. I think I let him do that about a half dozen times before I said, “John. They’re not real. They’re makeup. Just something I was playing around with.” John leaned forward, raised his glasses and squinted at the scars. “Oh, hey yeah! They are!”

Reputation destroyed. Or created. Probably both.

Writing a PNG Decoder in F#, Part II

This entire installment is going to be about CRC. CRC is more or less a hash of data that can be used to determine is data has been corrupted. PNG was designed such that every chunk is followed by a CRC32 of the chunk length and its data. Like many algorithms, CRC can be table-driven.

I used this site as a reference because it has output that includes the table that was used and I could take the table wholesale.


let crcTable = 
        0x00000000u ; 0x04C11DB7u ; 0x09823B6Eu ; 0x0D4326D9u ; 0x130476DCu ; 0x17C56B6Bu ; 0x1A864DB2u ; 0x1E475005u ; 
        0x2608EDB8u ; 0x22C9F00Fu ; 0x2F8AD6D6u ; 0x2B4BCB61u ; 0x350C9B64u ; 0x31CD86D3u ; 0x3C8EA00Au ; 0x384FBDBDu ; 
        0x4C11DB70u ; 0x48D0C6C7u ; 0x4593E01Eu ; 0x4152FDA9u ; 0x5F15ADACu ; 0x5BD4B01Bu ; 0x569796C2u ; 0x52568B75u ; 
        0x6A1936C8u ; 0x6ED82B7Fu ; 0x639B0DA6u ; 0x675A1011u ; 0x791D4014u ; 0x7DDC5DA3u ; 0x709F7B7Au ; 0x745E66CDu ; 
        0x9823B6E0u ; 0x9CE2AB57u ; 0x91A18D8Eu ; 0x95609039u ; 0x8B27C03Cu ; 0x8FE6DD8Bu ; 0x82A5FB52u ; 0x8664E6E5u ; 
        0xBE2B5B58u ; 0xBAEA46EFu ; 0xB7A96036u ; 0xB3687D81u ; 0xAD2F2D84u ; 0xA9EE3033u ; 0xA4AD16EAu ; 0xA06C0B5Du ; 
        0xD4326D90u ; 0xD0F37027u ; 0xDDB056FEu ; 0xD9714B49u ; 0xC7361B4Cu ; 0xC3F706FBu ; 0xCEB42022u ; 0xCA753D95u ; 
        0xF23A8028u ; 0xF6FB9D9Fu ; 0xFBB8BB46u ; 0xFF79A6F1u ; 0xE13EF6F4u ; 0xE5FFEB43u ; 0xE8BCCD9Au ; 0xEC7DD02Du ; 
        0x34867077u ; 0x30476DC0u ; 0x3D044B19u ; 0x39C556AEu ; 0x278206ABu ; 0x23431B1Cu ; 0x2E003DC5u ; 0x2AC12072u ; 
        0x128E9DCFu ; 0x164F8078u ; 0x1B0CA6A1u ; 0x1FCDBB16u ; 0x018AEB13u ; 0x054BF6A4u ; 0x0808D07Du ; 0x0CC9CDCAu ; 
        0x7897AB07u ; 0x7C56B6B0u ; 0x71159069u ; 0x75D48DDEu ; 0x6B93DDDBu ; 0x6F52C06Cu ; 0x6211E6B5u ; 0x66D0FB02u ; 
        0x5E9F46BFu ; 0x5A5E5B08u ; 0x571D7DD1u ; 0x53DC6066u ; 0x4D9B3063u ; 0x495A2DD4u ; 0x44190B0Du ; 0x40D816BAu ; 
        0xACA5C697u ; 0xA864DB20u ; 0xA527FDF9u ; 0xA1E6E04Eu ; 0xBFA1B04Bu ; 0xBB60ADFCu ; 0xB6238B25u ; 0xB2E29692u ; 
        0x8AAD2B2Fu ; 0x8E6C3698u ; 0x832F1041u ; 0x87EE0DF6u ; 0x99A95DF3u ; 0x9D684044u ; 0x902B669Du ; 0x94EA7B2Au ; 
        0xE0B41DE7u ; 0xE4750050u ; 0xE9362689u ; 0xEDF73B3Eu ; 0xF3B06B3Bu ; 0xF771768Cu ; 0xFA325055u ; 0xFEF34DE2u ; 
        0xC6BCF05Fu ; 0xC27DEDE8u ; 0xCF3ECB31u ; 0xCBFFD686u ; 0xD5B88683u ; 0xD1799B34u ; 0xDC3ABDEDu ; 0xD8FBA05Au ; 
        0x690CE0EEu ; 0x6DCDFD59u ; 0x608EDB80u ; 0x644FC637u ; 0x7A089632u ; 0x7EC98B85u ; 0x738AAD5Cu ; 0x774BB0EBu ; 
        0x4F040D56u ; 0x4BC510E1u ; 0x46863638u ; 0x42472B8Fu ; 0x5C007B8Au ; 0x58C1663Du ; 0x558240E4u ; 0x51435D53u ; 
        0x251D3B9Eu ; 0x21DC2629u ; 0x2C9F00F0u ; 0x285E1D47u ; 0x36194D42u ; 0x32D850F5u ; 0x3F9B762Cu ; 0x3B5A6B9Bu ; 
        0x0315D626u ; 0x07D4CB91u ; 0x0A97ED48u ; 0x0E56F0FFu ; 0x1011A0FAu ; 0x14D0BD4Du ; 0x19939B94u ; 0x1D528623u ; 
        0xF12F560Eu ; 0xF5EE4BB9u ; 0xF8AD6D60u ; 0xFC6C70D7u ; 0xE22B20D2u ; 0xE6EA3D65u ; 0xEBA91BBCu ; 0xEF68060Bu ; 
        0xD727BBB6u ; 0xD3E6A601u ; 0xDEA580D8u ; 0xDA649D6Fu ; 0xC423CD6Au ; 0xC0E2D0DDu ; 0xCDA1F604u ; 0xC960EBB3u ; 
        0xBD3E8D7Eu ; 0xB9FF90C9u ; 0xB4BCB610u ; 0xB07DABA7u ; 0xAE3AFBA2u ; 0xAAFBE615u ; 0xA7B8C0CCu ; 0xA379DD7Bu ; 
        0x9B3660C6u ; 0x9FF77D71u ; 0x92B45BA8u ; 0x9675461Fu ; 0x8832161Au ; 0x8CF30BADu ; 0x81B02D74u ; 0x857130C3u ; 
        0x5D8A9099u ; 0x594B8D2Eu ; 0x5408ABF7u ; 0x50C9B640u ; 0x4E8EE645u ; 0x4A4FFBF2u ; 0x470CDD2Bu ; 0x43CDC09Cu ; 
        0x7B827D21u ; 0x7F436096u ; 0x7200464Fu ; 0x76C15BF8u ; 0x68860BFDu ; 0x6C47164Au ; 0x61043093u ; 0x65C52D24u ; 
        0x119B4BE9u ; 0x155A565Eu ; 0x18197087u ; 0x1CD86D30u ; 0x029F3D35u ; 0x065E2082u ; 0x0B1D065Bu ; 0x0FDC1BECu ; 
        0x3793A651u ; 0x3352BBE6u ; 0x3E119D3Fu ; 0x3AD08088u ; 0x2497D08Du ; 0x2056CD3Au ; 0x2D15EBE3u ; 0x29D4F654u ; 
        0xC5A92679u ; 0xC1683BCEu ; 0xCC2B1D17u ; 0xC8EA00A0u ; 0xD6AD50A5u ; 0xD26C4D12u ; 0xDF2F6BCBu ; 0xDBEE767Cu ; 
        0xE3A1CBC1u ; 0xE760D676u ; 0xEA23F0AFu ; 0xEEE2ED18u ; 0xF0A5BD1Du ; 0xF464A0AAu ; 0xF9278673u ; 0xFDE69BC4u ; 
        0x89B8FD09u ; 0x8D79E0BEu ; 0x803AC667u ; 0x84FBDBD0u ; 0x9ABC8BD5u ; 0x9E7D9662u ; 0x933EB0BBu ; 0x97FFAD0Cu ;     
        0xAFB010B1u ; 0xAB710D06u ; 0xA6322BDFu ; 0xA2F33668u ; 0xBCB4666Du ; 0xB8757BDAu ; 0xB5365D03u ; 0xB1F740B4u |]

The actual CRC function is straightforward. Many CRC implementations work on a byte array, but I felt that using an F# seq of bytes would fit in better in the overall code. That said, the actually implementation of the CRC calculation is short and sweet.

let crc32 data =
    let rec bitrev count acc n =
        if count = 0 then acc
        else bitrev (count - 1) ((acc <<< 1) ||| (n &&& 1u)) (n >>> 1)
    let update acc (input:byte) =
        // tupni is input backwards
        let tupni = bitrev 8 0u (uint32 input)
        let crc1 = acc ^^^ (tupni <<< 24)
        let pos = (crc1 >>> 24) &&& 0xffu |> int32
        (crc1 <<< 8) ^^^ crcTable.[pos]
    0xFFffFFffu ^^^ (Seq.fold update 0xFFffFFffu data |> bitrev 32 0u)

The function bitrev is a tail-recursive implementation of code to reverse n bits in a number. If you come from a C/C++ background, you may wonder if this is wise. Should I be doing heavy-duty bit-banging in a functional programming language and should I be doing a tail-recursive implementation? The answer is absolutely. A few years ago for Atalasoft, I did a managed port of the low-level C++ image processing library. From my experiments, the typical cost in execution time of using C# was about 1.5x a C++ implementation. Most of the time, F# falls right in that range. In some cases, though, I found that F# was running faster than the equivalent C++. I never had time to do a complete analysis as to why, but so be it.
The function update is responsible for mutating an accumulator into the next accumulator for the CRC. It fits perfectly into a fold to calculate the CRC of an entire sequence of bytes, which is how it should be.

I’m Old, Part XVI: The Tao of Carl

Carl Orthlieb and I worked on a number of projects in parallel. As I mentioned before, we both worked on PDF full-text search, the weblink plug-in, and so on. When I got stuck on something, I would sit in his office and we’d make each other’s lives a little more surreal. Or at the end of the day, sometimes Carl would play Warcraft (“STOP POKING ME!“) and I’d watch him wipe out some Orcs.


Around that time, for no other reason than personal growth, I had read an essay entitled “Is God a Taoist?” by Raymond Smullyan in the book “The Mind’s I“. Several years earlier, I had met Smullyan at Oberlin College where I was a student and he was a guest lecturer. I was already familiar with his work as my brothers and I went through some of his logic puzzle books when we were younger. After the lecture, Judith Underwood, a math and CS major and one of my classmates, came up with the following Smullyanesque logic problem:

Q. You are on an island of knights and knaves. Knights always tell the truth. Knaves always lie. You come to a crossroads and there is a person there, but you don’t know if it’s a knight or a knave. What single question can you ask to find the shortest route to town?

A. “Hey! Did you hear they’re giving out free beer in town?”

Then you follow him into town.

So after reading the essay, I put the source book from which Smullyan’s essay had been excerpted on order and in the intervening time, I decided to read a translation of the Tao Te Ching. I picked a translation at random from the public library and it turned out to be a decent one and it spoke to me (later I read a different translation that wasn’t nearly so good, and was grateful for the one I had first). I went on to read the Tao of Pooh and the Te of Piglet. One day, I was talking to Carl in his office about the Tao and as we did so, a new hire, David Fogel came up to the doorway and listened in. At a break, he asked a question about what Winnie the Pooh had to do with the Tao.

Carl, in true Carl form said, “Well, Pooh embodies the Tao. Pooh lives in the moment. It’s like the story when Pooh goes up to Christopher Robin and asks him for a balloon. Then Christopher Robin pulled out a shotgun, gut-shot Pooh, then tied his intestines to the back of a pickup truck and drove around the Hundred Acre Wood three times before finally pulling up at a bridge and throwing Pooh’s broken body into the river while the the rest of the residents chanted ‘Poohsticks! Poohsticks! Poohsticks!'”

David had a look of abject horror and while looking down at the gray carpet of the old Mountain View offices, he repeatedly shook his head saying, “No. No. No. That didn’t happen.”  before walking away.

It’s important to have Carls and Judiths in your life.

Writing a PNG Decoder in F#, Part I

Over the next few months, I’m going to be posting a series on how to write PNG decoder in F#. I’ve spent a fair amount of time looking at libpng, and I think we can do better.

First off, PNG files start with a header, so here is a quick and dirty check for a header matching the PNG spec:

let headerIsValid (stm:Stream) =
    let validHeader = [|  0x89uy ; 0x50uy ; 0x4euy ; 0x47uy ; 0x0duy ; 0x0auy ; 0x1auy ; 0x0auy |]
    let header = Array.zeroCreate 8
    stm.Read(header, 0, header.Length) |> ignore
    (Array.compareWith Operators.compare validHeader header) = 0

In this case, I’m returning true if the header is valid, false otherwise.
PNG files are made up of a series of chunks. Each chunk is the format:


where length is a 4 byte length of the data section, tag is a 4 byte code, data is length bytes of data and crc is the 4 byte network order crc32 value of the tag and the data. In libpng, there is a lot of work done to ensure that the chunks are in the right order. This is all well and good, I suppose, but it’s not very forgiving and it makes their code much more complicated. Futhermore, since each chunk comes with a size, we can skip over the file and build and index of the chunks before we begin reading any actual data.
The tag will be represented as an F# record. Tags in PNG files are 4 ASCII characters, so we’ll model it that way:

let headerIsValid (stm:Stream) =
type FourCC = 
        c0: char
        c1: char
        c2: char
        c3: char
    override this.ToString() = String.Format("{0}{1}{2}{3}", this.c0, this.c1, this.c2, this.c3)
let fourCCStr (str:string) =
    if str.Length <> 4 then raise (ArgumentOutOfRangeException("str", "str must have excetly 4 characters."))
    { c0 = str.[0]; c1 = str.[1]; c2 = str.[2]; c3 = str.[3]; }
let readFourCC (stm:Stream) =
    ensureRead stm 4 |> ignore
    { c0 = stm.ReadByte() |> char; c1 = stm.ReadByte() |> char; c2 = stm.ReadByte() |> char; c3 = stm.ReadByte() |> char; }

I also gave you a couple convenience routines for converting a string to a FourCC and for reading a FourCC from a stream, but there’s an IO routine that I slipped in there: ensureRead.

let ensureRead (stm:Stream) n =
    if stm.Length - stm.Position >= (int64)n then n
    else raise (Exception(String.Format("Unexpected end of file while trying to read {0} bytes.", n)))

which makes sure that there are at least n bytes to read in the file, returning n if so. This is handy because anytime we want to read n bytes from a file can wrap the desired number of bytes to read with a call to ensureRead and if they’re there, everything will work as expected. And while I’m going on about I/O, here are some handy routines for reading numeric values from a stream:

let rec readCombined (stm:Stream) acc count =
    if count = 0 then acc else readCombined stm ((acc <<< 8) ||| stm.ReadByte()) (count - 1)
let readInt (stm:Stream) =
    readCombined stm 0 (ensureRead stm 4)

Finally for the representation of a chunk, I’m also using an F# record:

type Chunk =
        length: uint32
        crc: uint32
        position: int64
let makeChunk name length crc position =
    { name = name; length = length; crc = crc; position = position }

Because I’m lazy, I also put in a factory function because unlike a lot of F#, the record initialization syntax is tedious.
In the record, I have all the segments of a chunk except for the raw data itself. In its place, I’ve put ‘position’ which tells me where the data starts in the file. Ultimately, our collection of Chunks will be a Dictionary<FourCC, List<Chunk>>.
In the next installment, I’ll show you how to read in all the chunk information from a PNG file.

I’m Old, Part XV: The Day I Fixed Bugs in Netscape Navigator

Carl Orthlieb and I worked on a number of parallel projects on the Acrobat team. After Acrobat 2.0 shipped, we started looking at how to integrate with that thing called The Web. Carl came up with a standard for a “Web Link Annotation” which was an extension to the existing link annotations that allowed a destination to be a URI.


At the time, internet connections were slow and if you were in America and connecting from home, your average connection speed was 2400 baud. We cared a lot about the experience, so we invested a lot of time in I/O cover-up. The latter is a technique where you show some friendly animation to indicate that the computer is working (and if possible, show progress).

In our initial versions, the experience was less than stellar. You clicked on a web link in Acrobat and you waited and waited and saw nothing until the page was complete and if you clicked on a PDF link in a web page, you had a similar experience. We “fixed” that by drafting an API which allowed us to report in the Acrobat UI the progress that Netscape was making behind the scenes and vice versa. It wasn’t particularly elegant, but it worked. Well, it worked in theory.

We worked with Netscape on this API and the builds we got had some issues. Since Netscape was in Mountain View, it was easy enough to go sit down with the engineer responsible and get the issue fixed. One afternoon, I sat down with Aleks Totic and we looked over his code. He was scrolling down looking for a particular section and I was reading over his shoulder. I said, “Hey – hang on. Go back a sec.” Aleks scrolled back. “Stop. There!” Aleks focused on the code I was indicating and asked, “what is it?” “Right there – that’s an off by one error. You want to change that to […]”. “Oh! I do that all the time!” Oddly enough, so did I, which is how I spotted the pattern. It was code to display a progress bar in the UI which would paint beyond the bounds of the progress bar at 100%. The problem was that the total range of the bar was 0-100 inclusive, which is 101. His code assumed the range was 1-100 inclusive.

And that brief exchange was how I fixed a long-standing bug in Netscape Navigator.

Today, this type of work is called Pair Programming and it’s supposed to be a standard practice. In my experience, it was a good process for bringing junior engineers up to speed faster and for senior engineers to collaborate on new designs or to work on “impossible” bugs. Otherwise, I’ve felt it’s slowed me down. Still, it’s a good resource to use now and again.

Creating Medusa Tuples in Swift

One day, when I was home doing terrible things to my dog with a fork, it hit me… – Steve Martin

When I was a student at Oberlin college, one of my CS professors, Rich Salter (not that one), had come up with a cute hack to represent repeating decimal numbers in Scheme so that they lost no precision. He called them Medusa numbers because everything was well and good until you looked at them, then you’d freeze because the print out was infinite.


Keep this in mind.

Swift is an interesting language. It is neither functional nor procedural, but a hybrid of the two. It draws the line closer to the procedural side so as to be friendlier to Objective-C programmers, whereas F# cleaves much closer to the functional side.

Before I get into my specific problem, let’s examine the Swift landscape. Swift is a language wherein it is required that everything is initialized. This is important to Swift because Swift is a reference counted language. Unfortunately, like many other memory management schemes, you have to be aware of what is happening or you’re likely to cause problems. C is like this too, but the problems are much more manifest. Swift hides them, but the problems are still there.

Let’s say that you do this in Swift:

var x:SomeStruct = SomeStruct()
x = aFunctionThatReturnsSomeStruct();

What you likely think is happening is less than what actually happens. First you might say, “x gets set to the return value of the SomeStruct() constructor.” And you’re nearly right. X, however, does not get set with that value. Instead, x gets initialized with the return value of the constructor. “Then,” you continue, “x gets set with the return value of aFunctionThatReturnsSomeStruct”. Not quite. First, Swift calls the function, holding onto the return, then it looks up a special function in the value witness table for SomeStruct and calls a destructor method in it to destroy the contents of x. Then Swift calls a function to copy the return value from the function onto x.
This distinction is very important and it all has to do with Swift’s automatic reference counting (ARC).

One day a student came to Moon and said: “I understand how to make a better garbage collector. We must keep a reference count of the pointers to each cons.”
Moon patiently told the student the following story:
“One day a student came to Moon and said: `I understand how to make a better garbage collector…

I do not care for reference counting as a memory management scheme.

So let’s consider this little gem that you can do in C#:

T x = default(T); // T is a generic type parameter

You can’t do this in Swift. When I say that, I’m lying. You can, but the code I’ve seen that uses it scares the pants off of me. Here is just such code:

public func defaultOf<T>() -> T {
    let x = UnsafeMutablePointer<T>.alloc(a)
    let retval = x.memory
    return retval

And this code works. Almost. The problem arises if your T is a struct or enum that contains a reference type (either directly or indirectly or in a sub element), you have created a time bomb.

let x:SomeStruct = defaultOf() // mmmaybe ok?
x = aFunctionThatReturnsSomeStruct() // ...and you're dead

The reason is that when the destructor gets called before assignment, any reference types in get their reference count decremented, but the value is unknown, probably null and then you get a bus error. If you’re lucky.
Ultimately, default(T) can’t or shouldn’t exist in Swift because Swift insists on knowing that variable in Swift is initialized and meaningful and default(T) can’t guarantee that.

The only escape clause is in the types UnsafePointer and UnsafeMutablePointer. And this is where I finally get to my problem: I needed to interoperate with Swift from C, specifically, I had to call a function that took a Swift optional. An optional in Swift is syntactic sugar implemented in a generic enum (more or less):

public enum Option { case none, some(T) }

The problem is that in C I can neither generate nor consume Swift enums. The spec in the Swift ABI is very scant for how enums are laid out. There are 5 different cases. Some are straightforward, others not so much. What I can do is crack apart a Swift tuple because the rules for the memory layout of tuples is much better specified. So I started with two functions in Swift:

public func toOptional<T>(optTuple: (value: T, isPresent: Bool)) -> Bool
   if isPresent { return value; }
   else { return nil }
public func fromOptional<T>(value: T?, result: UnsafeMutablePointer<(T, Bool)>)
   if value != nil {
      result = (value, true);
   else {
       let x = UnsafeMutablePointer<Bool>(result + 1)

And here you see that I’ve done a terrible thing to Swift with a fork. I’ve created a tuple that, if a nil optional comes in, will have an invalid head which is not consumable in Swift. And this is a Medusa Tuple. If you don’t take precautions before looking at the head, you’re just dead.
But what about toOtional? Does it work? Yes, because it never touches the value unless it’s valid. So even if I can’t call a function in Swift that takes or returns an optional in C, I can write a wrapper around the Swift function that transforms optionals to/from tuples and then calls the desired function.

I hope that if you find this you only find it in desperation and that the last thing you do is use the defaultOf function above because it is fundamentally broken in Swift semantics. Instead, take the “Unsafe” label very seriously. Values within an unsafe pointer may not be initialized and this is one of the few places in Swift where this is allowable. Swift also cares very much about the lifetime of memory. In my own code, the memory for the UnsafeMutablePointer is allocated in C and deallocated in C and that how it should be.

Measuring Trumpet Fingering Difficulty in F#

In this previous blog, I talked about applying Gray Code to trumpet fingering to maybe improve the difficulty of fingering. I thought I’d take some time this evening to answer the question, “Is there even a problem here?”


(and yes, I know this is a cornet, but I just had it overhauled)

I did this by writing some code to evaluate all the common major scales based solely on difficulty of finger transitions. I chose to do this measurement in F#.

Before I get into the methodology, I want to explain my choice of F#. For the most part, the choice was arbitrary. I could have done this in a variety of different languages. If anything, my F# is a little rusty, so this was an exercise to flex those muscles. Taking time to explain the choice is due to an experience when I was in college. I sat in on a guest lecture from a researcher who did a similar study for guitar fingerings, which I saw as a fairly simple graph search problem and I was appalled that he had done it in Prolog, so of course being a righteous upstart, I asked him “why Prolog?”

OK? Good.

So I chose to represent trumpet fingerings by a discriminated union:

type fingering = 
    | NotANote
    | V000
    | V010
    | V100
    | V110
    | V101
    | V011
    | V111

I use NotANote as a placeholder when I start a sequence of transitions.

Next, I needed a way to measure the difficulty of any given fingering change. To do that, initially, I had a huge pattern match of the form:

match (vp1, vp2) with
| (V000, V000) -> 0.0
| (V010, V000) -> 1.0 // etc

Which required 49 cases and then afterward, I had to write unit tests to make sure that my arbitrary scores were reflexive. In thinking about it, the problem should be easy to break down into a couple rules:

  1. No valve changes are free (this is a lie, more on this later)
  2. Single valve changes are always cheap except for changing from V101 to V111, which is not quite as cheap.
  3. Double valve changes are more expensive that single, but V010 to V111 is bad.
  4. Triple valve changes are more expensive than double, but V101 to V010 is the worst.

Four rules struck me as more concise than 49 cases:

let fingeringDifficulty vp1 vp2 =
    // convert a note to a binary representation
    let noteToBinary note =
        match note with
        | NotANote -> 0b100000
        | V000 -> 0b000
        | V010 -> 0b010
        | V100 -> 0b100
        | V110 -> 0b110
        | V011 -> 0b011
        | V101 -> 0b101
        | V111 -> 0b111
    // naively count the number of bits set in a value
    let rec bitCountNaive n a =
        if n = 0 then a
        else bitCountNaive (n >>> 1) (a + if n &amp;&amp;&amp; 1 = 0 then 0 else 1)
    let eitherIs a b c = (a = c) || (b = c)
    // rule implementation
    let binaryValveRules bp1 bp2 = 
        let changes = bp1 ^^^ bp2
        let nChanges = bitCountNaive changes 0
        match nChanges with
        | 0 -> 0.0
        | 1 -> if changes = 0b010 &amp;&amp; (eitherIs bp1 bp2 0b101) then 2.0 else 1.0
        | 2 -> if changes = 0b101 then 2.0 else 1.5
        | 3 -> if (eitherIs bp1 bp2 0b101) then 4.0 else 3.0
        | _ -> raise(System.ArgumentOutOfRangeException())
    // filter out any NotANote elements
    match (vp1, vp1) with
    | (_, NotANote) | (NotANote, _) -> 0.0
    | _ -> binaryValveRules (noteToBinary vp1) (noteToBinary vp2)

Next, for want of a better standard, I use midi note numbers to represent pitch values. It’s good enough for what I need. Given that, we need a way to convert from midi note numbers to fingerings, so here is a programmatic trumpet fingering chart:

// the lowest practical note on a Bb trumpet is E2 or 52
// while you can have screaming notes at Bb5 or higher, these are
// really "party trick" notes. We'll stop at F5, which is about
// usually the upper limit for pro big band lead parts (by comparison,
// the highest note in the West Side Story lead part is E5, and that
// is a very challenging part.
// This also ignores "false" fingerings that are part and parcel with the
// instrument.
let midiNoteToFingering midinote =
    let minNote = 52 // E2
    let maxNote = 89 // F5
    let noteToFingering = [|
        V111 ; V101 ; V011 ; V110 ; V100 ; V010 ;
        V000 ; V111 ; V101 ; V011 ; V110 ; V100 ; V010 ; V000 ; V011 ; V110 ; V100 ; V010 ;
        V000 ; V110 ; V100 ; V010 ; V000 ; V100 ; V010 ; V000 ; V011 ; V110 ; V100 ; V010 ;
        V000 ; V110 ; V100 ; V010 ; V000 ; V100 ; V010 ; V000 |]
    if midinote < minNote || midinote> maxNote then NotANote
    else noteToFingering.[midinote - minNote]

this code is essentially a table lookup that filters out notes that are out of the instruments practical working range (discounting pedal tones and the nosebleed notes).

Given modeling and measurement, now we need to apply it.

type notePairMeasurer = (int*int) -> float
let simpleMeasurer (midinote1, midinote2) =
    fingeringDifficulty (midiNoteToFingering midinote1) (midiNoteToFingering midinote2)

This is a definition of a function type that measures the difficulty in transitioning between two notes, passed as a tuple of ints, returning a float. I’m calling 0.0 free with difficulty increasing. The simpleMeasurer applies the rules that I’ve laid out earlier to any pair of notes by looking them up in the fingering chart and then passing the fingering into the earlier 4 rule function.
With that, we can measure a sequence of notes:

let measureNotes (measurer:notePairMeasurer) notes =
    let measureNote (prev, currentScore) curr = (curr, currentScore + (measurer (prev, curr)))
    let (note, totalScore) = Seq.fold measureNote (-1, 0.0) notes

which is done with one nice fold.
I want to measure a whole bunch of major scales, but since I only want to type in the data for a major scale once, I need a way to transpose a sequence of notes:

let transpose semitones notes =
    Seq.map (fun note -> note + semitones) notes

And now you can see why F# is handy – that’s very simple code.
I also wanted a way to print the scale name, so I wrote this chunk of code to convert a given midi note number to its name:

let midiName note =
    let notenames = [| "C" ; "C#" ; "D" ; "Eb" ; "E" ; "F" ; "F#" ; "G" ; "Ab" ; "A" ; "Bb" ; "B" |]
    let octavenumber = note / 12 - 2
    let notename = notenames.[note % 12]
    notename + ((string)octavenumber)

And while I’m at it, here’s the C major scale starting on C3:

let majorScale = [| 60 ; 62 ; 64 ; 65 ; 67 ; 69 ; 71 ; 72 |] |> Array.toSeq

And at long last, I tie it all together. Yes, I could probably do this more efficiently, but it works well enough.

let main argv = 
    let allScales =
        seq { for i in -8..11 do
                let score = majorScale |> transpose i |> measureNotes simpleMeasurer
                let scaleName = majorScale |> transpose i |> Seq.head |> midiName
                yield (score, score / (Seq.length majorScale |> float), scaleName) }
                |> Seq.sortBy (fun (score, avg, name) -> score)
    for (score, avg, name) in allScales do
        printfn "%s total score %f, average %f " name score avg

And here is the output:

 Eb3 total score 7.500000, average 0.937500
 F3 total score 7.500000, average 0.937500
 Bb3 total score 7.500000, average 0.937500
 C3 total score 8.000000, average 1.000000
 D3 total score 8.000000, average 1.000000
 G3 total score 8.000000, average 1.000000
 Ab3 total score 8.500000, average 1.062500
 F2 total score 9.000000, average 1.125000
 Bb2 total score 9.000000, average 1.125000
 A3 total score 9.500000, average 1.187500
 G2 total score 10.500000, average 1.312500
 E3 total score 10.500000, average 1.312500
 F#3 total score 10.500000, average 1.312500
 B3 total score 10.500000, average 1.312500
 Ab2 total score 11.000000, average 1.375000
 A2 total score 11.500000, average 1.437500
 B2 total score 12.000000, average 1.500000
 C#3 total score 12.000000, average 1.500000
 E2 total score 13.000000, average 1.625000
 F#2 total score 13.500000, average 1.687500

And for the most part, I agree with the scoring. The top 3 are tied and they are pretty much the easiest scales. In general, the consistency of the average scores is encouraging in that there may not even be a problem that needs solving.

What’s missing? Well. The difficulty measurement doesn’t generalize beyond scales. And I alluded to this by naming the measurer I use ‘simpleMeasurer’. In reality, the difficulty of going from one note to another on a trumpet is more complicated. A trumpet is really 7 bugles bound together. You get scales by switching the bugles in and out, but they’re still bugles. Any given bugle can only play the harmonic series and skipping between harmonics is much more difficult than valve changes. So really, one needs to take that into account too. Putting those parts in, this could be used to measure the difficulty of a piece of music. For that, it should consume a stream of MIDI and, in addition to measuring fingering changes and skips between or across harmonics, it should also measure the relative amount of time spent actively playing and resting and the rough range center and relative time spent in the upper range.

Now with all of this in place, make the rules driven by the instrument. Then the piece’s difficulty can be measured across an entire ensemble.