I’m Old, Part LXVI: CGDC

When I was at Newfire, we had a really good set of people doing marketing. I know this because I was at a party where I met my wife and her brother-in-law was there and he asked where I worked. When I mention the Newfire, he said, “Dude – you’re the guys who do 3D really fast!”

We had a booth at the Computer Games Developer Conference. The company worked together to try to figure out how to maximize our reach at the conference. For example, we set up a conference game of assassin and supplied the conference with Newfire branded suction cup dart guns. We gave out t-shirts. We embraced the “fire” aspect of the name. I made a faux bomb which was 8 dowel sections painted red and bound together with black electrical tape with wires running from them into a project box with a Radio Shack clock. These days, I can’t imagine that this prop would make it into the conference. We also set up a hospitality booth the first night of the conference. The way this worked is that we allowed each person in only if they got a Newfire temporary tattoo. Each person was allotted one cheap beer (Bud or Bud Light) and a bowl of Chili. I bought a couple very large bottles of Tabasco sauce, soaked off the labels and replaced them with our logo. In the suite, I was uncapping and handing out beers and Cowboy Dave was giving out chili. The thing that I really liked was that Cowboy Dave was being intentionally rude to the people coming through. When someone approached him, he would bark, “WHAT DO YOU WANT?!!!” which I found especially funny because there was no actual choice. It was chili or nothing.

The door was staffed by the girlfriends of Alan and Cowboy Dave. They were assigned the task of checking for tattoos and applying new ones. At one point there was some commotion at the door. A man was insisting that he get his tattoo applied to his butt. Oh Sweet Jebus. There was so much wrong with this. First, dude, not appropriate. Second, he hardly had a physique worth showing off. Third, he put the two women into a very uncomfortable situation.

After we ran out of chili and beer, we wound down the suite and headed out to mingle in the crowd. I remember being up on a balcony with Harry Vitelli, our salesman and he was completely itchy. He was looking down at the crowd and muttering about how there so many people and how could we take advantage of that. In a burst of inspiration, we grabbed a demo machine and a monitor and dragged it down to the pavilion and set up our “killer” demo on a grand piano and started selling people. I remember that we managed to get the attention of John Grigsby, an engineer at Atari with whom I went to college a few years earlier. We turned a lot of heads, which was the point.

As it turns out, this kind of sales work was not OK according to the conference rules, but we got away with it so yay?

During the days of the conference, we had people working the booth and also sent people out to get intel on competition and information from other booths. I remember Mike, one of our QA techs saying, “Let’s go out and get free stuff!” before disappearing into the crowd to get t-shirts and tchotchkes.

As I’ve matured, I’ve realized and embraced that I’m an introvert. This is funny because I’ve been a lifelong performer. Conferences. I’ve realized, are a challenge for me because even though I get along quite well with people, I get overloaded pretty quickly. However, with a team that’s all working in the same direction, it makes it that much easier. With a small company like Newfire, I really liked having the opportunity to contribute to the trade show and seeing how our team came together to get our messaging across.

I’m Old, Part LXV: Hallowe’en

Hallowe’en. I prefer the version with with single quote.
I love Hallowe’en. I always have. As an adult, I have taken a lot of joy in the process of creating costumes for Hallowe’en. When I started at Adobe, I made a pair of costumes for me and my ex-wife: A large-size plug and socket. I was the plug. It was brilliant. Just rude enough. My ex-wife got stopped by people for weeks after, saying “you were the socket!” She wasn’t too happy with that.
A couple years later, I went as 2/3 of ZZ-Top. I made a cardboard version of Dusty Hill with an articulated right arm and I dressed as a coordinated Billy Gibbons. I made a connecting rod that went from my right arm to the cardboard Dusty Hill so that we moved in sync and I brought in my guitar and an amp and played Tush for the Adobe contest.
The next year, 1994, I spent three months making a face hugger from Alien.

This was no small task. First of all, in the proto internet age, there were no easily accessible photo references. I had to special order a coffee table book about the production of Alien just to get a couple of marginal reference stills. I made clay pieces, made molds out of plaster of Paris, did castings in latex, built an armature out of hobby brass, wire, heat-shrink tubing and RC airplane hinges. Wearing this on my face, I had a really interesting experience. I could just barely see through some of the leg gaps via peripheral vision. As such, in order to make eye contact with someone, I had to turn slightly, which would make them turn and then I’d turn, and then they’d turn. Whoever I was talking to and I would end up doing a weird dance.

Next year, emboldened by my experience in molding and casting, I decided that I would go as a 10th century interpretation of the crucifixion of Jesus.

Yes, that’s my own hair. I grew out my beard for a few months, made plastic spikes for my hands and feet, a prosthetic wound for my side and a muslin loincloth. One piece I was particularly proud of was my ID badge. Adobe had recently switched over to a photo ID system. The badges were pretty nice. Your badge had your photo and your first name in BIG LETTERS and your last name in small letters. My badge read “STEPHEN hawley”. The badge was made with some tweaks to try to minimize forgery. What they didn’t count on was that an employee in Adobe (me) had access to some decent scanners and printers. I made a scan of my badge and using Adobe Photoshop, I removed my name and matching the font, I changed the name on the badge to “JESUS h. christ” and printed it out on a nice dye sublimation printer named “DON’T USE. EXPLODES” then laminated it and attached it to my badge. Oddly enough, I successfully used the badge for the next several months and was never challenged.

The next year, working at Axial (which would become Newfire), I decided that I would go as a cockroach.

This was so much fun. I bought sweatpants and a couple sweatshirts and some fabric and made the costume.

Wait.

You made this?

Yup. See, when I was in 8th grade, I had an awesome art teacher. Herb O’Brien. He did a unit with us on soft sculpture wherein we made gnomes. We learned how to work with sewing machines, how to do embroidery, stuffing and so on. I bought a sewing machine and applied most of that to this costume. I couldn’t get sweats in the right color, so I just dyed them.

I will point out that in 1996, my experience in fabric stores was interesting. I spent a lot of time walking aisles looking for just the right fabric for various projects and I got a lot of stares from the women in the store as I was always the only man present. Oh – and if anyone says that there is a difference in thinking skills viz 3D visualization between men and women, show them a pattern for a vest because there are some truly mind-bending transformations in there.

The next year, I was dating a wonderful woman (I ended up marrying her, ha!) and we decided to hold our own Hallowe’en party with coordinated costumes. I went as The Tick and E went as American Maid.

E made her costume from scratch (awesome). I bought several  dance leotards and dyed them (wrong color – easy to fix), and modified them as needed. This included sewing gloves, making booties, making foam rubber muscles, and so on.

When we moved to Massachusetts, we carried our joy of Hallowe’en along. One year I made a fake skunk skin cap and went as “Skuny Crockett – Davey’s lesser known, but more widely shunned brother”.

Atalasoft was fantastic. Most of my coworkers were way better at their costumes than me. Dave Terrell, when his wife was expecting, decided to grow out his beard through her pregnancy. At Hallowe’en, he dressed up as a Hasid (his wife is Jewish and approved, so no appropriation there). Jake Lauzier dressed up as a puppet theater. Bill Bither dressed up as a fairy (Tinker Bill). Christina Gay did a very respectable Awesomeo 3000. I was so happy.

I never really had a lot of time to since I’ve had kids to dedicate to my own costumes. One year at Atalasoft, I managed to pull of a decent enough House costume. It was particularly funny because Eric Deutchman dressed up as the devil and kept offering people candy in exchange for their souls. Since I was House, I was playing the part as a devout atheist and repeatedly denied his existence and asserted that religion was fairy tales for the insecure. I can’t say who had the harder time keeping a straight face.

What does this have to do with software, with coding? The process of writing software is a fundamentally creative process as is making Hallowe’en costumes. One of the better smells in your company is how creative the costumes are. Creative costumes = creative employees and that means all kind of good things.

Foster creativity in your peers and it will show in all kinds of ways and you will have a much better place to work.

Swift’s Limited Oberservers and How To Fix Them

Last week I was tearing apart code generated by the Swift 3.1 compiler to look at how the Swift observers, willSet, and didSet are implemented. I was surprised by a number of things, mostly by how limited the observers are.

Let’s start with the background. Swift variables are, in actuality, a lot closer to .NET properties. When you declare the following class:

open class Sample {
   public var x:Int = 0
}

The swift compiler will reserve space in the class for x, but it will also author two hidden methods, a getter for x and a setter for x. Inside the class, x will act like a field, but outside the class the compiler may choose to use the hidden methods instead.

Now let’s add in willSet and didSet methods:

open class Sample {
   public var x:Int = 0 {
       willSet {
          print("Sample x will be set to \(newValue)")
       }
       didSet {
          print("Sample x changed, old value is \(oldValue)")
       }
   }
}

Now if we make and instance of Sample and set it’s x to, say, 7, this will print “Sample x will be set to 7” and then “Sample x changed, old value is 0”.

There are some limitations though. In Swift, there are a few different types of properties: stored properties and computed properties. In these examples, x is a stored property. If x was computed property, we’d have problems.

open class Sample {
   public var w:Int = 0
   public var x:Int {
       get {
           return w * 2
       }
       set {
           w = newValue / 2
       }
       willSet { // compiler error
          print("Sample x will be set to \(newValue)")
       }
       didSet { // compiler error
          print("Sample x changed, old value is \(oldValue)")
       }
   }
}

Swift hates when you put observers on computed properties. Unless they’re overrides.

open class Sample {
   public var w:Int = 0
   public var x:Int {
       get {
           return w * 2
       }
       set {
           w = newValue / 2
       }
   }
}
open class SubSample : Sample {
    override var x:Int {
       willSet { // legal
          print("SubSample x will be set to \(newValue)")
       }
       didSet { // legal
          print("SubSample x changed, old value is \(oldValue)")
       }
   }
}

The reason this is the case is in the Swift language reference:

The stored or computed nature of an inherited property is not known by a subclass

And with all this information, we can get a clear picture at how this is implemented.  Here’s a more complete example with output:

open class Sample {
   public var x:Int = 0 {
      willSet {
         print("Sample willSet")
      }
      didSet {
         print("Sample didSet")
      }
   }
}
open class SubSample : Sample {
    override var x:Int {
       willSet {
          print("SubSample willSet")
       }
       didSet { // legal
          print("SubSample didSet")
       }
   }
}
let cl = SubSample()
cl.x = 7
// output:
// SubSample willSet
// Sample willSet
// Sample didSet
// SubSample didSet

Note that the parent’s willSet/didSet pair are nested inside the child’s. This has to do with the implementation of the setters. Each setter looks syntactically like a field set, but there’s more. It ends up looking like this:

open class Sample {
   public var x:Int = 0 {
      set (newValue) {
          let oldValue = x_field
          willSet(newValue)
          x_field = newValue
          didSet(oldValue)
      }
      willSet {
         print("Sample willSet")
      }
      didSet {
         print("Sample didSet")
      }
   }
}
open class SubSample : Sample {
    override var x:Int {
      set (newValue) {
          let oldValue = super.x
          willSet(newValue)
          super.x = newValue
          didSet(oldValue)
      }
      willSet {
          print("SubSample willSet")
      }
      didSet { // legal
          print("SubSample didSet")
      }
   }
}
 

What happens is that if you don’t supply a setter, the swift compiler will inject the willSet/didSet code for you. And that explains why you don’t get this code in computed properties. The compiler would have to inject the willSet/didSet as a prelude/postlude, but it starts to get complicated especially when the setter has multiple exit points or unusual flow. They punted rather than generate unpredictable code and I don’t fault them for it.

But there’s a problem with the observers as is. They’re just not good for very much, honestly. I’d call them syntactic sugar, but they’re not because they don’t really sweeten things. This is more like Java where if you want to do something actually useful, you end up typing in a pile of boiler plate code (violating D.R.Y.). Let’s call it syntactic coffee (I dislike coffee, it’s nasty and bitter). Here’s what you’d really like an observer to do: have a list of clients that subscribe to it and get notified when the code reaches the point of inflection. Instead, Swift hides the point of inflections and only objects in the hierarchy have access to them. If you want more, you have to do a lot more work and at that point why are you even using willSet/didSet?

To be fair, the .NET event pattern (which is a single broadcaster, multiple listener model) is nearly as bad in terms of repetition, but at least the use of EventHandler<T> has made that better, compared with what it used to be.

Can we make this better? Somewhat.

public class Broadcaster {
    private var _currentTag:Int = 0
    private var _listeners:[(_listener:(T)->(), _tag:Int)] = []
    public init() { }
 
    // represents a synchronous lock
    private func lock( _ lock: Any, closure: () -> ()) {
        objc_sync_enter(lock)
        closure()
        objc_sync_exit(lock)
    }
 
    // add a listener to the chain of listeners
    public func add(listener:@escaping (T)->()) -> Int
    {
        var tag = 0
        lock (self) {
            tag = _currentTag;
            _currentTag = _currentTag + 1
            _listeners.append((listener, tag))
        }
        return tag;
    }
 
    private func indexOf(tag: Int) -> Int? {
        return _listeners.index(where: { (l, t) in return t == tag })
    }
 
    // remove a listener associated with tag.
    // returns true if it was found and removed
 
    public func remove(tag: Int) -> Bool
    {
        var index:Int? = nil
        lock (self) {
            index = indexOf(tag:tag)
            if index != nil {
                _listeners.remove(at: index!)
            }
        }
        return index != nil
    }
 
    // returns true if there is a listener associated with the given tag
    public func contains(tag:Int) -> Bool {
        var containsIt = false
        lock (self) {
            containsIt = indexOf(tag:tag) != nil
        }
        return containsIt
    }
 
    // returns a listner for the given tag, nil is not found
    public func listenerFor(tag: Int) -> ((T)->())? {
        var listener: ((T)->())? = nil
        lock (self) {
            if let index = indexOf(tag:tag) {
                listener = _listeners[index]._listener
            }
        }
        return listener
    }
 
    // broadcast the value to each listener
    public func broadcast(value:T) {
        lock (self) {
            _listeners.forEach() { listener in listener._listener(value) }
        }
    }
 
    // sugar operator
    static func <= (left: Broadcaster, right: @escaping (T)->()) -> Int {
        return left.add(listener: right)
    }
}
 
public class Sample {
   public var willSetX = Broadcaster()
   public var didSetX = Broadcaster()
   public var x:Int {
       willSet {
           willSetX.broadcast(newValue)
       }
       didSet {
           didSetX.broadcast(newValue)
       }
   }
}
let cl = Sample()
cl.willSetX <= { newValue in print("outside listener heard a new value \(newValue)"); }
cl.x = 5
// output
// outside listener heard a new value 5

My Broadcaster class is a little more than is needed in most cases, but the thread safety is important, especially in apps where the UI is running on the main thread and broadcasting changes to other threads. I did this with the lock function. Swift has a cute syntactic hack that if you write a function whose last argument is a closure you can either pass the closure in as an argument or put the closure afterwards. Since closure syntax was designed to always be in braces, it makes it look like you’re adding new syntax to the language. In fact, you’re not and the abstraction here is leaky as all hell and bit me a couple times while I was using it. For example, in the add() method, I wanted to return the tag inside the closure, but that makes no sense since the closure type returns an empty tuple (aka, void), so I had to put in a state variable. There’s not much you can do to fix this.

But we see that with this little gem, we can at least at proper even handling to Swift even if it still has a somewhat bitter syntactic coffee flavor.

I’m Old, Part LXIV: Pre Web Hacking

Around about 1993, Mattel released Teen Talk Barbie, a doll with a voice box that came programmed with 4 phrases (out of a suite of 270). Most wee innocuous, but some were vapid and the notorious “Math class is tough” raised a stink. So much that a group named the BLO (Barbie Liberation Organization) made a video about how they swapped the voice boxes in these Barbies and G.I.Joe equivalents and returned them to the shelves.

Alan Wootton heard about this and decided it was a great idea and set out to recreate it. He bought a talking G. I. Joe and spent some time trying to disassemble the doll. The first problem he encountered was that these kids of dolls are not made to be taken apart. These companies would suffer badly if a kid took one apart and choked on pieces. Alan mounted an X-Acto blade onto a soldering iron and used that to cut open the doll and get out the circuitry. it wasn’t going to be as simple as swapping chips – one doll was made by Mattel, the other by Hasbro. The likelihood of a compatible chipset was about none. The other choice was to swap the whole circuit boards, but again, the form factor for G. I. Joe’s chest and Babries weren’t even close.

Still, the G. I. Joe circuit was pretty nifty. Rather than have a few stock sentences, it had a dozen or so phrases forms and many pieces that could be substituted in. It was a much more interesting toy than teen Barbie.

Alan gave up on the process, being pretty sure that the BLO work was either just a joke or a pretty bad hack job. Alan reassembled the G. I. Joe and used the same rig he used to cut it open, to weld it back together. He ended up giving it to one of his nephews who, if I recall correctly, enjoyed the battle scars.

I’m Old, Part LXIII: Oh We Like Bacon

I really liked the time that I spent at Atalasoft. Not only was I doing work that I really liked, I felt like any employee had the ability to make the workplace better on any given day.

One of the conferences that we attended every year was AIIM, which is dedicated to document management and imaging for the most part. I have worked the floor of that show a couple times, but after we had filled out our sales staff more, we didn’t have to dig into engineering for running the booth.

The result was that during AIIM, our office was missing close to 1/3 of the people who were working long days and then hitting the post show bar with customers. Damn them. So on the Monday of AIIM week, I was talking to Christina Gay about what we should do and we decided that we would have bacon day. I knew a company, Burger’s Smokehouse, that sells truly excellent bacon and ships promptly and reliably. We ordered some bacon and arranged with the remaining staff to have some eggs and other things brought in. Christina volunteered her griddle for cooking.

We showed up a little early, cooked up piles of bacon and eggs and enjoyed a really nice change of pace. The office was in an old mill building and featured holes in the floor that allowed us to see in the offices below and consequently the scent of cooking bacon spread. One of the other offices tweeted at us about the smell.

Of course the sweetest part was when the staff from AIIM returned and they found out what we had done in their absence.

Of course we did this at the next AIIM. And the next one. In fact, we started looking at the sales trade show schedule for when we should run the next bacon day. We invested in a second grill because our headcount was larger. For some reason, Kevin Hulse and I nearly always piloted the grills. Eventually we broke down and grudgingly scheduled bacon day when when everyone was available. You know, just to stop sales from bellyaching.

One thing I particularly liked about the process of preparing for bacon day was that Burger’s online form has a field for putting in a personal message. Christina would fill that out with a message to herself or us. Something along the lines of “Hello, Christina. I am your bacon.” I loved this.

Really, what all this comes down to is that in a small office, you have lots of opportunities to improve your office and the office culture. Generally speaking, you make the rules and a small treat now and again is a very good thing.

I’m Old, Part LXII: College CS Value

I recently saw this tweet from Stephanie Hurlburt:

I have a 4-year CS degree, it helped me get my 1st job (via referral from my professor), but the actual education was not useful or worth it

And I won’t argue with her on that. Her experience was her own. More than anything else, I would like to hear more detail of her experience and how it could be improved.

I went to Oberlin for CS. At the time, it was a very young program and only became an actual major while I was there. I was the 5th student to declare.

Up until my third year, I would agree with Ms. Hurlburt. The classes that I took during that time were only putting names on things that I invented myself or encountered in the wild. That changed more in the 300 level classes and some of the 200 level classes. Some of the classes have been no use to me. Discrete Structures was super cool – I really liked the notion of being able to treat programming language elements as mathematical entities and subject to proofs of correctness. For day-to-day engineering, I’ve had no application of that.

There were some game changers (in no particular order):

  • Compilers – this class kicked my ass, but I learned a hell of a lot from it including how to design a decent, easy-to-parse Domain Specific Language. I learned how to write a tokenizer, scanner, and parser. As a professional, I’ve implemented compilers/interpreters for 4 languages, including one that was specifically not subject to the Halting Problem. It also taught me how to decompile assembly back into a high level language on sight. In my current job I need this skill frequently.
  • Theory Of Computer Science – this was a class on automata theory. Finite state automata have been hugely useful in my professional career. A decade ago, I wrote a PDF parser that included an FSA running the tokenizer and another running the interpreter. At one job, I designed an AI scripting language for a game engine that used a state machine as the central control flow.
  • Algorithms/Analysis – making something work is only part of the job. Making it work well is where this comes in and needs to be in the wheelhouse of every software engineer. It also reminds me of the time I was reworking some code in Acrobat and I found a chunk of code to do string comparison that was O(n2). That was an easy fix.
  • Functional Programming – one of the professors did a seminar class on functional programming based on Simon Peyton Jones’ book The Implementation of Functional Programming Languages. Having the background in FP made it much easier to make the decision to use F# as the main language for creating a pure managed, performant .NET graphics library port of a C++ version.

One thing that wasn’t in my CS program, is practice in communication and writing. If you’re going to work in a group or have your work made public, you need to be able to write clearly.

I don’t think that any particular class got me a job. In fact, I know that even the things in my CV had little impact on my first full-time job out of college at Adobe. I had a boatload of Macintosh coding experience and figured that was what I was going to be brought in to do. Nope. PostScript printer engineering – embedded systems. I didn’t know PostScript. I didn’t know what an ICE was. I didn’t really know hardware, but I ended up doing/working with all those things.

I’m Old, Part LXI: Feed Your Troops

When I was teaching/running district tech, the principal at the school had a piece of paper taped to the file cabinet next to his desk which had a list of things to do to help build a team. One of the items was “Feed Your Troops”.

At Atalasoft, that was a thing that we did figuratively as well as actually. The picture above is a cake that Christina Gay got for the office for my birthday one year. I returned the favor by making her a cake for her birthday with a hand turkey on it, since her birthday is close to Thanksgiving.

We also routinely had pizza lunches, beer o’clock on Friday, and the occasional bacon day (I’ll save that story for another day). This is the actual feeding of the troops. The figurative was another thing entirely.

Lou Franco and I had a problem in that he and I were the most highly experienced engineers at the company and we had a decent number of talented junior engineers and a few moderately experienced engineers. The problem with that is that we needed more engineers in the moderately and highly experienced bins. The solution to that is time and opportunities. Not every engineer has the time (or thinks they have the time) nor are there are always opportunities.

Lou had a great idea, which was to do some lunchtime seminars. He chose the topic of the GoF design patterns and rather than do a presentation each week himself, he assigned a pattern to one of our junior engineers. The engineer would have to explain the pattern and how it worked and also where it was used in our code base. Some engineers took the task seriously and presented in depth. Some engineers clearly had just copped the wikipedia page. The fun part was the “where is it in the Atalasoft code base” portion. After the junior engineer did their bit, one of the more experienced engineers would list off the other places it was. Funny thing, before these presentations, I had never known the GoF patterns under their names. This is partly because I independently invented most of them at some point in my career or encountered them reading someone else’s code. They didn’t have names; they were just what you used when you had certain classes of problems. Nifty!

Later, I set up an independent study program where junior engineers picked topics of study and I would would work through it with them. Some of the topics included Scheme, C, and ray tracing in F#. Anything coding related was fair game.

We also did regular company staff meetings where we had a rotating schedule of people who ran the meetings. Very often, the choice of the curator of the next meeting was a game of “not it” but hated or not presenting for a group of people is a valuable work skill. We also tried to set aside time for work or non-work presentations. Sometimes they were technical, sometimes they were show and tell. They were always valuable.

One thing that I remember talking about with Lou about the overall process was that in a small company like ours, we were free to make our own rules. What do you mean you can’t take time to teach and mentor your cow workers? How can you not? One of the things I liked about Atalasoft was that I felt like I had the opportunity to make it into an engineering organization that I most wanted to work in. Certainly, it had room for improvement and we actively tried to steer things in that direction.

If you take the time to thoughtfully create a structure that is engineer friendly, then your engineers will want to be there, even through the inevitable slog work. If you are actively involved in professional development, you will build a better, more capable team. How do you achieve that? Feed your troops.

Functional Image Processing, Are You F#ing Kidding Me?

I’ve approached 3 or 4 Functional Programming conferences with a talk entitled, “Functional Image Processing, Are You F#ing Kidding Me?” I haven’t gotten a nod on that (yet).

The story behind it had to do with work I did at Atalasoft. We had an image processing library for .NET that was written largely in C++ with a friendly veneer in C#. This was to get performance. At one point, I spent a couple weeks doing measurements to find out what the cost would be to work completely in a managed language. My results came down to the magic number 1.5. Typically, image processing code written in C# ran 1.5 times slower than the equivalent C++, which is honestly pretty dang good, but we didn’t have a lot of incentive to do that.

At one point, it looked like we needed to, though. A lot of our business came from people who were doing server side image processing and we had customers who wanted to use Managed hosting, which meant no C++. Around the same time, Microsoft released Silverlight, which was a version of .NET that ran in the browser and required an even more limited set of .NET and certainly no C++.

In deciding how we were going to do this, Rick Minerich was getting heavily into F# and he suggested we look at that. I spent a week learning F# and then started doing some experiments. Rick seemed pretty surprised at how quickly I picked it up, but I had done some work in college in Miranda so this type of coding wasn’t precisely new to me, even though the syntax was brand new. The first thing I did was a port of an image processing tool we had called Dynamic Threshold. This was a tool that turned gray images into black and white and had a decent amount of smarts to give better results than the industry standard Adaptive Threshold. I chose this command specifically because it was the worst choice for a functional implementation since it had side-effects out the wazoo and was the antithesis of functional style. Still, with some effort, I got a functional version of it working that was a pixel-perfect match to the C++ code and it ran about 1.5 times slower.

If the worst possible command could work out, then I was sure that this would work with most of the typical image processing code. I threw out the F# implementation and redid it in C# instead, but then I went ahead with the rest of the library.

Since we already had a bunch of code that was just in C#, this created an interesting problem. We needed a hybrid assembly built by merging an F# assembly and a C# assembly. This involved some unusual build steps. One of my goals was that our resulting API should be 100% identical to our existing one. There was a problem, though. In a lot of my code, I like to use something akin to the GOF Template Method design pattern by building out a set of protected methods that implement reasonable default behaviors for each step in a process. Because they’re protected, an end user can sub-class the class and customize behaviors. For many things, this is a powerful tool. Unfortunately, F# (at the time) had a compiler bug where methods declared protected ended up public, which would give us an API that didn’t match and also violated the encapsulation. So for that, I did something truly grotesque. I put a .NET attribute on each method that was to be protected and wrote a tool that use the Mono tool Cecil to tear apart the compiled assembly and find all methods marked with the attribute and make them protected and then rebuild the assembly. Surprisingly, it worked quite well.

What really came in handy in the process was that F# has a proper way to declare functions as inline, which allowed me to make code read well, but still performant. This was used in code to set/get a 1-bit pixel from a scan line, for example or set/clear a segment of pixels in a scan line. I really wanted macros for a lot of my work, but that just wasn’t there. I did, however, make heavy use of partial function application. One issue we had was that we supported 12 different pixel formats and many commands needed to operate on each of them. It was especially bad when you had to combine two images in two different pixel formats because that represented 144 cases to handle. What I could do was put the general work in a nice little tail-recursive function with a function to apply on each scan line. The function was inevitably the result of an F# match statement that used partial application to build a function that had the signature I needed.

When all was said and done, most of the code ran in the 1.5 x slower range, but surprisingly, there was a bunch of code that ran faster than the C++. In those cases, there was inline code that the F# compiler could optimize the hell out of in a way that the C++ compiler could not. This is one of the big secrets of functional programming is that with proper attention to side-effect free code, much of the computational work can be done ahead of time to throw away cases that can’t possibly happen.

One of the big problems in doing this work was handling the problem of “how do I know I did this right?” Fortunately, a few years earlier I put in a suite of anti-unit tests. We had more than a hundred objects for performing various image processing tasks and each could work on all or a subset of the 12 pixel formats. I saw this as a pain point and wrote a suite of tests that ran every one of these tasks on every possible combination of pixel formats and checked to see that the results had not changed since the last time the test ran. These were the “Image Stability Tests” and I could leverage them to ensure that I was doing my job correctly. In the end, nearly all the tests ran unchanged from the C++ implementation to the F# equivalent. There were some exceptions that had to do with junk data in padding on the end of scan lines or differences in floating point rounding and I also found some bugs in the C++ or the C# wrapping code that had been undiscovered for years.

Of all that work, I think the code that I enjoyed doing the most was the Octree Color Quantization. This was particularly fun because the C++ code was incredibly grody for the whole purpose of being memory stingy and fast. I was able to undo a lot of that filth by using F# discriminated unions to represent the nodes and leaves in a tree and some nice tail-recursive code to handle the construction and reduction of the tree. It was no small feat, but in the end, it worked very well.

Not all of the code was pretty. There were times when I chose to play fast and loose with arrays passed for the sake of mutating state. There were tail-recursive functions that had entirely too many arguments and I shrugged hoping that the compiler would magically fix all that under the hood, but in the end it worked quite well.

We had two products that fell out of this – DotImage Pure (which would run in a ‘pure’ managed environment) and DotImage Sterling (which would run in a Silverlight environment).

Eventually, when we produced JoltImage, a version of DotImage for Java, I did the work by porting the F# to Java. I entertained the notion of working in Scala, but passed. But all of that is a story for another day.

In the end, there were several things that made this project work: an understanding of how to use the tools of a modern functional programming language; the ability to choose between doing something in F# or C#; a robust set of tests; and a solid expectation of the performance a priori to allow me to attend to performance hot spots as needed.

 

I’m Old, Part LX: The Humble Mouse

When I was at Adobe, they started me off with a Sun workstation for writing code. It was OK. It ran the compilers that I needed and was good enough for my day-to-day work. Most of the engineers in the department had similar machines. It came with Sun’s optical mouse that required a special metal mouse pad with a red and blue plaid pattern on it. This was a crappy misfeature. The mouse felt soupy and if a hair got into the optical sensor (which happened every couple of days), the mouse tracking went wonky.

In my second year, Computer Santa Claus came by and brought me a brand-spanking new DECStation 3100, which was a shit ton faster than the Sun and the monitor was in color and it had a better mouse. Well, in some respects. It had a unique mechanism which had two wheels, one canted around the X axis, the other canted around the Y axis. Unlike a ball mouse, no part of the rollers which come in contact with your work surface ever enters into the mouse itself. Unlike mechanical mice with balls, it rarely needs cleaning. Unfortunately, the form factor is terrible. It’s too wide and too low. A soap bar shape would have been much better.

A few months in, I was getting wrist pain, so I figured I would try to find a replacement for it. DEC didn’t supply the pin outs on the mouse connector. But hey – it’s a Hawley mouse – I know that name! I found that the mouse had been designed by Jack Hawley, who had also worked on the Xerox Alto and lived in Berkeley. I called his company, The Mouse House, and left a message. He called me back and we had a nice long chat about it. He didn’t have the pin outs either and suspected that it wouldn’t help if I did. The form factor and the communications were all DEC’s doing, the mechanism was his work. He made some suggestions on modifying the mouse itself.

We got to talking about family history and we couldn’t find any common relatives going back a few generations. Oh well. He asked me if I knew the Hawley coat of arms. My grandparents had a copy of one, but I never thought it was correct. He said he had a very old heraldry book and because the Hawley coat of arms in it was so simple, he suspected that it was the correct one. He asked for the company fax number so he could send me copies of the pages.

The cover sheet had his business card which was done up in a gilded age style with the words: “Jack Hawley – Berkeley’s Great, Though Humble, Inventor” Yes, he is definitely a Hawley. The pages that followed include the title page of the book and the page with the Hawley coat of arms. The book was over 300 years old! I couldn’t believe he put a 300 year old book onto a copier!

He was quite an interesting fellow and although he couldn’t help me directly, it was nice to talk to him and find out a little more about my family name.

I’m Old, Part LIX: It’s OK to Explore

Much in software engineering is based on what you make. What is your product? What did you ship? This is flawed thinking. Sometimes exploration is it’s own reward and the dividends get paid much later.

When I was in 9th grade, I convinced my dad to buy me a toolkit for the Apple II called Bill Budge’s 3D Graphics System. The manual is here. For the time and the capabilities of the Apple, this was an astounding piece of software. Budge provided an editor for 3D shapes, a library for displaying them, and interfacing for the library in Integer Basic, Applesoft Basic, and assembly language. Neat!

I had an idea for a game called virus. It involved a virus that lands on a cell wall and you controlled a turret that shot antibodies at the virus. If you broke down the virus in time,  no problem. If you didn’t the top would crack open and create new viruses. I never finished it. And that was probably for the best. The game concept was missing a lot. It was not particularly a playable idea, although I imagine that it might work as a framework for a game in the tower defense genre.

I was shaken that I couldn’t finish this game and that I couldn’t think of another game that fit in this genre. I was also frustrated at the performance in Basic and I had a rough time getting anything solid in assembly language. Really, I was overly hard on myself.

What I didn’t know at the time was how my experience was going to affect me later. 4 years later in college, I created a 3D graphics application in MacPascal that ran on the original Mac. It turned out to be a stretch for the system. I discovered bugs and limitations in MacPascal, I learned about issues with clipping and problems with divide by zero in perspective transformations. Still, it worked.

While I didn’t know it at the time, Budge’s library was based on a display list – this is a cool concept that allows either a self-drawing entity or a structure that can be ripped with a fairly simple state machine. I’ve used display lists in a number of projects. DotPDF uses display lists internally to represent page contents.

So yes, in my youth, it was easy to give myself grief for not finishing a little game, but that was only because I didn’t know the true value of the learning experience that I would carry forward. So I don’t dwell long on abandoned projects, but I do celebrate what I have learned.