Wordsmithy and Rail Riding

This post has little to do with technology, but a lot to do with wordsmithy. Nomenclature in software engineering is a real pain in the ass. When writing a library for public consumption, it can be hard to pick the right names for things for a number of reasons. For example, the obvious name is used for something else, the non-obvious name is too obscure, the obvious name is a leaky abstraction, there is no reasonable single word or short phrase that describes what is happening, or you just can’t latch onto the right word. As Steve Martin said, “Some people have a way with words and other people … uh … not have way, I guess.”

There was a point when I was working at Bell Communications researc.  A college friend of mine, Jim Blandy, was there as well. He was assigned the task of porting a text editor, sam, from the Blit terminal to the MGR windowing system. Sam was originally written by Rob Pike. Jim had been scratching his head over the source for a long time and at one point, Jim turned to me and asked, “What’s a rasp?” I thought he was asking about woodworking tools, so I answered, “it’s a file with holes in it.” Jim responded in a way that showed his exasperation. Within the text editor code was a structure called a RASP. It represented a non-contiguous text file on disk, i.e., a file with holes in it.

I think about words a lot. It helped that my mom had a huge vocabulary and collected such books as Mrs. Byrne’s Dictionary of Unusual, Obscure, and Preposterous Words, from which a learned such words as hircismus and papaphobia. The latter is a fear of the pope, although I can only imagine that its practical usage is narrow.

I recently read John Scalzi’s essay on the DNC convention and was struck by a couple of word choices: “shitshow” and “Wasserman-Schultz was ridden out of town on the nicest, most face-saving rail that could be found”.

Shitshow’s meaning is obvious. I just found it entertaining for its usage to describe the 2016 RNC Convention, since as another friend pointed out, there was been a norovirus outbreak at the convention so it was both a figurative and actual shitshow. Bravo.

Now. Rail-riding.

When I was in high school, I had a scheduling issue which resulted in me being shoe-horned into a Constitutional Law class that was well below my level so that I could meet the state requirement for history. The reasons behind this took up a long paragraph that I deleted. You don’t need to know. The teacher was a tired-looking man with a less-than-great classroom management skills, who had moments of greatness. There were points when he injected small lectures about things that were decidedly not in our text, and one of them was about the process of being rode out of town on a rail which has stuck in my mind.

Conner-prairie-split-rail-fence

Itinerant charlatans made their way through towns in the United States. Snake Oil salesmen and so on. If their scams came to light and they were caught by an angry mob, they were treated to an uniquely American punishment, which I note that Wikipedia refers to as ‘extrajudicial’ – a kinder term for mob law: tarring and feathering and being run out of town on a rail. Tarring and feathering is obvious, but my history teacher noted that the scoundrels were stripped of their clothes first. Getting a coating of hot tar is painful and long lasting. The covering of feathers makes it harder for the miscreant to write it off as an accident at the next town.

Now, being rode out of town on a rail. Many fences were “split-rail fences” like the one in the picture above. These were crude affairs, at best. The culprit was put on the rail, which was then hoisted up on the shoulders of a couple men who would then have nice jog out of town with the perp bouncing along painfully. I wondered what kept the perp on the rail. Why not drop off and run? If he was bound, would it better to let himself hang? Maybe the fear of further mob justice was an incentive to keep balance. I think about these things.

If you are ever under the delusion that we are more imaginative than our forebears because we have iPhones and distributed music services, just think about being run out of town on a rail and that will put those thoughts to rest.

At this point, considering how comprehensive the punishment is, I’m having difficulty thinking of a way that someone could make it gentle without it completely losing it’s teeth. Maybe if you left out the tar and feathers. Maybe if you used a wider, smooth, rounded rail. In other words: a log, which is no longer a rail. If you padded the rail, it becomes a parade float.

Yeah, I wonder about these things.

I’m Old, Part XVII: Burritos and Acrobat

When I worked in the printer department at Adobe, many people in our area liked to get lunch from La Costeña, a small Mexican grocery store in Mountain View that had a burrito “factory” in the back. The food was cheap, delicious, and filling. You would go through the grocery store to the back where there was a small room with a buffet. A worker would walk with you along the buffet as you pointed at the items you wanted in your burrito and they would scoop them onto the tortilla. At then end, they wrapped it up, put it into a small bag and marked on it what you got. You brought it to the front of the store, paid, and went on your way.

Sausage-Breakfast-Burrito

The problem with La Costeña was that many of the businesses in the area found out about it and the line to get a burrito was often long. Some people at Adobe suggested that the owner should put in a fax machine so they could fax in orders and pick them up at the counter and then the wait was minimal. The owner agreed and even made a form for people to use when they were ordering their food.

Around 1992, Adobe was doing experiments to add fax capabilities to PostScript printers. There were several printers in our department that could send and receive faxes and more importantly, could send faxes automatically.

Ross Thompson, one of the engineers in our department, decided that he wanted to make a program that could order his burrito for him. It was called, oddly enough, burrito. It ran from the command line and had one of the most florid argument syntaxes I have ever encountered. It included a macro language that let you specify burritos in terms of other burritos, and of course all of this could be embedded in a ‘.costenarc’ file so you could make burritos based on burritos defined in your .costenarc file. To top it off, Ross embedded PostScript code to generate La Costeña’s fax form, complete with faithfully recreated typos and write the order on top of it before faxing it.

Unless I use certain syntaxes every day, I forget them quickly. I’ve had this problem with regular expressions, awk, and burrito. It’s frustrating. Since I ate at La Costena every three weeks or so, I had to relearn the command-line syntax every single time.

Since I’m a programmer, I decided to do some work to make it easier to be lazy. I wrote a Motif-based X Windows program to run on the DECstation I used for development. The program presented a WYSIWYG version of La Costeña’s form, except with check boxes, pop-ups, and text fields that were all live. When you were done, the program would cobble together all the options then run burrito as a sub-process with the correct command line syntax to generate the form. Worked like a charm. I don’t think I ever ran burrito from the command line again. I called my program, unsurprisingly enough, xburrito.

Around that time, I got scouted for the Acrobat team, which was then code-named Carousel. One evening, I was invited to meet with Frank Bozeman, who was a friend of Joe Holt. I don’t remember as much from that evening as I’d like, but Frank said I’d been described as “hot shit on a silver platter” which is a compliment, I think.

Yeah, pretty sure.

After some discussions between my department and theirs, I got slated to move to the Acrobat group. I was supposed to be on the team dedicated to full-text indexing and search in Acrobat 2.0, but I was pre-appropriated to work on Acrobat 1.0 as part of the Macintosh team, since Acrobat 1.0 was still in the demo release phase and there was a lot that needed to be done.

I knew the transition was going to be challenging with a very steep learning curve.

As an aside, I would like to say that every software job I’ve ever had since I left college has involved a very steep learning curve.

In this case, the Acrobat was written in two distinct parts: the main engine, which was in portable-ish C, and the UI layer, which was written in a combination of C and hamstrung C++. The implementation of C++ was that which was supported by the Think C compiler, and was missing a bunch of the more esoteric parts of C++. In addition, the UI was built using an application framework shipped with the compiler called TCL or the Think Class Library. It was OK – it solved a lot of the problems of typical Macintosh application development, but it also created all kinds of new ones in the process. You can’t have everything (“Where would you put it?” – Steven Wright).

TCL was substantial in scope. There was a lot to learn. Before working on the main Acrobat code base, I set myself a small task to get used to TCL. I decided to port xburrito to the Macintosh. Because a page full of check boxes and such seemed very un-Macintosh like, the Mac version included a virtual tortilla and a panel of ingredients that you could drag and drop onto (or off) the tortilla.

When you were done, you clicked the ‘order’ button and it would cobble together an order and place it on top of the same PostScript program that Ross had created and send it to your favorite printer. If it had fax, it would send it to La Costeña. This program was called MacBurrito.

Here is the article that ended up in rec.humor.funny about Ross’ burrito program:

Following is the documentation for a computer program which lives at
Adobe (PostScript/fonts/Acrobat/Photoshop/Illustrator) in Mountain
View, California. I got permission from the author to re-post it for
him.

Some background: La Costena is a Mexican restaurant local to Adobe.
Everything is made to order, and the cooks follow you down a sort of
burrito assembly line in order to customize your meal.  Not
surprisingly, the place is very popular, and there are often long
lines. Mr. Thompson has thus enlisted the aid of technology to avoid
wasting too much time in queue.

-----------------------------------------------------------------------

From thompson@mv.us.adobe.com Wed Aug 18 07:05:38 1993
Message-Id: <9308181404.AA10840@rhythmic.mv.us.adobe.com>
To: gurgle@netcom.com (Pete Gontier)
Subject: Re: Burritos 
Date: Wed, 18 Aug 93 07:04:48 MDT

Here you go.  You may want to add some editorial notes for those
unfortunate to live outside the Bay area.  By the way, this is a real
program.

- Ross

-----------------------------------------------------------------------

Tired of standing in line at La Costena?  This file documents an
automatic facility for sending a fax to La Costena that orders 1 or
more burritos, quesadillas, tacos, and whatever.  The command will
compose the fax, and send it to your favorite PostScript fax printer,
for direct transmission to La Costena, and no paper at this end will
be generated.  Then, when you get there, your food will be waiting.
No worries.

To use this, you will want to add the following lines to your .cshrc
file:

setenv BURRITOPRINTER = <printer>
alias burrito /user/thompson/public/burrito<mach>

where <mach> is dec, sun, or sparc, as appropriate.  Requests to
support other machine types will be greeted with enthusiasm if the
following conditions are met:
 1) I can get the code to compile with a minimum of effort.
	(I expect no difficulties, but you never know).
 2) I am provided with the name of a machine on which to do the build.

<printer> should be the name of a PostScript level 2 printer that
supports fax and is connected to an external phone line. I use
radiant, which is located in building E.

You will also probably want to create a .costenarc file, to define
your burrito macros in.  The one in /user/thompson/.costenarc is
designed to stand as an example that you can use.  Feel free to copy,
modify, whatever.  I think the syntax should be pretty
straightforward, if you understand how to describe a burrito.

In keeping with long standing Unix tradition, the syntax for
specifying burritos is somewhat obscure. Here is an attempt at
explanation, with some examples at the end.  For a better
understanding, the energetic reader will attempt to thoroughly
comprehend the contents of my .costenarc file.

burrito [-n "name"] [-t <time>] [-p phone#] [-9] [-d] [-x] [FoodSpec [...]]

    -n  specify the name at the top of the order blank.
	   This should be the name of the person who will pick
	   up the order.  Default is current user, as defined in
	   /etc/passwd.
    -t  specify the time at which you will pick up the order.
    	   time may be absolute 24 hour time or +delta. Default is +1:00.
    	   Note that La Costena specifies a 20 minute on small orders
	   and 60 minute on large orders minimum notification time.
    -p  specify callback number in case La Costena has questions.
	   default is as found in /usr/local/adobe/phones/adobe.phones.
	   If your phone number is not specified, and burrito can't
	   figure it out by looking in adobe.phones, an error will result,
	   and the order will not be transmitted.
	   syntaxes for phone numbers:
		    entry		interpretation
		(408)123-4567		(408)123-4567
		123-4567		(415)123-4567
		x4567			(415)962-4567
	    If you are entering the phone number on the command line
	    (instead of using a macro) please note that the ()'s need
	    to be escaped: \\(408\\)....
    -9	dial "9" before dialing the La Costena number.
  The following two options are installed primarily to help me debug
  the code.  There is probably no reason for general use of these options,
  unless you have some perverse desire to see the guts of this thing
  in operation.
    -d  debug: print the file locally rather than faxing it.
    -x  xmit off: don't run the shell script at all.  PostScript file
	will be left in ~/.faxorder.ps

Up to six FoodSpecs can be specified:

FoodSpec::=<type>[options*][/<name>]
<type> ::= [b|t|m|q|T]
    (burrito, taco, mexico city, quesadilla, Taqitaco)
options:
    +g  gucamole
    +c  cheddar
    +cc cotija
    +cl cilantro
    +cm monterey
    +i"note" special instructions (e.g. black beans, no rice, etc.)
	NOTE: the "s need to be escaped if the shell sees them:
	   +i\"note\"
    +j  whole jalepenos
    +jf fresh jalapenos
    +js sliced jalapenos
    +n:<i> <i> copies of this food item. (default = 1)
    +o  olives
    +s  medium salsa
    +sc sour cream
    +sf fire salsa
    +sh hot salsa
    +sm mild salsa
    +sv salsa verde
    +t  tomato
    +v:ca carne azada
    +v:cc chile colorado
    +v:cv chile verde
    +v:f  fiesta
    +v:l  lengua  (beef tongue)
    +v:m  mole    (chicken)
    +v:p  pastor
    +v:pb pollo borracho
    +v:rb rice and beans (default)
    +v:v  vegetarian
    +z:l  large
    +z:r  regular (default)
    +z:c  chico (small)

    -[option] cancels option.  Not valid for ":" options or +i.  
       This is useful for modifying burrito macros specified in
        .costenarc file.

example:

burrito -time +:30 b+g+cc+jf+jf+sf+sc-sc+i"Black Beans"+n:2/Ross \
   b+v:cc+g+cm+sc+i"no rice"/Kathie

interpretation:
logged in user will pick up an order in 30 minutes.
  Ross wants two rice and bean (default) burritos with
    guacamole
    cotija
    fresh jalepenos (double)
    fire salsa
    no sour cream (cancelled)
    Black beans (comment)
  Kathie wants a Chile Colorado burrito with
    guacamole
    montery cheese
    sour cream
    no rice (comment)

You should keep your +i comments short, because there isn't much space on
the form for them, and the space is not used particularly well by my
PostScript program.  "Black beans, no rice" is about as long a
message as it can handle.

FILES:
	/etc/passwd
	/usr/local/adobe/phones/adobe.phones
	$HOME/.costenarc
	$HOME/.faxorder.ps

ENVIRONMENT VARIABLES:
	BURRITOPRINTER

BUGS:
   There's all kinds of ways to break this thing.  The lines in your
\.costenarc file should be less than 1000 characters, or the stack will
get trashed.  The PostScript program does not make particularly good
use of the "Comments" section of the form (controlled by the "+i"
switch), and doesn't detect when it is writing things off the side of
the page.  I have no idea what will happen if the disk is full when
burrito tries to write the .faxorder.ps file, or if it can't open it
because the directory is protected, or whatever.  But if you are
reasonable in your expectations of the program, and don't try to break
it, I think you'll find that it's adequate.  For bug reports, see my
comments below about future enhancements.

FUTURE ENHANCEMENTS:
   This is the kind of thing that everybody will have suggestions on
how to improve.  I will duly record every feature enhancement request,
but I can't promise that I'll do any more than that.  I have no
intention whatsoever, for example, of writing a Graphical User
Interface for this thing, even though so many people think that it's a
natural.  (However, see below.)  As I have said (many times) before
about this: "When I'm done with the program, you are more than welcome
to add any features that you wish."  Well, I'm (essentially) done.
Anyone who wants the source, it's in /user/thompson/public/burrito.c.
Gombata Kodesai.

ACKNOWLEDGMENTS:
    Steve Hawley has written two programs that make use of burrito
technology.  I don't know anything about them except that they exist,
but I thought I would mention them for what it's worth.  As far as I
know, they were both done as intellectual exercises, and are not
necessarily supported.
    - xburrito is a GUI overlay to the burrito program, which makes
	use of the motif library to animate the La Costena order form.
	This has been successfully run on DecStations, Sparcs, and
	perhaps other platforms.
    - macburrito is a gooey which runs on Macs.  It has the look and
	feel of a real burrito: The user interface involves throwing
	toppings onto a tortilla.

I’m Old, Part XVI: One of Many Screw Ups

If you write code, you write bugs. It’s inevitable; it’s human. It’s what happens after that’s important.

fuu

As I got to be a more experienced programmer, I learned habits to catch bugs either by the compiler (ideally), or very quickly in the test cycle. Mind you, the tools have gotten much better and there is less reliance on doing things like writing custom or sub-allocators, writing your own link list or growable array code and so on.

The first printer I worked on at Adobe, for DEC, had a chunk of EEROM/NVRAM on the controller. It had something like 128 bytes available and I didn’t own all of them. Since we were creating a PostScript cartridge, we had to share the memory with the hosting controller. As the project went along, there were a number of things that got added to the NVRAM, not the least was the serial port configuration, which got complicated since DEC wanted a bazillion baud rates, stops bits, parity, handshaking, and so on.

After all of the changes, I had coded a very subtle bug into the NVRAM handler. The read/write code worked just fine if the NVRAM had been initialized, but if it had never been initialized, the printer would execute a dreaded routine in the PostScript code base called CantHappen().

If you ever hit that routine in a debug build, the printer dropped into it’s low-level monitor and you could walk the stack to figure out how you got there. If you hit that routine in a release build, the printer rebooted.

Can you see the problem?

DEC sent the final accepted code off to a facility to make masked ROMs, which is an expensive process with a long lead time, but in quantity, it’s far cheaper than using EPROMs or One Time Programmable (OTP) devices/cartridges. DEC had gotten the first shipment of masked ROMs built into cartridges and when plugged into a printer, the printer would boot and start PostScript. The NVRAM was uninitialized and this triggered the failure. The code executed CantHappen(), then rebooted and the cycle repeated.

DEC called us and they were, understandably, at their wits’ end. They had no idea what had happened and what was going on.

It took me a couple days to figure it out and once we figured it out, the negotiations began. We explored all the options. DEC needed to get the product pipeline going and waiting for more masked ROMs was not in the cards. We talked about covering a set of OTP cartridges that they could use until more ROMs were burned – that was possible, but not ideal because OTPs of that size were costly.

I suggested an alternative – not ideal, but it worked. I could write a “special” version of PostScript that had no PostScript, but when it booted, it would detect uninitialized NVRAM and write to it in a way that the faulty cartridges would accept it, then put a message on the front panel letting the user know that the printer was ready. It was awkward and confusing for the user, but they could get this code into the smallest OTP and add it into the box with a note to use the initialization cartridge first. Adobe offered to cover the cost of the OTP cartridges for DEC until a new set of masked ROMs could be made.

Because of this, I made our printer QA rather unhappy. Whenever a new class of bug is found, printer QA gets another task in their long list of things to do. In this case, part of every acceptance test for every printer from then on that had NVRAM included “remove the NVRAM and put in a new chip, boot the printer, ensure the NVRAM has been properly initialized.”

While I worked on the “fix it” cartridge, I had a paper cup on my desk of “used” NVRAM. They were perfectly good chips, but they all had been initialized and were unsuitable for my testing. It didn’t take me too long to write the fix, but it did take time to go through both our and their QA departments.

Nobody foresaw this bug and I don’t know if there was a good way to have done so. I think the takeaway from this situation is to do what we did: understand the nature of the bug, list out all the solutions and pick the one that works best for everyone in the circumstances.

I’m not proud of creating the problem, but I was happy with the solution. DEC was not happy with the bug, but I think they were happy with how Adobe management handled the situation. They were happy enough to continue a working relationship with Adobe and happy enough to have me on their next product.

But that’s a story for another day.

I’m Old, Part XV: The Bug Hunt

One of the things that happened routinely at Adobe was The Bug Hunt. When a new product was slated for release from the Applications department, QA would reserve a training room, arrange for the cafeteria to provide free sandwiches and cookies,  and host a bug hunt.

H96566k

In the bug hunt, employees who didn’t work on the product could sit down and try out this new version of the application and try to find bugs in it. They had means of classifying the bugs and would very often set up prizes for certain amounts of unique serious bugs found.

If the engineers were foolish, they would hang around and watch the carnage. When there was a bug hunt, word spread like wildfire and very often, engineers from one project would descend together to ruin the next few weeks for the engineers working on the software under bug hunt.

There was a lot of vengeance visited upon the engineers from one team onto another as a result. “Oh, remember that time when you ruined my life on the last release of Illustrator? Payback is a bitch.”

There was a release of Type Align for Windows. The product would let you draw out a Bezier curve and then fit the baseline text to that curve. Then it would give you a parallel curve to the top of the text and you could modify either curve and warp the letters.

I sat down and selected the Bezier tool and started sketching. And sketching. And sketching. I kept going and going, treating the Bezier tool as if it were the pencil tool in MacPaint until I had painted the entire screen black.

Why? Well, I’m a programmer and I guessed that they had set aside a chunk for memory to record points from the cursor for later fitting a curve to. I also guessed that the chunk of memory that they had chosen was big, but not ridiculously big, which was why I was sketching for a good 10 minutes.

What ended up happening was that I overflowed the chunk of memory they had set aside and the program started recording  points into the stack of the program. When I let go of the mouse, I took out the entire system. Blue screen. Booyah!

Another favorite of mine was for apps running on any version of MacOS before X. I would create a document and name it “^0” and save it. Then I would make a change to that document close the document or quit the app. The app would inevitably put up a box that said, “Do you really want to discard changes to ‘^0’?”. Or at least it would try. MacOS had a built-in tool for putting up these boxes with OK and Cancel buttons. To prepare the box, there was a toolbox routine you would call named ParamText(). You could pass in up to 4 strings that it would use to replace text within the next Alert box that was displayed. Wait – how would it know what to replace? Well, if your Alert box had the characters ^0, ^1, ^2, or ^3 in it, it would substitute the parameters into those slots.

Problem was that if your Alert text was “Do you really want to discard changes to ‘^0’?”, and your file was named ^0, the Alert code would substitute ‘^0’ for ‘^0’ and continue. Then it would substitute ‘^0’ for ‘^0’ and continue. And if you haven’t figured it out, the Mac would lock up, hard.

So even though this really convenient routine existed, you should never call it because of this problem. Yet, lots of engineers are either hurried or lazy or both and just use this routine to get the task done.

Adobe release an early WYSIWYG web page editor. It was from a new group that had came in from an acquisition. They had never encountered an Adobe bug hunt before. My goodness, it was like shooting fish in a barrel. On this particular bug hunt, they were offering gift certificates to Tower Records (remember them?). I ended up with a certificate for $150 for the bugs that I found. For example, I typed in a sentence and did the following (using keyboard short cuts):

Select All, Copy, Paste, Paste, Select All, Copy, Paste, Paste, etc.

It didn’t take long until I took out the machine. Suckers.

Then again, I took the same pain when Acrobat went into bug hunts.

I’m Old, Part XVI: Lunchtime Camraderie

One of the things that was nice about Adobe in the 90’s was that it had a wonderful cafeteria. The chefs that worked there were imaginative and worked with some pretty nice ingredients. I remember the time that Chef Rob pulled me aside to show me the whole swordfish that had just been delivered which was going to be the next day’s special.

food

On the Acrobat team, there was a group of us who frequently had lunch together: Alan Wootton, Mike Pell, Nathan Graham, Mike Diamond, and me. There were sometimes other people who drifted in and out: Mark Sonnes, Rob Heiser, or Dina Sakahara from QA; or people from other teams.

We walked from our building to the building with the cafeteria, which was all of 50 yards along a sidewalk that was lined with jasmine. In the winter, there was an occasional patch of rain. I used to tease the people on my team because they would go running for coats and umbrellas. I just walked. I went to college in Ohio and saw rain that came down sideways where your umbrella made no difference at all.

For whatever reason, we tended to sit at a round table in the corner if it was open, and spent the next 45 minutes or so bullshitting about geek stuff. At the time, Star Trek: The Next Generation was still in first run, so we gabbed about the previous episode of that or the Simpsons or whatever was new and exciting.

One of my favorite things was something we called “Alan On”. This was when Alan Wootton would start talking, nay, sermonizing about a subject. Alan on Dating. Alan on Science Fiction. Alan on Los Angeles. Alan on Losing Weight. Alan on Management. He took our ribbing in a good-natured way and I’m glad it didn’t deter him from the process.

While we were working on Acrobat 1.0, we had an interesting problem that happened very often at lunch. It seemed that the time that we slated for eating lunch was just a little bit earlier than the time that John Warnock, the CEO picked for his.

John was very excited about the potential of Acrobat and was quick to tell us that and talk about whatever feature he had tried from the build that week. John had a top of the line Macintosh Quadra on his desk and he was always checking the builds for something new. If he saw something that sparked an idea, he would stop one of us (usually Alan) and tell him about it.

Because of this we invented a verb (and trust me, John, this was kind-hearted):

war•nock – v. to interrupt an engineer and give them a out-of-band task or feature to implement.

All of us at some point, on the way out from lunch, got warnocked and sat back down with John to find out what was next on our plate.

Alan got warnocked when we were in the Golden Master phase of Acrobat 1.0. This is the point when the software engineers waited on pins and needles to see if a particular build passed QA and when the QA engineers were working their butts off checking on fixed bugs, looking for recidivation on older bugs, and searching for new ones. Engineers aren’t supposed to change code during this time. Well, when the CEO pulls you aside and gives you a feature, you have to a certain calculus to figure out which is going to get you in more trouble, doing what the CEO says (and pissing off your manager) or towing the department line and pissing off the CEO. Alan chose to acquiesce to John, and it would’ve been all well and good if QA hadn’t found a bug in the feature and it was the first that QA or management had heard about this feature which had appeared during Golden Master.

Alan ultimately suffered for this. He was seen by his boss as a bit of cowboy coder and unmanageable. When we started on 2.0, Alan was given task after task that were either well beneath him or a waste of time or both. I know that in reality, it was probably not that bad, but Alan took the hint and decided to move on and start doing research for a company that he intended to found later.

But that’s a story for another day.

I’m Old, Part XV: PostScript Printer Development

TurboButton_5e82e769-9cf9-4f8e-b56d-8b07cdf126d2

When I worked at Adobe on PostScript printers, there were four choices for development, and one of them was a non-choice. The best was if you had a system where the print controller was a device running a real operating system. This would mean that you could use actual development tools, like a debugger, on the system when it was live. Few of the printers had this. Next best was have an I.C.E. (In-Circuit Emulator), which could let you do all kinds of nifty things like halt the processor when a particular value was written or read from a memory location or range. This was fantastic for tracking down heap smashers and is still better than the state of the art for desktop development, IMHO.

Next, and most common, was a system where the controller was rigged to have extra RAM and there was a simple monitor that could download code into RAM for execution. If you were lucky, the monitor had some break point facilities, but they were usually very primitive in that you had to pull up the symbol table on the host, search for the function you wanted to break on, then you could type that address into the monitor to set a break point (or as a stepping off point to disassemble the code and find the actual address. The monitor would typically replace the instruction at the address you specified with an instruction that would generate an interrupt so that the monitor could take over.

The worst way to develop was to compile, burn ROMs, plug in the the ROMs and test the code. Totally sucked, and I was glad I had to do it only rarely.

During all of this, there was a system that was built by Treve Bonser called a Soft Printer, which did its best to simulate a typical printer controller, but ran entirely on the host. This allowed you to take advantage of host debugging tools that didn’t totally stink, but to be honest, in the 1990’s, debuggers on the hosts weren’t very good.

Most of my work was done in the RAM based version.

This was OK, but it wasn’t great. When I compiled a new version of the code, I would look forward to having to download it over a serial connection to the printer. Several of megabytes at 19200 baud, was still significant. Ed McCreight, an ex-Xerox engineer who invented the B-Tree, had made a nifty device that hooked into the SCSI port of the host and a ROM socket on the printer. This done, the monitor could then pull in code via SCSI which ran at around 10-20 megabytes/sec – way better than serial.

With some help, I made a specialized socket for the 68K processor on one system with a push button that connected the non-maskable interrupt pin of the CPU to ground. This let me break into the monitor by pushing a button.

Still, the system was hardly ideal. RAM and ROM are fundamentally different and there are bugs that show when you run in RAM that don’t show in ROM and vice versa. Irritating. In addition, when you’re debugging, you typically end up commandeering one of the serial ports of the printer for communications to the monitor, which is a hassle when you have a bug reported that “sometimes you get a blank page when the printer is reading from both serial ports at the same time.”

In addition, there is an unfortunately too common problem where the CPU jumps to address 0, which is typically the result of calling a NULL function pointer. Should happen, but it does because programmers are lazy or sloppy or both.

With the 68K, low memory is dedicated to interrupt vectors. The problem with this is that the typical interrupt vectors look like valid code to the CPU, so when you jump to 0, it executes the interrupt vectors and often jumps into space, after having destroyed the contents of the CPU registers, making it hard to figure out (1) where you came from and (2) what the CPU state was when it happened.

There was an early interrupt vector that was guaranteed not to be used on our hardware, so I made a change to the master source code that would ensure that that vector was set to a value that would break into the monitor. Ta da.

One of my favorite hardware mods was done to a controller used by Ross Thompson. If I recall correctly, the controller was mounted in an old PC case because it was there and the power supply was adequate. Ross insisted that the hardware engineer wired up the “Turbo” button so that the indicator lit up. It didn’t do anything else, but Ross was happier to have a turbo button.

I’m Old, Part XIV: Corporate Culture

I’m an engineer and I have always worked on things that were challenging to me. As such, I often got stuck on problems. Over time, I’ve developed a process that helps me solve them, but it is entirely non-linear. There are very few challenges that I’ve solved where I can say, “Yes, I looked at the problem, analyzed it, and brought it through to a solution.” If I do that, then the problem is not challenging. It is, in fact, the exact definition of straight-forward.

maxresdefault

Mind you, something that is straight forward to me might not be the same to you (and vice versa).

My process involves doing things that are orthogonal to the problem. For example, when I was stuck on a problem on a printer I was working on at Adobe, I found that modelling my cube in a ray tracer helped me get past my sticking point.

Other times, I would just walk around the buildings for a while and see what other people were working on. I highly recommend doing this. You need a modicum of tact to know when to interrupt and when to leave well enough alone, but there’s benefits for multiple sides. First, you get to see what the other creative people in your company are doing. This is great because it removes barriers and you get a wide appreciation of what’s going on. Second, someone else’s cool work can be an inspiration to you. Third, chances are you’ll see someone who is stuck in their own problem domain and you can help them out. Fourth, sometimes it’s fun to play the “wouldn’t it be cool if” game with other creative people.

On my walks, I got to see an expansion card that was being built for an existing printer to add Fax send/receive. This card had a Z-80 on it separate from the printer CPU. Wouldn’t it be cool if it ran CP/M? Then you could run WordStar on your printer!

I got to see type designers working on new fonts.

I got to see pre-release versions of PhotoShop and Illustrator. At one point, Joe Holt had gone to a hackathon at Apple and rebuilt Illustrator to use a new renderer based on a pre-release technology from Apple called “Serrano”. Joe also created some built-in tools to show the relative performance which was dubbed the Serranodometer.

One of the things I also liked was seeing how other engineers built their spaces. Some were austere, some decorative, some sloppy (like mine). Dick Sweet was a senior printer engineer who decided to create a candy store in his office: Dick’s Sweet Shop. He had shelves of containers of candy and an honor jar for paying for what you wanted. Other engineers added to the environment. At one point, Dick had gotten some Flintstones candy and Joel made a new label for the jar that had the music to the Flintstones theme song with modified lyrics, “Flintstones, eat the Flintstones”. I’m pretty sure he hand-coded the PostScript to generate the label with the font Sonata for the music.

I asked Dick what the raison d’être was for his office and his answer was simple and consistent with a software engineer: “To make me look thinner by making other people fatter.”

I’m Old, Part XIII: Rendering Spheres

When I was in college, I wrote a program that could render a sphere lit from a single point light source. I wrote it in C on a Mac Plus and the output was a window with a 1-bit dithered image of the sphere.

Aside – dithering, in imaging, is the process of reducing a image with some number of bits per pixel to an image with fewer bits per pixel. The Mac Plus had could only do black or white (no gray), so that’s 1 bit per pixel. A couple years later, I ended up writing a chapter in the first volume of the book Graphics Gems on ordered dithering.

It ran OK, and it looked pretty decent, but I wanted a larger image and didn’t want to wait too long for output. While the Macintosh was ostensibly faster running at 8 MHz, the school’s VAX had native floating point arithmetic, whereas my Mac did not.

I ported the program to the CS department’s VAX 11/750 and asked to borrow the reference manual for the line printer in the CS lab. I wrote the program such that it would print size information and gray scale values and then, this being Unix, wrote a separate program that could consume that output, dither it, then convert it to codes to send to the printer to print out an image.

The output was very satisfying. I set it up to print a sphere with a 512 pixel radius, which just fit onto one sheet of paper. It was a little slow though. The image took roughly an hour for the calculations.

anxiety

Like Brophy in High Anxiety, I wanted to blow it up. I thought that 4 pages by 4 pages would be good. I couldn’t do this in one shot because students had a limited amount of disk space and I would be out of space before half the image had rendered and it would take 16 hours. So I modified the program so that I could render it in tiles. Each tile would take an hour, but if I ran them in parallel, I might do much better than 16 hours. Plus, each job would use none of my disk space since they would just end up in the print queue. I wrote a script to generate all the pages as background jobs and set it loose.

And I very quickly drove the poor VAX into the ground. Oh well. I could wait.

Unfortunately, nobody else could and the head of the CS department was one of those people. When he could get a slice of CPU time, he spotted 16 jobs all belonging to me and all topping out the CPU. He killed them all and sent me an email to come talk to him. I did and explained what I was doing and he grudgingly agreed to let me run it to completion but I could only run the jobs late at night and only one at a time. Damn. Still, I did just that and ran maybe two or three per night, serially. After a week, I collected the pages, cut off the borders and taped the pages together and put it on the wall in the lab.

Not too long after, Mike Dashow made a little drawing of the Millennium Falcon with a speech balloon that read “I’ve got a bad feeling about this.”. He taped it to the wall next to the sphere. The whole thing stayed there for the next year.

I considered going up to 8×8, but decided against it because it would take me a month if I was consistent and printing out the sphere was also a little hard on the ribbon cartridge that the printer used. The pages from the 4×4 were looking decidedly lighter in the last few panels, and I didn’t have the spare cash to buy my own printer ribbons, so that didn’t happen.

I don’t have the original code anymore, but this evening I recreated it in F#, running in Mono on my Toshiba x64 laptop, running Ubuntu. Took me about an hour. I think the original code took me 4 or 5.

Here is the output of a 512 radius sphere, no dithering, just continuous tone gray.

output

Compared to the one hour of the original version on the VAX, this took 365 milliseconds. For grins, I upped it to the same size as the 4×4. It finished in 7.2 seconds.

For grins, here’s the source code. No command line arguments, and it writes a JPEG image to a fixed file, but hey.

open System
open System.Drawing
open System.Drawing.Imaging
 
type Vector = {
    X:single; Y:single; Z:single
}
 
let inline dot u v = u.X * v.X + u.Y * v.Y + u.Z * v.Z
 
let length2 v = dot v v
 
let inline length v = length2 v |> sqrt
 
let normalize v =
    let len = length v
    { X = v.X / len; Y = v.Y / len; Z = v.Z / len }
 
let cosanglebetween u v = (dot u v) / (length u * length v)
 
let pointOnSphere r x y =
    // x^2 + y^2 + z^2 = r^2
    // z^2 = r^2 - x^2 - y^2
    let z2 = (r * r) - (x * x) - (y * y)
    if z2 >= 0.0f then Some(sqrt z2, -sqrt z2) else None
 
 
let illum background ambient lightpos r x y =
    let lighting z1 z2 =
        let v1 = { X = x; Y = y; Z = z1 }
        let v2 = { X = x; Y = y; Z = z2 }
        let v = if length2 v1 < length2 v2 then v1 else v2
        let cosangle = cosanglebetween lightpos v
        if cosangle < 0.0f then 0.0f else cosangle
    match pointOnSphere r x y with
    | Some(posz, negz) -> lighting posz negz
    | None -> background
 
let kSphereRad = 512.0f
let kLightPos = { X = 2048.0f; Y = -2048.0f; Z = -2048.0f }
let kAmbient = 0.1f
let kBackground = 0.2f
 
let myIllum = illum kBackground kAmbient kLightPos kSphereRad
 
let toPixelValue brightness =
    let ibright = (brightness * 255.0f) |> int
    Color.FromArgb(ibright, ibright, ibright)
 
[<EntryPoint>]
let main argv = 
    let startTime = Environment.TickCount
    let bm = new Bitmap(2 * (int kSphereRad), 2 * (int kSphereRad), PixelFormat.Format24bppRgb)
    for y = 0 to ((int)(2.0f * kSphereRad)) - 1 do
        for x = 0 to ((int)(2.0f * kSphereRad)) - 1 do
            let pv = myIllum ((float32 x)-kSphereRad) ((float32 y)-kSphereRad) |> toPixelValue
            bm.SetPixel(x, y, pv)
    let endTime = Environment.TickCount
    printf "total time: %d ticks\n" (endTime - startTime)
 
    bm.Save("output.jpg", ImageFormat.Jpeg);
    0

I’m Old, Part XII: Works for Me

 

While working on Adobe Acrobat, the worst bugs that I had to work on were the ones that had a mystical recipe from QA. When reproducing a bug went into 10 or more steps, you knew you were in trouble. On top of that, we would often get bugs that happened on the QA tech’s machine, but not on ours. This was not a surprise – in most cases, QA was running a release build of the code and engineers were running a debug build. Since the code size of the two builds are different, the memory layout is different. That alone is enough to produce a “works for me” situation. It doesn’t help that the code flow is subtly different in a release build than a debug build as well as register allocation for local variables.

Worse on top of worse is the set of bugs that include a complex recipe as well as “crashes sometimes”.

We had one QA engineer, Dina Sakahara, who was incredibly tenacious about cutting the number of steps down as well as turning a “crashes sometimes” bug into a “crashes every time”.

At one point, I was assigned a bug that I could not reproduce. Worse, once it had manifested, the bug had probably happened a long time ago. One class of bug in this category is a called a “heap smasher” or a Heisenbug. Code writes into a chunk of memory that it shouldn’t and damages a data structure. The damage isn’t always manifest until much later in time.

On the Macintosh, Acrobat 1.0 was written in a combination of C and C++. Well, actually, the compiler we were using for the Mac wasn’t fully C++. It was close, though.

C, like most high level languages, allocates memory for local variables on the machine stack. This is very convenient and takes a scant few instructions to do on most architectures. One problem is that C doesn’t initialize local variables by default. This means that if you use a variable before it’s been initialized, you have garbage in your variable. Maybe. Sometimes, it might be conveniently correct. Like in the debug build, for example. Sometimes, it might be inconveniently incorrect. Like in the release build, for example. The end result is that you might end up shotgunning the heap in release, but not in debug.

I had one of these bugs and I spent 3 weeks trying to narrow it down and was making no progress at all. Crap.

While I thought through what causes this class of problem, what I wanted was a way to change the semantics of C such that local variables were initialized to known values.

Looking over the compiler documentation, I read up on it’s ability to do code profiling. Profiling is a way to do measurement of how long each function takes as well as how often it gets called. The compiler had the ability to inject measurement code into every function. Even better, it had the ability to make that code replaceable with custom code. Ha! The game is afoot!

I wrote a custom profiler that did no profiling at all. Instead, when it was called to start profiling a function, it would execute some code that I crafted that would walk the stack back to the function, then it found the code that set up local variables and wrote carefully crafted garbage into that space. I think I repeatedly wrote 0xdeadbeef into that space. If a local variable was a pointer to memory, the moment that it got used, the program would crash hard. I also initialized all the registers to 0xdeadbeef.

When I unleashed this code on the debug build, I found crash after crash. Some was in Macintosh only code, some in shared code. After fixing the problems and checking the code in, my bug went away as well as a bunch of other bugs assigned to other engineers on both Mac and Windows and DOS.

I wasn’t happy, though. I didn’t really have proof that I fixed the bug I was after. I know I fixed many bugs, but I couldn’t ever say that this was one of them.

At this point, most programming languages have a solution to this particular problem:

  1. Issue an optional warning on use of uninitialized data
  2. Forbid the use of uninitialized data
  3. Initialize anything not specified otherwise to 0’s

One of the reasons this was such a problem in C was because of the specification of where local variables live within a function.

In both C and Pascal, local variables are declared immediately after a block start and before lines of code. This makes the language grammar a little less complicated and often makes the compiler easier to write, but the cost is brutal in terms of bugs.

C++ (and most other later languages) let you declare variables almost anywhere in the code stream. This is a good thing – having a variable declaration closer to use reduces bugs as well as makes refactoring easier.

Still, I have a soft spot for C. It is a minimal language and was designed to map onto most hardware very directly. It is such that at that time, I could look at a chunk of code and not only figure out if the source language was C or Pascal, but I could also tell you which of the major compilers were used for it.

In my career, I enjoy having these flashes of insight to solve problems. I only wish they didn’t have to come with the brutal overhead of weeks of searching for a bug.

I’m Old, Part XI: That Time I Got In Trouble For Swearing In Code

qbert_sw_marquee

I worked on PostScript printers at Adobe then got transferred to the department that was working on Acrobat (that’s a whole other story). In Acrobat 2.0, a separate sub-team, on which I was the only Mac programmer, was providing full-text indexing and searching. At the same time, Acrobat was being redesigned inside and out to be more portable, to have a better view model, and most importantly to have an extensibility model.

Unfortunately, the development and build process were fundamentally broken. File check out was always exclusive and there was no branching. There was also no continuous building, as far as I knew. This meant that there was no way to get access to code before it was final-ish nor was there a way to get an intermediate build of Acrobat to code against.

On top of this, the core engineering for Acrobat was overextended and frequently had nothing that was even close to ready until 6:00PM on the scheduled day. This meant that I and Carl Orthlieb (who was my Windows counterpart), had to stay later until the new version of Acrobat was built, then we had to figure out why our code wouldn’t compile. Usually, the problem was that APIs had changed and nobody bothered to let us know. On top of that, there was nobody to talk to about it because the engineers who had worked on those changes were long gone.

We frequently couldn’t even begin our work until 7:00 or 8:00.

At one point, the entire model for building menus changed. In addition to the text that you associated with any given menu, you had to provide a non-ui-displayable string name for every menu and submenu and menu items. Acrobat Search had a shit ton of menu items that we had to work with and I was getting more and more aggravated as I changed the code.

Sonny Boy Williamson was a blues musician, who did some recording. This wasn’t high budget work. He would go into the studio and the engineer would ask him the title of the song he was going to play. Williamson got really tired of that and this was the exchange:

LC: Go ahead we’re rolling, Take 1. What’s the name-a this?
SB: “Little Village” <pause> “A Little Village,” mother f*cker! “A Little Village!”
LC: There’s isn’t a mother f*ckin’ thing there about a village. You son-of-a-bitch! Nothin’ in the song has got anything to do with a village.
SB: Well, a small town
LC: I know what a village is!
SB: Well alright, goddamn it! You know, you don’t need no title. You name it up, you, I got-get through with it, son-of-a-bitch. You name it what you wanna. You name it your mammy, if ya wanna.

So in the midst of the late hours, changed APIs, no warning, and a ton of menus to name internally, I named one something like, “somenameforthisfuckingmenu”.

Code compiled and everything seemed to work well enough, so I checked in and walked away, leaving even more work for Mike Pell, who had to take the output of all the compiled builds and then build an installer for QA. No matter how bad I thought I had it, Mike had it worse.

This wasn’t the only time something like this happened. There were times when whole sets of APIs were totally removed and the Search team was SOL. At one point, I remember Carl and I went into Alan Padgett’s cube and stood behind him and forced him to put an API back in which would allow us to have a more responsive UI to show progress as well at to blink the cursor in the Search dialog box. The day after release, his boss made him take it out again. This went back and forth until we could agree to get the right API in the right place that would let us do what we wanted.

Time went on and eventually we went into Beta release. We had a number of external customers who were excited to explore the new plug-in model and to try their hand at coding to it. One customer on the Mac wrote code to iterate through all the menus and print them out. Sure enough, s/he hit the Search menu and saw my glorious blue streak.

I don’t know how this trickled back into Adobe, but it found its way to Bob Wulff, who was the manager of the whole Acrobat team. Bob personally paid me a visit. He was clearly not happy with this being found by a customer and he made it clear that I had to clean it up and to be more careful about “colorful language” in the code. I suspect that he was also holding back.

Of course, I changed it to something more appropriate. It was my fault, after all.

Or was it?

On the one hand, at 26 I probably should have known better than to put profanity anywhere in my code.

On the other hand, the build and release process was fundamentally flawed: it was always a “sucks to be you” chain of events: core engineers made changes that weren’t communicated and certainly weren’t available. The build process was manual and summative rather than iterative. Schedules were overdone and pushed right up to the end of deadlines.

There was so much broken in the process and even though there existed reasonable solutions to every broken element, it took a lot of pushing to get even simple changes in place, such as: build begins the morning of the release, not 5:00PM.