Continue to Site

Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations KootK on being selected by the Eng-Tips community for having the most helpful posts in the forums last week. Way to Go!

The Story of Mel

Status
Not open for further replies.

flash3780

Mechanical
Dec 11, 2009
829
I came across this usenet post from the early 80s: a response to another article called Real Programmers Don't Use Pascal. It's called "The Story of Mel", and I thought I'd share. Apparently Mel was a real programmer at Royal McBee Computer Corp (pictures of some of his routines here). Although I've admittedly never held a paper tape or punch card in my hand, even I can see the brilliance in using a stack overflow to exit a loop to squeeze every ounce of speed out of your RPC-4000. Talk about clever problem solving, eh? :)

The Story of Mel said:
Source: usenet: utastro!nather, May 21, 1983.

A recent article devoted to the *macho* side of programming made the bald and unvarnished statement:

Real Programmers write in Fortran.

Maybe they do now, in this decadent era of Lite beer, hand calculators and "user-friendly" software but back in the Good Old Days, when the term "software" sounded funny and Real Computers were made out of drums and vacuum tubes, Real Programmers wrote in machine code. Not Fortran. Not RATFOR. Not, even, assembly language. Machine Code. Raw, unadorned, inscrutable hexadecimal numbers. Directly.

Lest a whole new generation of programmers grow up in ignorance of this glorious past, I feel duty-bound to describe, as best I can through the generation gap, how a Real Programmer wrote code. I'll call him Mel, because that was his name.

I first met Mel when I went to work for Royal McBee Computer Corp., a now-defunct subsidiary of the typewriter company. The firm manufactured the LGP-30, a small, cheap (by the standards of the day) drum-memory computer, and had just started to manufacture the RPC-4000, a much-improved, bigger, better, faster -- drum-memory computer. Cores cost too much, and weren't here to stay, anyway. (That's why you haven't heard of the company, or the computer.)

I had been hired to write a Fortran compiler for this new marvel and Mel was my guide to its wonders. Mel didn't approve of compilers.

"If a program can't rewrite its own code," he asked, "what good is it?"

Mel had written, in hexadecimal, the most popular computer program the company owned. It ran on the LGP-30 and played blackjack with potential customers at computer shows. Its effect was always dramatic. The LGP-30 booth was packed at every show, and the IBM salesmen stood around talking to each other. Whether or not this actually sold computers was a question we never discussed.

Mel's job was to re-write the blackjack program for the RPC-4000. (Port? What does that mean?) The new computer had a one-plus-one addressing scheme, in which each machine instruction, in addition to the operation code and the address of the needed operand, had a second address that indicated where, on the revolving drum, the next instruction was located. In modern parlance, every single instruction was followed by a GO TO! Put *that* in Pascal's pipe and smoke it.

Mel loved the RPC-4000 because he could optimize his code: that is, locate instructions on the drum so that just as one finished its job, the next would be just arriving at the "read head" and available for immediate execution. There was a program to do that job, an "optimizing assembler", but Mel refused to use it.

"You never know where it's going to put things", he explained, "so you'd have to use separate constants".

It was a long time before I understood that remark. Since Mel knew the numerical value of every operation code, and assigned his own drum addresses, every instruction he wrote could also be considered a numerical constant. He could pick up an earlier "add" instruction, say, and multiply by it, if it had the right numeric value. His code was not easy for someone else to modify.

I compared Mel's hand-optimized programs with the same code massaged by the optimizing assembler program, and Mel's always ran faster. That was because the "top-down" method of program design hadn't been invented yet, and Mel wouldn't have used it anyway. He wrote the innermost parts of his program loops first, so they would get first choice of the optimum address locations on the drum. The optimizing assembler wasn't smart enough to do it that way.

Mel never wrote time-delay loops, either, even when the balky Flexowriter required a delay between output characters to work right. He just located instructions on the drum so each successive one was just *past* the read head when it was needed; the drum had to execute another complete revolution to find the next instruction. He coined an unforgettable term for this procedure. Although "optimum" is an absolute term, like "unique", it became common verbal practice to make it relative: "not quite optimum" or "less optimum" or "not very optimum". Mel called the maximum time-delay locations the "most pessimum".

After he finished the blackjack program and got it to run, ("Even the initializer is optimized", he said proudly) he got a Change Request from the sales department. The program used an elegant (optimized) random number generator to shuffle the "cards" and deal from the "deck", and some of the salesmen felt it was too fair, since sometimes the customers lost. They wanted Mel to modify the program so, at the setting of a sense switch on the console, they could change the odds and let the customer win.

Mel balked. He felt this was patently dishonest, which it was, and that it impinged on his personal integrity as a programmer, which it did, so he refused to do it. The Head Salesman talked to Mel, as did the Big Boss and, at the boss's urging, a few Fellow Programmers. Mel finally gave in and wrote the code, but he got the test backwards, and, when the sense switch was turned on, the program would cheat, winning every time. Mel was delighted with this, claiming his subconscious was uncontrollably ethical, and adamantly refused to fix it.

After Mel had left the company for greener pa$ture$, the Big Boss asked me to look at the code and see if I could find the test and reverse it. Somewhat reluctantly, I agreed to look. Tracking Mel's code was a real adventure.

I have often felt that programming is an art form, whose real value can only be appreciated by another versed in the same arcane art; there are lovely gems and brilliant coups hidden from human view and admiration, sometimes forever, by the very nature of the process. You can learn a lot about an individual just by reading through his code, even in hexadecimal. Mel was, I think, an unsung genius.

Perhaps my greatest shock came when I found an innocent loop that had no test in it. No test. *None*. Common sense said it had to be a closed loop, where the program would circle, forever, endlessly. Program control passed right through it, however, and safely out the other side. It took me two weeks to figure it out.

The RPC-4000 computer had a really modern facility called an index register. It allowed the programmer to write a program loop that used an indexed instruction inside; each time through, the number in the index register was added to the address of that instruction, so it would refer to the next datum in a series. He had only to increment the index register each time through. Mel never used it.

Instead, he would pull the instruction into a machine register, add one to its address, and store it back. He would then execute the modified instruction right from the register. The loop was written so this additional execution time was taken into account -- just as this instruction finished, the next one was right under the drum's read head, ready to go. But the loop had no test in it.

The vital clue came when I noticed the index register bit, the bit that lay between the address and the operation code in the instruction word, was turned on-- yet Mel never used the index register, leaving it zero all the time. When the light went on it nearly blinded me.

He had located the data he was working on near the top of memory -- the largest locations the instructions could address -- so, after the last datum was handled, incrementing the instruction address would make it overflow. The carry would add one to the operation code, changing it to the next one in the instruction set: a jump instruction. Sure enough, the next program instruction was in address location zero, and the program went happily on its way.

I haven't kept in touch with Mel, so I don't know if he ever gave in to the flood of change that has washed over programming techniques since those long-gone days. I like to think he didn't. In any event, I was impressed enough that I quit looking for the offending test, telling the Big Boss I couldn't find it. He didn't seem surprised.

When I left the company, the blackjack program would still cheat if you turned on the right sense switch, and I think that's how it should be. I didn't feel comfortable hacking up the code of a Real Programmer.

Reposted from: [URL unfurl="true"]http://www.pbm.com/~lindahl/mel.html[/url]
 
I've always enjoyed that story.



Mike Halloran
Pembroke Pines, FL, USA
 
"Pessimum" has become my "Word of the Day" and has taken its rightful place on my whiteboard.

old field guy
 
"Pessimum" has become my "Word of the Day" and has taken its rightful place on my whiteboard.
Haha nice. Sometimes I feel like I'm asked to work on the "most pessimum" task required to produce a quality design. :)
 
I worked with an old guy (I think he was 68 at the time) in 1981 who had spent his whole career writing machine code. In 1979 the company did away with the last thing that needed to be coded that way and told him to learn COBOL or retire. He learned COBOL. For those of you who are not familiar with COBOL, it was the most verbose language ever designed--many of the variable names were over 30 characters, and it took 5 lines to do something you could do in FORTRAN in 2. This guy actually wrote COBOL programs that looked like machine code. The other programmers on the project hated to have to touch his code. They all admitted that it ran fast and used very few resources, but they couldn't imagine why anyone would care. He lasted 3 years as a COBOL programmer and then retired. Every line of code he wrote on my project was scrapped within a month of his leaving.

Those machine-code guys were amazingly creative, but they wrote the most unmaintainable code ever devised.

David
 
I worked with a guy having a similar mindset who wrote turbine control ladder logic in a way which was dependant on the order the compiler scanned the ladder for it to execute correctly. Visually it was nonsense - a shame because that's one of the great benefits of ladder logic - but if you knew how the compiler behaved then it could be followed with a bit of effort. He could have made the code fucntionally equivalent using conventional logic and it would have cost a few more CPU cycles. The original ran on an 8086 processor with a few kB of RAM; this was running on a 486 processor with a few MB of RAM - a few more cycles didn't matter one way or the other. He did it 'because he could' even though it made it tricky to maintain and there was no technical need.


----------------------------------
image.php

If we learn from our mistakes I'm getting a great education!
 
Seems like everyone knows someone like that. We had a guy who complained that C was too bloated, and that "real" programmers programmed in assembly, even in Windows. He wasn't even that old at the time, maybe 40-ish in 1995.

TTFN

FAQ731-376
Chinese prisoner wins Nobel Peace Prize
 
C - bloated? [lol]

I used to be quite adept at small assembler routines for the 6809 and 68000 series chips. Does that date me? Well I don't care! Anyway, somewhere down the line I saw the light and went over to playing with big wires. [smile]


----------------------------------
image.php

If we learn from our mistakes I'm getting a great education!
 
The old guy that David was talking about.... in 1981 when David worked with him, the RAM would have cost around US$4500/MB, or maybe a lot less as wikipedia remembered for me that the C-64 came out the following year and they were waiting for RAM to fall below US$100 for the 64k, indicating more like $1500 to $2k.

Lets say he had spent say 15 years writing code. He would have learnt coding at a time when the price of RAM was in the region of $2.5m/MB.

You can hardly be surprised at Mr. Old Guy's incentives at that time, and how that might influence his idea of good code. We have been spoiled with cheap ram - none more so than Microsoft. Excel is a classic example of software that processes data inconceivably slower than you would expect it could.

Pricing references from this site, not sure if there has been any price adjustment for inflation or not.

Also - look at the plot linked from the above page - you can see why disk arrays won the war against big plates.
 
Yeah, I remember that the first PC my then employer bought for me had to wait until 1Mb SIMMs got below $100 each. I think the company leased all eight of them. ... and may still be paying on the lease.

Before that, we were using 'homemade' CP/M machines. ... and I remember roaching added functionality into unused space within application programs, even using two-byte holes. Not on the same order as Mel, but great fun.



Mike Halloran
Pembroke Pines, FL, USA
 
You could imagine that the price of memory almost certainly influenced how people used it. That sort of thing happens all over.

For example, a platinum-plated aluminum pyramid (a new metal at the time) tops the Washington Monument, designed to serve as a lightning rod for the structure. At the time, it wasn't an inexpensive proposition - it was quite expensive, actually. These days, we make Coke cans out of the same stuff.

You have to appreciate how prudently these guys used what was, at the time, a very scarce resource. Scarce resources aside, though, I'm in awe of how clever they were.
 
KiwiMace,
The issue wasn't money at all. He was programing early logic controllers and they were eventually capped around 64k--you couldn't address any more even if you had Amoco's money (and he did). He managed Bits, not Bytes and he said that it felt luxurious the first time he had a task with 16k addressable.

It was amazing what those guys could do with 16k compared to today's programs that take 1,000 times that much to initialize.

David
 
Everything is relative. You couldn't do this website and most of the Internet would be unbuildable if you were limited to the resources of a 64K IBM-PC AND the 300-baud modems of the day.

TTFN

FAQ731-376
Chinese prisoner wins Nobel Peace Prize
 
Programming bloat is nothing new. They take every byte they can.
Windows hasn't helped with the bloat. It adds a lot to a program to get the program to run.
At one time I had Wordperfect 5.1. In DOS mode, the install folder was 3.1MB. The Windows version folder was 10.2MB. I'm sure there were more files linked to the app that were system level and not even in the install folders.


"Wildfires are dangerous, hard to control, and economically catastrophic."

Ben Loosli
 
Today's programs are pretty. Programs in the 70's and early 80's were effecient. Both are a response to the availability of memory to run in. I wouldn't want to go back to IRstuff's 300 baud modem, but dang if I want to write a sub routine to do a simple calculation callable from a bunch of places in a pretty program I have to include all the graphics overhead in the sub routine and it becomes a pig within a pig. I sometimes wish I still had a FORTRAN compiler to make the sub routines kind of effecient, but I don't know how to link FORTRAN into VBA (which is all I use much anymore) and I don't do it often enough to bother to learn.

David
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor