Wednesday, December 12, 2007

I was an Amiga user back in the day and this is nothing more than presenting the history of the Amiga, written by Jeremy Reimer of Ars Technica and released free of charge, in single page format.

Chapter 1: Genesis

Introduction
The Amiga 1000
The Amiga 1000

The Amiga computer was a dream given form: an inexpensive, fast, flexible multimedia computer that could do virtually anything. It handled graphics, sound, and video as easily as other computers of its time manipulated plain text. It was easily ten years ahead of its time. It was everything its designers imagined it could be, except for one crucial problem: the world was essentially unaware of its existence.

With personal computers now playing such a prominent role in modern society, it's surprising to discover that a machine with most of the features of modern PCs actually first came to light back in 1985. Almost without exception, the people who bought and used Amigas became diehard fans. Many of these people would later look back fondly on their Amiga days and lament the loss of the platform. Some would even state categorically that despite all the speed and power of modern PCs, the new machines have yet to capture the fun and the spirit of their Amiga predecessors. A few still use their Amigas, long after the equivalent mainstream personal computers of the same vintage have been relegated to the recycling bin. Amiga users, far more than any other group, were and are extremely passionate about their platform.

So if the Amiga was so great, why did so few people hear about it? The world has plenty of books about the IBM PC and its numerous clones, and even a large library about Apple Computer and the Macintosh platform. There are many also many books and documentaries about the early days of the personal computing industry. A few well-known examples are the excellent book Accidental Empires (which became a PBS documentary called Triumph of the Nerds) and the seminal work Fire in the Valley (which became a TV movie on HBO entitled Pirates of Silicon Valley.)

These works tell an exciting tale about the early days of personal computing, and show us characters such as Bill Gates and Steve Jobs battling each other while they were still struggling to establish their new industry and be taken seriously by the rest of the world. They do a great job telling the story of Microsoft, IBM, and Apple, and other companies that did not survive as they did. But they mention Commodore and the Amiga rarely and in passing, if at all. Why?

When I first went looking for the corresponding story of the Amiga computer, I came up empty-handed. An exhaustive search for Amiga books came up with only a handful of old technical manuals, software how-to guides, and programming references. I couldn't believe it. Was the story so uninteresting? Was the Amiga really just a footnote in computing history, contributing nothing new and different from the other platforms?

As I began researching, I discovered the answer, and it surprised me even more than the existence of the computer itself. The story of Commodore and the Amiga was, by far, even more interesting than that of Apple or Microsoft. It is a tale of vision, of technical brilliance, dedication, and camaraderie. It is also a tale of deceit, of treachery, and of betrayal. It is a tale that has largely remained untold.

This series of articles attempts to explain what the Amiga was, what it meant to its designers and users, and why, despite its relative obscurity and early demise, it mattered so much to the computer industry. It follows some of the people whose lives were changed by their contact with the Amiga and shows what they are doing today. Finally, it looks at the small but dedicated group of people who have done what many thought was impossible and developed a new Amiga computer and operating system, ten years after the bankruptcy of Commodore. Long after most people had given up the Amiga for dead, these people have given their time, expertise and money in pursuit of this goal.

To many people, these efforts seem futile, even foolish. But to those who understand, who were there and lived through the Amiga at the height of its powers, they do not seem foolish at all.

But the story is about something else as well. More than a tale about a computer maker, this is the story about the age-old battle between mediocrity and excellence, the struggle between merely existing and trying to go beyond expectations. At many points in the story, the struggle is manifested by two sides: the hard-working, idealistic engineers driven to the bursting point and beyond to create something new and wonderful, and the incompetent and often avaricious managers and executives who end up destroying that dream. But the story goes beyond that. At its core, it is about people, not just the designers and programmers, but the users and enthusiasts, everyone whose lives were touched by the Amiga. And it is about me, because I count myself among those people, despite being over a decade too late to the party.

All these people have one thing in common. They understand the power of the dream.


The dream (1977-1984)



Jay Miner and his dog, Mitchy.

There were many people who helped to create the Amiga, but the dream itself was the creation of one man, known as the father of the Amiga. His name was Jay Miner.

Jay was born in Prescott, Arizona on May 31, 1932. A child of the Depression, he was interested in electronics from an early age. He started university at San Diego State. By this time, the Korean War was in full swing, and Jay opted to join the Coast Guard. His education and interest worked in his favor, landing him in electronics school in Groton, Connecticut. It was here that he met his future wife, Caroline Poplawski. They were married in a quiet ceremony in 1952.

Jay's interest in electronics continued to grow, and he brought his new bride with him to California where he enrolled at the University of California-Berkeley. He completed his degree in electrical engineering in 1958. Berkeley would later become a hotbed of computer science, contributing, among other things, the TCP/IP communications protocol that would later become the standard for the entire Internet.

For the next ten years, Jay moved around from company to company, many of them startups. His desire to be involved at a fundamental level in the design process was far greater than his need for steady employment. At startups, all the traditional rules about management and procedure are typically thrown out the window. People don't worry about sticking to their job descriptions; employees on every level from intern to CEO simply do whatever work needs to be done. This type of environment suited Jay well.

Jay then landed a position at a hot young company called Atari, which had gone from nothing to worldwide success overnight with the invention of the first computerized arcade games, including the blockbuster PONG. Atari was by no means a typical company. Its founder, Nolan Bushnell, was a child of the 1960s and believed that corporations could be more than emotionless profit machines: they should be like families, helping each other to prosper in more ways than just financially. There were few rules at Atari, and it didn't matter how weird a person you were if you could do the work. (One such Atari hire was Steve Jobs, who later moved on to bigger and better things.)


Nolan Bushnell with a PONG console.

The man at Atari who hired Jay Miner in the mid-1970s was Harold Lee, who became a lifelong colleague and friend. Harold once said of Jay that "he was always designing. He never stopped designing." That kind of attitude could get you far at a company like Atari. Jay wound up being the lead chip designer for a revolutionary product that would create a multibillion dollar industry: the Atari 2600, otherwise known as the Video Computer System or VCS.

Atari days

The VCS was the first massively popular game console, and despite having incredibly primitive hardware inside, it managed to have a commercial life span far greater than any of its competitors. Much of this longevity was due to Jay Miner's brilliant design, which allowed third-party programmers to coax the underpowered machine to achieve things never dreamed of by its creators.

The Atari 2600
The Atari 2600 and the game that made it famous.

An example of this was Atari's Chess game. The original packaging for the VCS showed a screenshot of the machine playing chess, although its designers knew that there was no way it was powerful enough to do so. However, when someone sued Atari for misleading advertising, the programmers at Atari realized they had better try and program such a game. Clever programming made the impossible possible, something that would be seen many times on the Amiga later on in our story.

Having achieved such great success with the VCS game console, Jay's next assignment was designing Atari's first personal computer system. In 1978, personal computers had barely been invented, and the few companies that had developed them were often small, quirky organizations, barely moved out of their founder's garages. Apple (started by the aforementioned Steve Jobs and Steve Wozniak) was one of the major players, as was Tandy Radio Shack and even Commodore.

The Atari 400/800
An early ad for the Atari 400/800. Note the years!
The computer Jay designed was released in 1979 as the Atari 400. A more powerful version, the 800, was also released with a better keyboard. At the time, most of its competitors were awkward, clunky machines, often large, heavy and temperamental, and if they created any graphics at all they were either in monochrome or, in the case of the Apple ][, limited to a palette of only eight colors. The Atari 400/800 machines had a maximum of 40 simultaneous colors, and featured custom chips to accelerate sound and graphics to the point that accurate conversions of popular arcade games became possible. Compared to an Apple ][ or a TRS-80, the Atari machine seemed to come from the future. The same thing would happen with the Amiga a few years later.

However, Atari management undermined the success of the 400/800 in several ways. Firstly, to avoid competition with the VCS, they downplayed the importance or even the existence of games for the platform, insisting that it be considered a "serious" machine. Ironically, when the company was struggling to produce a successor to the 2600, they ended up simply putting an Atari 400 in a smaller, keyboard-less case. Even worse, Atari was reticent about giving out information about how the hardware worked, thinking that such data was to be kept a trade secret, known only to internal Atari programmers. Some individuals, such as the superstar game programmer John Harris, considered this a challenge, and they managed to unlock most of the Atari's secrets by a process similar to reverse engineering. But the lack of strong third-party development for the computer doomed it to an also-ran status in the nascent industry.

After the 400 and 800 had shipped, Atari management wanted Jay to continue developing new computers. However, they insisted that he work with the same central processing unit, or CPU, that had powered the VCS and the 400/800 series. That chip, the 6502, was at the heart of many of the computers of the day. But Jay wanted to use a brand new chip that had come out of Motorola's labs, called the 68000.


The 68000

The 68000 was an engineer's dream: fast, years ahead of its time, and easy to program. But it was also expensive and required more memory chips to operate, and Atari management didn't think that expensive computers constituted a viable market. Anyone who had studied the history of electronics knew that in this industry, what was expensive now would gradually become cheaper over time, and Jay pleaded with his bosses to reconsider. They steadfastly refused.
The Motorola 68000
The dream chip: Motorola's 68000.
Atari at this time was changing, and not necessarily for the better. The company's rapid growth had resulted in a cash flow crunch, and in response Nolan Bushnell had sold the company to Warner Communications in 1978. The early spirit of family and cooperation was rapidly vanishing. The new CEO, Ray Kassar, had come from a background in clothing manufacturing and had little knowledge of the electronics industry. He managed to alienate all of Atari's VCS programmers, refusing their demands for royalty payments on the games they designed (which were at the time selling in incredible numbers) and even referred to them at one point as "prima donna towel designers." His attitude led to a large number of Atari programmers quitting the company and forming their own startups, such as the very successful Activision, started by Larry Kaplan. Larry had been Atari's very first VCS programmer.

Larry Kaplan.
Jay had incredible visions of the kind of computer he could create around the 68000 chip, but Atari management simply wasn't interested, so finally he gave up in disgust and left the company in early 1982. He joined Zimast, a small electronics company that made chips for pacemakers. It seemed like his dream was dead.

However, as would happen many times in the short history of this industry, forces would align to make a previously impossible dream possible. While technology was advancing rapidly, the number of people who really understood the technology remained small. These people would not be limited by the short-sighted management of large companies. They would find each other, and together, they would find a way.

It was this feeling that caused Larry Kaplan to pick up the phone and make the fateful call to Jay Miner in the middle of 1982.

Larry was enjoying the fruits of his success with Activision, yet still felt the limitations of being primarily a developer for the Atari VCS. Video games were a hot property at this time, and there was no shortage of investment money that people were willing to put into new gaming startups. A consortium out of Texas, which included an oil baron (who had also made money from sales of pacemaker chips, which was how Jay knew him) and three dentists, had approached Larry about investing seven million dollars in a new video game company.

Larry immediately phoned Jay at Zimast to ask if he would like to be involved in this new venture. The idea was to spread the development around: Larry and Activision would develop the games, Jay and Zimast would design and build the new hardware to run them, and everybody would make money. They had to quickly decide on a name for the new venture, and "Hi-Toro" was chosen because it sounded both high-tech and Texan. The company needed a management person to oversee all this development, so David Morse was recruited from his position of vice president of marketing at Tonka Toys. A small office was located in Santa Clara, California, and the three co-founders got down to the business of designing the ultimate games machine.

David Morse
It was around this time that Larry Kaplan began to get cold feet about the whole idea. Jay speculated that perhaps things weren't moving fast enough for him, or maybe he was worried that the games industry was becoming too crowded, but he suddenly decided to quit the company in late 1982. It turned out that Kaplan had been given a very generous offer from Nolan Bushnell to come back to Atari, an offer that later turned out to be less than expected.

In any case, Kaplan's departure presented the fledgling venture with a problem: they had no chief engineer. While Larry was a software developer and not a true hardware engineer, he had still been in charge of engineering management for the company. The next logical choice for this position was Jay Miner.

Jay knew this was his chance. He agreed to take over the position of chief of engineering at Hi-Toro under two conditions: He had to be able to make the new video game machine use the 68000 chip, and also make it work as a computer.


Chapter 2: The birth of Amiga

Born as a console, but with the heart of a computer


Game consoles and personal computers are not all that different on the inside. Both use a central processing unit as their main engine (the Apple ][, Commodore 64, and the Atari 400/800 all used the same 6502 CPU that powered the original Nintendo and Sega consoles). Both allow user input (keyboards and mice on computers, joysticks and game pads on consoles) and both output to a graphical display device (either a monitor or a TV). The main difference is in user interaction. Gaming consoles do one thing only—play games—whereas personal computers also allow users to write letters, balance finances and even enter their own customized programs. Computers cost more, but they also do more. It was not too much of a stretch to imagine the new Hi-Toro console being optionally expandable to a full computer.

However, the investors weren't likely to see things that way. They wanted to make money, and at the time the money in video games dwarfed the money in personal computers. Jay and his colleagues agreed that they would design the new piece of hardware to look like a games unit, with the option of expansion into a full computer cleverly hidden.

This was one of those decisions that, in retrospect, seems incredibly prescient. At the time, it was merely practical—the investors wanted a game console, the new company needed Jay Miner, and Jay wanted to design a new computer. This compromise allowed everyone to get what they wanted. But events were transpiring that would make this decision not only beneficial, but necessary for the survival of the company.


The video game crash

The great video game crash of 1983, was, like all great crashes, easy to predict after it had already happened. With sales of home consoles and video games rising exponentially, companies started to think that the potential for earning money was unlimited. Marketing executives at Atari bragged that they could "shit in a box and sell it." And inevitably, that's exactly what happened.

There were too many software companies producing too many games for the Atari VCS and other competing consoles. The quality of games began to suffer, and the technological limitations of the first generation of video game machines were starting to become insurmountable. Clever programming could only take you so far. Today, it is understood that each new generation of game consoles has a limited lifecycle, and new hardware platforms are scheduled for release just as the old ones are starting to wane. Back then, however, the industry was so new that the sinusoidal-like demand for a game platform was not understood at all. People just expected sales to keep going up forever.

Just like the dotcom bubble in the late 1990s, a point was reached where the initial enthusiasm was left behind and replaced with sheer insanity. This point can be traced precisely to the release of a new game for the Atari VCS in late 1982, timed to coincide with the release of a new blockbuster movie: E.T. The Extra Terrestrial.

E.T. The Extra Terrestrial for Atari 2600
The game that ended it all.

Atari paid millions of dollars for the license to make the game, but marketing executives demanded that it be developed and sent to manufacturing in six weeks. Good software is like good wine—it cannot be rushed. The game that Atari programmers managed to produce turned out to be a very nasty bottle of vinegar. It was repetitive, frustrating, and not much fun. Atari executives, however, did not realize this. They compounded their mistake by ordering the manufacture of five million cartridges, which was nearly the number of VCS consoles existing at the time. But the insanity didn't stop there. For the release of the game Pac-Man, Atari actually manufactured more cartridges than there were VCS consoles to run them!

An Atari marketing manager was actually asked about this disparity, and his response clearly expressed his total disconnect from reality. He said that people might like to buy two copies: one for home, and one for a vacation cottage!

Instead of two copies, most people decided to buy zero. Atari (and thus Warner) posted huge losses for the year and were forced to write off most of its unsold inventory of VCS cartridges. In a famous ceremony, tens of thousands of E.T., Pac-Man, and other carts were buried and bulldozed in an industrial waste dump.

The E.T. debacle was the exact moment when the bubble burst. Millions of kids around the world decided that Atari and, by extension, all console video games weren't "cool" anymore. Sales of all game systems and software plummeted. Suddenly, venture funding for new game companies vanished.

Personal computer sales, however, were still climbing steadily. Systems like the Apple ][, the Commodore 64, and even the new IBM PC were becoming more popular in the home. Parents could justify paying a little more money for a system that was educational, while the kids rejoiced in the fact that these little computers could also play games.

This set the stage for a fateful meeting. The nervous Hi-Toro investors, watching the video game market crumble before their eyes, anxiously asked Jay Miner if it might be possible to convert the new console into a full-blown personal computer. Imagine their relief as he told them he had been planning this all along!

There was only one problem remaining: the company's name. Someone had done a cursory check and found out that the name Hi-Toro was already owned by a Japanese lawnmower company. Jay wanted his new computer to both friendly and sexy. He suggested "Amiga," the Spanish word for female friend. Perhaps not coincidentally, Amiga would also come before Atari in the phone book! Jay wasn't terribly pleased with the name initially. However, as none of the other employees could think of anything better, the name stuck.

Now everything was in place. The players were set; the game was under way.

The dream was becoming a reality.


Early days at Amiga

Jay Miner once described the feeling of being involved in the young Amiga company as being like Mickey Mouse in the movie Fantasia, creating magical broomsticks to help carry buckets of water, then being unable to stop his runaway creations as they multiplied beyond control. He immediately hired four engineers to help him with the hardware design, and a chief of software design, Bob Pariseau. Bob then quickly hired four more software engineers to help him. The young company quickly became an unruly beast, devouring money at an insatiable pace. But it was necessary.

In high technology, even more so than in other industries, speed is always important, and there is never enough time. Things change so quickly that this year's hot new design looks stale and dated next year. The only way to overcome this problem is to apply massive amounts of concentrated brainpower and come up with a very clever design, then rush as quickly as possible to get the design through the initial prototype and into an actual product. Even the inelegant, unimaginative and graphically inept IBM PC, introduced in 1981, was the result of an unprecedented one-year crash building program. Not even the mighty IBM, with resources greater than those of small nations, was immune to the pressures of time.

A tiny company like Amiga had even greater problems. On top of the maddening rush to ramp up staffing and develop a new product, Jay and his team had to worry about much larger corporations and their industrial espionage teams stealing their new ideas and applying much greater resources to beat them to market. Nobody knew what Amiga, Inc. was up to, and the company's founders liked it that way. So an elaborate two-pronged attack was devised to ensure that nobody got wise to Amiga's ambitions before they were ready to show them to the world.

Firstly, the company would create a deceptive business front. This had to be something simple enough that it would not take away too many resources from the actual work, yet still deliver actual products and generate some revenue. The company decided to stick to its videogame roots and produce hardware and software add-ons for the Atari VCS. One of the first products, a collector's item today, was the Amiga Joyboard, a kind of joystick that was used by sitting or standing on top of it and leaning back and forth, left and right. The company also wrote some simple games for it that involved skiing and skateboarding. While income from these games and peripherals helped sustain the company in its early days, it was also affected by the video game crash of '83 and sales quickly dwindled.
The Amiga Joyboard
The Amiga Joyboard. Note the small Amiga logo at bottom.

This short-lived era of the young company's history had one long-lasting impact on the Amiga computer. RJ Mical, a programmer writing some of the complicated routines that would bring the Amiga to life, developed a simple game that used the Joyboard and was designed to try and help him to relax. The game was called "Zen Meditation" and the object was to try and sit absolutely still. The game was a kind of running joke in the Amiga offices, and when the time came to write the text for a serious error message for the Amiga operating system, a programmer came up with the term "Guru Meditation Error." This would remain in the operating system for years to come, until a nameless and unimaginative Commodore executive insisted on removing the Guru and making the message into "Software Failure."

The second front of deception against industrial espionage involved codenames for the powerful new custom chips the team was designing for the Amiga computer. Dave Morse decided that henceforth all these chips would be referred to by women's names. The idea was that if anyone intercepted telephone conversations between Amiga people, they would be unable to figure out that they were discussing parts of a computer. The idea of "Agnes" being temperamental or "Denise" not living up to expectations also appealed to the engineers' sense of humor. The computer itself was codenamed "Lorraine," the name of Dave's wife.

Jay Miner may have been leading the team, but the details of the new computer were hammered out at team design meetings, held in a seminar-like room that had whiteboards covering the walls. Everyone could pitch for inclusion in the machine, and the small group would have to come to a consensus about which features to include and which to leave out. Engineering is all about tradeoffs, and you can't just decide to include "the best of everything" and have it all work. Cost, speed, time to develop, and complexity are just some of the factors that must be taken into account at this crucial stage of a new computer. The way the Amiga team came to a consensus was with foam rubber baseball bats.

It isn't known who first came up with the idea, but the foam bats became an essential part of all design meetings. A person would pitch an idea, and if other engineers felt they were stupid or unnecessary, they would hit the person over the head with a bat. As Jay said, "it didn't hurt, but the humiliation of being beaten with the bat was unbearable." It was a lighthearted yet still serious approach, and it worked. Slowly the Amiga design began to take shape.


Hold and modify

Jay had always had a passion for flight simulators, and it was something that would stay with him for the rest of his life. A friend of his took him on a field trip to Link, a company that made multimillion-dollar flight simulators for the military. Jay was enthralled by the realistic sights and sounds and vowed that he would make the Amiga computer capable of playing the best flight simulators possible.

Two major design decisions came out of this trip: the blitter and HAM mode. Jay had already read about blitters in electronic design magazines and had taken a course at Stanford on their use, so they were not a new idea for him. However, the flight simulator experience had made him determined to create the best possible blitter for the Amiga.

A blitter is a dedicated chip that can move large chunks of graphics around on the screen around very quickly without having to involve the CPU. All modern video cards have what is essentially an advanced descendant of a blitter inside them. Again, Jay was ahead of his time.

HAM mode, which stood for Hold And Modify, was a way of getting more colors to display on the screen than could normally fit into the display memory. At the time, memory chips were very expensive, and the cost for displaying millions of colors at once was too high even for military applications like the Link simulator. So instead of storing all the color information for each dot (or pixel) on the display, the hardware could be programmed to start with one color and then change only one component of it (Hue, Saturation or Luminosity) for each subsequent pixel along each line. Jay decided to put this into the Amiga.

Later on in the design process, Jay would become concerned that HAM mode was too slow and even asked his chip layout artist if he could take it out. The chip designer replied that it would take many months and leave an aesthetically unappealing "hole" in the middle of the chip. Jay decided to keep the feature in, and later admitted that this was a good decision. The Amiga shipped with the ability to display 4096 colors in this mode, far more than any of its competitors, with clever programmers squeezing even more colors out of future Amiga chipset revisions. Despite HAM being suitable only for displaying pre-calculated images, a software company would even develop a graphics editor that operated in HAM mode. Like the chess game on the Atari 2600 before it, programmers would make the impossible possible on the Amiga.

Screens like no other

Another new invention for the Amiga computer was the "copper" chip. This was essentially a special-purpose CPU designed specifically for direct manipulation of the display. It had only three instructions, but it could directly access any part of the other display chips at any time. What's more, it could turn amazing tricks in the fraction of a second that it took for the monitor to refresh the display. This allowed a trick that no other computer has ever reproduced: the ability to view multiple different screens, opened at different resolutions, at the same time. These "pull-down" screens would amaze anyone who saw them. Modern computers can open different screens at different resolutions (say, for example, to open a full-screen game at a lower resolution than the desktop is displaying, in order to play the game faster or at a higher frame rate) but they can only switch between these modes, not display multiple modes at once.

The design eventually coalesced down to three chips named Agnes, Denise, and Paula. Agnes handled direct access to memory and contained both the blitter and copper chips. Denise ran the display and supported "sprites," or graphical objects that could be displayed and moved over a complex background without having to redraw it. Finally, Paula handled sound generation using digitally-sampled waveforms and was capable of playing back four channels at once: two on the left stereo channel and two on the right. It would be years before competing computer sound capabilities came anywhere close to this ability. Paula also controlled the Amiga's floppy disk drive.

These chips formed the core of what would be referred to as the Amiga's "custom chipset." However, they did not yet exist, except on paper. While the software development team was able to get started planning and writing programs that would support the chipset's features, the hardware team needed some way to test that their chips would actually work before committing to the expense of manufacturing them. In addition, the operating software could not be fully tested without having real Amiga hardware to run it on.


Chapter 3: The first prototype

Prototyping the hardware


Modern chips are designed using high-powered workstations that run very expensive chip simulation software. However, the fledgling Amiga company could not afford such luxuries. It would instead build, by hand, giant replicas of the silicon circuitry on honeycomb-like plastic sheets known as breadboards.


Hobbyist breadboard

Breadboards are still used by hobbyists today to rapidly build and test simple circuits. The way they work is fairly simple. The breadboard consists of a grid of tiny metal sockets arranged in a large plastic mesh. Short vertical strips of these sockets are connected together on the underside of the board so that they can serve as junctions for multiple connectors. Small lengths of wire are cut precisely to length and bent into a staple-like shape, with the exposed wire ends just long enough to drop neatly into the socket. Small chips that perform simple logic functions (such as adding or comparing two small numbers in binary code) straddle the junctions, their centipede-like rows of metal pins precisely matching the spacing of the grid.

At the time, nobody had ever designed a personal computer this way. Most personal computers, such as the IBM PC and the Apple ][, had no custom chips inside them. All they consisted of was a simple motherboard that defined the connections between the CPU, the memory chips, the input/output bus, and the display. Such motherboards could be designed on paper and printed directly to a circuit board, ready to be filled with off-the-shelf chips. Some, like the prototypes for the Apple ][, were designed by a single person (in this case, Steve Wozniak) and manufactured by hand.

The Amiga was nothing like this. Its closest comparison would be to the minicomputers of the day—giant, refrigerator-sized machines like the DEC PDP-11 and VAX or the Data General Eagle. These machines were designed and prototyped on giant breadboards by a team of skilled engineers. Each one was different and had to be designed from scratch—although to be fair, the minicomputer engineers had to design the CPU as well, a considerable effort all by itself! These minicomputers sold for hundreds of thousands of dollars each, which paid for the salaries of all the engineers required to construct them. The Amiga team had to do the same thing, but for a computer that would ultimately be sold for under $2,000.

So there were three chips, and each chip took eight breadboards to simulate, about three feet by one and a half feet in size, arranged in a circular, spindle-like fashion so that all the ground wires could run down the center. Each board was populated with about 300 MSI logic chips, giving the entire unit about 7200 chips and an ungodly number of wires connecting them all. Constructing and debugging this maze of wires and chips was a painstaking and often stressful task. Wires could wiggle and lose their connections. A slip of a screwdriver could pull out dozens of wires, losing days of work. Or worse, a snippet of cut wire could fall inside the maze, causing random and inexplicable errors.

Agnus breadboards

However, Jay never let the mounting stress get to him or to his coworkers. The Amiga offices were a relaxed and casual place to work. As long as the work got done, Jay and Dave Morse didn't care how people dressed or how they behaved on the job. Jay was allowed to bring his beloved dog, Mitchy, into work. He let him sit by his desk and had a separate nameplate manufactured for him.

Jay even let Mitchy help in the design process. Sometimes, when designing a complex logic circuit, one comes to a choice of layout that could go either way. The choice may be an aesthetic one, or merely an intuitive guess, but one can't help but feel that it should not be left merely to random chance. On these occasions Jay would look at Mitchy, and his reaction would determine the choice Jay would make.

Slowly, the Amiga's custom chips began to take shape. Connected to a Motorola 68000 CPU, they could accurately simulate the workings of the final Amiga, albeit more slowly than the final product would run. But a computer, no matter how advanced, is nothing more than a big, dumb pile of chips without software to run on it.


Raising the bar for operating systems

All computers since the very first electronic calculators required some kind of "master control program" to handle basic housekeeping tasks such as running application programs, managing the user environment, talking to peripherals such as floppy and hard disks, and controlling the display. This master program is called the operating system, and for most personal computers of the day, it was a very simple program that was only capable of doing one thing at a time.

Jay's specialty was designing hardware, not software, so he had little input on the design of the Amiga's operating system. But he did know that he wanted his computer to be more advanced than the typical personal computers of the time running such primitive operating systems as AppleDOS and MS-DOS. His hire for chief of software engineering, Bob Pariseau, did not come from a background in microcomputers. He worked for the mainframe computer company Tandem, which made massive computers that were (and are still today) used by the banking industry.

Bob was used to his powerful computers that could handle many tasks and transactions at one time. He saw no reason why microcomputers should not be capable of the same thing. At the time, there were no personal computers that could multitask, and it was generally felt that the small memory capacities and slow CPU speeds of these machines made multitasking impossible. But Bob went ahead and hired people who shared his vision.

The four people he hired initially would later become legends of software development in their own right. They were RJ Mical, Carl Sassenrath, Dale Luck, and Dave Needle. Carl's interview was the simplest of all: Bob asked him what his ultimate dream job would be, and he replied, "To design a multitasking operating system." Bob hired him on the spot.

Carl Sassenrath had been hired from Hewlett-Packard where he had been working on the next big release of a multitasking operating system for HP's high-end server division. According to Carl:

"What I liked about HP was that they really believed in innovation. They would let me buy any books or publications I wanted... so I basically studied everything ever published about operating systems. I also communicated with folks at Xerox PARC, UC Berkeley, MIT, and Stanford to find out what they were doing.

In 1981-82 I got to know CPM and MSDOS, and I concluded that they were poor designs. So, I started creating my own OS design, even before the Amiga came along."

So the Amiga operating system would be a multitasking design, based on some of Carl's ideas that would later be called a "microkernel" by OS researchers in academia. Carl had invented the idea before it even had a name; the kernel, or core of the operating system, would be small, fast, and capable of doing many things at once, attributes that would then pervade the rest of the operating system.

The decision to make a multitasking kernel would have a huge impact on the way the Amiga computer would perform, and even today the effects can still be felt. Because the mainstream PC market did not gain true multitasking until 1995 (with Windows 95) and the Macintosh until 2001 (with OSX), an entire generation of software developers grew up on these platforms without knowing or understanding its effects, whereas the Amiga, which had this feature since its inception, immediately gave its developers and users a different mindset: the user should never have to wait for the computer. As a result, programs developed for the Amiga tend to have a different, more responsive feel than those developed for other platforms.


Adding a GUI

There was one more significant design decision that was made about the Amiga at this time: to design it with a graphical user interface. Most personal computers at the time were controlled by a command line interface; the user had to type in the name of a program to run it and enter a long series of commands to move files or perform maintenance tasks on the computer.

The idea of a graphical user interface was not new. Douglas Engelbart had demonstrated most of its features along with the world's first computer mouse in 1968, and researchers at Xerox PARC had created working models in the mid-70's. At the beginning of the 1980's, it seemed everyone was trying to cash in on the graphical interface idea, although developing it on the primitive computers of the day was problematic. Xerox itself released the Star computer in 1981, but it cost $17,000 and sold poorly, serving mostly as an inspiration for other companies. Apple's version, the Lisa, came out in 1983. It cost $10,000 and also sold poorly. Clearly, personal computers were price-sensitive, even if they had advanced new features.

Apple solved the price issue by creating a stripped-down version of the Lisa. It took away the large screen, replacing it with a tiny 9 inch monochrome monitor. Instead of two floppy drives, the new machine would come with only one. There were no custom chips to accelerate sound or graphics. And as much hardware as possible was removed from the base model, including the memory—the operating system was completely rewritten to squeeze into 128 kilobytes of RAM. The stripped-down operating system was only capable of running one application at a time—it couldn't even switch between paused tasks.

This was the Macintosh, which was introduced to the world in dramatic fashion by Steve Jobs in January of 1984. What most people don't remember about the Macintosh was that initially it was not a success—it sold reasonably well in 1984, but the following year sales actually went down. The Mac in its original incarnation was actually not very useful. The built-in word processor that came with the machine was limited to only eight pages, and because of the low memory and single floppy drive, making a backup copy of a disk took dozens of painful, manual swaps.

The Amiga operating system team wasn't thinking like this. The hardware design group wasn't compromising and stripping things down to the bare minimum to save money, so why should they?

The Amiga user interface WorkBench
The Amiga user interface, Workbench.

One of the more difficult parts of writing a graphical user interface is doing the low-level plumbing, called an API, or Application Programming Interface, that other programmers will use to create new windows, menus, and other objects on the system. An API needs to be done right the first time, because once it is released to the world and becomes popular, it can't easily be changed without breaking everyone's programs. Mistakes and bad design choices in the original API will haunt programmers for years to come.

RJ Mical, the programmer who had come up with the "Zen Meditation" game, took this task upon himself. According to Jay Miner, he sequestered himself in his office for three weeks, only coming out once to ask Carl Sassenrath a question about message ports. The resulting API was called Intuition, an appropriate name given its development. It wound up being a very clean, easily-understandable API that programmers loved. In contrast, the API for Windows, called Win16 (later updated to Win32) was constructed by a whole team of people and ended up as a mishmash that programmers hated.


Working 90-hour weeks

RJ Mical recalled what life was like back in those busy early days:

"We worked with a great passion... my most cherished memory is how much we cared about what we were doing. We had something to prove... a real love for it. We created our own sense of family out there."


Robert J. Mical

Like the early days at Atari, people were judged not on their appearance or their unusual behavior but merely on how well they did their jobs. Dale Luck, one of the core OS engineers, looked a bit like a stereotypical hippie, and there were even male employees who would come to work in purple tights and pink fuzzy slippers. "As long as the work got done, I didn't mind what people looked like," was Jay Miner's philosophy. Not only was it a family, but it was a happy one: everyone was united by their desire to build the best machine possible.

Why was everybody willing to work so hard, to put in tons of late (and sometimes sleepless) nights just to build a new computer? The above and beyond dedication of high-tech workers has been a constant ever since Silicon Valley became Silicon Valley. Companies have often reaped the rewards from workers who were willing to put in hundreds of hours of unpaid overtime each month. Managers in other industries must look at these computer companies and wonder why they can't get their workers to put in that kind of effort.

Part of the answer lies with the extreme, nearly autistic levels of concentration that are achieved by hardware and software engineers when they are working at peak efficiency. Everyday concerns like eating, sleeping, and personal hygiene often fade into the background when an engineer is in "the zone." However, I think it goes beyond that simple explanation. Employees at small computer companies have a special position that even other engineers can't hope to achieve. They get to make important technical decisions that have far-reaching effects on the entire industry. Often, they invent new techniques or ideas that significantly change the way people interact with their computers. Giving this kind of power and authority to ordinary employees is intoxicating; it makes people excited about the work that they do, and this excitement then propels them to achieve more and work faster than they ever thought they could. RJ Mical's three-week marathon to invent Intuition was one such example, but in the story of the Amiga there were many others.

The employees of Amiga, Inc. needed this energy and passion, because there was a hard deadline coming up fast. The Consumer Electronics Show, or CES, was scheduled for January 1984.


The January CES and the buyout of Amiga

CES had expanded significantly since its inception in 1967. The first CES was held in New York City, drawing 200 exhibitors and 17,500 attendees. Among the products that had already debuted at CES were the VCR (1970), the camcorder (1981), and the compact disc player (also 1981). CES was also home to the entire nascent video game industry, which would not get its own expo (E3) until 1995.

Amiga, Inc. didn't have a lot of money left over for shipping its prototype to the show, and the engineers were understandably nervous about putting such a delicate device through the rigors of commercial package transport. Instead, RJ Mical and Dale Luck purchased an extra airline seat between the two of them and wrapped the fledgling Amiga in pillows for extra security. According to airline regulations, the extra "passenger" required a name on the ticket, so the Lorraine became "Joe Pillow," and the engineers drew a happy face on the front pillowcase and added a tie! They even tried to get an extra meal for Joe, but the flight attendants refused to feed the already-stuffed passenger.

The Lorraine prototype
The Lorraine prototype, with its three custom "chips."

The January 1984 CES show was an exciting and exhausting time for the Amiga engineers. Amiga rented a small booth in the West Hall at CES, with an enclosed space behind the public display to showcase their "secret weapon," the Lorraine computer. A guarded door led into the inner sanctum, and once inside people could finally see the massive breadboarded chips, sitting on a small table with a skirt around the edges. Skeptical customers would often lift the skirt after seeing a demonstration, looking for the "real" computer underneath.

The operating system and other software were nowhere near ready, so RJ Mical and Dale Luck worked all night to create software that would demonstrate the incredible power of the chips. The first demo they created was called Boing and featured a large, rotating checkered ball bouncing up and down, casting a shadow on a grid in the background, and creating a booming noise in stereo every time it hit the edge of the screen. The noise was sampled from Bob Pariseau hitting the garage door with one of the team's celebrated foam baseball bats. The Boing Ball would wind up becoming an iconic image and became a symbol for the Amiga itself.

The Amiga Boing Ball demo
The famous Amiga Boing Ball demo.

The January CES was a big success for the Amiga team, and the company followed it up by demonstrating actual prototype silicon chips at the June CES in Chicago, but the fledgling company was rapidly running out of money. CEO Dave Morse gave presentations to a number of companies, including Sony, Hewlett-Packard, Philips, Apple, and Silicon Graphics, but the only interested suitor was Atari, who lent the struggling company $500,000 as part of a set of painful buyout negotiations. According to the contract, Amiga had to pay back the $500,000 by the end of June or Atari would own all of their technology. "This was a dumb thing to agree to but there was no choice," said Jay Miner, who had already taken a second mortgage out on his house to keep the company going.

Fortunately for Amiga (or unfortunately, depending on how you imagine your alternate histories) Commodore came calling at the last minute with a buyout plan of its own. It gave Amiga the $500,000 to pay back Atari, briefly thought about paying $4 million for the rights to use the custom chips, and then finally went all in and paid $24 million to purchase the entire company. The Amiga had been saved, but it now belonged to Commodore.


Chapter 4: Enter Commodore

Deus ex machina

The company that rescued Amiga in 1984 was the creation of a single man. Born in Poland in 1928 as Idek Tramielski, he was imprisoned in the Nazi work camps after his country was invaded in World War II. Rescued from the camps by the US Army, he married a fellow concentration camp survivor named Helen Goldgrub, and the two emigrated to the United States. Upon arrival, he changed his name to Jack Tramiel.

Jack Tramiel
Jack Tramiel
Jack enlisted in the US Army in 1948 and served in the Equipment Repair Office. He served two tours of duty in Korea, then left the Army to work at a small typewriter repair company. In 1955, Jack and his wife left for Canada to start their own typewriter manufacturing firm. Jack wanted a military-sounding name for the company, but General and Admiral were already taken, so he settled on Commodore after seeing a car on the street with that name.

The little firm grew quickly, going public in 1962, but it became enveloped in a financial scandal that threatened to consume the company. Jack was a survivor, however, and would not give up. He found a financier named Irving Gould who purchased a large chunk of Commodore and moved it into new directions. Inexpensive Japanese typewriters were eating into Commodore's profits, so the company got into selling calculators instead. Then cheap calculators from Japan and from US firms like Texas Instruments threatened to take that business away as well. Jack realized that in order to survive the price wars, he needed to control the chips that went into the calculators. In 1976 he bought MOS Technologies, the same company that split off from Motorola to produce the legendary 6502 chip that ended up in the Apple ][, various game consoles, and the Atari 400/800 series.

The MOS purchase got Commodore into the computer business, starting with the PET, then the low-cost VIC 20, and finally in 1982 the company released the best-selling personal computer model of all time: the Commodore 64.


Commodore 64
The 64 was a huge hit, selling over 22 million machines over its life span and firmly cementing Commodore as one of the major players in the burgeoning personal computer industry. However, things were not all rosy at the company.

Jack was determined not just to compete with other computer companies, but to destroy them. "Business is war," was his motto, and while the price war he initiated did take out some competitors—including getting revenge against TI, which withdrew from the computer business in October 1983—it also strained Commodore's profits. Tramiel often fought with Gould over matters of money: the financier wanted Jack to grow the business without any extra capital, but Jack wanted more cash in order to lower costs and thus wipe out the rest of his competitors. "We sell computers to the masses, not the classes," he once said, reflecting the price difference between a $199 Commodore 64 and machines from Apple and IBM that cost thousands of dollars.

In the end, as is often the case when battling your financiers, the money people won. Jack Tramiel was forced out of his own company by the board of directors in late 1983.

This ouster would have a huge effect on the fledgling Amiga company, because Tramiel did not go quietly.

Jack Tramiel was a study in conflicts and contradictions, like any human being, but more so. His hardheaded management style made him enemies, but also made him steadfast friends—many key employees quit Commodore when he left to join him in his new ventures. His tendency to jump from project to project paid huge dividends when the company moved from the PET to the VIC-20 to the Commodore 64, but that same line of thinking hurt the company when ill-conceived successors such as the Plus/4 failed in the marketplace.

So it should come as no surprise that Tramiel's departure from Commodore both saved and doomed the Amiga at the same time.

Before Tramiel had left, Commodore had already engaged in halfhearted talks to purchase the struggling Amiga, Inc., but nothing had come from them. Atari was developing a new personal computer and game console and wanted access to the Amiga chipset. The initial offer was for $3 a share and kept getting lower. When it hit 98¢ per share, both sides walked away from the table. It was at this point that Atari "loaned" Amiga $500,000 to continue operations for a few more months.

This poisonous deal was put together by none other than Jack Tramiel, who had managed to purchase Atari's computer division after being kicked out of Commodore. Due to the video game crash of 1983, Atari's parent company Warner Communications had been looking to dump the computer and home console video game portions of Atari (they would retain the arcade division, which was still doing well), and Tramiel managed to work out a spectacular deal that gave him ownership of Atari's computer division for no money down.

When Jack left Commodore for Atari, the former company's stock fell while the latter's rose, as public opinion still considered (and rightly so) Jack to have been the driving force that built Commodore's success. A steady flow of engineers followed Tramiel to Atari, which prompted Commodore to sue Atari for theft of trade secrets. (Tramiel, in his inimitable style, would later countersue; both lawsuits were eventually settled out of court.) To compete with Tramiel and regain the engineering talent they had lost, Commodore decided to purchase Amiga wholesale. Keeping the original Amiga team intact saved the computer as Jay Miner and the others had originally envisioned it.

However, it also made Tramiel more determined than ever to get his revenge on Commodore. That revenge would come in the form of the Atari ST—sometimes called the Jackintosh—which was rushed into production to compete against the Amiga. Had Jack never been kicked out of Commodore, the Atari line of computers might have just faded into oblivion after Warner had dumped the company. The competition between Amiga and Atari would wind up hurting both platforms as they focused their resources on fighting each other rather than making sure they had a place in a world increasingly dominated by the IBM PC.

Still, all that was in the future, and the Amiga team, now a fully-owned subsidiary of Commodore, had but one thing on their minds: finishing the computer.


Finalizing the design

One hugely positive benefit about being owned by a large computer company was that the Amiga team no longer (for the moment, anyway) had to worry about money. The team was moved 10 miles to a spacious, rented facility in Los Gatos, California. They could afford to hire more engineers, and the software development team went from having 10 people sharing a single Sage workstation to everyone having their own SUN on their desk.

The influx of resources made the release of the Amiga computer possible, but it was still a race against time to get the computer finished before the competition took away the market.

While the hardware was mostly done, pending a few adjustments by Jay Miner and his team, the software (as is usually the case in high-tech development) was falling behind schedule. The microkernel, known as Exec, was mostly complete, thanks to the brilliant work by Carl Sassenrath, and the GUI was coming together as well, building on RJ Mical's solid framework (for a short time, his new Commodore business card read "Director of Intuition").

However, there was a third layer necessary to complete the picture. Exec, like modern microkernels, handled basic memory and task management, but there was still a need for another component to handle mundane tasks such as the file system and other operating system duties.


The CAOS debacle

Originally, that third layer was known as CAOS, which stood for the Commodore Amiga Operating System. Exec programmer Carl Sassenrath wrote up the design spec for CAOS, which had all sorts of neat features such as an advanced file system and resource tracking. The latter was a method of keeping track of such things as file control blocks, I/O blocks, message ports, libraries, memory usage, shared data, and overlays, and freeing them up if a program quit unexpectedly. As the Amiga software engineers were already behind schedule, they had contracted out parts of CAOS development to a third party. Still, as is often the case in software, the development hit some unforeseen roadblocks.

According to Commodore engineer Andy Finkel, the management team "decided that it wouldn't be possible to complete [CAOS] and still launch the Amiga on time, especially since the software guys had already given up weekends at home. And going home. And sleeping."

Lack of time wasn't the only problem. The third-party development house learned that Amiga, Inc., had been bought out by Commodore, and they suddenly demanded significantly more money than had originally been agreed upon. "Commodore tried to negotiate with them in good faith, but the whole thing fell apart in the end," recalled RJ Mical, who was upset by the whole event. "It was a jerk-butt thing that they did there."


TripOS to the rescue

When the CAOS deal fell apart, the Amiga team suddenly needed a replacement operating system. Relief came in the form of TripOS, written by Dr. Tim King at the University of Cambridge in the 1970s and 80s, and later ported to the PDP-11. Dr. King formed a small company called MetaComCo to quickly rewrite TripOS for the Amiga, where it became known as AmigaDOS.

AmigaDOS handled many of the same tasks as CAOS, but it was an inferior replacement. "Their code was university-quality code," said Mical, "where optimized performance was not important, but where theoretical purity was important." The operating system also lacked resource tracking, which hurt the overall stability of the system. This oversight had repercussions that remain to this day: the very latest PowerPC-compiled version of AmigaOS will still sometimes fail to free up all resources when a program crashes.

Interestingly, TripOS (and thus AmigaOS) was written in the BCPL language, a predecessor to C. Later versions of the operating system would replace this with a combination of C and Assembler.

With the kernel, OS, and GUI ready, and with last-minute adjustments to the custom chips, all that remained was designing a case for the system, which had been dubbed the Amiga 1000. Jay Miner felt it would be appropriate to have the signature from all 53 Amiga team members—both Amiga, Inc. employees and Commodore engineers who later joined the project—to be preserved on the inside of the computer's case. Both Joe Pillow and Jay's dog Mitchy got to sign the case in their own unique way.

Dave Morse, who was still nominally in charge of Commodore Amiga, added his own idea for the case: a raised "garage" on the bottom that users could slide their keyboards into when not in use.

There was only one potential stumbling block preventing the release of the Amiga 1000: the decision about how much RAM to put in the system. Cost-conscious Commodore wanted to ship with only 256KB. Knowing that the operating system and GUI needed more memory, Jay insisted on shipping with 512KB. The two sides were unable to come to an agreement, so a compromise was reached: the Amiga would ship with 256KB but come with an easily-accessible expansion cage on the front of the case that could accommodate more memory. Jay would later say that he had to "put his job on the line" just to get Commodore to put the expansion port in.

Amiga 1000
The final Amiga 1000 design

Now that all the pieces were in place, Commodore decided to announce the Amiga to the world. For the first time in the company's history, management decided to pull out all the stops. The Amiga announcement would be the most lavish and expensive new product showcase in the history of personal computers.


The announcement

Commodore rented the Lincoln Center and hired a full orchestra for the Amiga announcement ceremony, which was videotaped for posterity. All Commodore employees were given tuxedoes to wear for the event: RJ Mical one-upped the rest by finding a pair of white gloves to complete his ensemble. The band played a jaunty little number with tubas and xylophones as a brilliant laser display revealed the Amiga name in its new font.

The master of ceremonies was Commodore marketing vice president Bob Truckenbrode, but he soon gave way to the real star of the show: the head of software engineering, Bob Pariseau. With his long hair elegantly tied back in a ponytail, Pariseau directed the demonstration like a maestro conducting a symphony. With each wave of his hand, he would signal his counterpart, sitting at a real Amiga 1000, to demonstrate each new feature.

Robert Pariseau
Robert Pariseau

"At Amiga, the user controls how he uses his time, not the computer," Pariseau said, as his assistant showed the flexibility of the then-new graphical user interface. He then brought up a graphical word processor called TextCraft to show how a GUI could be applied to everyday work: the word processor featured menus, toolbar buttons, and an on-screen ruler for setting margins and tab stops. Pedestrian stuff for 1995, but astounding for a decade earlier!

Then he moved on to showing off the Amiga's graphics capabilities, showing all 4,096 colors at once on the same screen, followed by a close-up photo of a baboon's face in 640 by 400 resolution: an image that many people might remember gazing at in VGA monitor advertisements from the early 1990s.

Baboon picture
It's looking at you!

From static images, he moved on to the Amiga's strong suit: animation. The custom chips included hardware commands to flood fill arbitrary areas: those who remember using flood fill in Photoshop on older computers will remember how slow it was when it had to rely on the CPU. The Amiga's hardware-accelerated version filled up multiple rotating and intersecting triangles with different colors as they spun across the screen, all at a constant 30 frames per second. Another animation demo, Robot City, showed the Amiga's built-in sprite and collision detection features, allowing large animated characters to move over complex backgrounds and interact with each other.

Hardware triangles
Hardware flood fills

None of the demos were taking over the entire computer to do their magic. Each full-screen demo could be smoothly slid down to reveal other running applications beneath.

The concept of multitasking was virtually unknown for personal computer users in 1985, and Bob went through several examples of how this feature could be used not just for entertainment but for business applications as well. A bar chart and pie chart were built simultaneously from the same numerical data, and the user could quickly switch from one window to another to see the results in either format.

Moving on from graphics to sound, Bob demonstrated the four-channel synthesized sound hardware by using the keyboard as a virtual piano playing various different sampled instruments. "With all four channels going simultaneously, the 68000 [CPU] is idle," Pariseau commented, something that would not be true for many years in other computers until sampled waveform sound cards became available for PCs. A close-up of the Amiga operator at the keyboard showed his fingers shaking slightly—there was a lot riding on these demos, and the software was brand new and still largely untested. Yet the Amiga performed masterfully in its first time on stage, without crashing once.

The next demonstration was of computer-generated speech: the Amiga spoke in a male voice, a female voice, a fast and a slow voice, and all were pitch-modulated to sound more like a real person; the last voice was spoken in a monotone, "just like a real computer." This line got a good laugh from the audience.

Even back in 1985, the market was already showing signs of standardizing on the IBM PC platform, and Bob acknowledged this fact in his speech. "You know, it's hard," he said, "it's hard to be innovative in an industry that has been dominated by one technology for so long. We at Commodore Amiga knew that to do this [introduce a new platform] we had to be at least an order of magnitude better than anything anyone had ever seen.

"We've done that," he continued, "and then we decided: why stop there? Why not include that older technology in what we had already done?" Thus was set the stage for the very first IBM PC emulator on the Amiga, called Amiga Transformer. The program was started up, then a PC-DOS installation disk was placed into an attached 5.25 inch floppy drive, and this was replaced with a Lotus 1-2-3 disk. "Standard, vanilla, IBM DOS," Bob said with a sigh, and the crowd laughed again. Compared to the exciting graphics and sound demos of a few minutes earlier, it was a bit of a letdown seeing the industry standard spreadsheet take over the screen.

To lighten the mood, Bob finished off with a replay of the original Boing Ball demo that was first shown at CES only a year earlier. "We've lived our dream," he said, "and seen it come to life. Now it's your turn. What will you do with the Amiga Computer?"

Two unlikely celebrities were then invited on stage to show what creative folks might do with their Amigas. Deborah Harry, the lead singer of Blondie, walked on stage along with counterculture art icon Andy Warhol, who took a quick appreciative glance at her red dress as they sat down. "Are you ready to paint me now?" Debbie asked, her voice slightly nervous.
Andy and Debbie Harry

Andy sat down in front of the Amiga 1000, looking at it like it was some kind of alien technology from another world. "What other computers have you worked with?" asked resident Amiga artist Jack Hager. "I haven't worked on anything," Andy replied truthfully. "I've been waiting for this one." A nearby video camera was attached to a digitizer, and from this setup a monochrome snapshot of Debbie's face appeared on the Amiga screen, ready for Andy to add a splash of color.

It is a cardinal rule in doing computer demos in public that you never let anyone else take control of the machine, lest they do something off-script that winds up crashing the computer. The paint program (ProPaint) being used was a very early alpha, and the software engineers knew that it had bugs in it. One of the known bugs was that the flood fill algorithm—the paint program didn't use the hardware fills that were demonstrated earlier—would usually crash the program every second time it was used. Yet there was Andy clicking here, there, and everywhere with the flood fill. Somehow, the demo gods were smiling on Amiga that day, and the program didn't crash. "This is kind of pretty," Andy said, admiring his work. "I think I'll keep that."

Debbie Harry on Amiga
The finished product

The show ended with a short video—powered by the Amiga—of a wireframe ballerina, who then turned into a solid-shaded figure, and finally a fully rotoscoped animated image. A real ballerina then came out on stage and danced in sync with her animated counterpart.


Reactions to the show

While the crowd attending the show went away extremely impressed with what they had seen, the reaction from the rest of the world was mixed. Articles about the demo were published in magazines such as Popular Computing, Fortune, Byte, and Compute. The Fortune article both praised and dismissed the Amiga at the same time: "While initial reviews praised the technical capabilities of the Amiga, a shell-shocked PC industry has learned to resist the seductive glitter of advanced technology for its own sake."

Think about that last line for a few moments. Can any computer user today honestly say that color, animation, multichannel sound, and multitasking are merely seductive glitter that exists only for its own sake? Like Doug Engelbart's revolutionary demonstration of the first mouse-driven graphical user interface back in 1968, many of the ideas shown in the Amiga unveiling were a little too far ahead of their time, at least for some people.

Nevertheless, Commodore had some great buzz leading up to the introduction of the Amiga 1000. The machine had great hardware and software. It had features that no other computer could even hope to emulate. Freelance writer Louis Wallace described it thusly: "To give you an idea of its capabilities, imagine taking all that is good about the Macintosh, combine it with the power of the IBM PC-AT, improve it, and then cut the price by 75 percent." This last part was a bit of an exaggeration, but not by much: the final price of the Amiga 1000 was set at $1,295 for the 256KB version and $1,495 for the 512KB one. This compared favorably to the Macintosh, which had only 128KB and sold for $2,495.

Commodore looked like it had everything going for it. The new Amiga computer was years ahead of the competition, and many people in the company—including Jay Miner—felt that they had a real chance to significantly impact the industry. Sitting in the crowd during the Amiga's unveiling was Thomas Rattigan, an enthusiastic executive who had come from Pepsi and was being groomed for the position of CEO at Commodore. He had big plans for the Amiga. The original designers had achieved their dream by creating the Amiga from nothing, but now bigger dreams were being imagined for the little computer.

Unbeknownst to him, however, larger forces were at work that would turn these dreams into nightmares.

Chapter 5: postlaunch blues

On the cusp of greatness

By July 1985, Commodore had everything going for it. The Amiga computer had been demonstrated in public to rave reviews, and everyone was excited at the potential of this great technology.

That's when the problems started.

Commodore's primary woes were always about money, and 1985 was no exception. Sales of the Commodore 64 were still going strong, but the price wars had slashed the profits on the little computer. The company had invested millions of dollars creating new and bizarre 8-bit computers that competed directly against the venerable C-64, such as the wholly incompatible Plus/4, that had no chance in the marketplace. To make things worse, the company had to deal with lawsuits from its ousted founder, Jack Tramiel. Finally, Commodore had invested $24 million to purchase Amiga outright, but as the computer had not gone on sale yet, there was no return on this investment.
The Commodore Plus/4
The Commodore Plus/4
All these financial problems put a strain on the company's ability to get the Amiga ready to sell to the public. Without a lot of spare cash, it was difficult to rush the production of the computer. Further software delays pushed back the launch as well. The end result was that the Amiga did not go on sale until August of 1985.

This wouldn't have been a huge problem, had Commodore been able to gather enough resources to ship the machine in quantity. Instead, production delays meant that the computers trickled off the assembly lines, and by October there were only 50 Amiga 1000 units in existence, all used by Commodore for demos and internal software development.

This delay was doubly crippling because Jack Tramiel had managed to rush the development of the Atari ST, using off-the-shelf chips and an operating system and GUI purchased from Digital Research. Tramiel was able to show the ST off at the January CES and started taking orders for the computer shortly thereafter. This sudden competition from Commodore's former CEO took everyone by surprise.

The Atari ST
Atari ST



Missing Christmas

Amiga 1000 computers did not start to appear in quantity on retail shelves until mid-November 1985. This was too late to make a significant impact on the crucial holiday buying season. Most retailers make 40 percent or more of their yearly sales over the holidays, and Commodore had missed the boat.

To make matters worse, the company was not really clear about how it was going to sell the computer. The Commodore 64 had been sold at big retail chains like Sears and K-Mart, but marketing executives felt that the Amiga was better positioned as a serious business computer. Astoundingly, Commodore actually turned down Sears' offer to sell Amigas. Back in the 1980s, Sears was a major player in computer sales; I personally used to cherish parental shopping visits so that I could get my hands on the latest in computer technology. The Atari ST was sold there, but the Amiga was not.

Even these blunders might have been mitigated had Commodore come up with some truly amazing advertising campaigns to drum up interest in the new computer. The delays gave the company extra time to do this, but what Commodore came up with was so awful that it sickened many of its own employees.


Bad advertising

Because the Amiga was years ahead of its time compared to the competition, many Commodore executives believed that the computer would sell itself. This was not—and has never been—true of any technology. When personal computers first came on the scene in the late 1970s, most people had no idea what they would be useful for. As a result, the only people who bought them initially were enthusiastic and technically skilled hobbyists—a limited market at best. It took a few killer applications, such as the spreadsheet, combined with an all-out marketing assault, to drive sales to new levels.

The Amiga was in the same position in 1985. It was a multimedia computer before the term had been invented, but there were no killer applications yet. What it needed was a stellar advertising campaign, one that would drive enough sales to get software companies interested in supporting the new platform. Instead, what it got was a half-hearted series of television ads that ran over Christmas and were never seen again. The first commercial had a bunch of zombie-like people shuffling up stairs towards a pedestal, from which a computer monitor emanated a blinding light. It was a poor copy of Apple's famous 1984 advertisement, and failed to generate even a tiny amount of buzz in the industry.



From there, things got worse. The next ad was a rip-off of the ending of 2001: A Space Odyssey and featured an old man turning into a fetus. Some pictures of the commercial's production made their way to the Commodore engineers, and soon the "fetus on a stick" became a standard joke about their company's marketing efforts.

Further advertising used black-and-white and sepia-toned footage of typical family home movies, with some vague narration: "When you were growing up, you learned you faced a world full of competition." Amiga did indeed face a world full of competition, but this kind of lifestyle avant-garde advertising was already being done—and being done much better—by Apple.

What Commodore really needed at that time was some simple comparative advertising. A picture of an IBM PC running in text mode on a green monochrome screen, then a Macintosh with its tiny 9-inch monochrome monitor, then the Amiga with full color, multitasking, animation, and sound. For extra marks, you could even put prices under all three.

As a result of Commodore dropping the ball on production and marketing, the firm sold only 35,000 Amigas in 1985. This didn't help with the balance sheet, which was getting grim.


Missing CES

Commodore had experienced a financial crunch at the worst possible time. In the six quarters between September 1984 and March 1986, Commodore Business Machines International lost over $300 million. Money was tight, and the bean-counters were in charge.

As a result, Commodore was a no-show for the January 1986 Consumer Electronics Show (CES). Ahoy! Magazine reflected on this conspicuous absence:

Understand that the last four CES shows in a row, dating back to January 1984, Commodore's exhibit had been the focal point of the home computer segment of CES, the most visited computer booth at the show—as befitted the industry's leading hardware manufacturer. Their pulling out of CES seemed like Russia resigning from the Soviet Bloc.

Commodore also missed the following computer dealer exhibition, COMDEX, as well as the June 1986 CES. The company had defaulted on its bank loans and could not get the bankers to lend any more money for trade shows.

The company's advertising also slowed to a trickle. Thomas Rattigan, who was being groomed for the position as Commodore's CEO, recalled those troubling times. "Basically, the company was living hand to mouth," he said. "When I was there, they weren't doing very much advertising because they couldn't afford it."

This strategic retreat from the market had a hugely negative impact on Amiga sales. In February 1986, Commodore revealed that it was moving between 10,000 and 15,000 Amiga 1000 computers a month. Jack Tramiel's Atari ST was beating the Amiga in sales figures, in signing up dealers, and worse still, in application support.


"They fucked it up"

Many Amiga engineers felt betrayed by Commodore's financial ineptitude and pathetic marketing efforts. They were disgusted that their company could take such an advanced and powerful computer and fail to capitalize on it. Most of these bad feelings were confined to grumblings in the hallways, but some of them wound up hurting the Amiga directly.

One of the software engineers working on upgrades to Workbench, the Amiga's graphical desktop environment, decided he would "get back" at Commodore for its failure to properly market the Amiga. He programmed in a hidden message, commonly known as an "Easter Egg" in the software industry, that would only appear only when the user pressed a certain combination of keys simultaneously. The message was "We made the Amiga, they fucked it up."

RJ Mical got a slight chuckle out of the message, but told the engineer (who remains nameless to this day) that it was unacceptable, and he would have to take it out. The engineer relented, and when Mical checked the final code, the offending text had been replaced with the message "Amiga: born a champion." He thought that was the end of it.

Little did he know that the engineer had added a second Easter Egg with the original message encrypted inside. To get to the message, you had to hold down eight separate keys, which would pop up the text "We made the Amiga" on the screen. If you kept the keys held down, and were very dexterous or had a friend to help you, inserting a floppy in the drive would flash the latter part of the message for 1/60th of a second. The engineer thought that nobody would ever see this last part, but because the Amiga could output its graphics directly to video, you could just tape the whole experience and press pause on the VCR to see it.

The message was discovered embedded in the ROMs for the European PAL version of the Amiga 1000, just after the computer had gone on sale in the United Kingdom. Managers at Commodore UK pulled tens of thousands of Amigas off the shelves and refused to sell the machines until replacement ROM chips were sent out that excised the offending message. The little joke by a single software engineer cost the Amiga over three months of sales in a major market and had ramifications that shook the whole company.


Leaving Los Gatos

After the Easter Egg fiasco, Commodore management decided that they should move the Amiga team closer to headquarters so that they could keep a closer eye on their activities. The Amiga engineers were asked to move across the country, from their offices in Los Gatos, California, to West Chester, Pennsylvania.

Many of the engineers shrugged their shoulders and started packing, but for some this was the last straw. RJ Mical, the software guru who had written the Intuition programming interface and designed much of the Amiga's GUI, decided that his future would lie elsewhere. He wound up working as an independent contractor on Amiga peripherals and software, including an early video capture device called a frame grabber.

Despite his issues with Commodore, Mical still was proud of the role he played in developing the Amiga. "Those were such cool days, you just couldn't believe it," he would later tell Commodore documentary author Brian Bagnall. "It was one of the most magical periods of my entire life working at Amiga. God, what an incredible thing we did."

The father of the Amiga, Jay Miner, also refused to switch coasts. While he left Commodore as an official employee, he continued to work as a consultant for them for many years. He also donated much of his time giving talks to Amiga user groups around North America, telling the story of how he brought his dream computer to life.


Searching for stability

The trials and tribulations of Commodore Business Machines International weren't the only problems that dogged the young Amiga computer. The initial release of the operating system was rushed, and as a result the first Amiga 1000 machines shipped with many bugs in the OS. The "Guru Meditation" error that started as a joke in the Amiga offices would come to haunt many early Amiga users.

The infamous Guru Meditation error
The infamous Guru meditation error

Because the OS lacked memory protection, a fatal error in the OS or even in an application could lock up the system completely, forcing a reboot. Users might be taking advantage of the multitasking abilities of the Amiga to run many programs at once, only to lose work in all of them when the machine went down. The PC, Macintosh, and Atari ST, which had much simpler operating systems that could only run one application at a time, did not suffer from this problem.

As a result, the Amiga gained a reputation for instability that would stay with the machine for many years to come. The lack of memory protection wasn't the real problem—an operating system with full memory protection can still be brought down by a bug in the OS itself, and an application that crashes all the time isn't useful even if the OS keeps running. The software engineers at Commodore worked tirelessly to track down these bugs and eliminate them, as did the application developers. Years later, most Amiga users would run many applications at once and keep their machines operating for weeks and even months without crashing or requiring a reboot. However, the initial stability problems hurt the reputation of the Amiga—and it carried this reputation for the rest of its life.

Monday, October 8, 2007

Welcome

This blog is intended to be a forum to publish my views on the world wide web. I tried keeping my diary online from 2004-2006 but it was met with universal disinterest. I decided that there was no benefit to writing an online journal except that it was legible, unlike my handwriting (perhaps I will tell that story at a later date). In the past year or so I have had wanted to put things out there on the internet from time to time. Things you might run a Google search on to see if anyone else has thought of the same thing. Or topics that amaze you at the complete lack of attention given to them.

These are the sort of things that a blog is all about in my opinion. It’s more of an electronic publishing forum rather than a way to share your day-to-day experiences. I think an isolating factor for me in the diary-keeping aspect is that I don't live in North America. The U.S.A. (and its close cousin, Canada, of which 75% of the population live within 150km of the border) is the third most populous nation in the world. If that's not enough it has the most amount of computer users per capita. So even if the entire planet's population spoke the same language the US would have fairly large representation. But in the English speaking world it results in everyone on the internet by "default" living somewhere in North America with the occasional exception to the rule such as Britons, Australasians and fluent English speakers in places like India.
In Australia or Britain we are intimately familiar with North America through prodigious consumption of movies, televisions shows, and internet sites that are all made there. However the reverse is not true, especially with Australia. They're about as familiar with as us we are with, say, South Africa. The average Australian would know some basic facts and stereotypes but nothing beyond that. The average north American would know some basic facts and stereotypes about Australia but nothing beyond that.

So when journalling on eg. Livejournal just the fact alone that I do not live in North America is enough to cause disinterest. There is too great a cultural divide. I am thousands on kilometres away in an entirely different region of he world. There are different names and spellings for the same things. So this blog will be primarily aimed at an Australian audience or people who don't mind reading things written by someone in Australia