Thursday, January 12, 2012

The Myth of Obsolescence

There's an insidious belief held by many people that computer hardware inevitably becomes obsolete and useless after only a few years, at which point it must simply be thrown away as E-waste and replaced with the "next new thing." It's true that the capabilities of computer hardware continue to improve at an exponential rate, but this doesn't imply that the computer that was good enough to perform a task in the past is necessarily no longer adequate today, simply because something else has come along that can perform the same task slightly faster, or otherwise better.

One point I'm trying to make with this series on old computers that we don't have to buy into the assumptions of planned obsolescence that companies often try to foist on consumers in order to make more money. In fact, with nearly 7 billion people on the planet today, our collective environmental impact, including from peak oil and global climate change, will force us to make more conservative use of our natural resources in the 21st century than the United States and other industrial countries ever had to consider in the last half of the 20th.

If we continue to treat computers and other electronics as merely cheap and disposable products, and don't give any concern to reusing our older equipment, then eventually the new products will become more expensive due to higher demand and lower supply of raw materials, and they'll probably be of lower quality, because when people don't expect for something to last for a long time, they're less likely to demand (and be willing to pay the slightly higher price for) products that are designed to last.

Where hardware is concerned, you do often get what you pay for, and high-quality components that were expensive when new are often available used for a tiny fraction of the original price. The Apple IIe (Platinum series) and Apple IIc+ (upgraded to 8MHz) that I recently purchased on eBay for $154.70 and $249.99 respectively were admittedly nostalgia-driven purchases, but they are so solidly built and the keyboards are so nice to type on that I was willing to pay the price. They were also extremely expensive computers when new: I would never have paid $750 in 1988 for an Apple IIc+, or $1400 for an Apple IIe, at a time when an Amiga 500 cost about $550, and was far more powerful, but now that I own one of each, I can almost see why some people were willing to pay such high prices for Apple II gear.

One area I will spend a lot of time covering is the SCSI bus, as this is the interface used to connect hard drives, CD-ROMs, and tape drives to many of the older machines, including my Amiga, Alphas, and VAXstations, and there are a number of tricky details to cover. Some years ago, Seagate released an informative white paper explaining why SCSI drives almost always cost more than the IDE drives of the same era. The differences were all related to the higher performance and reliability requirements of hard drives for the server applications where SCSI hard drives were used, and the lower prices that users were willing to pay (and willing to accept lower performance and reliability in exchange) for IDE drives.

In the paper they mention a typical MTBF of 1,000,000 hours for a typical high-end SCSI drive. That's over 114 years of continuous usage! Now I have no idea how manufacturers justify this estimate without a time machine, but the point is that these are the goals that the drives were built to achieve. I had to go through 5 or 6 different 36 GB SCSI drives before I found one that was quiet enough to put in the Amiga: most of them had annoying high-pitched whines, and I have extremely sensitive ears. Fortunately, I live near Weird Stuff, which has an excellent selection of used computer gear and a good return policy. for this project. According to smartctl, it had a lifetime usage of 13756 hours (about 1.5 years) when I bought it (the drive itself is probably about 10 years old). If I'm lucky, this drive will last another 20-30 years, and hopefully the Amiga itself will also last as long.

Tuesday, January 10, 2012

Restoring Old Computers

Now that I'm fairly well adjusted to my new, happier, life working at Google on Android, I'm returning to blogging with a series of articles on one of my favorite hobbies: restoring and reusing old computers. I'm going to focus on three very different computers: the Commodore Amiga, the DEC Alpha, and the Apple IIe.

What do these systems have in common (besides having names that start with the letter A)? The main feature that attracts me is that none of them uses an Intel processor or runs MS-DOS or Windows (and they're not Macs either). So if your definition of PC is a generic personal computer, then they all qualify, but far more people use the term PC primarily to mean a very specific type of computer, namely one with an Intel (or compatible) "x86" processor that runs Windows (as in Apple's famous Get a Mac ad campaign).

If you're under 25, you might be too young to remember this, but there was a time before Microsoft's relentless, predatory, and sometimes illegal, business practices forced nearly the entire computer industry to standardize on their Windows-branded operating systems, and the Intel, or Intel-compatible, CPUs that ran the most compatible flavors of Windows. It was a time of great innovation. For example, the first Web browser was invented (along with the early versions of the HTTP and HTML protocols that serve as the backbone of the Web itself) by Tim Berners-Lee, on the NeXT computer and OS, Steve Jobs's ill-fated proprietary UNIX-based platform of the 90's, which was failing miserably in the marketplace, despite being quite advanced technically, and would have disappeared completely from the mainstream, just like the Amiga, the Alpha, BeOS, OS/2, and many other quirky and often cool platforms disappeared under the onslaught of the MS-DOS/Windows juggernaut, if Mr. Jobs hadn't been canny enough to convince Apple to bring in their company, at which point Steve worked his way back into the reins of power and NeXTSTEP became the foundation of the now amazingly successful Mac OS X.

The significant thing about the invention of the Web is that it might never have happened if Mr. Berners-Lee was working on a PC running Windows, because that platform simply wasn't capable of designing something like a Web browser without running into all sorts of annoyances and weaknesses of the platform. He might have written the first Web browser on an Amiga, or perhaps on a Mac, but I think it's fairly well established that NeXT had the best development environment if you wanted to create powerful and reliable software with an extremely small team (Berners-Lee worked with another computer scientist and an intern) in a short amount of time. Anyway, Windows took another 10 years before its development tools had really caught up to what NeXT (itself a very small company) created back in 1990.

Similarly, the PC hardware of the 90's left a lot to be desired. For example, non-autoconfiguring devices, such as cards for the standard 8-bit and 16-bit ISA bus included the fun activities of setting jumpers or DIP switches on the card to manually select a port range, IRQ, and possible DMA channels to use, making sure your choices didn't conflict with other cards on the system, and editing your CONFIG.SYS and AUTOEXEC.BAT startup files to inform the OS and programs of your choices, along with possibly spending time trying to get all your MS-DOS drivers to load into "high memory" in order to maximize your "conventional memory" below 640K, and a whole bunch of other annoying stuff that I remember wasting literally hundreds of hours on myself as a young PC hacker before the PCI bus came along and things slowly started catching up to what the other platforms had been doing right all along.

That's one reason I want to point out the Apple II, because Woz had "Plug & Play" functionality in the Apple II in 1977, and IBM didn't bother to make their PC similarly user friendly, even though it came out 3 years later, and had the full weight of IBM's engineering resources (compared to a single geek in a garage). IBM didn't rectify this mistake until 1987, when they introduced a new proprietary computer bus called MCA in the PS/2 series, which other vendors couldn't use in their PC compatibles without paying patent royalties to IBM. This didn't sit well with Compaq, HP, Tandy (remember them?), and the rest of the Gang of Nine, who created their own alternative architecture called EISA which also didn't catch on. VESA Local Bus was another short-lived standard for video cards, but eventually PCI won out and we're better off for it. But the "PC compatible" architecture still includes a lot of legacy junk from the ISA days, even though it's all emulated inside a single chip these days.

The Apple II has auto-configuring cards, my two Alpha workstations use PCI (including a few slots supporting the rare 64-bit variant) and the Amiga includes a proprietary auto-configuring bus called Zorro.  So at least from a usability perspective, they're all as friendly to upgrade as a PC from the year 2000, and far superior than the PC's of just five years earlier. Similarly on the OS front, as MS-DOS and Windows (particularly the 16-bit and CE variants) were similarly broken and pathetic compared to the far more capable OS's of the other platforms, and it wasn't until Windows XP that the PC-compatible industry had an OS that was both easy to use and not completely broken when compared to a typical UNIX system. You could make the same argument about the Mac's OS, which didn't really get good until Mac OS X 10.3 came out in 2003 (in my opinion).

What did UNIX (and other heavy-duty OS's like VMS) do that the wimpier platforms like DOS and Windows could not? Well, quite a number of things. Virtual memory that protected buggy apps from being able to bring down the system or interfere with each other. Support for multiple users, with file system security so users couldn't access each others files without permission or modify critical system files. The X Window System, which evolved quickly from simple beginnings because it was extensible enough to support a variety of different approaches that evolved the platform to become more user friendly and better looking, such as the popular commercial graphical toolkit called Motif (now available as open source), and today's GTK and Qt platforms that form the basis of the popular GNOME and KDE desktops on Linux and other UNIX systems. X has been so wildly successful that one of the nice features of Mac OS X (for a UNIX geek at least) is Apple's inclusion of an X server with the OS, which makes running any type of UNIX app really easy, as compared to Microsoft's half-hearted efforts to provide UNIX compatibility in Windows, which they only grudgingly seem to support.

So I'll be running UNIX on the Amiga and the Alphas, specifically the NetBSD flavor, which is open source, and well supported on a huge number (57 at last count!) of different platforms, including x86 PC's, of course. I've been running the Amiga version of the 5.x stable branch, and it has been rock solid reliable for me. I'll be switching over to the current branch that will become NetBSD 6.0, so that I can help fix bugs and submit a few changes and improvements that I've written to the NetBSD community (the current branch is where active work is done for new features).

I'll have more to say about NetBSD and the Commodore Amiga series in the next post, so stay tuned if you're interested in this sort of stuff. Comments and feedback welcome. I also have a Google+ post about some hacking I did to get CD quality audio out of the Amiga, something that I didn't think possible until I learned about the 14-bit hack, the Paula calibration hack, and the 31.5 kHz video hack.