New forms of technology tend not to materialize from thin air. The nature of innovation takes existing known technologies and remixes, extends, and co-opts them to create novelty.
Gordon Brander refers to it in this piece as âexapting infrastructure.â As in the case of the internet, it wasnât nonexistent one day then suddenly connecting all of our computers the next. It wasnât purposely designed from the beginning as a way for us to connect our millions of computers, phones, and smart TVs. In fact, many types of computers and the things we do with them evolved as a consequence of the expansion of the internet, enabled by interconnection to do new things we didnât predict.
Former railroad corridors are regularly reused as cycling trails
âExaptationâ is a term of art in evolutionary biology, the phenomenon of an organism using a biological feature for a function other than it was adapted for through natural selection. Dinosaurs evolved feathers for insulation and display, which were eventually exapted for flight. Sea creatures developed air bladders for buoyancy regulation, later exapted into lungs for respiration on land.
In the same way, technologies beget new technologies, even seemingly-unrelated ones. In the case of the internet, early modems literally broadcast information as audio signals over phone lines intended for voice. Computers talked to each other this way for a couple decades before we went digital native. We didnât build a web of copper and voice communication devices to make computers communicate, but it could be made to work for that purpose. Repurposing the existing already-useful network allowed the internet to gain a foothold without much new capital infrastructure:
The internet didnât have to deploy expensive new hardware, or lay down new cables to get off the ground. It was conformable to existing infrastructure. It worked with the way the world was already, exapting whatever was available, like dinosaurs exapting feathers for flight.
Just like biological adaptations, technologies also evolve slowly. When weâre developing new technologies, protocols, and standards, weâd benefit from less greenfield thinking and should explore what can be exapted to get new tech off the ground. Enormous energy is spent trying to brute force new standards ground-up when we often would be better off bootstrapping on existing infrastructure.
Biology has a lot to teach us about the evolution of technology, if we look in the right places.
Investor Esther Dyson published this piece in her Release 1.0 newsletter in 19941. Itâs a look forward at what the market for content and digital goods with the rise of the internet.
These were the days of CompuServe and AOL, when you had to pay by the hour for access to the net. Software was still sold in a box, still on diskettes, and effectively all media was still consumed in print. Even search engines were in their very early days.
Dyson is prescient here, with some amazingly accurate predictions about how value would shift to different parts of the value chain for digital works:
Intellectual property that can be copied easily likely will be copied. It will be copied so easily and efficiently that much of it will be distributed free in order to attract attention or create desire for follow-up services that can be charged for.
The way to become a leading content provider may be to start by giving your content away. This âgenerosityâ isnât a moral decision: itâs a business strategy.
She references what eventually would become the open source movement, SaaS companies, and subscription publishing. An astonishingly forward-looking view of how intellectual property would evolve, and also assumes an internet explosion that wasnât obviously inevitable at the time.
I still would love to see a publication (looking at you Increment) that curates and republishes historically-important articles from the history of computers.
The back issues are all available from OâReilly, as well. An excellent historical archive of the early days of the internet era. âŠ
Physicians hang diplomas in their waiting rooms. Some fishermen mount their biggest catch. Downstairs in Westborough, it was pictures of computers.
Over the course of a few decades dating beginning in the mid-40s, computing moved from room-sized mainframes with teletype interfaces to connected panes of glass in our pockets. At breakneck speed, we went from the computer being a massively expensive, extremely specialized tool to a ubiquitous part of daily life.
During the 1950s â the days of Claude Shannon, John von Neumann, and MITâs Lincoln Lab â a âcomputerâ was a batch processing system. Models like the EDVAC were really just massive calculators. It would be another decade before the computer would be thought of as an âinteractiveâ tool, and even longer before that idea became mainstream.
The 60s saw the rise of IBM its mainframe systems. Moving from paper tape time clocks to tabulating machines, IBM pushed their massive resources into the mainframe computer market. S/360 dominated the computer industry until the 70s (and further on with S/370), when the minicomputer emerged as an interim phase between mainframes and what many computer makers were pursuing: a personal, low-cost computer.
The emergence of the minicomputer should be considered the beginning of the personal computer revolution. Before that, computers were only touched by trained operators â they were too complex and expensive for students, secretaries, or hobbyists to use directly. Minis promised something different, a machine that a programmer could use interactively. In 1964, DEC shipped the first successful mini, the PDP-8. From then on, computer upstarts were sprouting up all over the country getting into the computer business.
The DEC PDP-8
One of those companies was Data General, a firm founded in 1968 and the subject Tracy Kidderâs book, The Soul of a New Machine. A group of disaffected DEC engineers, impatient with the companyâs strategy, left to form Data General to attack the minicomputer market. Founder Edson de Castro, formerly the lead engineer on the PDP-8, thought there was opportunity that DEC was too slow to capitalize on with their minis. So DG designed and brought to market their first offering, the Nova. It was an affordable, 16-bit machine designed for general computing applications, and made DG massively successful in the growing competitive landscape. The Nova and its successor sold like gangbusters into the mid-70s, when DEC brought the legendary VAX âsuperminiâ to market.
DECâs announcement of the VAX and Data Generalâs flagging performance in the middle of that decade provide the backdrop for the book. Kidderâs narrative takes you inside the company as it battles for a foothold in the mini market not only against DEC and the rest of the computer industry, but also with itself.
The VAX was set to be the first 32-bit minicomputer, an enormous upgrade from the prior generation of 16-bit machines. In 1976, Data General spun up a project codenamed âFountainhead,â their big bet to develop a VAX killer, which would be headquartered in a newly-built facility in North Carolina. But back at their New England headquarters, engineer Tom West was already at work leading the Eclipse team in building a successor. So the company ended up with two competing efforts to create a next-generation 32-bit machine.
Data General's Eclipse S230
The book is the story of Westâs team as they toil with limited company resources against the clock to get to market with the âEagleâ (as it was then codenamed) before the competition, and before Fountainhead could ship. As the most important new product for the company, Fountainhead had drawn away many of the best engineers who wanted to be working on the companyâs flagship product. But the engineers that had stayed behind werenât content to iterate on old products, they wanted to build something new:
Some of the engineers who had chosen New England over FHP fell under Westâs command, more or less. And the leader of the FHP project suggested that those staying behind make a small machine that would solve the 32-bit, logical-address problem and would at the same time exhibit a trait called âsoftware compatibility.â
Some of those who stayed behind felt determined to build something elegant. They designed a computer equipped with something called a mode bit. They planned to build, in essence, two different machines in one box. One would be a regular old 16-bit Eclipse, but flip the switch, so to speak, and the machine would turn into its alter ego, into a hot rodâa fast, good-looking 32-bit computer. West felt that the designers were out to âkill North Carolina,â and there wasnât much question but that he was right, at least in some cases. Those who worked on the design called this new machine EGO. The individual initials respectively stood one step back in the alphabet from the initials FHP, just as in the movie 2001 the name of the computer that goes berserkâHALâplays against the initials IBM. The name, EGO, also meant what it said.
What proceeded was a team engaged in long hours, nights and weekends, and hard iteration on a new product to race to market before their larger, much better funded compatriots down south. As West described it to his team, it was all about getting their hands dirty and working with what they had at hand â the definition of the scrappy upstart:
West told his group that from now on they would not be engaged in anything like research and development but in work that was 1 percent R and 99 percent D.
The pace and intensity of technology companies became culturally iconic during the 1990s with the tech and internet boom in that decade. The garage startup living in a house together working around the clock to build their products, a signature of the Silicon Valley lifestyle. But the seeds of those trends were planted back in the 70s and 80s, and on display with the Westborough team and the Eagle (which eventually went to market as the Eclipse MV/80001). Kidder spent time with the team on-site as they were working on the Eagle project, providing an insiderâs perspective of life in the trenches with the âHardy Boysâ (who made hardware) and âMicrokidsâ (who wrote software). He observes the teamâs engineers as they horse-trade for resources. This was a great anecdote, a testament to the autonomy the young engineers had to get the job done however they could manage:
A Microkid wants the hardware to perform a certain function. A Hardy Boy tells him, âNo wayâI already did my design for microcode to do that.â They make a deal: âIâll encode this for you, if youâll do this other function in hardware.â âAll right.â
If youâve ever seen the TV series Halt and Catch Fire, this book seems like a direct inspiration for the Cardiff Electric team in that show trying to break into the PC business. The Eagle team could represent any of the scrappy startups from the 2000s.
Itâs a surprisingly approachable read given its heavy focus on engineers and the technical nature of their work in designing hardware and software. The book won the Pulitzer in 1982, and has become a standard on the shelves of both managers and engineers. The Soul of a New Machine sparked a deeper interest for me in the history of computers, which has led to a wave of new reads Iâm just getting started on.
In those days, you could always count on business products to have sufficiently boring names. âŠ
This is a presentation from Fernando CorbatĂł from the 1990 ACM Turing Award lectures. CorbatĂł was one of the creators of both the Compatible Time-Sharing System (CTSS) at MIT and Multics, the operating system that influenced Unix.
Fernando CorbatĂł at MIT
He describes the challenges in those programs with their novel approaches: you always encounter failure states when pushing the edge. His takeaways in his experience building âambitious systemsâ:
First, the evolution of technology supports a rich future for ambitious visions and dreams that will inevitably involve complex systems.
Second, one must always try to learn from past mistakes, but at the same time be alert to the possibility that new circumstances require new solutions.
And third, one must remember that ambitious systems demand a defensive philosophy of design and implementation. Or in other words, âDonât wonder if some mishap may happen, but rather ask what one will do about it when it does occur.â
I love these stories from the early days of computers. Already the creators were encountering problems similar to those we still deal with today:
One important consequence of developing CTSS was that for the first time users had persistent on-line storage of programs and data. Suddenly the issues of privacy, protection and backup of information had to be faced. Another byproduct of the development was that because we operated terminals via modems, remote operation became the norm. Also the new-found freedom of keeping information on-line in the central file system suddenly made it especially convenient for users to share and exchange information among themselves.
And there were surprises too. To our dismay, users who had been enduring several hour waits between jobs run under batch processing, were suddenly restless when response times were more than a second. Moreover many of the simplifying assumptions that had allowed CTSS to be built so simply such as a one level file system, suddenly began to chafe. It seemed like the more we did, the more users wanted.
Boy, does that sound familiar!
Hereâs a version from the MIT website with his slides included.
In 1972 at Xerox PARC, Butler Lampson wrote this memo to the Xerox leadership making the case to produce the Alto computer from Chuck Thackerâs original design. Amazing to think how significant this piece of communication was in the subsequent progress on personal computers and the internet.
At this stage in â72, the ARPANET had only been live a couple years, and Bob Metcalfeâs ethernet design was still in the future. And the closest thing to a personal computer at this point was Wes Clarkâs LINC, designed during his time at Lincoln Lab. Two of the critical points Lampson uses as justification were on both networking potential and personal computing:
Distributed computing. We can very easily put in an Aloha-like point-to-point packet network between Altoâs, using a coax as the ether (or microwave with a repeater on a hill for home terminals). We can then do a large variety of experiments with dozens of machines. It is easy to try experiments which depend on the independence of the participants as well as those which use specialized components which must cooperate to accomplish anything. In particular, we can set up systems in which each user has his own files and communications is done solely for the interchange of sharable information, and thus shed some light on the long-standing controversy about the merits of this scheme as against centralized files.
Personal computing. If our theories about the utility of cheap, powerful personal computers are correct, we should be able to demonstrate them convincingly on Alto. If they are wrong, we can find out why. We should, for example, be able to satisfy heavy Lisp users such as Warren and Peter with an Alto. This would also take a big computing load away from Maxc. It should also be quite easy to simulate the hardware configuration of other proposed personal computers (e.g., different memory hierarchies) and thus to validate those designs. This is important because more compact machines will require a much larger investment in engineering development and more precise optimization of the memory system.
The âMaxcâ refers to PARCâs emulation of the PDP-10.
Pivotal moments can come down to a single well-articulated idea.
Today on the nerdy computer history feed, weâve got a 1982 video from Bell Labs: The UNIX System: Making Computers More Productive.
Most of the video has Brian Kernighan explaining the structure of UNIX and why itâs different from its contemporary operating systems. I should do more work with the keyboard in my lap and my feet on the desk.
Navigating a Linux shell looks almost identical to this today, 50 years later.
I liked this quote John Mashey, a computer scientist who worked on UNIX at Bell:
Software is different from hardware. When you build hardware and send it out, you may have to fix it because it breaks, but you donât demand, for example, that your radio suddenly turn into a television. And you donât demand that a piece of hardware suddenly do a completely different function, but people do that with software all of the time. Thereâs a continual demand for changes, enhancements, new features that people find necessary once they get used to a system.
Continuing my dive into the history of computers, I ran across this extended, detailed article covering the development and boom of the minicomputer industry.
An technical piece on restoring Alan Kayâs Xerox Alto he donated to Y Combinator. Amazing piece of technology history, and inspired so many future developments in computing â graphical user interfaces, WYSIWIG text editing, bitmapped graphics, the mouse, and Ethernet for connectivity.
Xerox built about 2000 Altos for use in Xerox, universities and research labs, but the Alto was never sold as a product. Xerox used the ideas from the Alto in the Xerox Star, which was expensive and only moderately successful. The biggest impact of the Alto was in 1979 when Steve Jobs famously toured Xerox and saw the Alto and other machines. When Jobs saw the advanced graphics of the Alto, he was inspired to base the user interfaces of the Lisa and Macintosh systems on Xeroxâs ideas, making the GUI available to the mass market.
A fun story from Jimmy Maher about the 1991 partnership with IBM that moved Apple from the Motorola 88000 chips to PowerPC. It was a savvy deal that kept the Macintosh (and Apple) alive and kicking long enough to bridge into their transition back to Steve Jobsâs leadership, and the eventual transition of the Mac lineup to Intel in 2006.
While the journalists reported and the pundits pontificated, it was up to the technical staff at Apple, IBM, and Motorola to make PowerPC computers a reality. Like their colleagues who had negotiated the deal, they all got along surprisingly well; once one pushed past the surface stereotypes, they were all just engineers trying to do the best work possible. Appleâs management wanted the first PowerPC-based Macintosh models to ship in January of 1994, to commemorate the platformâs tenth anniversary by heralding a new technological era. The old Project Cognac team, now with the new code name of âPiltdown Manâ after the famous (albeit fraudulent) âmissing linkâ in the evolution of humanity, was responsible for making this happen. For almost a year, they worked on porting MacOS to the PowerPC, as theyâd previously done to the 88000. This time, though, they had no real hardware with which to work, only specifications and software emulators. The first prototype chips finally arrived on September 3, 1992, and they redoubled their efforts, pulling many an all-nighter. Thus MacOS booted up to the desktop for the first time on a real PowerPC-based machine just in time to greet the rising sun on the morning of October 3, 1992. A new era had indeed dawned.
The downturn in revenue IBM plummeted into in the early 90s during the Wintel explosion was stunning. Just look at these numbers:
In 1991, when IBM first turned the corner into loss, they did so in disconcertingly convincing fashion: they lost $2.82 billion that year. And that was only the beginning. Losses totaled $4.96 billion in 1992, followed by $8.1 billion in 1993. IBM lost more money during those three years alone than any other company in the history of the world to that point; their losses exceeded the gross domestic product of Ecuador.
One of the great things about YouTube is being able to find gems of history like Doug Engelbartâs âMother of All Demosâ presentation from 1968. How amazing it mustâve been to see something like this live, 50 years ago:
The live demonstration featured the introduction of a complete computer hardware and software system called the oN-Line System or, more commonly, NLS. The 90-minute presentation essentially demonstrated almost all the fundamental elements of modern personal computing: windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a collaborative real-time editor (collaborative work). Engelbartâs presentation was the first to publicly demonstrate all of these elements in a single system. The demonstration was highly influential and spawned similar projects at Xerox PARC in the early 1970s. The underlying technologies influenced both the Apple Macintosh and Microsoft Windows graphical user interface operating systems in the 1980s and 1990s.