"Personal" Computing

In 1971, Intel released the Intel 4004, the first microprocessor. Here, all of the transistors and circuits of the CPU could be placed on a single silicon chip. The 4004 contained 2300 transistors and had as much computing power as the ENIAC, and yet it was only about the size of a small postage stamp!

By the late 1970s, the price of computer processors and other hardware had both dropped in cost and shrunk in size to such an extent that it was actually becoming feasible for someone to own their own computer, an idea that had been completely unthinkable in the previous decades.

Intel, the company that produced the first microprocessors driving many of these early so-called microcomputers, had actually considered the idea of marketing a personal computer. However, they couldn't come up with an answer to the obvious question: Who would buy such a thing, and what would they do with it? At the time, computers were still primarily considered to be number-crunchers, and it seemed that few individuals really needed to have that much computation done. Computers were seen as "industrial strength" machinery, and, to a lot of people, suggesting that there might be a market for a "personal" computer seemed to make as much sense as trying to sell the average person their own bulldozer!

Another part of the problem was that although computer hardware was smaller, cheaper, and more widely available, computer software was still largely "custom made." In other words, if you wanted to use a computer, that usually meant programming it yourself (or hiring someone else to do it for you). For most tasks, there was no "off-the-rack," ready-made solution. There was no real software industry, apart from the folks who were selling programming languages.

Even so, companies began marketing "microcomputers." In 1977, Apple introduced the enormously popular Apple II. In 1981, IBM responded by releasing its IBM PC. Apple and IBM sold many of these early computers. However, who were they selling these computers to? What sort of "person" was implied in the phrase personal computer?

The answer was this: by and large, the majority of those purchasing microcomputers in the late '70s and early '80s were programmers and hobbyists. Again, programming was largely a matter of necessity: chances are, one couldn't go out and buy the software program one needed. If you wanted to use the computer, you probably had to program it yourself—using a programming language such as BASIC, for example.

It would take a young programmer named Bill Gates to foresee the idea of a software market.

Previous Page Next Page
 



These pages were written by Steven H. VanderLeest and Jeffrey Nyhoff and edited by Nancy Zylstra
©2005 Calvin University (formerly Calvin College), All Rights Reserved

If you encounter technical errors, contact computing@calvin.edu.