Basics of Software

The astonishing flexibility of computers, their ability to perform such an amazing array of tasks, is directly related to their "programmability." Again, the idea behind a computer program is to reduce a human task into a sequence consisting only of the basic arithmetic operations ("instruction set") that a computer can perform.

Because modern computers are really, at rock bottom, machines that can mimic binary arithmetic, this leads to an astonishing fact: namely, that anything we wish a computer to do must first be boiled down to a sequence of binary arithmetic.

In fact, we can go one step further in considering the computer as a binary "number cruncher": even the very instructions that make up the computer program must be binary numbers. Why? Because computers are machines that are built to understand binary numbers. These sets of 0s and 1s set off "chain reactions" in the two-state devices in the computer. When this reaction is complete, these two-state devices are in a state which can be understood as a sequence of 0s and 1s. Again, the computer is simply engineered so that this ending state is the "correct" result in regard to the binary numbers that were input. Thus, although we say that we give "instructions" to a computer, it is important to remember that this is far from our use of the term in regard to human beings. Computers don't "understand" instructions; they merely react to them.

So, if we want computers to perform a task, this task must be broken down into steps of binary arithmetic, and these steps themselves must be described in terms of binary numbers. Does this sound tedious? It should. It's mindboggling to consider the fact that early computer programmers had to write their programs entirely in 1s and 0s, and, moreover, after running these programs got only 1s and 0s as results!

This level of computing that consists only of 1s and 0s is called "machine language." And it is still at the heart of computer software.

It's not surprising that computer programmers soon grew exasperated with having to work in nothing but 0s and 1s. It was this frustration that led some ingenious people such as Grace Hopper to develop what are now called "high-level" programming languages. COBOL, FORTRAN, BASIC, Pascal, C, C++, and Java are only a few examples of such languages. The most attractive thing about high-level programming languages is that they are far more similar to human language than they are to instructions consisting merely of 0s and 1s.

"But, wait!" you say. "I thought computers could only respond to inputs of 0s and 1s." And you are correct. Any computer programs written in a high-level programming language must be "compiled" - that is, translated into the 0s and 1s that the computer expects. This translation is usually handled by a piece of software called a "compiler." Thus, for example, a programmer in C++ must first execute a program called the "C++ compiler." This translates her C++ program (called "source code") into the instructions that the computer can execute (called "machine language").

The rise of high-level programming languages certainly had a great effect on computing, because a much broader range of persons were capable of and willing to write programs in such languages than was the case in machine language programming. The rise of these languages -- along with the rising power and lower costs of computers -- caused a worldwide explosion in the use of computers during the 1960s and 1970s.

Moreover, instead of software programs in the form of punched holes on cards or tapes, magnetic storage devices were developed - tapes, and later disks. However, because these devices are still "mechanical" - again, because they require moving parts - they are slow compared to the electronic internal components of a computer. This posed a problem for early computers, because the computer always completed one instruction of a software program long before the next one could be loaded, and this greatly slowed down the system. For this reason, computers are now designed so that software programs are first read into a computer's memory - which is comprised of extremely fast electronic "chips" - and the instructions are then executed from there.

 

 
 




These pages were written by Steven H. VanderLeest and Jeffrey Nyhoff and edited by Nancy Zylstra
©2005 Calvin College, All Rights Reserved

If you encounter technical errors, contact rit@calvin.edu.