Computer Programming Languages (Part 1) [Archives:2003/654/Education]

archive
July 28 2003

By Akram Yahia Baker
APTECH, Sana'a Center

Ever since the invention of Charles Babbage's difference engine in 1822, computers have required a means of instructing them to perform a specific task. This means is known as a programming language. Computer languages were first composed of a series of steps to wire a particular program: these morphed into a series of steps keyed into the computer and then executed; later these languages acquired advanced features such as logical branching and object orientation. The computer languages of the last fifty years have come in two stages, the first major languages and the second major languages, which are in use today.
In the beginning, Charles Babbage's difference engine could only be made to execute tasks by changing the gears, which executed the calculations. Thus, the earliest form of a computer language was physical motion. Eventually, physical motion was replaced by electrical signals when the US government built the ENIAC in 1942. It followed many of the same principles of Babbage's engine and hence, could only be “programmed” by presetting switches and rewiring the entire system for each new “program” or calculation. This process proved to be very tedious.
In 1945, John Von Neumann who was working in the Institute for Advanced Study developed two important concepts that directly affected the path of computer programming languages. The fist was known as “shared-program technique” (www.softlord.com). This technique stated that the actual computer hardware should be simple and does not need to be hand-wired for each program. Instead, complex instructions should be used to control the simple hardware, allowing it to be reprogrammed much faster.
The second concept was also extremely important to the development of programming languages. Von Neumann called it ” condition control transfer” (www.softlord.com). This idea gave rise to the notion of subroutines, or small blocks of code that could be jumped to in any order, instead of a single set of chronologically ordered steps for the computer to take. The second part of the idea stated that the computer code should be able to branch based on logical statements such as IF (expression) then, and looped such as with a FOR statement. “Conditional control transfer” gave rise to the idea of “libraries” which are blocks of code that can be reused over and over.
In 1949, a few years after Von Neumann's work, the language Short Code appeared (www.byte.com). It was the first computer language for electronic devices and it required the programmer to change its statements into 0's and 1's by hand. Still, it was the first step towards the complex languages of today. In 1951, Grace Hooper wrote the first compiler, A-0 (www.byte.com). A compiler is a program that turns the language's statements into 0's and 1's for the computer to understand. This led to faster programming, as the programmer no longer had to do the work by hand.
In 1951 Grace Hooper, working for Remington Rand, began design work on the first widely known compiler, named A-0. When the language was released by Rand in 1957, it was called MATH-MATIC.
In 1952 Alick E. Glennie, in his spare time at the University of Manchester, devised a programming system called AUTOCODE, a rudimentary compiler.
In 1957, the first of the major languages appeared in the form of FORTRAN. Its name stands for Formula Translating system. The language was designed at IBM for scientific computing. The components were very simple and provided the programmer with low-level access to the computers. Today, this language would be considered restrictive as it only included IF, DO, and GOTO statements, but at the time, these commands were a big step forward. The basic types of data in use today got their start in FORTRAN. These included logical variables (TRUE or FALSE) and integer, real, and double-precision numbers.
To be continued next week
——
[archive-e:654-v:13-y:2003-d:2003-07-28-p:education]