A brief history of programming
In the 17 Century, the first calculating machines were invented by Wilhelm Schickard and Blaise Pascal (who created the "Pascaline" in 1642). These mechanical devices were remarkable creations but they could only perform specific calculations. Arguably the first programmable computer was the Analytical Engine by Charles Babbage conceived in 1835 but never completed.
Analytical Engine - 1835
With the Analytical Engine, Babbage conceived of a machine that could be programmed to solve any logical or computational problem. This project came to the attention of Lady Ada Lovelace (who, incidently, was the only legitimate child of Lord Byron). Lovelace became obsessed with the project and wrote notes on programming techniques, sample programs and the potential for programmable machines to play chess and compose music. She is regarded as the world?s first computer programmer and is credited with the invention of the programming loop and the subroutine.
Babbage?s ideas were conceived in terms of mechanical technology, and it wasn?t until a century later when advances in electronic technology would enable many of his ideas to be fully realized.
Z-3, Robinson and Mark I
During the Second World War, British invested significant resources into the ?Ultra? project based at Bletchley Park. This top-secret project utilized machines built by Alan Turing to decode German military messages encoded using the ?Enigma? enciphering machine. One such machine, called Robinson, was built in 1940 and is generally regarded as the first operational (although non-programmable) computer.
The first programmable computer was actually built in Germany by Konrad Zuse in 1941. In contrast to the British, the German military apparently overlooked the significance of Zuse?s achievements and his work only ever got minor support and very little recognition after the war (the original Z-3 machine was destroyed during the war, but a replica is on display at the Deutsches Museum in Munich).
In the US, a team of Harvard and IBM scientists led by Howard Aiken were also working on a programmable computer. This computer, called Mark I, was completed in 1944. The person who is credited with fully harnessing the power of this programmable computer is Captain Grace Murray Hopper. She was one of the first to recognize the value of reusable libraries of subroutines, is credited with inventing the term 'debug' (when she removed a dead moth stuck in a relay) and having written the first high level compiler (A-0). She also led the effort to develop COBOL ? a programming language not identified with a particular manufacturer. Among Hopper's many achievements was winning "Computer Science Man-of-the-Year Award" in 1969 (despite not being a 'man').
Programming Languages
The language a computer can understand (called "machine code") is composed of strings of zeros and ones. This smallest element of a computer?s language is called "a bit" ? 0 or 1. Four bits are a nibble. Two nibbles (8 bits) equal a byte. The ?words? of a computer language are the size of a single instruction encoded in a sequence of bits (for example, many computers speak a language with words that are '32-bits' long).
As machine code is extremely difficult to work with, a type of language called ?Assembly? was soon developed. Using an assembly language, programmers use series of mnemonics that are then translated by a program into machine code that the computer can understand. However, assembly is very similar to machine code in that all procedures have to be spelt out in exact detail in a process that is extremely difficult, slow and prone to errors.
Translators (Compilers and Interpreters)
Grace Hopper is credited with pioneering the idea of a ?compiler? to translate some more human-friendly language into the language of a computer. These more ?human-friendly? languages are called ?higher level languages? and were developed to allow programmers to concentrate more closely on the abstract problem to be solved rather than all the painful detail required for machine code or assembly language programming.
A compiler converts source code written in some high-level language into executable machine code (also called binary code or object code). The resulting machine code can only be understood by a specific processor, such as a Pentium or PowerPC.
An interpreter translates either source code or tokens into machine code, one instruction at a time, as the program is run. An interpreter does not generate machine code from a source program.
High level languages
One of the first 'higher level' languages that gets wide use is FORTRAN, first released in 1957. This language is very good at number crunching, but not so good at input and output. COBOL, released soon after, was "designed from the ground up as the language for businessmen" and used "a very English-like grammar"1.
These languages are generally considered to reflect a 'procedural' paradigm of programming. In 1958, John McCarthy at M.I.T. began work on LISP which goes on to become one of the most important languages in the area of "Artificial Intelligence". LISP, which gets its name from LISt Processing, reflects a language model based on recursive functions. Another language, PROLOG, invented in the 70s, used a model based on 'logic programming' with predicate calculas.
The next most significant language to appear, at least from a Director/Lingo perspective, was Smalltalk which was developed by Alan Kay at Xerox PARC.
1 http://www.princeton.edu/%7Eferguson/adw/programming_languages.shtml