Monday, July 14, 2008

Inventors of the Modern Computer




Inventors of the Modern Computer



FORTRAN - The First Successful High Level Programming Language - Invented by John Backus and IBM

Inventors of the Modern Computer Series
Table of Contents • Next Chapter Jack Kilby and Robert Noyce - Integrated Circuit ENTER
More on FORTRAN and John Backus
Further Reading on FORTRAN Biography of John Baccus and the Fortran team, information on Fortran and programming in Fortran, HTML version Of Fortran 77, free Fortran software. • FORTRAN : The Early Turning Point The history of software and software programmers. • Software Innovations
By Mary Bellis
"I really didn't know what the hell I wanted to do with my life...I said no, I couldn't. I looked sloppy and disheveled. But she insisted and so I did. I took a test and did OK." - John Backus on interviewing for IBM.
FORTRAN or formula translation, the first high level programming language, was invented by John Backus for IBM, in 1954, and released commercially, in 1957. It is still used today for programming scientific and mathematical applications. Fortran began as a digital code interpreter for the
IBM 701 and was originally named Speedcoding. John Backus wanted a programming language closer to human language, which is the definition of a high level language, other high language programs include Ada, Algol, BASIC, COBOL, C, C++, LISP, Pascal, and Prolog.
The first generation of codes used to program a computer, was called machine language or machine code, it is the only language a computer really understands, a sequence of 0s and 1s that the computer's controls interprets as instructions, electrically. The second generation of code was called assembly language, assembly language turns the sequences of 0s and 1s into human words like 'add'. Assembly language is always translated back into machine code by programs called assemblers.
The third generation of code, was called high level language or HLL, which has human sounding words and syntax (like words in a sentence). In order for the computer to understand any HLL, a compiler translates the high level language into either assembly language or machine code. All programming languages need to be eventually translated into machine code for a computer to use the instructions they contain.
John Backus headed the IBM team of researchers, at the Watson Scientific Laboratory, that invented Fortran. On the
IBM team were the notable names of scientists like; Sheldon F. Best, Harlan Herrick (Harlan Herrick ran the first successful fortran program), Peter Sheridan, Roy Nutt, Robert Nelson, Irving Ziller, Richard Goldberg, Lois Haibt and David Sayre. The IBM team didn't invent HLL or the idea of compiling programming language into machine code, but Fortran was the first successful HLL and the Fortran I compiler holds the record for translating code for over 20 years. The first computer to run the first compiler was the IBM 704, which John Backus helped design.
Fortran is now over forty years old and remains the top language in scientific and industrial programming, of course it has constantly been updated. The invention of Fortran began a $24 million dollar computer software industry and began the development of other high level programming languages, Fortran has been used for programming video games, air traffic control systems, payroll calculations, numerous scientific and military applications and parallel computer research. John Backus won the 1993 National Academy of Engineering's Charles Stark Draper Prize, the highest national prize awarded in engineering, for the invention of Fortran.

BASIC Language:

BASIC (standing for Beginner's All Purpose Symbolic Instruction Code) was written (invented) in 1963, at Dartmouth College, by mathematicians John George Kemeny and Tom Kurtzas as a teaching tool for undergraduates. BASIC has been one of the most commonly used computer programming languages, a simple computer language considered an easy step for students to learn before more powerful languages such as
FORTRAN.
BASIC's popularity was spread by both Paul Allen and William Gates, in 1975. Gates and Allen (both Microsoft founding fathers) wrote a version of BASIC for the
Altair personal computer. It was the first product Microsoft sold. Later Gates and Microsoft wrote versions of BASIC for the Apple computer, and IBM's DOS which Gates provided came with its' version of BASIC.

Integrated Circuit (IC)

illustration from Jack Kilby's inventor's journal
Inventors of the Modern Computer Series
Table of Contents • Next Chapter Steve Russell and Spacewar - the First Computer Game ENTER
More on Intergrated Circuit - Jack Kilby and Robert Noyce
Further Reading: The history of integrated circuits, patent drawings, photos, biographies of Jack Kilby and Robert Noyce.
"What we didn't realize then was that the integrated circuit would reduce the cost of electronic functions by a factor of a million to one, nothing had ever done that for anything before" - Jack Kilby
It seems that the integrated circuit was destined to be invented. Two separate inventors, unaware of each other's activities, invented almost identical integrated circuits or ICs at nearly the same time.
Jack Kilby, an engineer with a background in ceramic-based silk screen circuit boards and transistor-based hearing aids, started working for Texas Instruments in 1958. A year earlier, research engineer Robert Noyce had co-founded the Fairchild Semiconductor Corporation. From 1958 to 1959, both electrical engineers were working on an answer to the same dilemma: how to make more of less.
In designing a complex electronic machine like a computer it was always necessary to increase the number of components involved in order to make technical advances. The monolithic (formed from a single crystal) integrated circuit placed the previously separated
transistors, resistors, capacitors and all the connecting wiring onto a single crystal (or 'chip') made of semiconductor material. Kilby used germanium and Noyce used silicon for the semiconductor material.
In 1959 both parties applied for patents. Jack Kilby and Texas Instruments received U.S. patent #3,138,743 for miniaturized electronic circuits. Robert Noyce and the Fairchild Semiconductor Corporation received U.S. patent #2,981,877 for a silicon based integrated circuit. The two companies wisely decided to cross license their technologies after several years of legal battles, creating a global market now worth about $1 trillion a year.
In 1961 the first commercially available integrated circuits came from the Fairchild Semiconductor Corporation. All computers then started to be made using chips instead of the individual transistors and their accompanying parts. Texas Instruments first used the chips in Air Force computers and the Minuteman Missile in 1962. They later used the chips to produce the first electronic portable calculators. The original IC had only one transistor, three resistors and one capacitor and was the size of an adult's pinkie finger. Today an IC smaller than a penny can hold 125 million transistors.
Jack Kilby now holds patents on over sixty inventions and is also well known as the inventor of the portable calculator (1967). In 1970 he was awarded the National Medal of Science. Robert Noyce, with sixteen patents to his name, founded Intel, the company responsible for the invention of the
microprocessor, in 1968. But for both men the invention of the integrated circuit stands historically as one of the most important innovations of mankind. Almost all modern products use chip technology.

Programming Language
programming language, syntax, grammar, and symbols or words used to give instructions to a
computer.
Sections in this article:
Introduction
Development of Low-Level Languages
Evolution of High-Level Languages
Compilers and Interpreters

Development of Low-Level Languages
All computers operate by following machine language programs, a long sequence of instructions called machine code that is addressed to the hardware of the computer and is written in binary notation (see
numeration), which uses only the digits 1 and 0. First-generation languages, called machine languages, required the writing of long strings of binary numbers to represent such operations as “add,” “subtract,” “and compare.” Later improvements allowed octal, decimal, or hexadecimal representation of the binary strings.
Because writing programs in machine language is impractical (it is tedious and error prone), symbolic, or assembly, languages—second-generation languages—were introduced in the early 1950s. They use simple mnemonics such as A for “add” or M for “multiply,” which are translated into machine language by a
computer program called an assembler. The assembler then turns that program into a machine language program. An extension of such a language is the macro instruction, a mnemonic (such as “READ”) for which the assembler substitutes a series of simpler mnemonics. The resulting machine language programs, however, are specific to one type of computer and will usually not run on a computer with a different type of central processing unit (CPU).

Evolution of High-Level Languages
The lack of portability between different computers led to the development of high-level languages—so called because they permitted a programmer to ignore many low-level details of the computer's hardware. Further, it was recognized that the closer the syntax, rules, and mnemonics of the programming language could be to “natural language” the less likely it became that the programmer would inadvertently introduce errors (called “bugs”) into the program. Hence, in the mid-1950s a third generation of languages came into use. These algorithmic, or procedural, languages are designed for solving a particular type of problem. Unlike machine or symbolic languages, they vary little between computers. They must be translated into machine code by a program called a compiler or interpreter.
Early computers were used almost exclusively by scientists, and the first high-level language, Fortran [Formula translation], was developed (1953–57) for scientific and engineering applications by John Backus at the IBM Corp. A program that handled recursive algorithms better, LISP [LISt Processing], was developed by John McCarthy at the Massachusetts Institute of Technology in the early 1950s; implemented in 1959, it has become the standard language for the artificial intelligence community. COBOL [COmmon Business Oriented Language], the first language intended for commercial applications, is still widely used; it was developed by a committee of computer manufacturers and users under the leadership of Grace Hopper, a U.S. Navy programmer, in 1959. ALGOL [ALGOrithmic Language], developed in Europe about 1958, is used primarily in mathematics and science, as is APL [A Programming Language], published in the United States in 1962 by Kenneth Iverson. PL/1 [Programming Language 1], developed in the late 1960s by the IBM Corp., and ADA [for Ada Augusta, countess of Lovelace, biographer of Charles
Babbage], developed in 1981 by the U.S. Dept. of Defense, are designed for both business and scientific use.
BASIC [Beginner's All-purpose Symbolic Instruction Code] was developed by two Dartmouth College professors, John Kemeny and Thomas Kurtz, as a teaching tool for undergraduates (1966); it subsequently became the primary language of the personal computer revolution. In 1971, Swiss professor Nicholas Wirth developed a more structured language for teaching that he named Pascal (for French mathematician Blaise
Pascal, who built the first successful mechanical calculator). Modula 2, a Pascallike language for commercial and mathematical applications, was introduced by Wirth in 1982. Ten years before that, to implement the UNIX operating system, Dennis Ritchie of Bell Laboratories produced a language that he called C; along with its extensions, called C++, developed by Bjarne Stroustrup of Bell Laboratories, it has perhaps become the most widely used general-purpose language among professional programmers because of its ability to deal with the rigors of object-oriented programming. Java is an object-oriented language similar to C++ but simplified to eliminate features that are prone to programming errors. Java was developed specifically as a network-oriented language, for writing programs that can be safely downloaded through the Internet and immediately run without fear of computer viruses. Using small Java programs called applets, World Wide Web pages can be developed that include a full range of multimedia functions.
Fourth-generation languages are nonprocedural—they specify what is to be accomplished without describing how. The first one, FORTH, developed in 1970 by American astronomer Charles Moore, is used in scientific and industrial control applications. Most fourth-generation languages are written for specific purposes. Fifth-generation languages, which are still in their infancy, are an outgrowth of
artificial intelligence research. PROLOG [PROgramming LOGic], developed by French computer scientist Alain Colmerauer and logician Philippe Roussel in the early 1970s, is useful for programming logical processes and making deductions automatically.
Many other languages have been designed to meet specialized needs. GPSS [General Purpose System Simulator] is used for modeling physical and environmental events, and SNOBOL [String-Oriented Symbolic Language] is designed for pattern matching and list processing. LOGO, a version of LISP, was developed in the 1960s to help children learn about computers. PILOT [Programmed Instruction Learning, Or Testing] is used in writing instructional software, and Occam is a nonsequential language that optimizes the execution of a program's instructions in
parallel-processing systems.
There are also procedural languages that operate solely within a larger program to customize it to a user's particular needs. These include the programming languages of several database and statistical programs, the scripting languages of communications programs, and the macro languages of
word-processing programs.

COBOL :

The word COBOL is an acronym that stands for COmmon Business Oriented Language. As the the expanded acronym indicates, COBOL is designed for developing business, typically file-oriented, applications. It is not designed for writing systems programs. For instance you would not develop an operating system or a compiler using COBOL
COBOL is a high-level programming language first developed by the CODASYL Committee (Conference on Data Systems Languages) in 1960. Since then, responsibility for developing new COBOL standards has been assumed by the American National Standards Institute (ANSI).
Three ANSI standards for COBOL have been produced: in 1968, 1974 and 1985. A new COBOL standard introducing object-oriented programming to COBOL, is due within the next few years.


...Ravi

1 comments:

Anonymous said...

Sweet website! Continue the informative entries.