Jump to UNIX history


Theearliest calculating machine was the abacus, believed to have been invented inBabylon around 2400 B.C.E.The abacus was used by many different cultures and civilizations, including themajor advance known as the Chinese abacus from the 2nd Century B.C.E.

TheChinese developed the South Pointing Chariot in 115 B.C.E. This device featured adifferential gear, later used in modern times to make analog computers in themid-20th Century.

The Indiangrammarian Panini wrote the Ashtadhyayiin the 5th Century B.C.E.In this work he created 3,959 rules of grammar for India’s Sanskrit language.This important work is the oldest surviving linguistic book and introduced theidea of metarules, transformations, and recursions, all of which have importantapplications in computer science.

The firsttrue computers were made with intricate gear systems by the Greeks. Thesecomputers turned out to be too delicate for the technological capabilities ofthe time and were abandoned as impractical. The Antikythera mechanism,discovered in a shipwreck in 1900, is an early mechanical analog computer frombetween 150 B.C.E. and 100 B.C.E.. The Antikythera mechanism used asystem of 37 gears to compute the positions of the sun and the moon through thezodiac on the Egyptian calendar, and possibly also the fixed stars and fiveplanets known in antiquity (Mercury, Venus, Mars, Jupiter, and Saturn) for anytime in the future or past. The system of gears added and subtracted angularvelocities to compute differentials. The Antikythera mechanism could accuratelypredict eclipses and could draw up accurate astrological charts for importantleaders. It is likely that the Antikythera mechanism was based on anastrological computer created by Archimedes of Syracuse in the 3rd century B.C.E.

The firstdigital computers were made by the Inca using ropes and pulleys. Knots in theropes served the purpose of binary digits. The Inca had several of thesecomputers and used them for tax and government records. In addition to keepingtrack of taxes, the Inca computers held data bases on all of the resources ofthe Inca empire, allowing for efficient allocation of resources in response tolocal disasters (storms, drought, earthquakes, etc.). Spanish soldiers actingon orders of Roman Catholic priests destroyed all but one of the Inca computersin the mistaken belief that any device that could give accurate informationabout distant conditions must be a divination device powered by the Christian“Devil” (and many modern Luddites continue to view computers as Satanicallypossessed devices).

In the1800s, the first computers were programmable devices for controlling the weavingmachines in the factories of the Industrial Revolution. Created by CharlesBabbage, these early computers used Punch cards as data storage (the cardscontained the control codes for the various patterns). These cards were verysimiliar to the famous Hollerinth cards developed later. The first computerprogrammer was Lady Ada, for whom the Ada programming language is named.

In 1822Charles Babbage proposed a difference engine for automated calculating. In 1933Babbage started work on his Analytical Engine, a mechanical computer with allof the elements of a modern computer, including control, arithmetic, andmemory, but the technology of the day couldn’t produce gears with enoughprecision or reliability to make his computer possible. The Analytical Enginewould have been programmed with Jacquard’s punched cards. Babbage designed theDifference Engine No.2. Lady Ada Lovelace wrote a program for the AnalyticalEngine that would have correctly calculated a sequence of Bernoulli numbers,but was never able to test her program because the machine wasn’t built.

GeorgeBoole introduced what is now called Boolean algebra in 1854. This branch ofmathematics was essential for creating the complex circuits in modernelectronic digital computers.

In the1900s, researchers started experimenting with both analog and digital computersusing vacuum tubes. Some of the most successful early computers were analogcomputers, capable of performing advanced calculus problems rather quickly. Butthe real future of computing was digital rather than analog. Building on thetechnology and math used for telephone and telegraph switching networks,researchers started building the first electronic digital computers.

The firstmodern computer was the German Zuse computer (Z3) in 1941. In 1944 Howard Aikenof Harvard University created the Harvard Mark I and Mark II. The Mark I wasprimarily mechanical, while the Mark II was primarily based on reed relays.Telephone and telegraph companies had been using reed relays for the logiccircuits needed for large scale switching networks.

The firstmodern electronic computer was the ENIAC in 1946, using 18,000 vacuum tubes.See below for information on Von Neumann’s important contributions.

The firstsolid-state (or transistor) computer was the TRADIC, built at Bell Laboratoriesin 1954. The transistor had previously been invented at Bell Labs in 1948.

Jump to UNIX history continues..

Von Neumannarchitecture

John Louisvon Neumann, mathematician (born János von Neumann 28 December 1903 inBudapest, Hungary, died 8 February 1957 in Washington, D.C.), proposed the stored program concept whileprofessor of mathemtics (one of the orginal six) at Princeton University’sInstitute for Advanced Services, in which programs (code) are stored in thesame memory as data. The computer knows the difference between code and data bywhich it is attempting to access at any given moment. When evaluating code, thebinary numbers are decoded by some kind of physical logic circuits (later othermethods, such as microprogramming, were introduced), and then the instructionsare run in hardware. This design is called von Neumann architecture andhas been used in almost every digital computer ever made.

VonNeumann architecture introduced flexibility to computers. Previous computershad their programming hard wired into the computer. A particular computer couldonly do one task (at the time, mostly building artillery tables) and had to bephysically rewired to do any new task.

By usingnumeric codes, von Neumann computers could be reprogrammed for a wide varietyof problems, with the decode logic remaining the same.

Asprocessors (especially super computers) get ever faster, the von Neumann bottleneck is starting tobecome an issue. With data and code both being accessed over the same circuitlines, the processor has to wait for one while the other is being fetched (orwritten). Well designed data and code caches help, but only when the requestedaccess is already loaded into cache. Some researchers are now experimentingwith Harvard architecture to solve the von Neumann bottleneck. InHarvard arrchitecture, named for Howard Aiken’s experimental Harvard Mark I(ASCC) calculator [computer] at Harvard University, a second set of data andaddress lines along with a second set of memory are set aside for executablecode, removing part of the conflict with memory accesses for data.

VonNeumann became an American citizen in 1933 to be eligible to help on top secretwork during World War II. There is a story that Oskar Morganstern coached vonNeumann and Kurt Gödel on the U.S. Constitution and American history whiledriving them to their immigration interview. Morganstern asked if they had anyquestions, and Gödel replied that he had no questions, but had found somelogical inconsistencies in the Constitution that he wanted to ask theImmigration officers about. Morganstern recommended that he not ask questions,but just answer them.

VonNeumann occassionally worked with Alan Turing in 1936 through 1938 when Turingwas a graduate student at Princeton. Von Neumann was exposed to the concepts oflogical design and universal machine proposed in Turing’s 1934 paper “OnComputable Numbers with an Application to the Entschiedungs-problem”.

VonNeumann worked with such early computers as the Harvard Mark I, ENIAC, EDVAC,and his own IAS computer.

Earlyresearch into computers involved doing the computations to create tables, especiallyartillery firing tables. Von Neumann was convinced that the future of computersinvolved applied mathematics to solve specific problems rather than mere tablegeneration. Von Neumann was the first person to use computers for mathematicalphysics and economics, proving the utility of a general purpose computer.

VonNeumann proposed the concept of stored programs in the 1945 paper “First Draftof a Report on the EDVAC”. Influenced by the idea, Maurice Wilkes of theCambridge University Mathematical Laboratory designed and built the EDSAC, theworld’s first operational, production, stored-program computer.

The firststored computer program ran on the Manchester Mark I [computer] on June 21,1948.

VonNeumann foresaw the advantages of parallelism in computers, but because ofconstruction limitations of the time, he worked on sequential systems.

VonNeumann advocated the adoption of the bit as the measurement of computer memoryand solved many of the problems regarding obtaining reliable answers fromunreliable computer components.

Interestingly,von Neumann was opposed to the idea of compilers. When shown the idea forFORTRAN in 1954, von Neumann asked “Why would you want more than machinelanguage?”. Von Neumann had graduate students hand assemble programs intobinary code for the IAS machine. Donald Gillies, a student at Princeton,created an assembler to do the work. Von Neumann was angry, claiming “It is awaste of a valuable scientific computing instrument to use it to do clericalwork”.

VonNeumann also did important work in set theory (including measure theory), themathematical foundation for quantum theory (including statistical mechanics),self-adjoint algebras of bounded linear operators on a Hilbert space closed inweak operator topology, non-linear partial differential equations, and automatatheory (later applied to cmputers). His work in economics included his 1937paper “A Model of General Economic Equilibrium” on a multi-sectoral growthmodel and his 1944 book “Theory of Games and Economic Behavior” (co-authoredwith Morgenstern) on game theory and uncertainty.

Ileave the discussion of von Neumann with a couple of quotations:

 “If people do not believe that mathematics issimple, it is only because they do not realize how complicated life is.”

 “Anyone who considers arithmetical methods ofproducing random numbers is, of course, in a state of sin.”

Jump to UNIX history continues….


Onesolution to this problem was to have programmers prepare their work off-line onsome input medium (often on punched cards, paper tape, or magnetic tape) andthen hand the work to a computer operator. The computer operator would load upjobs in the order received (with priority overrides based on politics and otherfactors). Each job still ran one at a time with complete control of thecomputer, but as soon as a job finished, the operator would transfer theresults to some output medium (punched tape, paper tape, magnetic tape, orprinted paper) and deliver the results to the appropriate programmer. If theprogram ran to completion, the result would be some end data. If the programcrashed, memory would be transferred to some output medium for the programmerto study (because some of the early business computing systems used magnetic corememory, these became known as “core dumps”).

Theconcept of computer operators dominated the mainframe era and continues todayin large scale operations with large numbers of servers.

Devicedrivers and library functions

Soon afterthe first successes with digital computer experiments, computers moved out ofthe lab and into practical use. The first practical application of theseexperimental digital computers was the generation of artillery tables for theBritish and American armies. Much of the early research in computers was paidfor by the British and American militaries. Business and scientificapplications followed.

Ascomputer use increased, programmers noticed that they were duplicating the sameefforts.

Everyprogrammer was writing his or her own routines for I/O, such as reading inputfrom a magnetic tape or writing output to a line printer. It made sense towrite a common device driver for each input or putput device and then haveevery programmer share the same device drivers rather than each programmerwriting his or her own. Some programmers resisted the use of common devicedrivers in the belief that they could write “more efficient” or faster or"“better” device drivers of their own.

Additionallyeach programmer was writing his or her own routines for fairly common andrepeated functionality, such as mathematics or string functions. Again, it madesense to share the work instead of everyone repeatedly “reinventing the wheel”.These shared functions would be organized into libraries and could be insertedinto programs as needed. In the spirit of cooperation among early researchers,these library functions were published and distributed for free, an earlyexample of the power of the open source approach to software development.

Computermanufacturers started to ship a standard library of device drivers and utilityroutines with their computers. These libraries were often called a runtimelibrary because programs connected up to the routines in the library at runtime (while the program was running) rather than being compiled as part of theprogram. The commercialization of code libraries ended the widespread freesharing of software.

Manufacturerswere pressured to add security to their I/O libraries in order to preventtampering or loss of data.

Jump to UNIX history continues…


In theearliest days of electronic digital computing, everything was done on the barehardware. Very few computers existed and those that did exist were experimentalin nature. The researchers who were making the first computers were also theprogrammers and the users. They worked directly on the “bare hardware”. Therewas no operating system. The experimenters wrote their programs in machine orassembly language and a running program had complete control of the entirecomputer. Often programs and data were entered by hand through the use oftoggle switches. Memory locations (both data and programs) could be read byviewing a series of lights (one for each binary digit). Debugging consisted ofa combination of fixing both the software and hardware, rewriting the objectcode and changing the actual computer itself.

The lackof any operating system meant that only one person could use a computer at atime. Even in the research lab, there were many researchers competing forlimited computing time. The first solution was a reservation system, withresearchers signing up for specific time slots. The earliest billing systemscharged for the entire computer and all of its resources (regardless of whetherused or not) and was based on outside clock time, being billed from thescheduled start to scheduled end times.

The highcost of early computers meant that it was essential that the rare computers beused as efficiently as possible. The reservation system was not particularlyefficient. If a researcher finished work early, the computer sat idle until thenext time slot. If the researcher’s time ran out, the researcher might have topack up his or her work in an incomplete state at an awkward moment to makeroom for the next researcher. Even when things were going well, a lot of thetime the computer actually sat idle while the researcher studied the results(or studied memory of a crashed program to figure out what went wrong). Simplyloading the programs and data took up some of the scheduled time.

Jump to UNIX history continues…

Inputoutput control systems

The firstprograms directly controlled all of the computer’s resources, including inputand output devices. Each individual program had to include code to control andoperate each and every input and/or output device used.

One of thefirst consolidations was placing common input/output (I/O) routines into acommon library that could be shared by all programmers. I/O was separated fromprocessing.

Thesefirst rudimentary operating systems were called an Input Output Control Systemor IOCS.

Computersremained single user devices, with main memory divided into an IOCS and a usersection. The user section consisted of program, data, and unused memory.

The userremained responsible for both set up and tear down.

Set upincluded loading data and program, by front panel switches, punched card,magnetic tapes, paper tapes, disk packs, drum drives, and other early I/O andstorage devices. Paper might be loaded into printers, blank cards into cardpunch mahcines, and blank or formatted tape into tape drives, or other outputdevices readied.

Tear downwould include unmounting tapes, drives, and other media.

The veryexpensive early computers sat idle during both set up and tear down.

This wasteled to the introduction of less expensive I/O computers. While one I/O computerwas being set up or torn down, another I/O computer could be communicating areadied job with the main computer.

Someinstallations might have several different I/O computers connected to a singlemain computer to keep the expensive main computer in use. This led to theconcept of multiple I/O channels.


Ascomputers spread from the research labs and military uses into the businessworld, the accountants wanted to keep more accurate counts of time than merewall clock time.

This ledto the concept of the monitor. Routines were added to record the startand end times of work using computer clock time. Routines were added to I/Olibrary to keep track of which devices were used and for how long.

With thedevelopment of the Input Output Control System, these time keeping routineswere centralized.

You willnotice that the word monitor appears in the name of some operating systems,such as FORTRAN Monitor System. Even decades later many programmers still referto the operating system as the monitor.

Animportant motivation for the creation of a monitor was more accurate billing.The monitor could keep track of actual use of I/O devices and record runtimerather than clock time.

Foraccurate time keeping the monitor had to keep track of when a program stoppedrunning, regardless of whether it was a normal end of the program or some kindof abnormal termination (such as aa crash).

Themonitor reported the end of a program run or error conditions to a computeroperator, who could load the next job waiting, rerun a job, or take otheractions. The monitor also notified the computer operator of the need to load orunload various I/O devices (such as changing tapes, loading paper into theprinter, etc.).

Jump to UNIX history continues……


Someoperating systems from the 1950s include: FORTRAN Monitor System, GeneralMotors Operating System, Input Output System, SAGE, and SOS.

SAGE(Semi-Automatic Ground Environment), designed to monitor weapons systems, wasthe first real time control system.

Batch systems

Batchsystems automated the early approach of having human operators load one programat a time. Instead of having a human operator load each program, softwarehandled the scheduling of jobs. In addition to programmers submitting theirjobs,, end users could submit requests to run specific programs with specificdata sets (usually stored in files or on cards). The operating system wouldschedule “batches” of related jobs. Output (punched cards, magnetic tapes,printed material, etc.) would be returned to each user.

GeneralMotors Operating System, created by General Motors Research Laboratories inearly 1956 (or late 1955) for thieir IBM 701 mainframe is generally consideredto be the first batch operating system and possibly the first “real” operatingsystem.

Theoperating system would read in a program and its data, run that program tocompletion (including outputing data), and then load the next program in seriesas long as there were additional jobs available.

Batchoperating systems used a Job Control Language (JCL) to give the operatingsystem instructions. These instructions included designation of which punchedcards were data and which were programs, indications of which compiler to use,which centralized utilities were to be run, which I/O devices might be used,estimates of expected run time, and other details.

This typeof batch operating system was known as a single stream batch processing system.

Examplesof operating systems that were primarily batch-oriented include: BKY, BOS/360, BPS/360,CAL, and Chios.



The early1960s saw the introduction of time sharing and multi-processing.

Someoperating systems from the early 1960s include: Admiral, B1, B2, B3, B4, BasicExecutive System, BOS/360, Compatible Timesharing System (CTSS), EXEC I, EXECII, Honeywell Executive System, IBM 1410/1710 OS, IBSYS, Input Output ControlSystem, Master Control Program, and SABRE.
    Thefirst major transaction processing system was SABRE (Semi-Automatic BusinessRelated Environment), developed by IBM and American Airlines.

Jump to UNIX history continues…….


There is ahuge difference in speed between I/O and running programs. In a single streamsystem, the processor remains idle for much of the time as it waits for the I/Odevice to be ready to send or receive the next piece of data.

Theobvious solution was to load up multiple programs and their data and switchback and forth between programs or jobs.

When onejob idled to wait for input or output, the operating system could automaticallyswitch to another job that was ready.



The firstoperating system to introduce system calls was University of Machester’s AtlasI Supervisor.

Time sharing

Theoperating system could have additional reasons to rotate through jobs,including giving higher or lower priority to various jobs (and therefore alarger or smaller share of time and other resources). The CompatibleTimesharing System (CTSS), first dmonstrated in 1961, was one of the firstattempts at timesharing.
While mostof the CTSS operating system was written in assembly language (all previousOSes were written in assembly for efficiency), the scheduler was written in theprogramming lanuage MAD in order to allow safe and reliable experimentationwith different scheduling algorithms. About half of the command programs forCTSS were also written in MAD.
Timesharingis a more advanced version of multiprogramming that gives many users theillusion that they each have complete control of the computer to themselves.The scheduler stops running programs based on a slice of time, moves on to thenext program, and eventually returns back to the beginning of the list ofprograms. In little increments, each program gets their work done in a mannerthat appears to be simultaneous to the end users.



Someoperating systems from the mid-1960s include: Atlas I Supervisor, DOS/360, InputOutput Selector, Master Control Program, and Multics.
The AtlasI Supervisor introduced spooling, interrupts, and virtual memory paging (16pages) in 1962. Segmentation was introduced on the Burroughs B5000. MIT’sMultics combined paging and segmentation.
TheCompatible Timesharing System (CTSS) introduced email.



Someoperating systems from the late-1960s include: BPS/360, CAL, CHIPPEWA, EXEC 3,and EXEC 4, EXEC 8, GECOS III, George 1, George 2, George 3, George 4, IDASYS, MASTER,Master Control Program, OS/MFT, OS/MFT-II, OS/MVT, OS/PCP, and RCA DOS.



In 1968 agroup of scientists and engineers from Mitre Corporation (Bedford,Massachusetts) created Viatron Computer company and an intelligent dataterminal using an 8-bit LSI microprocessor from PMOS technology. A year laterin 1969 Viatron created the 2140, the first 4-bit LSI microprocessor. At thetime MOS was used only for a small number of calculators and there simplywasn’t enough worldwide manufacturing capacity to build these computers inquantity.
Othercompanies saw the benefit of MOS, starting with Intel’s 1971 release of the4-bit 4004 as the first commercially available microprocessor. In 1972 Rockwellreleased the PPS-4 microprocessor, Fairchild released the PPS-25microprocessor, and Intel released the 8-bit 8008 microprocessor. In 1973National released the IMP microprocessor.
In 1973Intel released the faster NMOS 8080 8-bit microprocessor, the first in a longseries of microprocessors that led to the current Pentium.
In 1974Motorola released the 6800, which included two accumulators, index registers,and memory-mapped I/O. Monolithic Memories introduced bit-slicemicroprocessing. In 1975 Texas Instruments introduced a 4-bit slicemicroprocessor and Fairchild introduced the F-8 microprocessor.



Someoperating systems from the early-1970s include: BKY, Chios, DOS/VS, MasterControl Program, OS/VS1, and UNIX.
In 1970Ken Thompson of AT&T Bell Labs suggested the name “Unix” for the operatingsystem that had been under development since 1969. The name was an intentionalpun on AT&T’s earlier Multics project (uni- means “one”, multi-means “many”).

Jump to UNIX history continues……..

UNIXtakes over mainframes
I amskipping ahead to the development and spread of UNIX, not because the earlyhistory isn’t interesting, but because I notice that a lot of people aresearching for information on UNIX history.
UNIX wasorginally developed in a laboratory at AT&T’s Bell Labs (now an independentcorporation known as Lucent Technologies). At the time, AT&T was prohibitedfrom selling computers or software, but was allowed to develop its own softwareand computers for internal use. A few newly hired engineers were unable to getvaluable mainframe computer time because of lack of seniority and resorted towriting their own operating system (UNIX) and programming language (C) to runon an unused mini-computer.
Thecomputer game Space Travel was originally written by Jeremy Ben for Multics.When AT&T pulled out of the Multics project, J. Ben ported the program toFORTRAN running on GECOS on the GE 635. J. Ben and Dennis Ritchie ported thegame in DEC PDP-7 assembly language. The process of porting the game to thePDP-7 computer was the beginning of Unix.
Unix wasoriginally called UNICS, for Uniplexed Information and Computing Service, aplay on words variation of Multics, Multiplexed Information and ComputingService.
AT&T’sconsent decree with the U.S. Justice Department on monopoly charges wasinterpretted as allowing AT&T to release UNIX as an open source operatingsystem for academic use. Ken Thompson, one of the originators of UNIX, tookUNIX to the University of California, Berkeley, where students quickly startedmaking improvements and modifications, leading to the world famous BerkeleyStandard Distribution (BSD) form of UNIX.
UNIXquickly spread throughout the academic world, as it solved the problem ofkeeping track of many (sometimes dozens) of proprietary operating systems onuniversity computers. With UNIX all of the computers from many differentmanufacturers could run the same operating system and share the same programs(recompiled on each processor).
WhenAT&T settled yet another monopoly case, the company was broken up into“Baby Bells” (the regional companies operating local phone service) and thecentral company (which had the long distance business and Bell Labs). AT&T(as well as the Baby Bells) was allowed to enter the computer business.AT&T gave academia a specific deadline to stop using “encumbered code”(that is, any of AT&T’s source code anywhere in their versions of UNIX).
This ledto the development of free open source projects such as FreeBSD, NetBSD, and OpenBSD,as well as commercial operating systems based on the BSD code.
Meanwhile,AT&T developed its own version of UNIX, called System V. Although AT&Teventually sold off UNIX, this also spawned a group of commercial operatingsystems known as Sys V UNIXes.
UNIXquickly swept through the commercial world, pushing aside almost allproprietary mainframe operating systems. Only IBM’s MVS and DEC’s OpenVMS survivedthe UNIX onslaught.
“Vendorssuch as Sun, IBM, DEC, SCO, and HP modified Unix to differentiate theirproducts. This splintered Unix to a degree, though not quite as much as isusually perceived. Necessity being the mother of invention, programmers havecreated development tools that help them work around the differences betweenUnix flavors. As a result, there is a large body of software based on sourcecode that will automatically configure itself to compile on most Unixplatforms, including Intel-based Unix.
Regardless,Microsoft would leverage the perception that Unix is splintered beyond hope,and present Windows NT as a more consistent multi-platform alternative.”—Nicholas Petreley, “The new Unix alters NT’s orbit”, NC World

Jump to UNIX history continues………

UNIXto the desktop

Among theearly commercial attempts to deploy UNIX on desktop computers was AT&Tselling UNIX in an Olivetti box running a w74 680x0 assembly language isdiscussed in the assembly language section. Microsoft partnered with Xenix tosell their own version of UNIX. Apple computers offered their A/UX version ofUNIX running on Macintoshes. None of these early commercial UNIXs wassuccessful. “Unix started out too big and unfriendly for the PC. … It sold likeice cubes in the Arctic. … Wintel emerged as the only ‘safe’ business choice”,Nicholas Petreley.

 “Unix had a limited PC market, almost entirelyserver-centric. SCO made money on Unix, some of it even from Microsoft.(Microsoft owns 11 percent of SCO, but Microsoft got the better deal in thelong run, as it collected money on each unit of SCO Unix sold, due to a bit ofcode in SCO Unix that made SCO somewhat compatible with Xenix. The arrangementended in 1997.)” —Nicholas Petreley, “The new Unix alters NT’s orbit”, NC World
To date,the most widely used desktop version of UNIX is Apple’s Mac OS X, combining theground breaking object oriented NeXT with some of the user interface of theMacintosh.



Someoperating systems from the mid-1970s include: CP/M, Master Control Program.
In 1973the kernel of Unix was rewritten in the C programming language. This made Unixthe world’s first portable operating system, capable of being easily ported(moved) to any hardware. This was a major advantage for Unix and led to itswidespread use in the multi-platform environments of colleges and universities.


Late 1970s

Someoperating systems from the late-1970s include: EMAS 2900, General ComprehensiveOS, VMS (later renamed OpenVMS), OS/MVS.



Someoperating systems from the 1980s include: AmigaOS, DOS/VSE, HP-UX, Macintosh, MS-DOS,and ULTRIX.
The 1980ssaw the commercial release of the graphic user interface, most famously theApple Macintosh, Commodore Amiga, and Atari ST, followed by Microsoft’sWindows.



Someoperating systems from the 1990s include: BeOS, BSDi, FreeBSD, NeXT, OS/2, Windows95, Windows 98, and Windows NT.



Someoperating systems from the 2000s include: Mac OS X, Syllable, Windows 2000, WindowsServer 2003, Windows ME, and Windows XP.



TimelineNotes In additionto listing the years that various operating systems were introduced, thistimeline also includes information on supporting technologies to give bettercontext.
Year Year that items were introduced.
OperatingSystems Operatingsystems introduced in that year.
ProgrammingLanguagesProgramming languages introduced. While only a few programming languages areappropriate for operating system work (such as Ada, BLISS, C, FORTRAN, andPL/I, the programming languages available with an operating system greatlyinfluence the kinds of application programs available for an operating system.
Computers Computers and processorsintroduced. While a few operating systems run on a wide variety of computers(such as UNIX and Linux), most operating systems are closely or even intimatelytied to their primary computer hardware. Speed listings in parenthesis are in operationsper second (OPS), floating point operatins per second (FLOPS), or clock speed(Hz).
Software Software programs introduced. Somemajor application programs that became available. Often the choice of operatingsystem and computer was made by the need for specific programs or kinds ofprograms.
Games Games introduced. It may seemstrange to include games in the time line, but many of the advances in computerhardware and software technologies first appeared in games. As one famousexample, the roots of UNIX were the porting of an early computer game to newhardware.
Technology Major technology advances, whichinfluence the capabilities and possibilities for operating systems.



Computers: Zuse Z1 (Germany, 1 OPS, first mechanicalprogrammable binary computer, storage for a total of 64 numbers stored as 22bit floating point numbers with 7-bit exponent, 15-bit signifocana [oneimplicit bit], and sign bit)



Computers: Atanasoff-Berry Computer; Zuse Z3 (Germany,20 OPS, added floating point exceptions, plus and minus infinity, andundefined)
Computers: work started on Zuse Z4
Computers: Harvard Mark I (U.S.); Colossus 1(U.K., 5 kOPS)
Computers: Colossus 2 (U.K., single processor,25 kOPS)
Programming Languages: Planalkül (Plan Calculus)  
Computers: Zuse Z4 (relay based computer,first commercial computer)
Computers: UPenn Eniac (5 kOPS); Colossus 2(parallel processor, 50 kOPS)
Technology: electrostatic memory

Computers: IBM SSEC; Manchester SSEM
Technology: random access memory; magnetic drums; transistor

Computers: Manchester Mark 1
Technology: registers

Jump to UNIX history continues………

Computers: Ferranti Mark 1 (first commercialcomputer); Leo I (frst business computer); UNIVAC I, Whirlwind


Programming Languages: A-0; first version of FORTRAN
Computers: UNIVAC 1101; IBM 701
Games: OXO (a graphic version of Tic-Tac-Toe created by A.S. Douglas onthe EDSAC computerat the University of Cambridge to demonstrate ideas onhuman-computer interaction)


Computers: Strela


Programming Languages: Mark I
Computers: IBM 650; IBM 704 (vacuum tube computer with floating point);IBM NORC (67 kOPS)
Technology: magnetic core memory

Operating Systems: GMOS (General Motors OS for IBM701)
Computers: Harwell CADET
Operating Systems: GM-NAA I/O
Computers: IBM 305 RAMAC; MIT TX-0 (83 kOPS)
Technology: hard disk
Computers: IBM 608
Programming Languages: FORTRAN
Technology: dot matrix printer
Operating Systems: UMES
Programming Languages: ALGOL 58; LISP
Computers: UNIVAC II; IBM AN/FSQ-7 (400 kOPS)
Games:Tennis For Two (developed by William Higinnotham using anosciliscope and an analog computer)
Technology: integrated circuit
Operating Systems: SHARE
Computers: IBM 1401




Operating Systems: IBSYS
Programming Languages: COBOL
Computers: DEC PDP-1; CDC 1604; UNIVAC LARC (250 kFLOPS)

Operating Systems: CTSS, Burroughs MCP
Games: Spacewar! (created by group of M.I.T. students on the DEC PDP-1)
Computers: IBM 7030 Stretch (1.2 MFLOPS)
Operating Systems: GECOS
Programming Languages: APL, SIMULA

Computers: ATLAS, UNIVAC 1100/2200 (introduced two floating pointformats, single precision and double precision; single precision: 36 bits,1-bit sign, 8-bit exponent, and 27-bit significand; double precision: 36 bits,1-bit sign, 11-bit exponent, and 60-bit significand), IBM 7094 (followed theUNIVAC, also had single and double precision numbers)

Computers: DEC PDP-6
Technology: mouse

Operating Systems: DTSS, TOPS-10
Programming Languages: BASIC, PL/I
Computers: IBM 360; DEC PDP-8; CDC 6600 (first supercomputer, scalarprocessor, 3 MFLOPS)
Technology: super computing


Operating Systems: OS/360; Multics
Technology: time-sharing; fuzzy logic; packet switching; bulletin boardsystem (BBS); email
Programming Languages: ISWIM, Logo
Computers: BESM-6
Operating Systems: ITS; CP/CMS; WAITS
Computers: DEC PDP-10
Technology: microprocessor; interactive computing (including mouse,windows, hypertext, and fullscreen word processing)
Operating Systems: ACP; TENEX/TOPS-20; work started onUnix
Programming Languages: SmallTalk
Computers: CDC 7600 (36 MFLOPS)

Games: Space Travel (written by Jeremy Benfor Multics; when AT&T pulled out of the Multics project, J. Ben ported theprogram to FORTRAN running on GECOS on the GE 635; then ported by J. Ben andDennis Ritchie in PDP-7 assembly language; the process of porting the game tothe PDP-7 computer was the beginning of Unix)

Technology: ARPANET (military/academic precursor to the Internet);RS-232; networking; laser printer (invented by Gary Starkweather at Xerox)



Operating Systems: Unix; RT-11; RSTS-11
Programming Languages: Pascal; Prolog
Computers: Datapoint 2200; DEC PDP-11
Technology: dynamic RAM; flight data processor


Computers: Intel 4004 (microprocessor)
Games: Computer Space (first commercial vidoe game)
Technology: floppy disk; first electronic calculator (T1)


Operating Systems: VM/CMS
Programming Languages: C
Computers: Intel 8008 (microprocessor); Rockwell PPS-4 (microprocessor);Fairchild PPS-25 (microprocessor)
Games: Pong
Technology: game console (Magnavox Odyssey); first scientific calculator(HP); first 32-bit minicomputer; first arcade video game

Computers: National IMP (microprocessor)
Technology: TCP/IP; ethernet
Operating Systems: MVS
Programming Languages: SQL
Computers: Intel 8080 (microprocessor); Motorola 6800 (microprocessor);CDC STAR-100 (100 MFLOPS)
Programming Languages: Scheme
Computers: Altair 880 (first personal computer); Fairchild F-8(microprocessor); MOS Technology 6502 (microprocessor); Burroughs ILLIAC IV(150 MFLOPS)
Technology: single board computer; laser printer (commercial release byIBM)
Operating Systems: CP/M
Computers: Zilog Z-80 (microprocessor); Cray 1 (250 MFLOPS); Apple I
Technology: inkjet printer; Alan Kay’s Xerox NoteTaker developed atXerox PARC

Programming Languages: OPS5; FP
Computers: DEC VAX-11; Apple II; TRS-80; Commodore PET; Cray 1A
Operating Systems: Apple DOS 3.1; VMS (later renamedOpenVMS)
Programming Languages: CSP
Computers: Intel 8086 (microprocessor)
Games: Space Invaders (arcade game using raster graphics)
Technology: LaserDisc
Programming Languages: REXX; work started on C withClasses (later renamed C++); VISICALC
Computers: Motorola MC68000 (microprocessor); Intel 8088(microprocessor)
Games: Lunar Lander (arcade video game, first to use vector graphics);Asteroids (vector arcade game); Galaxian (raster arcade game, color screen)
Technology: first spreadsheet; object oriented programming; compactdisk; Usenet discussion groups



Operating Systems: OS-9
Programming Languages: dBASE-II; Smalltalk-80
Computers: Commodore VIC-20; ZX80; Apple III

Games: Battlezone (vector arcade videogame, dual joystick controller and periscope-like viewer); Berzerk (rasterarcade video game, used primative speech synthesis); Centipede (raster arcade videogame, used trackball controller); Missile Command (raster arcade video game,used trackball controller); Defender (raster arcade video game); Pac-Man(raster arcade video game); Phoenix (raster arcade video game, use of musicalscore); Rally-X (raster arcade video game, first game to have a bonus round);Star Castle (vector arcade video game, color provided by transparent plasticscreen overlay); Tempest (vector arcade video game, first color vector game);Wizard of Wor (raster arcade video game)
Operating Systems: MS-DOS; Pilot
Computers: 8010 Star; ZX81; IBM PC; Osborne 1 (first portable computer);Xerox Star; MIPS I (microprocessor); CDC Cyber 205 (400 MFLOPS)
Games: Donkey Kong (raster arcade video game); Frogger (raster arcadevideo game); Scramble (raster arcade video game, horizontal scrolling); Galaga(raster arcade video game); Ms. Pac-Man (raster arcade video game); Qix (rasterarcade video game); Gorf (raster arcade video game, synthesized speech)
Technology: portable PC; ISA bus; CGA video card


Operating Systems: SunOS
Computers: Cray X-MP; BBC Micro; Commodore C64; Compaq Portable; ZXSpectrum; Atari 5200; Intel 80286 (microprocessor)

Games: BurgerTime (raster arcade video game); Dig Dug (raster arcadevideo game); Donkey Kong Junior (raster arcade video game); Joust (rasterarcade video game); Moon Patrol (raster arcade video game, first game withparallax scrolling); Pole Position (raster arcade video game); Q*bert (rasterarcade video game); Robotron 2084 (raster arcade video game, dual joystick);Time Pilot (raster arcade video game); Tron (raster arcade video game); Xevious(raster arcade video game, first game promoted with a TV commercial); Zaxxon(raster arcade video game, first game to use axonometric projection)

Technology: MIDI; RISC; IBM PC compatibles