1. What is an Operating System? A Historical Perspective
First Generation Computers (1949-1956)
- You need physical access to the machine. A single user writes a program and operates the computer through the console (simple as a wiring panel). At any time, only one user uses/owns the machine.
- The entire address space belongs to the user program of the user currently owning the machine.
- Absolute machine language (support branches to absolute addresses): compilers are simple as there is no need for relocatable code.
- I/O devices - development of libraries and functions to control and access I/O devices.
- Memory space - Separated between system libraries and the user program.
Second Generation Computers (1956-1963)
- The cost of computer infrastructure in those days was expensive, so had to maximize the utilization of the computer. Most time was wasted on programming and the setup of the computer.
- Observation: need to reduce those times. Solution:
- Separate the programming from the operation: Don’t develop and enter the program on the machine itself. Instead, move the programming offline to a medium that can be easily read by the computer.
- Automate Load/Translate/Load/Execute process
- New concepts
- This separated the programmer and user from the computer operator, whose responsibility was to ensure the smooth operation of the computing system.
- Computers now started to execute batches of programs, where at any point in time there was a queue of jobs waiting to be executed, each possibly from a different user and most likely with its own user program to be run.
- Special programs had to be run on the system to sequence and handle the incoming stream of jobs (Monitors).
- To separate programming from the operation, the program had to be stored on a medium that could be easily read by the computer. A popular medium was the punch card.
- New memory organization
- The monitor contains the code for the device drivers, the code to sequence, load, and execute a stream of user programs, and the control card interpreter, which understands the job control and the job entry control commands.
- Monitor loads and executes user programs, no longer the user's job to do it.
- Monitor stays resident in memory after user programs execute, for example, to mediate access to devices, or to load the next program.
- The monitor program must be protected from the user programs to ensure that misbehaving user programs don’t affect the execution of subsequent user programs.
Third Generation Computers (1964-1973)
- Key Observations
- I/O devices are very slow compared to the speed of computers
- I/O operations are very expensive
- Ex: The CPU might wait for an I/O device to return, thus it idles (self-suspension) when it can start job2 and job3.
- Multiprogramming: use idle time created by self-suspension to execute other workloads
- We need to have several programs ready to execute
- We need to keep several programs in memory
- Therefore, the memory space changes. Rather than separating by memory into monitor and user program, now the system has to separate it into monitor memory and a separate portion of memory for each job that the system decides to keep in memory.
- Issues with multiprogramming
- How to schedule the jobs
- How to allocate memory to the jobs
- How to project jobs from each other
- Time-Sharing - (the mid-1960s on), System designers were adding mechanisms to operating systems that would allow multiple - typically remote -, users, to access the computer simultaneously.
- You can run multiple instances of command interpreter programs, one for each user, and assign the CPU in a round-robin fashion similar to multiprogramming.
- The system needs to maintain the illusion that the user owns the machine. If the user loses connectivity to the computer and reconnects, their data need to be preserved. Thus, something called “persistent storage” was developed, giving rise to file systems.
- The OS must be protected from system misuse, so passwords were required for authentication.
- It was easy to overload the memory when multiple programs are running simultaneously, so there had to be ways to swap unused portions of memory to secondary storage, so virtual memory was invented.
- It’s important to maintain the interactivity for impatient users, so CPU scheduling becomes very important.
More Recently...