CSC 539: Operating Systems Structure and Design
HW2: Batch and Timeshared CPU Scheduling
The following questions concern the simple batch/timesharing
simulator discussed in class.
The simulator reads in a collection of jobs from a file, with each job
specified by an arrival number (assumed to be time 0 for all batch jobs), ID number and job length.
It stores the jobs in a queue
and processes them in order. By changing constants, the delay time
between jobs and the time slice alotted to each job in turn can be
Source code for the simulator can be viewed and downloaded by clicking on
the following links and cut-and-pasting into the Visual C++ .Net editor:
jobs.txt. (If you would like a quick review
of useful C++ classes and libraries, some of which may be new to you, see
CSC539 REVIEW SHEET: C++ Classes & Libraries.)
- Build a C++ project and execute the simulator on the provided
job data file (jobs.txt). To simulate a batch environment, set
the LOAD_DELAY and TIME_SLICE to be high (5 and 1000, respectively).
Likewise, to simulate a simple timesharing environment, set these
constants to lower values (1 and 10, respectively). Print the log of each
execution and hand them in with the assignment.
Note: you can copy the contents of the output window by right-clicking
within the window, selecting Select All from the menu, and then pasting
that text into whatever text editor or word processor you choose.
- Create a data file with the following jobs, and provide printouts of
execution logs using the batch (LOAD_DELAY = 5, TIME_SLICE = 1000) and
timesharing (LOAD_DELAY=1, TIME_SLICE=10) settings from above.
ARRIVAL JOB # LENGTH
0 1 20
0 2 4
0 3 7
0 4 31
0 5 13
- In both of the above data files, the timesharing settings produced shorter overall
simulations than with the batch settings. That is, the total time to complete all jobs
was less. Will this always be the case? If so, explain why. If not, give an
example where the batch settings will require less time than the timesharing settings.
- Suppose that our only concern was minimizing the time it takes to
complete all of the jobs. Given a set LOAD_TIME, which would make more sense in
order to minimize the total time: increasing or decreasing the TIME_SLICE?
Justify your answer,
with references to specific data where applicable.
- While the total time to complete all jobs is relevant in a batch
environment, a more meaningful measure in a timesharing environment
is the average time to completion for all the jobs. Modify the CPUScheduler
class to have another member function named displayStats. When called, this
method should display the CPU utilization percentage for the system, and the average completion time
for all completed jobs. Your CPU.cpp program should call
this method at the end of the simulation to display system statistics.
For each of the job data files above (the provided sample file and the
new job data from Exercise 3), report the CPU utilization and average completion time
for jobs using the batch and timesharing settings.
- Which type of environment, batch or timesharing, tends to do
better with respect to average completion time? Does that environment
always do better, or can you cite specific examples where it is
actually worse? Justify your answer, with references to specific data
- Does the order of the jobs affect the average completion time? That
is, is it possible to shuffle the order of a collection of jobs and obtain
significantly different results? If so, which environment (batch vs.
timesharing) is affected more by order variation? Justify your
answer, with references to specific data where applicable.