Introduction & History by e47qAH


									Introduction & History

AP Computer Science

     From: Oberta A. Slotterbeck
     Computer Science Department
     Hiram College
     Edited by: Suzy Crowe
The slides - credits
  Some of these slides are provided on the CS
  Illuminated textbook web site by the authors
  for instructors’ use. You can see copies there.
  Obie added many slides to those and, in
  some cases, modified the original slides,
  without specifying precisely where changes
  have been made.
  If other sites were used, especially for
  pictures, Obie referenced those sites.

Chapter Goals
Describe the layers of a computer system
Describe the concept of abstraction and its
relationship to computing
Outline the history of computer hardware and
Describe the changing role of the computer user
Distinguish between systems programmers and
applications programmers
Distinguish between computing as a tool and
computing as a discipline
Computing Systems
Computing systems are dynamic!

What is the difference between hardware
and software?

Computing Systems             (Cont’d)
Hardware: The physical elements of a
computing system (printer, circuit boards,
wires, keyboard…)

Software: The programs that provide the
instructions for a computer to execute

Layers of a Computing System
         Operating System


A mental model that removes complex details
This is a key concept. Abstraction will
reappear throughout your studies – be sure to
understand it!

  Early History of Computing

  An early device to record numeric values
  We normally do not call it a computer, but a computing
  It is still used in parts of the world today.

  This distinction between a computer and a computing
  device will become clearer as we look at other aspects
  of the history of computing.
source:   8
Napier’s Bones (or Rods)
   Rods were marked with multiplication table
  These were used to provide fairly simple
  means of multiplying large numbers.
  On the web site, one of the labs goes into
  some details on Napier’s Bones.
  The honor for constructing the first
  calculating machine belongs to a German
  called Wilhelm Schickard . In 1623 he
  completed a mechanical calculating machine
  based on Napier's work.
  Using the bones to compute
  46732 X 5

source:    10
Slide Rule
1614: John Napier discovers algorithms, making it
possible to multiply and divide using addition and
To avoid having to look up logs, Edmund Gunter created
a number line in which the position of numbers were
proportional to their logs. Go Guntie!
Soon afterwards, William Oughtred created a slide rule
with two Gunter’s lines which would slide relative to
each other.
The slide rule was widely used by the end of the 17
century and remained popular for the next 300 years.
Ms. Crowe used a slide rule in her Chemistry, Physics,
and Calculus classes. Go Crowe!
Mods included powers and roots of numbers, but not
addition or subtraction.                               11
 Blaise Pascal
 In 1642 Blaise Pascal, a Frenchman invented a
 new kind of computing device.
 It used wheels instead of beads. Each wheel had
 ten notches, numbered '0' to '9'. When a wheel
 was turned seven notches, it added 7 to the
 total on the machine.
 Pascal's machine, known as the Pascaline, could
 add up to 999999.99.
 It could also subtract.

infotech/stage1/assign2/pre20th.htm#schickard           12
 Gottfried Leibnitz
     Leibnitz improved on Pascal's adding
     machine so that it could also perform
     multiplication, division and calculate
     square roots.

infotech/stage1/assign2/pre20th.htm#schickard           13
Grillet’s Pocket Calculator
  One very early machine which incorporated Napier’s ideas
  was that built by a French clockmaker called Grillet in
  Grillet included a set of Napier's Rods in an adaptation of
  the Pascaline. It could be considered the world's first
  pocket calculator.
  The top section of the device consisted of 24 dials or sets
  of wheels. The lower section contained a set of inverted
  Napier’s Rods engraved on cylinders.
  Although the device was limited, it did allow simple
  operations to be performed.
  It could carry out eight digit additions -- something that
  would have been very useful at a time when very few
  people had skill with numbers.

 Grillet’s Machine
Joseph Jacquard
  In the late 1700s in France, Joseph Jacquard
  invented a way to control the pattern on a
  weaving loom used to make fabric.
  Jacquard punched pattern holes into paper
  The cards told the loom what to do.
  Instead of a person making every change in a
  pattern, the machine made the changes all by
  Jacquard's machine didn't count anything. So
  it wasn't a computer or even a computing
  device. His ideas, however, led to many
  other inventions later.

Jacquard Loom - A mechanical device
that influenced early computer design
Intricate textile
patterns were
prized in France in
early 1800s.
Jacquard’s loom
(1805-6) used
punched cards to
allow only some
rods to bring the
thread into the
loom on each
shuttle pass.
Sheets of punched cards set the
pattern of the weave

  Source:   18
During the 1700's and early 1800's, part of the world saw
the development of industrialization.
Before the Industrial Revolution, manufacturing was done by
hand or simple machines.
The Industrial Revolution caused many people to lose their
 Groups of people known as Luddites attacked factories and
wrecked machinery in Britain between 1811 and 1816.
The Luddites received their name from their mythical leader
Ned Ludd.
They believed that the introduction of new textile machines
in the early 1800's had caused unemployment and lowered
the textile workers' standard of living.
Note this is similar to the way some people see that
computers today are taking the jobs of workers.
Charles Babbage
Babbage is known as the
father of modern computing
because he was the first
person to design a general
purpose computing device.
In 1822, Babbage began to
design and build a small
working model of an
automatic mechanical
calculating machine, which he
called a "difference engine".
Example: It could find the
                              In the Science Museum,
first 30 prime numbers in two
and a half minutes.           London

Source:       20
    A closer look at difference engine

Babbage continued work to produce a full scale working
Difference Engine for 10 years, but in 1833 he lost interest
because he had a "better idea"--the construction of what
today would be described as a general-purpose, fully
program-controlled, automatic mechanical digital computer.

Babbage called his machine an "analytical engine".

He designed, but was unable to build, this Analytical Engine
(1856) which had many of the characteristics of today’s
    an input device – punched card reader
    an output device – a typewriter
    memory – rods which when rotated into position “stored”
a number
    control unit – punched cards with instructions encoded as
with the Jacquard loom                                 22
 The machine was to operate automatically, by steam
 power, and would require only one attendant.


Some call Babbage’s analytic engine the
first computer, but, as it was not built by
him, most people place that honor
   Babbage's analytical engine contained all the
   basic elements of an automatic computer--
   storage, working memory, a system for
   moving between the two, an input device and
   an output device.
   But Babbage lacked funding to build the
   machine so Babbage's computer was never

Babbage designed a printer, also, that has
just been built at the Science Museum in
London- 4,000 working parts!

    Ada Lovelace
     Ada Byron Lovelace was a close
     friend of Babbage.
     Ada thought so much of
     Babbage's analytical engine that
     she translated a previous work
     about the engine.
                                                Today, on behalf
     Because of the detailed                    of her work in
     explanations she added to the              computing, a
     work, she has been called the              programming
     inventor of computer                       language, Ada, is
     programming.                               named after her.
Herman Hollerith
In 1886, Herman Hollerith invented a machine known as
the Automatic Tabulating Machine, to count how many
people lived in the United States.
This machine was needed because the census was taking
far too long.
His idea was based on Jacquard's loom. Hollerith used
holes punched in cards. The holes stood for facts about a
person; such as age, address, or his type of work. The
cards could hold up to 240 pieces of information.
Hollerith also invented a machine, a tabulator, to select
special cards from the millions.
To find out how many people lived in Pennsylvania, the
machine would select only the cards punched with a
Pennsylvania hole. Hollerith's punched cards made it
possible to count and keep records on over 60 million
 Hollerith Tabulator
Hollerith founded the
Tabulating Machine
In 1924, the name of
the company was
changed to
Business Machines
Corporation (IBM).
This is the 1890
version used in
tabulating the 1890
federal census.
Photo taken at Computer Science History Museum, San Jose, CA,
       by Dr. Robert Walker on VLSI Trip to Silicon Valley. 29
Punched cards
The punched card
used by the Hollerith
Tabulator for the
1890 US census.

                                         The punched card
                                         was standardized in
                                         It was the primary input
                                         media of data processing
                                         and computing from 1928
                                         until the mid-1970s and
                                         was still in use in voting
                                         machines in the 2000 USA
                                         presidential election.
source:   30
History of Hardware
Harvard Mark I, ENIAC,
UNIVAC I, ABC and others
  These are the names of some of the early
  computers that launched a new era in
  mathematics, physics, engineering and
  economics initially and, subsequently, almost
  every area has been impacted by computers.
   The early computers were huge physically
  and very limited by today’s standards.

   First Generation Hardware
   (1951-1959) – Major

Vacuum Tubes
Large, not very reliable, generated a lot of heat

Magnetic Drum Storage
Memory device that rotated under a read/write head

Card Readers  Magnetic Tape Drives
Development of these sequential auxiliary storage
  ABC   built by Professor John Atanasoff and a
  graduate student, Clifford Berry, at Iowa State
  University between 1939 and 1942.
                                             Special purpose
                                             computer and was
                                             not truly
                                             The instructions to
                                             the machine were
                                             entered by buttons.
                                             Input: Punched
                                             paper tape
                                             Output: Punched
Source:          34
   Mark I designed by Howard Aiken and Grace
   Hopper at Harvard University in 1939-1944.
 Contains more than 750,000
 components, is 50 feet long, 8 feet
 tall, and weighs approximately 5

 Instructions were pre-punched on
 paper tape
 Input was by punched cards
 Output was displayed on an electric typewriter.
 Could carry out addition, subtraction, multiplication,
          division and reference to previous results.
 Still exists in the Computer Science Building at Harvard
 University and can be turned on and run!
Zuse’s Machines, Z1-Z4 built by Konrad Zuse in
Berlin, Germany, 1938 – 1944 (all destroyed
supposedly in the Berlin bombings)

If these machines did exist as described by Zuse
after the war, they were the first computers.
Rebuilt model of Z3 housed in Deutsches
Technik Museum, Berlin

Input: from a numeric, decimal,
20 digit keyboard
Output: Numbers displayed with
lamps, 4 decimal digits with
decimal point
Programmed via a punch tape
and punch tape reader

Multiplication 3 seconds,
      division 3 seconds, addition 0.7 seconds.
Used a 600 relay numeric unit,
      1600 relay storage unit                     37
Computer vs computing device
  Most, but not all, people claim a
  computer must be
     Digital
     Programmable
     Electronic
     General purpose
  If any characteristic is missing, at best,
  you have a computing device.

Mauchly and Eckert or Zuse –
built the first computer
 Many claim the ENIAC was the first computer as
 there was proof that it did exist.
 John Mauchly envisioned the ENIAC. He was a
 professor of Physics at Ursinus College. In 1943 he
 attended a workshop at Penn were, he saw a
 machine calculating firing tables. Mauchly realized
 that he could build an electronic machine that could
 be much faster.
 J. Presper Eckert solved the engineering challenges.
 The chief challenge was tube reliability. Eckert was
 able to get good reliability by running the tubes at
 1/4 power.
ENIAC – (Electrical Numerical Integrator And
Calculator), built by Presper Eckert and John
Mauchly at Moore School of Engineering,
University of Pennsylvania, 1941-46

                                    Often called the
                                    first computer (that
                                    was electronic,
                                    general purpose
                                    and digital).

18,000 vacuum tubes and weighed 30 tons
Duration of an average run without some failure
was only a few hours, although it was predicted
to not run at all!
When it ran, the lights in Philadelphia dimmed!
ENIAC Stored a maximum of twenty 10-digit
decimal numbers.
Input: IBM card reader
Output: Punched cards, lights

  Eniac’s Vacuum Tubes

Photo taken at Computer Science History Museum, San Jose, CA,
by Dr. Robert Walker on VLSI Trip to Silicon Valley.            42
      A vacuum tube similar
      to those used in the
      earliest computers.


 Programming required rewiring of the machine,
Source:   44
     UNIVAC – first commercial
       March 31, 1951: Census Bureau accepts
       delivery of the first UNIVAC computer
       Final cost: ~ $1,000,000
       46 UNIVAC computers were built for both
       government and business uses.
       Remington Rand became the first American
       manufacturer of a commercial computer
       Their first non-government contract was for
       General Electric in Louisville, Kentucky, who
       used the UNIVAC computer for a payroll
UNIVAC’s prediction ignored
A 1952 UNIVAC made history by predicting
the election of Dwight D. Eisenhower as US
president before the polls closed.
The results were not immediately reported by
Walter Cronkite because they were not
believed to be accurate.
Democratic presidential candidate Adlai
Stevenson was the front-runner in all the
advance opinion polls, but by 8:30 p.m. on the
East Coast, well before polls were closed in the
Western states, UNIVAC projected 100-to-1
odds that Dwight D. Eisenhower would win by
a landslide, which is in fact what happened.
   1952 election night

source:   47
Whirlwind at MIT - 1952
      The first digital computer capable of
      displaying real time text and graphics
      on a large oscilloscope screen.
                                                              displayed on

 Second Generation Hardware
 (1959-1965) - Characteristics

Replaced vacuum tube, fast, small, durable, cheap

Magnetic Cores
Replaced magnetic drums, information available

Magnetic Disks
Replaced magnetic tape, data can be accessed directly

 A Typical Computing Environment in 1960
 – UNIVAC 1107 at Case Institute of

     The true purpose of computers is finally
     realized in 1961, when a MIT student, Steve
     Russell, created the first computer game –
     Spacewar on a DEC PDP-1- a minicomputer
200 hours to program!

   Sources:          51
 Father of Graphics- Ivan Sutherland

Ph.D. Thesis, 1963, MIT :
Sketchpad: The First Interactive Computer Graphics

Package on TX-2 (forerunner of DEC machines).
TX-2 was a giant machine for the day:

     320 kilobytes of memory, about twice the
  capacity of the biggest commercial machines
     magnetic tape storage,
     an on-line typewriter,
     the first Xerox printer,
     paper tape for program input,
     a light pen for drawing,
     a nine inch CRT (i.e. display screen) !

 Light Pen Input
“Sketchpad: A Man-machine Graphical Communications
System," used the light pen to create engineering drawings
directly on the CRT.

1964 CDC
Control Data

CDC 6600 at U of
Texas, 1964-68

Cost: $2,000,000+

Kept in dustfree
room behind
locked doors
timeline_category=cmptr                                         55
  CDC 6600 – University of Texas -

   were available
   to only a few.

   Most had to
   use punched
   cards handed
   in through a

  A sampling of 1960-1965 circuit boards:

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley
 Third Generation Hardware
  Integrated Circuits
  Replaced circuit boards, smaller, cheaper, faster, more

  Now used for memory construction

  An input/output device with a keyboard and screen

1968: 1.3 MHz CPU with half a megabyte of RAM and 100
  megabyte hard drive cost a mere US $1.6 million.
PDP I – first of the minicomputers to be
used by many universities.

PDP-40: One of Hiram’s first computers
purchased in 1974 (designed in the 3rd

  The DEC-10 was popular at a lot of large universities.

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley
    Dumb terminals or workstations were used to tie into the

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 62
 Fourth Generation Hardware
Large-scale Integration
Great advances in chip technology

PCs, the Commercial Market, Workstations
Personal Computers were developed as new companies
like Apple and Atari came into being. Workstations

Personal Computers are
Introduced around 1977

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 65
  Typical prices on early PCs
Contrary to Memory     BASIC         APL           Both
belief today,
              16K      $8,975        $9,975        $10,975
these were 32K         $11,975       $12,975       $13,975
not cheap –
the IBM       48K      $14,975       $15,975       $16,975
5100 in mid
1970s.        64K      $17,975       $18,975       $19,975

Note that $1 in 1975 would be equal to $3.76 today so
multiply these by ~3.8! For this price comparison
see                   66
Vax 780 – purchased by Hiram
College in 1981.

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 68
   Dumb terminals were used for some input with these
   machines and line printers were used for output:

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 69
Parallel Computing and
Parallel Computing
Computers rely on interconnected central processing
units that increase processing speed.

With the Ethernet small computers could be connected
and share resources. File servers connected PCs in
the late 1980s.

ARPANET and LANs  Internet

  There are many different kinds of parallel
  machines – this is one type
   A parallel computer must be capable of working
   on one task even though many individual
   computers are tied together.
   Lucidor is a distributed memory computer (a
   cluster) from HP. It consists of 74 HP servers
   and 16 HP workstations. Peak: 7.2 GFLOPs
 one billion
    Cray Machines Are Another Type
    of Parallel Machine –Cray 1

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 72
    Cray 2

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley
    Another Earlier Parallel Computer at the
    University of Illinois was the Illiac IV

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley
    Another View of the Illiac IV

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley 75
 CM-2 Connection Machine

Interesting fact: The lights are there only for show!   76
   IBM 360 (late ’60s) Console with

Photo taken at Computer Science History Museum, San Jose, CA,
        by Dr. Robert Walker on VLSI Trip to Silicon Valley
 A History of the Web - Let’s Be Precise
Late 1960s – the ARPANET was conceived as a
network of computers in which packets of information
(i.e. data) could be transmitted between various
computers via telephone lines or higher speed
dedicated data lines.

ARPA was the Advanced Research Projects Agency.

This allowed remote logins to computers (using
telnet), the ability to transfer files between computers
(using ftp = file transfer protocol), and e-mail.

Many people now say it was designed for the
military, but that was newspaper hype.
The purpose of the ARPANET was to allow
the sharing of National Science Foundation
(NSF) research project information between
In 1972, there were 29 nodes (i.e. computer
sites that were interconnected).

 ARPANET  Internet Protocol
To move data from computer A to
 computer B:
   Break the data up into packets of
   Each packet carries the IP (Internet
    Protocol) address of computer B
    which consists of 4 numbers.
   At each computer, the address is
    read and then the packet is shipped
    along to another computer using a
    recipe called a routing algorithm.
 At no time are A and B physically connected
  (if they are adjacent nodes on the network,
  they might be physically connected).
 The different packets will not necessarily
  follow the same route.
 At computer B, the packets are reassembled
  into the document.
 A packet is typically in the 40 – 1500 byte
  range (1 byte = 1 character)

                  Packet X-1          Computer C
   Docu             for B
    X at A        Packet X-2
                     for B                         Computer F

                  Packet X-3
Packet-based        for B
transmission of                       Computer E
a document
from Computer
A to Computer
B                 Computer D

Note: If one                                           Document
computer fails,                  Computer B             X at B
the document
may still be
                    Computer G
transmitted.                                            82
Tracing the Route Your Packets Take
Traceroute is a tool that traces the route your
packets take.
Ping is a tool that tells you whether or not a web
site is up.
Ping Plotter ( is a
tool that graphically shows you the route your
packets take.
  It was free, but can be used for 30 days

   without a subscription of $0.99 a month
  Interesting to see how your packets travel!
One Ping Plotter Trace to

A Trace to KSU
If you watch these interactively you’ll see the route changes.

 Internet to World Wide Web
The Internet is a collection of computers using the
internet protocol to transmit information
The World Wide Web is a multimedia environment
invented by Tim Berners-Lee, in 1990.
  Was a physicist at CERN (the European

   Organization for Nuclear Research).
  Wrote the first web browser (WorldWideWeb) and
   the software for the first web server.
  Invented both the HTML markup language in which

   many web pages are written and the HTTP protocol
   used to request and transmit web pages between
   web servers and web browsers
             Growth of WWW
Number of Unique Web Sites (adjusted to account for
sites duplicated at multiple IP addresses.
      1998:     2,636,000
      1999:     4,662,000
      2000:     7,128,000
      2001:     8,443,000
      2002:     8,712,000

A Web site is defined as a distinct location on the
Internet, identified by an IP address, that returns a
response code of 200 and a Web page in response to
an HTTP request for the root page. The Web site
consists of all interlinked Web pages residing at the IP
Statistics provided by OCLC Online Computer Library Center, Inc., Office of
Research                                                                      87
Active Internet Users by Major

Average Web Usage in U.S.

More Than Half of People in U.S. and
Canada Regularly Use Internet

Global Usage – Includes All Countries Monitored :
~20 countries accounting an estimated 90% of all
Internet users

For a more complete list see:
From the “Computer Industry Almanac”
   Worldwide Internet Population 2004: 934 million

   Projection for 2005: 1.07 billion

   Projection for 2006: 1.21 billion

   Projection for 2007: 1.35 billion

For more details for individual countries see:
article.php/%205911_151151                           92
Negative Properties of WW
Uneven capabilities of user’s browsers and other
  i.e. plug-ins differ widely and can interact with
   each other in strange ways.
No central control – therefore, unregulated and
somewhat uncensored.
Anonymity of site owners.
Contrary to popular belief, it is not free.
Device independent – but not all hardware acts
the same.
Copyright issues are more pronounced.
Uneven bandwidth – number of bits that can be
transmitted per second.                            93
 Be Cautious and Critical
    There is a famous New Yorker cartoon:

Cartoon by Peter Steiner reproduced from page 61 of July 5, 1993
 issue of The New Yorker, (Vol.69 (LXIX) no. 20).                  94
Brief Mention of History of
We’ll study this more later
Alan Turing
Turing Machine, Artificial Intelligence
    Much of the early work on computers was
   theoretical and done by mathematicians.
   Alan Turing, and others, studied the
   questions “What tasks can be computed?”
   and “What is a computer?” His abstract
   model, called the Turing Machine, is of great
   interest in studying these questions.
   Another big questions was, and still is, “Are
   machines intelligent?”
   Alan Turing devised a test, now known as
   The Turing Test (1950), to answer this
   We’ll delve into these issues a little later in
   the course.                                     96
History of Software
First Generation Software
Machine Language
Computer programs were written in binary (1s and 0s)

Assembly Languages and translators
Programs were written in languages that mimicked
machine language and were then translated into
machine language

Programmers begin to specialize
Programmers divide into application programmers and
systems programmers.

Systems vs Applications
  Computer scientists that design programs
 and systems for other computer scientists to
 use are called systems computer scientists.

  Computer scientists that design programs
 and systems for non-computer scientists to
 use are called application computer scientists.

 Second Generation Software
High Level Languages
Use English-like statements and made programming
Fortran, COBOL, Lisp.

                    Language     Machine

Third Generation Software (1965-
  Systems Software Developed
      utility programs,
      language translators,
      and the operating system, which decides which
      programs to run and when.
  Separation between Users and
  Computer programmers now created
  programs to be used by people who did
  not know how to program
Third Generation Software

        Application Package
          Systems Software
        High-Level Languages
         Assembly Language

        Machine Language

Fourth Generation Software

Structured Programming
Pascal, C

New Application Software for Users
Spreadsheets, word processors, database management

Fifth Generation Software (1990-
The Windows operating system, and other Microsoft
application programs dominate the market

Object-Oriented Design
Based on a hierarchy of data objects (i.e. Java, C++)

World Wide Web
Allows easy global communication through the Internet

New Users
Today’s user can “get by” with little computer knowledge
Computing as a Tool

             Programmer / User

Systems Programmer         Applications Programmer
    (builds tools)               (uses tools)

                        Domain-Specific Programs

     User with No
  Computer Background
Computing as a Discipline

   What can be (efficiently)

   Four Necessary Skills
    1. Algorithmic Thinking
    2. Representation
    3. Programming
    4. Design

Some Systems Areas of
Computer Science
 Study of Algorithms and Data Structures
 Programming Languages
 Operating Systems
 Software Methodology and Engineering
 Human-Computer Communication
 Systems Programming

Some Application Areas of
Computer Science
 Numerical and Symbolic Computation
 Databases and Information Retrieval
 Artificial Intelligence and Robotics
 Organizational Informatics
 Game development
Some other disciplines sharing some
ground with computer science

    Computer engineering
    Electrical engineering
    Computational physics
    Computational chemistry
    Computational biology

   Hot off the presses:
     What field has…
     • …the best-rated job, and 5 of the top
       10 highest paid, highest growth jobs?
     • …shown strong job growth in the face
       of outsourcing?
     • …a looming severe shortage in college
     Computer Science!
This slide and the rest of the slides in this presentation were collated from SIGCSE
announcements and displayed at the Gettysburg College Department of Computer Science
 website. Individual sources are given on the last slide.
• Software engineers top the list of
best jobs according to a Money
magazine and survey based
on “strong growth prospects, average
pay of $80,500 and potential for
creativity”. [1]

     • 5 computing jobs are in the top 10 salary jobs
     from the Bureau of Labor Statistics’ list of the 30 fastest
     growing jobs through 2014. [2]

1.   Computer systems software engineer: $81,140
2.   Computer applications software engineer: $76,310
6.   Computer systems analyst: $67,520
7.   Database administrator: $61,950
9.   Network systems and data communication analyst:

Salaries are given as mean annual salaries over all regions.

• In April 2006, more Americans were employed in IT
than at any time in the nation’s history. [3]

• In May 2004, “U.S. IT employment was
      17% higher than in 1999
      5% higher than in 2000 and
      showing an 8% growth in the [following] year …

The compound annual growth rate of IT wages has been
about 4% since 1999 while inflation has been just 2% per

… Such growth rates swamp predictions of the outsourcing
job loss in the U.S., which most studies estimate to be 2% to
3% per year for the next decade.” [4]

• “According to the National Science Foundation, the need
for science and engineering graduates will grow 26%,
or by 1.25 million, between now and 2012.

The number of jobs requiring technical training is growing at
five times the rate of other occupations. And U.S. schools are
nowhere near meeting the demand, according to multiple
studies.” [5]

• The percentage of college freshmen listing computer
science as their probable major fell 70% between 2000
 and 2004. [6]

[1] Wulfhorst, Ellen., Apr. 12, 2006.

[2] Morsch, Laura., Jan. 27, 2006.

[3] Chabrow, Eric., Apr. 18, 2006.

[4] Patterson, David. President’s Letter: Restoring the Popularity of
Computer Science, Communications of the ACM, Sept. 2005, Vol. 48, No. 9

[5] Deagon, Brian. Investor’s Business Daily, May 12, 2006.

[6] Robb, Drew., July 17, 2006.


To top