Docstoc

docx

Document Sample
docx Powered By Docstoc
					List the Important of the OS


          The Most Important Software Innovations
                       David A. Wheeler
         First version 2001-08-01; Revised 2009-05-26

Introduction
Too many people confuse software innovations with other factors, such as the increasing
speed of computer and network hardware. This paper tries to end the confusion by
identifying the most important innovations in software, removing hardware advances and
products that didn’t embody significant new software innovations. This paper presents its
criteria for the most important software innovations and sources, the software innovations
themselves, discusses software patents and what’s not an important software innovation,
and then closes with Conclusions.

The results may surprise you.


Criteria
This paper lists the “most important software innovations,” so we first need to clarify
what each of those words mean:

   1. To be a “most important” innovation, an innovation has to be an idea that is very
      widely used and is critically important where it applies. Innovations that are only
      used by a very small proportion of software (or software users) aren’t included.
   2. To be a “software” innovation, it has to be a technological innovation that impacts
      how computers are programmed (e.g., an approach to programming or an
      innovative way to use a computer).

       I’m intentionally omitting computer hardware innovations or major hardware
       events that don’t involve software innovation. For example, court cases have
       decided that John Vincent Atanasoff is the legal inventor of the electronic digital
       computer, but that’s a hardware innovation. I’ve omitted other strictly hardware
       innovations such as the transistor (1947) and integrated circuits (1958). I’ve also
       omitted Ethernet, which Bob Metcalfe developed in 1973, for the same reason.

       I’ve omitted inventions that aren’t really technological inventions (e.g., social or
       legal innovations), even if they are important for software technology and/or are
       widespread. For example, the concept of a copylefting license is an innovative
       software licensing approach that permits modification while forbidding the
       software from becoming proprietary; it is used by a vast array of software via the
       General Public License (GPL). The first real copylefting license (the Emacs
       Public License) was developed by Richard Stallman in 1985 - but since copyleft
       is really a social and legal invention, not a technological one, it’s not included in
       this list. Also, the “smiley” marker :-) is not included - it’s certainly widespread,
       but it’s not really critical for use of computers, and it’s really a social invention
       not a technological one.

   3. We also have to define “innovation” carefully. An “innovation” is not simply
      combining two functions into a single product - that’s “integration” and usually
      doesn’t require any significant innovation (just hard work). In particular,
      integrating functions to prevent customers from using a competitor’s product is
      “predation,” not “innovation.” An “innovation” is not a product, either, although a
      product may embody or contain innovations. Re-implementing a product so that it
      does the same thing on a different computer or operating system isn’t an
      innovation, either. An innovation is a new idea. And in this paper, what’s meant
      is a new idea in software technology.

As a result, you may be surprised by the number of events in computing history that are
not on this list. Most software products are not software innovations by themselves, since
most products are simply re-implementations of another idea. For example, WordStar
was the first microprocessor word processor, but it wasn’t the first word processor -
WordStar was simply a re-implementation of a previous product on a different computer.
Later word processors (such as Word Perfect and Word) were re-implementations by
other vendors, not innovations themselves. Some major events in computing are simply
product announcements of hardware, and have nothing to do with innovations in
software. Thus, while the IBM PC and Apple ][’s appearances were important to the
computing world, they didn’t represent an innovation in software - they were simply
lower-cost hardware, with some software written for them using techniques already well-
known at the time.

Occasionally a product is the first appearance of an innovation (e.g., the first spreadsheet
program), in which case the date of the product’s release is the date when the idea was
announced to the public. Some innovations are innovative techniques, which aren’t
directly visible to software users but have an extraordinary effect on software
development (e.g., subroutines and object-orientation) - and these are included in this list
of software innovations. For the more debatable entries, I’ve tried to discuss why I
believe they should be included.

I’ve tried to identify and date the earliest public announcement of an idea, rather than its
embodiment in some product. The first implementation and first widespread
implementation are often noted as well. “Public” in this case means, at least, an
announcement to a wide inter-organizational audience. In some cases identifying a
specific date or event is difficult; I welcome references to earlier works. For example,
sometimes it is difficult to identify a “first” because an idea forms gradually through the
actions of many.
Sources
Since I haven’t found some sort of consensus of what the most important computing
innovations are, I’ve developed this list by selecting events from many other sources. I
used many sources so I wouldn’t miss anything important, in particular, I used IEEE
Computer’s historical information (including their 50-year timeline), the Virtual Museum
of Computing, Hobbes’ Internet Timeline, Paul E. Ceruzzi’s A History of Modern
Computing, and John Naughton’s A Brief History of the Future. I also used Janet
Abbate’s Inventing the Internet in a few cases, but I tried to double-check everything in
that source because (unfortunately) Abbate makes several errors that make its use as a
source suspect. For example, Abbate (page 22) doesn’t realize that although both
Strachey and John McCarthy used the same word (“time-sharing”) for their ideas, they
didn’t mean the same thing at all. Also, Abbate (page 201) claims Steve Bellovin was at
Duke, but this is wrong. I’ve also examined other sources, such as James Durham’s
History-Making Components and A History and Future of Computing. Note that, in
general, these sources mix computer hardware and software together. Another source is
the “Software Pioneers” conference (June 28-29, 2001, Bonn) sponsored by Software
Design and Management. Many specific sources such as “OSI and TCP: A History” by
Peter H. Salus were checked too. If you find computing history interesting, you might
also enjoy the 20 Year Usenet Timeline, a Brief History of Hackerdom, and Landley’s
Computer history page, though they aren’t sources for the material here.

Since this paper was originally published, I’ve received several additional suggestions
which rounded out this paper. My thanks to those who have provided those suggestions.
It’s quite possible this paper is still missing some important innovations; please contact
me if you have a correction or addition (dwheeler, at dwheeler.com, no spam please).


The Most Important Software
Innovations
Here is a list of the most important software innovations:
  Year             Innovation                             Comments
                                   Charles Babbage was an eminent scientist; he was
                                   elected Lucasian Professor of Mathematics at
                                   Cambridge in 1828 (the same chair held by Isaac
                                   Newton and Stephen Hawking). In 1837 he publicly
                                   described an analytical engine, a mechanical device
            Software (Babbage’s
1837                               that would take instructions from a program instead of
            Analytical Engine)
                                   being designed to do only one task. Babbage had
                                   apparently been thinking about the problem for some
                                   time before this; as with many innovations, pinning
                                   down a single date is difficult. This appears to be the
                                   first time the concept of software (computing
  Year        Innovation                            Comments
                            instructions for a mechanical device) is seriously
                            contemplated. Babbage even notes that the instructions
                            can be reused (a key concept in how today’s software
                            works). In 1842 Ada Augusta, Countess of Lovelace,
                            released a translation of “Sketch of the Analytical
                            Engine” with extensive commentary of her own. That
                            commentary has a clear description of computer
                            architecture and programming that is quite
                            recognizable today, and Ada is often credited as being
                            the “first computer programmer”. Unfortunately, due to
                            many factors the Analytical Engine was never built in
                            Babbage’s lifetime, and it would be many years before
                            general-purpose computers were built.
                            George Boole published “An Investigation of the Laws
1845      Boolean Algebra   of Thought”. His system for symbolic and logical
                            reasoning became the basis of computing.
                            Alan Turing wrote his paper “On computable numbers,
                            with an application to the Entscheidungsproblem”,
                            where he first describes Turing Machines. This
1936-37   Turing Machines   mathematical construct showed the strengths - and
                            fundamental limitations - of computer software. For
                            example, it showed that there were some kinds of
                            problems that could not be solved (in finite time).
                            In the “First Draft of a Report on the EDVAC”, the
                            concept of storing a program in the same memory as
                            data was described by John von Neumann. This is a
                            fundamental concept for software manipulation that all
                            software development is based on. Eckert, Mauchly,
                            and Konrad Zuse have all claimed prior invention, but
1945      Stored program    this is uncertain and this draft document is the one that
                            spurred its use. Alan Turing published his own
                            independent conception, but went further in showing
                            that computers could be used for the logical
                            manipulation of symbols of any kind. The approach
                            was first implemented (in a race) by the prototype
                            Mark I computer at Manchester in 1948.
                            Hypertext was first described in Vannevar Bush’s “As
                            we may think” The word “hypertext” itself was later
                            coined by Ted Nelson in his 1965 article A File
1945      Hypertext
                            Structure for the Complex, the Changing, and the
                            Indeterminate. 20th National Conference, New York,
                            Association for Computing Machinery.
                            Maurice Wilkes, Stanley Gill, and David Wheeler (not
1951      Subroutines
                            me) developed the concept of subroutines in programs
  Year        Innovation                                Comments
                               to create re-usable modules and began formalizing the
                               concept of software development.
                               Alick E. Glennie wrote “Autocoder”, which translated
                               symbolic statements into machine language for the
1952     Assemblers            Manchester Mark I computer. Autocoding later came
                               to be a generic term for assembly language
                               programming.
                               Grace Murray Hopper described techniques to select
                               (compile) pre-written code segments in correspondence
                               with codes written in a high level language, i.e., a
                               compiler. Her 1952 paper is titled “The Education of a
                               Computer” (Proc. ACM Conference), and is reprinted
1952     Compilers             in the Annals of the History of Computing (Vol. 9, No.
                               3-4, pp. 271-281), based on her 1951-1952 effort to
                               develop A-0. She was later instrumental in developing
                               COBOL. A predecessor of the compiler concept was
                               developed by Betty Holberton in 1951, who created a
                               “sort-merge generator”.
                               John Backus proposed the development of a
                               programming language that would allow users to
                               express their programs directly in commonly
                               understood mathematical notation. The result was
                               Fortran. The first Fortran implementation was
                               completed in 1957. There were a few compilers before
                               this point; languages such as A-0, A-1, and A-2
         Practically Compiling
                               inserted subroutines, the Whirlwind I included a
1954     Human-like Notation
                               special-purpose program for solving equations (but
         (FORTRAN)
                               couldn’t be used for general-purpose programming),
                               and an “interpreter” for the IBM 701 named
                               Speedcoding had been developed. However, Fortran
                               used notation far more similar to human notation, and
                               its developers developed many techniques so that, for
                               the first time, a compiler could create highly optimized
                               code [Ceruzzi 1998, 85].
                               Frierich L. Bauer and Klaus Samelson developed the
                               “stack principle” (“the operation postponed last is
                               carried out first”) at the Technische Universität
1955     Stack Principle       München. This served as the basis for compiler
                               construction, and was naturally extended to all
                               bracketed operation structures and all bracketed data
                               structures.
                               In Fall 1957 John McCarthy (MIT, US) began
1957     Time-sharing          proposing time-sharing operating systems, where
                               multiple users could share a single computer (and each
  Year        Innovation                            Comments
                            believes they control an entire computer). On January
                            1, 1959, he wrote a memo to Professor Philip Morse
                            proposing that this be done for an upcoming machine.
                            This idea caused immense excitement in the computing
                            field. It’s worth noting that Christopher Strachey
                            (National Research Development Corporation, UK)
                            published a paper on “time-sharing” in 1959, but his
                            notion of the term was having programs share a
                            computer, not that users would share a computer
                            (programs had already been sharing computers, e.g., in
                            the SAGE project). [Naughton 2000, 73] By November
                            1961 Fernando Corbató (also at MIT) had a four-
                            terminal system working on an IBM 709 mainframe.
                            Soon afterwards CTSS (Compatible Time Sharing
                            System) was running, the first effective time-sharing
                            system. Even in those systems of today which aren’t
                            shared by different users, these mechanisms are a
                            critical support for computer security.
                            McCarthy (at Stanford) developed the LISP
                            programming language for supporting list processing;
                            it continues to be critical for Artificial Intelligence and
                            related work, and is still widely used. List processing
                            was not completely new at this point; at the 1956
                            Dartmouth Summer Research Project on Artificial
                            Intelligence, Newell, Shaw and Simon described IPL 2,
                            a list processing language for Rand Corporation’s
1958-    List Processing    JOHNNIAC computer. However, McCarthy realized
1960     (LISP)             that a program could itself be represented as a list,
                            refining the approach into a flexible system
                            fundamentally based on list processing. In 1956-1958
                            he began thinking about what would be needed for list
                            processing, with significant work beginning in 1958
                            with hand simulated compilations. LISP demonstrated
                            other important innovations used in many later
                            languages, including polymorphism and unlimited-
                            extent data structures.
                            In the early days of computing every vendor had their
                            own incompatible method for creating programs and
                            storing data. IBM, for example, encoded characters
         Vendor-Independent
                            using systems such as BCD and EBCDIC. But this
1959-    Exchange Standards
                            created terrible problems for users, who could not
1960     for Software
                            easily exchange information and were kept hostage by
         (COBOL and ASCII)
                            the various vendors. Thus, vendor-independent
                            exchange standards began to be developed. The
                            solution was to create vendor-independent exchange
  Year       Innovation                           Comments
                            standards.

                            The basic idea of creating standards was not new, even
                            then. But creating standards for something ephemeral
                            like software was new, so vendor-independent
                            exchange standards for software are being counted as
                            an innovation. Such standards are critical; standards
                            finally made it possible for users to choose and change
                            their suppliers, and since they could work together
                            even with different suppliers. (Even today, people fail
                            to understand the need for standards and thus fall
                            victim to vendor lock-in.) Two of the first efforts to
                            create such standards were COBOL (for exchanging
                            programs) and ASCII (for exchanging text).

                            In 1959, an industry-wide team was assembled to
                            formulate a standardized business programming
                            language, Common Business Oriented Language
                            (COBOL). The initial specification was presented in
                            April 1960, and was developed in cooperation with
                            computer manufactures, users (including the U.S.
                            Department of Defense) and universities. Soon
                            afterwards, in May 1962, a committee began
                            developing a standard for the Fortran language.

                            American Standard Code for Information Interchange
                            (ASCII) is a way of encoding characters as numbers, so
                            that there is a standard number to represent each
                            character of text. Work on ASCII began in 1960, and it
                            was first published in 1963. For many years ASCII
                            competed with the vendor-specific EBCDIC, but
                            eventually the open vendor-neutral ASCII beat the
                            vendor-specific format (a pattern that often repeated
                            over the years) .
                            In 1960 Paul Baran (RAND) proposed a message
                            switching system that could forward messages over
                            multiple paths. Unlike previous approaches (which
                            required large storage capacities at each node), his
                            approach used higher transmission speeds, so each
         Packet-Switching
1960                        node could be small, simple, and cheap. Baran’s
         Networks
                            approach routed messages to their destination instead
                            of broadcasting them to all, and these routing decisions
                            were made locally. In 1961 Leonard Kleinrock (MIT)
                            published “Information Flow in Large Communication
                            Nets,” the first larger work examining and defining
Year   Innovation                           Comments
                    packet-switching theory. In 1964 Paul Baran wrote a
                    series of papers titled “On Distributed Communications
                    Networks” that expanded on this idea. This series
                    described how to implement a distributed packet-
                    switching network with no single outage point (so it
                    could be survivable). In 1966 Donald Davies (NPL,
                    UK) publicly presented his ideas, which he termed
                    “packet switching”, and learned that Baran had already
                    invented the idea (though we still use Davies’ term
                    “packet switching”). Davies started the “Mark I”
                    project in 1967 to implement it, and ARPANET
                    planning (the ancestor of the Internet) also began in
                    1967.

                    It’s worth noting here that in a similar time period,
                    ARPA was looking for solutions to some of the
                    problems that packet-switching solves. J.C.R.
                    Licklider, head of two ARPA departments for a time,
                    had formed the jokingly named “Intergalactic
                    Computer Group” in the early 1960s. In 1963 he wrote
                    a memo to its members pleading for standardization
                    among the various computer systems so they could
                    easily communicate data between them, a memo that
                    spurred on the search for and implmentation of ways to
                    link computers together. In 1965 (after he left ARPA)
                    Licklider wrote the book “Libraries of the future”,
                    which also hinted at the Internet and World Wide Web
                    of the future; Licklider said that “the concept of a
                    ‘desk’ may have changed from passive to active: a desk
                    may be primarily a display-and-control station in a
                    telecommunication-telecomputation system-and its
                    most vital part may be the cable (’umbilical cord’) that
                    connects it [into the] net [to obtain] “everyday
                    business, industrial, government, and professional
                    information, and perhaps, also to news, entertainment,
                    and education.”

                    On September 2, 1969, UCLA professor Len
                    Kleinrock, along with graduate students Stephen
                    Crocker and Vinton Cerf, sent the first test data
                    between two ARPA computers in a system that would
                    eventaully become the Internet. These packet-
                    switching concepts are the fundamental basis of the
                    Internet, defining how the Internet uses packet-
                    switching, though it would be several years before the
  Year        Innovation                              Comments
                              TCP/IP protocols we now use would be developed.
                              Note that TCP/IP and the Internet were not themselves
                              designed to survive nuclear attack or other security
                              issues like that. Instead, the later developers of TCP/IP
                              needed their network to have lots of nice properties,
                              and the packet-switching concept created by Baran
                              (which was developed to be survivable) turned out to
                              have the properties they needed.
                              The first “word processor”, IBM’s product MT/ST
                              (Magnetic Tape/Selectric Typewriter), which combined
                              the features of the Selectric typewriter with a magnetic
                              tape drive. For the first time, typed material could be
1964     Word Processing
                              edited without having to retype the whole text or chop
                              up a coded copy. Later, in 1972, this would be
                              morphed into a word processing system we would
                              recognize today.
                              The Mouse was invented in 1964 by Douglas C.
                              Engelbart at SRI, using funding from the U.S.
                              government’s ARPA program [Naughton 2000, 81].
                              Although this could be viewed as a hardware
                              innovation, it isn’t much of a hardware innovation (it’s
1964     The Mouse
                              nothing more than an upside-down trackball). The true
                              innovations were in the user interface approaches that
                              use the mouse, which is entirely a software innovation.
                              It was patented, though this never resulted in much
                              money for the inventor.
                              E. W. Dijkstra defined semaphores for coordinating
                              multiple processes. The term derives from railroad
1965     Semaphores
                              signals, which in a similar way coordinate trains on
                              railroad tracks.
                              The Multics project spurred several innovations.
                              Multics was the first operating system to sport
                              hierarchical directories, as described in a 1965 paper by
                              Daley and Neumann. Multics was also the first
                              operating system where, in an innovation developed by
         Hierarchical         Louis Pouzin, what you type at command level is the
         directories, program name of a program to run. This caused related
1965
         names as commands innovations like working directories and a shell. In
         (Multics)            earlier systems, like CTSS, adding a command
                              requiring recompiling; to run your own program you
                              had to execute a system command that then loaded and
                              ran the program. Louis Pouzin implemented a very
                              limited form of this idea on CTSS as “RUNCOM”, but
                              the full approach was implemented on Multics with his
  Year        Innovation                             Comments
                             help. Although fewer ordinary users use a command
                             line interface today, these are still important for many
                             programmers. The Multicians.org site has more
                             information on Multics features.
                             J.A. Robinson developed the concept of “unification”.
1965     Unification         This concept - and algorithms that implement it -
                             become the basis of logic programming.
                             B?and Jacopini defined the fundamentals of “structured
                             programming”, which showed that programs could be
                             created using a limited set of instructions (looping,
                             conditional, and simple sequence) - and thus showing
         Structured
1966                         that the“goto” statement was actually not essential.
         Programming
                             Edsger Dijkstra’s 1968 letter “GO TO Statement
                             Considered Harmful” popularized the use of this
                             approach, claiming that the “goto” statement produced
                             code that was difficult to maintain.
                             Les Earnest of Stanford developed the first spelling
                             checker circa 1966. He later improved it around 1971
                             and this version quickly spread (via the ARPAnet)
                             throughout the world. Earnest noted that 1970s
1966     Spelling Checker
                             partipants on the ARPAnet “found that both programs
                             and data migrated around the net rather quickly, to the
                             benefit of all” - an early note of the amplifying effect
                             of large networks on OSS/FS development.
                             In computer programming, “a pseudo-code or p-code
                             machine is a specification of a CPU whose instructions
                             are expected to be executed in software rather than in
                             hardware (ie, interpreted).” Basic Combined
                             Programming Language (BCPL) is a computer
                             programming language designed by Martin Richards of
                             the University of Cambridge in 1966. He developed a
                             way to make it unusually portable, by splitting the
                             compiler into two parts: a compiler into an
         Pseudo-Code (p-
                             intermediate pseudo-code (which he called O-code),
1966     Code) Machine (in
                             and a back-end that translated that into the actual
         BCPL)
                             machine code. Since the intermediate code could be
                             exchanged between arbitrary machines, it enabled
                             portability. Later Pascal implementations used this
                             approach, calling it p-code and popularizing the
                             technique. Java and later C# are based on this
                             fundamental approach, which enables compiled code to
                             be sent to different computer architectures. Even many
                             text adventure games have been built with this
                             approach, most famously the Z-machine used to
  Year        Innovation                             Comments
                             implement many Infocom games (such as Zork). BCPL
                             also significantly influenced the C programming
                             language, including its use of curly brackets {...}.
                             Object-oriented (OO) programming was introduced to
                             the world by the Norwegian Computing Centre’s Ole-
                             Johan Dahl and Kristen Nygaard when they release
                             Simula 67. Simula 67 introduces constructs that much
         Object Oriented     later become common in computer programming:
1967
         Programming         objects, classes, virtual procedures, and inheritance.
                             OO programming is later popularized in Smalltalk-80,
                             and still later C++, Java, and C#. This approach proved
                             especially invaluable later when graphical user
                             interfaces became widely used.
                             The first formatted texts manipulated by computer had
                             embedded codes that described how to format the
                             document (”font size X”, “center this text”). In
                             contrast, in the late 1960s, people began to use codes
                             that described the meaning of the text (such as “new
                             paragraph” or “title”), with separate information on
                             how to format it. This had many advantages, such as
                             allowing specialists to devise formats and easing
         Separating Text     searching, and influenced later technologies such as
1967
         Content from Format SGML, HTML, and XML. Although it is difficult to
                             identify a specific time for this idea, many credit the
                             use of this approach (sometimes called “generic
                             coding”) to a presentation made by William
                             Tunnicliffe, chairman of the Graphic Communications
                             Association (GCA) Composition Committee, during a
                             meeting at the Canadian Government Printing Office in
                             September 1967 (his topic was on the separation of the
                             information content of documents from their format).
                             Douglas C. Engelbart gave a 90-minute, staged public
                             demonstration of a networked computer system at the
                             Augmentation Research Center, which was the first
                             public appearance of the mouse, windows, hypermedia
                             with object linking and addressing, and video
                             teleconferencing. These are the innovations that are
         The Graphical User
1968                         fundamental to the graphical user interface (“GUI”).
         Interface (GUI)
                             This kind of interface made it much easier to
                             implement a driving idea of J.C.R. “Lick” Licklider,
                             who envisioned a human-computer symbiosis. One of
                             Licklider’s central ideas was that “a close coupling
                             between humans and computers would result in better
                             decision-making. In this novel partnership, computers
  Year        Innovation                              Comments
                              would do what they excelled at - calculations, routine
                              operations, and the rest - thereby freeing humans to do
                              what they in turn did best. The human-computer
                              system would thus be greater than the sum of its parts.”
                              (Summary from [Naughton, page 71]; see “Man-
                              Computer Symbosis”, IRE Transactions on Human
                              Factors in Electronics, vol. HFE-1, March 1960, pp 4-
                              11.)
                              Ken Thompson published in the Communications of
                              the ACM, June 1968, the paper “Regular Expression
                              Search Algorithm,” the first known computational use
                              of regular expressions. Regular expressions had been
                              studied earlier in mathematics, based on work by
                              Stephen Kleene. Thompson later embedded this
                              capability in the text editor ed to implement a simple
                              way to define text search patterns. ed’s command
1968     Regular Expressions
                              ‘g/regular expression/p’ was so useful that a separate
                              utility, grep was created to print every line in a file that
                              matched the pattern defined by the regular expression.
                              Later, many libraries included this capability, and the
                              widely-used Perl language makes regular expressions a
                              fundamental underpinning for the language. See Jeffrey
                              E.F. Friedl’s Mastering Regular Expressions, 1998, pp.
                              60-62, for more about this history.
                              In 1969 Charles F. Goldfarb, Ed Mosher, and Ray
                              Lorie developed what they called a “Text Description
                              Language” to enable integrating a text editing
                              application, an information retrieval system, and a page
                              composition program. The documents had to be
                              selectable by query from a repository, revised with the
                              text editor, and returned to the data base or rendered by
                              the composition program. This was an extremely
                              advanced set of capabilities for its time, and one that
         Standardized Generic simple markup approaches did not support well. They
1969-
         Markup Language      solved this problem by creating a general approach to
1970
         (SGML)               identifying different types of text, supporting formally-
                              defined document types, and creating an explicit nested
                              element structure. Their approach was first mentioned
                              in a 1970 paper, renamed after their initials (GML) in
                              1971, and use began in 1973. GML became the basis
                              for the Standard Generalized Markup Language
                              (SGML), ISO Standard 8879. HTML, the basis of the
                              World Wide Web, is an application of SGML, and the
                              widely-used XML (another critically important
                              technology) is a simplified form of SGML. For more
  Year        Innovation                              Comments
                              information see The Roots of SGML -- A Personal
                              Recollection and A Brief History of the Development
                              of SGML. These standard markup languages have been
                              critical for supporting standard interchange of data that
                              support a wide variety of display devices and querying
                              from a vast store of documents.
                              E.F. Codd introduced the relational model and
                              relational algebra in a famous article in the
                              Communications of the ACM, June 1970. This is the
         Relational Model and
1970                          theoretical basis for relational database systems and
         Algebra (SQL)
                              their query language, SQL. The first commercial
                              relational database, the Multics Relational Data Store
                              (MRDS), was released in June 1976.
                              Richard Watson at the Stanford Research Institute
                              suggested that a system be developed for transferring
                              mail from one computer to another via the ARPANET
                              network. Ray Tomlinson of Bolt, Beranek and
                              Newman (BBN) implemented the first email program
                              to send messages across a distributed network, derived
                              from an intra-machine email program and a file-
                              transfer program. This quickly became the
                              ARPANET’s most popular and influential service.
                              Note that Tominson defined the “@” convention for
         Distributed Network email addresses. It isn’t clear when single-computer
1971
         Email                email was developed; it’s known that MIT’s CTSS
                              computer had a message feature in 1965 [Abbate 1999,
                              page 109]. However, email that can span computers is
                              far more powerful than email limited to a single
                              computer. In 1973 the basic Internet protocol for
                              sending email was formed (though RFC 630 wasn’t
                              released until April 1974), and in 1975 the Internet
                              mail headers were first officially defined [Naughton
                              2000, 149]. In 1975, John Vittal released the email
                              program MSG, the first email program with an
                              “answer” (reply) command [Naughton 2000, 149].
                              David Parnas published a definition and justification of
1972     Modularity Criteria
                              modularity via information hiding.
                              Lexitron and Linolex developed the first word
                              processing system that included video display screens
                              and tape cassettes for storage; with the screen, text
         Screen-Oriented
1972                          could be entered and corrected without having to
         Word Processing
                              produce a hard copy. Printing could be delayed until
                              the writer was satisfied with the material. It can be
                              argued that this was the first “word processor” of the
  Year         Innovation                            Comments
                              kind we use today. (see a brief history of word
                              processing for more information). Other word
                              processors were developed since. In 1979, Seymour
                              Rubenstein and Rob Barnaby released “WordStar”, the
                              first commercially successful word processing software
                              program produced for microcomputers, but this was
                              simply a re-implementation of a previous concept. In
                              March of 1980, SSI*WP (the predecessor of Word
                              Perfect) was released.
                              Pipes are “pipelines” of commands, allowing programs
                              to be easily “hooked together”. Pipes were originally
                              developed for Unix and widely implemented on other
                              operating systems (including all Unix-like systems and
                              MS-DOS/Windows). M. D. McIlroy insisted on their
                              original implementation in Unix; after a few months
1972      Pipes               their syntax was changed to today’s syntax.
                              Redirection of information pre-existed this point
                              (Dartmouth’s system supported redirection, as did
                              Multics), but it was only in 1972 that they were
                              implemented in a way that didn’t require programs to
                              specially support them and permitted programs to be
                              rapidly connected together.
                              Rudolf Bayer and Edward M. McCreight publish the
1972      B-Tree              seminal paper on B-trees, a critical data structure
                              widely used for handling large datasets.
                              By this date high-level languages had been used for
                              many years to reduce development time and increase
                              application portability between different computers.
                              But many believed entire operating systems could not
                              be practically ported in the same way, since operating
                              systems needed to control many low-level components.
                              This was a problem, since it was often difficult to port
                              applications to different operating systems. Significant
                              portions of operating systems had been developed
          Portable operating
1972,1976                     using high-level languages; ( Burroughs wrote much of
          systems (OS6, Unix)
                              the B5000’s operating system in a dialect of Algol, and
                              later much of Multics was written in PL/I, but both
                              were tied to specific hardware. In 1972 J.E. Stoy and
                              C. Strachy discussed OS6, an experimental operating
                              system for a small computer that was to be portable. In
                              1973 the fledgling Unix operating system was rewritten
                              in C, a high-level programming language that just been
                              developed, though at first the primary goal was not
                              general machine portability of the entire operating
  Year        Innovation                                Comments
                                system. In 1976-1977 the Unix system was modified
                                further to be portable, and the Unix system did not
                                limit itself to being small - it intentionally included
                                significant capabilities such as a hierarchical filesystem
                                and multiple simultaneous users. This allowed
                                computer hardware to advance more rapidly, since it
                                was no longer necessary to rewrite an operating system
                                when a new hardware idea or approach was developed.
                                The Cyclades project began in 1972 as an experimental
                                network project funded by the French government. It
                                demonstrated that computer networks could be
                                interconnected (“internetworked”) by the simple
                                mechanism of transferring data packets (datagrams),
                                instead of trying to build session connections or trying
                                to create highly reliable “intelligent” networks or
                                “intelligent” systems which connected the networks.
                                Removing the requirement for “intelligence” when
                                trying to hook networks together had great benefits: it
                                made systems less dependent on a specific media or
                                technology, and it also made systems less dependent on
                                central authorities to administer it.

                                At the time, networks were built and refined for a
                                particular media, making it difficult to make them
                                interoperate. For example, the ARPANET protocols
         Internetworking using (NCP) depended on highly reliable networks, an
1972     Datagrams (leading to assumption that broke down for radio-based systems
         the Internet's TCP/IP) (which used an incompatible set of protocols). NCP
                                also assumed that it was networking specific
                                computers, not networks of networks. The experience
                                of Xerox PARC’s local system (PARC Universal
                                Packet, or PUP), based on Metcalfe’s 1973
                                dissertation, also showed that “intelligence” in the
                                network was unnecessary - in their system,
                                “subtracting all the hosts would leave little more than
                                wire.”

                                On June 1973, Vinton Cerf organized a seminar at
                                Stanford University to discuss the redesign of the
                                Internet, where it was agreed to emphasize host-based
                                approaches to internetworking. In May 1974, Vinton
                                Cerf and Robert E. Kahn published “A Protocol for
                                Packet Network Interconnection,” which put forward
                                their ideas of using gateways between networks and
                                packets that would be encapsulated by the transmitting
Year   Innovation                        Comments
                    host. This approach would later be part of the Internet.

                    In 1977, Xerox PARC’s PUP was designed to support
                    multiple layers of network protocols. This approach
                    resolved a key problem of Vinton Cerf’s Internet
                    design group. Early attempts to design the Internet tried
                    to create a single protocol, but this required too much
                    duplication of effort as both network components and
                    hosts tried to perform many of the same functions. By
                    January 1978, Vint Cerf, Jon Postel, and Danny Cohen
                    developed a design for the Internet, using two layered
                    protocols: a lower-level internetwork protocol (IP)
                    which did not require “intelligence” in the network and
                    a higher-level host-to-host transmission control
                    protocol (TCP) to provide reliability and sequencing
                    where necessary (and not requiring network
                    components to implement TCP). This was combined
                    with the earlier approaches of using gateways to
                    interconnect networks. By 1983, the ARPANET had
                    switched to TCP/IP. This layering concept was later
                    expanded by ISO into the “OSI model,” a model still
                    widely used for describing network protocols. Over the
                    years, TCP/IP was refined to what it is today.

                    In the early 1980s, DARPA sponsored or encouraged
                    the development of TCP/IP implementations for many
                    systems. BSD implemented TCP/IP as open source
                    software, which led to its being available to many.
                    After TCP/IP had become wildly popular, Microsoft
                    added support for TCP/IP to Windows (originally by
                    licensing TCP/IP code from Spider systems as well as
                    using BSD-developed code; Microsoft later rewrote
                    portions).

                    As an aside, there are two different misconceptions
                    about the Internet and TCP/IP that should be clarified.
                    Some mistakenly claim that the Internet and TCP/IP
                    were specifically created to resist nuclear attacks; this
                    is absolutely not true, since its parent the ARPANET
                    was specifically created to share large systems. Yet it's
                    also a mistake to claim that there was no connection
                    between the Internet and survivable networks; the
                    Internet TCP/IP technology is an internetwork of data
                    packets, and as noted earlier, packet-switching of data
                    packets was created was to be survivable in case of
  Year        Innovation                             Comments
                               disaster.

                              In 2005, Vinton Cerf and Robert Kahn were awarded
                              the prestigious Turing award for their role in creating
                              the Internet’s basic components, particularly the
                              TCP/IP protocol. One of the reasons given for the
                              adoption of the TCP/IP protocols was they were
                              unencumbered by patent claims; “Dr. Cerf said part of
                              the reason their protocols took hold quickly and widely
                              was that he and Dr. Kahn made no intellectual property
                              claims to their invention.”
                              There have been many efforts to create fonts using
                              mathematical techniques; Felice Feliciano worked on
                              doing so around 1460. However, these older attempts
                              generally produced ugly results. In 1973-1974 Peter
                              Karow developed Ikarus, the first program to digitally
                              generate fonts at arbitrary resolution. In 1978, Donald
                              Knuth revealed his program Metafont, which generated
                              fonts as well (this work went hand-in-hand with his
                              work on the open source typesetting program TeX,
         Font Generation
1973                          which is still widely used for producing typeset papers
         Algorithms
                              with significant mathematical content).
                              Algorithmically-generated fonts were fundamental to
                              the Type 1 fonts of Postscript and to True Type fonts as
                              well. Font generation algorithms made it possible for
                              people to vary their font types and sizes to whatever
                              they wanted, and for displays and printers to achieve
                              the best possible presentation of a font. Today, most
                              fonts displayed on screens and printers are generated
                              by some font generation algorithm.
                              Hoare (1974) and Brinch Hansen (1975) proposed the
                              monitor, a higher-level synchronization primitive; it’s
1974     Monitor
                              now built into several programming languages (such as
                              Java and Ada).
                              C. A. R. Hoare published the concept of
                              Communicating Sequential Processes (CSP) in
         Communicating
                              “Parallel Programming: an Axiomatic Approach”
1975     Sequential Processes
                              (Computer Languages, vol 1, no 2, June 1975, pp. 151-
         (CSP)
                              160). This is a critically important approach for
                              reasoning about parallel processes.
                              The Diffie-Hellman public key algorithm was created
         Diffie-Hellman       in a way that the public could read about it. According
1977
         Security Algorithm to the United Kingdom’s GCHQ, M. J. Williamson had
                              invented this algorithm (or something very similar to
  Year       Innovation                             Comments
                            it) in 1974, but it was classified, and I’m only counting
                            those discoveries made available to the public. This
                            algorithm allowed users to create a secure
                            communication channel without meeting.
                            Rivest, Shamir, and Adleman, published their seminal
                            paper describing the RSA algorithm, a critical basis for
                            security. The RSA algorithm permits authentication or
                            encryption without having to previously exchange a
                            secret shared key, greatly simplifying security. It’s
         RSA security       amusing to note that this paper also introduced “Alice”
1978
         algorithm          and “Bob”, fictious characters who are trying to
                            securely communicate, and Alice and Bob have
                            become a standard part of security notation ever since
                            the RSA paper. According to the United Kingdom’s
                            GCHQ, Clifford Cocks had invented the RSA
                            algorithm in 1973, but it was classified.
                            Dan Bricklin and Bob Frankston invented the
                            spreadsheet application (as implemented in their
1978     Spreadsheet
                            product, VisiCalc). Bricklin and Frankston have made
                            information on VisiCalc’s history available on the web.
                            Leslie Lamport published “Time, Clocks, and the
                            Ordering of Events in a Distributed System”
1978     Lamport Clocks     (Communications of the ACM, vol 21, no 7, July 1978,
                            pp. 558-565). This is an important approach for
                            ordering events in a distributed system.
                            Tom Truscott and Jim Ellis (Duke University, Durham,
                            NC), along with Steve Bellovin (University of North
                            Carolina, Chapel Hill), set up a system for distributing
                            electronic newsletters, originally between Duke and the
                            University of North Carolina using dial-up lines and
                            the UUCP (Unix-to-Unix copy) program. This was the
         Distributed        beginning of the informal network USENET,
1979     Newsgroups         supporting online forums on a variety of topics, and
         (USENET)           took off once Usenet was bridged with the ARPANET.
                            ARPANET already had discussion groups (basically
                            mailing lists). However, the owner of ARPANET
                            discussion groups determined who received the
                            information - in contrast, everyone could read
                            USENET postings (a more democratic and scaleable
                            approach) [Naughton 2000, 177-179]
                            The “Model, View, Controller” (MVC) triad of classes
         Model View         for developing graphical user interfaces (GUIs) was
1980
         Controller (MVC)   first introduced as part of the Smalltalk-80 language at
                            Xerox PARC. This work was overseen by Alan Kay,
  Year       Innovation                              Comments
                            but it appears that many people were actually involved
                            in developing the MVC concept, including Trygve
                            Reenskaug, Adele Goldberg, Steve Althoff, Dan
                            Ingalls, and possibly Larry Tesler. Krasner and Pope
                            later documented the approach extensively and
                            described it as a pattern so it could be more easily used
                            elsewhere. This doesn’t mean that all GUIs have been
                            developed using MVC, indeed, in 1997 and 1998, the
                            Alan Kay team moved their Smalltalk graphic
                            development efforts and research to another model
                            based on display trees called Morphic, which they
                            believe obsoletes MVC. However, this design pattern
                            has since been widely used to implement flexible
                            GUIs, and has influenced later thinking about how to
                            develop GUIs.
                            An RPC (Remote Procedure Call) allows one program
                            to request a service from another program, potentially
                            located in another computer, without having to
                            understand network details. The requestor usually waits
                            until the results are returned, and local calls can be
                            optimized (e.g., by using the same address space). This
                            calling is facilitated through an “interface definition
         Remote Procedure   language” (IDL) to define the interface. It’s difficult to
1981?
         Call (RPC)         trace this innovation back in time; the earliest I’ve
                            identified this concept is Xerox’s Courier RPC
                            protocols, but I believe the concept is much older than
                            the date shown here. Sun’s RPC (later an RFC) were
                            derived from this, and later on DCE, CORBA,
                            component programming (COM, DCOM), and web
                            application access (SOAP / WDDI, RPC-XML) all
                            derive from this.
                            A computer virus is a program that can ‘infect’ other
                            programs by modifying them to include a possibly
                            evolved copy of itself. While not a positive
                            development, this was certainly an innovation. The
                            program “Elk Cloner” is typically identified as the first
                            “in the wild” computer virus. Elk Cloner written in
                            1982 by junior high school student Richard Skrenta, as
1982     Computer Virus
                            a practical joke. It attached to the Apple DOS 3.3
                            operating system, and spread through floppy disks that
                            were inserted afterwards. The history of computer
                            viruses is more complicated, and some consider earlier
                            programs named Creeper, Rabbit, or Animal as the first
                            virus. In particular, in 1975 John Walker released
                            ”Animal” on Univac systems with a PERVADE
  Year        Innovation                             Comments
                             subroutine that caused copies of Animal to reappear
                             elsewhere. But since it was careful to only use this to
                             update its own software, it’s not clear that it fits the
                             definition above, and few (other than Walker)
                             understood the dangers of the idea. Fred Cohen later
                             wrote academic works studying computer viruses.
                             The “domain name system” (DNS) was invented,
                             essentially the first massively distributed database,
                             enabling the Internet to scale while allowing users to
                             use human-readable names of computers. Every time
                             you type in a host name such as “www.dwheeler.com”,
                             you’re relying on DNS to translate that name to a
         Distributing Naming numeric address. Some theoretical work had been done
1984
         (DNS)               before on massive database distribution, but not as a
                             practical implementation on this scale, and DNS
                             innovated in several ways to make its implementation
                             practical (e.g., by not demanding complete network-
                             wide synchronicity, by distributing data maintenance as
                             well as storage, and by distributing “reverse lookups”
                             through a clever reflective scheme).
                             Dick Grune released to the public the Concurrent
                             Versions System (CVS), the first lockless version
                             management system for software development. In
                             1984-1985, Grune wanted to cooperate with two of his
                             students when working on a C compiler. However,
                             existing version management systems did not support
                             cooperation well, because they all required that
                             versions “locked” before they could be edited, and
                             once locked only one person could edit the file. While
                             standing at the university bus stop, waiting for the bus
                             home in bad autumn weather, he created an approach
                             for supporting distributed software development that
         Lockless version
1986                         did not require project-wide locking. After initial
         management (CVS)
                             development, CVS was publicly posted by Dick Grune
                             to the newsgroup mod.sources on 1986-07-03 in
                             volume 6 issue 40, (and also to comp.sources.unix) as
                             source code (in shell scripts). CVS has since been re-
                             implemented, but its basic ideas have influenced all
                             later version management systems. The initial CVS
                             release did not formally state a license (a common
                             practice at the time), but in keeping with the common
                             understanding of the time, Mr. Grune intended for it to
                             be used, modified, and redistributed; he has specifically
                             stated that he “certainly intended it to be a gift to the
                             international community... for everybody to use at their
  Year        Innovation                               Comments
                               discretion.” Thus, it appears that the initial
                               implementation of CVS was intended to be open
                               source software / free software (OSS/FS) or something
                               closely akin to it. Certainly CVS has been important to
                               OSS/FS since that time; while OSS/FS development
                               can be performed without it, CVS’s ideas were a key
                               enabler for many OSS/FS projects, and are widely used
                               by proprietary projects as well. CVS’ ideas have been a
                               key enabler in many projects for scaling software
                               development to much larger and more geographically
                               distributed development teams.
                               The World Wide Web (WWW)’s Internet protocol
                               (HTTP), language (HTML), and addressing scheme
                               (URL/URIs) were created by Tim Berners-Lee. The
                               idea of hypertext had existed before, and Nelson’s
                               Xanadu had tried to implement a distributed scheme,
         Distributed Hypertext but Berners-Lee developed a new approach for
         via Simple            implementing a distributed hypertext system. He
1989
         Mechanisms (World combined a simple client-server protocol, markup
         Wide Web)             language, and addressing scheme in a way that was
                               new, powerful, and easy to implement. Each of the
                               pieces had existed in some form before, but the
                               combination was obvious only in hindsight. Berners-
                               Lee’s original proposal was dated March 1989, and he
                               first implemented the approach in 1990.
                               Erich Gamma published in 1991 his PhD thesis which
                               first seriously examined software design patterns as a
                               subject of study including a number of specific design
                               patterns. In 1995 Gamma, Helm, Johnson, and
                               Vlissides (the “Gang of Four”) published “Design
                               Patterns,” which widely popularized the idea. The
                               concept of “design patterns” is old in other fields,
                               specific patterns had been in use for some time, and
                               algorithms had already been collected for some time.
1991     Design Patterns
                               Some notion of patterns is suggested in earlier works
                               (see the references in both). However, these works
                               crystallized software design patterns in a way that was
                               immediately useful and had not been done before. This
                               has spawned other kinds of thinking, such as trying to
                               identify anti-patterns (“solutions” whose negative
                               consequences exceed their benefits; see the
                               Antipatterns website, including information on
                               development antipatterns).
1992     Secure Mobile Code A system supporting secure mobile code can
  Year        Innovation                              Comments
         (Java and Safe-Tcl) automatically download potentially malicious code
                             from a remote site and safely run it on a local
                             computer. Sun built in 1990-1992, and demonstrated
                             on September 1992, its new programming language,
                             Oak (later called Java), as part of the Green project’s
                             demonstration of its *7 PDA. Oak combined an
                             interpreter (preventing certain illegal actions at run-
                             time) and a bytecode verifier (which examines the
                             mobile code for certain properties before running the
                             program, speeding later execution). Originally intended
                             for the “set-top” market, Oak was modified to work
                             with the World Wide Web and re-launched (with much
                             fanfare) as Java in 1995. Nathaniel Borenstein and
                             Marshall Rose implemented a prototype of Safe-Tcl in
                             1992; it was first used to implement “active email
                             messages.” An expanded version of Safe-Tcl was
                             incorporated into regular Tcl on April 1996 (Tcl 7.5).
                             Refactoring is the process of changing a software
                             system that does not alter its external behavior but
                             improves its internal structure. It’s sometimes
                             described as“improving the design after it’s written”,
                             and could be viewed as design patterns in the small.
                             Specific refactorings and the general notion of
1993     Refactoring         restructuring programs were known much longer, of
                             course, but creating and studying a set of source code
                             refactorings was essentially a new idea. This date is
                             based on William F. Opdyke’s PhD dissertation, the
                             first lengthy discussion of it (including a set of standard
                             refactorings) I’ve found. Martin Fowler later published
                             his book “Refactoring” which popularized this idea.
                             The World Wide Web Worm (WWWW) indexed
                             110,000 web pages by crawling along hypertext links
                             and providing a central place to make search requests;
                             this is one of the first (if not the first) web search
                             engines. Text search engines far precede this, of
         Web-Crawling Search course, so it can be easily argued that this is simply the
1994
         Engines             reapplication of an old idea. However, text search
                             engines before this time assumed that they had all the
                             information locally available and would know when
                             any content changed. In contrast, web crawlers have to
                             locate new pages by crawling through links (selectively
                             finding the “important” ones).
         Content-Based       Content-based addressing (aka content-based storage)
1996
         Addressing (rsync)  calculates cryptographic hashes of data (usually whole
  Year        Innovation                              Comments
                              files) and uses that value as the "address" of the data.
                              This dramatically saves network bandwidth -
                              applications need only exchange hashes over the
                              network instead of the actual content. The application
                              "rsync" by Andrew Tridgell used it in 1996, and later
                              applications such as BitTorrent and git use it as well.
                              Although bandwidth has dramatically increased over
                              years, the amount of data we want to send has grown as
                              well - and this makes many large data transfers much
                              more efficient. A more formal analysis is in "Content-
                              Based Addressing and Routing: A General Model and
                              its Application" by Antonio Carzaniga, David S.
                              Rosenblum, and Alexander L. Wolf.
                              In 2004, Google's Jeffrey Dean and Sanjay Ghemawat
                              revealed MapReduce, a programming model that
                              enables processing and generating large data sets with
                              huge clusters of parallel machines, yet is remarkably
                              easy to program. In this approach, developers "specify
                              a map function that processes a key/value pair to
                              generate a set of intermediate key/value pairs, and a
                              reduce function that merges all intermediate values
                              associated with the same intermediate key." The
                              developers also specify an input reader, a partition
                              function, a compare function, and an output writer.
                              These are then fed to the MapReduce framework,
         Massively-parallel   which executes those definitions on a potentially large
2004
         MapReduce            distributed computer cluster, which handles
                              complications such as computer and network failure.
                              They note that "many real world tasks are expressible
                              in this model". Programmers without any experience
                              with parallel or distributed systems can through this
                              model, use large distributed systems to handle large
                              data sets. The basic MapReduce approach has since
                              been implemented in other tools such as Hadoop and
                              Qt Concurrent; Eugene Ciurana has an article
                              demonstrating how to use MapReduce approaches
                              (using Mule). Google's reworked search engine no
                              longer uses mapreduce, but it is still widely-applicable
                              to other projects.


Software Patents
One source that was not helpful for this analysis were software patents. The reason?
Software patents are actually harmful, not helpful, to software innovation, as confirmed
by a myriad of data. Those unfamiliar with software patents may find that shocking.

There are several basic problems with software patents, compared to actual innovation:

   1. almost all truly important innovations in software were never covered by patents,
      so using patents as a primary source would omit almost all of the most important
      software innovations;
   2. as software patentability has increased, the number of software innovations has
      decreased; and
   3. software patents are often granted to cover ideas that are obvious to practitioners
      of the art or have prior art (even though these aren’t supposed to be patented).

There are many reasons most of the most important software innovations were never
patented. Historically, software was not patentable, and it’s still not patentable in vast
number of countries (including the EU). Many believe software should never be
patentable, and many of them oppose software patents on ethical or moral grounds as
well as on pragmatic grounds (and many of them will not apply for patents for these
reasons). For more about the many who oppose software patents, you can see the ffii.org
site and the League for Programming Freedom, including statements by software vendor
Oracle and and a list of software luminaries opposed to software patents (including
Donald Knuth). Dan Bricklin (inventor of the spreadsheet) explains why introducing
patents to the software industry, about 50 years after the industry began (and after it had
already been flourishing without them), is a mistake and hardship. AutoCAD’s co-author
and Autodesk founder John Walker wrote “Patent Nonsense”, where he states that “Ever
since Autodesk had to pay $25,000 to ‘license’ a patent which claimed the invention of
XOR-draw for screen cursors (the patent was filed years after everybody in computer
graphics was already using that trick), I’ve been convinced that software patents are not
only a terrible idea, but one of the principal threats to the software industry... the
multimedia industry is shuddering at the prospect of paying royalties on every product
they make, because a small company in California has obtained an absurdly broad patent
on concepts that were widely discussed and implemented experimentally more than 20
years earlier.” Forbes’ article “Patently Absurd” also notes the problems of patents, as
does eWeek. One survey of professional programmers found that by a margin of 79.6%
to 8.2%, computer programmers said that granting patents on computer software
impedes, rather than promotes, software development (the remaining 12.2% were
undecided). By 59.2% to 26.5% (2:1), most went even further, saying that software
patents should be abolished outright. Professors Bessen and Maskin, two economists at
the Massachusetts Institute of Technology (MIT), have demonstrated in a report that
introducing patenting into the software economy only has economic usefulness if a
monopoly is the most useful form of software production. This is concerning, because
few believe that a monopoly is truly the most useful (or desirable) form of software
production.
Paul Vick, lead architect for Visual Basic .Net at Microsoft, was required by his
employer to file for a patent on an obvious pre-existing idea (the IsNot operator), which
the patent office nevertheless granted -- Paul’s posting on Software patents states, “I
don’t believe software patents are a good idea... software patents generally do much more
harm than good. As such, I’d like to see them go away and the US patent office focus on
more productive tasks... One of the most unfortunate aspect of the software patent system
is that there is a distinct advantage, should you have the money to do so, to try and patent
everything under the sun in the hopes that something will stick.... [overwhelming the
patent system.] Microsoft has been as much a victim of this as anyone else, and yet we’re
right there in there with everyone else, playing the game. It’s become a Mexican standoff,
and there’s no good way out at the moment short of a broad consensus to end the game at
the legislative level. as far as the specific IsNot patent goes, I will say that at a personal
level, I do not feel particularly proud of my involvement in the patent process in this
case.”

As patentability has increased, there’s good evidence that the number of software
innovations has decreased. Bessen and Maskin also demonstrated a statistical correlation
between the spread of patentability in the United States and a decline in innovation in
software. In particular, between 1987 and 1994 , software patents issuance rose 195%,
yet real company funded R&D fell by 21% in these (software) industries while rising by
25% in industries in general. This paper gives additional evidence that software patents
are inversely related to innovation; it’s hard to not notice that as patenting become more
common (e.g., 1987 and later) that the number of major innovations slowed down and are
almost always not patented anyway. Although these only show correlation and not
causality, other data suggest that there is a causal relation. Their more recent book,
"Patent Failure: How Judges, Bureaucrats, and Lawyers Put Innovators at Risk" by James
Bessen and Michael J. Meurer (Princeton University Press, March 2008) provides more
information about the failures of software patents. Chapter 9 notes, "In Chapter 7, we
noticed that patents on software and especially patents on business methods (which are
largely software patents) stood out as being particularly problematic. These patents had
high rates of litigation and high rates of claim construction review on appeal. This
chapter [argues] that there is, in fact, something crucially different about software:
software is an abstract technology. This is a problem because at least since the 18th
century, patent law has had difficulty dealing with patents that claimed abstract ideas or
principles... Such patents often have unclear boundaries and give rise to opportunistic
litigation... Software also seems to be an area with large numbers of relatively obvious
patents. For these reasons, it is not surprising that a substantial share of current patent
litigation involves software patents... no other technology has experienced anything like
the broad industry opposition to software patents that arose beginning during the 1960s.
Major computer companies opposed patents on software in their input to a report by a
presidential commission in 1966 and in amici briefs to the Supreme Court in Gottschalk
v. Benson in 1972. Major software firms opposed software patents through the mid-1990s
(for example in USPTO hearings in 1994). Perhaps more surprising, software inventors
themselves have mostly been opposed to patents on software. Surveys of software
developers in 1992 and 1996 reported that most were opposed to patents... Software
patents... play a central role in the failure of the patent system as a whole. Any serious
effort at patent reform must address these problems and failure to deal with the problems
of software patents... will likely doom any reform effort." Thus, not only do software
patents fail to help encourage innovation - they actually inhibit innovation.

The book "Against Intellectual Monopoly" by Michele Boldrin and David K. Levine
presents a number of evidences that software patents are harmful to the software industry
and users. Actually, it goes much further, presenting evidence against patents and
copyrights in general, but it's the evidence against software patents that I find especially
compelling. Patents are an economic adsurdity argues against patents in general, but has
some additional words on the specific problems of software patents.

L. Gordon Crovitz's "Patent Gridlock Suppresses Innovation" (Wall Street Journal, July
14, 2008, Page A15) states that "for most industries [including software], today's patent
system causes more harm than good... Our patent system for most innovations has
become patently absurd. It's a disincentive at a time when we expect software and other
technology companies to be the growth engine of the economy. Imagine how much more
productive our information-driven economy would be if the patent system lived up to the
intention of the Founders, by encouraging progress instead of suppressing it."

Bruce Perens explains why patents cause serious problems in creating and implementing
standards. Since patents retard the creation and use of standards, they also retard the
industry as a whole (since relevant, widely-implemented standards are a key need in the
software industry).

The patented European webshop is an excellent illustration of the problem - it shows a
few of the many obvious, widely-used ideas of grantedEuropean patents. In short, it
demonstrates why patents are a poor match for software.

There are also many reasons why software patents are often granted that cover obvious
ideas and prior art (which can give the illusion of innovation without actually having
any). As noted by an FTC analysis of patents, in the U.S. about 1,000 patent applications
now arrive each day, so patent examiners have from eight to 25 hours to read and
understand each application, search for prior art, evaluate patentability, communicate
with the applicant, work out necessary revisions, and reach and write up conclusions.
(The article also notes -- somehow without irony -- that most granted patents are in fact
obvious to practitioners, even though that is illegal.) Many other studies have noted that
patent examiners have a poor database of prior art in software, so it’s hard for them to
find prior art. But the biggest problem is that there are no incentives for anyone in the
patent process to reject bogus patents. The patent applicant has every incentive to
ignore prior art, the patent examiner has little time or resources to do this search, and a
patent examiner who doesn’t commit enough resources to the search is rewarded (in
contrast, a patent examiner who spends too much time on each patent will be punished).
And it’s difficult for a patent examiner to declare something is “obvious”; after all, the
people who are paying money say that their patent request isn’t obvious, and there’s little
downside for an examiner to agree with the petitioner. Also, other areas of the software
industry generally pay more than a patent examiner’s salary, decreasing the likelihood
that a software patent examiner has the best software experience. The entire software
patent examination process favors granting software patents for obvious and prior art.
The patent “review” process has become so much of a rubber stamp that Steven Olson
managed to obtain a patent on swinging sideways on a swing, an absurd patent that was
granted the U.S. patent process.

Frankly, I think permitting software patents in the U.S. was a tremendous mistake, and a
misuse of the original patent laws. Very few of the innovations listed here were patented,
and of the few that were (e.g., the mouse and RSA), there’s little evidence that granting
the patents encouraged innovation. The mouse patent never made much money for its
inventor, and although the developers of RSA did make money, there’s no evidence that
they would not have developed RSA without the offer of a patent. All evidence seems to
show that these ideas would have occurred without the patents! In short, patents impeded
deployment and increased customer costs without encouraging innovation. The patent
laws were originally written to specifically prevent patenting mathematical algorithms,
and courts have basically rewritten the laws to re-permit patenting of mathematical
algorithms (which is fundamentally what any software patent is). Permitting software
patents has done almost nothing to encourage innovation nor award innovators, and the
harm that it’s done far, far exceeds any claimed good. Most key software technology
innovations were never patented, so tracking patents is certain to miss most of the most
important innovations. Conversely, since patent examiners have a poor database of prior
art in software and there are no incentives for anyone in the patent process to seriously
search for prior art, software patents are routinely granted for previous and obvious
inventions in software technology. Basically, the number of patents granted for software
primarily shows how much money an organization is willing to spend to submit patent
applications - it has nothing to do with innovation. The W3C has noted that its policy of
ensuring that all W3C standards were royalty free has been key to universal web access;
anything else would cause dangerously harmful balkanization. Vint Cerf stated that part
of the reason the Internet protocols took hold so quickly and widely was that he and Dr.
Kahn made no intellectual property (patent) claims to their invention. “It was an open
standard that we would allow anyone to have access to without any constraints.”

“The Software Patent Experiment” by James Bessen (Research on Innovation and Boston
University) and Robert M. Hunt (Federal Reserve Bank of Philadelphia) is a sobering
less-technical summary of important research they did on software patents. They found
that in the 1990s, the firms that were increasingly patenting software were the ones that
were decreasing their research and development -- that is, patents are replacing research
and development, not encouraging it. They found strong statistic evidence that patents in
the software field do not provide an incentive for research and development -- the vast
majority of software patents are obtained by firms outside the software industry which
have little investment in the software developers required to develop software inventions.
They don’t say it directly, but their research results seem to clearly show that software
patents have become legalized extortion, instead of a means to encourage innovation.

The software industry’s solution has been to cross-license patents between companies,
creating a sort of software patent d鴥nte. More recently, this has included cross-licensing
patents with the open source community (through mechanisms like the Open Invention
Network). Of course, such mechanisms tend to inhibit newcomers, so software patents’
primary impact is to prevent new ideas from becoming available to end-users, subverting
the official justification for them. The only group that seems to be unambiguously aided
by software patents are patent lawyers - and since they make the rules, they are happy to
have them.

Any statistic based on software patents is irrelevant when examining software innovation
-- because today’s software patents have nothing to do with innovation. End Software
Patents is an organization that is trying to eliminate the nonsense of software patents; I
hope they succeed, since they are harming instead of helping innovation.


What’s Not an Important Software
Innovation?
It’s difficult to identify the “most important” innovations within the last few years.
Usually what is most important is not clear until years after its development. Software
technology, like many other areas, is subject to fads. Most “exciting new technologies”
are simply fashions that will turn out to be impractical (or only useful in a narrow niche),
or are simply rehashes of old ideas with new names.

As I noted earlier, many important events in computing aren’t software innovations, such
as the announcements of new hardware platforms. Indeed, sometimes the importance
isn’t in the technology at all; when IBM announced their first IBM PC, neither the
hardware nor software was innovative - the announcement was important primarily
because IBM’s imprimateur made many people feel confident that it was “safe” to buy a
personal computer.

Standards are extremely important in computing (just as they are in many other fields). In
earlier versions of the document I noted that standards long preceded computing, and did
not note them as an innovation. However, I’ve since added an entry for vendor-
independent standards. The notion of having computing standards was not something that
immediately came to mind in the computing industry - so the notion of having computer-
related standards is now included above as an innovation. There are many important
events in computing history involving standards, but very few standards are listed above
as innovations... and for good reason. Standards themselves generally do not try to create
significant new innovations, and rarely work well when they do. Instead, standards
usually attempt to create agreements based on well-understood technology, where the
innovations have already been demonstrated as being useful. Any significant innovation
embodied in a standard was usually developed and tested many years before the
standards’ development.

Here are a few technologies that, while important, aren’t really innovative:
   1. XML. XML is simply a simplified version of SGML, which has been around for
      decades.
   2. SOAP. SOAP is yet another remote procedure call system, employing XML (and
      often HTTP).

There’s nothing wrong with not being innovative. Indeed, a technology should primarily
be measured as whether or not it solves real world problems (without causing more
problems than it solves). However, the focus of this paper is innovation, not utility. Do
not confuse innovation with utility.


Conclusions
Clearly, humankind has been impacted by major new innovations in software technology.
But the number of major new innovations is smaller than you might expect, especially
given the many who declare that software technology “changes rapidly.” If you only
consider major new innovations in software technology, instead of various updates to
software products, fundamental software technology is not changing as rapidly as
claimed by some.

I believe that this list is evidence that people are far more affected by other issues in
computing than by major new software innovations. In particular, I believe that there are
at least three reasons for the illusion of rapid changes in major software technology:

   1. People have been able to apply computing technology to more and more areas due
      to rapidly decreasing costs. Computer hardware performance has improved
      exponentially, its size has dropped significantly, and its cost has decreased
      exponentially, making it possible to apply computing technology in more and
      more situations. The increasing hardware performance has also allowed
      developers to use techniques that decrease development time by increasing
      computing time; this trade reduces the development cost and time for software,
      again making it less costly to apply computing technology (by reducing the cost
      of software development). The idea of automating actions is not, by itself
      innovative, but automation can certainly change an environment.
   2. Increasing use begets increasing use. And when a technology is widespread or
      ubuiquitous, it often enables many widespread uses and social changes. In those
      cases, people are feeling multiple rapid social changes, caused by the widespread
      availability of a technology, rather than multiple rapid changes and innovations in
      the technology itself. When the web browser was first introduced, comparatively
      few people used it because there was relatively little information or services of
      interest to them available through it. But once some was available, other providers
      of information and services had users/customers, enticing them to use the WWW,
      causing an exponential increase in use. A service can become particularly
      influential if it becomes a standard (either because it’s formally specified as a “de
      jure” standard, or simply through widespread use as a “de facto” standard). The
      idea of creating standards is not new, but once something becomes a standard, the
      idea’s very ubiquity can mean that it will be widely used in places that it wouldn’t
      be used before.
   3. Software functionality can be changed over time, adding new functionality and
      generalizing a particular program’s capabilities. However, when functionality is
      changed over time, this often requires that the human interfaces change as well.
      As a result, people constantly have to learn how to handle changes in a given
      program’s interface. This gives some the illusion of constant innovative change in
      software technology, while instead, what is changing is a particular
      implementation.

Intriguingly, the richest and most powerful software company currently, Microsoft, did
not create any major software innovation as identified in this list. Microsoft did not even
create the first useful or widely-used implementation of any major software innovation.
Others have come to the same conclusions, for example, see the Microsoft “Hall of
Innovation”. This certainly casts doubt on Microsoft’s claims to be an innovative
company. For more information about this, see Microsoft, the Innovator?.

In contrast, several major innovations were first implemented as open source software /
Free Software (OSS/FS) projects, especially for those involving networks. Examples of
innovations initially released as OSS/FS or first widely distributed as OSS/FS include
DNS, web servers, the TCP/IP implementations on BSD systems to create internetworks
using datagrams, the first spell checker, and the initial implementation of lockless version
management. Tim Berners-Lee, inventor of the World Wide Web, stated in December
2001 that “ A very significant factor [in widening the Web’s use beyond scientific
research] was that the software was all (what we now call) open source. It spread fast,
and could be improved fast - and it could be installed within government and large
industry without having to go through a procurement process.” This may be because the
ideas of open source software are quite similar to research approaches in general, e.g., in
both systems publications are available to all and can be used as the basis of further work
(as long as credit is given). The paper Altruistic individuals, selfish firms? The structure
of motivation in Open Source Software found in a 2002 survey of 146 Italian firms that
their primary reason for supplying OSS/FS programs was that “Open Source software
allows small enterprises to afford innovation”. For more information, see the information
on OSS/FS innovation from my paper, “Why OSS/FS? Look at the Numbers!”

It's important to not overstate the value of innovation. As noted in Shapin's article What
Else Is New?, just doing something radically different does not make it important.
Indeed, we are surrounded by "old" technology that still serves us well. An useful
innovation has to be useful, and not just be a new idea. That said, sometimes new ideas
truly are useful, and when they are, they can improve our world.

In addition, it's important to note that innovation is primarily a matter of incremental
improvement and hard work. While I think it's useful to note dates (where possible) for
new innovations, it can give the illusion that innovation is primarily a matter of "Eureka!"
moments that change everything. As noted in Eureka! It Really Takes Years of Hard
Work (by Janet Rae-Dupree, New York Times, February 3, 2008), innovation is
(primarily) "a slow process of accretion, building small insight upon interesting fact upon
tried-and-true process." See Scott Berkun's 2007 book "The Myths of Innovation" for
more.

Software technology changes, but that is not software’s primary impact on us. Software
primarily impacts us because of its ubiquity and changeability, as the computers that
software controls become ubiquitous and the software is adapted to changing needs.


Appendix: Software Innovations Being
Considered
No list of “software innovations” can be complete. At the least, there is always the hope
that there will be new innovations, ones that we have not even heard of yet. But
innovations must be given time to see if they are truly important, or simply a fad that will
quickly fade away. Another problem is that it is sometimes difficult to track backwards to
find out when an idea was created, or by who. There is also the challenge of determining
if it was really innovative, and what its impact was.

Some ideas have been identified that may be added to future versions of this document.
These include the following:

   1. Algorithms - Euclid (GCD), and before.
   2. Backus-Naur Form (BNF), a format for defining language syntax.
   3. Abstract Data Types (ADTs) - this preceded object-orientation, but a “first” use
      seems to be very hard to find.
   4. Quick sort (Tony Hoare) - I believe this also spurred development of other
      algorithms
   5. Big O / Complexity theory
   6. Hashes
   7. Transactions (esp. database transactions) - it can be a little difficult to determine
      the origins of the idea of transactions. David Lomet (who did key work in this
      area) has helpfully pointed out to me three papers, along with useful recollections
      of his:
          o Obermarck, R. (1980) “IMS Program Isolation Feature”. IBM San Jose
              Research Report RJ 72879. [This was written about 10 years after the
              product containing the feature was released. IMS never quite abstracted
              the notion completely, but it the properties are there if one looks
              carefully.]
          o Eswaren, K., Gray, J., Lorie, R., Traiger, I. “The Notions of Consistency
              and Predicate Locks in a Database System.” Comm. ACM 19 (11). 1976
              [This is the first paper that clearly identifies transactions in the context of
              databases. Several flavors are involved. Unfortunately, the title isn’t very
              transparent on this point.]
       o    Lomet, D. Process structuring, synchronization, and recovery using atomic
            actions. ACM Conf. on Language Design for Reliable Software, Raleigh,
            NC SIGPLAN Notices 12,3 (Mar 1977). [Yes, I have some claim on this. I
            independently discovered the notion of “atomic procedure” while working
            with Brian Randell at the U. of Newcastle on system reliability.]
8. Recursion (though this was a mathematical concept predating computers, and
    might not really be a “software” innovation at all)
9. Database management systems (it is very difficult to trace back to the “first”
    ones)
10. Operating systems (again, it’s very difficult to trace back to “first” ones)
11. Aspect-oriented programming
12. The “Page Rank” algorithm used by Google
13. Directories (X.500 and before) - these are read-mostly databases, implemented
    later by LDAP (and still later by Active Directory)
14. Bazaar-style programming/development (e.g., Linux; noted by Raymond)
15. Open-source software / Free Software
16. Fourth-generation programming languages (4GLs)
17. Wiki+P2P (they change Knowledge distribution and coordination!).

				
DOCUMENT INFO
Shared By:
Categories:
Tags:
Stats:
views:0
posted:6/4/2013
language:English
pages:32
wu yunyi wu yunyi
About wuyyok@163.com