Boot on the government's role in innovation

| Comments (2) |
Over at Volokh, Max Boot writes:
Even where government has played a big role in the development process, as with the Internet and the electronic computer, the key advances were usually made by people not on its payroll: William Shockley, John Bardeen, and Walter Brattain (the transistor); Jack Kilby and Robert Noyce (the microchip); Ted Hoff (the microprocessor); Paul Allen and Bill Gates (MS-DOS and Windows); Tim Berners-Lee (the World Wide Web); Marc Andreessen and Eric Bina (the Mosaic browser); and many others.

Well, this is sort of true and sort of not. First, I think the phrase "payroll" is confused. The way that the government does a lot of its research is by handing out grant money to other people to do the work for it. So, for instance, much of the Internet work was paid for by DARPA under contract. As a practical matter, the vast majority of university-level research work in science in the US is paid for by government grants.

More importantly, either Boot doesn't really understand the history of computers or these examples are cherrypicked, because these aren't really the key moments at all. First, they start way too late. The main theory of the modern computer was worked out by people who worked more or less directly the government (Turing and later von Neumann) and the first electronic stored program computer in the US (ENIAC was built for the government in order to compute ballistic tables)1. Real computers existed (based on tubes) substantially before transistors were in wide use.

I certainly agree that Boot's first three examples (transistors, ICs, and microprocessors) are really important and were developed in the commercial sector. But then when we get to MS-DOS and Windows, things go badly awry. Obviously, Microsoft is incredibly important but as a technology DOS and Windows are nothing special. MS-DOS is basically an imitation of CP/M and even in 1980, when MS-DOS was released, UNIX2 was far superior to MS-DOS (it had pre-emptive multitasking, among other things). Similarly, Windows is an imitation of MacOS, which has roots in interfaces designed at PARC and SRI. The bottom line here is that the history of operating systems and UIs is complicated and innovations were made in a whole bunch of places and systems, some government funded and some not.

Similarly, when we turn to the Internet, packet switching was originally proposed by Baran, who worked for RAND, which was mostly a government contractor, Davies, at the UK National Physical Laboratory, and Kleinrock at MIT (remember that almost all university research work in CS is paid for by government grants). And, as I said earlier, almost all of the post-Baran Internet work before 1993 or so was paid for by DARPA (and then later ARPA). Finally, when we turn to his last two examples, they're just plain wrong. Tim Berners-Lee was at CERN, which is basically a multi-country national lab. Marc Andreessen and Eric Bina were at NCSA, which, as the name "National Center for Supercomputing Applications" suggests, was government funded. Finally, even during the .com boom, a lot of the .com companies came out of universities, where, again, research is mostly funded by NSF, DARPA, etc.

1. Yeah, yeah, I know about Zuse, who may have been first, but who's research didn't really end up on the main line.
2. Remember, 4BSD dates from November of 1980.

2 Comments

Also, it's a bit of a stretch to say that any work done at the old bell labs wasn't done on government payroll. True, there wasn't direct government money involved, but bell labs was the great american technology lab, created as a loss leader by AT&T to convince the world that they were a force for the public good, and not the extraordinarily profitable private company with a granted government monopoly which they really were. The labs continued to do good work, and in exchange AT&T was allowed to continue to exist.

Great post!

The National Academies' Computer Science and Telecommunications Board: (CSTB) put together a nifty (if complex) chart tracing the development of 19 billion-dollar segments of the IT industry -- from the emergence of the first research, to the introduction of the first product, to the time that product became a billion-dollar industry. You can find a copy of the chart (called the "tire tracks" chart, for reasons that should be apparent when you see it) here. It's a little dense, but worth some time to figure out.


Listed on the left are the various technologies (Timesharing, Client/server computing, Graphics, etc). Each technology has three separate lines plotted on a timeline -- a red line showing when work was taking place in universities (almost all supported by federal research dollars), a blue line showing work in industrial labs, a black dotted line showing when the first product appeared, and a green line noting when that technology became a billion dollar industry. The arrows between the lines and between the various technologies show the flow of people and ideas. So, the chart shows, for example, that work in universities on timesharing prior to 1970 in helped lead to developments in the early 1970's in research that would lead to the Internet.

The chart makes a lot of the points that those of us who advocate for more federal research for basic research try to emphasize in our arguments:

  • There's a complex and rich interplay between federally supported research in universities and industrial R&D. In some cases (RISC and RAID, for example) the initial ideas came from industry, but gov't sponsored research in universities was needed to advance the technology. In other cases (the Internet, GUIs, timesharing), the ideas originated in universities long before they matured to a point where subsequent research by industry moved the tech towards commercialization. University and industrial research is complementary -- with different goals (university = long term, fundamental questions; industry = short term, development-oriented), one doesn't supplant the other.
  • The IT R&D ecosystem is very interdependent.
  • There's often a long incubation period between the the time a technology was first conceived and the time it arrived in the market as a commercial product.
  • University research produces people -- researchers and practitioners -- as well as ideas.
Essentially every aspect of information technology we rely upon today bears the stamp of federally supported research.

Leave a comment