A profoundly fortuitous convergence of historical factors has led us to today’s marvelous status quo, and many of us (with a few well-known exceptions like record company CEOs and cyber-stalkees) have enjoyed the benefits of the generative Internet/PC grid while being at most inconvenienced by its drawbacks. Unfortunately, this quasi-utopia can’t last. The explosive growth of the Internet, both in amount of usage and in the breadth of uses to which it can be put, means we now have plenty to lose if our connectivity goes seriously awry. The same generativity that fueled this growth poses the greatest threat to our connectivity. The remarkable speed with which new software from left field can achieve ubiquity means that well-crafted malware from left field can take down Net-connected PCs en masse. In short, our wonderful PCs are fundamentally vulnerable to a massive cyberattack.
[…] If the Internetâ€”more precisely, the set of PCs attached to itâ€”experiences a crisis of this sort, consumers may begin to clamor for the kind of reliability in PCs that they demand of nearly every other appliance, whether a coffeemaker, a television set, a Blackberry, or a mobile phone. This reliability can come only from a clamp on the ability of code to instantly run on PCs and spread to other computers, a clamp applied either by the network or by the PC itself. The infrastructure is already in place to apply such a clamp. Both Apple and Microsoft, recognizing that most PCs these days are Internet-connected, now configure their operating systems to be updated regularly by the companies. This stands to turn vendors of operating-system products into service-providing gatekeepers, possessing the potential to regulate what can and cannot run on a PC. So far, consumers have chafed at clamps that would limit their ability to copy digital books, music, and movies; they are likely to look very differently at those clamps when their PCs are crippled by a worm.
To be effective, a clamp must assume that nearly all executable code is suspect until the operating system manufacturer or some other trusted authority determines otherwise. This creates, in essence, a need for a license to code, one issued not by governments but by private gatekeepers. Like a driver’s license, which identifies and certifies its holder, a license to code could identify and certify software authors. It could be granted to a software author as a general form of certification, or it could be granted for individual software programs. Were a licensed software author to create a program that contained a virus, the author’s license would be revoked.
[…] The downside to licensing may not be obvious, but it is enormous. Clamps and licenses managed by self-interested operating-system makers would have a huge impact upon the ability of new applications to be widely disseminated. What might seem like a gated communityâ€”offering safety and stability to its residents, and a predictable landlord to complain to when something goes wrongâ€”would actually be a prison, isolating its users and blocking their capacity to try out and adopt new applications. As a result, the true value of these applications would never be fully appreciated, since so few people would be able to use them. Techies using other operating systems would still be able to enjoy generative computing, but the public would no longer automatically be brought along for the ride.
[…] What is needed at this point, above all else, is a 21st century international Manhattan Project which brings together people of good faith in government, academia, and the private sector for the purpose of shoring up the miraculous information technology grid that is too easy to take for granted and whose seeming self-maintenance has led us into an undue complacence. The group’s charter would embrace the ethos of amateur innovation while being clear-eyed about the ways in which the research Internet and hobbyist PC of the 1970s and 1980s are straining under the pressures of serving as the world’s information backbone.
The transition to a networking infrastructure that is more secure yet roughly as dynamic as the current one will not be smooth. A decentralized and, more important, exuberantly anarchic Internet does not readily lend itself to collective action. But the danger is real and growing. We can act now to correct the vulnerabilities and ensure that those who wish to contribute to the global information grid can continue to do so without having to occupy the privileged perches of established firms or powerful governments, or conduct themselves outside the law.
Or we can wait for disaster to strike and, in the time it takes to replace today’s PCs with a 21st-century Mr. Coffee, lose the Internet as we know it.