Larry Lessig -the end to end principle – e2e – and the future of ideas

(entry last updated: 2003-07-01 15:46:44)

(Larry’s brought the lights down, so it must be time to start the next session) (Donna’s notes are here)

1935 – Edwin Howard Armstrong, broadcast an organ recital from a transmitter located in Long Island, demonstrating FM radio. A novel technology, and in the face of AM radio. Demonstrably a better technology for sound transmission; no static, higher fidelity, penetrates the ionosphere so lower power required to transmit over the same distance.

An employee of RCA, he screwed up the company’s ownership of AM – and Sarnoff of RCA fought this all the way. He coopted the FCC, he fought Armstrong’s patents for 6 years, until Armstrong was bankrupted and killed himself.

1964 – Paul Baran of Rand came up with “packet switching.” AT&T was shown this technology, who said they hated it. We doubt it will work, and we won’t help to create a competitor to us. Thus, the internet was delayed

How about another – let’s stream video over the internet. But we are bandwidth limited until broadband comes into being. In 1998, Excite and ATamp;T joined up and some thought to try to do this. Somers of AT&T says no way. Why compete

Innovators – internet-Cerf/Kahn – students

WWW – CERN/Swiss grad student

ICQ – Israeli kid

Hotmail – Indian immigrant

Napster – BU Students

Note: all foreigners and kids. Whom we might call “outsiders.:

Does the architecture help or hinder this sort of innovation? The key idea is the end-to-end character of the logical layer. "Intelligence at the edge; simple at the core." A design concept.

Contrast with switched networks. If AT&T liked your innovation, it would get deployed. If they didn’t like it, AT&T would keep you out. If profits are challenged, you’re out; if they are enhanced, you’re in. (cf. Baran and the video story).

In an end-to-end network, this sort of control cannot be exerted. The network is blind to the use of the packets – it just routes and delivers the packets. Thus, it’s driven by what the users want, rather than what the network allows.

David Isenberg introduced this idea to AT&T – he noted that the smart network of the company built limitations into the way that this network could evolve. Wrote about it, circulated within the company. The Isenberg 1997stupid networks paper meant he was to go. He left once he was vested, and he left to sell this.

This was a reinvention of some MIT ideas by Jerry Salzer, David Clark, and Jerry Reed (van Schewick dissertation – versions of stupid nets). The notion is that network function can be done best by letting the application complete the intent of the transmission. E.g., rather than checking data within the network, let the application manage the problem of data integrity – it will have to anyway.

This evolves into the notion of a preferred design – a bias in the design ofthe network. To the extent possible, make choices that put functionality at the edge of the network, rather than within the network. Not to say an explicit rule, but a working design argument.

Implications far beyond the architecture of the machines. The original notion of the internet IP protocols is that it’s as simple as possible.

First, the technical consequences.

  1. Flexibility in the way that the network develops, in part because of

  2. No coordination among network users needed to try something new. (e.g., voice over IP. Just take sound, digitize, packetize, drop into the net, reverse the process – sound) – innovation without coordination
  3. Fast evolution of the network application (e.g., gopher went from 1991 to 1993 took off; massive growth; 1993 the first browser is release and UMn decides to charge money for gopher; the gopher dies – “dead gophers anywhere” – no network administrator inter; demand and innovation took care of it) (gopher manifesto) The network cannot defend itself.

Competitive consequences:

  1. Maximizes competition: We start with the commons; a resource that everyone has access to, priced or not. All, however, are free in the sense that no one has proprietary control over access to that resource (e.g., language is a commons). The Tragedy of the Commons Garrett Hardin – poisons the concept of a commons in most of the US educational process.

    Lessig refutation of Hardin: the tragedy requires a rivalrous resource (if I use it, you can’t). Not all resources are rivalrous (e.g., ideas, language). Language, for example, actual becomes MORE valuable as more use it – a comity. So, is the resource one that invites a tragedy, or not. Is this a question of tangible or intangible assets?

    L: economists keep seeing the commons aspect of the internet, and assuming it will die (Larry fails/elects to consider the network economists, of course).

    Larry proposes an “innovation commons,” as a product of the end to end architecture. Everyone has an equal right to innovate in this space. The need is to maintain the commons, although there are many efforts to break it, generally by introducing property rights, which Larry asserts is only appropriate for rivalrous goods – or, more carefully, property is only appropriate when the benefits of using property exceed the costs of the system

    Competitive power to innovate is maximized in this space, therefore, because no one is locked out.

  2. Minimizes strategic threat: Strategic behavior, in the law, is the notion that behavior that undermines the intent of the competitive marketplace. One such strategy is defensive monopolization – the core of the government’s case against Microsoft.

    (The case, Netscape and Java could possibly change competition for applications on the PC platform. With Netscape/Java, applications can be written once, run everyone – not technically successful, it appears, but not. The case asserts that Microsoft decided to attack this strategy by displacing Netscape with something MS controls, like IE. Thus, MS can protect itself from this insidious plan, by closing the platform to competition in their application.)

    The US courts found defensive monopolication is certainly illegal. End to end takes this kind of protection out of the game – the network cannot protect itself from innovation.

    This lowers the cost of innovation – barriers to entry go away, so the cost of entry falls.

  3. Consumer financed growth: If you think like a utility company and you think about innovation, your notion of innovation is how expensive is it to deploy this innovation, and how much will you benefit from doing so? However, some of the most rapid innovation (e.g., the internet) takes place in domains where the consumers invest in the deployment of new technology. (Note: always using the internet as the example of rapid innovation is a bit of intellectual monoculture that weakens this discussion).

    Consider 3G v. 802.11. 3G was conceived in the old model of innovation deployment; invest and make the 5 year plan. Now, it’s obsolete at the point of deployment. 802.11 is much faster, cheaper and better for at least some things. The market pull for 802.11, operating in unlicensed space, led to rapid deployment of innovation.

    If this is possible with just this small innovation commons in spectum, what might happen if there were more space within which to work?

So, “e2e is heaven.” But the pessimisn – we’re “on our way to hell.”

What is happening is that the e2e layer is being pressured by the owners of the physical layer and the content layer. Corrupting the core:

  1. Policy based routing: a layering on of a new technology that allows the physical layer to treat certain packets differently than others (“All pigs are equal, but some pigs are more equal than others”).

    Xbox and cable as an example. Microsoft is now favoring the e2e network. MS wants to use the Xbox to allow gameplaying on the network. Cable companies want to permit (for a fee) gaming on their networks. So, cable companies want to extract some rent

  2. Content begins to eat the conduit: Consider media concentration – one company controls/coordinates messages, stifling independent voices. Diller attacked it because of the corruption of the message. Turnet attacked it because he could never have competed in this space today.

    The FCC response is that the Internet will solve this problem – it’s an independent source. Of course, this is true only to the degree that the internet remains e2e. But is companies can influence the architecture so that this is no longer true, then the internet solves nothing.

    Does the FCC act to leave the internet e2e. No – the FCC says that businesses should be allowed to innovate in whatever way they want in the internet space – including changing the architecture as they see fit.

Possible solutions: maintain e2e like the electric power grid, rather than turning it into the cable TV network, which is allowed to discriminate in service based on fees or worse. Three debates:

  1. Physical layer – Open access: ISPs shouldn’t be able to block participation in the network hardware. Ensure competition among providers. It failed in the US, but is successful in Japan (100 megabit/set for $50/month in Japan). NTT was not facing competition, so they did what they were told. In the US, where the Baby Bells were competing to exist, they barely complied and fought all the way.

  2. A logical layer push for regulating a “neutral network.” FCC regulates the net to make sure that pricing discrimination won’t take place among all net participants.

  3. “Free culture” – the power of the content layer to behave strategically – copyright, DRM, CBDTPA, etc.

So, the layers are all represented in the issues that we face, and that this framework leads us to a set of policy discussions that we have to consider with care. Changes at any of these layers, leading to lockup, will finish off the end to end objective/benefits.

Do any governments get this? So far, the basic claim is that there is “no need” to preserve this – the “market will take care of it(self).” So, end to end is at risk….


  1. Doesn’t the notion of the innovation commons ignore the effects of scale upon innovation?> Aren’t there some innovations that require scale that only companies can supply?

    Larry agrees that these firms certainly had great innovators, but there seems always to have been a conflict within these firms between the innovators and the business divisions. Moreover, Larry argues that complex systems should be allowed to self organize, rather than be managed. (This is an interesting hypothesis; and it’s something that Larry defends, although he conflates self-organization with simplicity, I think.)

  2. What happens to the notion of property? Investment was made, either in monies or effort, and the sacredness of property influences all of this discussion.

    Larry argues that the notion of property emerges out of a social need, rather than an absolute right. What has happened has been the loss of the idea that property exists to satisfy certain social needs. What has happened is that we have the cart before the horse. There are many assets that we elect not to “propertize,” and we have to be careful about what we elect to call property.

    In the Eldred case, there was a brief signed by 17 economists, including several Nobel arguments, who asserted that property as a concept requires careful consideration of what the property is expected to accomplish for society. It’s not just efficiency – there are other social objectives that should be served – and balance has to be struck.