[e2e] Protocols breaking the end-to-end argument

Richard Bennett richard at bennett.com
Fri Oct 23 20:23:21 PDT 2009

People who are interested in the evolution, refinement, application, and 
re-definition of end-to-end arguments, principles, doctrines, dogmas, 
and guidelines may enjoy my paper, "Designed for Change: End-to-End 
Arguments, Internet Innovation, and the Net Neutrality Debate", 
available at http://www.itif.org/index.php?id=294 along with a video of 
a nice discussion of the stagnation of Internet protocol development 
with Dave Farber, John Day, Chris Yoo, Bill Lehr, and yours truly.

I think Jaime's usage, "breaking end-to-end", is common in today's IETF, 
where people tend to regard end system function placement as a default, 
and the caveats of the Arguments are pretty much ignored. This kind of 
reduction is to be expected, given the way that complex ideas tend to be 
simplified by time.

The best discussion I've seen of function placement in a datagram 
network to this day is found in Louis Pouzin's mongraph on the CYCLADES 
network, _Cyclades Computer Network: Towards Layered Network 
Applications_, Elsevier Science Ltd (September 1982). The book is out of 
print, but it's available through interlibrary loan from several 
institutions in the US. Pouzin takes a very pragmatic and empirical 
approach to function placement, where later engineers tended to come 
from first principles. The worst treatment is David Isenberg's second 
"stupid network" paper, "Dawn of the Stupid Network"; it's much more 
doctrinaire than "Rise of the Stupid Network" by the same dude.

A couple of great critiques of "End-to-End Args" are RFC 1958 and Tim 
Moors' "A Critical Review of End-to-End Arguments in System Design", 
http://www.ee.unsw.edu.au/~timm/pubs/02icc/published.pdf. Moors shows 
that the Saltzer, Reed, and Clark argument for end-to-end placement is 
both circular and inconsistent with the FTP example that is  supposed to 
demonstrate it. But the tres amigos of e2e were writing in 1981 when 
network engineering was mostly a matter of intuition, so what do you 

One of the more interesting unresolved questions about "End-to-End Args" 
is why it was written in the first place. Some people see it as a salvo 
in the ISO protocol wars, others as an attack in BBN's ARPANET, some as 
an attempt to criss the divide between engineering and policy, and there 
are probably other theories as well.

The Blumenthal and Clark "Brave New World" paper was very influential 
because it lit the fire under Larry Lessig that got him storming around 
about "protecting the Internet" from all the threats to stagnation and 
freedom. There's a fairly clear path from Lessig's reaction to "Brave 
New World" and the immoderate regulatory climate in the US today that's 
so hostile to Internet progress.


rick jones wrote:
> On Oct 23, 2009, at 3:58 AM, Jeroen Massar wrote:
>> Jaime Mateos wrote:
>>> Hi,
>>> I'm working on a project about the current challenges the Internet is
>>> presenting to the end-to-end argument. I'd be interested to know about
>>> any protocols, currently in use, that break the end-to-end principle 
>>> and
>>> the context where they are used.
>> Everything that needs an NAT helper, thus any protocol that embeds
>> addresses or ports, thus most games, everything that has a listening
>> port where the listening port is not on a public IP or firewalled away.
> Isn't the sense incorrect there?  I always thought it was the NAT 
> itself, and its need for helpers that was in opposition to the 
> quasi-mythical end-to-end principle?
> rick jones
> Wisdom teeth are impacted, people are affected by the effects of events

Richard Bennett
Research Fellow
Information Technology and Innovation Foundation
Washington, DC

More information about the end2end-interest mailing list