[e2e] patents on routing algorithms

David P. Reed dpreed at reed.com
Fri Jan 4 09:16:40 PST 2008


Jon Crowcroft wrote:
> why do you think a loose coupled system cannot be described
> mathematically?
Perhaps it is an article of faith on my part.   Dawkins claims that he 
takes nothing on faith - rejecting the concept of faith as 
non-mathematical, and therefore not scientifically real.  I reject his 
views as self-contradictory, though I find them interesting and worthy 
of study.

Of course there are many mathematicians (I think a minority, but a 
substantial one) who don't buy into Hilbert's pure formalism program, 
and further don't believe that the world must be formalizable in all 
aspects.  I believe I am (to the extent I claim to be a mathematician) 
in that group.  Bayesians tend to be in that group - you can be a 
Bayesian without expecting that there is a "ground truth" of ideals in 
the sense that Plato and his followers wished for.

One can, of course, *define* a meaning for the term "loose coupling" as 
a mathematical property either as an axiom or by reduction to axiomatic 
elements of some formal system.   But that definition, I personally 
suspect, will not be complete, in the sense of properly capturing the 
notion of a protocol of the sort that English, or Linear-B, or any other 
human mode of communication has.  For example, we speak English and 
expect a wide variety of "systems" to change state in reasonably 
non-surprising ways that match our intention.
> clearly I can describe the code at each end of a
> protocol, and the channel, and at least derive theorems about how
> these combine. just because i dont over determine thigns doesn't mean
> that there isnt a precise mathematical description - it just means
> that there are non deterministic (external) events....thats been
> smething that process algebras have addressed since CCS, CSP, Lotos
> etc etc...even though those systems were somewhat unwieldy
>   
Process algebras start with assertions and definitions.   That makes 
them useful tools, but it does not make them true.
> many processor designs today are specified - but a ot of asynch
> circuit design has to be underdetermined....that doesnt mean it isnt
> amenable to math (code/algorithmic) description.
>
> you're starting to sound like Richard Dawkin's who seems to think that
> human consciousness is not amanable to emulation by machine because
> of quantum mechaninics....an even great heresy
>
> description doesn't mean 100% prediction...:)
>
> In missive <477E37CF.5010707 at reed.com>, "David P. Reed" typed:
>
>  >>Jon Crowcroft wrote:
>  >>> it is a goal of much recent work (see Sewell et al in sigcomm 05
>  >>> "Rigorous specification and conformance testing techniques for network protocols,
>  >>> as applied to TCP, UDP, and sockets"
>  >>> and various papers
>  >>> by Griffin and Sobrinho on Metarouting) 
>  >>> to render protocols merely 
>  >>> algorithmic specifications that are fed into engines that run them
>  >>>
>  >>> shame on us as computer scientists that
>  >>> we dont use such techniques on a daily basis for
>  >>> well-found engineering instead of the handwaving that passes
>  >>> for communications work still in the 21st century
>  >>>
>  >>> it is a technical AND ethical goal to make it so
>  >>> and should be a duty on all of us to get the law to recognize it
>  >>>
>  >>>   
>  >>That's a plausible point of view.   I heartily disagree, however.  In 
>  >>1974 or so, our research group (Saltzer, Clark, Reed, Liskov, Svobodova, 
>  >>as I recall) decided that a *crucial* aspect of distributed systems was 
>  >>that they exhibited "autonomy", which implies a serious notion of loose 
>  >>coupling, flexibility, revisability, etc.  That set of attributes are 
>  >>crucial, leaving them out for the sake of formal methods is just another 
>  >>Procrustean bed, where they are the Feet.
>  >>
>  >>*Protocols* are techniques for achieving communications in the face of 
>  >>uncertainty about who is on the other side of the network.  Not just an 
>  >>unreliable network in the middle, but an uncertainty in a very 
>  >>fundamental sense about what is on the other side.
>  >>
>  >>In "distributed systems" that must function in the real world, a core 
>  >>and *essential* concept is that one must specify parts of the system to 
>  >>work "right" EVEN IF THE DEFINITION OF RIGHT CANNOT BE WELL-DEFINED 
>  >>MATHEMATICALLY.
>  >>
>  >>To someone who speaks English as a protocol, this is obvious.   I can 
>  >>try to convince you, for example, by the words above that I am right.   
>  >>And I am using English correctly, and this can be verified.  But it has 
>  >>nothing to do whatsoever with being able to prove that you *will* agree 
>  >>with me at the end of the conversation.  Maybe it will take more 
> conversations, maybe not.
>  >>
>  >>But a protocol is not an algorithm executed by a complete set of formal 
>  >>machines, though some protocols (a small subset might be in that 
>  >>category).  That is a sad, little boring and utlimately trivial subset 
>  >>of the "protocols" of the world.   Maybe it makes small-minded 
>  >>mathematicians happy because they can close off a "formal system" and 
>  >>prove theorems, as if proving theorems is the desired endpoint of system 
>  >>design.    But the ability to prove theorems is not the test of a 
>  >>*useful* protocol set - neither of engineering value, nor of human 
>  >>value.   The ability to communicate (which cannot be formalized in any 
>  >>way I know) is the correct test.   The Internet is one example of a 
>  >>system that succeeds in communicating, and there really was NOT a need 
>  >>to define a formal specification of a collection of machines to achieve 
>  >>that result.
>  >>
>  >>
>
>  cheers
>
>    jon
>
>
>   


More information about the end2end-interest mailing list