David P. Reed
dpreed at reed.com
Tue Apr 16 17:53:36 PDT 2002
At 02:39 PM 4/16/2002 -0400, J. Noel Chiappa wrote:
> I realize this whole topic of how to maximize the throughput of a mesh,
>a offered traffic load matrix, is a complex one, but it does seem to be one
>that people care about.
I agree people care about it. But throughput is rarely a useful
measure. What applications (i.e. customers) care about is whether their
application works as well as it can, given the interaction with other
applications sharing the same network. The two ways it can work badly are:
- the network responds badly to the offerred load, and comes nowhere near
to a possible solution that meets all users needs fairly. (one might call
further break this into the "resonance" problem, by analogy with systems
that resonate uncontrollably with inputs, and the "damping" problem by
analogy with systems that actively restrain inputs so that the outputs miss
crucial timing requirements and reliability requirements).
- the network picks favorites - certain types of traffic work well, while
others get discriminated against.
Latency is as important to certain applications as throughput, and the
ability to reallocate resources to bursty applications is equally important.
I've always felt it unreasonable for networks to be designed as if all
traffic is built out of constant-rate flows that stay stable for long
periods. Almost no computer applications, except FTP and voice telephony,
have such a property, and it becomes harder and harder to force them into
that procrustean bed.
If you are going to invent modeling frameworks to be used in resource
allocation, they should have some relation to the application abstractions
that are most useful for building interesting applications. Very
simplified, necessarily, but there should be a clear connection.
The kinds of models that are used to build "Small World" networks of
relationships might be interesting "generating rules" for application
traffic - nodes tend to talk to nodes they have talked to before, or be
introduced to new nodes that are known to nodes they have recently talked
to. And traffic on these links tend to arrive randomly with probability
decreasing with time.
This probably is a much more interesting way to think about how traffic
scales as the networks get much larger and the applications get more
More information about the end2end-interest