[e2e] TCP ex Machina

Keith Winstein keithw at mit.edu
Sun Jul 21 17:14:41 PDT 2013

Thanks, Jon, I think the analogy to the Axelrod experiment is quite apt!
Wish I had thought of that myself. :-) Similar to Axelrod, my dream is to
have a "kaizen for congestion" where anybody could contribute a new
algorithm and we would evaluate it vs. the existing schemes and see how it
performs on different kinds of benchmark networks, and then add it to these
same plots.

Detlef, I'm afraid I don't think your email quite summarized our approach
accurately. We do not give the optimizer advance information about who
wants to send what to whom and we don't calculate an optimized "schedule."
Remy develops rules for a TCP sender; e.g. when to increase the window and
by how much, when/how to decrease the window, when to enforce a minimum
interval between outgoing packets (a pacer) and what that interval should
be. It tries to find the best rules for a given set of assumptions
specified explicitly -- e.g., what are the range of possible networks the
protocol is intended for, and what is the goal.

We model the arrival and departure of flows as drawn from some stochastic
process, e.g., flows are "on" for some amount of time or some number of
bytes, drawn from an exponential or Pareto distribution or from an
empirical CDF of flow lengths on the Internet. The traffic model given to
Remy at design time usually is not the same as the case we then evaluate in
ns-2 when comparing against the other schemes.

Regarding wireless links, you might be interested in some of our prior work
(http://alfalfa.mit.edu) that shows one can achieve considerable gains by
modeling the link speed variation of cellular networks as a simple
stochastic process, then making conservative predictions about future link
speeds at the endpoints in order to compromise between throughput and delay.

Best regards,

On Sun, Jul 21, 2013 at 5:14 PM, Jon Crowcroft
<jon.crowcroft at cl.cam.ac.uk>wrote:

> it is a tiny bit cleverer than that - the work is the moral equivalent of
> the Axelrod experiment in emergent cooperation, but neater because it is
> quantitative rather than just qualitative selection of strategies - what is
> important (imho) is that they use many many simulation runs to evaluate a
> "fitness" of a given protocol...this is heavy lifting, but pays off - so it
> will be nice to see empirical follow up work but this isn't some naive
> "overfitting" undergrad work - it is rather different and requires a
> considered response
> On Sun, Jul 21, 2013 at 9:28 PM, Detlef Bosau <detlef.bosau at web.de> wrote:
> > To my understanding, you write down the whole communication (who wants to
> > sent what to whom) and afterwards you calculate an optimized schedule.
> >
> > Reminds me of undergraduate homework in operating systems, gantt diagrams
> > and that funny stuff.
> >
> > You cannot predict your link's properties (e.g. in the case of wireless
> > links), you cannot predict your user's behaviour, so you conjecture a lot
> > from presumptions which hardly ever will hold.
> >
> > Frankly spoken: This route leads to nowhere.
> >
> >
> > --
> > ------------------------------**------------------------------**------
> > Detlef Bosau
> > Galileistraße 30
> > 70565 Stuttgart                            Tel.:   +49 711 5208031
> >                                            mobile: +49 172 6819937
> >                                            skype:     detlef.bosau
> >                                            ICQ:          566129673
> > detlef.bosau at web.de                     http://www.detlef-bosau.de
> >
> >

More information about the end2end-interest mailing list