[e2e] end2end-interest Digest, Vol 19, Issue 11
nichols at mountainfog.com
Thu Sep 15 11:27:19 PDT 2005
I'm feeling a curmudgeonly rant coming on.
David Reed gives some excellent perspective on the use of simulation. In
my experience, it's the "perspective" that seems to be out of order in
much of the research use of simulation. From the start, I've cast a
suspicious eye at the results of my own simulations, not to mention
others. The most egregious problem seems to be claiming internet-wide
peformance improvements based on very limited simulations. (Jon
Crowcroft gave me the chance to rant on this at the Decides BoF at
the Oslo IETF in 1999 in the diffserv context.) Making sweeping changes
in widely deployed internet protocols on the basis of simple simulation,
or even simple lab experiments just seems like a bad idea.
On the other hand, simulation can be excellent for understanding the
dynamics of smaller scale interactions. Years ago while working at
the Bell Labs That Was, I was working on a visual simulator and
modeled a colleague's distributed computer network. The visuals
beautifully showed a deadlock. I told him not to worry, my simulator
was still being developed. 3 weeks later he told me that they'd
found the deadlock in the lab. Okay, sometimes simulation works.
When working at my very first start up, I simulated TCP interactions
with a cable data system and found some problems in the protocol.
This seems another nice use of simulation - to explore inter-layer
interactions before everything gets built and deployed.
Went through all the channels to explain how this needed to be fixed.
The engineer writing the code decided not to implement the change.
When a problem showed up in the beta roll-out, it was obvious the
change hadn't been implemented. Over the years, I've used simulations
to learn various cool things about how things interact and get
some insight. It never occured to me to recommend new protocols
for the internet on the basis of a simulation though.
The first time I used ns (before ns-2), I was persuaded to do so by
Van Jacobson who told me that ns's event sorting was very efficient
and that the TCP code was very good. I found out that events were
stored in a linked list (I think the statute of limitations on my
complaining about this is about up, though) and the TCP implementation
did not have the three way handshake nor did it shut down connections
properly. On the other hand, it took me less time to fix this stuff and
get a useful set of simulations than the amount of time I'd put into
trying to model MAC layer-TCP interactions in Opnet. When I picked up
ns-2 the first time, I found the "full TCP" code was full of bugs and
the oTcL/C++ interaction made it much harder to code your own models.
So, I think I'm supporting Detlef and those who say it's important to
read through the models and to know what you are modeling and to know
what the goals of your simulation are. I'm sort of looking for something
without the baggage of ns-2 these days. Has anyone used Omnet++?
Perspective is a good thing to have in life. Especially when it comes
to the use of simulation. I'm not sure we can expect the average
grad student to have such perspective, but it seems like it might
reasonably be expected of those who advise graduate students.
More information about the end2end-interest