[e2e] end2end-interest Digest, Vol 19, Issue 11

Yogesh Prem Swami yogesh.swami at nokia.com
Wed Sep 14 15:29:31 PDT 2005


I have a somewhat general question about simulations. My question is
that is there any scientific reason why simulations should be able to
predict the behavior of a real packet transmission phenomena? Unless you
make the assumption that packet transmission/interaction is a
non-chaotic phenomenon (chaotic, as used in physics), there is no reason
to believe why a simulation would be able to model real world events.

In other words, how did the networking community come to the conclusion
that the error between a simulation results and real world packet
transmission would be bounded, if someone ran the simulations long
enough? Also, stability of mean and variance only signifies that the
system (simulator) has a saddle point, but nothing more. (Although, I do
agree, that this is the very least a researcher can do.)

To give an analogy, take weather prediction for example. Accurately
predicting weather is often hard, because the "weather" is fundamentally
a chaotic event that can not be easily simulated. If packet
transmissions are also like weather, then there is no reason why
simulations and real world implementation of protocols will have any
similarity in their behavior.

Unless someone shows me a proof that packet transmissions fall within
the non-chaotic region, I will have a hard time accepting or advocating
the use of NS-2 or any other simulation tool.

Thanks
Yogesh

ext S. Keshav wrote:
> Detlef, 
>     Since you mentioned my web site, I will save others the effort in wading
> through it to post the single relevant sentence: "The goal of simulation is
> intuition, not numbers," R.W. Hamming. I was taught this by Sam Morgan at
> Bell Labs, who heard it from the horse's mouth.
> 
> In terms of simulator validation, I can tell you how I validated REAL when I
> first wrote it in 1988 (BTW, this was not my idea: I was only carrying out
> Scott Shenker's instructions): I wrote the same simulation using two
> packages -- CSIM and NEST (which eventually became REAL). Then, I compared
> every packet transmission and reception at the time granularity of one
> microsecond. If there was a difference, I found and fixed any bugs. This
> allowed me to find several bugs in both simulators.
>
> Sam Morgan did not trust REAL, and he spent a few months comparing REAL
> results with queueing theoretic results for M/M/1 and M/D/1 queues.
> (Thankfully, he did not find any bugs.) I wonder if any other simulators
> have been compared using this straightforward technique.
> 
> My two cents:
> 
> Simulation results that do not include
> 
> * an analysis of parameter stability, i.e. the length of time you need to
> run the simulations before the metrics achieve their steady state value, and
> 
> *  both means and standard deviations (or error bars)
> 
> are just plain bogus.
> 
> I was surprised to find that of the 44 'good' papers I taught last year,
> only ONE had results standard deviations. All the rest that had simulation
> resuts showed a single data value. Imagine what the situation is for 'not so
> good' papers!
> 
> keshav
> 
> 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 479 bytes
Desc: OpenPGP digital signature
Url : http://www.postel.org/pipermail/end2end-interest/attachments/20050914/41113573/signature.bin


More information about the end2end-interest mailing list