[e2e] Compression of web pages

Woojune Kim wkim at airvananet.com
Tue Aug 27 11:30:22 PDT 2002


> Are you referring to
> http://online.wsj.com/article/0,,SB1029974251705867835,00.html ?
> 

Not sure of the exact url (don't have web access to WSJ), but it was printed on Monday or last Friday I think.

> My guess is that is baloney he was fed by Sprint 
> salescritters and that

:-)

> Recall that PPP compression doesn't do much for web surfing because
> the stuff that takes the most time is generally already compressed
> pictures.  No compression no matter how magical can significantly
> compress or speed up the transmission a .gif or .jpg (except perhaps
> in wierd, very unusual cases).
> 

>From Mr. Mossberg's comments that the figures were grainy, I had the impression that some sort of proxy was taking the .gif, .jpg files apart and reducing their clarity / size , so as to make their sizes smaller. Wouldn't that work ? 

> The other the bits in web page are usually too few for compression to
> matter at link speeds above 30 Kbit/sec.  The latencies of 
> DNS resolution,
> reach over the Internet, and of the HTTP server itself are generally
> greater than the time needed to transimit a few Kbytes of HTML.
> 
> 400 Kbit/sec through a real link running 50-70 Kbit/sec is a 6:1 to
> 8:1 compression ratio.  That is impossible except in special cases.
> The data used by `ping` or lists of IP addresses or email addresses
> can often be compressed (e.g. by LZW) by 8:1, 30:1, or even more, but
> "average" compression rates for "typical," not already compressed
> network data (e.g. not GIFs) using good compression schemes (plenty
> of history, and not just per packet as in common uses of LZS) are 2:1
> to less than 4:1.
>


Overall no big difference in opinion. But Mr. Mossberg mentions that he used some of the bandwidth measuring websites. Do you know what sort of data do they use ? Would their data be more susceptible to compression ?

thanks




More information about the end2end-interest mailing list