Bandwidth Challenge Results 111
the 1st sandman writes "SC2005 published some results of several challenges including bandwidth utilization. The winner (a Caltech led team of several institutes) was measured at 130 Gbps. On their site you can find some more information on their measurements and the equipment they used. They claimed they had a throughput of several DVD movies per second. How is that for video on demand!"
home use (Score:2, Interesting)
Re:home use (Score:2)
Never ! (Score:1)
Re:home use (Score:2)
My Comcast service already hits 4 mbps whenever I ask it to, so it feels within reach but I guess we'll see.
Re:home use (Score:2)
LOC'ed in. (Score:2, Funny)
How many Library Of Congress'es is that?
Re:LOC'ed in. (Score:4, Informative)
1 Library of congress is 20TB
1 Fortnight is 1209600s
0.0158691406 x 1209600 = 19195.31247
At 130Gbps after 1 fortnight 19195.31247TB would be transfered
19195.31247/20 = 959.77 Libraries of Congress per fortnight.
Re:LOC'ed in. (Score:1)
959.77/14 = 68.555 LOC/DAY
68.555/24 = 2.856 LOC/HOUR
2.856/60 = 0.0476 LOC/MINUTE
0.0476/60 =
If my calculations are correct that is. Gee, I guess that's not very fast than is it?
I'll be impressed when they can get up to 1 LOC/SEC.
Re:LOC'ed in. (Score:5, Funny)
Re:LOC'ed in. (Score:2)
John Doe can be an awful lot of people.
Sponsors? (Score:5, Funny)
The Bandwidth Challenge, sponsored by the good fellows at the MPAA and RIAA. I think they forgot to put their logos on the sponsor page.
Re:Sponsors? (Score:1)
Microsoft - What virus do you want today?
Probably not enough DVDs/sec (Score:5, Insightful)
They claimed they had a throughput of several DVD movies per second. How is that for video on demand!"
Given you might need to serve a few thousand people an hour (or more?), I'd say it's still got awhile to go. Kinda sobering, when you think about it. Shiny discs and station wagons are going to be around for awhile.
Re:Probably not enough DVDs/sec (Score:2, Insightful)
Re:Probably not enough DVDs/sec (Score:2)
Re:Probably not enough DVDs/sec (Score:2)
Besides, for the foreseeable future video on demand will be pay per view, so the number of simultaneous users will be far fewer than the number of households.
Re:Probably not enough DVDs/sec (Score:1, Informative)
You don't have them download the entire fucking DVD in one second or in one hour. Who, other then nerds, wants to fill up their harddrives with movies that they can simply watch at any time over the internet with a small subscription base.
You stream it to them.
On a DVD movie the HIGHEST bitrate your going to see is around 10Mbps.
If you had a 130Gbps pipe... that would allow you to serve 13 thousand customers on one connection, and that is at the highest quality setting aviable on dvd movies nowa
re: multicast (Score:2)
You get random access to the whole movie for 24h hours for about $4
A good deal I think and regularly watch a movie.
Re:Probably not enough DVDs/sec (Score:1)
Re:Probably not enough DVDs/sec (Score:1)
Re:Probably not enough DVDs/sec (Score:2)
VOD doesn't do that - it streams it in realtime, so you're talking about being able to server many tens of thousands of customers simultaneously.
Take into account multicast and align each 'broadcast' to a minute granularity (so you only need 90 simultaneous streams of the most popular movies to serve everyone) and there's more than enough bandwidth to scale to even the largest city.
Even if you were wanting to download the whole DVD to a hard disk (a
Re:Probably not enough DVDs/sec (Score:2)
Re:Probably not enough DVDs/sec (Score:1)
Re:Probably not enough DVDs/sec (Score:2, Interesting)
Lets take this scenario. There are around 10,000 users seeing the movie (thats an average- we are not looking at starwars kind popularity).
each user needs to have atleast 100 mbps or more for an average viewing (this too is very conservative=consider HDTV).
10,000 * 100 mbps= 1,000,000 mbps=100 gbps (take 1 gbps=1000 mbps )
now what are we looking at? serving 10,000 people? eh!
Re:Probably not enough DVDs/sec (Score:1)
Re:Probably not enough DVDs/sec (Score:1)
Plus, you made an error, your calculation results should have been 1000 Gbps.
So 10,000 users would require 10,000 * 13 Mbps = 130,000 Mbps = 130 Gbps.
Coincidentally, that is what Caltech achieved.
By using a better encoding (H264) they might even double that number.
Re:Probably not enough DVDs/sec (Score:2)
Re: (Score:3, Insightful)
Re:Probably not enough DVDs/sec (Score:2)
Given you might need to serve a few thousand people an hour (or more?), I'd say it's still got awhile to go.
"Several per second" is equivalent to "a few thousand an hour."
Re:Probably not enough DVDs/sec (Score:1)
Re:Probably not enough DVDs/sec (Score:2)
Satellite ---> World
The internet uses packets.
Packets go from Point A ---> Point B
If you want another person to recieve the aformentioned packets...
someone, somewhere has to send those packets to Person C
And since we usually want to authenticate the person recieving the stream... well, you see why its not so simple.
I'm just spouting at the mouth here, so if I'm wrong, feel free to correct me in a technical manner.
Re:Probably not enough DVDs/sec (Score:1)
But they said several. Lets guess three. 10,800 people. Wikipedia says that the in the 2000 census, the US was reported to have 281,421,906 people. If everyone watched their own movie simultaniously, you would need 26,058 of these servers. Now, peo
shout out to my folks (Score:1)
I don't want to denegrate the Caltech crew either, as I know Stephen Low and find that he's one of the nicest guys I've ever gotten to work with.
They sure know how to make a ... (Score:1)
Mr. Phelps, (Score:3, Funny)
This packet will self-destruct in 8..7..6..5..
where's all the (Score:1)
Whatever you do... (Score:5, Funny)
In other news... (Score:1, Redundant)
In other news, the MPAA released a statement today saying...
Mandatory Spaceballs... (Score:3, Funny)
Re:Mandatory Spaceballs... (Score:1)
Sorry, I couldn't help but fix that.
farthings per furlong (Score:5, Interesting)
Re:farthings per furlong (Score:1)
It's actually 131Gb/s (Gigabits per second), which works out to about 16.4 GB/s (Gigabytes/sec).
16.4 / 0.00132 = approx. 12,424 users/sec streaming 1x speed DVD data.
Or, using your metrics, about 12.4 KDVDs/sec
92 Tbits/sec via Cisco gear about 18 months ago (Score:3, Interesting)
Ex-MislTech
So that means.. (Score:4, Funny)
What do we do next ?
Re:So that means.. (Score:2)
Bandwidth? (Score:1, Interesting)
RAM (Score:1)
While googling in an attempt to find what I was thinking about, I found this article from a year ago about a HUGE one of these bought by the US government for 'database crosschecking' (Spying on people in real time, for those of you wearing your tinfoil hats)
http://www.techworld.com/storage/news/index.cfm?Ne wsID=1176 [techworld.com]
Enjoy.
They used disks (Score:2)
The bandwidth challange used to be about copying from
DVDs/sec? How about (Score:2, Funny)
That's nice, but what is it in Libraries of Congress per microfortnight?
Re:DVDs/sec? How about (Score:1)
1 Library of Congress = 10 TiB (LOC is base 10) = 10^13 or 10,000,000,000,000 bytes
1 Fortnight = 1,209,600 seconds
1 Microfortnight = 1,209,600 * 10^-6 = 1.2096 seconds
So we need: 10^13 / 1.2096 = (8.26719577 * 10^12)*8 bps
Now using the 130Gbps (130*1000*1000*1024) we get: 133,120,000,000 / ((8.26719577 * 10^12)*8)
Which is:
0.0020127744 LOC/mFtnght
The final line said it all folks! (Score:2)
But don't tell the RIAA or the MPAA, they'll have a press release out yet tonight about how much they lost to piracy. But I'll bet its never crossed their minds that if they'd quit treating the customer like a thief, and give him an honest hours entertainment for an honest hours wages, plus letting us see how much the talent got out of that, we'ed be a hell of a lot happier when we do fork over.
We don't like the talent t
Missing infrastructure (Score:4, Interesting)
This is nothing but an impressive statistic until ISPs provide this kind of bandwidth into homes (the infamous "last mile" connection). Not to mention that even the fastest hard drives available to consumers can't write data this fast.
Re:Missing infrastructure (Score:2)
100s of millions of people at 5Mbps == a heck of a lot of load.
Though yeah, Gbps to the home would be nice...
Tom
Re:Missing infrastructure (Score:1)
You might not be able to say "I want to watch a movie right now" and get it on demand but whats wrong with a Netflix-like queue of films you would like to see and "trickle" download system?
You could list 5 films you liked, and the system could merrily go off and trickle download 20 films that people who also liked the first 5 liked. You don;t have to watch them all, or even pay for them.
HDD space is becoming less and less of a
Re:Missing infrastructure (Score:2)
Re:Yes, but (Score:1)
Re:Yes, but (Score:2)
hey i want one of those (Score:1)
A chain of airplanes has more throughput (Score:3, Insightful)
Imagine you had as many big planes as possible taking off from each airport and landing at the other every day.
Imagine they were all filled with hard disks or DVDs.
Now THAT is a lot of bandwidth.
Latency sucks though.
The moral of the story:
Bandwidth isn't everything.
They used Linux 2.6 kernel (Score:4, Interesting)
Impressive work, either way.
Re:They used Linux 2.6 kernel (Score:5, Interesting)
The problem with latency is that everyone lies about the figures. I talked to some of the NIC manufacturers and got quoted the latency of the internal logic, NOT the latency of the card as a whole, and certainly not the latency of the card when integrated. There was one excellent-sounding NIC - until you realized that the bus wasn't native but went through a whole set of layers to be converted into the native system, and that the latency of these intermediate steps, PLUS the latencies of the pseudo-busses it went through, never figured in anywhere. You then had to add in the latency of the system's bus as well. In the end, I reckoned that you'd probably get data out at the end of the week.
I also saw at SC2005 that the architectures sucked. The University of Utah was claiming that clusters of Opterons didn't scale much beyond 2 nodes. Whaaaa???? They were either sold some VERY bad interconnects, or used some seriously crappy messaging system. Mind you, the guys at the Los Alamos stand had to build their packet collation system themselves, as the COTS solution was at least two orders of magnitude too slow.
I was impressed with the diversity at SC2005 and the inroads Open Source had made there, but I was seriously disgusted by the level of sheer primitiveness of a lot of offerings, too. Archaic versions of MPICH do not impress me. LAM might, as would LAMPI. OpenMPI (which has a lot of heavy acceleration in it) definitely would. The use of OpenDX because (apparently) OpenGL is "just too damn slow" was interesting - but if OpenDX is so damn good, why hasn't anyone maintained the code in the past three years? (I'd love to see OpenGL being given some serious competition, but that won't happen if the code is left to rot.)
Microsoft - well, their servers handed out cookies. Literally.
Re:They used Linux 2.6 kernel (Score:2)
Perhapse openDX is significantly more complex to write but executes more efficiently for the job at hand.
Re:They used Linux 2.6 kernel (Score:2)
Re:They used Linux 2.6 kernel (Score:2)
Some BSDs were considered, but would probably not make too big of a difference directly. Most of this was more about getting individual nodes working together than raw bandwidth out of a single box. A lot of the tools used were intended for Linux and were otherwise kind of untested or not really configured yet for use on one of the BSDs (although I think that may be looked into soon). In the end, with each node pushing out about 940-950 Mbps on a 1 Gbps connection, there is not too much more to squeeze o
Re:They used Linux 2.6 kernel (Score:1)
1) Only some of the members used patched 2.6.12.x kernels. I used XFS patched Scientific Linux (i.e., RHELv3 compatible) 2.4.21-37 kernels on my nodes (21 senders, 41 receivers), so nothing terribly special.
2) It wasn't 131Gbps over a single link: we had 22 10Gbps links in to and out of the routers in our booths on the show floor that were measured simultaneously. The links were full duplex, so the maximum theoretical bandwidth in and out was 440Gbps. Ho
hahah (Score:3, Funny)
Sony Rootkit? (Score:1)
connectionListener();
sendSpam();
startDOS();
Gbps (Score:3, Interesting)
I'm looking through these charts and I am not finding an important number, how far the signal can be sent at that rate before it starts dying. Repeaters could be responsible for keeping this in vaporworld.
Re:Gbps (Score:2)
How much money do you have? That's the limiting factor. The hardware is available, it's just very expensive. There are fiber optic amplifiers that boost the signal level without having to demodulate it and regenerate it.
Re:Gbps (Score:1)
available bandwidth actually used is pretty impressive.
_ Booker C. Bense
Big deal (Score:2)
j/k
Tom
Only measured 17 of the 22 wavelengths (Score:1)
A better standard instead of DVDs (Score:2)
And, at least the size is standard.
PNG? (Score:3, Funny)
Re:PNG? (Score:2)
We're in luck ... (Score:2)
That's almost as fast as the movie industry is generating crappy movies to download!
BW Challenge: less and less relevant (Score:2)
Re:BW Challenge: less and less relevant (Score:1)
What's cool about that is that we have made a leap in technology, and we have new bottlenecks to fatten up for next year. It's like advancing an army one phalanx at a time.
Re:BW Challenge: less and less relevant (Score:2)
I'd like to see a phased-array radio telescope that supplies raw data to each remote user for beam forming.
Re:BW Challenge: less and less relevant (Score:1)
Especially when using a protocol'd network.
Need a bigger quota... (Score:1)
What does each component do? (Score:2, Informative)
Re:What does each component do? (Score:2)
haha, okay, j/k. They set it all up so that nobody could make the obligatory "what about a beowulf cluster of those" joke.
not sure.... (Score:2)
I can't answer the question until I know WHICH several movies.
Re:not sure.... (Score:1)
Re:not sure.... (Score:2)
I pulled those movies from here [imdb.com] and here [maximonline.com]
(The Maxim list loses most of its credibility by rating Dune as one of the worst movies ever.)
I hope NOT! (Score:1)