World Series Ticket Sales Overwhelm Servers 86
vlakkies writes "The Colorado Rockies Major League Baseball team decided to only sell tickets for the World Series games at Coors Field online. As a result of overwhelming interest, the ticket vendor Paciolan experienced a system meltdown resulting in a suspension of all ticket sales."
Contract with Akamai (Score:3, Interesting)
Sure, wouldn't make sense after the initial week, but this is becoming a major joke lately. These places always seem to underestimate demand by a factor of, like, hundreds.
You'd be indeed too late. (Score:3, Interesting)
Scalpers' plans fell through (Score:4, Interesting)
A hell of a pain in the ass indeed.. (Score:4, Interesting)
Just seconds after 10AM mountain time, the site (evenue.net) became completely unresponsive. After about an hour of reloading and fighting with the system, I finally got in. I was able to (excruciatingly slowly) pick seats and get to my shopping cart. After that they took me to a captcha (which didn't load), and following that to a registration page to take all my info and credit card number.
Hitting submit on that page caused an hour long hang that eventually just kicked my back out to the waiting page. I had several family members across the country try to get in as well, all with no success.
What's interesting though is it seems that evenue was using a load balancing system to automatically assign the end user to one of their servers...
Over the course of trying to get tickets I was connected to ev14.evenue.net, ev15.evenue.net, ev9.evenue.net, ev5.evenue.net, and finally (the server that got me through), ev8.evenue.net.
I'm willing to bet that their all on the same backbone connection though, and from the way things went I can't imagine it being any fatter than a 45mbps link.. then again, 8.5 million hits in an hour *is* a lot. In order to sustain that load from a single datacenter (not that they'd have to, but from the sounds of it they were; all their servers seem to be in the same datacenter in California) they'd need, oh, 8,500,000/60/60*56/1024 ~130Mb/s... which really isn't that much at all (that's assuming a 56kbps connection per person for a reasonable experience on the site).
So what it really boils down to, then, is the inefficiency of their server code and the number of servers they have. From the failover numbers it looks like they only had ~20 servers handling this... And from the design of their site (Lots of java
In any case, the Rockies are new at this