Image 1 of 10
The technology big leagues
Take a look at the technology and content behind MLB.com.
Every day from March to October is game day for MLB.com and the IT challenges stack up. Issues of scale, new products, digital asset management and juggling online access with television rights are just some of the challenges. Bottom line: What you see on MLB.com looks easier than it is. ZDNet took of tour of MLB.com’s facilities in Chelsea, New York to talk technology (see related blog post). Here’s a look at the key players and technologies behind the scenes. Click on photo to zoom in.
Best of Breed
MLB.com’s chief technology officer Joe Choti says he take a best of breed approach with technology. He also prefers to build his own applications. Under the hood of MLB.com is hardware and software from Sun Microsystems, running on Oracle databases. Most applications are built on Java. Akamai is MLB.com’s delivery platform. Tibco, Quovo and SAS are also key vendors. On game days, MLB.com can see 20 million to 50 million page views an hour. Overall, MLB.com garners about 4 million unique users a day. Click on photo to zoom in.
Content is king
Here’s a picture of MLB.com’s “bullpen,” which consists of
workers who log and track plays of a game. These stories,
tidbits and play-by-play are disseminated on the Web site in real time. Stringers at the game also produce data. The stringers are focused more on getting the play tracking correct than timeliness.
The bullpen and an army of loggers and producers create data sliced in many formats. Here’s an example. Click on photo to zoom in.
Wireless: Geolocation matters
Choti has built a team of wireless application developers that help deliver scores, video and data feeds of MLB games. The group’s biggest challenge: Geolocation. Each team in the league has forged its own television deals that restrict how video is delivered online. For instance, if you’re in New York you can’t watch the Yankees or Mets on MLB.com. Should you hop a train to
go to Philadelphia you can. MLB.com relies on a company called Quova to track IP addresses and determine where a customer is while viewing MLB.com. “If you’re on a train and out of the local viewing area we have to be able to check,” says Choti.
Video: Do your own
MLB.com built its own video studio two years ago as it moved from delivering Web quality to broadcast quality. Here’s a shot of the control center where MLB.com delivers nightly recaps, interviews and manages video feeds.
Here’s another bullpen where an army of producers create ancillary products from footage. For instance on MLB.com you can get video footage on individual players in key settings, say Barry Bonds home runs on grass fields. This footage is correlated by “loggers,” who tag footage and plays based on time, player, location and other items.
Murals of baseball line and other league items provide ambiance while at work.
Video data center
Here’s where all the video gets managed and stored. This isn’t MLB.com’s primary data center, which resides offsite. This collection of Sun hardware primarily accounts for the on-site digital assets. After two years of partnerships for capturing,
encoding and streaming video, MLB.com decided to do its own
When the baseball season starts Choti is focused on maintenance, support and keeping IT infrastructure running efficiently. Once the World Series ends, Choti’s IT season begins. This offseason Choti moved to a service oriented architecture (SOA) to MLB.com could swap components of various applications easily. For instance, with a SOA architecture MLB.com can more readily move its resources. That means at 9 a.m. EDT computing power of video servers can be used elsewhere, for testing or transactions. Once 1 p.m. EDT hits, computing power switches over to video again. Choti estimates the site is 40 times faster than it was a year ago largely due to the SOA-based architecture. Choti’s mission beyond the 2007 season: Prepping the MLB.com network to handle high-definition television.