Data Centers

10 obstacles holding back microserver adoption

Chip makers and server vendors are stirring things up in the microserver market, but a number of factors are hampering widespread adoption.
 
hero
  

Microservers take the familiar concept of the blade server, which essentially puts multiple servers into one physical chassis for shared power and cooling, to another level. A Dell microserver chassis, for example, houses 12 physical servers in 3U of rack space, theoretically lowering space, cooling, and hardware costs. However, there are several challenges to widespread microserver adoption. Here are some of them.

1: Unfamiliarity

Microservers bring a fairly new hardware form factor to the data center, eschewing the blade concept that's been in place for a number of years. While the concepts are fundamentally the same, everyone from data center designers to front-line operators will have to learn new design and operational considerations.

2: A gunfight at the processor corral

The mainstream X86 server market has basically standardized around Intel processors, with AMD grabbing limited market share. Intel's server chips are powerful beasts with well-understood capabilities. In the microserver market, Intel has introduced lower-powered versions of its traditional server chips, as well as a "server-grade" version of the Atom chip. However, the ARM-based chips that dominate mobile phones and tablets are also making a play for the data center, and workloads that are targeted at microservers seem perfect for ARM designs. Companies may consider waiting out the first few generations in the microserver market until a processor architecture becomes standardized.

3: The curse of the Atom

One of Intel's two classes of microserver processors uses the company's Atom branding. Recent Atom processors are excellent in terms of power consumption and processing capabilities for the right workloads, yet the brand still engenders skepticism, particularly from people with bad memories of the early days of the netbook craze. While it's patently unfair to discount the current generation of Atom processors, the brand alone will cause some to dismiss microservers.

4: Sizing

Sizing traditional servers is a known commodity. Most vendors and implementation providers have proven sizing tools that quickly determine server and processor configurations. With microservers being somewhat new to the market, you or your partner may be taking a bit of a guess with sizing a microserver deployment — a guess that could be fairly expensive if done poorly.

5: Lack of clarity around workloads

Microservers have a compelling story around power and space savings, but when it comes to the types of workloads they're best equipped to handle, the story becomes a bit less clear. Since microserver processors generally eliminate some of the technology embedded in higher-end server chips, it's fairly obvious they're not the best choice for complex analytics or graphical calculations. Outside some well-defined niches, it's hard to tell if a microserver is the best choice. Most vendors mention areas like web serving or serving up virtual desktops, but they lack compelling use cases that show the savings that can be achieved by a microserver. After all, if a workload that could be accomplished with two traditional blades takes four or five microservers, savings may be illusory at best.

6: Open Compute

Technology companies like Facebook have been pioneering nontraditional servers for a number of years and have launched the Open Compute platform, which specifies designs for servers targeted toward high-density web service providers. While Open Compute is arguably a distinct physical platform versus microservers, it's targeted toward a similar market and presents early adopters with a major, divergent choice between Open Compute and microservers.

7: Competing against virtualization

Facebook and others have largely proven that a giant farm of microservers or a similar technology is the best way to handle millions of concurrent web requests, but that doesn't mean a move away from virtualization is appropriate for every data center. Many companies have invested significantly in their virtualized infrastructure, and microservers return us to the days of the physical server. There is certainly some overhead to virtualization, in both technical and human terms, but a pool of physical servers also poses its own challenges.

8: Managing workloads

One of the great benefits of technologies like virtualization is that you can dynamically allocate processor and memory resources as workloads shift. With microservers, you are physically committed to a fixed level of resources once you place your order. With effective planning and workload balancing tools in front of your servers, this may be an easy task. But for other workloads, you may struggle to break them into chunks that can be processed in parallel.

9: Moving to the cloud

Companies are increasingly hesitant about owning fixed IT assets like data centers and their racks of servers. In extreme cases, I've encountered clients who want to own no IT infrastructure other than networks that connect their leased end-user devices to cloud services. A rack full of microservers now competes against a cloud provider, which in some cases allows you to simply write a check and let someone else worry about the debates concerning which hardware to select and how to provision and maintain it.

10: A product looking for a market?

I can't shake the feeling that the microserver might be a product in search of a market. Sure, there are niche markets where the microserver might fit nicely. But you're paying a premium for technology that's yet to be fully standardized, has complexities around workload, and is new to the market. The upshot for vendors is that they can charge a slightly higher price since microservers have not been subjected to the brutal competition and commoditization of the standard X86-based server market. Unless you're a high-volume web provider or highly sensitive to power requirements, make sure there's a compelling case for the move to microservers before jumping onboard.

 

 

About

Patrick Gray works for a global Fortune 500 consulting and IT services company and is the author of Breakthrough IT: Supercharging Organizational Value through Technology as well as the companion e-book The Breakthrough CIO's Companion. He has spent ...

0 comments

Editor's Picks