Data Centers

Create pools to maximize Cisco UCS blades' stateless computing features

Learn how to create LAN and storage pools that will let you take advantage of the stateless computing concepts offered by Cisco UCS blades.


I've written about Cisco UCS B-Series blade servers in my three previous articles. I covered how to rack and stack them, make sense of the Cisco UCS Manager software, and configure ports for the environment.

In this article I focus on getting various pools created so you can create service templates and get your blade servers running. If you haven't already cabled your UCS environment or configured ports, I suggest going back to the articles I've recently written or reading the Cisco UCS documentation.

There are several pools you'll want to create so you can better take advantage of the stateless computing concepts offered by Cisco UCS. I'll start with the UUID pool. The UUID is a number that identifies a particular server. Each server will be assigned a UUID.

Follow these steps to create a UUID pool:

  1. Click the Servers tab in the UCS Manager.
  2. Expand Pools and right-click UUID Suffix Pools.
  3. Click Create UUID Suffix Pool.
  4. Give it a name. The name is arbitrary, but you might go with something like Company_ESXi_UUID.
  5. You can make the Assignment Order sequential if you like. Then click Next.
  6. Under the Add UUID Blocks screen, click the Add button.
  7. Enter a number from which to start your UUID pool. You can be creative with this and even use something like an address or site number so you can easily recognize it.
  8. Enter a size (Figure A). You may want to enter a larger number than the amount of blades you currently have in case you decide to add more blades later.
  9. Click OK and then Click Finish.

Figure A


Next you'll configure the pools under the LAN tab; these include IP pools for the Cisco Integrated Management Controller (CIMC) connections, as well as IP addresses for the KVM connections. You probably won't end up using the CIMC management too often on your blades, but it's a good idea to set it up just in case. If you're managing C series UCS servers, the CIMC can be helpful there as well. You'll also need to configure the MAC address pools.

Follow these steps to create a new IP pool for CIMC management:

  1. Click the LAN tab in the UCS Manager.
  2. Expand Pools and Root.
  3. Right-click IP Pools and select Create IP Pool.
  4. Give it a descriptive name (e.g., ESXi_Management_IP).
  5. Choose sequential for the Assignment Order.
  6. Click Next.
  7. Click the Add button.
  8. Enter the correct IP information for your network along with the range you'd like to use for the CIMC IP addresses (Figure B).
  9. Click OK and Finish.

Figure B


To create the KVM IP pool, we'll do pretty much the same steps, except in this case right-click the IP pool ext-mgmt pool that's there by default. Under this pool, create a range of IPs you'd like to use to give yourself KVM access to the servers. You'll definitely want this for when you need to install an operating system.

Now you can move on to creating MAC pools; these will be the pools from which our vNICs get assigned MAC addresses. You'll create two different pools in this case: one for Fabric A and one for Fabric B so you can have redundant connections. If you have eight vNICs assigned to each server, four will go to Fabric A and four will go to Fabric B.

Follow these steps to create a MAC pool:

  1. While still on the LAN tab under Pools, right-click MAC pools and select Create MAC Pool.
  2. Give it a name, but keep in mind we're creating two pools, one for each fabric (e.g., ESXi_MAC_FAB_A).
  3. Click Sequential and then click Next.
  4. Click the Add button.
  5. Cisco automatically puts 00:25:B5 as the first three bytes, and I recommend leaving those; however, you can play with the last three as you like — perhaps put a 0A in the fourth byte.
  6. Specify a size. If we're sticking with eight vNICs per server and we have four vNICs on Fabric A then we'll need 32 (Figure C).
  7. Click OK and Finish.
  8. Repeat for Fabric B.

Figure C


That pretty much handles the LAN pools. You can move on to the storage pool creation that will be necessary for connecting to storage arrays. With this example, I'm going to stick to fibre channel connections, but iSCSI can be used. At the very least you'll need to create a WWNN pool and a couple of WWPN pools. For more information on these storage terms, please see your storage vendor's documentation.

Follow these steps to create a WWNN pool:

  1. Click the SAN tab and expand Pools and Root.
  2. Right-click WWNN Pools and select Create WWNN Pool.
  3. Give it a name (e.g., ESXi_WWNN).
  4. Click Next.
  5. Click the Add button.
  6. Leave the first part of it alone, but if you'd like to play with the last three bytes, go for it.
  7. Assign a size. You can probably stick with 32, or whatever you've been using for scalability.
  8. Click OK and Finish.

Follow these steps to create WWPN pools:

  1. While still under the SAN tab, right-click WWPN Pools and select create WWPN Pool.
  2. Give it a name and include the Fabric like you did for the MAC addresses above. In this example I'll assume two vHBAs per server. One will go to Fabric A and one will go to Fabric B.
  3. Click Next
  4. Click the Add button.
  5. Leave the first five bytes alone, but in the third to last you might consider putting a 0A in there to signify that it's part of Fabric A.
  6. Enter a size. If you plan on scaling up to 32 blades, you'll need 32 WWPNs for Fabric A and 32 for Fabric B (Figure D).
  7. Click OK and Finish.
  8. Repeat for Fabric B.

Figure D


This completes the pools setup. Look for an article on setting up templates soon. If you have any questions or comments about this series, please feel free to leave them in the comments section.


Lauren Malhoit has been in the IT field for over 10 years and has acquired several data center certifications. She's currently a Technology Evangelist for Cisco focusing on ACI and Nexus 9000. She has been writing for a few years for TechRepublic, Te...

Editor's Picks

Free Newsletters, In your Inbox