Storage load testing used to be the exclusive domain of hardware vendors, but now it’s becoming a routine task for enterprise customers to do on their own.

Vendors several years ago began running black-box tools such as Virtual Instruments’ Load Dynamix or open-source software to measure how their disk arrays perform under pressure. However, as the data got bigger and the applications became diverse, this led to vendor numbers turning almost as artificial as benchmarks.

“If you’re running any of the standard off-the-shelf benchmarking tools, you’re really not simulating a real-world load,” storage analyst George Crump said. His company Storage Switzerland (its name was chosen to state their vendor neutrality) is a Virtual Instruments customer. “We use it on a case-by-case basis. To me, if I’m a larger IT shop, this is something to have all the time,” he said.

SEE: Ethics policy: Vendor relationships (Tech Pro Research)

Virtual Instruments on Tuesday announced version 5.6 of its software and version 5.3 of the enterprise appliance package. “There’s nothing I’ve seen thus far that does what these guys do. I’m always looking for something that is similar,” Crump said, citing Linux software packages Big Data Benchmark, Fio, and Iometer as options for doing part of the job at lower costs–Load Dynamix starts at $50,000, company officials in San Jose, CA said.

Large organizations may be able to justify that bill. “Storage today is a very sophisticated software stack. The characteristics of customer workloads can dramatically affect the performance of arrays,” explained Tim Van Ash, senior VP of products.

The new version of Load Dynamix gained support for 32 gigabit Fiber Channel and 40 gigabit Ethernet connections. It also has new workload models for network-attached storage and assorted usability improvements, Van Ash said.

“It started out as an opportunity to make better decisions for new technology [however] there are pretty significant decisions at times with some of the firmware upgrades,” Van Ash explained. Something as seemingly benign as a Microsoft Exchange point upgrade can make dramatic changes to storage workloads, he said.

The new generation of hybrid and all-flash arrays also warrants in-house storage testing, Van Ash asserted. “It goes well beyond whether it just works today,” he said. “The problem with going all flash, unless you’ve upgraded your [host-bus adapters] and your storage network and everything else, is you need testing more than ever. The environment’s going to crash your network sooner.”

“We’ve seen situations where customers are buying equipment from different vendors because the attributes of the product are better for different workloads,” Van Ash continued. In some cases the speed of flash storage is causing hotspots where multiple applications try accessing the same data simultaneously–“That can cause many of the arrays out there to stumble today,” he noted.

Crump said it would be good for the storage market to see Virtual Instruments gain a direct competitor. “This flash bubble that we live in–for the first time storage is faster than we need it to be–and that’s going to be short-lived,” he observed. It may lead to storage testing seeing its niche expand, he said.

Subscribe to the Data Insider Newsletter

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more. Delivered Mondays and Thursdays

Subscribe to the Data Insider Newsletter

Learn the latest news and best practices about data science, big data analytics, artificial intelligence, data security, and more. Delivered Mondays and Thursdays