Until a couple of years ago, the best way to test your storage hardware was by using a black box, but now there are more options. The new options are exclusively performed in software, courtesy of Calypso Systems, the Storage Networking Industry Association (SNIA), and maybe soon from the University of New Hampshire’s InterOperability Laboratory.
Load-testing software isn’t new, but it generally works by running sample data through your network. In the Calypso system, released as a specification via SNIA in May 2018, testing is performed by capturing your actual company data. They call it Real World Storage Workload (RWSW) Performance Test Specification for Datacenter Storage—the name that doesn’t roll off the tongue, but it will be music to the ears of storage administrators who use it to discover performance issues.
In traditional load-testing systems, “They use these synthetic benchmarks to saturate the drives and test the numbers,” explained Calypso CEO Eden Kim. “People realize when you put storage into a real application they perform different.”
The new system defines how to capture your workload and how to use it for assessing the results. “You can open the file and then create an I/O stream map. It’s a way to allow you to grab and see what actually occurs in your application. Once you have that, then you can go to the analysis part. You can then save that and create automatically a test script which would be a replay,” Kim said. Users can filter data to just the applications they want to test, he added.
None of this works on cloud storage unless you’re the cloud administrator, Kim noted. Meanwhile, a future update would use uploaded real-world data for test libraries. “The difficulty in doing that is everyone thinks their data is super-secret,” he acknowledged.
About 1,500 users downloaded the tool and uploaded results so far, Kim said. Results can be analyzed with free online tools for basic results, or with Calypso’s paid software for advanced results, although anyone could build analysis products.
Meanwhile, at the UNH laboratory, testing has traditionally been limited to storage vendors who are willing to bring their products in firsthand. The only companies that usually do their own storage testing are hyper-scale organizations such as Amazon or Google. Sometimes a very large user might get involved, for example a financial company, lab engineer David Woolf said.
But looking forward, he said, “I think there are some things that would be useful for people to understand around testing, and in particular it would be understanding whether or not you’re getting the best performance for your workload.” That could include knowing what the applications are doing, what traffic patterns you have, and what infrastructure design would work best for them, he added.
For storage users who want access to such tests, “This is something that we’re working on expanding,” Woolf said. “There have been some data center folks that take those and run them in their infrastructure… When we found out that some data centers are using it, it was a little bit of a surprise to us.”
For those users, “I think in the future we’ll be working on our tools,” Woolf continued. “Being able to address performance issues, not just compliance issues, and that I think will be far more interesting.”
There’s not yet a timeline. “It’s a little bit of a greenfield area for us… Is there a tool out there that can tell you not just what your data center, your workload, is doing, but also how to improve that? That I think is an area that is not well-covered today,” he said. “I would think that’s one of the most useful things. There’s a lot that needs to be done in order to tune these networks.”