Exploring new restaurants can be one of the most enjoyable parts of spending time in a new city. It’s no small task to pick a restaurant to try when there are so many good choices, so many people rely on sites like Yelp that provide a way to objectively compare different restaurants. Selecting the proper VDI vendor requires even more comparative scrutiny, and therefore a similar way to objectively compare different solutions.
Recently, SimpliVity released two white papers (VDI without Compromise and Citrix XenDesktop Performance) detailing our End User Computing (EUC) performance testing on SimpliVity hyperconverged infrastructure. These papers detailed the performance characteristics we experienced when utilizing the Login VSI benchmarking tool for putting our technology through their paces.
Comparing different vendor performance tests can be difficult given the drastic difference even minor changes in configuration or test process can create. Utilizing Login VSI allowed SimpliVity to provide standardized performance test results that can be used to simply and easily compare performance results with other vendors. As with any benchmarking tool, Login VSI has many configuration options that can change the results of the test runs. This is why Login VSI created their “Tested By” and “Validated By” programs.
The “Tested By” logo is available to vendors using any version of Login VSI and provides both a marketing and technical review of vendor publications, as well as promotional support from Login VSI. The “Validated By” program, however, is a much more rigorous process. In order to receive this newly created logo, a vendor must utilize the latest version of Login VSI, which is important since newer versions of Login VSI have different approaches to measuring the relative performance. Vendors seeking the “Validated By” logo must also submit their test environment to a review by Login VSI engineers and must also pass an audit of the results. The audit is critical in order to verify the tests were run utilizing the very specific testing guidelines Login VSI has defined, thus improving the comparability of different test results between vendors.
After being presented with the opportunity to participate in the new “Validated By” program, SimpliVity found it an easy decision to pursue the more rigorous option to provide the greatest possible credit to our amazing results. We worked extensively with Login VSI to understand their guidelines, ensure our infrastructure was properly configured, interpret the test results, and create a joint release of the results. The result was a very deep understanding of our performance results and the full weight of Login VSI to stand behind those results.
This consistency of results comes out in the metrics and graphs that Login VSI creates after completing test runs. It starts by testing the responsiveness of the environment in an idle state, which establishes the VSIbase metric (669 in the below chart). VSIbase is increased by 1000 to establish the VSImax 4.1 threshold, which identifies the level at which the end user experience will no longer be satisfactory (1669 in the below chart). Login VSI then utilizes a number of separate test machines to launch sessions to the VDI environment and measures the responsiveness of the sessions, which establishes the VSImax v4.1 average after each additional desktop workload kicks off (the final measurement was 1246 in the below chart).
This chart is the testing result of a 1000 user login storm on a VMware Horizon View environment hosted on SimpliVity OmniStack. The thick blue line is the VSImax v4.1 average measurement as each user logs in. It starts at the same value as VSIbase (the thin blue line) and slowly increases as additional desktops are logged in. The logical limit of desktops that can run in this environment is established when the VSImax 4.1 average line reaches the VSI Threshold (the thin red line). The thick red line and thick green lines are the maximum and minimum response times across all the desktops, respectively.
The important thing to look for when comparing Login VSI results is the environment configuration and the type of tests being performed. In version 4.0 Login VSI completely changed the tiers of workloads that can be run in the tests, so be aware of which ones are being used as you compare older test results with newer ones. Alternatively, you could rely on Login VSI’s “Validated By” program, which will ensure the tests were run appropriately and nothing was doctored in the results.
Overall, the process was a great experience with a wonderful outcome. Login VSI was a pleasure to work with, and it was clear they have really incorporated their experience with many benchmarking exercises across many vendors into both the product and their validation programs. SimpliVity would definitely recommend utilizing their catalog of when investigating the implementation of an End User Computing environment.