72% of Networking Performance Statistics Are Misleading

465 Words. Plan about 2 minute(s) to read this.

One element of the networking industry’s marketing machine is the citing of performance statistics. This { box | software package | interface } can perform this many operations this quickly. Statistics are nice for technically minded people. They make us feel like we’re making an informed decision about a product. “Well, gosh. This product is faster than that one, see? Clearly, it’s superior.”

Like my tongue-in-cheek title, performance statistics are often misleading or, at best, meaningless without context. As a savvy consumer of any networking product, you should look at performance statistics as little more than a rough indicator of how a { box | software package | interface } performed under a specific test circumstance. Hint: the tests are usually rigged. Specifically, networking device performance tests are rigged such that the test data thrown at them are going to show the product in the best possible light. That test data is not going to look much like your network, which is why the statistics are misleading.

Here’s what you need to ask yourself when it comes to reported performance statistics.

  1. Who did the testing, and how were they paid? If the vendor did their own testing…well…no doubt the tests have some root in reality, but it’s unlikely they’ll display the ugly underneath. If the testing was independent, but commissioned by the vendor, well…then you need to dig into the specific testing methodology. Was it a test designed to favor a predetermined outcome? Hint: yes, most likely. If the testing was independent and funded by a consortium of vendors, then I think it’s more likely to be useful data.
  2. What sort of testing was done? For example, was the traffic mixed? Of varying packet sizes? At what scale? Depending the hardware and software being tested, different traffic mixes and rates can result in some very different results. In the emerging world of networking software and SDN controllers, the concerns will be about software performance and scalability, thinking more in terms of operations per second than raw throughput. Again, different sorts of tests are likely to result in different sorts of numbers.

You need to do your own testing before committing to any product. Marketing performance statistics are only a general indication of how well the { switch | controller | firewall } will perform in your specific environment. For those of you with limited scale requirements, this might not be a big deal. But for those operating large environments for whom maximum performance is key, take the claims with a grain of salt. Do your own testing with your own real data.

Oh — and then do the rest of us a favor: publicize your testing methods and results. The FUD of marketing hyperbole is a tedious weight hanging around the neck of this industry. We could do with more end user testing data.



Ethan Banks writes & podcasts about IT, new media, and personal tech.
about | subscribe | @ecbanks

Comments

  1. JC

    We happen to be about to launch an rfp for new firewalls and we’ve been reflecting on how to spec/evaluate the performance (throughput) aspect; what would be your recommendations for how to do our own performance testing? We have the fortune of having an IXIA load tester but it’s an oldish unit with only gigE interfaces and some our requirements will be for multi-gig throughput (10/40 gig interfaces on the firewall). We were thinking of just spec’ing very specific benchmark traffic and letting the submitters provide the according performance numbers (in a contractually binding fashion). What do you think?

    1. Post
      Author
      Ethan Banks

      In principle, I like the idea of specifying a specific traffic mix in the RFP. I think it will get fussy though, if you run into issues with claimed throughput vs. actual in your environment. Lots of room for finger-pointing unless you’re able to be very specific in the RFP, and/or actually provide the test data yourself for the respondents to loop through.

      Would still be nice if you could run some tests yourself. Firewall testing is so painful, as dramatic performance changes come up just because you turned on some inspection that you’d think would be no big deal. I haven’t done this recently, but I recall that renting Ixia kit was an option back in the day. I have lost touch with this market somewhat, but BreakingPoint had a testing solution that was far cheaper than Ixia. Not sure what’s happened to the pricing of that product since Ixia bought BreakingPoint. Spirent has also gone with affordable testing kit for enterprises as well http://www.spirent.com/Solutions/Enterprise.

Comments are closed.