• Skip to primary navigation
  • Skip to main content
Ethan Banks

Ethan Banks

@ecbanks

  • Blog
  • Book
  • Podcasts
  • RSS
  • Show Search
Hide Search

72% of Networking Performance Statistics Are Misleading

Ethan Banks · 2 minutes to read
Published June 5, 2015 · Updated June 5, 2015

One element of the networking industry’s marketing machine is the citing of performance statistics. This { box | software package | interface } can perform this many operations this quickly. Statistics are nice for technically minded people. They make us feel like we’re making an informed decision about a product. “Well, gosh. This product is faster than that one, see? Clearly, it’s superior.”

Like my tongue-in-cheek title, performance statistics are often misleading or, at best, meaningless without context. As a savvy consumer of any networking product, you should look at performance statistics as little more than a rough indicator of how a { box | software package | interface } performed under a specific test circumstance. Hint: the tests are usually rigged. Specifically, networking device performance tests are rigged such that the test data thrown at them are going to show the product in the best possible light. That test data is not going to look much like your network, which is why the statistics are misleading.

Here’s what you need to ask yourself when it comes to reported performance statistics.

  1. Who did the testing, and how were they paid? If the vendor did their own testing…well…no doubt the tests have some root in reality, but it’s unlikely they’ll display the ugly underneath. If the testing was independent, but commissioned by the vendor, well…then you need to dig into the specific testing methodology. Was it a test designed to favor a predetermined outcome? Hint: yes, most likely. If the testing was independent and funded by a consortium of vendors, then I think it’s more likely to be useful data.
  2. What sort of testing was done? For example, was the traffic mixed? Of varying packet sizes? At what scale? Depending the hardware and software being tested, different traffic mixes and rates can result in some very different results. In the emerging world of networking software and SDN controllers, the concerns will be about software performance and scalability, thinking more in terms of operations per second than raw throughput. Again, different sorts of tests are likely to result in different sorts of numbers.

You need to do your own testing before committing to any product. Marketing performance statistics are only a general indication of how well the { switch | controller | firewall } will perform in your specific environment. For those of you with limited scale requirements, this might not be a big deal. But for those operating large environments for whom maximum performance is key, take the claims with a grain of salt. Do your own testing with your own real data.

Oh — and then do the rest of us a favor: publicize your testing methods and results. The FUD of marketing hyperbole is a tedious weight hanging around the neck of this industry. We could do with more end user testing data.

Filed Under: ConceptsPublished on ethancbanks.com.

Have You Read…?

  1. Synology Running Out Of Space? Empty The Recycle Bin.
  2. Free Networking Lab Images From Arista, Cisco, nVidia (Cumulus)
  3. Career Advice I’d Give To 20, 30 and 40-Something Year Old Me
  4. When Stretching Layer Two, Separate Your Fate
  5. Auto-Adding Routes When Mac PPTP Connection Comes Up
  6. How To Pass API Query Parameters In A Curl Request
  7. How To: Simple Juniper SRX Rate-Limiting via Policer
  8. Using AppleScript To Size A Window To 16×9 On MacOS
  9. What Makes A Senior IT Engineer “Senior”?
  10. How IT Pros Learn Online In 2021

MORE FREE STUFF!

Check out my IT Education Twitter Collection.
Curated tweets for IT professionals trying to up their game.

twitter linkedin linkedin

Have a great day. You're doing an outstanding job. 👍

About · Privacy

Copyright © 2007–2022 Ethan Banks