vendor benchmark numbers are not "our system will perform this well", they are instead "We guarantee that under the most favorable possible conditions, this device will never perform better than this well", and if you manage to prove them wrong, they will change their testing methodolgy to use the new, more favorable tests.
As a result, in the real world, they are worthless for any purpose other than comparing different models of devices from the same manufacturer.
As an example of this, Cisco performance numbers in the real world are frequently as bad as 1/10 the Vendor published numbers.
There are a handful of vendors who are honest and rate their products in a sane default config (and if you turn off features, their performance will improve), but they tend to be small, if for no other reason that big purchasing managers won't buy their device because "it's not as good as Brand X" that publishes inflated numbers.
This is why big companies always require Proof of Concept testing, so they can configure the products and see how they work when hit by real configurations and real-world traffic.