I get a lot of requests for Performance Benchmarks of APOLLO vs. lots of other systems. We provide extremely analytical and detailed performance results of our server every release. A couple issues always crop up in any performance benchmarks:
1. Features - first of all, ANY competitive product cannot DO what APOLLO can do...so I find myself either A: Dummying down APOLLO and the test to even be able to do an "apples to apples" test or B: not making it "apples to apples".
2. Return on Investment - ROI on software is not just performance, but HOW LONG IT TOOK TO SETUP, ADMINISTER and GET the feature operational in a production scenario!! I find myself spending such a HUGE part of my time getting the competitive software to even "work" to do the test.
I've been following the FOSS4G's Web Mapping Shootout announced for their 2009 conference. I get a really HUGE chuckle because their "shootout" couldn't be perfomed on a more CARTOON set of data and NON-REALISTIC use case. I don't know ONE client that requires one image and a handfull of vector data sets (3 to be precise).
Our smallest benchmark has 459 7 band images...choke on that Open Source.
They should call it a "water gun fight" instead of a "Shootout".
Also what will NOT be collected in the "shootout" is how long it took them to setup the system and service enable the data...how many "WEEKS" are you willing to struggle with that?
PERFORMANCE is about ROI on the investement and of course the ability of the system to handle a user load. Weigh both when your looking at the numbers!!