Once upon a time there lived a bored young aristocrat named Stanley. Growing tired of his indolent lifestyle, Stanley decided to go into the manufacturing business, so he purchased a haggis factory, which was going cheap as its previous owner had died in a tragic golfing accident.
On his first day as the new owner, Stanley arrived in front of the factory gates bright and early, eager to find out what his new investment was capable of. Here is what he saw:
"Yikes!", thought Stanley. "It took six hours to prepare a single haggis. Assuming I can sell this for $12, that gives an income of $2 per hour; nowhere near enough to cover payroll for my 20 staff. I fear this investment may have been a mistake."
Stanley has made the same error as many beginning graphics programmers, who render a single model (or sometimes just the default CornflowerBlue template) and then post on Internet forums complaining about the resulting framerate.
It is obviously ridiculous to judge the throughput of a factory by examining just one haggis. Sure, it takes a while to clean the equipment and heat up the cooker, but you only have to do that once in the morning. If you were to make 100 haggis, these could all cook at the same time in the same pot, so would take no longer than a single one. If you wanted 200, they might not all fit in the pot at the same time, but you could reuse the existing hot water, and the second batch of 100 haggis could be cooking at the same time as the first batch was cooling.
Graphics cards work the same way. When you see something like this:
it is easy to worry that your framerate will decrease by 300 each time you add 100 triangles. If this was true, drawing more graphics would result in:
Huh? A negative framerate is obviously impossible. This proves there must be something wrong with my logic.
My first mistake was to assume that framerate is a linear scale, when in fact the framerate is equal to one divided by the amount of time spent drawing each frame. To convert into linear millisecond units, we must divide 1000 by the framerate:
Looking at the difference between these frame times, it took 0.75 milliseconds to draw 100 triangles. Time is a linear scale, so we can predict how performance will change as we add more triangles:
But this estimate is still too pessimistic, because graphics drawing time is not linear with regard to how many things are being drawn. In some cases, adding more triangles might be free, if the hardware is able to boil them up in the same pot it is already using to cook your previous graphics. In most cases, adding more triangles will slow you down, but by less than you would expect from measuring just a few in isolation.
It is appealing to think we might be able to predict the performance of a full game by measuring something smaller and simpler, but this is not possible, because we have no way to know how much of that small measurement represents real work, versus how much is just warming up the boiler and cleaning our equipment ready to start cooking.
In fact, measuring the framerate of a game that does only a small amount of work tells you pretty much nothing. If you want to know how long it will take to make a large number of haggis, the only accurate way to find out is to crank up the production line and actually make that many haggis!