DiGiSTORM bracht ons richting deze post van John Carmack op Slashdot (in een thread over het Dual 1GHz PIII artikel van Hardware Central. JC zegt (uit eigen ervaring) dat motion blur in first person shooters zoals Quake III niet al te indrukwekkend is. Hier heb je z'n complete post:
A GeForce should be able to run Q3 at 200 fps at 400x300 (r_mode 1) or possibly even 512x384 resolution if the cpu was fast enough. A dual willamette at the end of this year will probably do it.
We currently see 100+ fps timedemos at 640x480 with either a 1ghz processor or dual 800's, and that isn't completely fill rate limited. DDR GeForce cards are really, really fast.
Yes, it is almost completely pointless.
The only reasonable argument for super high framerates is to do multi frame composited motion blur, but it turns out that it isn't all that impressive.
I did a set of offline renderings of running Q3 at 1000 fps and blending down to 60 fps for display. Looked at individually, the screenshots were AWESOME, with characters blurring through their animations and gibs streaking off the screen, but when they were played at 60hz, nobody could tell the difference even side by side.
Motion blur is more important at 24hz movie speeds, but at higher monitor retrace rates it really doesn't matter much.
There are some poster-child cases for it, like a spinning wagon wheel, but for most aspects of a FPS, realistic motion blur isn't noticable.
Exagerated motion blur (light sabers, etc) is a separate issue, and doesn't require ultra-high framerates.
There are still plenty of things we can usefully burn faster cpu's on...