Voodoo Extreme heeft een vraag gepost van een bezoeker over de slechte OpenGL performance van Unreal. Tim Sweeney geeft antwoord:
Theory 1 (Quake theory):
Hardware vendors are writing and optimizing their OpenGL drivers for Quake, and Quake uses small textures, so they don't spend any time optimizing their drivers for handling the huge amounts of textures that Unreal uses in order to improve graphical realism.
Theory 2 (historic code theory):
Hardware vendors base their OpenGL code off the source code SGI provided them, and that code is very inefficient at handling texture changes and state changes.
Theory 3 (general purpose equals slow theory):
OpenGL has a ton more options for texture formats, memory layouts, etc than OpenGL and Glide, so hardware vendors don't have time to optimize for all of those cases, whereas they can for Glide and Direct3D.
Theory 4 (Sweeney sucks theory):
Tim's OpenGL code sucks (judge for yourself, it's available on http://unreal.epicgames.com).
Theory 5 (Nobody focuses on OpenGL theory):
Most games use Direct3D so hardware vendors put most of their optimization effort into their Direct3D drivers instead of their OpenGL drivers. Other than theory 4, I don't have firsthand knowledge of the code involved, so it's not really clear to me why OpenGL drivers are slower than Direct3D drivers on the same card. But I guess it's combination of all those things.