- Apr 18, 2014
- 1,438
- 67
- 91
http://blog.mecheye.net/2015/12/why-im-excited-for-vulkan/
Interesting read.
Interesting read.
Whats Vulkan
In order to program GPUs, we have a few APIs: Direct3D and OpenGL are the most popular ones, currently. OpenGL has the advantage of being implemented independently by most vendors, and is generally platform-agnostic. The OpenGL API and specification is managed by the standards organization Khronos. Note that in closed environments, you can find many others. Apple has Metal for their own set of PVR-based GPUs. In the game console space, Sony had libgcm on the PS3, GNM on the PS4, and Nintendo has the GX API for the Gamecube and Wii, and GX2 for the Wii U. Since it wasnt expected that GPUs were swappable by consumers like on the PC platform, these APIs were extremely low-level.
OpenGL was originally started back in the mid-80s as a library called Graphics Layer, or GL, for SGIs internal use on their own hardware and systems. They then released it as a product, IRIS GL, allowing customers to render graphics on SGI workstations. As a strategic move by SGI, SGI allowed third-parties to implement the API and opened up the specifications, transferring it from IRIS GL to OpenGL.
In the 30+ years since GL was started, computing has grown a lot, and OpenGLs model has grown outdated. Vulkan is the first attempt at a cross-platform, vendor-neutral low-level graphics API. Low-level APIs are similar to what has been seen in the console space for close to a decade, offering higher levels of performance, but instead of tying itself to a GPU vendor, it allows any vendor to implement it for its own hardware.
I could go on. NVIDIA has broken Chromium because it patched out the localtime function. The Dolphin project has hit bugs because having an executable named Dolphin.exe. We were told by an NVIDIA employee that there was a similar internal testing tool that used the API wrong, and they simply patched it up themselves. A very popular post briefly touched on how much game developers get wrong from an NVIDIA-biased perspective, but having talked to these developers, theyre often told to remove such calls for performance, or because it causes strange behavior because of driver heuristics. Its common industry knowledge that most drivers ship with hand-compiled or optimized forms of shaders used in popular games as well.
You might have heard of tricks like AZDO, or approaching zero driver overhead. Basically, since game developers were asking for a slimmer, simpler OpenGL, NVIDIA added a number of extensions to their driver to support more modern GPU usage. The general consensus across the industry was a resounding sigh.
A major issue in shipping GLSL shaders in games is that since there is no conformance test suite for GLSL, different drivers accept different variants of GLSL. For a simple example, see page 85 Glyphy slides for examples of complex shaders in action.
NVIDIA has cemented themselves as the king of video games simply by having the most tricks. Since game developers optimize for NVIDIA first, they have an entire empire built around being dishonest. The general impression among most gamers is that Intel and AMD drivers are written by buffoons who dont know how to program their way out of a paper bag. OpenGL is hard to get right, and NVIDIA has millions of lines of code invested in that. The Dolphin Project even concludes that NVIDIAs OpenGL implementation is the only one to really work.
How does one get out of that?