- Oct 30, 2008
- 27,024
- 79
- 86
So, with all the talk about AMD's Mantle and Nvidia claiming Direct X doesn't actually degrade performance a lot, what is the truth?
I've only really developed for PC and just really started using DX, so I am curious of the real impact.
I remember there being a lot of talk when DOS was going away that losing the low level access would make games worse, but that never really happened to my knowledge.
Currently, is DX really that inefficient? And if so, by how much is it really going to improve with low level access?
This really doesn't have to even pertain to gaming high / low level APIs. For instance, if I write machine code, will it actually perform significantly faster than writing code that uses an interpreter?
I've only really developed for PC and just really started using DX, so I am curious of the real impact.
I remember there being a lot of talk when DOS was going away that losing the low level access would make games worse, but that never really happened to my knowledge.
Currently, is DX really that inefficient? And if so, by how much is it really going to improve with low level access?
This really doesn't have to even pertain to gaming high / low level APIs. For instance, if I write machine code, will it actually perform significantly faster than writing code that uses an interpreter?
Last edited:
