- Mar 15, 2014
- 78
- 0
- 0
Don't speak in absolutes because it backs you into a corner when things change. And they change often.
Not with Nvidia helping MS monopolise the PC gaming industry it wont.
Don't speak in absolutes because it backs you into a corner when things change. And they change often.
Are you implying they are the only ones? This just comes across as a premature gushing ad and we haven't even seen a result. Did you fail to notice the driver wars about a year or was it already 2 years ago where they were getting massive gains, particularly for the 79xx series. There were some drivers with double digit gains.
Of course AMD love DirectX 12, it's a copy paste of the API that they wrote...
Nope. Thats yet another selfproclaimed hype.
Don't forget about the disruption factor. If
a) SteamOS takes off
b) an Android giant like Amazon launches a serious Android console
then NVidia has new opportunities to put a gaming box under your TV. They're clearly streets ahead of AMD in the Android space, though they are instead up against the likes of Qualcomm, and their Linux drivers are generally much better than AMD.
Not saying that either of these will happen, of course, but bear them in mind.
Actually it seems Nvidia is pushing SteamOS and OpenGL harder than anyone else right now.Not with Nvidia helping MS monopolise the PC gaming industry it wont.
Actually it seems Nvidia is pushing SteamOS and OpenGL harder than anyone else right now.
Neither of you can prove it but clearly mantle/dx12 will be very related (at least portions of dx12).
If Nvidia is pushing significant support for the alternatives, how is Nvidia pushing a Microsoft DX monopoly?While also pushing the Microsoft DX monopoly, do you think thats help full to OGL?
No. its the other way round.
While also pushing the Microsoft DX monopoly, do you think thats help full to OGL?
No. its the other way round.
If Nvidia is pushing significant support for the alternatives, how is Nvidia pushing a Microsoft DX monopoly?
If it was "clear" then it could be easily proven. So tell us how it's clearly related? Or how that something that is so clear, cannot be proven?
Maybe if you've been a casual gamer or someone that didn't pay much mind to the technical side of things over the last ~20 years, then you would have never really thought about or been bothered by the state of D3D and Windows in general. But going back to the beginning of gaming, recall that to the metal programming was critical simply because the hardware was simply too slow to even begin to think about wasting computing cycles. This was true for quite a long time, anyone remember melto-vision on the Atari Jaguar? Some really neat programming tricks were used to eek out very interesting effects. An absolute classic example of eeking as much from the hardware as humanly possible was arcade games.LMAO.. tolerating. Ok ATM. Show me any indication you were tolerating DX all these years before Mantle was announced, and I'll send you a cookie.
I know you have read the thread with links to knowledgeable people talking about the similarities. Go read some.
https://www.google.com/#q=dx12+mantle+similarities
I just got done playing ARMA 3 in DX11, which is the gold standard for PC graphics. I doubt anyone with enthusiast hardware has any issues with needing "close to the metal".
The "good enough" attitude is not shared by game devs, at all. Also most people don't have an enthusiast system, have you ever looked at the average system reported by STEAM?I just got done playing ARMA 3 in DX11, which is the gold standard for PC graphics. I doubt anyone with enthusiast hardware has any issues with needing "close to the metal".
Maybe if you've been a casual gamer or someone that didn't pay much mind to the technical side of things over the last ~20 years, then you would have never really thought about or been bothered by the state of D3D and Windows in general. But going back to the beginning of gaming, recall that to the metal programming was critical simply because the hardware was simply too slow to even begin to think about wasting computing cycles. This was true for quite a long time, anyone remember melto-vision on the Atari Jaguar? Some really neat programming tricks were used to eek out very interesting effects. An absolute classic example of eeking as much from the hardware as humanly possible was arcade games.
On the Amiga side of things same deal, programmers were digging right into the hardware to get as much as possible out of the available power. Same with DOS, same with the original Playstation and other consoles (several of which vanished). Then programmers got lazy because processing power increased exponentially, just toss more silicon at the problem and everyone is happy enough. But that ramp up of processing cycles has leveled off, and guess what efficiency matters again.
And in all that time, D3D was never exactly a beacon of efficiency, and anyone that delved into programming or knew a little bit about the state of game engines and like knew full well that a great deal of potential was being left on the table. So to answer your question, as immature and mocking as it was, the "indication" has been there all along for anyone paying attention.
Like everyother tech AMD has come up with it would have been shared with Nvidia once more refined, they said right from the start they would do that.
It seems Nvidia don't want to go down that rout.
Instead they have aligned themselves with Console pushers Microsoft, which perpetuates an API monopoly, Nvidia are empowering Microsoft to keep control of GFX API's.
All Nvidia had to do was to team up with with AMD, work with them to kill Microsofts unhealthy API dominance wich has been the reason PC Gaming has been stuck in a rut for the past 10 years.
Put your Green loyalties aside for just long enough to think about what Nvidia did here.
And all just because they want to fight AMD instead of working with them (at all costs)
What about us that don't run enthusiast hardware? What are we, chopped liver?I just got done playing ARMA 3 in DX11, which is the gold standard for PC graphics. I doubt anyone with enthusiast hardware has any issues with needing "close to the metal".
The "good enough" attitude is not shared by game devs, at all. Also most people don't have an enthusiast system, have you ever looked at the average system reported by STEAM?
......also it is not only about speed, read up what game devs are saying about D3D and the creative limitations.
I have yet to see a modern alternative API improve graphics quality over DX on PC.
ARMA/Crysis etc prove that if a game developer wants to take the time to make incredible graphics, they can do it. That pretty much throws water on the argument that developers deep down inside want to make better looking games, but they are limited by DX.
What developers are limited by are the studio's budget for man hours, as well as the hardware of the average gamer, which is a console. Games that are made first for a console, with scaled-up PC graphics an afterthought (Titanfall), will always be terrible graphically.
If I am wrong, it will be glaringly obvious once the public Mantle SDK is released for any developer to use, and we start seeing non AMD-subsidized games use Mantle, and use it to increase IQ.
I don't agree with this at all. Visuals are scalable, Mantle makes higher settings possible on same hardware. There is nothing stopping a dev from pushing visuals to a level that are only playable on Mantle or really high end rigs, and then giving settings/options to lower them when running D3D/slower systems. In fact I think this is fairly straightforward and something we've seen in games for a very long time, crank up the details if you can, lower them if you need to.Mantle suffers from the same fate as GPU Physx until such time as it becomes an open standard. Any IQ improvements will be relegated to fluff that cannot change core gameplay.
Mantle suffers from the same fate as GPU Physx until such time as it becomes an open standard. Any IQ improvements will be relegated to fluff that cannot change core gameplay. Mantle could be used to create games that drastically increase draw calls, but since the majority of the GPU market is on DX11 hardware no game dev will utilize that extra CPU grunt in a game outside of just making radeon's run faster.
In an interview with technology news site CRN, Huddy said only a select few developers had expressed to him a desire to go around the current driver standards, including Battlefield developer DICE and Crysis developer Crytech.
“It’s not something most developers want,” he said. “If you held a vote among developers, they would go for DirectX or Open GL, because it’s a great platform.”
AMD senior director of ISV relations Neal Robison also pointed out to CRN that most developers have welcomed the stability and standardization that come with developing through DirectX.
“It’s hard to crash a machine with DirectX, as there’s lots of protection to make sure the game isn’t taking down the machine, which is certainly rare especially compared to ten or fifteen years ago," Robison said. "Stability is the reason why you wouldn’t want to move away from DirectX, and differentiation is why you might want to."
While developers can get more direct access to the graphics hardware by using an alternative to DirectX, such as Open CL, Robison said it takes a particular type of obsessive programmer personality to think they could replicate all of DirectX's features better on their own.