AMD @ GDC: Partnership with MS next-generation graphics.

Page 21 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,448
5,831
136
That article is 3 years old. Also it's hard to crash a Windows machine/D3D application? HAHA no.

When's the last time a game crash full on bluescreened your PC? It almost never happens these days. I remember back in the Win98 days, when hard crashes were a daily occurrence.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
When's the last time a game crash full on bluescreened your PC? It almost never happens these days. I remember back in the Win98 days, when hard crashes were a daily occurrence.

i've had a few crashes on my kaveri system but this is with beta drivers and/or mild overclock
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Put simply, this is just AMD pushing the hardware industry forward again.
I know praising AMD or their stuff doesn't go down well around here, but they're the Google of the hardware industry. They did it with the Athlon, AMD64, Eyefinity, HSA/APUs and now Mantle. Mantle is a big deal, if it weren't you wouldn't see everyone scrambling the jets. It's a 'ignore at your own peril' thing. :)

But while everyone else is talking about it, AMD has always done it. I can't think of anything Nvidia has done to push the industry forward at all. Physx? 3D Vision?? Nvidia's main claim to fame is their marketing, it's really, really good and that's not a slight. AMD drops the ball in many areas and it's frustrating to watch. But, I don't consider NV an engineering company at its heart but rather a marketing company. It obviously works out. It does for Apple too.

The 'throw hardware at the problem' days have been over for a while. Things change.
I'm a software developer by trade, and it's going to be hard to compete with SteamOS for efficiency as time goes on. I don't think many people are going to stick with being forced to buy new versions of Windows for DX API updates every few years once SteamOS is up and running. It doesn't make sense. I'm not doing it. I'm on my last version of Windows (7). This or XP will get run in a VM on SteamOS for old Windows95-XP games that I want to play. DOSBox for everything else. New games linux/SteamOS only.

If my rig blew up tonight, I'd replace it with a cheap Kaveri rig. Because games have plateaued for a while now, and PS4/XB1 aren't really going to push things forward either.
I'm in a holding pattern until the consumer Oculus Rift is ready. Or until one finds a good 30" 60hz 4K LCD. Once OR is released, if it's 1440x2 a lot of people are going to be upgrading. Eyefinity provides an incentive if people like it, I never did. Right now though, according to Steam the most popular is just old fashioned 1920x1080. When I play games on my plasma sometimes I use 720 because it's easier to read text.
So most gamers are at pretty low res. I've always been convinced this forum and most of Anandtech besides those in the PC Gaming subforum do not play many games at all. This is a place where people gather who are excited about specs, and enjoy spending money on hardware for the thrill. I was there once, for a long time. There are some gamers here of course, I'm one, but I'm convinced many are simply interested in the hardware. Which is getting stranger over time because most hardware is grossly overpowered, the obvious benefits are gone.
I was more into the hardware update cycle when we actually needed it, for a long time we HAD to have upgrades or games wouldn't even run. I don't have any games that won't run on my Q9450 (6 years old) /5870 (5 years old).
But I can tell you many times that I had to buy very expensive RAM, or a new CPU to play games in the early 90's. Change happens.

Like I said I'm a developer, I can afford whatever computer I want. I priced out a new system on Newegg tonight without monitor and it came to $2200. This is an allout Intel system. Looking at the games I play, I think it's a waste. Kaveri or Carrizo makes more sense to me, but as APUs catch up and HSA gets more software adoption, I'd rather wait it out if I can.
We'll see when Oculus Rift arrives, then maybe $2200USD on a rig to replace my currently working-well system will be worth it.

+100,000,000

Definitely, they are an intergral and vital part of the industry. Mantle is just another in the long list of pioneering innovations developed by AMD. They are central in the gaming industry, owning the Triple Crown in consoles. HSA and associated foundation is the direction the industry is headed.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No one said that they'd tell me about it, only the hardware vendors would've had early access & not even game developers, but you keep on peddling this statement "that it was in development as early as 4yrs back" as a fact is frankly a lie because some bits of it would've made there way through to win8 at some point in time ! The development on DX12 probably started near the time XBONE was initially conceived alongside AMD or perhaps later on when the APU was fully functional, that's the best guesstimate for a timeline during which DX12 would've made it even to the drawing board, anytime before that certainly doesn't make any sense to me.

That was NVidia's statement, not mine, and I would believe them over you any day.

And fresh off the printer in the Anandtech DX12 article:

Wrapping things up, while DirectX 12 is not scheduled for public release until the Holiday 2015 time period, Microsoft tells us that they’ve already been working on the API for a number of years now.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
It's obvious that DX11 has been in development since years. The Xbox One DX11.x API is one example of the research and uses a lot of the new features of DX12.

That makes all the claims that DX12 is based on Mantle hilarious.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
It's obvious that DX11 has been in development since years. The Xbox One DX11.x API is one example of the research and uses a lot of the new features of DX12.

That makes all the claims that DX12 is based on Mantle hilarious.

Technically mantle is based on direct 3d. I don't find it that strange that Microsoft would use any CPU optimizations amd came up with for mantle.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
It's obvious that DX11 has been in development since years. The Xbox One DX11.x API is one example of the research and uses a lot of the new features of DX12.

That makes all the claims that DX12 is based on Mantle hilarious.

Who's hardware is in the XB1? Connect the dots. ;)
 

FiendishMind

Member
Aug 9, 2013
60
14
81
I can't think of anything Nvidia has done to push the industry forward at all. Physx? 3D Vision??

3D Vision was lame but it did trojan horse 120hz into many monitors, which definitely helped accelerate the development and adoption of 120hz LCD monitors.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
3D Vision was lame but it did trojan horse 120hz into many monitors, which definitely helped accelerate the development and adoption of 120hz LCD monitors.

I honestly agree with that. I'd question its value though, if any at all.
As a side-effect to 3D Vision, they did push the adoption of 120hz, which wasn't really successful. There still are not enough 120hz models on the market. I'd love to get 60hz on a 4K, but can't even get that. So 3D Vision might not have much impact afterall.
I certainly haven't seen a post on Anandtech asking for 3D Vision on 4K at least so I'm guessing the demand for 3D Vision is lacking.

I even opted -against- a 120hz LCD for my latest monitor. It didn't make sense price/performance, and to have it look odd next to my two 60hz panels flanking it. Frankly not a fan of LCD in general, it was pre-broken to begin with requiring tons of juryrigging to anywhere near the point CRT/plasma-vein technologies were from day one. Hz improvements (30hz, 60hz, 120hz, 240hz!!! Oh ya, 4K, back to 30hz- d'oh!), dejuddger fixes, backlight changes (just bought a full-array local dimming backlit LCD TV, they're STILL trying to fix LCD in 2014), the list goes on.

I've always been a plasma (RIP) guy, and eagerly awaiting OLED so we can bury LCD. I don't want anyone else like Nvidia trying to 'fix' it. Kind of tired of all these hacks for LCD myself, now we have 120hz on the desktop, and GSync!
Yeah, I'm sure Nvidia wants me going ga-ga over those. Next please.

-(modern) GPGPU
-Variable refresh monitors for gaming
-(modern) multi GPU implementations
-Frame pacing (making multi GPU actually useful)

Just to name a few. You just have to think harder.

It's actually much easier to think of Nvidia innovations when it comes to GPUs.

Hey first post guy. Welcome to the forum, interesting first post to say the least!
I'd like to make a few remarks on your points. By the way, if you were making similar outrageous claims about AMD, you'd be eaten alive here. You only survived with that post because you're in Nvidia-territory. I'm happy to hold up the Mantle (get it?) of truth though.

But first, quite the stretch on the 'modern' points. In fact, 'modern' means they've been done before. The Wright Brothers are generally famous for being first in flight, for a reason. The thousands of engineers at Boeing? Rightly or wrongly, not so much. Sucks, I know, but you're not going to be able to steal the show from the people who did it first either.

I admire your attempt though. I wouldn't downplay anyone's work, but let's credit innovation where there's true innovation.

No one but newbies who weren't gaming in the 90's believe NV are masters of innovation. They tried a couple times, namely with quadratic surface rendering. Flopped.

3dfx did most of your list first, nothing 'modern' about Nvidia buying 3dfx and continuing development on 3dfx's firsts. That blows away points 1, 3, and 4. 3dfx had frame limiting, there's nothing new about frame pacing. Oh trust me I'm thinking, but you- have to be over 30 to have lived through and remember the history. :)

And G-Sync. Really? Hey, I'd honestly love to have it in my hardware. But let's face it, I don't want to go below 60hz/60FPS anyway, so I keep my settings adjusted accordingly. With that being the case, GSync offers me nothing. It has less market penetration than 120hz, which itself was influenced by Nvidia, but also failed to become a market sensation.

Face it, there is little if not nothing Nvidia can chalk up that even comes close to even ONE of AMD's innovations, such as AMD64. There is no comparison between these two companies in that regard. If you want to bash AMD on financials, OK, guessing you'd get off on Apple's performance on Wall Street too instead of get fired up about technology.
This is, unfortunately for many- a technology and engineering-based oriented forum, not the New York Stock Exchange.

I don't want to try to write off all of your points as nonsense, as Nvidia is a successful company. But praise them for their reinvention of guerrilla marketing (viral marketing) innovations and stock market success, not their innovations in engineering.
That really is Intel and AMD's territory.
 

Alatar

Member
Aug 3, 2013
167
1
81

Nice long post but in the end it has very little substance.

-modern GPGPU

This one is clear as day. There can be a theoretical argument made about fixed function stuff existing well before but the point is that starting with Tesla CUDA completely changed the GPGPU landscape. CUDA is also what caused OpenCL to be created. Pretty much everything in the current GPGPU industry is how it is because of Nvidia's GPGPU push with tesla (G80) and CUDA. Including modern OpenCL implementations and gpu compute implementations in games for example. I have no clue why you would think that 3dfx in any way counters this point. What we currently understand by 'gpgpu' is what was created by Nvidia with CUDA and tesla.

- variable refresh monitors

You say mantle that can be used by ~15% of the gaming market, and is included in 2 games at the moment is a huge innovation then so is a completely new technology that makes lower frame rates seem much smoother. Whether you personally are interested in the tech or not is irrelevant, it's still a huge change from what was previously possible and it also made AMD react.

- modern multi GPU

Again quite self explanatory. 3dfx's scan line interleave as a technology was completely different than Nvidia's SLI. And these were done in different points in time. Fact of the matter is that Nvidia's SLI is what pushed multi GPU systems to what they are right now. It's Nvidia's SLI that caused AMD to make their first crossfire implementations. Not 3dfx's, but Nvidia's.

- frame pacing

Again, this isn't frame limiting, some frame caps, vsyncs or anything like that. Proper multi GPu frame pacing from Nvidia is what made Nvidia multi GPU setups give a better experience than single card setups. The same couldn't be said about AMD's so they were forced to again follow the GPU market leader once they realized that their crossfire was mostly useless.

----------

I mean again, the fact that you have to resort to CPU innovations ( a market Nvidia isn't even a part of) tells enough of a story here. Aside from eyefinity and mantle it's hard to think of AMD GPU innovations. In general they really haven't been the driving force in the GPU industry. They have played catch up (with features or performance) much more often than Nvidia. And there's nothing wrong with that. But to call Nvidia a PR company is just ignoring the past of the GPU industry.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
As much as I would love to say how important re-inveting stuff by nvidia is good for keeping fierce competition - which is pushing the industry forward, but I will hold that for another thread and avoid discussion in moderator forum ;).

as to AMDs cooperation with MS - I wonder what innovations this console generation will bring to PC market. I hope for something groundbreaking like we had last time.
This gen already gave us:
-mantle
-trueaudio
-Unified memory
-high-res textures
-multi-core optmizations (in some cases)
-new dx?
-easier ports with little to non console exclusives :)
 
Last edited: