So WHEN is the next big thing for the high end happening?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Developers will do extra work to support AMD because its AMD GPU's in the consoles. It's the one advantage AMD has.

We've heard this same thing for years and it's never proven true. Consoles will get love, yes, consoles are always optimized for, but AMD gpus on PC will see no love, just as they haven't in the past. Having single digit (of actual gamers) market share means no love and AMD has a massive uphill battle to fight moving forward.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Do you really expect the performance delta between the xx60 & xx80Ti models to be around 40-50% (30 + a bit)?

Maybe I'm giving 2080 Ti too much credit, taking a look at it it seems it's only about 20%-25% faster than 2080, not 35% or more as I assumed. Delta will surely be larger than 50% for 3060 to 3080Ti. 3080Ti might even be $1499 if NV really pushes performance on 7nm. If the last few gens can be used to predict things, 3060 will perform around 2080, especially given the node jump. Only question is how much will the 3060 cost, the range will be $299 to $499. Probably depends on how Navi looks.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Do you really expect the performance delta between the 3060 & 3080Ti models to be around 40-50% (30 + a bit)?

On 7nm, absolutely. A mature 7nm should bring huge gains and 35-45%, especially after the negatively received 2000 series is probably about right. When is the last time that NVIDIA had two missteps in a row? I doubt we'll be disappointed in the 3000 series. NVIDIA knows that AMD is now flush with cash, and that Intel is entering the market, they can't afford to be complacent.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
We've heard this same thing for years and it's never proven true. Consoles will get love, yes, consoles are always optimized for, but AMD gpus on PC will see no love, just as they haven't in the past. Having single digit (of actual gamers) market share means no love and AMD has a massive uphill battle to fight moving forward.
Yea, your right. Consoles (which are not running Windows) get love but PC's do not. I concede the point.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
On 7nm, absolutely. A mature 7nm should bring huge gains and 35-45%, especially after the negatively received 2000 series is probably about right. When is the last time that NVIDIA had two missteps in a row? I doubt we'll be disappointed in the 3000 series. NVIDIA knows that AMD is now flush with cash, and that Intel is entering the market, they can't afford to be complacent.

I wouldn't hold my breath for Intel to deliver a decent gaming GPU. AMD is just done it seems, so I expect Nvidia prices to just keep climbing. I'm definitely scratching my head trying to figure out who these cards are for though, lol. Hardly a handful of people are going to spend $1500 on a gaming card, but that looks like where the high end is going to stay. Oh well. I got so sick of waiting for a 1080ti replacement that I actually just spent way over a grand on a tennis ball launcher instead, lol. Not kidding.
I'm willing to give Nvidia my play money, but Jesus, they really don't want it this time around it seems. They just aren't interested. They aren't interested in my money when they offer me the same performance for $100 more and call it a RTX 2080 or 20% more performance for a 50% price increase and call it a RTX 2080Ti. I call both cards a joke. Don't expect anything good from anyone at a decent price for GOD knows how long. I give it a few years and hopefully Intel actually produces and drives costs down, because I think we know AMD is just done now.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I wouldn't hold my breath for Intel to deliver a decent gaming GPU. AMD is just done it seems, so I expect Nvidia prices to just keep climbing. I'm definitely scratching my head trying to figure out who these cards are for though, lol. Hardly a handful of people are going to spend $1500 on a gaming card, but that looks like where the high end is going to stay. Oh well. I got so sick of waiting for a 1080ti replacement that I actually just spent way over a grand on a tennis ball launcher instead, lol. Not kidding.
I'm willing to give Nvidia my play money, but Jesus, they really don't want it this time around it seems. They just aren't interested. They aren't interested in my money when they offer me the same performance for $100 more and call it a RTX 2080 or 20% more performance for a 50% price increase and call it a RTX 2080Ti. I call both cards a joke. Don't expect anything good from anyone at a decent price for GOD knows how long. I give it a few years and hopefully Intel actually produces and drives costs down, because I think we know AMD is just done now.
Ready for some more cat chow?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I'm turning vegetarian, starting now.

My claim is that Intel will vomit crap technology on us all the way through 2020 and try to call it a "GPU". As far as the risk I'm willing to take to defend my claim is concerned; I think you know what I'm capable of. It involves large hands and a pile of meow mix. Come at me.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
My claim is that Intel will vomit crap technology on us all the way through 2020 and try to call it a "GPU". As far as the risk I'm willing to take to defend my claim is concerned; I think you know what I'm capable of. It involves large hands and a pile of meow mix. Come at me.

That's probably a pretty fair assessment. I don't think we'll see anything excited overall for a few years (if ever). I do expect the 3000 series to provide a solid performance bump over the 2000 series, but as you mentioned above, don't expect it to be at affordable prices.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I was hoping that Foxconn would enter the gpu arena, but apparently a sea goddess had other ideas...
 
Mar 11, 2004
23,063
5,533
146
You'll see soon enough.

Everything after is also that.

Rome is the start of AMD's chiplets strategy.

X360 already did that.

SC contracts are all separate dies.

The GPU is the NB there.

In the absolute sense but not in the way I meant and I'm fairly certain you knew that. Supposedly there will be a substantial change to the base architecture of AMD's GPUs.

Not with GPUs it isn't and that's also clearly what I mean and I'm fairly certain you also knew that. I mean, really you could argue that AMD did chiplets long before then (in fact you straight up did by saying the Xbox 360 did that).

Lots of consoles did that, and it wasn't quite the same thing. Not even sure what your point is as that has absolutely no bearing on this situation. There's fundamental differences to how AMD is approaching things now. The market has changed. At the time, of the 360's development, AMD probably couldn't have offered a monolithic APU so it wasn't even really an option.

I'm not entirely sure what you're meaning and/or where you're knowing that for certain if its what I think you're meaning (that AMD is not using any off the shelf chips in semi-custom business). I think that could change (there will be some custom silicon, it just wouldn't need have to be in the CPU or GPU, letting AMD maximize economies of scale, which means they can possibly offer lower prices or similar prices with performance that Sony would be fine with while AMD saves on R&D while cutting them a volume discount or makes the case that their savings in not doing a hefty custom chip enables them to do something like say include NAND or maybe more memory or something that Sony otherwise couldn't if they were paying for custom CPU chiplets).

We'll see. I think that will change starting with these. Which I could see them keeping the GDDR6 memory controllers on the GPU, but then routing through the I/O to the CPU and NAND.

The thing is, as a consumer, I don't grade on a curve. I want the best product - whether that's best in absolute terms, or in terms of perf/$, or perf/watt, or whatever metric I'm trying to optimize. And right now, AMD isn't really being all that competitive in those terms. (One exception is the Radeon WX 5100, currently the best pro card under 75W and the most powerful card of any kind under 75W. But even there, the poor driver support for stuff like OpenGL hurts them.) I don't care if they are working with fewer resources; that's their problem, not mine.



As long as they stick with GCN, they won't be able to compete in terms of perf/transistor and perf/watt. GCN is not optimal for gaming compared to Nvidia's Maxwell and subsequent architectures. And it's clear that they don't know how to fix it (if it even is fixable). The current Chinese team didn't invent GCN and doesn't understand it; the Canadian team that did was laid off in 2013. They clearly can't fix the inherent limitations of the architecture, such as 4 triangles/clock (Nvidia's top architectures have done 6 triangles/clock for years).



The only way that could possibly ever work is if they managed to pull off transparent multi-GPU with 2+ cards presenting to the system as 1 card. Anything requiring any support from developers is an absolute nonstarter, since no developer will do extra work for AMD support/optimization unless they're paid to.

That's fine but sorry its not that huge of a disparity. We've seen both companies make up bigger disparities in single generation. And the market is far more diverse so they don't even need to directly compete as much for both to be successful. I think that's true even with Intel, that the overall demand for GPU is strong enough that they can all do well. Certainly competition will happen and will impact them, but I don't think it necessarily spells doom for any of them.

We've been hearing the same thing (that GCN is horrible, heck some have been arguing for years that its outright broken), and yet its kept AMD fairly competitive even when AMD focused on CPU development (at the expensive of GPU development resources). Speaking of hearing the same thing, I still see people saying how the console market can't support more than 2 companies, yet Sony, Microsoft, and Nintendo have all been around competing for nearly 2 decades now, and all are making billions of dollars from gaming, and have continued to even though there's competition from other markets (smartphones, PC).

I agree AMD needs to improve their GPUs, but I don't think its nearly as drastic as people have been saying. I also don't think its the hardware even that has been the issue, but rather the software resources that has been the real problem for AMD. We've seen that when the software utilizes AMD's GPUs, they're very strong (and outright competitive, sometimes even better than Nvidia's). That's generally in compute related stuff, but we saw with Vulkan and Doom that it can happen in gaming as well. Heck, even outside that AMD is competitive (in perf/$). The issue is that in higher end discrete gaming, AMD's GPUs aren't well utilized and to try and make up for it AMD pushes outside the efficiency range of their products on the manufacturing processes they've used.

The thing is, that's how these newer graphics APIs just look for GPUs in that manner. Heck, remember when they touted that it would actually improve mGPU?
https://arstechnica.com/gaming/2016...lly-work-together-but-amd-still-has-the-lead/
Which I totally forgot about this (I've been saying that AMD won't call their future mGPU stuff Crossfire as speculation...they already did that):
https://hothardware.com/news/amd-ab...lti-gpu-setups-since-dx12-does-it-differently

Another thing is that, scaling on modern games is possibly already at the level I was talking about:
Definitely noticeable is how the most recent games don't have Crossfire support, but if they target just the top games (or top engines) it could be worthwhile for quite a lot of gamers. And again, scaling doesn't have to be great for it to be worth it because of Nvidia's prices.

So yes, there needs to be some work done. But the devs might already be doing that work (for Google and possibly others that might be doing similar game streaming services; Google is touting mGPU support on their streaming service). And AMD could pay devs to do the work if they hadn't already (or possibly do it for them), or they could work with the companies doing the game engines to get some level of inherent support that would make it easy for game developers to enable. Heck if they just got some ok level of support inherent to UE and Unity, that'd quickly change the situation due to how many games use those. Which AMD already needs to be working to get developers to better utilize their CPUs as well. The single biggest disparity between AMD and Nvidia in gaming, is probably the developer outreach. AMD has recognized that (it was the impetus behind them pushing opensource in that it gave an avenue for developers to utilize when AMD couldn't provide as much support, and developers tend to prefer open source so it was also an attempt to appease them; it does take time to develop though, but we are starting to see some fruits of that though as there's been big progress made with AMD GPUs on Linux). We'll see what they do to change things (I think AMD should have like 4 dev boxes sent to major developers, one their mobile/lowend APUs, one with mainstream CPU and GPU - think 2600 with a RX470 market GPU, a higher end one top of the line consumer CPU with single top of the line GPU, and then a Threadripper with multiple GPU system; this latter might become more important as they move to multi-die on possibly even consumer level, and so maybe a 5th that included a 2 die Ryzen with a pair of Navi GPUs).

mGPU just needs the right situation to make it worth being developed again. I think that exists now, through a confluence of high prices, lack of other ways of pushing performance forward (gains are getting smaller and smaller and the gains from new processes are as well), while the industry is trying to also transition to a MUCH more power/performance hungry graphics method, and while gaming goes to the cloud (where mGPU has been the standard for other GPU tasks).

And yes, this is just pure speculation on my part with nothing (not even rumors) supporting that AMD will be doing mGPU with Navi. I don't think it'll be an overnight thing and I don't think it'll be some magic great scaling across all titles (like I said, it doesn't need to be due to high prices of higher end stuff). In fact, I won't be surprised if its not even hinted at with Navi's launch, especially if it doesn't really need to be (as in Navi is good enough that it eclipses anything Nvidia has within $100 of its price), but it would be something they could announce months later maybe if sales slow some (or when they could match any extra demand without it hurting stock and driving up prices; let more people get Navi cards early and not strain stock driving end prices higher) or say around the time that Nvidia might launch their 7nm GPUs.