Question A regular guy asking questions about where the GPU market is headed next

Caveman

Platinum Member
Nov 18, 1999
2,502
30
91
Context: My last rig was a 4770k with a GTX970 that I upgraded to a 1080Ti. Building a new rig now on the AM5 platform to last another 5 or so years. Planning to snag a founders 7900XTX when they come out 13Dec. Looking for some insight into the following questions:

1) Does anyone else think it's logical for AMD to release a 7950XTX in a few months to compete (and beat) the 4090? I can see it priced at ~$1200 and being able to capture market share in a big way for better framerates. Do people really care about ray tracing? Hard to tell much difference in the image - reminds me of audiophiles getting giddy about hearing those 22000+ frequencies on their tweeter...

2) The 7900 series cards seem like when Zen CPUs were released. They are "new" and show promise - about as much as the 6000 series did against NVidias 3000 series. In that context, do the folks here think it's likely that the next series of video cards from AMD will wipe the floor with NVidias offerings because of the superior architecture on which to build upon? Sort of like, we're just getting started... wait till you see what's next...?

3) When would the industry expect the next series of GPUs to drop? Isn't it about a 2 year cycle so maybe Nov 2024? Sound about right?
 
Last edited:

Leeea

Platinum Member
Apr 3, 2020
2,738
4,124
106
AMD to release a 7950XTX in a few months to compete (and beat) the 4090?
Not going to happen.

Nvidia's ada is a 608mm chip built on the best 4nm process available to man.

AMD's rdna3 is a 350mm chip built on not good enough for Apple any more 5nm.

AMD's engineers are good, but they are not that good.

Do people really care about ray tracing?
In the $1200 category they do

reminds me of audiophiles getting gitty about hearing those 22000+ frequencies on their tweeter
You pretty much nail it on the head with that analogy.

In that context, do the folks here think it's likely that the next series of video cards from AMD will wipe the follow with Nvidias offerings because of the superior architecture on which to build upon?
rx8000 series will be 2 years from now, irrelevant to the market for now

Sort of like, we're just getting started... wait till you see what's next...?
Seems unlikely.

Look at ada. 2x the chip size of AMD and a better process. That is a statement from nvidia that cannot be ignored.

Nvidia is willing to spend greater then 4x* more per die to stay ahead of AMD. That is not going to change two years from now.

AMD is not going to catch up to that.


*with silicon wafers cost goes up exponentially as die size increases due to defect probabilities.
At $10,000 per wafer, 20 defects spread across a 100 chips takes out 20% of them, yielding 80 good chips at $125 per chip cost.
20 defects spread out over 50 chips takes out 40% of them, yielding 30 good chips at $333.33 per chip cost.

*another issue is patterning, and how many chips can fit on the circular edges of the wafer. As chip size goes up, less chips can fit along the outside curve of the wafer.
*another issue is just process cost, nvidia's 4nm is the bleeding edge with bleeding edge costs

When would the industry expect the next series of GPUs to drop? Isn't it about a 2 year cycle so maybe Nov 2024? Sound about right?
That is about a good a guess as anyone.
 
Last edited:

Caveman

Platinum Member
Nov 18, 1999
2,502
30
91
Thanks for the response... Pretty much makes sense though from what I've been picking up, it seems like Nvidia's process is maxed out with no headroom but AMD has headroom so something like a 7950XTX would be possible. One can hope. Also, aren't AMDs drivers typically so unpolished at release that there may be a lot of performance still on the table once they get a thorough tuning? I'm not arguing with you, just trying to learn.

Back about Ray Tracing... I'm a flight simmer and saw that FS2020 has it now (DCS or IL-2 doubtful will ever have it). From what I saw, it's a difference that really makes no difference. It's just "different", not really an improvement per se...
 

GodisanAtheist

Diamond Member
Nov 16, 2006
4,668
3,935
136
@Leeea was more or less on top of it.

1) At most we'll see something at the mid-gen/1 year mark for the refresh part. Maybe AMD will put a larger chip out to compete for the pole position, but more likely we'll just get a higher clocked/faster RAM part.

Ray-Tracing is a person to person thing. I think most people don't care for Ray-Tracing as a function of the performance hit. If we could get Ray-Tracing with no performance hit I figure most people would prefer it. When you're spending the big bucks on a GPU, you want it to perform best in class and that's what NV offers.

2) There is no saying AMD is more competive now than it has been since the 290x/780ti generation, but NV is no slouch and has shown time and again that they'll do anything and everything to win the Halo spot each gen. AMD competes with itself (it's Zen CPU products) for manufacturing space at TSMC, and as a result design their GPUs toward bang for buck rather than best in class because they're simply working under a different set of priorities.

3) Holiday Season 2024 is a very reasonable guess for RDNA4/Blackwell.
 
  • Like
Reactions: Leeea

Leeea

Platinum Member
Apr 3, 2020
2,738
4,124
106
Thanks for the response... Pretty much makes sense though from what I've been picking up, it seems like Nvidia's process is maxed out with no headroom but AMD has headroom so something like a 7950XTX would be possible. One can hope. Also, aren't AMDs drivers typically so unpolished at release that there may be a lot of performance still on the table once they get a thorough tuning? I'm not arguing with you, just trying to learn.

Back about Ray Tracing... I'm a flight simmer and saw that FS2020 has it now (DCS or IL-2 doubtful will ever have it). From what I saw, it's a difference that really makes no difference. It's just "different", not really an improvement per se...
AMD doesn't have head room. The 3GHz hopes are dead, something went wrong with the production chips. AMD will be lucky to get another 10% out of what they have. It is going to be 2 years before we see respin, pattern dies are just to expensive. In the mean time, nvidia is not going to be sitting still either. They can see AMD not far behind in the rear view mirror with a chip 1/4 the cost, nvidia is already designing a cost is no object monster to keep AMD there.

AMD drivers are pretty good these days. It is unlikely there is going to be much performance left on the table at release.



Ray tracing is good, better then people think. But not in new games or multiplayer games the majority of people play.

Ray tracing is good in old titles. Nvidia is quietly going back and releasing mod packs for things like farcry, and yea, it is just better. Old games also do not need much horsepower to run, so the raytracing performance hit is tolerable. For a certain segment, that is very valuable.



If your are an AMD fan boy like I am, RDNA3 is a sweet chip to cheer for.

1/2 the size, older process, less watts, cheap non-x gddr memory, and right behind nvidia's best. An engineering masterpiece.


AMD will be able to crank out those things at less then 1/4th the cost of nvidia's top end part. AMD can sell these things at $999 all day long and make 2x more money per card then nvidia makes with $1600 rtx4090s. Remember, the rtx cards use boutique gddrx memory chips made just for nvidia, where AMD uses the more generic gddr chips used in everything from consoles to fpgas.

In both $ per frame for the consumer
and in $ per unit sold for AMD
rdna3 knocks it out of the park.


These are difficult economic times, and AMD made a chip for difficult times. It is not the chip I wanted, but it is the chip AMD needed to build.
 
Last edited:

Caveman

Platinum Member
Nov 18, 1999
2,502
30
91
Thanks guys - learning so much. Very interesting stuff. One basic question: Are gsync and freesync completely compatible with one another now? i.e. Can a gsync monitor perform freesync with an AMD card and can a freeesync monitor perform gsync with an Nvidia card?
 

GodisanAtheist

Diamond Member
Nov 16, 2006
4,668
3,935
136
Thanks guys - learning so much. Very interesting stuff. One basic question: Are gsync and freesync completely compatible with one another now? i.e. Can a gsync monitor perform freesync with an AMD card and can a freeesync monitor perform gsync with an Nvidia card?
- Sort of Freesync will work with any NV card from the 10xx series onward, while the "Gsync compatible" moniker means the card will work with any AMD GPU from Vega/Polaris onward.

If something is only "Gsync" or 'Gsync Ultimate" it means it is still using NV's proprietary module and will only work with NV cards. However, these monitors are extremely rare nowadays.
 

Caveman

Platinum Member
Nov 18, 1999
2,502
30
91
Sorry slow on this one... it sounds like if a consumer wants full compatibility and is purchasing a 30X0 series nvidia card or 6X00 series AMD card, a monitor with Freesync would have VRR capability with either card?

On PCIe 5.0... Realistically, how many years will it be before video cards use PCIe5.0 capability? Some of the videos I've ben watching makes it seem like it's along ways out but then wondering why AMD would be pushing PCIe 5.0 tech to the AM5 socket? Perhaps 8X00 series will be PCIe 5.0? Begs the question why anyone might buy a B650 board if they plan to ride out the upgrade path for AM5.
 
  • Like
Reactions: Leeea

pj-

Senior member
May 5, 2015
447
210
116
- Sort of Freesync will work with any NV card from the 10xx series onward, while the "Gsync compatible" moniker means the card will work with any AMD GPU from Vega/Polaris onward.

If something is only "Gsync" or 'Gsync Ultimate" it means it is still using NV's proprietary module and will only work with NV cards. However, these monitors are extremely rare nowadays.
I dunno about "extremely rare".. The alienware qd oled I have is gsync ultimate and it is considered one of the best gaming monitors available and seems to be pretty popular. It also works with Free Sync
 
  • Like
Reactions: Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
4,668
3,935
136
I dunno about "extremely rare".. The alienware qd oled I have is gsync ultimate and it is considered one of the best gaming monitors available and seems to be pretty popular. It also works with Free Sync
-I figured I'd get called out on that cause I never really look at top end monitors. Freesync/Gsync compatible are certainly far more plentiful than proper Gsync monitors and definitely more so at lower price points.
 
  • Like
Reactions: Tlh97 and Leeea

Aapje

Senior member
Mar 21, 2022
714
825
96
AMD doesn't have head room. The 3GHz hopes are dead, something went wrong with the production chips. AMD will be lucky to get another 10% out of what they have. It is going to be 2 years before we see respin, pattern dies are just to expensive.
I disagree. If they can truly get a big gain with a respin, where they can ask $200 more, then I think they will do it. Especially if the N32 parts don't have this bug and have to be held back to not undermine the 7900 XT(X). Then by replacing Navi 31, they can 'free' the lower cards and get more returns on those too.
 
  • Like
Reactions: Tlh97 and Leeea

pj-

Senior member
May 5, 2015
447
210
116
It would be fit to mention the newer version of that monitor ditches Gsync.
they also slightly reduced the refresh rate and the HDR seems a bit dimmer

not sure if that's because they removed the gsync module or if it's just dell being dell
 

kschendel

Senior member
Aug 1, 2018
200
118
116
"Freesync" is probably too broad a term. What you want in a monitor is VESA adaptive sync, which is essentially Freesync over Displayport. I don't know about current monitor offerings; back in the day, you could find monitors that only did Freesync over HDMI and those won't work (AFAIK) with nvidia.

As for PCIe 5.0, my guesses are that they did it a) because they could and it was seen as a marketing win, and b) the real gain is in the CPU to chipset link which now has double the bandwidth without increasing lane (and hence pin/trace) counts.
 

Caveman

Platinum Member
Nov 18, 1999
2,502
30
91
"Freesync" is probably too broad a term. What you want in a monitor is VESA adaptive sync, which is essentially Freesync over Displayport. I don't know about current monitor offerings; back in the day, you could find monitors that only did Freesync over HDMI and those won't work (AFAIK) with nvidia.

As for PCIe 5.0, my guesses are that they did it a) because they could and it was seen as a marketing win, and b) the real gain is in the CPU to chipset link which now has double the bandwidth without increasing lane (and hence pin/trace) counts.
Thanks. So... Look for a monitor with VESA adaptive sync and it should be able to have VRR with either an AMD or Nvidia card? And... it all happens best over Displayport.

On another note...

Interesting news on the possibility of AMD graphics cards refresh next fall with Vcach and higher clocks:

 

Leeea

Platinum Member
Apr 3, 2020
2,738
4,124
106
Thanks. So... Look for a monitor with VESA adaptive sync and it should be able to have VRR with either an AMD or Nvidia card? And... it all happens best over Displayport.

On another note...

Interesting news on the possibility of AMD graphics cards refresh next fall with Vcach and higher clocks:
If you want to dream you can dream.

We will see the rx7800, then the 7700, 7600, 7500, etc next.

The money is in the low end for both nvidia and amd.
 
Last edited:

ASK THE COMMUNITY