[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 137 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SteveGrabowski

Diamond Member
Oct 20, 2014
9,348
8,030
136
PS5 is rumored to cost 399$, and Xbox Series X is rumored to cost 499$.

Rumors say that XSX is more powerful, and this would put the validity into rumors about TFLOPs: PS5 - 9.2 TFLOPs, XSX - 12 TFLOPs.

Who cares? If PS5 is 399$ - its a day one buy from me.

I think most are expecting $499 for PS5. If it's got a Ryzen 7 + 5700 XT for $399 oh my god.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Also what about die size. Rx 5700xt is already 251mm^2 without RT and without the cpu cores. How big will this APU be? What is its transistor budget?

Exactly. 8-core zen2 + navi10? Not gonna happen and certainly not monolithic. I suspect it will be smaller than navi 10, wouldn't be surprised if it's only 4 zen2 cores and not a monolith.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Exactly. 8-core zen2 + navi10? Not gonna happen and certainly not monolithic. I suspect it will be smaller than navi 10, wouldn't be surprised if it's only 4 zen2 cores and not a monolith.
I think most are expecting $499 for PS5. If it's got a Ryzen 7 + 5700 XT for $399 oh my god.
It may have 2560 ALUs but for yield and salvaging every possible APU die, most likely they will cut it to 2304 ALUs. Just like with every single console APU in Sony History.

Yes, the die will be monolithic.
 

uzzi38

Platinum Member
Oct 16, 2019
2,747
6,657
146
Exactly. 8-core zen2 + navi10? Not gonna happen and certainly not monolithic. I suspect it will be smaller than navi 10, wouldn't be surprised if it's only 4 zen2 cores and not a monolith.

It's happening, and the Xbox is definitely larger than the PS5 too.
 
  • Like
Reactions: Glo.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Let's be honest here. To the every day person, "crushed" is definitely not the descriptor to use. But we aren't every day people. We're enthusiasts. And when we see NV's historical mid-range GPU outperform AMD's newest top-range GPU, "crushed" is putting it nicely.

You are forgetting that GTX680 was released 3 month after HD7970 and it wad NVIDIAs top-range for one year until the GTX780 was released 14 months later. Also, both chips were mid-level with GK104 a little smaller than Tahiti XT

GTX 680 demolished 7970. And all the "fine wine" and "OC it to the moon" rhetoric won't change this.

If GTX680 demolished HD7970 then R9 290X XXX the GTX780 both in size, price and longevity. It was really embarrassing moment for NVIDIA. The real XXX happened a few months later when R9 290X even demolished the GTX9780 Ti.
 
  • Like
Reactions: Ranulf

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
PS5 is rumored to cost 399$, and Xbox Series X is rumored to cost 499$.

Rumors say that XSX is more powerful, and this would put the validity into rumors about TFLOPs: PS5 - 9.2 TFLOPs, XSX - 12 TFLOPs.

Who cares? If PS5 is 399$ - its a day one buy from me.
Ya am sure it won't be less than $499. I find it very hard to believe that even for $500 we can get a console with such hi end specs. Could be a good reason for someone to switch from pc to console gaming.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Exactly. 8-core zen2 + navi10? Not gonna happen and certainly not monolithic. I suspect it will be smaller than navi 10, wouldn't be surprised if it's only 4 zen2 cores and not a monolith.

7nm+ (EUV) is denser and cheaper than 7nm

7nm+ vs 7nm = 15% higher density and 10% lower power.

Also, unlike 7nm there is no need for triple Patterning on 7nm+ EUV , that will lower cost (less masks, less rules and way easier IC design) and less manufacturing steps = less consumables, less time to finish the wafers etc etc.

PS5 chip could be at 300-330mm2 and cost no more than what NAVI 10 was at launch.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Videocardz leaked a picture and specs of an overclocked Asrock 5600 XT. The GPU is essentially a slightly cut down (192bit bus) Navi 10 (assuming the leak is real). Same CU count but lower clocks and slower vram. The Asrock model had a triple fan cooler on it but everything about the cards specs say it runs cooler and uses a lot less power, plus it only had a single 8 pin power connector. I think Asrock took the lazy path and used a 5700 XT cooler for the 5600 XT. The cooler is 4" - 5" longer than the actual card.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You are forgetting that GTX680 was released 3 month after HD7970 and it wad NVIDIAs top-range for one year until the GTX780 was released 14 months later. Also, both chips were mid-level with GK104 a little smaller than Tahiti XT

I'm not forgetting anything, as I'm speaking regarding two products competition which infers the release of GTX 680 in March to AMD's paper launch of December/physical launch in January. Enthusiasts were aware what was under the hood of the GTX 680.

Regarding the later release of GTX 780, AMD had to roll out a whole new chip to even compete with it, where as NV just released a product they would normally have released 14 months earlier had Tahiti not been so underwhelming EDIT: in comparative to what Kepler could do.

If GTX680 demolished HD7970 then R9 290X XXX the GTX780 both in size, price and longevity. It was really embarrassing moment for NVIDIA. The real XXX happened a few months later when R9 290X even demolished the GTX9780 Ti.

You're right, NV must have been so embarrassed to have the product they made to compete with AMD's first release have to settle with competing with AMD's second release! I'm sure AMD called it a win.

The issue with these arguments is you only focus on performance. NV rolled out one chip that basically competed with two chips from AMD. They made bank, while AMD hemorrhaged. By the time AMD could compete bitmining screwed them, HARD. AMD fans couldn't buy Radeon 290/290X due to gouging and then AMD couldn't sell new stock because of 2nd hand market. Some of us got our 290/290Xs at ridiculously low prices.

I don't know how anyone can look back at any of this and believe "AMD was doing good" or worse "AMD was beating Nvidia."
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Videocardz leaked a picture and specs of an overclocked Asrock 5600 XT. The GPU is essentially a slightly cut down (192bit bus) Navi 10 (assuming the leak is real). Same CU count but lower clocks and slower vram. The Asrock model had a triple fan cooler on it but everything about the cards specs say it runs cooler and uses a lot less power, plus it only had a single 8 pin power connector. I think Asrock took the lazy path and used a 5700 XT cooler for the 5600 XT. The cooler is 4" - 5" longer than the actual card.
The leaked 3dMark scores were for standard GPUs. AIB versions might be up to 5% faster than the reference models.

Which will put this GPU extremely close to RTX 2060 in performance. It already is the same as GTX 1070 Ti. Adding 5% over it brings RX 5600 XT to RTX 2060 levels.

Also the TBP should be lower than 150W for some, mildly overclocked AIB models. Heavily OCed version will exceed this power draw.

If AIB GPU models will be 299$ - that might be still acceptable for what it is. It will be best value GPU from this generation, after all.
 
  • Like
Reactions: lightmanek

uzzi38

Platinum Member
Oct 16, 2019
2,747
6,657
146
The leaked 3dMark scores were for standard GPUs. AIB versions might be up to 5% faster than the reference models.

Which will put this GPU extremely close to RTX 2060 in performance. It already is the same as GTX 1070 Ti. Adding 5% over it brings RX 5600 XT to RTX 2060 levels.

Also the TBP should be lower than 150W for some, mildly overclocked AIB models. Heavily OCed version will exceed this power draw.

If AIB GPU models will be 299$ - that might be still acceptable for what it is. It will be best value GPU from this generation, after all.
There's no reference 5600XT.

Only AIBs.

There is a reference spec, but no blower model, for example.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
The leaked 3dMark scores were for standard GPUs. AIB versions might be up to 5% faster than the reference models.

Which will put this GPU extremely close to RTX 2060 in performance. It already is the same as GTX 1070 Ti. Adding 5% over it brings RX 5600 XT to RTX 2060 levels.

Also the TBP should be lower than 150W for some, mildly overclocked AIB models. Heavily OCed version will exceed this power draw.

If AIB GPU models will be 299$ - that might be still acceptable for what it is. It will be best value GPU from this generation, after all.
My point was that I think this card will not be clocked to high heaven as per AMD's usual modus operandi. This card might run cool with excellent perf/watt. Of course that would also give it a nice potential for overclocking.

As for price, I'm not too optimistic there. AMD seems to be overpricing their stuff (compared to past history, not compared to Nvidia. They're both guilty of it currently)
 
  • Like
Reactions: Ranulf

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I'm not forgetting anything, as I'm speaking regarding two products competition which infers the release of GTX 680 in March to AMD's paper launch of December/physical launch in January. Enthusiasts were aware what was under the hood of the GTX 680.

Regarding the later release of GTX 780, AMD had to roll out a whole new chip to even compete with it, where as NV just released a product they would normally have released 14 months earlier had Tahiti not been so underwhelming EDIT: in comparative to what Kepler could do.



You're right, NV must have been so embarrassed to have the product they made to compete with AMD's first release have to settle with competing with AMD's second release! I'm sure AMD called it a win.

The issue with these arguments is you only focus on performance. NV rolled out one chip that basically competed with two chips from AMD. They made bank, while AMD hemorrhaged. By the time AMD could compete bitmining screwed them, HARD. AMD fans couldn't buy Radeon 290/290X due to gouging and then AMD couldn't sell new stock because of 2nd hand market. Some of us got our 290/290Xs at ridiculously low prices.

I don't know how anyone can look back at any of this and believe "AMD was doing good" or worse "AMD was beating Nvidia."

The problem is you completely wrong, GTX680 and GTX780/Ti didnt use the same chip.

HD7970 = Tahiti - released Jan 2012

GTX680 = GK104 - released March 2012

GTX780 = GK110 - released May 2013

R9 290X = Hawaii - released Oct 2013

GTX780 Ti = GK110B -released Nov 2013


Both AMD and NVIDIA released two Chips and made 4 products, 2 from each chip.
GK110 was way larger and more expensive than Hawaii and it was definitely not meant to compete against Tahiti (HD7970).

Tahiti = 352mm2
GK104 = 294mm2
Hawaii = 438mm2
GK110 = 561mm2
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I'm not forgetting anything, as I'm speaking regarding two products competition which infers the release of GTX 680 in March to AMD's paper launch of December/physical launch in January. Enthusiasts were aware what was under the hood of the GTX 680.

Regarding the later release of GTX 780, AMD had to roll out a whole new chip to even compete with it, where as NV just released a product they would normally have released 14 months earlier had Tahiti not been so underwhelming EDIT: in comparative to what Kepler could do.

This is wrong. The GK110 wasn't even announced until November 12, 2012. And then, this was only for things like the Tesla, NO consumer cards. It was NEVER going to be released along with the GTX 680. The 780 coming in 6 months after the Tesla is about when nVidia typically puts out a "consumer" version of the big chip GPU.

The whole idea of the GK-104 being a "mid range chip" stemmed from it having all of its compute ripped out of it and thereby making it a smaller chip. But this didn't make it lower end, this was part of nVidia changing how it was building its gaming GPU's. The 480 and 580 were both monster chips that sucked power and had lots of compute. For AMD, the 5xxx/6xxx series was more a slim, gaming GPU.

But then AMD went "we need to compete with the 580" so Tahiti had a bunch of compute. Which turned out to be great long term, but really hurt it early on as nVidia then released a slim GK-104 that games just as fast, but at much lower power because it lacked all that extra compute. So both makers countered the others previous generation.

We are now back into both makers doing things kind of the same way. Only AMD currently lacks a big chip design. We will have to see what this year brings.
 
  • Like
Reactions: Ranulf

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The problem is you completely wrong, GTX680 and GTX780/Ti didnt use the same chip.

HD7970 = Tahiti - released Jan 2012

GTX680 = GK104 - released March 2012

GTX780 = GK110 - released May 2013

R9 290X = Hawaii - released Oct 2013

GTX780 Ti = GK110B -released Nov 2013


Both AMD and NVIDIA released two Chips and made 4 products, 2 from each chip.
GK110 was way larger and more expensive than Hawaii and it was definitely not meant to compete against Tahiti (HD7970).

Tahiti = 352mm2
GK104 = 294mm2
Hawaii = 438mm2
GK110 = 561mm2

Just like the HD 5870 (334 mm2) wasn't suppose to compete with the GTX 480 (529 mm2)?

You're going to tell me AMD's goal was to compete with GK104? Or they they were blind sided by GK104's performance? This is like Glo. trying to spin AMD originally meant to charge more only to reduce price days before release as a "gotcha!" move and not a "oh shoot!"

When I said "one chip" you know I meant one family. NV didn't have to do design a new chip to compete, they had the GK100 -> GK110 ready to respond to whatever AMD brought out to compete.

You're right, GK110 wasn't suppose to compete with HD 7970, they didn't need to. Their half chip did it for them allowing NV to cash cow GK100/110. This is not a win for anyone but Nvidia.

This is wrong. The GK110 wasn't even announced until November 12, 2012. And then, this was only for things like the Tesla, NO consumer cards. It was NEVER going to be released along with the GTX 680. The 780 coming in 6 months after the Tesla is about when nVidia typically puts out a "consumer" version of the big chip GPU.

There was no "need" for a consumer version. Oddly when there was guess who got trotted out. It's as if no one remembers HD 5K vs GTX 400. NV released a clearly unready GF100 to compete with a healthy AMD and Cypress. This is an example of AMD forcing NV to respond. Not this nonsense of AMD constantly playing catch up since the HD 5K.

Again, we enthusiast knew GK100/Gk110 was waiting, it was just a matter of when.

The whole idea of the GK-104 being a "mid range chip" stemmed from it having all of its compute ripped out of it and thereby making it a smaller chip. But this didn't make it lower end, this was part of nVidia changing how it was building its gaming GPU's. The 480 and 580 were both monster chips that sucked power and had lots of compute. For AMD, the 5xxx/6xxx series was more a slim, gaming GPU.

But then AMD went "we need to compete with the 580" so Tahiti had a bunch of compute. Which turned out to be great long term, but really hurt it early on as nVidia then released a slim GK-104 that games just as fast, but at much lower power because it lacked all that extra compute. So both makers countered the others previous generation.

We are now back into both makers doing things kind of the same way. Only AMD currently lacks a big chip design. We will have to see what this year brings.

Exactly! NV segmented their product line and it caught AMD off guard. You can form your own opinions, but I believe that's why AMD upped the HD 7970 to $550 from the HD 6970's $380 launch prices. They were probably not expecting NV's gains. And with Navi we see AMD doing what NV did with Kepler - driving a wider wedge between their compute cards and "sleek" gamer cards while simultaneously raising prices.

The only thing I disagree with is it was a long term gain for AMD. The HD 7970 is where you can trace a lot of bad luck for AMD. This card, while an amazing product, was a sore eye to AMD for years starting such bogus memes as FineWine, further reducing AMD's brand value with 3+ game bundles on top of price cuts. Just look at the vicious reception to AMD raising prices.
 
Last edited:
  • Like
Reactions: DooKey

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
@railven

I purchased two 7970's on release. Ouch. Talk about expensive. The Xfire experience was awful and I happily sold them and purchased two 680's.

With that said I eventually purchased two 290's and that wasn't as bad an Xfire experience.
 

Ranulf

Platinum Member
Jul 18, 2001
2,911
2,584
136
Videocardz leaked a picture and specs of an overclocked Asrock 5600 XT. The GPU is essentially a slightly cut down (192bit bus) Navi 10 (assuming the leak is real). Same CU count but lower clocks and slower vram. The Asrock model had a triple fan cooler on it but everything about the cards specs say it runs cooler and uses a lot less power, plus it only had a single 8 pin power connector. I think Asrock took the lazy path and used a 5700 XT cooler for the 5600 XT. The cooler is 4" - 5" longer than the actual card.

Heh, you weren't kidding on the size. Well, it better be cool and whisper quiet.

 
  • Like
Reactions: lightmanek

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
6GBs, what idiots. Here's hoping we will see decent priced refreshes of Navi, with 12GB for 5600 and 16GB for 5700 series.
 

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
The problem is you completely wrong, GTX680 and GTX780/Ti didnt use the same chip.

HD7970 = Tahiti - released Jan 2012

GTX680 = GK104 - released March 2012

GTX780 = GK110 - released May 2013

R9 290X = Hawaii - released Oct 2013

GTX780 Ti = GK110B -released Nov 2013


Both AMD and NVIDIA released two Chips and made 4 products, 2 from each chip.
GK110 was way larger and more expensive than Hawaii and it was definitely not meant to compete against Tahiti (HD7970).

Tahiti = 352mm2
GK104 = 294mm2
Hawaii = 438mm2
GK110 = 561mm2
GTX680 was small kepler just like GTX980,1080,2080 and GTX460/560TI before kepler naming change
GTX780 was big kepler just like 980TI,1080TI,2080TI and GTX280/285/GTX480/580 before kepler naming change
Only reason they didint release big kepler along small kepler was because 7970 was slow and overpriced and for the fist time they didnt need big die to compete with AMD fastest GPU.They instead renamed GTX660TI as GTX680 as flagship and charge 500usd instead 300 or whatever it should be and hold big die and sell them only for datacenters and after few months they released cutdown big die as 1000USD TITAN instead 500usd (gtx480 and 580 were big die for 500usd) as GTX680.
 
Last edited:

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Man the gpu market just sucks right now. My $340 GTX 970 was pretty good for almost five years. It wasn't until around that fifth year or so that I was having to turn any settings below high to get a consistent 60 fps at 1080p. Now I can't see anything on the market that I wouldn't want to replace in two years max at that price range with how powerful the next gen consoles look to be, and you know AAA games will mostly target them ahead of PC.

I mean there's no doubt these consoles are going to be subsidized by Sony and MS since the long game is to get you to buy your games and microtransactions on that platform so Sony/MS can take their 30 cents on the dollar there, and also so you'll pay for memberships that are virtually pure profit to use PS Plus / XBox Live. Even the puny PS4 and XB1 were likely sold at a small loss at launch once you factor in the retailer's cut. But man, low binned Ryzen7 + RX 5700 XT level gpu + 16GB GDDR6 (guessing here on size) + 1TB SSD (guessing here on size also) + power supply to run it for $500? It sounds absurd based on prices for PC hardware. But that's seemingly the best guess what's going to be in the system.
Yeah its sad how both amd and nv work together now to just milk gamers.
Console:
r7 3700x+12-16Gb Gddr5/6.
512gb-1TB SSD
RX5700XT or faster with RT support
power supply
500USD for all that

PC-RX5700XT AIB alone 440USD with NO RT supporto_O i saying it again and again someone need lawsuit them for price fixing.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Yeah its sad how both amd and nv work together now to just milk gamers.
Console:
r7 3700x+12-16Gb Gddr5/6.
512gb-1TB SSD
RX5700XT or faster with RT support
power supply
500USD for all that

PC-RX5700XT AIB alone 440USD with NO RT supporto_O i saying it again and again someone need lawsuit them for price fixing.

Yes I agree with what you're saying. I understand the bulkness of the console orders, but this is ridiculous.

AMD may not care if gamers start fleeing by the millions. I mean they may not get you as customer through a graphics card, they will get you as a customer through a console. The problem is what Nvidia is going to do. If they lose you as a customer, they lose your for good.

I really hope Jensen is watching and they will cut the crap with Ampere. I want 10tflops, 12GBs at 400$ or they will get the D! Ideally I would like the same from big Navi as well.
 
Status
Not open for further replies.