[Rumor, Tweaktown] AMD to launch next-gen Navi graphics cards at E3

Page 112 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Beyond a few people on this board, does anyone specifically buy a monitor because it supports Freesynch or G Synch?

Absolutely. It's rare when a new monitor technology makes such a big gameplay difference. I avoided G-Sync monitors in the past because I didn't want to be locked down to one GPU manufacturer (even though I've primarily had Nvidia cards the last few years). Until Nvidia starting supporting "G-Sync compatible" monitors, I avoided throwing big money down on any Adaptive Sync monitor. Now I specifically look for Freesync 2 or G-Sync compatible monitors. I would have to get an extremely good deal on a G-Sync only monitor for me to consider it.
 
  • Like
Reactions: Ranulf

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
It fits the definition of "something mass market doesn't care about" quite perfectly

It doesn't because without the sweet, sweet margins from tesla cards we would either not be getting any new dGpus or they would be even more costly.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Beyond a few people on this board, does anyone specifically buy a monitor because it supports Freesynch or G Synch?
When I bought my most recent monitor I refused to pay the G-Sync tax but I specifically looked for a Freesync monitor. If I get an Nvidia card in the near future and the adaptive sync function doesn't work with it, oh well, I will not shed a tear. But while I have an AMD card I'm enjoying it.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,227
126
Am I on drugs? A website is selling the MSI 1660 Ti for $69.99. Is this for real?
I'm not even going to go to, nor quote the link. I can tell you without even checking, 100% FAKE.

ebay is flooded with FAKE "GTX 950" / "GTX 750" / "GTX 1050", with "unusual" memory configurations, shader counts, etc.

What they do is, they take an ancient GPU (Fermi rides again!), like a GTX 560 ti-equivalent, solder that to a board with some RAM, slap on a modern-looking cooler, and doctor the BIOS, so that it comes up in drivers and in Windows as some "modern" GPU model spec. This, unfortunately works, because of NV's much-"unified" drivers, that work across cards/GPUs. In this case, that works against them.

GPU-Z now has a "FAKE" indicator, when it can detect that the card is a fake.

Edit: Unless it's a simple typo, and the card is supposed to be $269.99, that I could believe.
N.B. I am the owner of three "legit" GTX 1660 ti cards. I paid $280, $210 (used, private party), and $310 ('Gaming X' model). You're not going to find them significantly cheaper, they are in demand for mining purposes too.
 
  • Like
Reactions: Mopetar

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
What Joe pretty and a supposedly good price for a freaking gtx 1660ti, have to do with the freaking next gen Navi thread?

And you are asking if you are on drugs?:p
 
  • Haha
Reactions: Elfear

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
I found that ad when I searched google shopping. It was listed 3 or 4 times. I just repeated the same search and now it only appeared once. I guess Google's filter smelled a rat or something.
It didn't take a search engine filter to realize what was wrong there.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,501
9,931
136
What Joe pretty and a supposedly good price for a freaking gtx 1660ti, have to do with the freaking next gen Navi thread?

And you are asking if you are on drugs?:p

- Ran out of things to talk about a few pages ago. Nothing to really say until the RX 5500 or the RX 5900 (or whatever) starts leaking out. Or when we get that Anandtech deep dive article that was suggested in the launch review ;)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
GPU-Z now has a "FAKE" indicator, when it can detect that the card is a fake.

Then I hope they do a better job by the time Intel dGPUs come. They are usually wrong on the ROP/TMU information for the Intel GPUs. HD 620 is shown as having 24 TMUs when it has half that - 12.

I even told them on their forum. They corrected the information for a bit, but they reverted to the wrong numbers again.

CPU-Z guys are good. GPU-Z not so much. I think they just gather information and input it in their database.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
CPU-Z guys are good. GPU-Z not so much. I think they just gather information and input it in their database.

They are literally the same guys. I am betting when they query the Intel GPU, its actually returning with a doubled number for some reason.
 
Feb 4, 2009
35,862
17,407
136
I'm not even going to go to, nor quote the link. I can tell you without even checking, 100% FAKE.

ebay is flooded with FAKE "GTX 950" / "GTX 750" / "GTX 1050", with "unusual" memory configurations, shader counts, etc.

What they do is, they take an ancient GPU (Fermi rides again!), like a GTX 560 ti-equivalent, solder that to a board with some RAM, slap on a modern-looking cooler, and doctor the BIOS, so that it comes up in drivers and in Windows as some "modern" GPU model spec. This, unfortunately works, because of NV's much-"unified" drivers, that work across cards/GPUs. In this case, that works against them.

GPU-Z now has a "FAKE" indicator, when it can detect that the card is a fake.

Edit: Unless it's a simple typo, and the card is supposed to be $269.99, that I could believe.
N.B. I am the owner of three "legit" GTX 1660 ti cards. I paid $280, $210 (used, private party), and $310 ('Gaming X' model). You're not going to find them significantly cheaper, they are in demand for mining purposes too.

Wow that explains how this happens with those I bought xxx card from wish or whatever and here is what I got stories.
Totally makes sense too.
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
Assuming the rumor mill is to be trusted we should learn about the 1650/1660 Super cards tomorrow. That would be good because I suspect part of the delay in AMD releasing the retail 5500 is they’re waiting to see price Nvidia sets for their cards.

I really wonder what is the delay in AMD releasing the 5500? It just seems like they’ve been outmaneuvered yet again. If they had released the 5500 a month ago they would have absolutely owned the low end GPU market, but their delay has allowed Nvidia to release a credible response. Of course this assumes the Supers are equivalent to the 5500 & 5500 XT, something we won’t know until AMD actually releases their cards.

One thing is for sure, the 5500 absolutely inspired Nvidia to up its game for the 1650. The 1650 Super, assuming rumors are true, is going from 896 to 1280 CUDA cores, from 56 to 80 Texture units, and from GDDR 5 to 6. That is no small change. The 1660 Super by comparison is just getting an upgrade from GDDR 5 to 6. Apparently the 1650 Super will be using the TU116 die versus the TU117. I wonder if NVIDIA will keep the 1650 around by the same name or release it as a 1640 or something, or maybe just release it in China or India? Maybe in 3 or 4 years it will reappear as bootleg cards from China like we talked about a few posts ago.
 
  • Like
Reactions: GodisanAtheist

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Assuming the rumor mill is to be trusted we should learn about the 1650/1660 Super cards tomorrow. That would be good because I suspect part of the delay in AMD releasing the retail 5500 is they’re waiting to see price Nvidia sets for their cards.

I really wonder what is the delay in AMD releasing the 5500? It just seems like they’ve been outmaneuvered yet again. If they had released the 5500 a month ago they would have absolutely owned the low end GPU market, but their delay has allowed Nvidia to release a credible response. Of course this assumes the Supers are equivalent to the 5500 & 5500 XT, something we won’t know until AMD actually releases their cards.

One thing is for sure, the 5500 absolutely inspired Nvidia to up its game for the 1650. The 1650 Super, assuming rumors are true, is going from 896 to 1280 CUDA cores, from 56 to 80 Texture units, and from GDDR 5 to 6. That is no small change. The 1660 Super by comparison is just getting an upgrade from GDDR 5 to 6. Apparently the 1650 Super will be using the TU116 die versus the TU117. I wonder if NVIDIA will keep the 1650 around by the same name or release it as a 1640 or something, or maybe just release it in China or India? Maybe in 3 or 4 years it will reappear as bootleg cards from China like we talked about a few posts ago.
I suspect they aren't releasing the 5500 because they don't have sufficient chips as there's a lot of competition for 7nm capacity, which will make it hard to get large quantities unless you are willing to pay over the odds. Not ideal for a high volume mainstream chip.
 
  • Like
Reactions: Mopetar

Mopetar

Diamond Member
Jan 31, 2011
8,526
7,786
136
I suspect they aren't releasing the 5500 because they don't have sufficient chips as there's a lot of competition for 7nm capacity, which will make it hard to get large quantities unless you are willing to pay over the odds. Not ideal for a high volume mainstream chip.

I assume that the other part of this is that AMD is focusing on getting their cards into more OEM products and that this also means more of what they do produce needs to go towards supplying their partners with products before the consumer markets get anything. So even if they were producing a high volume of chips, they've got more that are already spoken for than in their more recent launches.

Some of it also may just be a matter of finding a good time for a product launch. It's not uncommon to wait until right around Black Friday and the start of the holiday season to take advantage of the increased consumer spending. If they're looking to release a value-oriented product then it makes a good deal of sense to hold off a release until that time. That can also be aimed at generating good press or buzz for investors because holding back and ensuring you'll have strong sales during a holiday period can go a long way towards driving investor confidence (and stock prices) if you can post impressive launch figures. Some of that trickles out to consumers and there's a natural tendency to get behind a "winning" product for people who don't have any brand loyalty or aren't particular savvy.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
8,313
3,177
146
Apparently big Navi may be coming pretty soon, but hey the source was WCCFtech
 
Mar 11, 2004
23,444
5,852
146
Had to check what the new rumor was. Apparently it was submitted to some certification in Korea, which generally indicates its 3-6 months away. That would make quite a bit of sense, as I'd guess its a Radeon VII replacement, so an announcement at CES and then release later would actually give AMD a surprisingly consistent update for a change (perhaps that was what Radeon VII was more about, giving the complete team a sorta product prep run from engineering to production, marketing, etc).

Which, I'm hoping we see the 3 bigger chips that are rumored. With a couple being in the 80-100CU level (with 4 unique cards) and featuring HBM2. They'd be more targeting the enterprise/pro markets (CAD/render, game streaming stuff like Stadia and Microsoft's), but perhaps they'd throw enthusiasts a bone like with the Radeon VII and sell versions for a reasonable price ($750-1000, with, 80CU/18GB/800-900GB/s/$750, 84CU/24GB/900GB/s/$825, 90CU/24GB/960-1000GB/s/$900, 96-100CU/32GB/1.2TB/s/$1000).

I'm personally hoping the 3rd chip is the same one used in the next Xbox. As in the exact same one, just with the Xbox having lower clock speeds (25-33% lower) in order to fit into console power (putting it closer to 15TF). Something like 72CU, with 384bit GDDR6 bus for 640-750GB/s, 16GB. The console version would have lower clocked memory (basically would double up on the One X memory bandwidth to 640GB/s; the discrete card version would have newer 16Gbps chips. It'd give developers final GPU to prep for the system launch, it'd help AMD on the PC side (having the exact same GPU, would make console ports easy - which for Microsoft they could offer those just on the Windows store so it should help them some as well). They could sell it for basically the same price as the Xbox itself (which will make the Xbox seem good value), while having higher clock/memory speeds would make it a bit better (compensate some for the difference overall like how the PC would be using DDR4 as system memory). I'm guessing the Xbox will be $500 or maybe $600. That'd leave room for a cutdown 64CU version for $450-500. And then that'd leave room for an update to the 5700 line in the $300-400 range. Plus it also would play somewhat nice to Sony if the PS5 is a 48-56CU starting at $400 (even though I wouldn't expect a dGPU version of their chip so they don't have to worry about positioning it in that market, it would slot the PS5's GPU power in line with the overall GPU tier of AMD making it seem like ok value in the overall market).
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
I have seen everything from 6 months to a year after RRA certification, Vega was ~1 year for example.
 

Mopetar

Diamond Member
Jan 31, 2011
8,526
7,786
136
Had to check what the new rumor was. Apparently it was submitted to some certification in Korea, which generally indicates its 3-6 months away. That would make quite a bit of sense, as I'd guess its a Radeon VII replacement, so an announcement at CES and then release later would actually give AMD a surprisingly consistent update for a change (perhaps that was what Radeon VII was more about, giving the complete team a sorta product prep run from engineering to production, marketing, etc).

As far as performance goes, the 5700XT is already a Radeon VII replacement. Same performance for $300 less and much lower power consumption on top of it.

Radeon VII was about having a 7nm GPU to announce alongside their 7nm CPU because Navi wasn't ready to debut yet for whatever reason. They took some of their professional MI cards and rebranded a small amount of them for the consumer market.
 

killster1

Banned
Mar 15, 2007
6,205
475
126
When I bought my most recent monitor I refused to pay the G-Sync tax but I specifically looked for a Freesync monitor. If I get an Nvidia card in the near future and the adaptive sync function doesn't work with it, oh well, I will not shed a tear. But while I have an AMD card I'm enjoying it.
the new nvidia driver works very well with my 65" nu8000 1440p@120hz excited for 4k@120hz one day maybe in 2021 they will have good gfx cards :p
 

lifeblood

Senior member
Oct 17, 2001
999
88
91
the new nvidia driver works very well with my 65" nu8000 1440p@120hz excited for 4k@120hz one day maybe in 2021 they will have good gfx cards :p
Unless AMD puts something better on the table at the $200-$250 price point I'll be getting a GTX 1660 Super come Black Friday. My secret hope is that G-sync just magically works for me. Of course I'm also hoping an RX 5700 magically goes on sale for ~$250, even if it has the reference closed cooler. I'm not betting on it, but hope reigns eternal.
 
Status
Not open for further replies.