• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 204 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VirtualLarry

No Lifer
Aug 25, 2001
51,360
6,589
126
Some quick searches show that it isn't more efficient at mining than a 5700, so miners would get those before scooping up Ampere cards.
To butt into this thread unceremoniously, that's exactly what I think is happening. I went looking at Newegg just tonight at their RX 5700/5700XT cards. Pretty-much ALL OF THEM, that weren't being scalped for something obviously much above MSRP, were SOLD OUT.

Ebay listings for BIN for popular RX 5700XT cards, NEW, were around $700-800. Just like the good ol' mining heydeys... :(

I snagged an XFX DD RX 5700 non-XT, for the mind-bendingly exorbitant price of $450. More than an RTX 3060 ti. IF ONLY they were in-stock and available, I wouldn't be reduced to that. But still, I should be able to make $45/mo using that card, at current earning rates, after electricity costs. It should therefore pay for itself after 10-12 months. Free fancy expensive PC hardware, hard to say NO to that. (Other than having the initial investment up-front, and a place to run the cards with adequate cooling and power.) I'm just a "hobby miner" though, I don't actually have a real "farm". Just a few gaming PCs, and one purpose-built mining rig in a Rosewill mining server chassis, which worked out surprisingly well.

If newest-gen GPUs start being in-stock at non-scalper prices, regularly, I may sell off some of my current fleet of cards, and buy some newer ones. In truth, perhaps, I should be selling off my fleet now, while the market is hot. But I have fun mining, gives my meager life some purpose, and gives me grocery money.
 

guidryp

Golden Member
Apr 3, 2006
1,317
1,443
136
The table's have really turned on this. GCN architecture was very compute oriented, while Pascal, Turing and previous NVidia architectures were very gaming oriented.

Thus for the same gaming performance AMD usually delivered significantly more compute performance, or from the other perspective, for the same compute performance it delivered significantly lesser gaming performance.

RDNA shifted AMD balance significantly toward gaming, such that they were balanced more similarly to NVidia.

But with Ampere, NVidia essentially doubled up on FP32 compute units, shifting it strongly in favor compute performance.

Now we are in the opposite situation of previous generations.
 

Ajay

Diamond Member
Jan 8, 2001
8,958
3,638
136
The RTX 3060 Ti cards look like a real nice upgrade for me. Have to see how the AMD 6700XT performs to make a choice. I probably won't upgrade till next Christmas, we need more practical gifts this year :-(
 

Mopetar

Diamond Member
Jan 31, 2011
5,968
2,787
136
The RTX 3060 Ti cards look like a real nice upgrade for me. Have to see how the AMD 6700XT performs to make a choice. I probably won't upgrade till next Christmas, we need more practical gifts this year :-(
I wouldn't worry too much. By then the supply issues should be resolved so you can easily pick one up and I suspect the increased competition will drive prices down as well.

I've been itching to upgrade myself but can't get any of the new cards and I refuse to buy from scalpers on principle so I'm not likely to get anything for Christmas this year either.

Kind of a bummer but not going to kill me either. I've been stuck on what kind of monitor to get as an upgrade and kind of want to just jump all the way to 4K, but the 3060 Ti is such a solid 1440p card that it makes that all the more tempting.
 

Ajay

Diamond Member
Jan 8, 2001
8,958
3,638
136
Kind of a bummer but not going to kill me either. I've been stuck on what kind of monitor to get as an upgrade and kind of want to just jump all the way to 4K, but the 3060 Ti is such a solid 1440p card that it makes that all the more tempting.
Yeah, I'd love a curved ultra-wide monitor, but then I'd need an even more powerful GFX cards - so the costs just escalate too much for our budget at this time. As I have a 1440p monitor, a 3600 Ti would give me great min frame rates (I'm stuck at 60Hz). Even though I've been a Nvidia fan for the past 15 years, AMD has really come out strong this year with their 6xxx line. And, yeah, I don't expect any problems with supply or pricing by the time I buy - I'm always about a year behind when I buy GFX cards for whatever reason :astonished:.
 

tajoh111

Senior member
Mar 28, 2005
228
218
116
Not really. It's still based on a claim from an analyst. There's still no hard evidence that Nvidia sold any product to miners specifically.

It's no different than wccftech regurgitating the same rumor that popped up somewhere on Twitter or Reddit. Repeating something that isn't true enough times doesn't make it any more true. But it will get clicks because it's sensationalism and likely to draw outrage.

Still no evidence of any direct sales to miners or a majority of the cards actually sold through retail channels going to miners. Some quick searches show that it isn't more efficient at mining than a 5700, so miners would get those before scooping up Ampere cards.
The funny thing is the timing of this mining story which benefits AMD by taking the heat off of them.

It literally came out when it was clear AMD botched their launch. When AMD limited availability was worse than Nvidia after boasting it would be better availability, you had youtubers, that were starting to turn on AMD because of the MSRP of AIB being worse than Nvidia, cards not being sold in physical stores. Now this mining story comes up, and these youtubers and sensationalist news outlets, jump on this story like a fly on poop, forgetting AMD's botched launch.

This story could have a less dramatic explanation

https://www.tomshardware.com/news/upset-investors-accuse-nvidia-of-masking-dollar1-billion-in-mining-gpu-revenue-as-gaming

Considering Nvidia is currently being sued by some of their investors for not accurately mentioning the size of the mining market and thus inflating gaming sales, it would be far more transparent to investors to enumerate this figure. Should Nvidia continue to mask this figure and get sued again?
 

tajoh111

Senior member
Mar 28, 2005
228
218
116
That story is about the 2017 mining bubble...
I know. Hence why Nvidia would want to learn from their mistakes and actually try to enumerate how much mining cards represent of their gaming revenue.

I.e not get sued again by saying 175 million and acknowledging miners are part of their gaming revenue.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
Yeah, I'd love a curved ultra-wide monitor, but then I'd need an even more powerful GFX cards - so the costs just escalate too much for our budget at this time. As I have a 1440p monitor, a 3600 Ti would give me great min frame rates (I'm stuck at 60Hz). Even though I've been a Nvidia fan for the past 15 years, AMD has really come out strong this year with their 6xxx line. And, yeah, I don't expect any problems with supply or pricing by the time I buy - I'm always about a year behind when I buy GFX cards for whatever reason :astonished:.
You shouldn't worry about ultra-wide, assuming you're talking about the 3440x1440 p and not the newer higher resolution monitors. I think that resolution is much better than 4k because you will be able to run with fps over 100 in most games and 144hz monitors are cheaply available. You're not going to do that with 4k. With a 3070, I'm getting 130-200 fps in Doom Eternal ultra nightmare, around 100 in Forza Horizon 4, 70-80 in Horizon zero dawn which is CPU bottleneck by my potato ryzen 1700. Only game so far that have given me trouble with max setting was Godfall, I don't remember Gears 5, Gears Tactics, Wolfenstein, Serious Sam 4, Grid 2019 giving me any sort of trouble with max settings. Game play is incredibly smooth. 3060ti should give you a similar experience.
 

Mopetar

Diamond Member
Jan 31, 2011
5,968
2,787
136
The funny thing is the timing of this mining story which benefits AMD by taking the heat off of them.

It literally came out when it was clear AMD botched their launch. When AMD limited availability was worse than Nvidia after boasting it would be better availability, you had youtubers, that were starting to turn on AMD because of the MSRP of AIB being worse than Nvidia, cards not being sold in physical stores. Now this mining story comes up, and these youtubers and sensationalist news outlets, jump on this story like a fly on poop, forgetting AMD's botched launch.

This story could have a less dramatic explanation

https://www.tomshardware.com/news/upset-investors-accuse-nvidia-of-masking-dollar1-billion-in-mining-gpu-revenue-as-gaming

Considering Nvidia is currently being sued by some of their investors for not accurately mentioning the size of the mining market and thus inflating gaming sales, it would be far more transparent to investors to enumerate this figure. Should Nvidia continue to mask this figure and get sued again?
This is just adding conspiracy on top of hearsay. It has nothing to do with AMD's launch and as @guidryp pointed out the information you're posted in from years ago. Your specific accusation once again relies on some notion that Nvidia would have intentionally sold a massive amount of cards directly to miners such that they would know the cards didn't go to the gaming market. There's still absolutely zero proof of this or anything to substantiate it. Otherwise Nvidia shouldn't have any real clue what number of cards were purchased from retailers by gamers as opposed to miners. If they could tell that for a certainty their cards would be so loaded with spyware that I'd never want to put one in my PC regardless of performance. It's a card meant for the gaming market and sold as a gaming card. Analysts can estimate all they want, but Nvidia isn't being deceitful.
 

VirtualLarry

No Lifer
Aug 25, 2001
51,360
6,589
126
If they could tell that for a certainty their cards would be so loaded with spyware that I'd never want to put one in my PC regardless of performance.
GeForce experience can tell, and unless you paid only with cash and didn't use a retailer reward program, they have your name, address, cc number, phone number, and card serial number.
Same deal with Win10 "S Mode".
 

Mopetar

Diamond Member
Jan 31, 2011
5,968
2,787
136
I'm beginning to see why bitcoin is so popular. I don't appreciate merchants that leak information about my business with them to outside parties (including the manufacturer) either. If I want them to know that I've purchased their product I'll fill out the extended warranty card or jump through the project registration hoops they have.
 
  • Like
Reactions: maddie

Leeea

Senior member
Apr 3, 2020
896
1,051
96
I'm beginning to see why bitcoin is so popular. I don't appreciate merchants that leak information about my business with them to outside parties (including the manufacturer) either. If I want them to know that I've purchased their product I'll fill out the extended warranty card or jump through the project registration hoops they have.
The transaction fees* on bitcoin are crazy. Last time I messed with it, Ethereum is just as widely accepted, and more practical. But it has been a few years.

*https://ycharts.com/indicators/bitcoin_average_transaction_fee#:~:text=Bitcoin Average Transaction Fee is,K% from one year ago.
$7.57 per transaction on bitcoin :(

I quit using cryptocurrencies because the bank/credit card is superior. No privacy, but that is the point.
 
Last edited:

GodisanAtheist

Diamond Member
Nov 16, 2006
3,199
1,721
136
The transaction fees* on bitcoin are crazy. Last time I messed with it, Ethereum is just as widely accepted, and more practical. But it has been a few years.

*https://ycharts.com/indicators/bitcoin_average_transaction_fee#:~:text=Bitcoin Average Transaction Fee is,K% from one year ago.
$7.57 per transaction on bitcoin :(

I quit using cryptocurrencies because the bank/credit card is superior. No privacy, but that is the point.
- Sounds like $7.57 is the cost of performing a transaction with discretion...
 
  • Like
Reactions: Mopetar

beginner99

Diamond Member
Jun 2, 2009
4,821
1,210
136
I quit using cryptocurrencies because the bank/credit card is superior. No privacy, but that is the point.
- Sounds like $7.57 is the cost of performing a transaction with discretion...
Bitcoin or Ethereum or not private. In fact they are very public. Once you know someones address you can see all their transactions including amounts. So that will quickly get annoying having to create new addresses every months? week?

There are tools out there (used by FBI and co) that can easily track the flow of bitcoin even through "scramblers". The state / espionage system will be able to track you just as easily as with classical finance. Hell finance has so complex structures it might even be harder (or surely slower) to trace the money as it's not easily available to everyone.

For privacy you will need monero but I haven't' actually looked into it much especially how they deal with the in/out situation. You probably need to create a new address for each in/out transaction because it's easy to know from where it came / where it goes.

Privacy has its costs...
Privacy doesn't exist if you are on the internet especially not when posting on forums. I'm sure google by now has AI that analyses the content of forums posts and associates it with real life users. Want privacy? don't go on the internet.

Of course there are levels and posting a ton of stuff on facebook etc. probably isn't smart but it's like with protecting the environment. Less is more. Only way to have privacy is to use the internet less. Only way to save environment is to consume less. Anything else is just a drop on the hot stone and marketing to sell you something new.
 
  • Like
Reactions: Leeea

dr1337

Member
May 25, 2020
148
231
76
lmao yikes, sounds like nvidia really needs to give their marketing a PR and an HR refresher. Directly trying to sway reviewers to push a certain narrative is pretty un-ethical, things might be pretty interesting if this blows up
 

coercitiv

Diamond Member
Jan 24, 2014
4,446
5,970
136
This shouldn't happen, the fact that anyone at any level at Nvidia did this is unacceptable.
Hubris overload, right when Cyberpunk launches with punishing RT workloads for previous gen RTX hardware and right before youtubers get to have their say on RT and DLSS benefits in this title. Nvidia literally set themselves up, this will send ripples through the entire community:

GamersNexus said:
I have something else to say about NVIDIA's latest decision to shoot both its feet: They've now made it so that any reviewers covering RT will become subject to scrutiny from untrusting viewers who will suspect subversion by the company. Shortsighted self-own from NVIDIA.
GamersNexus said:
Won't be the last you hear of this. We'll let HUB start since it's their story, but they have our full support. Implying the demand for a "change" in "editorial direction" is way over the line. This makes NVIDIA look weak. This isn't necessary if the product is good.
Both HUB and GamersNexus were warming up to DLSS and embracing it where available in newer titles. RT would have been next as more and more titles are added. Now we're back to square one, every flickering pixel will be marked and counted.

Oh well, at least we get drama this Christmas.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
3,199
1,721
136
What's especially bizarre about that move is that its not like NV is performing poorly in rasterized workloads either, they're extremely competitive with a value add of having better RT performance and DLSS where the competition has nothing.

I mean, I'd *kinda* get it if AMD was 20% ahead in Raster while NV was 20% ahead in RT... but that's not even the case.

And at the end, everyone gets shot in the foot. NV looks like a bully, and people will think GamerNexus will be biased toward AMD because of the bad blood now created by this move.
 

Panino Manino

Senior member
Jan 28, 2017
428
518
136
Not only that, UB is doing separete and dedicated RT and DLSS videos for some time now. It's not that they neglected these Nvidia features, it's that they want to give even more focus than they can on normal videos.
The accusation makes no sense.
 
  • Like
Reactions: lightmanek

ASK THE COMMUNITY