Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 95 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
May 15, 2014
559
292
136
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 

CP5670

Diamond Member
Jun 24, 2004
5,539
613
126
The current modular PSUs typically have 1x8pin at the PSU going to 2x8pin PCIE at the video card. This nvidia cable has 2x8pin at the PSU, suggesting it's comparable to 4x8pin PCIE. Some PSUs even support the latter but not 2x8pin for the CPU at the same time.
 

Karnak

Senior member
Jan 5, 2017
399
767
136
It's not what you think. If it was, the 8 pin ends would be female. As can be seen in the product photos, it's male, meaning it's meant to be inserted directly into the PSU.
I was talking about an adapter for existing PSU's, not that one from SeaSonic which is a seperate cable and not an adapter. Or let's call it an extender rather than adapter which I think will be included.
 

JasonLD

Senior member
Aug 22, 2017
487
447
136
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.

It would be hard to imagine people who would be shopping for this card wouldn't have a modular PSU.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
They did the BIG Huge Ampere to satisfy his ego. Depending on the outcome he can still at least claim his is bigger....
The Founder's Edition boxes even contain a little black leather jacket to dress up your graphics card in!

With this connector you just plug it in, and then you only need to worry about the wattage of the PSU.
What you say makes a lot of sense, the current system is a bit of a mess. Like, who enjoys fiddling with those finicky 6+2-pin GPU power connectors, and trying to insert two separate connectors at the same time? Ugh. (Also: writing that sentence without it sounding too sexual in nature... lol)

That said though, it should be a standardized thing, not a Nvidia exclusive. Because we've had enough proprietary Nvidia crap by now. I still remember Huang whining about the proprietary nature of Glide, back when 3dfx was still the big dog in the 3D graphics card market; the taste of NV hypocrisy is beyond bitter at this point.
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
The current modular PSUs typically have 1x8pin at the PSU going to 2x8pin PCIE at the video card. This nvidia cable has 2x8pin at the PSU, suggesting it's comparable to 4x8pin PCIE. Some PSUs even support the latter but not 2x8pin for the CPU at the same time.
Yep, this 12 pin connector is going to be delivering more than 2x 8 pin PCIe. Lets hope it isn't close to the max possible from 4x 8 pin PCIe, but the massive cooler is an indicator of serious power consumption.
 

FaaR

Golden Member
Dec 28, 2007
1,056
412
136
the massive cooler is an indicator of serious power consumption.
I've been thinking myself about that, if the massive cooler really is for dealing with monster power consumption, or maybe instead to handle a perhaps somewhat higher than usual high-end-like power consumption either at or near whisper quiet sound levels.

It's an interesting situation, I can think of a bunch of reasons to argue for either case...
 
  • Like
Reactions: Gideon

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
I've been thinking myself about that, if the massive cooler really is for dealing with monster power consumption, or maybe instead to handle a perhaps somewhat higher than usual high-end-like power consumption either at or near whisper quiet sound levels.

It's an interesting situation, I can think of a bunch of reasons to argue for either case...
By itself, yes, the argument for low noise can be made, but is not as strong when you also realize that the 12 pin connector can deliver up to 600W.

Maybe it's an attempt to raise the oil price. :)
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
It would be hard to imagine people who would be shopping for this card wouldn't have a modular PSU.

- Modular PSU cables are not interchangable however, as different PSUs can have different pin-outs on the PSU side of things. That would potentially mean you would need different modular cable adaptors for all the different PSU OEMs.

Just an example, but you cannot take cables for a, say, Seasonic PSU and plug them into a Corsair or EVGA PSU. Hell, cannot even take a cable designed for one unit from a manufacturer and use it on another unit from the same manufacturer without guarantee that you won't end up with a dead component at best or an electrical fire at worst.

I'm almost more curious to see how the logistics of this are going to work than I am the cables.

All that said, I would love a new standard that brought the number of connectors/cables to the motherboard and GPU down and shrunk the connectors as well. Disassembled and reassembled my rig today for cleaning and while it's old hat for me now, I still get a twang of frustration having to drag that 4+4 pin CPU power cable over my ram and cooler across the motherboard and plug in the connectors blind because there is just no room for my hand and eyes to work together on that one...
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
All that said, I would love a new standard that brought the number of connectors/cables to the motherboard and GPU down and shrunk the connectors as well. Disassembled and reassembled my rig today for cleaning and while it's old hat for me now, I still get a twang of frustration having to drag that 4+4 pin CPU power cable over my ram and cooler across the motherboard and plug in the connectors blind because there is just no room for my hand and eyes to work together on that one...

That's basically what 12VO does. ONE connector to the motherboard, and one to the PSU. And the power supply only outputs 12VDC, no 3.3 or 5. The motherboard can handle that for things like USB and such.
 
  • Like
Reactions: GodisanAtheist

Veradun

Senior member
Jul 29, 2016
564
780
136
Well it would suck if you’re after an FE card and you don’t have a modular PSU.

At this point for me personally I would only be interested in an FE card on the outcome of the cooling solution. Other than that I won’t really want to spend $100 extra for FE tax.
I would expect them to bundle a 2x8pin to 1x12pin connector with the FE. Not doing this would be foolish
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
After reading this thread last week, then spending the weekend fishing and thinking of many things, I cam back to this thread to ask:

Why? Why is nvidia doing this? Why now?

I've been encouraged that some others asked that question too.

Typically these mammoths age terribly. All this power consumption makes for a lot of load on a lot of components and how many of these will die when PSU's that are a couple years old get hit with an actual load (man, a top of the line Intel CPU and a 3090 could legit draw 1kw?) nearing what they are spec'd for and lose the magic smoke? These will suck to have in any type of smallish room if they are hitting their power numbers in gaming loads, a 290x could heat my theater room up with the door closed during a gaming session, this is way past that... dorm rooms, offices, etc. are going to feel it.

This seems like AMD's cards from the past delivered by nvidia in 2020. Fury X, Vega 64, solid architectures in their own right but clocked and volted to high heaven and paired with exotic memory... they haven't aged very well at all. The Fury and Vega 56 were likely the better long term values, if there really was any. But we know at least some of "why" AMD did that - to take a shot at the halo.

It seems pretty obvious that the 3080 is going to be a halo product by itself and is supremely likely to hold the single card performance crown AND still chew up north of 300W.

I will readily admit that I am strongly adverse to spending more than ~$500, because I feel that is where the value prop changes so drastically and I only "need" 2k performance.

Also, because I expect ~$500 is going to net me an entire PS5. Maybe $600?

I have really respected nvidias right to make money on their efficient, well mannered and solid video cards. But this???

My mind is boggled.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
It seems pretty obvious that the 3080 is going to be a halo product by itself and is supremely likely to hold the single card performance crown AND still chew up north of 300W.
Yes, and no.

No. It won't hold the performance crown.
Yes. It will eat 300W of power for breakfast.
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
Yes, and no.

No. It won't hold the performance crown.
Yes. It will eat 300W of power for breakfast.

Really? Is the AMD 300W card really going to be more effecient?

@Stuka87

I get that - I meant that a 3080 seems likely to be powerful enough to be extremely competitive as the best gaming card you can buy at launch.

A 3080 Ti seems like it would make more sense in four -six months as needed than a $1600 - special power connector required (that can't be fed by 2x 8 pin power cables so you need a new PSU too - which I know isn't that big of a deal for the hundreds of people that are willing to go to any end of the earth to get this to work) hail mary product that shows up *before* it is obvious it's needed.

I mean, it will exist but how many times are we going to hear "Well, you could get a big navi or a 3080 double black ultra edition with AIO *or* you can put your big boy pants on and spend 2x the money on a 3090". That seems unlikely in this forum.

If you have to define an entirely new price point, require specialized PSU support and deal with exhausting an unprecedented amount of heat, have you really taken the "performance" halo? Or did you just define a new performance category (super double crazy) that the competition just didn't realize existed before?

I won't dip into car analogies (ugh) but plenty come to mind in this case.

Finally, is this targeted at the 200hz 4k full ultra settings crowd? Does that exist and also have money to burn? I am not kidding when it seems to me that this segments seems like hundreds of people large.
 

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
What 300W AMD Card? ;)

There is none in upcoming AMD lineup.

Well, to be fair, AMD always can chose to crank the heat and clocks up, and make it a burning furnace, just like Nvidia did, but at the moment, there is no plans to exceed 275W TBP.

Fair enough. I'll try again :D

The ~275W RDNA 2 GPU is really going to be convincingly faster than the 300W "ish" 3080?

I mean, that sounds great, but based on most of the competitive history of the two companies it sounds like extremely wishful thinking.

Then again, as I read more about the 3090 I couldn't believe what I was seeing either so... yeah. Can't wait to see what happens over the next few weeks in GPU land.
 
  • Like
Reactions: ozzy702

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Fair enough. I'll try again :D

The ~275W RDNA 2 GPU is really going to be convincingly faster than the 300W "ish" 3080?

I mean, that sounds great, but based on most of the competitive history of the two companies it sounds like extremely wishful thinking.

Then again, as I read more about the 3090 I couldn't believe what I was seeing either so... yeah. Can't wait to see what happens over the next few weeks in GPU land.
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?
 
  • Like
Reactions: Krteq

blckgrffn

Diamond Member
May 1, 2003
9,301
3,442
136
www.teamjuchems.com
Uhh... why would it not be faster than RTX 3080?

Let me ask you, and everybody here one simple question.

If ALL of Nvidia rumors turned out to be true, why is it so hard to believe that for once AMD has better GPU architecture, and products than their direct competitor?

Why would AMD rumors be incorrect, if ALL of Nvidia leaks and rumors turned out to be true?

Well, AMD is getting what, a half node push, at best? Nvidia is getting substantial (but maybe not so great?) full step improvement...

Comparing the 2070 Super with the 5700xt was largely a push, and there was still the 2080 and 2080 ti above that...

I guess I am just dubious that a 3080 eating ~85 more watts than a 2080 is going to be slower than an evolutionary touch up on RDNA. A straight die shrink of the 2080 Ti would seem more prudent to profit from.

Again, if that turns out to be the case, great. RDNA v1 seemed really raw (like a launch now to get dev kits for consoles into existence type of product) and maybe there is that much efficiency left to be unlocked. That'd be cool.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Well, AMD is getting what, a half node push, at best? Nvidia is getting substantial (but maybe not so great?) full step improvement...

Comparing the 2070 Super with the 5700xt was largely a push, and there was still the 2080 and 2080 ti above that...

I guess I am just dubious that a 3080 eating ~85 more watts than a 2080 is going to be slower than an evolutionary touch up on RDNA. A straight die shrink of the 2080 Ti would seem more prudent to profit from.

Again, if that turns out to be the case, great. RDNA v1 seemed really raw (like a launch now to get dev kits for consoles into existence type of product) and maybe there is that much efficiency left to be unlocked. That'd be cool.
RDNA2 is Maxwell like step for AMD.


P.S. Have you payed attention to Xbox Series X performance? It has 52 CUs clocked at 1.8 GHz that are consuming between 130 and 140W of power.

Scale it for full dGPU and you get 60 CUs at 2.2 GHz that uses around 225W of power for whole board.