• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question 'Ampere'/Next-gen gaming uarch speculation thread

Page 91 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ottonomous

Senior member
How much is the Samsung 7nm EUV process expected to provide in terms of gains?
How will the RTX components be scaled/developed?
Any major architectural enhancements expected?
Will VRAM be bumped to 16/12/12 for the top three?
Will there be further fragmentation in the lineup? (Keeping turing at cheaper prices, while offering 'beefed up RTX' options at the top?)
Will the top card be capable of >4K60, at least 90?
Would Nvidia ever consider an HBM implementation in the gaming lineup?
Will Nvidia introduce new proprietary technologies again?

Sorry if imprudent/uncalled for, just interested in the forum member's thoughts.
 
What I think is somewhat relevant to the pricing and wafer supply issue:

Local onlineshop just started a "sale" of intel CPUs and motherboards and more importantly Nvidia GPUs. "sale" because it's maybe $100-$50 off the GPU and you will almost certainly get a better deal with an Ampere card. My point however being is that they have a lot of stock (given the size of the target population) especially of the 2080TI. I see now way they will sell even half of that stock. So this shortage seems to be a US specific problem and not really a good argument for the high prices.
 
What I think is somewhat relevant to the pricing and wafer supply issue:

Local onlineshop just started a "sale" of intel CPUs and motherboards and more importantly Nvidia GPUs. "sale" because it's maybe $100-$50 off the GPU and you will almost certainly get a better deal with an Ampere card. My point however being is that they have a lot of stock (given the size of the target population) especially of the 2080TI. I see now way they will sell even half of that stock. So this shortage seems to be a US specific problem and not really a good argument for the high prices.

Where did you hear there was a shortage of Turing GPUs in the US?

Intel on the other hand is having a shortage issue, but its of higher end CPU's, not low end ones.
 
Some news from Igor regarding the memory/temperatures:

The memory is, even if you ask the board partners, the real problem child, which is to place current air coolers before such problems. One reports of up to 98 ° in the hottest module. At least here AMD can sleep a bit quieter because of the wider memory interface, without having to tinker with the GDDR6 on steriods (in the form of the GDDR6X).
[...]
NVIDIA’s so-called “over hand feature”, i.e. the very short boards with a protruding cooler, generate a very high heat flux density on a rather small area, especially with double-sided memory assembly. This is gonna be hot, really hot. With dual-slot designs, however, this won’t be possible without VC, because the up to 350 watts really have to be transported away quickly.

At least for air cooling that doesn't sound promising. For water it shouldn't be a problem though.
 
I think AMD sets prices so high because they simply couldn't fulfill demand anyway
This. AMD are not a charity, they are going to make as much money as possible. This next gen is using a process node that AMD has to compete for space on with several bigger richer companies, and what allocation AMD has is also being used for cpu's and consoles by AMD. They are not going to be able to pump out huge numbers of gpu's, hence they'll be able to sell what they have at higher prices. Nvidia if anything is the one in a position to lower prices as they have a lot more capacity if rumours are true that they are using the less popular 8nm (not that I expect them too).
 
What about gdd6rx advantage as far we know its only for NV as they have a deal with micron
If AMD want more memory bandwith they need to push wider bus to memory
 
I would have bought a 5700 XT at launch if they would have said in no uncertain terms there wouldn't be a more powerful AMD card for at least another year and a half. Now I'm probably going to buy a 3080 unless AMD finally gives us some information on their upcoming products instead of just hoping once again they will release a high end card before 2021. "Year of the Gamer" my ass Lisa.
You don't talk about your future products. Period.
 
There's always the option of going HBM, at least on the high end products.

No doubt AMD sold the Radeon VII at $700 because they'd have a hard time charging more, but if NVidia is charging $1,300 or more for their top card then I think AMD will have some room to maneuver even if they aren't the top dog in terms of performance.

If heat really is a big concern for NVidia the the third party cards will start to offer water solutions. I'm not sure why NVidia doesn't do it themselves unless they're worried about the optics. They wouldn't be the first though (AMD offered Fury with a water-cooling solution if I recall correctly) and honestly it's not hard to sell it as a premium solution for a premium product.

Regardless of how it all shakes out, I'm just glad we'll be seeing new cards before the end of the year though.
 
GA101 might have been HBM, but unless they revive it there won't be any 'gaming' cards with HBM. Would have to be on a node that's not SS8 though.

If heat really is a big concern for NVidia the the third party cards will start to offer water solutions. I'm not sure why NVidia doesn't do it themselves unless they're worried about the optics. They wouldn't be the first though (AMD offered Fury with a water-cooling solution if I recall correctly) and honestly it's not hard to sell it as a premium solution for a premium product.

That's why they are using the double sided cooler. Can't be water if it's going to be used by OEMs.
 
So we can expect 2-sided water blocks? It would have to sandwich the card between two blocks and have the inlet on one block and the outlet on the other block with a flow connector between the blocks, right? If that's the case, I'd imagine water blocks to be more expensive as well.
 
That's why they are using the double sided cooler. Can't be water if it's going to be used by OEMs.

I don't necessarily agree. I mean look at an OEM like Alienware today. FE Nvidia cards are dual fan coolers, but Alienware puts cheapo blower boards/cards in their new PCs not FEs. So Nvidia could certainly do an AIO FE and OEMs could stick with their generic blowers.

Fury X and Vega 64 Liquid (I own both) are fantastic AIO designs. I would LOVE it if Nvidia followed suit.
 
😀 Nope. There is nothing like "sandwich". There is no heatsink on the backside, all you can see there is just a backplate.
 
These things are just backplates, you can see PCB just below them
nvidia-geforce-rtx-30c3jaf.jpg
 
The fan on the backside is embedded into the heatsink such that it lines up with the front side of the PCB. As Krteq said, there's no sandwich happening outside of the backplate.

So you're saying the fan is on the side of the PCB? Doesn't make any sense, that's stupid.
 
So you're saying the fan is on the side of the PCB? Doesn't make any sense, that's stupid.

:shrug: It's not my design but that's what is in the picture. There's several things in this design that don't make sense to me in terms of air flow and heat flux, but I'm not a mechanical engineer so maybe I'm just missing the point.
 
Guys enough AMD talk in a NVIDIA THREAD.

I really do not care about NAVI in this thread.
I do not care about speculative comparison on AMD cards in a NVIDIA thread.
The only time AMD should be talked about in again a NVIDIA thread is after real numbers are out and you want to do a hard pass comparison.
You can't even accurately compare PRICE because again, its SPECULATIVE.

STOP POLLUTING MY NVIDIA THREAD
or GO TO THE AMD SECTION.

---end of rant---


Since the 3090 is now officially spoiled, I am assuming there is going to be no 3080ti, as i hear the 3090 is now the official flag ship.
However is there any chance that you guys think Nvidia will pull a 3090Ti, like a few months after launch and rub salt on early adopters?
 
but
Since the 3090 is now officially spoiled, I am assuming there is going to be no 3080ti, as i hear the 3090 is now the official flag ship.
However is there any chance that you guys think Nvidia will pull a 3090Ti, like a few months after launch and rub salt on early adopters?

The 3080 Ti would be something that would be slightly slower than the 3090 but at a lower price. Effectively this would be a price cut without actually cutting the 3090's price. This assumes the top Big Navi is faster than the 3080 in raster but slower than the 3090, and priced inbetween.
 
I believe this is a fanmade mockup, but this is closer to what I was envisioning:

That would be a horrible design.

Instead of the blowing the hot air out the rear, it would blow the air up top to the CPU and the mosfets.
Or if they were both intakes, it would essentially cause back pressure in the middle of the card.

I highly doubt Nvidia's engineers would design something that bad.

Well not that it would matter to me personally tho, as i will most definitely wait and get a EVGA FTW version with waterblock.
But i would not even recommend the FE edition if thats how the stock heat sink is going to be.
 
That would be a horrible design.

Instead of the blowing the hot air out the rear, it would blow the air up top to the CPU and the mosfets.
Or if they were both intakes, it would essentially cause back pressure in the middle of the card.

I highly doubt Nvidia's engineers would design something that bad.

Well not that it would matter to me personally tho, as i will most definitely wait and get a EVGA FTW version with waterblock.
But i would not even recommend the FE edition if thats how the stock heat sink is going to be.

It does match all the images that have come out though. From the images, the rear fan blows the opposite direction of the fan that is over the GPU. Which is still really weird, and would really screw with case cooling to have this strange circular motion of air. I know in my case, it would create a hot zone under the GPU which would result in that front fan pulling in the hot air from the rear fan.
 
That would be a horrible design.

Instead of the blowing the hot air out the rear, it would blow the air up top to the CPU and the mosfets.
Or if they were both intakes, it would essentially cause back pressure in the middle of the card.

I highly doubt Nvidia's engineers would design something that bad.

I guess it would depend on the mobo? We are talking about 325+ W it needs to remove.
 
That would be a horrible design.

Instead of the blowing the hot air out the rear, it would blow the air up top to the CPU and the mosfets.
Or if they were both intakes, it would essentially cause back pressure in the middle of the card.

I highly doubt Nvidia's engineers would design something that bad.

Well not that it would matter to me personally tho, as i will most definitely wait and get a EVGA FTW version with waterblock.
But i would not even recommend the FE edition if thats how the stock heat sink is going to be.

Both are intakes based upon the fan blade shape. From what I can tell and based upon the assumption that under the shroud there is a path for air flow from the fan on the front of the PCB to the fins in the middle X section, here is the air flow that I think is happening.

The front of pcb fan takes are from the bottom side of the case (standard) and pushes are out the back of the case as well towards the back of the card (front of the case) with some air coming out towards the bottom of the case as well, but most pushed around the 2nd fan towards the front of the case.

The back of pcb fan takes air from the middle of the case and pushes it straight through towards the bottom of the case. This design would push most of the hot air towards the front and bottom of the case. In an open air setup this wouldn't be an issue at all but in an actual case it could become a problem with trying to get cool air to the front side fan while removing the hot air coming off the card.

Ideally you'd probably want fresh air flowing from the top or upper back of the case and have fans pushing air out in the front and bottom of the case with a side panel fan towards the back of the case feeding the front side GPU fan fresh air. Ventilated PCI slot covers on the case would probably work OK too.
 
Back
Top