Discussion Ada/'Lovelace'? Next gen Nvidia gaming architecture speculation

Page 74 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
The specs of the 3050 laptop is new/changed. That's pretty wild they have all those different combinations.
RTX 3050 4GB-6GB has 64-128bit?
2x 2GB memory chips for 64-bit, that's ok.
128-bit looks like 2x 2GB + 2x 1GB, that's more exotic but not unheard of.
If you exceed 4GB, then the last 2GB have only 1/2 of BW.
I would rather pay more for RTX 4050 than this.
Not a fan of fake frames, but It supposedly helps when you have low FPS. The motion looks smoother, even If input lag is the same.

BTW, dynamic boost is up to 25W for RTX 40 series.
Saw It in here, Link
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,816
7,258
136
RTX 3050 4GB-6GB has 64-128bit?
2x 2GB memory chips for 64-bit, that's ok.
128-bit looks like 2x 2GB + 2x 1GB, that's more exotic but not unheard of.

I think it's 2560 cores and either 64-bit 2x2 or 96 bit 3x2. Why they wouldn't just give it a different name than the original 3050 is beyond me.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
I think it's 2560 cores and either 64-bit 2x2 or 96 bit 3x2. Why they wouldn't just give it a different name than the original 3050 is beyond me.
If there is a mistake in that table, then It's 96-bit as you said.
If It's correct, then what I wrote.
2560 will be likely the 6GB version.
Naming is pretty bad, considering It should perform as ~3050 Ti which has only 4GB. On the other hand, even the standard RTX 3050 could be faster If TGP is higher than RTX 3050 Ti.

@jpiniero You are correct.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I saw that the bald guy still calls it a xx70 "TIE" card. At least Nvidia is consistent in their delusions on pricing, naming and fps numbers. The gaming trailers with DLSS 3.0 numbers being emphasized really helped bring home how silly it all is anymore.

The YouTube comments were hilarious. Soon as he showed up everybody started to say "oh, the tie guy".
 
  • Like
Reactions: Ranulf

dlerious

Platinum Member
Mar 4, 2004
2,118
932
136
No, I don’t think so either. Maybe they’ll just stick to the same prices as the RDNA2 999 for x900 and 649 for x800.
Yeah, I'm basing my guess on the 7900XTX costing the same as 6900XT. The 7800XT is the next purchase I'm considering if price/performance are reasonable.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Yeah, I'm basing my guess on the 7900XTX costing the same as 6900XT. The 7800XT is the next purchase I'm considering if price/performance are reasonable.

6900xt and 6800xt where on the same Navi21 tho.

Also the 6800xt was almost twice as fast, compared to the old flagship, the 5700xt. Will the 7800xt be twice as fast as the 6800xt? No because not even the 7900xtx isn't twice as fast. Not even close.
 
  • Like
Reactions: Lodix

Aapje

Golden Member
Mar 21, 2022
1,530
2,106
106
The 5700 XT is a different tier though. I don't get why people insist on comparing different tiers.
 

amenx

Diamond Member
Dec 17, 2004
4,517
2,853
136
Sadly with ADA and RDNA 3, tiers are losing their meaning and now used to market/milk consumers.
 
  • Like
Reactions: Lodix

Hans Gruber

Platinum Member
Dec 23, 2006
2,516
1,358
136
Keep an eye on Nvidia and AMD's quarterly reports moving forward. They either have to admit the days of cheap GPU's are coming back or they have to cook the books. I can't see them admitting the good times are over and reporting awful profit/revenue/growth numbers.

People will ask why are your products costing so much money? Answering with "we have to make money." Agreed so why are your revenues down. You get the point. It's coming sooner than you think.

Intel has their next GPU coming out later this year. It's supposed to be able to compete with the high end GPU's instead of the mid grade 3060.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
Keep an eye on Nvidia and AMD's quarterly reports moving forward. They either have to admit the days of cheap GPU's are coming back or they have to cook the books. I can't see them admitting the good times are over and reporting awful profit/revenue/growth numbers.

People will ask why are your products costing so much money? Answering with "we have to make money." Agreed so why are your revenues down. You get the point. It's coming sooner than you think.

Intel has their next GPU coming out later this year. It's supposed to be able to compete with the high end GPU's instead of the mid grade 3060.
Yes, will be interesting how long they can deceive shareholders.

While profits, margins, and growth was high the management could tell porkies (cough, cough "we have minimal exposure to mining"), but when all three are down then shareholders might start asking inconvenient questions.
 
  • Haha
Reactions: igor_kavinski

amenx

Diamond Member
Dec 17, 2004
4,517
2,853
136
I dont think Nvidia need to deceive shareholders or "cook books". The whole tech industry is down and caught in a recession. Shareholders can easily understand that.
 
  • Like
Reactions: DooKey
Jul 27, 2020
28,138
19,178
146
Intel has their next GPU coming out later this year. It's supposed to be able to compete with the high end GPU's instead of the mid grade 3060.
Link to news article???


1672845253936.png

Just a faster ARC most likely. I will be surprised if it manages to challenge 7900/4090. If I have to guess, it could be something that consumes 500W+ and barely beats the 3080 12GB (best case).
 
Last edited:
Jul 27, 2020
28,138
19,178
146
I dont think Nvidia need to deceive shareholders or "cook books". The whole tech industry is down and caught in a recession. Shareholders can easily understand that.
But what Nvidia is doing is reducing their sales even more with artificially higher prices. How will they explain the unsold 4080 inventory sitting on retailers' shelves and in their warehouses?
 

Saylick

Diamond Member
Sep 10, 2012
4,052
9,472
136
But what Nvidia is doing is reducing their sales even more with artificially higher prices. How will they explain the unsold 4080 inventory sitting on retailers' shelves and in their warehouses?
But but but Nvidia Geforce Now with RTX 4080 Superpod! Just shove any unsold 4080s there and you're good to go. If you can't afford a 4080 outright, you can rent it from Nvidia for $20/month. Win-win, baby.

Can't wait for the fun proceedings to begin. Vote of no confidence against Jensen!
Lol, would be funny if that ever happened, but it likely never will. Even during the worst of it last year when Nvidia pre-announced their dip in earnings when the crypto bubble burst, JHH just pivoted hard to his usual fallbacks of AI, deep learning, servers, and self driving cars during the earnings call. Any serious question about oversupply and inventory correction was met with an answer about how Hopper was selling like hotcakes, which obviously is unrelated to the question that was asked.
 
Jul 27, 2020
28,138
19,178
146
But but but Nvidia Geforce Now with RTX 4080 Superpod! Just shove any unsold 4080s there and you're good to go. If you can't afford a 4080 outright, you can rent it from Nvidia for $20/month. Win-win, baby.
Sounds like something Nvidia would do in a last ditch effort to monetize the millions of brainwashed salivating gollums (I got a 3090 coz it was priced just right for 24GB VRAM! Honest! :p)

I also have the ASUS LC 6800 XT. NIB with warranty (I'll see if I can sell it to someone for a little or maybe lots of profit. Else it will be my first AMD card after RX 580).

Lol, would be funny if that ever happened, but it likely never will.
Sadly true.
 
  • Like
Reactions: Saylick

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
I dont think Nvidia need to deceive shareholders or "cook books". The whole tech industry is down and caught in a recession. Shareholders can easily understand that.
I wasn't thinkign about the future, but more that this mining boom they did claim they had minimal mining exposure despite getting into trouble the last time (when their exposure was far less). But as long as the money was coming nobody seemed to care as @Saylick already pointed out.
 

biostud

Lifer
Feb 27, 2003
19,914
7,018
136
Would nvidia have gotten the same flak if they had named the 4080 -> 4080Ti and the 4070Ti -> 4080 (and launched it @ $799)?
 

Mopetar

Diamond Member
Jan 31, 2011
8,489
7,736
136
Keep an eye on Nvidia and AMD's quarterly reports moving forward. They either have to admit the days of cheap GPU's are coming back or they have to cook the books.

I think the design changes and both companies move to an MCM approach. This not only gets better yields, but allows different components to be made on different nodes tailored to those components. The v-cache chiplet that AMD used for Zen 3D used a different set of libraries built for SRAM which let them achieve far greater density than on the base CPU chiplets.

NVidia might also continue to design a single due for both consumer and professional markets. This lets them be more flexible with product mix. Trickle out high-end consumer dies that are salvaged from professional cards and let the higher margins from those customers balance out the average.

They may also hope that other fabs become competitive with TSMC which will help keep costs down.
 

Heartbreaker

Diamond Member
Apr 3, 2006
5,160
6,778
136
I think the design changes and both companies move to an MCM approach. This not only gets better yields, but allows different components to be made on different nodes tailored to those components. The v-cache chiplet that AMD used for Zen 3D used a different set of libraries built for SRAM which let them achieve far greater density than on the base CPU chiplets.

MCM is a great change. But It's a benefit that creates a generational price reduction once. After that it's priced in.

It's still not like the good old days where you doubled the transistors each new node, for essentially the same money, for many generations in a row.

Also the benefit isn't really making itself felt at the consumer level. I remember when people were speculating here that AMD would massively undercut NVidia pricing to win market share with it's new MCM cost advantage... Not so much apparently.