oh no that's not how any of that works.
no, GPU is the SOC tile.
I was wondering if these would end up standalone or shared die.
Sharing memory IO would suggest a shared die would be more optimal.
oh no that's not how any of that works.
no, GPU is the SOC tile.
If I grokked MLID correctly the cpu is in the i/o dieI was wondering if these would end up standalone or shared die.
Sharing memory IO would suggest a shared die would be more optimal.
Need a thread for how much vram is too much vram 😉I wonder what will be the smallest LPDDR6 chip available? IIRC the smallest LPDDR5 ones are 8Gb, but I dunno if those are even still in production, the most common ones I see around are 12Gb.
AMD might be forced to put more memory in AT3 cards than on AT2 ones?
If I grokked MLID correctly the cpu is in the i/o die
Then you have 2 options:
- Add another cpu die (replacement to strix point)
- Add another gpu die (medusa premium & also halo I think)
No the Xbox is gddr7 — desktop modeOne thing that contradicts is the XBox configuration, shown in previous videos, where there is a base monolithic CPU-SOC die and a separate GPU die
And this video suggests this approach may be shared with laptops.
No the Xbox is gddr7 — desktop mode
Medusa halo & premium are lpddr6 — laptop mode
So the architecture changes. But still take all this with mountains of salt as MLID is the source
My revised estimations / guesstimates (no LLMs used)Any guesses on (2027) launch prices ??
- AT0 (10090xt > 5090) — 384 bit bus so $1500+ ?
- AT1 (10080xt) — scrapped
- AT2 (10070xt = 5080 > xbox next) — 72 CU 192bit gddr7 so $600+
- AT3 (10060xt < 5070) — 48 CU 384bit lpddr6 so $400+
- 9060xt 16gb (=ps5 pro) ~ $300
- AT4 (10050xt > 3060 12gb in raster) — 24 CU 128bit lpddr6 so $250 ?
never say the AI bubble didn't do anything for youWhy adding so much memory to AT2, AT3 and AT4? I would assume 18gb / 16gb / 12gb for these.
I'd like to get more, but it is unlikely that we are seeing that.
AT3 & AT4 are joke guesses because MLID said lpddrWhy adding so much memory to AT2, AT3 and AT4? I would assume 18gb / 16gb / 12gb for these.
I'd like to get more, but it is unlikely that we are seeing that.
I'd be surprised if the 10070 XT or whatever they call it only ends up being a 5070 Ti at $600. That's basically the same as a 9070 XT in perf/$ but with 50% more VRAM.My revised estimations / guesstimates (no LLMs used)
(Assuming this lpddr vram thingy is true & also assuming it works out) Imagine this line-up (in 2027)
- AT0
- 10090xt+ — Multiple models starting at $1500 plus and huge vram like Radeon VII or titan
- AT1
- 10080xt — scrapped (Lisa Su took her toys & went home)
- AT2 (gddr7)
- 10070 xtx 24gb = $700 (~5080)
- 10070 xt 18gb = $600 (~5070 ti)
- 10070 gre 15gb = $500-$550 (~5070 super)
- AT3 (lpddr6)
- 10060 xt 24gb = $450-$500 (~5070)
- 10060 16gb = $400 (~5060ti 16gb)
- AT4 (lpddr6/lpddr5x)
- 10050xt 32gb = $350 (~9060xt 16gb)
- 10050xt 24gb = $300 (~9060)
- 10040xt 16gb = $250 (~3060 12gb in raster)
But path tracing !!!I'd be surprised if the 10070 XT or whatever they call it only ends up being a 5070 Ti at $600. That's basically the same as a 9070 XT in perf/$ but with 50% more VRAM.
Even if they are using LPDDR6, why adding more memory than useful? These things have to be cheap and 16 GByte for a mainstream GPU and 12 GByte for the Low End part seem to be reasonable.AT3 & AT4 are joke guesses because MLID said lpddr
AT2 I have to give a serious rethink.
I am now thinking xtx will use 4gb gddr7 while xt & gre will use 3gb gddr7
never say the AI bubble didn't do anything for you
Not sure what will happen with AT2Even if they are using LPDDR6, why adding more memory than useful? These things have to be cheap and 16 GByte for a mainstream GPU and 12 GByte for the Low End part seem to be reasonable.
Yes, you can add more memory. But why should AMD do that when not of big benefit for the average gamer?
The same logic applies to AT2 with GDDR7. 18 GByte are most reasonable on 192bit SI with 24 Gbit chips. And 18 GByte are also perfectly suited for a 1440p card and also fine enough for 4K with upscaling. Most gamers won't benefit from 24 GByte but would have to pay more.
You are logicalI know what you are thinking of. But does that work today as good as in the past? Anybody can pull out ChatGPT to ask for the better GPU (and might get the correct answer - or not).
For example, the megapixel race has pretty much ended. Many new phones and cameras get released with fewer pixels than their predecessor. People either gained more knowledge (more pixels != more quality), simply don't care because good enough or are not interested in technical details.
Higher VRAM amounts on lower end parts make the more expensive ones less attractive as well.
If you had these 3 options (for an entry level GPU) then which one are you buying 🤔
- 10050xt 32gb = $350 (~9060xt 16gb)
- 10050xt 24gb = $300 (~9060)
- 10040xt 16gb = $250 (~3060 12gb in raster)
He is too busy for thatBut now that Jensen knows of this he will be scheming up a riposte for this
Why adding so much memory to AT2, AT3 and AT4? I would assume 18gb / 16gb / 12gb for these.
I'd like to get more, but it is unlikely that we are seeing that.