Golgatha
Lifer
- Jul 18, 2003
- 12,381
- 1,004
- 126
Instead of waste of sand, waste of paper?Jensen better cut checks to his AIB partners to start printing 4050 Ti boxes![]()
Instead of waste of sand, waste of paper?Jensen better cut checks to his AIB partners to start printing 4050 Ti boxes![]()
Probably Nvidia's DX12 driver overhead problems making it underperform at 1080p. AMD used to have the same problems in DirectX 11 8-10 years ago.I would love to know how much L2 or IC you would need to compensate for 1/2 of bandwidth.
I think 32MB extra cache would do wonders for 4060Ti, but that would also mean ~20-25mm2 bigger chip.
What I find interesting is that 4060Ti is only 25% faster at Full HD than RX 7600, but at 4K It's 43% faster.
Both of them have the same BW and similar amount of cache.
He can recoup a few dollars from recycling the cardboard from the 4060 Ti boxes to pay for his blood pressure meds and most important to him, his Viagra pills, you know, so he can reliably go from 1 inch to 2 inches.Instead of waste of sand, waste of paper?
It's a collusion of sorts based on tacit agreement of the two big GPU players. Nvidia isn't trying to move the needle coz they want their cryptomining profits in a cryptowhining world. AMD as always is trying to copy Nvidia's moves to keep their GPU division relevant coz their real darling is their CPU. They have given up trying to compete with Nvidia on price coz doing that before hasn't netted them more marketshare. Intel is still on training wheels and trying hard not to land nose first on the rocky terrain leading to Mount GPU's peak.Jensen woke up one day and just decided not to sell any more cards. Super interesting.
It's not a down market for NvidAI.That said, I wonder if they are just sandbagging in a down market.
I'm now seeing 300 euro MSI Ventus 3060 12 GB's, which is a great deal compared to one month ago when they cost 350 euros. And I would argue that if people only want Nvidia, it is a better deal than the 4060 (cheaper and 12 GB which is more important than not having DLSS 3).People say Nvidia is trying to sell the massive overstock of 3000 series cards, but where are they? What cards are they talking about? I see 3060's still priced $3-400, so I know Nvidia isn't interested in selling those things either. All I can conclude is that Nvidia actually doesn't want to sell any gaming cards, be it new or old. Jensen woke up one day and just decided not to sell any more cards. Super interesting.
Yes! I already had an RX 7600 XFX triple-fan on order, but whenn the MSI Ventus 12GB RTX 3060 hit $289 USD @ Newegg, I had to snag one of those too.I'm now seeing 300 euro MSI Ventus 3060 12 GB's, which is a great deal compared to one month ago when they cost 350 euros. And I would argue that if people only want Nvidia, it is a better deal than the 4060 (cheaper and 12 GB which is more important than not having DLSS 3).
3060 is about 10% faster in raster than 6600 non XT so that plus DLSS being a lot better than FSR when rendering low resolutions and the extra 4GB of VRAM make 3060 definitely a better buy if you can go to $280 but no higher. Still $280 is a tough sell when the 35% faster 6700 XT that also has 12GB of VRAM can be found for $310 to $320 regularly now. There is even an MSI RX 6750 XT sold by Amazon (no third party crap) for $330 right now.The 3060 12GB at $280 is the only decent deal right now from Nvidia. That said, it is basically $80-100 for Nvidia's software and that extra 4Gb of ram. The 6600 is about the same performance but going for $180-200 now.
It's not a down market for NvidAI.
In case that is not overt enough: Nvid-AI
They probably can't even keep up with demand for AI cards and don't want to sell many 'cheap' gamer cards.
That's not my point. Look at the performance difference at 4K.Probably Nvidia's DX12 driver overhead problems making it underperform at 1080p. AMD used to have the same problems in DirectX 11 8-10 years ago.
The losses to 3060 Ti in some games beg to disagreeIt looks like BW is not such a great problem for 4060Ti as everyone thinks
I didn't do a deep dive but it looks like Far Cry 6 is really hurting the overall 4k average compared to 1080 and 1440 for the RX 7600. The game tanks at 4k in his testing. HWUB 4K results are actually more favorable for the 7600.That's not my point. Look at the performance difference at 4K.
288GB/s and 32MB cache for both, yet 4060Ti still managed 43% higher performance in TPU review. It looks like BW is not such a great problem for 4060Ti as everyone thinks, or N33 has more than It needs.
I didn't say that It doesn't affect performance at all, just not as much as some of you think.The losses to 3060 Ti in some games beg to disagree
SM | GPC | Shaders | TMU | ROPs | Frequency(median) | TFLOPs | Texture Fill rate | Pixel Fill rate | BW | L2 cache | Vram | |
RTX 4060Ti | 34 | 3 | 4352 | 136 | 48 | 2790MHz | 24.28 | 379.5 | 134 | 288 GB/s | 32MB | 8GB |
RTX 4070 | 46 | 4 | 5888 | 184 | 64 | 2790MHz | 32.86 | 513.4 | 178.6 | 504 GB/s | 36MB | 12GB |
Difference | +35.3% | +33.3% | +35.3% | +35.3% | 33.3% | 0% | +35.3% | +35.3% | +33.3% | +75% | +12.5% | +50% |
In Hogwarts it's -5% at 1080p and -11% at 1440p and in The Last of Us it's -6% at 1080p and -10% at 1440p per Hardware Unboxed. Absolutely disgraceful gpu.I didn't say that It doesn't affect performance at all, just not as much as some of you think.
I checked performance against 3060Ti, and It loses in 6 games from 25 in TPU review and even that only at 4K and the worst loss is 3.3%.
BF5: -2.9%
Days gone: -0.6%
Elder Ring: -3.3%
Far Cry 6: -0.6%
Guardians of the galaxy: -0.7%
The Witcher 3: -1.1%
The GPU is not bad, the price is bad and of course 8GB Vram.In Hogwarts it's -5% at 1080p and -11% at 1440p and in The Last of Us it's -6% at 1080p and -10% at 1440p per Hardware Unboxed. Absolutely disgraceful gpu.
The GPU is not bad, the price is bad and of course 8GB Vram.
RTX 4070 is not much stronger in Hogwarts legacy: +26%(1% low: +27%) compared to 4060Ti, so It's not like 4060Ti is weak per se, rather Ada is doing poorly in that game.
The Last of us part 1 is a massacre for any 8GB card, so I won't compare It at this point, will wait until the 16GB version is out.
![]()
RTX 3070Ti(GA104 392.5mm2) -> RTX 4080(AD103 378.6mm2) is 80% faster at 4K(TPU).I think ADA is actually one of the strongest architectures and improvement gen over gen but the amount of stock last gen is it's biggest enemy along with AMD's incompetence.
Supposed Nvidia did not hold back. The RTX 4080 came out as the RTX 4070.
This would make the RTX 4070 116% faster than a RTX 3070 which would be an absurd improvement(computerbase.de).
If the RTX 4070 ti actually out as a RTX 4060, we would be looking at 120% improvement gen on gen.
Similarly with a full AD102, we would be looking at around 108% vs the RTX 3090. If the RTX 4060 Ti came out as the RTX 4050, it would be another 100% plus improvement gen on gen.
These are absurd improvement gen on gen, the biggest ever by a long shot on average which is likely why Nvidia shifted the product line. Personally I think if Nvidia products one tier rather than two tiers, this would have been perfect, two tiers is a bit too much considering these monstrous improvements.
Compare this to AMD's 35% percent improvement gen on gen and it would just emphasize AMD incompetence in developing GPU's(both the the 7900 xtx and 7900 are 35% improvements over the 6900xt and 6800xt respectively).
So if the RTX 4080 came out as a RTX 4070 and was priced at $600, this along with the showering of praise in reviews would force the RX 7900 XTX to sell at 500 but possibly drop to 450 to compete with the mindshare. The 7900xt(this would definitely get branded the 7800XT) would be selling in the 350 dollar tier. The RX 7600 would be a 120 dollar product. AMD graphic division would be grinded into dust, particularly their mind share.
This forum was disappointed when the 7900 xtx only competed with the RTX 4080, imagine the disappointment if the 7900xtx only competed with the RTX 4070 when the hype train said it would beat at RTX 4090. Nvidia's flagship would be 50% faster than AMD's and thus make AMD look like they are a generation behind.
Both companies would not be making money, particularly not to cover R and D, which is why I think such pricing, particularly from AMD fans about Nvidia products is unrealistic. Moreover Nvidia's last gen products would be worthless meaning a multibillion dollar inventory/loss write down would be on order. RTX 4070 would make RTX 3090 be worth 400 dollars, RTX 3060 would be worth 120 dollars and etc. Maybe even less as everyone would be selling their cards to jump on the new cards flooding the used market. While gamers would love this, this is something only an incompetent CEO would do. Inflict harm on their own company when they are primarily in a competition with itself and sell things at a loss when it is not necessary and when sell other products like datacenter chips which they can pivot their products towards.
The difference in average performance is 28% at 1080p, 32% at 1440p and 36% at 4K(link).
Performance at 4K is pretty much the same as the increase in TFLOPs and texture filtrate. The much higher BW doesn't provide any significant improvement in performance.
I think comparing It to RTX 4070 is not a bad idea.Unless you seriously downclock the VRAM on the 4070 so that it has the same effective bandwidth, you can't say to what extent the bandwidth of the 4060 Ti is limiting performance. It's pretty clear that it matters in some titles, otherwise it wouldn't lose to a 3060 Ti, but if you dropped the 4070's bandwidth to that of a 4060 Ti, you'd probably see lower performance in some titles as the bottleneck shifts from the shaders to the memory subsystem.
Even then the 4070 would have more L2 cache so it won't be a perfect comparison, but it's only an extra 4 MB which is unlikely to tip the balance too much. Regardless a 4070 with the bandwidth of a 4060 Ti probably won't be able to scale based on shader count if it can't keep them fed.
I think ADA is actually one of the strongest architectures and improvement gen over gen but the amount of stock last gen is it's biggest enemy along with AMD's incompetence.
Supposed Nvidia did not hold back. The RTX 4080 came out as the RTX 4070.
This would make the RTX 4070 116% faster than a RTX 3070 which would be an absurd improvement(computerbase.de).
If the RTX 4070 ti actually out as a RTX 4060, we would be looking at 120% improvement gen on gen.
Similarly with a full AD102, we would be looking at around 108% vs the RTX 3090. If the RTX 4060 Ti came out as the RTX 4050, it would be another 100% plus improvement gen on gen.
These are absurd improvement gen on gen, the biggest ever by a long shot on average which is likely why Nvidia shifted the product line. Personally I think if Nvidia products one tier rather than two tiers, this would have been perfect, two tiers is a bit too much considering these monstrous improvements.
Compare this to AMD's 35% percent improvement gen on gen and it would just emphasize AMD incompetence in developing GPU's(both the the 7900 xtx and 7900 are 35% improvements over the 6900xt and 6800xt respectively).
So if the RTX 4080 came out as a RTX 4070 and was priced at $600, this along with the showering of praise in reviews would force the RX 7900 XTX to sell at 500 but possibly drop to 450 to compete with the mindshare. The 7900xt(this would definitely get branded the 7800XT) would be selling in the 350 dollar tier. The RX 7600 would be a 120 dollar product. AMD graphic division would be grinded into dust, particularly their mind share.
This forum was disappointed when the 7900 xtx only competed with the RTX 4080, imagine the disappointment if the 7900xtx only competed with the RTX 4070 when the hype train said it would beat at RTX 4090. Nvidia's flagship would be 50% faster than AMD's and thus make AMD look like they are a generation behind.
Both companies would not be making money, particularly not to cover R and D, which is why I think such pricing, particularly from AMD fans about Nvidia products is unrealistic. Moreover Nvidia's last gen products would be worthless meaning a multibillion dollar inventory/loss write down would be on order. RTX 4070 would make RTX 3090 be worth 400 dollars, RTX 3060 would be worth 120 dollars and etc. Maybe even less as everyone would be selling their cards to jump on the new cards flooding the used market. While gamers would love this, this is something only an incompetent CEO would do. Inflict harm on their own company when they are primarily in a competition with itself and sell things at a loss when it is not necessary and when sell other products like datacenter chips which they can pivot their products towards.
RTX 3070Ti(GA104 392.5mm2) -> RTX 4080(AD103 378.6mm2) is 80% faster at 4K(TPU).
RTX 3070Ti(GA104 392.5mm2) -> RTX 4070Ti(AD104 294.5mm2) is 43% faster at 4K(TPU).
RTX 3060(GA106 276mm2) -> RTX 4070Ti(AD104 294.5mm2) is 135% faster at 4K(TPU).
RTX 3060(GA106 276mm2) -> RTX 4060Ti(AD106 190mm2) is 41% faster at 4K(TPU).
Alternately,
RTX 3070 Ti (GA104 17.4B transistors) -> RTX 4060 (AD107 18.9B transistors) Unknown delta, but it's going to be way, way slower
RTX 3090 Ti (Full GA102 28.3B transistors) -> RTX 4070 Ti (Full AD104 35.8B transistors) 10% slower at 4k