https://wccftech.com/nvidia-rtx-3070-3dmark-performance-leaked-actually-faster-than-an-rtx-2080-ti/
My next gpu is getting closer
My next gpu is getting closer
Evidence? None except that supply has been limited for most of 2020 and now AMD has to make 2 consoles, Zen3 chiplets and a full stack GPU launch all at the same time? How can they not be supply limited? A good hint is also then Ryzen 5000 series prices and SKU. High prices, the 5800x is especially overpriced, and missing 5700x and missing 5600. It's clear once supply gets better, these 2 SKUs will also become available at saner prices. With zen3 they could offer reasonable prices and still make big profit and take a lot of market share, if they had the supply for that.What evidence do you have to suggest they are supply limited? Ultimately yes there is not an infinite supply. But there isn't any evidence to suggest they won't have enough cards for a proper launch.
Just because they have a lot of products, doesn't mean they will be supply contained. AMD is the 3rd largest customer that TSMC has, and they are the single largest 7nm customer.Evidence? None except that supply has been limited for most of 2020 and now AMD has to make 2 consoles, Zen3 chiplets and a full stack GPU launch all at the same time? How can they not be supply limited? A good hint is also then Ryzen 5000 series prices and SKU. High prices, the 5800x is especially overpriced, and missing 5700x and missing 5600. It's clear once supply gets better, these 2 SKUs will also become available at saner prices. With zen3 they could offer reasonable prices and still make big profit and take a lot of market share, if they had the supply for that.
On top of that for consoles they have contracts and every zen3 wafer simply is more profitable than a gpu wafer. So no incentive for a price war vs NV. especially since most likley all in all AMD has lower costs due to smaller dies and better yields even if TSMC wafers cost more.
No evidence. just logical deductions which of course can be completely wrong which I actually hope. I mean I would take 3080 level perfromance for $600 all day...but yeah let's remain realistic.
There is one reason. Gain marketshare or outright pushing your competition out of the market. But again that only works if you have the supply. Even cheap ryzens would make more money per wafer than big navi.There is no reason for AMD to undercut Intel when they have the upper hand. This in no way denotes supply issues.
If RGT is right then 6800XT will match RTX 3080 at 4K and beat it at 1440p and 1080p. Same for RX 6900XT vs RTX 3090. RX 6800 will beat RTX 3070 soundly and thats why Nvidia want a GA102 based 3070 Ti . If AMD price RX 6800 at $500, RX 6800XT at $600 and RX 6900XT at $800 that would be fairly aggressive pricing.3070TI on GA102 incoming to fight navi21XL/6800.Now we know why they cancel 3070 16GB and 3080 20GB-not good enough vs navi21.Btw last time nv launched x70 card on biggest SKU was during fermi days with GTX 570.NV must be desperate.
![]()
NVIDIA allegedly preparing GA102-150 GPU with 7424 CUDA cores - VideoCardz.com
NVIDIA is allegedly planning a new GA102 SKU. A return of GeForce RTX 3070 Ti? Kopite7kimi, a leaker that correctly predicted NVIDIA Ampere Gaming specs months in advance is now claiming that NVIDIA is preparing a new GA102 GPU. The device would allegedly feature 7424 CUDA cores, 1536 more than...videocardz.com
Also, now we hear that RTX 3060 is on 104 die, instead of 106, which might tell us the performance, we can expect from this product. Before - I couldn't believe that RTX 3060 would touch RTX 2080 performance, now - it will be easy.If RGT is right then 6800XT will match RTX 3080 at 4K and beat it at 1440p and 1080p. Same for RX 6900XT vs RTX 3090. RX 6800 will beat RTX 3070 soundly and thats why Nvidia want a GA102 based 3070 Ti . If AMD price RX 6800 at $500, RX 6800XT at $600 and RX 6900XT at $800 that would be fairly aggressive pricing.
I doubt the 3070 will have GDDR6X. Its too costly right now as nVidia is the only one using it. Its not even a ratified standard.So, if true where would this be priced? $600 would seem likely, but that would be the already questionable value of the 3070 even worse.
Videocardz is assume GDDR6X, but the tweet didn't mention that. Nvidia has said the dies can do either memory type, is there a reason to think a 3070 Ti wouldn't use 10GB GDDR6 on a 320 bit bus?
Almost glad my attempts to get a 3080 have been unsuccessful so far. 16 gb on the Radeon will make it far more future proof.Can't sell mid-rangers at high-end prices this time. "Our new GlafSHip PGU has 10GB of ram guys" GTFO
I guess the question is: how often do you upgrade?The VRAM discussion here is so masterfully done, I can't tell if the VRAM complaints are real or just having fun? I haven't found anything I can't run smoothly, meaning without VRAM hitching, on my tv @4K with the 10GB on the 3080.
Exactly this. I've gotten accustomed to keeping my 1080ti 3 years now. I'd like to do the same with the next card if it's a high end model that will let me play at max settings like I have been used to doing.I guess the question is: how often do you upgrade?
For regularly upgraders, performance here and now matters the most.
For those who keep cards longer, then the concern is that 10GB might age well at all over the new console's lifetime.
Well said. For folks who upgrade every time a new Nvidia flagship is launched this does not matter. But for folks who want to keep it longer than 2 years the VRAM discussion is important, especially for 4K gaming.I guess the question is: how often do you upgrade?
For regularly upgraders, performance here and now matters the most.
For those who keep cards longer, then the concern is that 10GB might age well at all over the new console's lifetime.
sorry wrong post quoted. I meant @undertaker101Can't sell mid-rangers at high-end prices this time. "Our new GlafSHip PGU has 10GB of ram guys" GTFO
What makes you think you will get 16gb amd card for msrpAlmost glad my attempts to get a 3080 have been unsuccessful so far. 16 gb on the Radeon will make it far more future proof.
What most people expressing so much concern over VRAM capacity fail to acknowledge is that you will probably run into more raw performance issues as your card ages, than VRAM issues. So you keep a card for multiple generations you will be turning settings down for acceptable performance. It's just a fact of life.Well said. For folks who upgrade every time a new Nvidia flagship is launched this does not matter. But for folks who want to keep it longer than 2 years the VRAM discussion is important, especially for 4K gaming.
You will get any card for msrp if you wait, point was what the default configuration would be for the top Radeon, and there are good indicators it will be 16 gb. Techspot did an evaluation of the 2060 6gb in Jan 2019 and concluded at the time 6gb could max everything out back then at 1440p, not true today, even at 1440p. I have little doubt that with consoles pushing 16 gb, 10 gb will be a limiting factor sooner than most folks realize.What makes you think you will get 16gb amd card for msrp
Dropping textures easily has the greatest IQ impact out of all settings (excluding resolution). People don't want to tone down texture quality because of that. Whereas with other settings, usually just dropping from Ultra to High settings can net you nearly the same visuals with significant performance uplifts in the region of 15-20%.What most people expressing so much concern over VRAM capacity fail to acknowledge is that you will probably run into more raw performance issues as your card ages, than VRAM issues. So you keep a card for multiple generations you will be turning settings down for acceptable performance. It's just a fact of life.
Somehow it's only a problem to adjust setting for VRAM performance issues, not raw performance issues? This is hypocritical.
As I posted in this thread before. A 2080Ti is already dipping to unacceptable performance at 4K max setting and it isn't from VRAM, it's from raw performance capability, so more VRAM isn't going to future proof you.
https://forums.anandtech.com/threads/ampere-next-gen-gaming-uarch-speculation-thread.2572510/page-182#post-40305049