Gg AMD, you had a good run...
970 SLI sold like hot cakes with only 3.5GB of VRAM to 1080P all the way to 3440x1440 users. The same for 980 SLI. I guess all those NV buyers didn't care about 3.5-4GB of VRAM but if R9 390/390X only has 4GB, it's DOOM.
970 SLI vs. 290X
- 60% faster at 1080P
- 70% faster at 1440P
980 SLI vs. 290X
- 72% faster at 1080P
- 89% faster at 1440P
I guess a $500-600 card that's 50-60% faster than an R9 290X would be worthless then because of 4GB of VRAM? What happens if AMD repeated HD4850-4870 or 5850-5870 strategy? A lot of PC gamers don't have 4K monitors which means 4GB is plenty.
Do I guess we can put the "Dual Link Interposer" fantasy to death as well.
Nothing is official until AMD's launch. But interesting how 4GB vs. 8GB has become such a contentious issue for you considering you did pay $550 for a 980 4GB that's only 15% faster than the 290X but yet you didn't have a problem with its 4GB of VRAM given the price. Also, I don't recall you going out of your way and not recommending 970 3.5GB SLI or 980 4GB SLI to any 1080P-4K gamers despite both of those setups trading blows and even beating a Titan X in FPS. I guess having 980 SLI performance for $1000 with only 4GB of VRAM is fine as of May 2015, but as of July 2015, 4GB officially becomes outdated spec? :sneaky:
TechPowerUp uses FurMark, which is the only reliable way to get maximum power consumption out of a card.
No, it isn't. Furmark is
not a reliable way to arrive at a GPU's maximum power usage since it is the most worthless synthetic GPU power test invented since no real world program can load the GPU like a power virus can. You continue to deny this and using FurMark to represent at GPU's maximum power usage is amazing despite the entire forum already debating this topic years ago and agreeing that Furmark is indeed a waste of time because it acts as a
power virus. How you aren't understanding the basic premise that no real world program can stress 99.9% of every transistor inside a GPU, but Furmark can, is remarkable! Unless someone in the world can design a real world application that PC gamers actually regularly use that can mimic the stress levels of Furmark, Furmark is just a synthetic bench and nothing more and is as far away from reality as it gets.
Seti@Home, MilkyWay@Home, Folding@Home, scrypt mining are all tasks PC gamers do run on their GPUs for some benefit. There is no measurable benefit that Furmark provides for today's GPU testing - the score in Furmark, the FPS in Furmark - are all meaningless. It can't be used reliably for OC stability testing either because NV/AMD have built in GPU thermal throttling in the drivers. There is no community of PC users that participates in Furmark weekly competitions to help scientific research, making $ from selling online currency or for gaming purposes. For all intents and purposes Furmark is an outdated test that some websites still cling to because their editors are stuck in the past when Furmark actually had some benefits.
I am not even sure if you did run Furmark lately. If you do, you will notice if the GPU's power target is exceeded, the GPU starts thermal throttling on purpose and/or GPU load drops far below 100%. This is because NV/AMD included safety mechanisms in their drivers to prevent GPUs from failing from an unrealistic ASIC / PCB workload imposed by Furmark. It's no wonder so many HD4870/GTX570/590 cards failed under Furmark.
Furmark is basically known for destroying GPU's VRM system and AMD/NV do not consider this program worthwhile whatsoever.
http://www.techpowerup.com/forums/threads/ati-deliberately-retards-catalyst-for-furmark.69799/