RMA approved for $740 store credit. Now what?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

kawi6rr

Senior member
Oct 17, 2013
567
156
116
NVidia will get their due, but I don't have time to wait for some infernal class action lawsuit unfortunately..

And as I said before, while the fault doesn't lie with Newegg or Gigabyte, customer care is an important part of doing business.

If Amazon, Best Buy and other retailers, plus AiB manufacturer EVGA hadn't allowed returns, then I wouldn't expect such of Newegg or Gigabyte..

And how is Nvidia lying to you good customer care? The only reason you got a refund is because they got caught lying not because they did the right thing. I don't get your logic.
 

Elixer

Lifer
May 7, 2002
10,371
762
126
These threads are pure PR gold for nvidia, and they are laughing all the way to the bank.
It makes no sense to demand a refund since nvidia lied, but, you still say you will stay with them, since "nothing else" beats them, which isn't really true.
Then you say you won't buy gigabyte again, since they didn't help you (since that is what nvidia told THEM, "they are on their own"!).

So, you punish newegg by doing the refund.
You punish gigabyte for not doing a refund.
You reward nvidia for lying about the 970, but, you are willing to pay even more $$$ for 980s.

Then when someone points out other options, you then tell them, no, because X,Y,Z.

So, why did you go through the hassle of getting a refund in the first place, when you could have just sat on the card(s), and waited until the new cards came out, and then, you could have made a decision then ?
Sure, you would have to sell your 970's at a loss (which would have been less than spending MORE on new 980's), but, you would have better performance with the new cards, and in the end, end up with a better deal.
 
  • Like
Reactions: Grazick

iiiankiii

Senior member
Apr 4, 2008
759
47
91
These threads are pure PR gold for nvidia, and they are laughing all the way to the bank.
It makes no sense to demand a refund since nvidia lied, but, you still say you will stay with them, since "nothing else" beats them, which isn't really true.
Then you say you won't buy gigabyte again, since they didn't help you (since that is what nvidia told THEM, "they are on their own"!).

So, you punish newegg by doing the refund.
You punish gigabyte for not doing a refund.
You reward nvidia for lying about the 970, but, you are willing to pay even more $$$ for 980s.


Then when someone points out other options, you then tell them, no, because X,Y,Z.

So, why did you go through the hassle of getting a refund in the first place, when you could have just sat on the card(s), and waited until the new cards came out, and then, you could have made a decision then ?
Sure, you would have to sell your 970's at a loss (which would have been less than spending MORE on new 980's), but, you would have better performance with the new cards, and in the end, end up with a better deal.

This thread is pure comedic gold. I am entertained.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Why do I even bother? So all those times in the past where ATI cards or NV cards ran faster in DX7/8/9.1/10/11, it was all just a random fluke and nothing to do with architecture specific optimizations made by ATI/NV's driver teams.

No, you're right in this instance. The driver team CAN optimize their drivers for games, the hardware and even the API. The drivers are the lowest level of software that interacts directly with the hardware, so the brunt of the optimization will take place there.

This is a direct contradiction from what you implied earlier, when you were talking about developers optimizing for architectures. This is only true for low level APIs..

Just think, all of these recent PC ports like Dying Light, Dragon Age Inquisiton, AC Unity etcetera were all developed on consoles, both of which have AMD GPUs. So if your theory was correct, then the optimizations the developers made for the GCN based consoles should transfer to the PC right?

Wrong. As I said, DX11 is specifically designed as an abstractive layer to preclude developers from having to program for several architectures at once..

All those GE/GW titles where AMD/NV tried to get better performance for specific features like Global Illumination, SSAA via DirectCompute, HBAO+, PCSS+, all done in DX11 had nothing to do with catering to specific architectures of the said brands'? R9 290X pulling away from Kepler and closing in on Maxwell but Kepler falling further behind Maxwell simultaneously - all of that is just a random fluke and has nothing to do with drivers from AMD/NV?

Again, you're confusing your own argument that you presented earlier :whiste:

Your CPU argument is going nowhere because HardOCP has already tested 290X CF vs. 970 SLI and against 980 SLI for real world gameplay smoothness at 1440P and above.

I don't need to resort to HardOCP for their own subjective tests concerning smoothness. I've used SLI for years, so I have my own sentiments about that.

When SLI works as intended, it is buttery smooth and this is for 95% of games. Only a few odd games like Watch Dogs have broken SLI implementations..

Like I said even in AC Unity you had to run FXAA on 970 SLI, so the last thing anyone needs to worry about with 295X2/970SLI or faster is CPU bottlenecks in modern games at 1440P and above on a 4.4Ghz+ Core i5/i7 system. We even have users here with 980 SLI max overclocked who report almost 0% increase in performance going from 4.0Ghz 5960X to a 4.4Ghz 5960X in games. This is not some coincidence but because most games are GPU limited today.

Not necessarily. Clock speed is only one factor. The other is core scaling, and driver overhead.. AMD has issues scaling above two cores apparently, as several reviewers have found.

Thats why the GTX 980 with a 2600K is able to beat a R9 290x with a 5960x in Call of Duty Advanced Warfare..

It's remarkable you keep claiming CPU bottlenecks and AMD's horrible DX11 driver overhead but after countless games tested 290X is now faster than GTX970 at higher resolutions, and the same for 295X2 vs. 970 SLI.

Not really. If you test the GTX 970 at reference clocks perhaps, but who's running a GTX 970s at such low clock speeds? For aftermarket GTX 970s, they are either equal to, or faster than the R9 290x whilst drawing a lot less power.

As many have repeated, what is the purpose of your thread? Do you want people to help you pick the best 970/980 cards or do you intend to keep creating flame wars about NV/AMD in your thread? It doesn't seem like you are at all asking people to really help you pick fairly between 970 SLI / 980 SLI or a single 980 as a stop-gap. There should be NO discussion about AMD at all since everyone on our forum knows you are not interested in AMD products. So why do you keep talking about them in your own thread!?

Where did I mention AMD in my OP? Other people brought up AMD, not me. I'm merely replying..

The only AMD part I would consider buying is a 390x. But since thats not due out till Summer, I can't wait that long. Witcher 3 will be out this May, and there's no way I'm missing that game..
 
Feb 19, 2009
10,457
10
76
You diss on AMD's DX11 driver with COD at lowered settings when that game hammers GPUs on ultra, particularly on 1440p.. then you add insult by throwing in DAI, where Mantle provides fast & smoother gameplay? Starswarm isn't a benchmark where AMD cares about DX11, it was made as a Mantle showcase, get with the program.

You hate that your beloved company was not as loyal to you as you are to them, by lying about their hardware, selling you "low-fat" when you wanted "full-fat" products. That made you angry enough to lament to Gigabyte & Newegg for a refund, as far as "never to buy from them again!"... but you will still defend NV to the death, using the same propaganda that they fed in your koolaid.

Makes about as much sense as an abused wife who stays loyal.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Blaming the AIBs is pure BS on the highest level, I can't understand doing so even in the slightest. There is no possible way they could have avoided the situation except have extraordinary knowledge of the GPU (essentially calling Nvidia liars from the start) and refuse to make cards with said chip.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Kind sir, if you like to drink koolaid from Jen Hsun, be my guest. However, do not speak about evil such as Gameworks lightly, or try to suggest that it is nothing serious. Gameworks is a set of closed libraries. You can't possibly optimise drivers without seeing how code is written in those libraries. Yet, you're more than happy to take a pi$$ on AMD and their cards for their performance in the Gameworks title? I was genuinely about to make suggestions on going 295x2, but then was reminded by someone of your post history, and that you're possibly not here asking open ended questions seeking help. For what it is worth, that suggestion seems very true.

Ignorance at it's finest. Let me educate you a bit. First off, Gameworks isn't some nefarious contract between NVidia and developers wherein the developers sign over their souls in exchange for truly exceptional NVidia performance :rolleyes:

It's the developers responsibility to make sure their product performs well on various hardware. And since so MANY of them are using Gameworks, one really can't assume that Gameworks automatically results in a performance drop for AMD, unless you're willing to believe that NVidia and game developers are in some illicit alliance with the intent to destroy AMD

In fact, HBAO+ runs just as fast on AMD hardware as it does on NVidia. And even more, HBAO+ looks better, and runs faster on both AMD and NVidia hardware than AMD's own ambient occlusion tech called HDAO, which is horrible in comparison to HBAO+.

But speaking of Gameworks, I'm sure you have no problem with games adopting Mantle and being part of AMD's GamingEvolved right? :sneaky:

Clearly you suggest punishing Newegg/ Gigabyte is in order, when Nvidia was the one done lying about specification of their product. Then you proceed to parrot on about new Titan, and how that would be a good buy. IMHO you lost all credibility when you posted comparison between 980 and 290 at 720p. 720p with those cards? Nobody with thousands of dollars to burn on GPUs is gaming on 1080p, leave alone lower resolutions. With all due respect, you kind sir, are merely having your jollies in this thread, and are doing little else.

Those graphs showed why I am right, and you are wrong. CPU overhead matters, especially in multi GPU setups....end of story.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Ignorance at it's finest. Let me educate you a bit. First off, Gameworks isn't some nefarious contract between NVidia and developers wherein the developers sign over their souls in exchange for truly exceptional NVidia performance :rolleyes:
You have no knowledge of what the code supplied by Nvidia is really all about, in fact the devs don't know themselves, so don't pretend you know otherwise. Also we have really clear evidence that GW sabotages AMD hardware, so you can characterize what GW is any way you like, doesn't change the results. Even worse is GW titles have been in general a hot mess on Nvidia hardware.

As for Mantle, it does nothing to cripple performance in D3D, zero. Zip.
 

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
I don't understand why you care about something like cpu overhead when actual achieved performance in the vast majority of games shows the 290x nipping at the heels of the 980. It just seems like your picking a mostly moot metric as a selling point. Maybe we should start comparing box art as the next reason the choose a card.


I'm not in the know, can somebody tell me how game works differs from twimtbp?
 
Last edited:

destrekor

Lifer
Nov 18, 2005
28,799
359
126
No, you're right in this instance. The driver team CAN optimize their drivers for games, the hardware and even the API. The drivers are the lowest level of software that interacts directly with the hardware, so the brunt of the optimization will take place there.

This is a direct contradiction from what you implied earlier, when you were talking about developers optimizing for architectures. This is only true for low level APIs..

Just think, all of these recent PC ports like Dying Light, Dragon Age Inquisiton, AC Unity etcetera were all developed on consoles, both of which have AMD GPUs. So if your theory was correct, then the optimizations the developers made for the GCN based consoles should transfer to the PC right?

Wrong. As I said, DX11 is specifically designed as an abstractive layer to preclude developers from having to program for several architectures at once..

He may have misspoke by way of terminology, but the concept is still spot on.

You acknowledge that APIs and optimization are occurring in the drivers, but forget about the APIs that these driver optimizations are made for.
Developers are not specifically coding for different architectures, however, they are, at times, utilizing code libraries which are only effectively optimized for by one manufacturer's driver, which in turn means there can be performance discrepancies if the API code is not shared between driver teams.

Why this occurs even with console ports that are developed on GCN architecture? Simple: it may be the same architecture, but the APIs are low-level. Nvidia worked with developers to create their GameWorks libraries so that they are optimized for the system architecture, which means Nvidia actually wrote their libraries for the GCN architecture, or at least the low-level APIs and drivers found in the console variants.

That doesn't carry over to the PC because those libraries aren't coded for by AMD in the PC driver. I feel pretty confident in guessing that, if these GameWorks games were also using Mantle, this issue wouldn't be as big, as AMD's experience with the low-level approaches in both the XBONE and PS4 likely contributes significantly to the API design of Mantle. However, I highly doubt you'll see a game that has both GameWorks features and offers Mantle.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You diss on AMD's DX11 driver with COD at lowered settings when that game hammers GPUs on ultra, particularly on 1440p.. then you add insult by throwing in DAI, where Mantle provides fast & smoother gameplay? Starswarm isn't a benchmark where AMD cares about DX11, it was made as a Mantle showcase, get with the program.

OK, time for a recap. I stated in one of my posts that one reason why I would not consider AMD at this time, is because their DX11 driver overhead is much higher than Nvidia's.

I get called out on it by several forumites, including yourself. I then commence to post several graphs which show exactly that what I said was true..

And now here we are, with people making excuses defending AMD's atrocious DX11 overhead by stating that these issues would never show up because I'd be playing at GPU bound resolutions anyway....which conveniently ignores the fact that SLI/XFire is still CPU bound at 1440p in many titles. :rolleyes:

Next time Anandtech does a CPU review, you can criticize the editor for running the CPU gaming benchmarks at 720p instead of 1600p, since thats more realistic :sneaky:

You hate that your beloved company was not as loyal to you as you are to them, by lying about their hardware, selling you "low-fat" when you wanted "full-fat" products. That made you angry enough to lament to Gigabyte & Newegg for a refund, as far as "never to buy from them again!"... but you will still defend NV to the death, using the same propaganda that they fed in your koolaid.

Where have I defended NVidia? As I said before, NVidia will get it's due, hopefully in the form of a lawsuit. However, I was never ripped off at any stage by NVidia. The GTX 970 is still an excellent GPU for the money, nothing has changed about that.

But I want the full monty. If I had known the GTX 970 was stripped down this much from the beginning, I would have gone with the 980 instead at launch.

You're making a mountain out of a mole hill by using language such as "beloved." This is a purely logical exercise. Why should I pay money towards AMD, when the overall experience will be worse compared to what I will get with NVidia?

Answer me that.. :colbert:
 
Feb 19, 2009
10,457
10
76
And now here we are, with people making excuses defending AMD's atrocious DX11 overhead by stating that these issues would never show up because I'd be playing at GPU bound resolutions as well....which conveniently ignores the fact that SLI/XFire is still CPU bound at 1440p in many titles. :rolleyes:

You're making a mountain out of a mole hill by using language such as "beloved." This is a purely logical exercise. Why should I pay money towards AMD, when the overall experience will be worse compared to what I will get with NVidia?

Answer me that.. :colbert:

Which recent titles are CPU bound at 1440p with ultra IQ settings? Find me some.

It seems you've made up your mind, you don't have many options.

1. Admit you made an error for being angry with the false advertisement and get 2x 970 again.
2. Reward NV with more $ by getting 2x 980.
3. Buy 1x 970 and wait for Titan X where you will reward NV for pricing it at >$1,000 (word is they are aiming it at $1,300).
4. Rule out AMD because you run games at lowered settings on a stock CPU.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Blaming the AIBs is pure BS on the highest level, I can't understand doing so even in the slightest. There is no possible way they could have avoided the situation except have extraordinary knowledge of the GPU (essentially calling Nvidia liars from the start) and refuse to make cards with said chip.

Who says I'm blaming the AiBs? Where are people getting this stuff from? D:
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Who says I'm blaming the AiBs? Where are people getting this stuff from? D:
You did.
First they took a week and a half to get back to me, and when they did, they denied me and tried to pass it off to the manufacturer; which in my case is Gigabyte.

Called Gigabyte yesterday, and their customer service is absolutely terrible. D: I'll never buy a Gigabyte product ever again, of that you can be assured. :thumbsdown:

So after getting the middle finger from Gigabyte,
You will never buy from GigaByte again because of something Nvidia screwed up. Why is it the board partners fault and why is the onus on them to accept a return? Your cards were not broken were they?
 

Pneumothorax

Golden Member
Nov 4, 2002
1,182
23
81
Who says I'm blaming the AiBs? Where are people getting this stuff from? D:

Ummm, maybe read the start of this thread when you get angry at Gigabyte for not giving you a refund when they correctly lay the blame on Nvidia's feet that you will now reward with buying a $500-$1300 card.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't understand why you care about something like cpu overhead when actual achieved performance in the vast majority of games shows the 290x nipping at the heels of the 980. It just seems like your picking a mostly moot metric as a selling point. Maybe we should start comparing box art as the next reason the choose a card.

And I don't understand why you keep glossing over CPU overhead as though it's some irrelevant stat.. The GPU can't deliver good performance without the CPU, and this is doubly so for multiGPU, which is what I will be using.

Also, the latest TPU roundup shows the GTX 980 with a solid lead over the R9 290x at all resolutions including 4K, using reference clock speeds. With AiB parts, the lead will be greater, with less power draw to boot.

perfrel_2560.gif


I'm not in the know, can somebody tell me how game works differs from twimtbp?

They're essentially the same thing. Gameworks is essentially TWIMTBP program with a new name..
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,992
1,284
126
So let me see if I understand. If Newegg didn't give you a refund they would lose your business forever? Gigabyte has lost your business forever? Both of these because of nVidia deceiving you about the specs on the 970? Now you are looking at buying nVidia again? If anyone has caused you this grief and deserves to be shunned over it, surely it's nVidia, isn't it? :confused:

Boom, headshot.:D
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
He may have misspoke by way of terminology, but the concept is still spot on.

You acknowledge that APIs and optimization are occurring in the drivers, but forget about the APIs that these driver optimizations are made for.
Developers are not specifically coding for different architectures, however, they are, at times, utilizing code libraries which are only effectively optimized for by one manufacturer's driver, which in turn means there can be performance discrepancies if the API code is not shared between driver teams.

What APIs are you talking about? The only rendering APIs I know of are Direct3D, Mantle and OpenGL, and out of all three of them, only Mantle is closed..

I think you're talking about shaders. NVidia's shader library is closed to AMD, but that doesn't automatically mean AMD can't optimize for them. Apparently, the driver teams for both AMD and NVidia optimize their drivers for most games without even seeing the source code.

Why this occurs even with console ports that are developed on GCN architecture? Simple: it may be the same architecture, but the APIs are low-level. Nvidia worked with developers to create their GameWorks libraries so that they are optimized for the system architecture, which means Nvidia actually wrote their libraries for the GCN architecture, or at least the low-level APIs and drivers found in the console variants.

Even if this was the case, it doesn't explain why so many developers are gungho about using Gameworks. Unreal Engine 4 for instance has Gameworks stuff imbedded deep in the engine.

If Gameworks is such a threat to AMD hardware performance, why are so many developers using it? So unless you buy into some conspiracy theory, it doesn't add up.

And if AMD is perceiving it as such a big problem, why not just develop their own shader library?
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
You did.

You will never buy from GigaByte again because of something Nvidia screwed up. Why is it the board partners fault and why is the onus on them to accept a return? Your cards were not broken were they?

To me it read as if Gigabyte was ignorant and incompetent (I have had problems getting motherboard issues cleared up by them before). Not pissed because they wouldn't refund on what you say is Nvidia's screw up, but because they are idiots.

If I read that incorrectly I want Carfax83 (not anyone else) to say so.
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
What APIs are you talking about? The only rendering APIs I know of are Direct3D, Mantle and OpenGL, and out of all three of them, only Mantle is closed..

I think you're talking about shaders. NVidia's shader library is closed to AMD, but that doesn't automatically mean AMD can't optimize for them. Apparently, the driver teams for both AMD and NVidia optimize their drivers for most games without even seeing the source code.



Even if this was the case, it doesn't explain why so many developers are gungho about using Gameworks. Unreal Engine 4 for instance has Gameworks stuff imbedded deep in the engine.

If Gameworks is such a threat to AMD hardware performance, why are so many developers using it? So unless you buy into some conspiracy theory, it doesn't add up.

And if AMD is perceiving it as such a big problem, why not just develop their own shader library?

What does any of this have to do with why you started this thread?

You never had any intention of seeking or taking the advice of any other members here, did you? You did this for no other reason than to incite another one of your flame wars, didn't you?

You're out of here. Get lost.

-- stahlhart
 
Status
Not open for further replies.