RMA approved for $740 store credit. Now what?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Meghan54

Lifer
Oct 18, 2009
11,684
5,228
136
290x crossfire isn't an option because:

1) Power consumption

2) Very high driver overhead and CPU scaling issues

3) Crossfire compatibility is less than SLI

If it weren't for these three things, I would go with AMD in an instant.


Yo, straight up, funniest misinformed diatribe I've seen in years. I was crying alongside that guy in the "Nvidia engineer" youtube video.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
Why are you people replying with suggestions?

This is a troll thread started by someone well known in the forums for trolling. If this individual was truly looking for reasonable suggestions, he/she would already be asking about what R9 295X2 to buy... but no, all he/she can provide is weak excuses of why he/she wants to stay with the ones that lied to him/her.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Seriously dude, making statements like AMD has horrible DX11 drivers and massive CPU overhead doesn't align with reality when we pair a high-end AMD and NV card with an overclocked i5/i7. So stop stating this as some gospel when it doesn't matter for most of us.

AMD's performance vs NVidia's in Call of Duty Advanced Warfare:

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-cod_proz_amd.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Action-Call_of_Duty_Advanced_Warfare-test-cod_proz_intel.jpg


A 2600K with a GTX 980 is faster than a 5960x with a R9 290x at CPU bound resolutions ():)

Or lets look at a Gaming Evolved title like Dragon Age Inquisition:

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-dragon_age_inquisition-test-DragonAgeInquisition_proz_amd.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-dragon_age_inquisition-test-DragonAgeInquisition_proz_proz.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-dragon_age_inquisition-test-DragonAgeInquisition_proz_mantle.jpg


DX11 path on AMD hardware is significantly slower than the DX11 path on NVidia with CPU limited circumstances. Only with Mantle does AMD manage to catch up with, and actually outperform NVidia, and Mantle is a far lower level API than DX11.

And then of course the infamous Star Swarm demo, which was created specifically to test API bottleneck:

71448.png

71449.png


Quite laughable. But of course no matter how many times these benchmarks are posted you'll still have your head stuck in the sand. :thumbsdown:
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
I too look at games with Mantle support for examples of how well AMD optimizes for DX11.
 

Lil'John

Senior member
Dec 28, 2013
301
33
91
Thank you for the update and in a separate thread:)

I'm glad to hear Newegg finally took care of you. I'm not glad to hear your Gigabyte response.

I'm in the exact GPU situation(SLI Gigabyte) but my retailer said "sure, send em back." However, I don't have the original boxes because they were working for light loads for the last 3 months.

I'm waiting to hear back from my retailer about the missing box issue :(

And I agree with your attitude... customer service does play a high importance to me also.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
^ Even if 290CF AMD cards were paired with a Skylake-E @ 10Ghz, they would not beat 980 SLI with a Core i7 4930K in AC Unity, and you know why that is - the game heavily favours Maxwell cards. No need to bring up CPU overhead when I already linked above that your theory doesn't work for highly overclocked i5/i7 systems.

You have no idea what you're talking about RS. I've told you several times that PC game optimization does not specifically take architecture into account, because game developers are programming for DX11, which is designed on purpose to be abstract enough to prevent developers from having to use low level programming..

Only with low level APIs like Mantle and DX12 can developers actually use architecture specific enhancements.

So blaming AC Unity's poor CPU scaling performance with AMD hardware on Gameworks or because it supposedly favors Maxwell just smacks of ignorance..

Instead, you decided to go ahead on a power usage, perf/watt, drivers, and CPU overhead discussion which is just going to create lengthy arguments that won't change your mind. So what's the point other than to start arguments? Again, believe me when I say it that if you state your preference for specific NV games that simply run faster on NV hardware, PC gamers would be A LOT more receptive to your brand preference because that's the most logical argument of all. You buy PC gaming hardware for games and if those specific games run faster on your cards, well the choice is clear.

You make several false statements here. First off, my reasons for not going with AMD as relates to power consumption and CPU overhead are all perfectly valid, as the evidence has shown.

Secondly, as I've repeatedly pointed out to you, there is no such thing as "specific NV games that simply run faster on NV hardware," or even specific AMD games that simply run faster on AMD hardware. Unless low level APIs are involved, that is a false argument..

Also, when was the last time that merely having opinions implied some ulterior motive to start arguments that would devolve into flamewars?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Quite laughable. But of course no matter how many times these benchmarks are posted you'll still have your head stuck in the sand. :thumbsdown:

Right and now let's look at the performance in 99% of those games you linked with MSAA, highest quality settings, etc. Give me a break. If DX11 performance of 290X was so awful it wouldn't be within 10% of 980 in TPU's latest benches and outperforming the 780Ti.

And using StarSwarm tech demo as any indication of AMD's DX11 overhead in real world games, wow talk about grasping for straws.

Again, based on your examples one would think that 980 SLI would be wiping the floor with 295X2 in games but there is no such thing. Look it's not my money, it's your $. As many posters already told you, get 970 SLI or 980 SLI, stop asking us for unbiased advice.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
But the 970 is still a great card, and a great buy. It was priced extremely well for what it offered so it wasn't like I was being ripped off. If it had the 4GB of full speed memory, 64 ROPs and 2MB of L2 cache like originally advertised, it would have been priced higher I'd think...

It's like buying a Lamborghini, but the speed limit will only let you drive 75 mph. All the horsepower is meaningless....but you want those horses to show off!

I understand why some may feel slighted, but this whole thread is a trip. It should be fully aware that the whole intention was to try to get people to switch to AMD from Nvidia as if AMD never has done something shady in the past.

I have no doubt Nvidia mis-represented the memory as well, but the fact that by the time you get to memory usage greater than 3.5gb fps drops to the point that it is unplayable due to the high resolution and settings required to reach that amount....it, overall, is a non-factor. People buy new GPU's all the time. By the time actual 4GB is needed, newer GPUs capable of handling the resolutions will be what will be desired...not the 970.

So, all the while people are crying away...the FPS from FRAPS hasn't changed one bit. The card performs just the same...and people raved about it after launch when they got it.

And here you talk about still going with 970 SLI (and others have suggested it as well) after the refund....LOL!

SMDH...Faux rage at it's finest.
 
Last edited:

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Let's admit it...

Right now most of us hate Nvidia, but we would STILL get a 980 over any other card because we still believe that the 980 is "the best card" at the moment.

(Sad) fact, sort-of...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Only with low level APIs like Mantle and DX12 can developers actually use architecture specific enhancements.

Why do I even bother? So all those times in the past where ATI cards or NV cards ran faster in DX7/8/9.1/10/11, it was all just a random fluke and nothing to do with architecture specific optimizations made by ATI/NV's driver teams. All those GE/GW titles where AMD/NV tried to get better performance for specific features like Global Illumination, SSAA via DirectCompute, HBAO+, PCSS+, all done in DX11 had nothing to do with catering to specific architectures of the said brands'? R9 290X pulling away from Kepler and closing in on Maxwell but Kepler falling further behind Maxwell simultaneously - all of that is just a random fluke and has nothing to do with drivers from AMD/NV?

Also, when console ports come to the PC and they were made to cater to GCN architecture, the x86 code in DX11 does not translate at all into any advantage for AMD's GCN hardware? Got it. I learned a lot today.

Your CPU argument is going nowhere because HardOCP has already tested 290X CF vs. 970 SLI and against 980 SLI for real world gameplay smoothness at 1440P and above.

Also, how do you keep missing the part about OC i5/i7 systems a lot of us have, but yet you use a stock i5 2600K 3.4Ghz as an example to prove your point of CPU bottlenecks. Jeez.

Like I said even in AC Unity you had to run FXAA on 970 SLI, so the last thing anyone needs to worry about with 295X2/970SLI or faster is CPU bottlenecks in modern games at 1440P and above on a 4.4Ghz+ Core i5/i7 system. We even have users here with 980 SLI max overclocked who report almost 0% increase in performance going from 4.0Ghz 5960X to a 4.4Ghz 5960X in games. This is not some coincidence but because most games are GPU limited today.

It's remarkable you keep claiming CPU bottlenecks and AMD's horrible DX11 driver overhead but after countless games tested 290X is now faster than GTX970 at higher resolutions, and the same for 295X2 vs. 970 SLI.

As many have repeated, what is the purpose of your thread? Do you want people to help you pick the best 970/980 cards or do you intend to keep creating flame wars about NV/AMD in your thread? It doesn't seem like you are at all asking people to really help you pick fairly between 970 SLI / 980 SLI or a single 980 as a stop-gap. There should be NO discussion about AMD at all since everyone on our forum knows you are not interested in AMD products. So why do you keep talking about them in your own thread!?
 
Last edited:

lavaheadache

Diamond Member
Jan 28, 2005
6,893
14
81
Let's admit it...

Right now most of us hate Nvidia, but we would STILL get a 980 over any other card because we still believe that the 980 is "the best card" at the moment.

(Sad) fact, sort-of...

Define best.

There are more powerful cards to pick from. A 295x2 is similar in price to aftermarket 980's.
 

96Firebird

Diamond Member
Nov 8, 2010
5,746
342
126
and nothing to do with architecture specific optimizations made by ATI/NV's driver teams.

Looks like he is talking about developers using architecture-specific optimizations, while your talking about driver teams using architecture-specific optimizations.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Let's admit it...

Right now most of us hate Nvidia, but we would STILL get a 980 over any other card because we still believe that the 980 is "the best card" at the moment.

(Sad) fact, sort-of...

If $ is of no factor to you at all, sure, 980, 980 SLI, 980 Tri-SLI. Which is why no one in this thread can understand the OP. He got a refund for his 970 SLI, he has 0 interest in buying any AMD card, which means the ONLY logical upgrade is 980 SLI or 980 Tri-SLI or ....going back and re-buying 970 SLI.

Since he wasn't happy with 970 SLI, what exactly does he expect people to recommend to him?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,815
1,552
136
Honestly if you have a decent secondary card I'd recommend waiting for a single GM200 or 390X.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Actually that CPU bench doesn't show what you think it shows.

endVPhU.jpg


Look at the 4770K at 2.5ghz running 4 cores, the R290X and 980 gets EXACTLY the same score.

You took two isolated examples, and ignored everything else. I'm betting that has to do with the extremely low clock speed.. Also, AMD's drivers use two cores just as well as NVidia's. It's when you go above two cores that scaling becomes an issue. Looking at everything else though, it's obvious that what I said is true; the most poignant example being the FX-8370, an octa core CPU with low IPC that depends on multithreading for good performance.

Because ACU is a GameWorks title, the 980 is about 20-25% faster then the R290X (which tops out at ~43 fps) so it ends up faster at normal CPU speeds.

People need to stop spreading FUD about GameWorks. GameWorks has nothing to do with why NVidia is faster than AMD in AC Unity. GameWorks only contributes HBAO+ and TXAA in AC Unity, and thats it.

GameWorks also cannot affect CPU usage. CPU usage is directly tied to the game engine, and the DX11 driver.

But ultimately if you play mostly GameWorks titles, then sure, its best to go with NV. But your options are limited. You don't want to waste much $ because you want Titan X, anything now is a temporary solution. So go for the cheapest bang for buck that you can resell without losing a massive premium.

I want the Titan X, but I don't want to have to wait months to get it. And I know the Titan X is going to be much slower than GTX 980 SLI no matter what, and it might also cost more as well.

So I'm still kind of debating that to be honest..
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I want the Titan X, but I don't want to have to wait months to get it. And I know the Titan X is going to be much slower than GTX 980 SLI no matter what, and it might also cost more as well.

So I'm still kind of debating that to be honest..

You bought GTX770 4GB SLI for $900 and sold them for $550-600 or so, losing about $300-350 in what 12 months of owning them? What's the problem with buying 980 SLI for $1100, using them for 6-8 months, and selling them for $750-800. Let's even go further. You had 580 3GB SLI and skipped 680 SLI entirely, waited until 770 4GB, which is probably at least 2 years of gaming on 580s. Based on the fact that Titan X/GM200 won't beat 980 SLI @ 1.5Ghz, what's stopping you from using 980 SLI all the way until Q4 2016/Q1 2017 and upgrading to 14nm Pascal?

Finally, if you decide to go GM200, it probably won't cost $1100. That means even if you resell 980 SLI for $800, you'll likely be able to get the flagship GTX1080 for $900 anyway.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
all of these points are grossly exaggerated.

1. yes the Hawaii cards consume more power but really what does that equate to overall?

More heat.. Less back up time on my UPS..

Show me 1 result where AMD's high driver overhead and cpu scaling is going to be even remotely noticeable unless looking at raw fps number counters.

I posted several examples above. Frame rate is directly affected by this, there's no mistaking. So unless you're arguing that frame rate is an unimportant metric, then I don't see what you're getting at here..

3. Multi gpu compatibility is at the very least equal and if anything the other direction from what you state. I have extensive experience with both vendors current iterations and in my experience AMD has the edge.

It's admittedly been a long time since I tried Crossfire, or even AMD. The last AMD hardware I had were two 4870s back in the day, and my experience was terrible! So terrible I ended up returning them.

One card wouldn't idle, flickering problems in several games, textures not showing up. I'm sure AMD's Crossfire support has grown by leaps and bounds since then, but it still left a really bad taste in my mouth. Turned me off AMD completely.

With SLI, I know what I'm getting because I've used it for years.

I have played with 3 way 290x, 295x2, 780 sli, 780 ti sli, titan Z sli (what a friggin joke) and the crossfire systems scaled better and have had better support.

You're one of the few people I'm aware of that says Crossfire is actually better than SLI. Technology wise, Crossfire is probably superior due to XDMA.

But support wise? Does Far Cry 4 have a crossfire profile yet? Or Dying Light? And Evolved?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Carfax you have the perfect opportunity to try out 290 XDMA crossfire for cheap. Open your mind and see what the alternative is, who knows maybe you'll even find out that the "issues" you speak of aren't as big as you've made them out to be. In either case you plan to upgrade by summer anyway (assuming 390x/980 ti hit as rumored). You'd have to be extremely dedicated to NV to buy a 980 with 290/x's at the current prices, and if you are talking about a stopgap solution anyway why not give it a shot? Since you returned the 970's based on the lies and lack of concern from NV it makes no sense to go for the absurdly overpriced 980.

I can't wait till Summer to upgrade, because my most anticipated game of all time (The Witcher 3) is due out in May of this year.. I must have the hardware to play that game at launch, 1440p maxed out minus supersampling.

And spec wise the 980 is definitely overpriced, but because it's a Halo card and AMD has nothing out at the moment to compete with it, NVidia can sell it at a higher price.

When the 390 and 390x drop, the 980's price should definitely be lowered; especially if they put the smack down performance wise.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
ITT Anandtech users try to sell Carfax something he would never purchase.
I commend the people that tried, but some people are 100% brand attached so it doesn't matter what you say. Nothing wrong with that but don't understand the point of this thread. The OP is either going to buy Nvidia something now or Nvidia something in a few months.

Not much room for discussion.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I have no doubt Nvidia mis-represented the memory as well, but the fact that by the time you get to memory usage greater than 3.5gb fps drops to the point that it is unplayable due to the high resolution and settings required to reach that amount....it, overall, is a non-factor. People buy new GPU's all the time. By the time actual 4GB is needed, newer GPUs capable of handling the resolutions will be what will be desired...not the 970.

It takes a lot to make the GTX 970 bottom out due to VRAM I agree. You basically have to be wanting to do just that..

But frame latency graphs definitely show more frame time spikes for the GTX 970 for VRAM usage above 3.5GB. The card manages it's VRAM extremely well all things considering though..

The only things I've noticed are an odd pause or stutter here and there in VRAM intensive games like AC Unity. In the past I attributed this to SLI, but it was probably due to the VRAM issue. Most of the VRAM is being used for cache, so having a segmented VRAM setup means both more, and slower asset swapping.

Anyway, the main reason why I'm returning the 970s is because had I known of the differing specs, I would have just gone with the 980s from the beginning.
 

garagisti

Senior member
Aug 7, 2007
592
7
81
You took two isolated examples, and ignored everything else. I'm betting that has to do with the extremely low clock speed.. Also, AMD's drivers use two cores just as well as NVidia's. It's when you go above two cores that scaling becomes an issue. Looking at everything else though, it's obvious that what I said is true; the most poignant example being the FX-8370, an octa core CPU with low IPC that depends on multithreading for good performance.



People need to stop spreading FUD about GameWorks. GameWorks has nothing to do with why NVidia is faster than AMD in AC Unity. GameWorks only contributes HBAO+ and TXAA in AC Unity, and thats it.

GameWorks also cannot affect CPU usage. CPU usage is directly tied to the game engine, and the DX11 driver.



I want the Titan X, but I don't want to have to wait months to get it. And I know the Titan X is going to be much slower than GTX 980 SLI no matter what, and it might also cost more as well.

So I'm still kind of debating that to be honest..
Kind sir, if you like to drink koolaid from Jen Hsun, be my guest. However, do not speak about evil such as Gameworks lightly, or try to suggest that it is nothing serious. Gameworks is a set of closed libraries. You can't possibly optimise drivers without seeing how code is written in those libraries. Yet, you're more than happy to take a pi$$ on AMD and their cards for their performance in the Gameworks title? I was genuinely about to make suggestions on going 295x2, but then was reminded by someone of your post history, and that you're possibly not here asking open ended questions seeking help. For what it is worth, that suggestion seems very true.

Clearly you suggest punishing Newegg/ Gigabyte is in order, when Nvidia was the one done lying about specification of their product. Then you proceed to parrot on about new Titan, and how that would be a good buy. IMHO you lost all credibility when you posted comparison between 980 and 290 at 720p. 720p with those cards? Nobody with thousands of dollars to burn on GPUs is gaming on 1080p, leave alone lower resolutions. With all due respect, you kind sir, are merely having your jollies in this thread, and are doing little else.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
To think I actually bit.

Clearly there is no question, get your 980's and be done. Why the thread when there are no alternatives? You're clearly very upset about the 970 lies.
 

Pneumothorax

Golden Member
Nov 4, 2002
1,182
23
81
I went from a 970 SLI setup to a 290x CF setup (I even had enough money left over to upgrade my son's rig!) and I'm perfectly happy playing my current faves of DAI and SoM. Since I love playing at maxed out settings, my 290x CF setup is definitely faster than my 970 SLI was. I had my MSI 970's o/c to 1450 boost too!

I don't know why you're fixated on UPS power ratings. Do you seriously plan on gaming while on UPS backup? My UPS only purpose to allow me to save my game and quit. Besides, in my area blackout=no internet so on-line games are also out.

Like you, I was going to go full stupid and send Nvidia more money by going 980 SLI, but after seeing how their reps went back and edited their 'promises' and stopped helping people, that made up my mind.

I don't like being the 'battered spouse' syndrome
 
Last edited:
Status
Not open for further replies.