At least get your facts straight before you accuse TPU of bias... :thumbsdown:
In the TPU review, there are eight AMD Gaming Evolved titles:
1) Alien Isolation
2) Tomb Raider
3) Crysis 3
4) Bioshock Infinite
5) Civilization Beyond Earth
6) Dragon Age Inquisition
7) GTA V (Co-sponsored with NVidia)
8) Battlefield 4
For NVidia's Gameworks, there are eight as well:
1) AC Unity
2) Batman Arkham Origins
3) Far Cry 4
4) Witcher 3
5) Metro Last Light
6) Project Cars
7) Watch Dogs
8) GTA V (Co-sponsored with AMD)
9)CoD Advanced Warfare(Checked and this isn't a GW title. CoD Ghosts is, but not Advanced Warfare.
Yeah, that's so manyAnyway, whether a game has Gameworks or GamingEvolved does not determine final performance or give the sponsor an advantage or their competitor a disadvantage. There are gameworks titles where AMD is ahead of NVidia, and GamingEvolved titles where NVidia is leading AMD..
NVidia are constantly tweaking and optimizing their drivers. A review using a driver that is two three revisions out of date is understandable, but TPU uses drivers that are months old. The driver they used in that review came out in June..
A game being a GE game means nothing much beyond that AMD helped optimize the game. AMD doesn't do the same things nvidia does. currently their involvement does not seem to result in unfairly skewed framerates.
I would get Fury tri-X, one of the best air cooling card ever made.
In the skewed result games I posted the 970 beat the Fury X! In Project cars it wins by double digits, both FPS and Avg. Find me GE games in W1zzard's suite that the 390 beats the 980 ti, which would be the equivalent. Then you can claim that GE titles are unfairly skewed like GW.
Cool and quiet was so last year.
You will hear no nVidia supporters wanting to talk noise when comparing Fury cards. It no longer matters. You also won't see them wanting to use a reference card ever. They know the old Titan cooler is hot and throttles. They won't talk about it though. And if anyone else does they'll play the custom cooler card unless you bring up Hawaii. Then, even a year and a half later they'll tell you it's hot and loud.
This is probably sound advice I will follow. I should have said when I made the post is that I will naturally wait for a good price (sale or whatever) hence the reason I asked if prices are the same which card. I plan to make the purchase between tomorrow and blackfriday.
You really are quite misinformed. GE and GW are basically the same type of programs with the same intent, only GW is much more successful.
Both programs try to aid developers when it comes to optimizing their games for their hardware, and both programs try to aid developers when it comes to implementing vendor/PC specific technologies into their games..
The notion that GW is somehow unfair to AMD flies in the face of reality and fact. The only real performance advantage that manifests due to GW or GE are that the vendor has a head start when it comes to optimizing their drivers for a particular game.
PC games do not optimize for specific architectures due to using abstract APIs..
I did not say "for their hardware". Nor did I say vendor specific technologies. That is what gameworks does. It really can't even be the same because if AMD puts something in a game there is a good chance the code is openly available.
Because the code is open it in turn means that improvements AMD contributes are effective for everybody. There is also no lockout of nvidia.
Ultimately we get a situation where GE titles do not result in the same effects as GW titles, thus those charts being shown here need to be looked at in that light.
Except we have examples like Dirt showdown that changed the lighting to use direct compute at a time when nvidia cards weren't good at direct compute performance. The lighting did not look better and wasn't any more impressive than the previous game that did not use direct compute. That was a GE title. That's not really different than adding hair works or something. It can be used on any hardware but has an advantage in performance on certain hardware. Like it or not, it has been done by both camps.
DiRT: Showdown is something of a divisive game for benchmarking. The game’s advanced lighting system, while not developed by AMD, does implement a lot of the key concepts they popularized with their Leo forward lighting tech demo. As a result performance with that lighting system turned on has been known to greatly favor AMD cards. With that said, since we’re looking at high-end cards there’s really little reason not to be testing with it turned on since even a slow card can keep up. That said, this is why we also test DiRT with advanced lighting both on and off starting at 1920x1080 Ultra.
As of a recent patch, two more have been added: Advanced Lighting and Global Illumination. The former you may recognize from AMD's “Leo” or “Forward+” demo; it offers “genuine” dynamic lighting with the assistance of DirectCompute instead of hacking it with 2D glows. Global Illumination also utilizes DirectCompute, though in this case, it's to intelligently simulate reflected light on all surfaces in a given scene for a more attractive and realistic look.
In practice, Advanced Lighting makes a significant difference, noticeably brightening up and “fleshing out” any given scene. You might say what Ambient Occlusion offers with shadows, Advanced Lighting offers with lighting. At times, though, it seems to go too far, making parts of your vehicle, for example, so bright you can't fully see them.
Because the code is open it in turn means that improvements AMD contributes are effective for everybody. There is also no lockout of nvidia.
Ultimately we get a situation where GE titles do not result in the same effects as GW titles, thus those charts being shown here need to be looked at in that light.
it is kinda mess up right? a guy looking for advice on his new purchase and he has to wade through the BS just to find a few genuine advices.OP, the only advice I can give you is don't listen to the video card company fans. There are several offering you advice and it's not reliable because they have LOVE for their preferred brand. They are very obvious in this thread.
so mess up.http://www.anandtech.com/show/6774/nvidias-geforce-gtx-titan-part-2-titans-performance-unveiled/6
The developers thought there was some benefit to be had and it happened to show some weaknesses in hardware. In eg. witcher 3, the developers didn't even write the code. Nor did the excess tessellation have a benefit.
Also...
http://www.rage3d.com/articles/gaming/codemaster_dirt_showdown_tech_review/index.php?p=2
![]()
![]()
Not an insignificant effect
it is kinda mess up right? a guy looking for advice on his new purchase and he has to wade through the BS just to find a few genuine advices.so mess up.
Except we have examples like Dirt showdown that changed the lighting to use direct compute at a time when nvidia cards weren't good at direct compute performance. The lighting did not look better and wasn't any more impressive than the previous game that did not use direct compute. That was a GE title. That's not really different than adding hair works or something. It can be used on any hardware but has an advantage in performance on certain hardware. Like it or not, it has been done by both camps.
If AMD wants to make their gaming technologies open source, more power to them. But NVidia has no obligation whatsoever to follow suit.
A good example of this is Mantle. AMD spent millions of dollars to develop it and said it would be open sourced, but it never came to be......until DX12 stole the limelight, the podium and the whole damn show.
Now Mantle is going to be the foundation of Vulkan, an open sourced API.. So basically AMD wasted their money for nothing..
And what light is that? Just because your GPU vendor isn't winning, you feel compelled to come up with excuses as to it's performance (or lack thereof) without even understanding the full picture.
I said it on the previous page, and I'll say it again. GE or GW has no bearing on the final performance of a game on a vendor's hardware.. We've seen it time and time again.
Metro 2033 and Metro Last Light ran better on AMD hardware when it first came out despite being GW titles. Far Cry 3 was faster on AMD hardware when it first came out, but after NVidia started optimizing it's drivers, they gained the edge..
The only performance benefit that GW and GE provide, are early access to the code so that IHVs can start optimizing their drivers for the game.
I don't care about noise levels personally. If the performance is right it has to be pretty loud for me to care about that. I've always been like that because I play my games with the sound up so I don't hear it.
Is amd or nvidia driver more stable (i.e, no crashing) ? Also What is this fury nano and release date ? Last but least the msi 390 g8 looks good (short enough to fit my case); only negative I've seen witht he 390 compared to 980 and fury is that it runs bloody hot.
I do want to stress that 1440p is about as high as I will go with regards to resolution until there is a 24 inch high resoltuion screen. My general view (for gaming) is 24 inch 1200x1920 is prefer but I'm happy with my 27 inch. I willl not be going larger though I could see a 27inch 1600 monitor if one was available at a good price.
What does this have to do with TPU's review?
What about throttling? I assume you care about that?
Both companies currently produce reliable drivers.Is amd or nvidia driver more stable (i.e, no crashing) ...
Some time this month....What is this fury nano and release date ?..
As an HD7950 owner, I have 0 issues with drivers crashing
http://www.bjorn3d.com/2015/06/msi-r9-390-gaming-8g-amd-300-series-with-a-custom-kick-from-msi/4/
As the R9 390 is the PERFORMANCE PER DOLLAR KING, there is a REASON it gets recommended you2. It's an all around great card that has been revamped (The r9 290 was the previous performance per dollar king), that I mean, we've talkd about a LOT of cards in this thread, it's the R9 390 that sits alone as a card most people will have a hard time saying is a bad choice.
The other thing to pay attention to you2, is that these cards we're all talking about will all be obselete next year. Next year, the flurry of hardware improvements over todays cards, it is in your BEST interest to purchase the cheapest, fastest card now, and then sell and purchase something next year or the year afterwards rather than investing a HUGE amount now, and being upset next year when a $300 GPU is now making your GPU look like an entry level piece of trash.
