thesmokingman
Platinum Member
I was referring to the quote about it being mainly a driver issue.
Huh? Driver issue, whatever the issue. How do you believe what they say in reference to my post? Do they expect their users to be that freaken stupid?
I was referring to the quote about it being mainly a driver issue.
Huh? Driver issue, whatever the issue. How do you believe what they say in reference to my post? Do they expect their users to be that freaken stupid?
I never said I believed anyone. I was responding to a post that claimed there were only 2 explanations, neither of which were the company's own explanation. I was asking if there was any factual basis behind this.
You say perhaps they were telling the truth. What's your point then??
I never said I believed anyone. I was responding to a post that claimed there were only 2 explanations, neither of which were the company's own explanation. I was asking if there was any factual basis behind the dismissal.
What the hell? How did you get lost?
I'm not your mother. Read my posts again yourself.
I get the feeling you are defending the devs? What facts are there to prove otherwise you ask? Like it's a conspiracy or something yea?You say incompetent and yet the game runs quite well on console.
Have you considered that perhaps they are telling the truth?
I don't know of any information that rules out this possibility.
Corrected.![]()
This is what I call "optimizations".
![]()
Far Cry 4 sucked on launch day and guess what, it still sucks.
GameWorks - the new business model by Nvidia. The way consumers are meant to be played 😎
Poor AMD. They work so hard to improve their performance via drivers, yet even their own fans don't believe that. Company can't catch a break!
You're not factoring in load power usage into any of these scenarios... an R9 390X at load under high-end games like Crysis 3 can use as much as 487 watts according to AnandTech Bench. The GTX 980 uses 308. What's the price difference in these cards? $100? $50 if you catch a GTX 980 at $480 w/ $30 rebate?
How long do you think it'll take you to use that $50 extra cost for the NVIDIA card in power usage? Sure, to a lot of people it just won't matter. It doesn't matter to me (clearly), but I know there's some people who value cost per watt even in their video cards.
You're not factoring in load power usage into any of these scenarios... an R9 390X at load under high-end games like Crysis 3 can use as much as 487 watts according to AnandTech Bench. The GTX 980 uses 308. What's the price difference in these cards? $100? $50 if you catch a GTX 980 at $480 w/ $30 rebate?
How long do you think it'll take you to use that $50 extra cost for the NVIDIA card in power usage? Sure, to a lot of people it just won't matter. It doesn't matter to me (clearly), but I know there's some people who value cost per watt even in their video cards.
Some scenarios, it would absolutely make sense to get the AMD product... if you live in an apartment where electricity is covered through rent, or say living in a college dorm for instance. Otherwise, at some point, you really do need to seriously consider power cost.
Furthermore, as you said, there's no real offering from AMD that can match the GTX 980 Ti... they can be had for $609.99 from Newegg right now, and they are power sippers compared to Fury X.
If you ask me, unless you have access to free power, AMD's offerings are a tough sell if you tend to hang onto video cards for awhile and can recoup the initial investment into the card through power savings over time.
GCN is very strong in the full PBR engines like CryEngine 3.4+ or the newest Frostbite. So this is not game specific. The primary reason for this is the robust cache design and the memory bandwith.
damn I was hoping for the next level of spin on my post but I got crap instead 😵 I am disappointed, very. :\How about actually reading what's been posted and discussing instead of posting dumb sarcastic remarks that don't add any value to this thread.
You know it's just because they made such crap drivers that it's making them look better now. Look at all that time this performance was locked away and they weren't competent enough to unleash it.
Sounds like you are finally starting to understand things 3D, I'm very proud of you! :thumbsup:
Yes I mean we saw the increases over time with driver updates. But let's ignore all of that. It's clear the only thing that happened was kepler just magically got worse over time.Poor AMD. They work so hard to improve their performance via drivers, yet even their own fans don't believe that. Company can't catch a break!
Good to see AMD addressing performance issues quicker. Fury owners won't have to face the performance drought Tahiti owners did! Enjoy it 😀
Care to elaborate ?
Because if I could have the lack of performance I had with tahiti my whole life, I'd be happy.
Of course I could also have shed 450 for a 680 then 650 for a 780 for a grand total of 1100 to get the final performances of a 350 7970. That would have made me look awesome, because nvidia, but it still doesn't tell me how I've been missing performance ?
Ok, but it still was at least 30% faster then a GTX580, and as fast as a 680 (OK, 6% behind at base clocks).
It's not like performance was bad. The fact performance got even higher is icing on the cake, cherry on the top. Plus nVidia also got a huge gain in BF3 shortly after.
I ran mine at 1200 instead of 925 too. Performance was awesome.
Then it might be that english is not my mothertongue, but "drought" sounds like "bad" to me 🙂
Like, I don't know, "in the desert and thirsty".
a prolonged absence of something specified.
"he ended a five-game hitting drought"
Good to see AMD addressing performance issues quicker. Fury owners won't have to face the performance drought Tahiti owners did! Enjoy it 😀
Tahiti launched with some poor drivers. As time progress, the performance gains were pretty big. There was the Never Settle Driver in 2013 that basically put Tahiti on the top. But that came almost a year after it's launch.