Makaveli
Diamond Member
A Thanksgiving gift would be better. You'll get it sooner and still be able to get yourself something else for Christmas.
Hmm not a bad idea!
Have to get the girlfriend something for christmas :\
Hardware or Love ?
A Thanksgiving gift would be better. You'll get it sooner and still be able to get yourself something else for Christmas.
true, but in all fairness the 9700 pro and 8800 gtx were both from a time period where graphics were still progressing fast. graphic advancements in the last 3 years or so has definitely slowed down increasing the longevity of a lot of GPU's
Mostly it was people jumping on the bandwagon, and not looking at things logically. Take a new generation of GPU when it is first released, and compare it to the previous generation on mature drivers, and you won't be all that impressed. But this happens every single time, drivers improve, games come out that push the new architectures, and they pull away from the previous generation.
The game is great looking, but I get the impression that the textures all look muddy. Crysis still wins in my books.
The PS3/360/Wii console generation lasting 2x longer than normal is the worst thing that happened to PC gaming.
If you can show me games where AMD specifically cripples or even removes features doing a vendor ID check, then I will cry foul. Not to mention completely disabling a software layer when your competitors hardware is detected. But I've been on record that the dev releations could very well mean a division of PC gaming, it is a slippery slope. I hope both AMD and Nvidia adhere to DX11 and OpenGL standards at the very least, and it is ultimately up to the game devs to make the right choices that benefit all gamers.So I want to know if the same people that bitched and complained all through 2010 and 2011 that it wasn't fair when reviewers included TWIMTBP games will be crying foul when GE titles are in reviews (and perform better on AMD hardware, at least initially) OR if the same said people are going to hypocritically tout how fast such and such GE game is on their AMD graphics card and how it is perfectly fair and normal for these games to be used to compare competing graphics cards.
I assume you have nothing but praise for AMD pushing devs to take full advantage of Radeon hardware?I just find it highly, highly amusing at the amount of animosity certain posters had when a game came out earlier this year or last year and performed substantially better on Nvidia hardware, but now that the tables are turned it's nothing but praise.
If I am not mistaken they probably created a vendor specific subroutine in Dirt showdown.I have no problems with GE titles at all but I hate double standards.If you can show me games where AMD specifically cripples or even removes features doing a vendor ID check, then I will cry foul. Not to mention completely disabling a software layer when your competitors hardware is detected. But I've been on record that the dev releations could very well mean a division of PC gaming, it is a slippery slope. I hope both AMD and Nvidia adhere to DX11 and OpenGL standards at the very least, and it is ultimately up to the game devs to make the right choices that benefit all gamers.
I assume you have nothing but praise for AMD pushing devs to take full advantage of Radeon hardware?
If I am not mistaken they probably created a vendor specific subroutine in Dirt showdown.I have no problems with GE titles at all but I hate double standards.
They didn't.If I am not mistaken they probably created a vendor specific subroutine in Dirt showdown.
Check my earlier postThey didn't.
Check my earlier post
And I suggest you step very carefully around TWIMTBP. That program has done more to foster DX11 game development that any other initiative.
You seem to be intentionally avoided the key point, which I already discussed. You said, "If I am not mistaken they probably created a vendor specific subroutine in Dirt showdown" which is false. Now you say, "there may be a shader subroutine specifically suited for AMD" which was not done by design, because the coding was started or even completed before they had any chance to run it on Kepler.So there may be a shader subroutine specifically suited for AMD. If they worked with devs and poured money it is simple that it will run better on their hardware.I don't understand what is wrong with that?
You actually believe AMD or NV to disclose their optimizations for a game? David would of course deny such things just like NV would.I would take the words of any rep with grain of salt.If optimization is not there how 660Ti can beat 7970 in BL2?What would NV say regarding that.You seem to be intentionally avoided the key point, which I already discussed. You said, "If I am not mistaken they probably created a vendor specific subroutine in Dirt showdown" which is false. Now you say, "there may be a shader subroutine specifically suited for AMD" which was not done by design, because the coding was started or even completed before they had any chance to run it on Kepler.
There is nothing wrong with leveraging the hardware as much as possible, no one is saying that. Where it all goes south is when you intentionally make sure your competitors hardware runs poorly, or has visuals completely missing. Or disables a feature outright just because the system has a competitors card installed. See the difference?
does that really have to be explained all over? here is a quick summary...7970 was only about 40% faster than 6970 at launch while costing almost 50% more than the 6970 launched at. that was PISS POOR. gtx680 was only about 35% faster than gtx580 but at least cost the same as the gtx580 launched at. still a VERY POOR leap.which really makes me question some people that say this gen wasn't much of a step up
I did and I see no reason to believe any rep's statement like its a gospel truth.