WelshBloke
Lifer
It doesnt, but that isnt what we are talking about. We are talking about what is "amds" answer to the 460. The 5850 isnt their answer.
The 5850 is an AMD card, its available therefore its an alternative to the 460.
It doesnt, but that isnt what we are talking about. We are talking about what is "amds" answer to the 460. The 5850 isnt their answer.
Throwing out Wreckages point (which is valid but is complicating the point I'm trying to make), any card available is a valid answer to another if its priced the same and performs the same.
What AMD says is an answer to the 460 is irrelevant, unless you make your purchasing decisions purely on what AMD marketing tell you.
What you responded to explicitly said "I fail to see AMD's "answer" to the 460".
The 6850/6870 were the answer to the 460. However, price drops have changed the answer.
Ignorance is a bad foundation.
You find me a CPU destructable architechture game where the debris dosn't dissapear after 10-15 seconds...and have more than 5000 interactive objects.
If you can't no need to post again...like I said, ignorace will only make you look like someone without a clue...
The 6850 splits the 460 768mb and 1012mb in price over here.
That probably makes the 6850 a better option then. Truth be told, at the same price point, the 6850 is better option if you plan to leave your card at stock and the 460 has s slight edge if you plan to OC. If the 6850 is cheaper, then it gets the nod IMHO. In the US, usually the 6850 is a little more than the 460 when it comes to the best deals available. Close though...
That probably makes the 6850 a better option then. Truth be told, at the same price point, the 6850 is better option if you plan to leave your card at stock and the 460 has s slight edge if you plan to OC. If the 6850 is cheaper, then it gets the nod IMHO. In the US, usually the 6850 is a little more than the 460 when it comes to the best deals available. Close though...
🙄
the only destuctible environment games I've played lately are both havok games, RFG and BC2. I try to spend more time playing games than I do reading phyics engine specs. I guess its my fault I didnt get into any of the physx titles, but I certainly dont let what physics engine a game uses on name alone determine my spending.
The 6850 is interesting; they went on sale for $150 recently. The problem is, you can get a 768mb GTX 460 for $120 right now. Overclocked the two cards will perform pretty much the same. I guess it comes down to your preferences. You get 256mb of extra memory with the AMD card, but you get PhysX and CUDA with the NV card, plus you save $30.
And here comes the fallacies.
I care about the amount and type of physics that can run.
GPU > CPU for that....by a MAJOR factor.
You want to run your graphics on the CPU?
Same deal.
The CPU is jack of all trades, master of none.
Drop the fallacies and stop telling me what I think/feel/want.
I honestly don't know how many of these games use GPU Physx, you tell me:
http://www.gamespot.com/best-of/game-of-the-year/index.html
You say you care more about what GPU Physx can add to a game then if the physics engine is run on the CPU or GPU. We are coming up on three years since Nvidia purchased Ageia. By now don't you think that CPU physics games should be so far outclassed that they are practically obsolete compared to GPU based Physx games if GPU based physics effects are everything you say they are? I honestly don't know how many games on that list use GPU Physx (zero? ten? I really don't know), what I do know is that at least a few use Havok on the CPU. How can that be when the CPU is so inferior according to you?
I have no doubts that hardware based physics engines are not the future. I do have doubts that GPU based Physx does anything other than drop frame rates for near identical effects than compared to what have with CPU based physic engines. How is it that AMD still sells so many GPU's without a hardware based physics component if it is so clearly outclassed? Physx is a gimmick. The 69xx cards will be successful or fail despite GPU Physx either way.
In games?
Right...all PR :thumbsdown:
Nice try...shifting the goalposts.
But I guess that is what happens when you can't find me a single game using GPU physcis...on anything else but PhysX :thumbsdown:
And still the GPU beats the CPU with a factor +10
So much for your "real serious" issue. :thumbsdown:
If, when, might...like I said...you got nothing to show.
And you might wan't to look past the hyp and look at the performance of the upcomming APU's...it's not like GPU's are stading still.
APU will luster behind deducated GPU's just like IGP's do it today.
GPU Accelerated PhysX has not done anything that hasn't been already done on the CPU. In some cases years earlier or simply better, but I guess thats because Havok & Crytek have been actively developing the engines instead of gimmicks. On top of that it seems Nvidia has to send their own programs out to code these features into the games for the few companies that use it.A lot of words, some wishhfull thinking...but nothing concrete from your side....wake me up when you have more than empty words.
AMD seems to still be working with Havok in some areas. AMD & Nvidia both support OpenCL on their cards as well as Direct Compute. So Havok & Bullet will be able to code hardware accelration just once & have it work on both. It just took AMD awhile to add OpenCL support to their CCC software instead of having to download the SDK to add the support.
I also want to say I'm not against Nvidia developing their own tech. I just don't think it's right the way they block software & features when AMD cards are present in the system.
NVIDIA has paved the way, while AMD just talked.
Big difference in my book.
Hence why NVIDIA gets my money and AMD dosn't.
Much like how AMD/ATI has had a tessellator for almost a decade now? Tessellation is a pretty big topic (especially with Nvidia fans) these days. Since you tend to appreciate those who pave the way as opposed to just talk, you must be really impressed with Radeons.
Don't forget that AMD also pioneered Eyefinity, something that Nvidia has yet to incorporate in their cards.
Much like how AMD/ATI has had a tessellator for almost a decade now? Tessellation is a pretty big topic (especially with Nvidia fans) these days. Since you tend to appreciate those who pave the way as opposed to just talk, you must be really impressed with Radeons.
Much like how AMD/ATI has had a tessellator for almost a decade now? Tessellation is a pretty big topic (especially with Nvidia fans) these days. Since you tend to appreciate those who pave the way as opposed to just talk, you must be really impressed with Radeons.
NVIDIA has a patent on tessellation going back more than a decade.
http://patft.uspto.gov/netacgi/nph-...50&s1=6906716.PN.&OS=PN/6906716&RS=PN/6906716
"Integrated tessellator in a graphics processing unit "
But they were all talk and did nothing with it. AMD brought it to the masses and with higher functionality :thumbsup:.Actually Matrox and NVIDIA Quadro pioneered that. Even a pair of old GTX260's support it.
NVIDIA has a patent on tessellation going back more than a decade.
http://patft.uspto.gov/netacgi/nph-...50&s1=6906716.PN.&OS=PN/6906716&RS=PN/6906716
"Integrated tessellator in a graphics processing unit "
But they were all talk and did nothing with it. AMD brought it to the masses and with higher functionality :thumbsup:.