• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Nvidia Geforce GTX 295 or ATI Radeon HD 5970?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
And just how useful is CUDA to the average person or even the average gamer? Almost zero? I rest my case.

Once again you are stating your opinion as if it was fact.

Cuda offers. Physx, folding@home, video decoding, video transcoding, Photoshop and Premiere acceleration, etc, etc.

Your bias clouds your judgment.
 
Once again you are stating your opinion as if it was fact.

Cuda offers. Physx, folding@home, video decoding, video transcoding, Photoshop and Premiere acceleration, etc, etc.

Your bias clouds your judgment.

No, your bias clouds your judgment, not mine. I have preferences but I always look at things with a somewhat impartial eye.

You are the master of stating opinion as if it were fact. I am indeed stating an opinion but if you want to refute what I say that's fine. Just don't go around putting words in my mouth.

How much of what you listed interests the average gamer much less the much more generalized average computer user? Those are apps used for power users.

While PhysX certainly requires CUDA, it is an "artificially" imposed requirement that was put in when nVidia bought PhysX.

How many gamers much less computer users are salivating at Folding@Home?

Photoshop? The average user barely knows how to use a simple image editor to remove red eye and the like. I don't expect them to even begin to know how to use Photoshop. While no Photoshop wizard, I do know how to leverage some of the features of Photoshop and I can say that the average gamer or user would not even know where to begin when using a powerful application like Photoshop.

Premiere? Even worse example than Photoshop. This is not an application for home users. This is a professional class app and it shows. Powerful but not built for home users.

How many gamers or computer users in general are salivating at encoding/transcoding? Heck, how many computer users even know what the heck those two terms mean?

All of the things you listed were evaluated by me when I made that post and dismissed because they are geared towards power users and content creators. It is barely a blip on the radar of the average gamer much less on the radar for the average computer user. Look at the nVidia web page talking about CUDA and the uses (even if experimental) that coders are putting CUDA to and you can see that 99% of it is geared towards content creation or scientific uses. Almost none of it is geared towards home users. Thus I stand by my original statement that for almost all users, CUDA is irrelevant.
 
Once again you are stating your opinion as if it was fact.

Cuda offers. Physx, folding@home, video decoding, video transcoding, Photoshop and Premiere acceleration, etc, etc.

Your bias clouds your judgment.

Cuda don't accelerate photoshop. Premier uses cuda, but only if you have a quadro card.
 

Again I'm sure the "average computer user" appreciates that you are the voice for them. 🙄

Seems the OP already stated an interest in CUDA\PhysX. So your "almost zero" statement is null and void.

Go enforce your biased opinion elsewhere. :thumbsdown:

Millions of people use their computers for just the sort of things that CUDA is good at. Not to mention that most people that visit this forum are just the type of person to be above average.
 
What 3D tech do you want to use?

DX11 effects can look more 3D than DX10.

I'm assuming Nvidia 3D Vision. If so, then wait for Fermi. If you really need a card now, 5970.

As for the CUDA debacle, I agree that it's has incredibly specific use and is not at all for the mass, making me wonder if the original poster is: a. a professional graphic designer or b. was overwhelmed by nVidia's marketing PR about it or c. a programmer wanting to try it.
 
Last edited:
You might have an argument with choosing between the GTX 295 and the 5870, but the 5970 pretty much demolishes the best NV has right now. If you want the best, there is no competition to the 5970, other than maybe getting 2x5870s?
 
DX11 is not that big a deal atm. But what does matter is performance and the 5970 blows away the 295

The Alien Versus Predator game coming out shows some very impressive tech demos, and it's releasing in the Spring. The difference between dx10 and dx11 is huge!
And more so, it seems like the tessellation feature of dx11 is something that's easily added into games, so we could actually see it making a huge difference faster than dx10 did. Heck, I still have difficulty pointing out the advantages of dx10, but dx11 tessellation is night and day.

And op, do you want CUDA, or Physx? With OpenCL and Directcompute out (and ati having the fastest performance in both), I don't see much need for CUDA. Physx is kind of neat, but DX11 is more neat.
 
Yeah, unless you use professional apps, there's no point to get a GTX295.
A 5970 would be more useful power.

You could "settle" for a 5870 though unless you're running at ultra high resolutions and you'd be fine.
 
Look, this is very easy. DX11 is a standard, as is DirectCompute and OpenCL. With CUDA and PhysX you're depending on the largess of various vendors in implementing to an API supported by a single hardware vendor and locked out from a large portion of the user base (even if they have otherwise capable hardware).

Going forward there can only be one resolution to this. The useful lifetime of CUDA and PhysX is going to be shorter than the life of today's high end cards.
 
Millions of people use their computers for just the sort of things that CUDA is good at. Not to mention that most people that visit this forum are just the type of person to be above average.
If you really find millions of "average home users" that use CUDA for premiere, photoshop and co, you're really good - because as we all know no current photoshop version uses CUDA for anything and you need a Quadro card for premiere (and everyone with a Quadro card is per definitionem no "average home user"), so we're stuck with people that decode/encode videos the whole day..

Physix and the 3d stuff is nice to have for gamers, but CUDA is mostly for a completly different clientele.


Though I'd wait for the Nvidia release anyhow.. at the moment the 5xxx cards are vastly overpriced and who knows how good fermi will be on a price/performance basis? But if you really needed to buy a card right now, the 295 is clearly inferior to the 5970.
 
Last edited:
Again I'm sure the "average computer user" appreciates that you are the voice for them. 🙄

Seems the OP already stated an interest in CUDA\PhysX. So your "almost zero" statement is null and void.

Go enforce your biased opinion elsewhere. :thumbsdown:

Millions of people use their computers for just the sort of things that CUDA is good at. Not to mention that most people that visit this forum are just the type of person to be above average.

I'm sure the "average computer user" appreciates me speaking for them just as much as the OP appreciates you speaking for him. I presented information on why CUDA is not important to the majority of users. I'm sure the OP can come to his own conclusion on whether CUDA is relevant to him.

In my reference to average computer users, I am speaking of them as a demography. It is used in marketing analysis all of the time where one assigns certain tendencies to a certain group. Just like how one can say that the number of females buying video games is a very small portion of the gaming population.

Instead of going off on a tangent, why don't you explain why CUDA should be important to the "average computer user." You didn't refute what I said, you merely went off on a side argument that had nothing to do with my statement.

Furthermore, many people who come to these forums are enthusiasts but have no need or use for some of the features that CUDA may help. Just because they are enthusiasts does not mean they use apps that will benefit from CUDA.

Yep. I'm biased. That's why I recommended the user get Fermi if he felt PhysX & CUDA was a deal breaker. That's why I say I am highly interested in Tegra, an nVidia product. That's why I have argued with at least one member who was biased against nVidia and those who like nVidia because I felt that person was bordering on personal attacks or speaking nonsense due to bias. Yeah, I'm very biased. Biased against blind stupidity.
 
Ahh, wow... I didn't expect almost 40 replies the next time I checked it..

Actually, I totally misrepresented why I wanted an Nvidia card.. It was 3d vision to be exact.

As per my research, ATI has never had the capability for 3d vision/stereoscopic so to speak.. I suppose I can go without it, but I was REALLY hoping to try it out this upgrade (I have an ATI right now, so I can't really..)

Cuda/PhysX are moreso a bonus feature than anything (I'm not about to grab 2 gtx "395" and a gtx 285 for physics, but I might use an old card with it...)

Thanks guys for your opinions.. I think I'll wait a few months (I'm not screaming for a new card yet).
 
Ahh, wow... I didn't expect almost 40 replies the next time I checked it..

Actually, I totally misrepresented why I wanted an Nvidia card.. It was 3d vision to be exact.

As per my research, ATI has never had the capability for 3d vision/stereoscopic so to speak.. I suppose I can go without it, but I was REALLY hoping to try it out this upgrade (I have an ATI right now, so I can't really..)

Cuda/PhysX are moreso a bonus feature than anything (I'm not about to grab 2 gtx "395" and a gtx 285 for physics, but I might use an old card with it...)

Thanks guys for your opinions.. I think I'll wait a few months (I'm not screaming for a new card yet).

just so you know, 3d vision with the red-blue glasses sucks hard. If you're getting a Geforce for 3d vision, make sure you add on the $500 for the supported monitor + shutter glasses.
 
just so you know, 3d vision with the red-blue glasses sucks hard. If you're getting a Geforce for 3d vision, make sure you add on the $500 for the supported monitor + shutter glasses.

There is a REALLY good chance the OP considered the cost of going 3D Vision if he is weighing his next graphics card purchase on it.
But you never know. He might have thought it was free.
 
There is a REALLY good chance the OP considered the cost of going 3D Vision if he is weighing his next graphics card purchase on it.
But you never know. He might have thought it was free.

He might have thought that the red-blue glasses would be good enough. I already know a few people who went nvidia thinking it would be.
 
I was part of EVGA's free 3D vision trial and honestly I was completely underwhelmed. Performance hit is not worth it. Cost is not worth it. I'll take a 27.5 inch monitor over that any day.
 
If PhysX is such a bit deal, just get a 5970 and a GTS250. Use some hacked drivers and use the nV card as well. Win / win if you ask me.
 
I was part of EVGA's free 3D vision trial and honestly I was completely underwhelmed. Performance hit is not worth it. Cost is not worth it. I'll take a 27.5 inch monitor over that any day.

You have this:

E8400 @3.825ghz
2 x EVGA GTX260's in SLI

and you're worried about a performance hit? Why? Cost doesn't seem a problem if you'd be willing to buy a 27.5" monitor.
You being underwhelmed is the important factor though. Some people say "meh" and others say "Ooooohhh"...

@Fox5: Tiger Direct seems to have the 3D Vision kit for 429.00. A little bit cheaper than what you suggested. And once in a while there are sales on the shutter glasses. I've seen them go for 100.00 on sale or special.
 
Last edited:
Ahh, wow... I didn't expect almost 40 replies the next time I checked it..

Actually, I totally misrepresented why I wanted an Nvidia card.. It was 3d vision to be exact.

As per my research, ATI has never had the capability for 3d vision/stereoscopic so to speak.. I suppose I can go without it, but I was REALLY hoping to try it out this upgrade (I have an ATI right now, so I can't really..)

Cuda/PhysX are moreso a bonus feature than anything (I'm not about to grab 2 gtx "395" and a gtx 285 for physics, but I might use an old card with it...)

Thanks guys for your opinions.. I think I'll wait a few months (I'm not screaming for a new card yet).

If you want that Stereoscopic thing, you are making the right choice by waiting 😉

Why buy now and regret later when you can wait a few months and have what you really want?
 
You have this:

E8400 @3.825ghz
2 x EVGA GTX260's in SLI

and you're worried about a performance hit? Why? Cost doesn't seem a problem if you'd be willing to buy a 27.5" monitor.
You being underwhelmed is the important factor though. Some people say "meh" and others say "Ooooohhh"...

@Fox5: Tiger Direct seems to have the 3D Vision kit for 429.00. A little bit cheaper than what you suggested. And once in a while there are sales on the shutter glasses. I've seen them go for 100.00 on sale or special.

Well the monitor I have is not exactly a bank breaker.
http://www.newegg.com/Product/Produc...ion=hanns+g+28

Also, when I tried it they didn't have SLI profiles for 3D-Vision, so it was effectively like having one GTX260 that further took a performance hit. I'm talking about my rig getting choppy frame rates with L4D. I think I remember a recent story about SLI support being added, but still, the fact remains that you take a performance hit (most, if not all of the initial reviews used a single card and reported signifigant drops in FPS).
 
Back
Top