Fermi to have bigger launch than hd5800

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Comparing ATI and Nvidia cards is like comparing apples to oranges? Haha, good one. You have fallen victim to NV's marketing machine.



PhysX? No thanks. Now that DX11 and OpenCL are both out, I'll wait for physics engines that run on these APIs and support "all" hardware.

As for 3D Vision, I tried two games with it, World of Warcraft and GTA. While the effect was interesting, it was nothing I can handle playing with for more than 10 minutes or so, I preferred gaming without the 3D effect. Your assessment is clearly wrong :p
Am I wrong? It appears PhysX and 3D isn't your cup of tea, but that doesn't mean no one likes it. These things contribute into performance of the product. Trying to compare FPS + power consumption + PhysX + 3D to FPS + power consumption is simply not valid, and therefore, it is just like a comparison between apple and orange.

If PhysX is a "No thanks" to you, then why bother waiting for something that does the exact same thing? The fact is, Nvidia can accelerate physX with GPU, and ATI doesn't do anything in that nature. Find me a game where ATI video card gets more compare to Nvidia video card if you can other than Dx11. It is true that no Nvidia card can handle Dx11 now, only Fermi which is going to hit the market Apirl 6. I don't see any sign of ATI supporting physics acceleration anytime soon. Intel simply won't allow any GPU to handle its load. Good luck.

As to 3D, I think it is your personal problem that you can handle it. My father can't handle computer and internet. It is simply too much for him. That does not mean computer and internet are gimmicks.
 
Last edited:

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Seriously Physx is a non factor. Games that use real physics in gameplay all use Havok or other solutions anyways. Physx is basically some eye candy that murders your fps unless you have a 2nd card dedicated to it. Battlefield bad company 2 has destructible environments and is one of the best looking games out today and it works fine using Havok.
 

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Am I wrong? It appears PhysX and 3D isn't your cup of tea, but that doesn't mean no one likes it. These things contribute into performance of the product. Trying to compare FPS + power consumption + PhysX + 3D to FPS + power consumption is simply not valid, and therefore, it is just like a comparison between apple and orange.

If PhysX is a "No thanks" to you, then why bother waiting for something that does the exact same thing? The fact is, Nvidia can accelerate physX with GPU, and ATI doesn't do anything in that nature. Find me a game where ATI video card gets more compare to Nvidia video card if you can.

As to 3D, I think it is your personal problem that you can handle it. My father can't handle computer and internet. It is simply too much for him. That does not mean computer and internet are gimmicks.

Well, some folks like eye finity out of single card, some don't. Regardless of this it is something that ATI offers that Nvidia does not. Though in a years time there will be open physics, third parties will have 3D (it will likely have something like a direct X standard, if not part of Direct X), and it will come down to what colour you like best. (Mind you hydra will only work with R/G anaglyph methods ;) )
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Seriously Physx is a non factor. Games that use real physics in gameplay all use Havok or other solutions anyways. Physx is basically some eye candy that murders your fps unless you have a 2nd card dedicated to it. Battlefield bad company 2 has destructible environments and is one of the best looking games out today and it works fine using Havok.
Havok works for Nvidia cards too. What you don't know is the limitation of Havok. Let say a simple robot explodes into pieces, both Havok and physX can handle it. Now let say 1,000 robots explode into pieces, PhysX can handle it without hurting FPS, while Havok may have an small impact of FPS. What about 1M robots explode at once? PhysX can handle it with a small impact on FPS, while Havok more or less become a slide show.

Yes, a 16 core CPU may handle it as well as a GTS 8800 as a PPU, but are you really going to use a 16 core CPU just for that?

PhysX is capable of handling interactive physics that effect game play, but that is excessive running time that hurts varies setup, even with Nvidia video card. If PhysX is a required code to run the game, than the game must be optimized to a level that it is still playable without a Nvidia card, which is where it is now. A game simply won't sell if it requires 480 to function. The plan should be, it runs on 8800 GTS, but gets more from 480. Some deal with Dx11. It runs on all ATI and Nvidia cards, but gets more from the Evergreen series. If PhysX is a gimmick, than Tessellation is also a gimmick. The tessellation demo showed a huge difference between Dx10 and Dx11, but Dirt 2 doesn't show that, 2033 doesn't show that. There are differences, but IMO the differences between tessellation on/off is the same, if not lesser than PhysX on/off.
 
Last edited:

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
lol, have you actually ever used PhysX? You would know how much it impacts your fps.

Speaking of benchmarks, it means nothing to me.
That's quite hilarious as well.
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
The point is that the fact that Physx won't work on all cards and Havok does is what makes Havok the only viable solution if you really need to integrate physics into your game. That and the fact that we seem to be getting less and less cpu limited in games (see BFG10K on that) and you see that the average gaming rig probably has more cpu power to spare than GPU power to spare anyways. Now if NVIDIA would release a more efficient and reasonably multithreaded version of Physx then you might see it becoming integrated into games but that would not help them sell cards so they will never do that.
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
Well, some folks like eye finity out of single card, some don't. Regardless of this it is something that ATI offers that Nvidia does not. Though in a years time there will be open physics, third parties will have 3D (it will likely have something like a direct X standard, if not part of Direct X), and it will come down to what colour you like best. (Mind you hydra will only work with R/G anaglyph methods ;) )
Eyefinity is a fancy name for multi display, and multi display really isn't something new, as I have been using it for years with a PCI (not PCIe) video card. 3D is also old, and 3D vision is another fancy name.

It is to my surprise that 5870 support multi-display + dx11 and cheaper than 295. I have been telling people to go get 5870 for this simple reason as a few FPS more doesn't mean anything. However, if Fermi comes out, then the tide simply changes. This won't be true forever as new video card will arise. As of today, it is too soon to say that it is the case simply because of NDA.

If however, Fermi comes out to be faster and has more function points and not too expensive, then it will take over the market. Unfortunately, I also believe that 600 bucks USD is what an average Joe needs to pay for a 480.
 

brybir

Senior member
Jun 18, 2009
241
0
0
Seriously Physx is a non factor. Games that use real physics in gameplay all use Havok or other solutions anyways. Physx is basically some eye candy that murders your fps unless you have a 2nd card dedicated to it. Battlefield bad company 2 has destructible environments and is one of the best looking games out today and it works fine using Havok.

Id agree with this. The physics of what occurs in BFBC2 compared to what occurs in Batman are worlds apart. Ill take the BFBC2 version anyday personally.

Physx could be cool if it was more of a standard, but I am not interested in buying a product because of some "feature" that I have the strange feeling is not going to be around in a couple of years when MS and others get the DirectCompute stuff more aligned with the direction of GPU accelerated physics (or we can debate the need to have the GPU do the physics in most cases with the increasing prevalence of 6+ core CPU's in the next few years).

Or, wait to see what things like FUSION and Sandy Bridges brings to the table. With native shader cores brought on package or on die, we will see a surge in programmer interest in doing things like standardized physics and other cool things. Why then? Because then everyone going forward will have the capability, not a smaller subset of the population such as now. Think of it this way as a game developer: Why continue to invest in Physx that works on nVidia only discrete graphics cards when in two years every AMD and Intel CPU will have a significant number of shader processors available to it and will be in every computer, all able to be operated by an industry standard MS API? If I were a developer, I would be moving in that direction.
 
Last edited:

Daedalus685

Golden Member
Nov 12, 2009
1,386
1
0
Eyefinity is a fancy name for multi display, and multi display really isn't something new, as I have been using it for years with a PCI (not PCIe) video card. 3D is also old, and 3D vision is another fancy name.

It is to my surprise that 5870 support multi-display + dx11 and cheaper than 295. I have been telling people to go get 5870 for this simple reason as a few FPS more doesn't mean anything. However, if Fermi comes out, then the tide simply changes. This won't be true forever as new video card will arise. As of today, it is too soon to say that it is the case simply because of NDA.

If however, Fermi comes out to be faster and has more function points and not too expensive, then it will take over the market. Unfortunately, I also believe that 600 bucks USD is what an average Joe needs to pay for a 480.

You asked for a feature that ATI supports that Nvidia does not. Right now you can run 3 displays on all 5000 cards, 5 or 6 on the coming ones. Sure it is possible for Nvidia to release something like this, and they likely will soon enough, but it is just as reasonable to assume ATI will support physics eventually, perhaps even on current and older lines.
 

brybir

Senior member
Jun 18, 2009
241
0
0
Eyefinity is a fancy name for multi display, and multi display really isn't something new, as I have been using it for years with a PCI (not PCIe) video card. 3D is also old, and 3D vision is another fancy name.

It is to my surprise that 5870 support multi-display + dx11 and cheaper than 295. I have been telling people to go get 5870 for this simple reason as a few FPS more doesn't mean anything. However, if Fermi comes out, then the tide simply changes. This won't be true forever as new video card will arise. As of today, it is too soon to say that it is the case simply because of NDA.

If however, Fermi comes out to be faster and has more function points and not too expensive, then it will take over the market. Unfortunately, I also believe that 600 bucks USD is what an average Joe needs to pay for a 480.


The downside is that many of us will never spend $600 for a video card regardless of how cool it is. I have a hard time justifying $200 on a video card most of the time.

BTW though, your old Matrox PCI TripleHead card or whatever you were using is nowhere near the same as having an eyefinity setup (its your argument about features, 5870 eyefinity can do everything that the Triplehead type setup can do, but not the other way around). Besides, eyefinity is very cool, and hopefully nvidia will develop their own method of doing it (maybe they have already?). The next round of upgrades I am definitely going to drop the $ on a few DP monitors. Seeing eyefinity in Supreme Commander 2 was just cool, and if that support from developers continues, multi-monitor gaming certainly has a great future.

As to taking over the market, it only works if there is supply available and the competition does not adjust its prices relative to nvidia. You seem not interested in Price/Performance in standard 1 monitor gaming, alleging an unbalanced comparison between feature sets, but for many of us, that is the exact metric we are looking for, because the other features are not compelling enough to spend much if any money on (yet).
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Since this thread is already way off topic, I figured I'd post some info for all those people that registered in the last few years that have no idea why some people are really opposed to having focus group members on AT.

http://www.google.com/search?source=ig&hl=en&rlz=&=&q=arbuthnot+nvidia

long story short, there is more history to this than just Keys and a few other people receiving free stuff from NV and being open and honest about their affiliation with NV...

Read up on it, and make up your own mind.

I certainly don't discount what Keys has to say because he's a smart guy, but I keep in mind that he could easily only pay about $600 for a rig the rest of us would pay $1800 for because of his affiliation with NVIDIA. I'm not sure I could honestly say that plus guaranteed availability wouldn't 'sway' me a little...
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The point is that the fact that Physx won't work on all cards and Havok does is what makes Havok the only viable solution if you really need to integrate physics into your game. That and the fact that we seem to be getting less and less cpu limited in games (see BFG10K on that) and you see that the average gaming rig probably has more cpu power to spare than GPU power to spare anyways. Now if NVIDIA would release a more efficient and reasonably multithreaded version of Physx then you might see it becoming integrated into games but that would not help them sell cards so they will never do that.
Whether PhysX can run in multithread or not doesn't depend on Nvidia. Think about it, why will they want to limit it on CPU? I mean, how difficult to see that GPU+CPU > CPU? How many cores are in xbox360 and ps3? Yes I can optimize a game utilizing 16 cores + 4 video card, but it won't run on a dual core and a single video card, yeah it isn't going to happen. Most games are optimized for console first, followed by a dual core system with 512Mb video card. Anything on top is meaningless if not wasted. Imagine a game that can utilize 16 cores running on a dual core.
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Seriously Physx is a non factor.

seriously Physx is a factor to me. its not like having Physx would disable anything... its a great feature to have no matter what. Physx is like having 'math co-processor'...doesn't hurt to have it, awesome when it works.


given same price, performance, I would take Nvidia over ATI card because it has Physx.

DX11 is currently less an issue for me, as it'll take about 2 more years for it to materialize. in 2 years, likely i'll have a new videocard.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,046
548
136
Well since you guys headed that way. The forum overlords have granted nVidia the right to populate the forums with their focus group. Here is one of their views on it. There are some interesting things to be read.

"Janooo does have a point that because it appears to him to be a certain way (and it appears that way to others as well) that someone (Anand and I) need to make a decision about the situation.

There are multiple layers here by the way. If there are viral marketers on the board, then they are more dangerous than anyone who has a known association with a company. But members need to be smart about how much credit they give to people on a certain topic.

On factual info about NVIDIA products, these guys are a great source.

For information on the value of features or the meaning of performance results, not so much. Or not at all really. And definitely not for information on competing products or how well they stack up to NVIDIA products. Every one here needs to take everything these guys say wrt these topics and largely throw it out the window. And you have the information right their in their sig to know that you need to do that.

I'm not saying that NVIDIA Focus Group members themselves are even the problem. The problem is deeper than that. It's call indoctrination.

As keys says, he knows 1000x more about NVIDIA hardware than AMD hardware. He is also exposed to NVIDIA's PR team and product managers. These guys are really really good at what they do.

It is much more effective to win converts and send them into the world then to hire salesmen.

And frankly NVIDIA's technology is impressive. It's easy to show people how awesome something is when it is awesome. Especially when they aren't as immersed in AMD hardware goodness and don't have as hands on an experience with the under-the-hood behind-the-scenes stuff from the competition. Not spending the time with AMD hardware and people that they do with NVIDIA hardware and people is a detriment to Focus Group members ability to accurately assess the market.

This should not be surprising to anyone.

People who see and play with demos and future games based on PhysX without being shown comparative examples of the software run on CPU and competing GPU solutions will color those people's opinion of the value of PhysX at this stage in the game. AnandTech's assessment is that you can't judge the real value of a feature until games ship using it. The assessment of others may be that the potential is awesome for PhysX on NVIDIA hardware ... but what if it's just as awesome on multicore CPUs (in practice) or on other GPUs? Sure, theoretically PhysX offers more, but we don't have anything "game play physics" related that is compelling at this point.

CUDA is great. We've had the AMD implementation of folding@home for a freaking long time though ... and AMD had CTM before NVIDIA had PTX ... and they've got CAL which is basically the same deal. We really need a unified standardized compute language, as I don't see language extensions created by anything less than an industry standards group actually satisfying the needs of everyone. I hope OpenCL ends up being what we need but only time will tell.

Seeing the performance numbers, the price, the fact that AMD can support c/c++ high level GPGPU computing as well through Brook+ or other tools, the fact that PhysX (or even Havok) support could be implemented through CAL / CTM (or a high level language if they're lazy) on AMD hardware, and a complete lack of compelling software using these features on NVIDIA's hardware (it's all demos, add-ons, lame effects and stuff that's supposed to come out in the future) absolutely does mean that someone with a well rounded perspective would conclude that GTX 260 is not worth it's price compared to the Radeon HD 4870.

NVIDIA Focus Group Members cannot have this well rounded perspective unless they work very very hard to overcome the natural barriers to neutrality put in front of them.

And it's not even really their "fault" -- it is their perspective that is tainted. "
 

Seero

Golden Member
Nov 4, 2009
1,456
0
0
The downside is that many of us will never spend $600 for a video card regardless of how cool it is. I have a hard time justifying $200 on a video card most of the time.

BTW though, your old Matrox PCI TripleHead card or whatever you were using is nowhere near the same as having an eyefinity setup (its your argument about features, 5870 eyefinity can do everything that the Triplehead type setup can do, but not the other way around). Besides, eyefinity is very cool, and hopefully nvidia will develop their own method of doing it (maybe they have already?). The next round of upgrades I am definitely going to drop the $ on a few DP monitors. Seeing eyefinity in Supreme Commander 2 was just cool, and if that support from developers continues, multi-monitor gaming certainly has a great future.
it isn't Matrox or Triplehead, just a simple PCI video card. It will probably fry if i try to play games on it. However, I can run Oracle forms across multiple displays us one windows. The name of the video card is Diamond S9250 PCI 256 DDR.

As to taking over the market, it only works if there is supply available and the competition does not adjust its prices relative to nvidia. You seem not interested in Price/Performance in standard 1 monitor gaming, alleging an unbalanced comparison between feature sets, but for many of us, that is the exact metric we are looking for, because the other features are not compelling enough to spend much if any money on (yet).
If i am not mistaken, all video cards that allows multi-display can play games on multi-display. The HD4xxx can, the GT 8800 can. What it can not do is to increase the resolution of the game, which is a driver issue and Nvidia has fixed it, or allow it now and not before for whatever reason, making all existing video card 8800 and above, support multi-display gaming on high resolution.

I wasn't serious about the taking over the market thing. If you stand still and let me punch you 10 times in the face, you will probably die. The chance of you dying from 10 direct hit in the face is extremely high, but the chance for you to stand still and let me punch you without breaking my face is near 0. ATI is not dump. In fact, they are extremely smart. The key is, whatever they do, is going to benefit customer.
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Seeing the performance numbers, the price, the fact that AMD can support c/c++ high level GPGPU computing as well through Brook+ or other tools, the fact that PhysX (or even Havok) support could be implemented through CAL / CTM (or a high level language if they're lazy) on AMD hardware, and a complete lack of compelling software using these features on NVIDIA's hardware (it's all demos, add-ons, lame effects and stuff that's supposed to come out in the future) absolutely does mean that someone with a well rounded perspective would conclude that GTX 260 is not worth it's price compared to the Radeon HD 4870. "

you are correct that GTX 260 is not worth the price compare to 4870.

However, Physx does have a few game support, and the frame rate is good enough to justify having it, you are basically getting the effects of octo-core cpu for friction of price. its simply idiotic / sad that AMD turned its back on Physx for absolutely no appearant reason other than saving a few bucks on licensing fees... had they adopted it (or even allow people to hack it), we would have had much better / stronger pc gaming industry now (at least we'll have better physx than console gaming).

IMO, AMD/microsoft is partially responsible for PC gaming losing to console gaming.
 

zerocool84

Lifer
Nov 11, 2004
36,041
472
126
lol @ the Physx argument again. Physx is nothing today. NVIDIA said it's going to change the way we play games and we have yet to see anything like that. It's just some basic eye candy that can and has been done on the processor before. Come back to me when Physx actually does something meaningful. Until then it's a non-factor.
 

at80eighty

Senior member
Jun 28, 2004
458
5
81
I figured I'd post some info for all those people that registered in the last few years that have no idea why some people are really opposed to having focus group members on AT.

http://www.google.com/search?source=...buthnot+nvidia

long story short, there is more history to this than just Keys and a few other people receiving free stuff from NV and being open and honest about their affiliation with NV...

Read up on it, and make up your own mind.


hm...
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Eyefinity is a fancy name for multi display, and multi display really isn't something new, as I have been using it for years with a PCI (not PCIe) video card. 3D is also old, and 3D vision is another fancy name.

It is to my surprise that 5870 support multi-display + dx11 and cheaper than 295. I have been telling people to go get 5870 for this simple reason as a few FPS more doesn't mean anything. However, if Fermi comes out, then the tide simply changes. This won't be true forever as new video card will arise. As of today, it is too soon to say that it is the case simply because of NDA.

If however, Fermi comes out to be faster and has more function points and not too expensive, then it will take over the market. Unfortunately, I also believe that 600 bucks USD is what an average Joe needs to pay for a 480.
Do you know that there is no single NV card that can handle 3 monitors?
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Since this thread is already way off topic, I figured I'd post some info for all those people that registered in the last few years that have no idea why some people are really opposed to having focus group members on AT.

http://www.google.com/search?source=ig&hl=en&rlz=&=&q=arbuthnot+nvidia

long story short, there is more history to this than just Keys and a few other people receiving free stuff from NV and being open and honest about their affiliation with NV...

Read up on it, and make up your own mind.

I certainly don't discount what Keys has to say because he's a smart guy, but I keep in mind that he could easily only pay about $600 for a rig the rest of us would pay $1800 for because of his affiliation with NVIDIA. I'm not sure I could honestly say that plus guaranteed availability wouldn't 'sway' me a little...

thats an eyeopener indeed..
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Do you know that there is no single NV card that can handle 3 monitors?

Actually there are quite a number, only they are either expensive or expensive for what they offer.
For instance, there are dual GPU workstation cards with 4 outputs using two low end GPUs, but they sell for a lot more than you would want to pay as a gamer for the performance.
 

ginfest

Golden Member
Feb 22, 2000
1,927
3
81
That's because he'd be gone out of here (and there) if he did but the agenda is the same...

I applaud [H] for not letting nvidia salesmen post there...



You don't have to be in a "focus group" to be a shill or FUD spreader ;)
You shill harder for your "side" then Keys does for NV. Does that mean you should be gone out of here? :)

Don't waste you time with [H], your team is going strong at B3D!
 

SolMiester

Diamond Member
Dec 19, 2004
5,330
17
76
Since this thread is already way off topic, I figured I'd post some info for all those people that registered in the last few years that have no idea why some people are really opposed to having focus group members on AT.

http://www.google.com/search?source=ig&hl=en&rlz=&=&q=arbuthnot+nvidia

long story short, there is more history to this than just Keys and a few other people receiving free stuff from NV and being open and honest about their affiliation with NV...

Read up on it, and make up your own mind.

I certainly don't discount what Keys has to say because he's a smart guy, but I keep in mind that he could easily only pay about $600 for a rig the rest of us would pay $1800 for because of his affiliation with NVIDIA. I'm not sure I could honestly say that plus guaranteed availability wouldn't 'sway' me a little...

As opposed to those affiliated with ATi\AMD that we don't know about!
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
You don't have to be in a "focus group" to be a shill or FUD spreader ;)
You shill harder for your "side" then Keys does for NV. Does that mean you should be gone out of here? :)

Don't waste you time with [H], your team is going strong at B3D!

So, so true. I've seen many people in this video forum on the red and green side that make Keys sound like a saint.

Most of them offer NO information but, just bash, bash, bash, whether it be Nvidia or ATI.
To me Keys is informative, if he seems bias, I just don't listen to him.
I guess this is the difference between a grown up and adolescence.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
I don't believe it has ever been confirmed that AMD gives free stuff to people in exchange for them telling people on forums how great it all is. It's known that Nvidia does it.
 
Status
Not open for further replies.