nVidia GT200 Series Review Thread

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I don't get why people keep on saying nvidia makes terrible motherboards... They are by far the best motherboards. They have the most features, such as number of SATA ports, etc. They OC well, and their drivers are damn good... With my P35 I was getting blue screens, I tried to install the drivers from the site and it was still iffy, I put in the original disk that came with the board and it installed some weird things that weren't on the site, only then did the board start working right...
I have never had such issue with an nvidia chipset...

AMD chipsets seem to have better driver support (but I don't personally know it), but have atrocious capabilities...
1. They don't support intel CPUs, so that makes them kind of useless.
2. They don't have nearly the amount of ports and plugs that nvidia has.

The only bad thing I hear is that nforce6/7 has a rare video corruption bug and that it supposedly runs "hot" (I wouldn't know, last boards I used were nforce5 and P35, everything else is way too expensive for what it gives you).
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: taltamir
I don't get why people keep on saying nvidia makes terrible motherboards... They are by far the best motherboards. They have the most features, such as number of SATA ports, etc. They OC well, and their drivers are damn good... With my P35 I was getting blue screens, I tried to install the drivers from the site and it was still iffy, I put in the original disk that came with the board and it installed some weird things that weren't on the site, only then did the board start working right...
I have never had such issue with an nvidia chipset...

AMD chipsets seem to have better driver support (but I don't personally know it), but have atrocious capabilities...
1. They don't support intel CPUs, so that makes them kind of useless.
2. They don't have nearly the amount of ports and plugs that nvidia has.

The only bad thing I hear is that nforce6/7 has a rare video corruption bug and that it supposedly runs "hot" (I wouldn't know, last boards I used were nforce5 and P35, everything else is way too expensive for what it gives you).

um, taltamir, have you ever heard of a gigabyte 680i mobo? what about all those 680i owners who were told that they could update to a 45nm quad, only to hear "psyche!!" 6-9 mos later? overclocking on nvidia mobos (other than 790i) isn't as good as the intel equivalents at similar price points. The only reason to buy an nvidia mobo for an intel system is if you think you'll want to sli. AMD mobos, otoh, at least in the past nvidia had the best mobos for overclocking. m2n xx sli mobos are still getting newegg awards in fact. I don't know much about phenom nvidia mobos, but if last gen is any indication then I suspect that they'll produce another winner.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
that has nothing to do with nvidia making bad quality chipsets... that is because the power circutry made by the board maker was not good enough for intel CPUs (which was probably intentional, but even if not...) and has nothing to do with the quality of the chipset.
And after getting enough flak for it gigabyte decided to to agree to replace those boards. (again, you are blaming nvidia for making terrible chipset because gigabyte was false advertising?)
Just like most 780G boards explode if you put in a 125 watt CPU or OC because manufacturers skipped on power circuitry. Has nothing to do with the quality of the chipset. (which is superb btw, I wish I could get it for the LGA 775)

And last I checked "intels are cheaper for similar performance" =! better.
Intels are cheaper cause they are not as feature rich.
And slightly higher OC on intel... ok I can accept this, point for intel..

None of those reasons make nvidias chipset "OMFG TERRIBLE PIECES OF CRAP" like everyone is saying here.
They OC slightly lower, they cost more, and they have much MUCH more features and their drivers are a level above intel's (but not as good as AMD... who does not make chipsets for intel anymore).
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Overall it seems to me if they increased the price of these boards somewhere between 300%-400% and changed the name people would see them as an insane deal.

No, I didn't mistype ;)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: BenSkywalker
Overall it seems to me if they increased the price of these boards somewhere between 300%-400% and changed the name people would see them as an insane deal.

No, I didn't mistype ;)

quaddro?

@taltamir: how much hardcore oc'ing do you do? I have an x3350 on a $150 ip35 pro clocked at 3.6. I have two othe quads at 3.4 and 3.5 on ip35e's that cost me $67AR and $58. THAT is insane overclocking ability (on quads too!) for insanely low prices.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
*sigh*

Take the mobo talk to a new thread in the right forum guys, you know better. I'm getting tired of reminding you and I don't want to ban you over something so trivial
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ViRGE
*sigh*

Take the mobo talk to a new thread in the right forum guys, you know better. I'm getting tired of reminding you and I don't want to ban you over something so trivial

sorry... will behave. :)
 

JACKDRUID

Senior member
Nov 28, 2007
729
0
0
Originally posted by: bryanW1995
Originally posted by: JACKDRUID
Originally posted by: Extelleron

In general HD 4870 will compete with the GTX 260. In certain situations it may be faster or slower. In a small number of situations, such as Crysis (which I was talking about in another thread) it will likely be about equal, or a bit faster, than the GTX 280. I'll refer you to these benchmarks: http://www.nordichardware.com/news,7854.html
GTX 280 > HD 3870 by 50%. The HD 4870 will be at least 50% faster than the 3870, that is for sure, and thus it will equal the GTX 280 (or likely beat it).

please indicate you are basing these observations on pure rumors..

to mod, can you please get Extelleron to stop baseless fud for godsake? with absolutely no benchmark nor data to back it up...

extelleron has been here a LOT longer than you and your ocguy buddy. I think that the mods are much more likely to give him the benefit of the doubt than people who only show up when nvidia needs some shilling.

the comparison is still baseless until 4850/70 is actually released, regardless of how long we've been here. I suppose its the use of his wording "i'm sure its faster...", "4870 is faster.." that made me go ewww...

$450 for 280 is a good price, but I won't buy it simply for its size/heat/power... I'd rather save the money for 4870 ( if its better) or a die shrink.


 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91

Tesla actually. It will do very well in the Quadro market, but the Tesla segment is going to go nuts. This thing is an absolute monster and will take a nice bite out of Intel's marketshare in that segment. That is also a VERY high margin market. Closest comparison I could reasonably make that some of the laymen may understand is that Tesla would replace what something like a Cray Supercomputer used to do.

The GTX 260 only has 11% more shading power than the 9800GTX

This, in all honestly, is laughable. You are looking at what amounts to a 'MIPS' rating of sort and comparing two parts with enormous differences in capabilities. When you factor in how much of a performance impact MUL has you could be looking at a very large rift depending on the particular equation not to mention DP where the percentage change is roughly infinite ;)

Now, is any of this going to matter in games? Not likely in the timeframe this is a viable part. Most of the really big improvements the GT200 brings to the table are going to go unused in it's reasonable lifetime.

GTX 280 > HD 3870 by 50%. The HD 4870 will be at least 50% faster than the 3870, that is for sure, and thus it will equal the GTX 280 (or likely beat it).

The GTX seems to be handily besting the 3870x2 in most benches, I'm not saying what you are saying isn't true, but it seems to me that even with a raw doubling on every performance metric(which the most optimistic reports don't have it at) it would still have a bit more then it could deal with knocking off the 280. I don't think it is reasonable to try and set that part up in that light either. If it comes in at 85% of the 280's performance for 60% of the price then it will serve its' design goal extremely well.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Azn
I don't understand how Nvidia didn't up their texture address and filtering when they reworked their SP with GT200. The old G92 cores had 8 texture address/filter for every 16SP but GT200 has 8 texture address/filter for every 24SP. Those textures did a whole lot more for games than just higher SP clocks with G92 with modern games.

GT200 has the same 8 by 8 texturing ability just like G92 but only 10 clusters of 24 SP instead of 8 by 16SP which equals out to 80tmu. Texture fillrate was the biggest difference when comparing G92 vs G80 and why G92 was able to beat it in lower resolutions or get very close to high resolution with much lower memory bandwidth and less ROP. If they did 12 by 12 which would be the exact same number as G92 SP/texture ratio it would have 120 tmu instead of 80. GT200 is inferior far as texturing ability when you compare ratio to G92.

GeForce 9800 GTX 10.8 pixel fillrate 43.2 bilinear fillrate 21.6 FP16 fillrate 70.4 GB/s

GeForce GTX 260 16.1 pixel fillrate 41.5 bilinear fillrate 20.7 FP16 fillrate 111.9 GB/s

GeForce GTX 280 19.3 pixel fillrate 48.2 bilinear fillrate 24.1 FP16 fillrate 141.7 GB/s

Games don't need all that processing power as of yet. Most games off loads to textures and back to the memory for the most part straight from Nvidia by nRollo. So having more fillrate makes the biggest difference when you want the performance NOW long as you aren't shader limited. Sure 280gtx has more fillrate than 9800gtx but in reality it doesn't have that much more. 260gtx has even less than 9800gtx. This is where bandwidth comes into play with GT200 where it's not so limited compared to 9800gtx which you see the performance gains from most games. Just look at any of the reviews. You will see that 260gtx isn't that far off in performance compared to 9800gtx only when AA is applied in some uber high resolution does it seem like it's more faster because of bandwidth advantages. Nvidia made a future product like 2900xt tried to do. But it's still not happening.

Merged in to the main GT200 thread

they did not increase the textures per cluster, but they increased the total amount of cluster...
Increasing texture power 25%... just enough to make 16x AA feasible and low impact.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Originally posted by: BenSkywalker
Now, is any of this going to matter in games? Not likely in the timeframe this is a viable part. Most of the really big improvements the GT200 brings to the table are going to go unused in it's reasonable lifetime.

That's kinda what I've been wondering about. The GT200 cards don't really appear to improve "game-specific" processing power all that much, they are much more focused on increasing the overall computational power (CUDA) which isn't utilized (yet) for gaming.

I think though, that this may be used before these cards become obsolete. One word: PhysX. I have a feeling that nV hopes to make PhysX an integral part of gaming that won't be available on anyone else's hardware.

EDIT: I also have to wonder how well AMD will institute Havok on their hardware correctly. And if they succeed, which version of physics support will game developers be more likely to support?
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I think that we've seen a shift in intel's thinking over the past 12 months. Instead of amd being their top rival, it is now nvidia. Intel is going to work VERY closely with amd to keep them competitive against the green team for the forseeable future. If larrabee creeps up towards the high end that will obviously change, but for now amd and intel have a common enemy: nvidia. My enemy's enemy is my friend as they say, no? A strong amd graphics division gives intel a true option to block out nvidia chipsets completely from their hardware, and will incidentally make amd a much more competitive player in the cpu market...this whole menage a trois between intel/nvidia/amd is extremely interesting. Not as interesting to me as it is to the guy who spent $80 at a strip club and $649 on a gtx 280, but still interesting nonetheless.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
Originally posted by: Denithor
EDIT: I also have to wonder how well AMD will institute Havok on their hardware correctly. And if they succeed, which version of physics support will game developers be more likely to support?

Probably neither, unless one has a much greater marketshare than the other. I wouldn't alienate half of my potential customers by supporting physics in the game I am making for only people with a different brand of GPU, even if it does make the game better. That would be a great way for me to lose a bunch of money. Physics probably won't become commonplace until a major Direct X release mandates cards support a standardized way of running it.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: bryanW1995
I think that we've seen a shift in intel's thinking over the past 12 months. Instead of amd being their top rival, it is now nvidia. Intel is going to work VERY closely with amd to keep them competitive against the green team for the forseeable future. If larrabee creeps up towards the high end that will obviously change, but for now amd and intel have a common enemy: nvidia. My enemy's enemy is my friend as they say, no? A strong amd graphics division gives intel a true option to block out nvidia chipsets completely from their hardware, and will incidentally make amd a much more competitive player in the cpu market...this whole menage a trois between intel/nvidia/amd is extremely interesting. Not as interesting to me as it is to the guy who spent $80 at a strip club and $649 on a gtx 280, but still interesting nonetheless.

lol... you have no idea how intel plans to ass rape both company. Intel is friend of no one , they will destroy anyone and anything that stands in their path. Intel has a different plan then you speak off. Also Intel didn't not assist with AMD in its graphic department.

I am actually excited about larabee , since its going super easy to develop on.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: tuteja1986
Originally posted by: bryanW1995
I think that we've seen a shift in intel's thinking over the past 12 months. Instead of amd being their top rival, it is now nvidia. Intel is going to work VERY closely with amd to keep them competitive against the green team for the forseeable future. If larrabee creeps up towards the high end that will obviously change, but for now amd and intel have a common enemy: nvidia. My enemy's enemy is my friend as they say, no? A strong amd graphics division gives intel a true option to block out nvidia chipsets completely from their hardware, and will incidentally make amd a much more competitive player in the cpu market...this whole menage a trois between intel/nvidia/amd is extremely interesting. Not as interesting to me as it is to the guy who spent $80 at a strip club and $649 on a gtx 280, but still interesting nonetheless.

lol... you have no idea how intel plans to ass rape both company. Intel is friend of no one , they will destroy anyone and anything that stands in their path. Intel has a different plan then you speak off. Also Intel didn't not assist with AMD in its graphic department.

I am actually excited about larabee , since its going super easy to develop on.

no, I fully understand intel's plan. it is no different from anybody else's plan would be in the same situation. All that I meant was that they (correctly) perceive that nvidia is a much more imminent threat than amd, so they choose to prop up nvidia's competition. of course they would prefer to bury both of these rivals if possible, but unfortunately for intel that is not in the cards at present. I would fully expect amd to do the same thing if the roles were reversed.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Camped the UPS guy earlier and installed my EVGA GTX 280. No problems at all so far with PSU power, monitor outputs or anything and installed the latest WHQL 177.35 from NV's site. A few things of note:

1) Its big but not as heavy as it looks. The PCB is as long as the 8800GTX, but it seems larger because its fully shrouded. The shroud itself adds about 1/4" to the end and back of the card compared to the 8800GTX. I was able to use my case's middle passthrough fan with the 8800GTX with a bit of effort but that's no longer possible with the GTX 280.

2) The EVGA bundle, even though it seems REALLY skimpy compared to some of the other ones out there is actually pretty good. The big thing that isn't advertised is that it includes what seems to be a full version of FRAPS 2.9.4. I've already had a registered version for a long long time (its well worth the $40 or whatever it is now) but for anyone who doesn't have the full version, its a great bundle throw-in. It also includes a special edition Precision and an EVGA bubble sticker I haven't seen from previous EVGA cards.

3) The card itself is virtually silent in idle running at 40% fan speed and very cool, only 53-55C. I haven't done any extended gaming, just some benches and running around various games for 10-15 mins and the fan speed never went above 55% and temps never exceeded 73-75. Its cooler than my GTX which I ran at 90% fan speed while gaming. I haven't had to create a profile or adjust fan speed manually as the fan control whether through driver or BIOS is *very* responsive, throttling up quickly as soon as temps rise.

4) I overclocked speeds up to EVGA "FTW" speeds of 670/2400 and so far so good. Certainly another reason to go with one of the better US vendors like XFX, BFG, EVGA is for their superior support and warranty. Even though EVGA does test/bin their cards I figured they may just be validating the OC versions rather than testing everything and binning them.

Ran a few benchmarks and so far the card is significantly faster than my 8800GTX and allows me to crank features back up to max in games I had to turn settings down in order to run smoothly. AA up to 4x results in very little performance penalty with this card which is a big difference since I rarely enabled AA on my 8800GTX in newer titles.
 

MraK

Senior member
Oct 12, 2003
417
0
0
Would it be a waste, as of now, to buy 2 GTX28s, or is it more of a realistic approach just to get 1 GTX280 for now? Reason I asked about getting 2 rather than 1 is because I will be using a IPS 26"inch LCD monitor.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,176
516
126
What is the IPS 26" LCD resolution? If it is only 1920x1200, a single will be more than enough.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: bryanW1995
I think that we've seen a shift in intel's thinking over the past 12 months. Instead of amd being their top rival, it is now nvidia. Intel is going to work VERY closely with amd to keep them competitive against the green team for the forseeable future. If larrabee creeps up towards the high end that will obviously change, but for now amd and intel have a common enemy: nvidia. My enemy's enemy is my friend as they say, no? A strong amd graphics division gives intel a true option to block out nvidia chipsets completely from their hardware, and will incidentally make amd a much more competitive player in the cpu market...this whole menage a trois between intel/nvidia/amd is extremely interesting. Not as interesting to me as it is to the guy who spent $80 at a strip club and $649 on a gtx 280, but still interesting nonetheless.

My enemy's enemy is my enemy's enemy, nothing more, nothing less.
 

MraK

Senior member
Oct 12, 2003
417
0
0
Originally posted by: Fallen Kell
What is the IPS 26" LCD resolution? If it is only 1920x1200, a single will be more than enough.

So in order to make use of 2 GTX280s I would need to have a 30"inch monitor? I always thought using a 1920x1200 resolution monitor would be enough for 2 GTX280s.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
you can use 2 x GTX 280 with whatever monitor size you want... 2 of them MIGHT (probably not) allow you to max out crysis at lower resolutions.. (at 30 inch they definitely will not)
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: MrAK
Originally posted by: Fallen Kell
What is the IPS 26" LCD resolution? If it is only 1920x1200, a single will be more than enough.

So in order to make use of 2 GTX280s I would need to have a 30"inch monitor? I always thought using a 1920x1200 resolution monitor would be enough for 2 GTX280s.

As Taltamir said, you can use GTX 280 SLI with any resolution monitor. Of course if you are gaming on a 19" 1280x1204 panel, you will not need SLI. A single 9800GTX would be enough for something like that. GTX 280 SLI is overkill for 1920x1200 in general; everything but Crysis would run fine on a single GTX 280 at that res. But if you want to run Crysis very well, then GTX 280 SLI will be good for your monitor.

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i wonder if anyone would make and post a video showing crysis gameplay on GTX 280 in SLI... I want to see how the game was meant to look like.