GT200b to launch at Nvision 08 (Aug 25-27)

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

9nines

Senior member
Sep 6, 2006
215
0
0
Originally posted by: chizow



Besides microstutter is just the top of the list with multi-GPU solutions, I was actually referring to scaling issues which is still a very valid concern. Basing performance on an ATI-mandated limit of 4 popular titles isn't exactly what I'd call an accurate representation of performance and scaling.
Something else I have noticed on reviews that report minimum FPSs is that CF 4870s often have the same minimum FPSs (sometimes even lower minimums) as a single 4870. Isn't that going to lower the quality compared to a more powerful single GPU that has higher minimum FPSs? In other words, if both set-ups have higher maximum FPSs than a human eye will detect, who cares about the 20% to 40% higher maximum of the 4870X2 if it has minimum FPSs that cause more stutter than a single GTX280 experiences.

 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: 9nines
Originally posted by: chizow



Besides microstutter is just the top of the list with multi-GPU solutions, I was actually referring to scaling issues which is still a very valid concern. Basing performance on an ATI-mandated limit of 4 popular titles isn't exactly what I'd call an accurate representation of performance and scaling.
Something else I have noticed on reviews that report minimum FPSs is that CF 4870s often have the same minimum FPSs (sometimes even lower minimums) as a single 4870. Isn't that going to lower the quality compared to a more powerful single GPU that has higher minimum FPSs? In other words, if both set-ups have higher maximum FPSs than a human eye will detect, who cares about the 20% to 40% higher maximum of the 4870X2 if it has minimum FPSs that cause more stutter than a single GTX280 experiences.
Absolutely. That would be a prime example of where you're paying more expecting better performance where you need it most, but only getting single card performance. The AVG FPS might be very different though and show an improvement, but only because you were getting higher frame rates in less intensive sequences where you didn't need them to begin with.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: 9nines
Originally posted by: chizow



Besides microstutter is just the top of the list with multi-GPU solutions, I was actually referring to scaling issues which is still a very valid concern. Basing performance on an ATI-mandated limit of 4 popular titles isn't exactly what I'd call an accurate representation of performance and scaling.
Something else I have noticed on reviews that report minimum FPSs is that CF 4870s often have the same minimum FPSs (sometimes even lower minimums) as a single 4870. Isn't that going to lower the quality compared to a more powerful single GPU that has higher minimum FPSs? In other words, if both set-ups have higher maximum FPSs than a human eye will detect, who cares about the 20% to 40% higher maximum of the 4870X2 if it has minimum FPSs that cause more stutter than a single GTX280 experiences.
yes, the average (displayed) FPS will be higher, but perceived smoothness by user will be lower due to lower min FPS. This is a closely related issue to micro-stutter in multi-GPU "solutions"
 
Jul 6, 2008
135
0
0
Originally posted by: chizow
Originally posted by: The Odorous One
That is my experience as well, no micro-stuttering observed at all. I have only replied to this thread to quell the mis-info. Funny how most all of the 48xx CF reviews make no mention of micro-stuttering either. This is what happens when people that are not qualified to make comments comment anyway :roll:
Originally posted by: evolucion8
If he says that he doesn't have microstutter with his setup, who can prove him wrong?
Once again, just because you guys don't "hear the music" or notice the phenomena, for whatever reason, does not mean microstutter doesn't exist. I've already challenged someone else claiming the same to prove it doesn't exist rather simply, all it involves is using FRAPs to log FPS and post the results so that we can take a look at them. Ideally you use a game/sequence that varies in FPS and drops below refresh rate, as frame rates higher than refresh will normalize delay and mask any microstutter.
It's not perceived by me. How you can possibly continue making statements without using multi-gpu is baffling :confused:

Your psu has ripple too, can you perceive that?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Odorous... they accept that you do not perceive it yourself, it is clearly stated that not everyone perceives micro stutter. But you claim that just because you don't perceive it, nobody else does as well.

And I would have perceived my PSU ripple had I had electrical sensitivity (like a shark) and was looking right at it, seeing as humans do not have such a sense, then no, we can't. This has nothing to do with micro-stutter though.

Virge, you warned about getting too heated. Did you mean we should cease discussion of this topic, or did you mean that we should keep such a discussion civil.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: keysplayr2003
Anyway..... If the current GT280, for example, can o/c to 700/1500 on a 65nm process, I think we can look forward to some nice stock clocks on the 55nm GT200b's. And Nvidia should be very competitive on the price point. I do hope they keep the 512bit bus however.
Actually, the move to 55nm does not garentee any clock improvements at all. Even Intel, moving from 65nm to 45nm didn't see that huge of a jump in clocks. Most people were already hitting 3.6 on the E6600's, and now the E8400's can generally hit around 4Ghz... Forecasts were thinking they would hit 4.5+ no problem, but that isn't the case. Once t he 45nm matures, then will they yeild better results. But if moving from 65 to 45 was that small of a jump in clock, then my guess is 65nm to 55nm will be even smaller.

If I had to guess, I'd put the new GT200B at 700Mhz (maybe 750 for O/C) core, 1600ish for Shader and around 2400/2500 for memory. The chips should run a bit cooler clock for clock, but 55nm isn't going to revolutionize the GT200 at all.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Anyway..... If the current GT280, for example, can o/c to 700/1500 on a 65nm process, I think we can look forward to some nice stock clocks on the 55nm GT200b's. And Nvidia should be very competitive on the price point. I do hope they keep the 512bit bus however.
Actually, the move to 55nm does not garentee any clock improvements at all. Even Intel, moving from 65nm to 45nm didn't see that huge of a jump in clocks. Most people were already hitting 3.6 on the E6600's, and now the E8400's can generally hit around 4Ghz... Forecasts were thinking they would hit 4.5+ no problem, but that isn't the case. Once t he 45nm matures, then will they yeild better results. But if moving from 65 to 45 was that small of a jump in clock, then my guess is 65nm to 55nm will be even smaller.

If I had to guess, I'd put the new GT200B at 700Mhz (maybe 750 for O/C) core, 1600ish for Shader and around 2400/2500 for memory. The chips should run a bit cooler clock for clock, but 55nm isn't going to revolutionize the GT200 at all.
You could be right there. Die shrink of course does not guaranty higher clocks. 9 times out of 10 though, this has been the case. I think your numbers, 700-750 aren't far off from what we'll see.
I wonder about that NVIO chip though. Will they absorb it's functionality into the core, or will it remain discrete. G80 to G92 had some pretty large modifications. Less ROP's, more texture units, smaller bus. Who knows what NV has in store for GT200.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: keysplayr2003
Originally posted by: ArchAngel777
Originally posted by: keysplayr2003
Anyway..... If the current GT280, for example, can o/c to 700/1500 on a 65nm process, I think we can look forward to some nice stock clocks on the 55nm GT200b's. And Nvidia should be very competitive on the price point. I do hope they keep the 512bit bus however.
Actually, the move to 55nm does not garentee any clock improvements at all. Even Intel, moving from 65nm to 45nm didn't see that huge of a jump in clocks. Most people were already hitting 3.6 on the E6600's, and now the E8400's can generally hit around 4Ghz... Forecasts were thinking they would hit 4.5+ no problem, but that isn't the case. Once t he 45nm matures, then will they yeild better results. But if moving from 65 to 45 was that small of a jump in clock, then my guess is 65nm to 55nm will be even smaller.

If I had to guess, I'd put the new GT200B at 700Mhz (maybe 750 for O/C) core, 1600ish for Shader and around 2400/2500 for memory. The chips should run a bit cooler clock for clock, but 55nm isn't going to revolutionize the GT200 at all.
You could be right there. Die shrink of course does not guaranty higher clocks. 9 times out of 10 though, this has been the case. I think your numbers, 700-750 aren't far off from what we'll see.
I wonder about that NVIO chip though. Will they absorb it's functionality into the core, or will it remain discrete. G80 to G92 had some pretty large modifications. Less ROP's, more texture units, smaller bus. Who knows what NV has in store for GT200.
A die shrink can only be good. So I am hoping they can really bring down the costs and increase the performance at the same time. But, I am a bit weary of purchasing a card right now, especially if what nVidia hints at holds true. It seems the minute I purchase an expensive graphics card, one is released shortly after that demolishes it. Or, it takes a huge price cut. I was hoping that GT200 was more than it ended up being, so in many ways I am very glad I decided not to buy it. Just look, it went from $650 to $400 in a month! Crazy times! I think I will be skipping this generation anyway... When I really sit back and honestly evalulate the performance of the 8800GT/GTS/GTX I can't really see a compelling reason to upgrade for 16x10 or 19x12 displays...
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: ArchAngel777
When I really sit back and honestly evalulate the performance of the 8800GT/GTS/GTX I can't really see a compelling reason to upgrade for 16x10 or 19x12 displays...
That may be true at 1680, but its definitely not the case at 1920. Many recent titles were already taxing my 8800 GTX to the point frames were constantly below 40FPS and teetering on the edge of being unplayable.

Here's a quick rundown of more recent titles:

Group 1
LOTRO DX10
CoH + OF DX10
World In Conflict
Sup Comm + FA
NWN2 + MotB
Mass Effect
Assassin's Creed
The Witcher
Crysis
Gears of War

Huge improvement in gameplay experience, not only frame rates but also enabling higher levels of IQ and detail. In some cases, I'm also able to enable AA but many of these demanding titles were borderline playable on a single 8800 GTX with some settings turned down.

Group 2
Bioshock
COD4
STALKER
Titan's Quest + IT

No major improvement with frame rates, upgrading just pushed FPS to 60 capped or to a more constant, higher frame rate. Upgrade has mainly allowed me to push up AA settings.

Group 3
FEAR
GRAW2
Sins of a Solar Empire
CS:S
Anything older than 2 years

Frames were already high enough it didn't really matter, but enabling higher levels of AA is now a possibility without any perceived performance hit.
 

toslat

Senior member
Jul 26, 2007
216
0
76
On micro-stutter:
Micro-stutter exists in both single GPU and multi-GPU setups, and its generally worse in the latter. People differ in the level of micro-stutter they can discern. Stating it as a general disadvantage for multi-GPU is not really fair, else I could say that anything above 60FPS (or even 30FPS) is wasted as a lot of people couldn't discern it. People like BFG10K, who seem to be sensitive to micro-stutter, are free to shun multi-GPU and any other setups/cards they feel are not to their liking. Even people that make their buying decisions based on color should be free to do so, after all its their money. Guys should just let the ms issue rest: does it exist? yes; will everybody notice it? no; will I notice it? only way to know for sure is if you try it. Anything else IMO is a wasted effort.

On minimum FPS:
A single stated value of minimum fps doesn't tell you enough to make an informed decision. A better resource would be the distribution of the FPS. For all you know the dips might occur once in the game or, at the other end, every other frame (like in micro-stutter). I think an average FPS + variance, though not ideal, is better than the average-min-max set.

To more relevant stuff:
One of the downsides to die shrinking is the reduced surface area for heat dissipation, and I feel this is what may limit the achievable clocks.

It definitely would be interesting to see how it goes for Nvidia, it being their first 55nm, though I was expecting them to jump to 45nm. As AMD has found out in the CPU biz, being a node behind your competition could increase the hurt.

All in all, its a good time for the consumer. Now for Intel to join the fray!
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: toslat
On micro-stutter:
Micro-stutter exists in both single GPU and multi-GPU setups, and its generally worse in the latter. People differ in the level of micro-stutter they can discern. Stating it as a general disadvantage for multi-GPU is not really fair, else I could say that anything above 60FPS (or even 30FPS) is wasted as a lot of people couldn't discern it. People like BFG10K, who seem to be sensitive to micro-stutter, are free to shun multi-GPU and any other setups/cards they feel are not to their liking. Even people that make their buying decisions based on color should be free to do so, after all its their money. Guys should just let the ms issue rest: does it exist? yes; will everybody notice it? no; will I notice it? only way to know for sure is if you try it. Anything else IMO is a wasted effort.
The first step is acknowledging it does exist of course. I agree that the only way to know if it bothers you is to see it first-hand, but for people to blindly recommend CF/SLI or deny its existence is poor and irresponsible advice. If I committed to multi-GPU and plunked down $400-600 for an "upgrade" only to find I couldn't stand it, I'd be pretty pissed.

On minimum FPS:
A single stated value of minimum fps doesn't tell you enough to make an informed decision. A better resource would be the distribution of the FPS. For all you know the dips might occur once in the game or, at the other end, every other frame (like in micro-stutter). I think an average FPS + variance, though not ideal, is better than the average-min-max set.
I'd prefer FPS vs time graphs and an attached .txt or spreadsheet with frame dumps. Then there'd almost be no need for attached commentary as the graph itself would address many of the biggest issues you see around here. Unfortunately AT has stated they will not do this, so I'll have to continue looking elsewhere for truly relevant info.

To more relevant stuff:
One of the downsides to die shrinking is the reduced surface area for heat dissipation, and I feel this is what may limit the achievable clocks.

It definitely would be interesting to see how it goes for Nvidia, it being their first 55nm, though I was expecting them to jump to 45nm. As AMD has found out in the CPU biz, being a node behind your competition could increase the hurt.

All in all, its a good time for the consumer. Now for Intel to join the fray!
There used to be quite a bit of discussion about die size and heat dissipation, but it really hasn't materialized in any negative manner. GT200 isn't their first move to 55nm, they've already cut their teeth with G92 9800 GT and GTX+. Its also good indication NV should be able to eek out higher clocks and hit previous OC'd model speeds even if they don't save on power or temps.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: 9nines
Something else I have noticed on reviews that report minimum FPSs is that CF 4870s often have the same minimum FPSs (sometimes even lower minimums) as a single 4870.
Please define what you mean by "often". Because in the reviews I see, which actually show a 4870 vs 4870CF (and also show MIN fps... there aren't too many that do), one would be hard-pressed to call what you describe as "often".

Let's look here first. They test Bioshock, COD4, UT3, and Crysis across several resolutions. The only game where a CF setup has a lower MIN fps than a single 4870 is UT3. As for CF having the same MIN fps as a 4870, that tends to happen at lower resolutions such as 16x12, where one would be making a very questionable purchase to begin with.

Now, let's look here. They test A LOT more games... 10 total. Only in 2 of them do a 4870CF have lower MIN fps than a single 4870 (GRID and WIC, which they say is due to a driver bug).

Either way, I wouldn't call this as happening "often". But if you have other reviews you can link to that show this happening more, please do. I haven't been able to find too many reviews that show a CF vs a single card AND show min fps too. Few and far between. Maybe we'll see more once the X2 is released.
 

finbarqs

Diamond Member
Feb 16, 2005
4,057
2
81
remember guys, if the 4870X2 is ANYTHING like the 3870x2, then we're in for an awesome treat! First of all, it runs natively with standard ATI drivers, meaning that we do not need to enable "CF" mode to get it the framerates of CF! Second, DUAL MONITOR Support. Nothing from Nvida based on 2 GPU's can do dual head output, when SLI is enabled. ATI's you can.
 

extra

Golden Member
Dec 18, 1999
1,947
6
81
Really seems like a lot of folks are grasping at straws for any reason they can to bash multi-gpu setups... Also hafta giggle when people bring up a few fluke minimum framerate issues especially when it's like 1 or 2 fps...

In the vast majority of current popular games and hit new release titles the 4870x2 does very very well. And it hasn't even been released yet so the drivers can only get better. If you want to buy a single gpu card that performs worse in the *vast majority of currently popular titles that people will be playing* because you play some old and/or obscure titles that most people don't, well, cool, do that then...but don't try to use that to bash multi-gpu solutions. That's like bashing a corvette because it doesn't have four seats or something...makes no sense.

And I'm sure at some point after the die shrink there will be a good ol' 280x2 card too...

As multi gpu becomes more and more popular, and it looks like that is the direction ATI as a company is heading, support is going to get much more mature and kinks will be worked out much faster than they are now with them being a niche product.
 

Hauk

Platinum Member
Nov 22, 2001
2,808
0
0
Specs according to an SLI forum member..

240 Shaders
- 80 TMU's
- 32 ROP's
- 512 Bit bus (thus likely 1GB of GDDR3 RAM)
- Shader-Ratio back at 2,5 from 2,15
- More transistors as GT200 (with the sole purpose to increase the shader clock)
- Shaderdomain works in 54MHz steps
- 1166 GFlops
- 648/1620 or 675/1674 (core/shader) - these are likely options only
- Probably TDP 236W

He won't reveal his source though..


 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: extra
Really seems like a lot of folks are grasping at straws for any reason they can to bash multi-gpu setups...
I don't agree. I think people want the best bang for the buck, and an SLI/Xfire will never provide that until it can scale ~95% in every game without relying on ATi or nVidia writing a driver specifically for it. I can't remember who said it here, but someone said a long time ago when responding to Rollo that 'SLI and bang for the buck' should never be used in the same sentence. Of course, currently, it shouldn't!

The only thing that particular annoys me is all the people who once praised SLI are now singing different tune when ATi finally has the fastest single card, via multi-GPU setup. I know who these people are and so does the majority of the active forum. Best to just ignore their posts in general as you know they cannot ever be honest or truthful, at least, not anymore than your average politician.

In any case, ATi will have the fastest single card via Mult-GPU. nVidia has the fastest single GPU and nVidia has the fastest multi-GPU setup. Those are the facts... Doesn't matter if you don't want to use multi-GPU or not, those are still the facts.

Instead of this tiresome bickering between both sides, why not just admit the facts, then leave it up to your wallet as for what card you decide to go with? Seems like a no-brainer to me. Ya'll got to much time on your hands if you can sit here and argue over graphics cards over and over and over...
 

toslat

Senior member
Jul 26, 2007
216
0
76
Originally posted by: chizow
There used to be quite a bit of discussion about die size and heat dissipation, but it really hasn't materialized in any negative manner. GT200 isn't their first move to 55nm, they've already cut their teeth with G92 9800 GT and GTX+. Its also good indication NV should be able to eek out higher clocks and hit previous OC'd model speeds even if they don't save on power or temps.
Quite right tou are. With all the action about of recent, I had quite forgotten about the 9800GTX+ :)
 

toslat

Senior member
Jul 26, 2007
216
0
76
Originally posted by: finbarqs
remember guys, if the 4870X2 is ANYTHING like the 3870x2, then we're in for an awesome treat! First of all, it runs natively with standard ATI drivers, meaning that we do not need to enable "CF" mode to get it the framerates of CF! Second, DUAL MONITOR Support. Nothing from Nvida based on 2 GPU's can do dual head output, when SLI is enabled. ATI's you can.
remember guys, this is a GT280+ thread. Such talk belong in the 4870X2 thread.
 

bryanW1995

Lifer
May 22, 2007
11,143
32
91
Originally posted by: taltamir
so... the G200b... I literally had to look up the thread title cause after reading all those last post I completely forgot.

The G200b is now rumored for Aug 25... that is exactly 31 days from today. Pretty nifty. I got that GTX260 SC for the 225$ deal, no regrets there, I might even step up later if it goes by MSRP of original purchase rather then price of original purchase.

Anyone curious as to how well it will perform?
I'm not. It will perform just like an overclocked gt 200. More importantly, it will be cheaper for nvidia to produce once they get decent yields out of it. Hopefully, they'll introduce a midrange gt 200 offering at the same time.
 

bryanW1995

Lifer
May 22, 2007
11,143
32
91
Originally posted by: The Odorous One
Qualified to make comments on what I use, contrary to a whole slug of people who don't own jack squat. I didn't say CF was the end all solution, and wow, a few games have probs, is that really a suprise? I take issues with that lame bullet list Chiz made (which applies more to old CF) and the fact that he doesn't even own the CF product. Same for you Keys, what qualifies you?
It's a pretty safe bet that keys has owned a lot more video cards than you have.

I understand that you want to support ati right now, but all you're doing is making them look bad by getting into a pissing match with chizow. Also, you keep baiting mods like an out-of-control pennsylvania bear hunter. You need to tone it down.
 

bryanW1995

Lifer
May 22, 2007
11,143
32
91
Originally posted by: keysplayr2003
Anyway..... If the current GT280, for example, can o/c to 700/1500 on a 65nm process, I think we can look forward to some nice stock clocks on the 55nm GT200b's. And Nvidia should be very competitive on the price point. I do hope they keep the 512bit bus however.
I think that they'd be smarter to switch to gddr5 and 256 bit. As long as gddr5 is plentiful they'll get similar/better performance costs will be quite a bit lower.
 

evolucion8

Platinum Member
Jun 17, 2005
2,867
3
61
Originally posted by: ArchAngel777
Originally posted by: extra
Really seems like a lot of folks are grasping at straws for any reason they can to bash multi-gpu setups...
In any case, ATi will have the fastest single card via Mult-GPU. nVidia has the fastest single GPU and nVidia has the fastest multi-GPU setup. Those are the facts... Doesn't matter if you don't want to use multi-GPU or not, those are still the facts.
Actually, the HD 4870X2 outperforms the GTX 280 in SLI more often than not. But quite strange considering that a single HD 4870 cannot outperform a 280, seems that the scaling in Crossfire is outstanding (When it works). So nVidia has the fastest single GPU, ATi has the fastest single via Multi GPU card, and both trade blows in Multi GPU setups.

http://www.anandtech.com/video/showdoc.aspx?i=3354&p=4
http://www.hardocp.com/article...wzLCxoZW50aHVzaWFzdA==

There some more other websites, but I'm sleepy as hell. Crysis for some reason doesn't scale well with Crossfire, ATi should fix it soon.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: finbarqs
remember guys, if the 4870X2 is ANYTHING like the 3870x2, then we're in for an awesome treat! First of all, it runs natively with standard ATI drivers, meaning that we do not need to enable "CF" mode to get it the framerates of CF! Second, DUAL MONITOR Support. Nothing from Nvida based on 2 GPU's can do dual head output, when SLI is enabled. ATI's you can.
I actually did mention that in my pro-NV rant as a negative bullet point against SLI. ;) Its been rumored though that NV will be fixing this and adding some other features in their Big Bang II driver in September. Personally I think they're going to push multi-GPU + multi-monitor gaming with extended resolutions like Matrox tried with Parhelia/Hydravision.

In the short-term, it looks like we'll see WHQL PhysX drivers on Aug 5 that extend support to the older 8-series G80s and G92s.
 

ASK THE COMMUNITY