• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Dual GT200 samples in December

JPB

Diamond Member
Dual GT200 samples in December

Available in early January

We've learned that the dual GT200 card, something that should finally help Nvidia to dethrone Radeon HD 4870 X2, is going to sample by the middle of this month, while it will be available in January.

We are not aware of the real performance numbers but we've learned that it should end up faster than R700, Radeon HD 4870 X2. This is the whole point of this card, to win the performance crown back. The card is based on 55nm GT 200 chips and despite the fact that it might be the bulkiest card and mos power hungry card ever made, it also has a chance to be the fastest thing ever made in very late 2008.

ATI can only be happy as Nvidia decided to follow its lead on dual GPU cards and knowing Nvidia, you should expect dual PCB card. The whole point is to win against ATI, and nothing else matters much.
 
Originally posted by: JPB
Dual GT200 samples in December

Available in early January

We've learned that the dual GT200 card, something that should finally help Nvidia to dethrone Radeon HD 4870 X2, is going to sample by the middle of this month, while it will be available in January.

We are not aware of the real performance numbers but we've learned that it should end up faster than R700, Radeon HD 4870 X2. This is the whole point of this card, to win the performance crown back. The card is based on 55nm GT 200 chips and despite the fact that it might be the bulkiest card and mos power hungry card ever made, it also has a chance to be the fastest thing ever made in very late 2008.

ATI can only be happy as Nvidia decided to follow its lead on dual GPU cards and knowing Nvidia, you should expect dual PCB card. The whole point is to win against ATI, and nothing else matters much.

Hallelujah! The Saviour is coming!


LOL, for those who expect it.

Seriously, will be good for high end buyers to have alternative to the 4870X2.

If performance is higher, will give them the option to spend more* and get more. (and may well drive down 4870X2s price some)

Also will let owners of non SLi motherboards decide if they prefer Crossfire or SLi drivers.

For consumers, I'd say this is good news.



*This is not a prediction of price, just a guess based on the highest performing card always costs the most.
 
Good news for me. Reason is because I havent used my step up yet. And I still have about two months left :shocked:

EDIT: I just hope FUD is right this time. And it will be here next month.
 
Wait, aren't you supposed to mudstomp it nrollo, I mean it does have a dozen inherent flaws just like the HD4870X2 making it a bad buy, does it not?

Oh wait, you're in the nvidia focus group 😀

But really, I'm curious what they will come up with. Dual gtx260's, or gtx280's ? 55nm obviously, clock speeds, how much ram? Dual PCB though, power consumption will be of the charts !
 
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).
 
I wonder how much RAM will it have in total? 2x896MB? That will look strange in the description 😛

GeForce GTX260GX2 Core 432 1792MB? 😛

Or will they use the 192 version?
 
um this is from FUD, aren't we supposed to totally discredit any and all of this information and assume it is all blasphemy? Doh, I just realized it is about an nvidia product in a positive manner... must be true and worthy of getting our panties in a bunch.:cookie: definately interested if this is legit
 
Originally posted by: lavaheadache
um this is from FUD, aren't we supposed to totally discredit any and all of this information and assume it is all blasphemy? Doh, I just realized it is about an nvidia product in a positive manner... must be true and worthy of getting our panties in a bunch.:cookie: definately interested if this is legit
Don't confused FUD with Inq. FUD isn't so blatantly biased and anti-NV as Inq and still gets invited to all the conferences and media events. Dual GT200 on 55nm has been rumored since about 4 weeks after GTX 280 launched so this really shouldn't come as a surprise.
 
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

Driver support is excellent for X2 imo; CF-X appears to also be a priority with AMD .. perhaps you are right about Nvidia's drivers and sandwich cards
rose.gif
 
Originally posted by: apoppin
Driver support is excellent for X2 imo; CF-X appears to also be a priority with AMD .. perhaps you are right about Nvidia's drivers and sandwich cards
rose.gif
You consider having to frame cap a new title to 30FPS in order to play it as excellent? At that point you need to seriously ask yourself why you spent $500 on a video card to begin with when $90 cards can achieve the same thing.
 
Originally posted by: chizow
Originally posted by: apoppin
Driver support is excellent for X2 imo; CF-X appears to also be a priority with AMD .. perhaps you are right about Nvidia's drivers and sandwich cards
rose.gif
You consider having to frame cap a new title to 30FPS in order to play it as excellent? At that point you need to seriously ask yourself why you spent $500 on a video card to begin with when $90 cards can achieve the same thing.

i had no issues with FC2 and X2
--just with the lesser ($300) AMD cards; for them i had to workaround

FC2 runs better on my $469 X2 than with a $380 GTX280
rose.gif

 
Originally posted by: apoppin

i had no issues with FC2 and X2
--just with the lesser ($300) AMD cards; for them i had to workaround

FC2 runs better on my $469 X2 than with a $380 GTX280
rose.gif
That makes sense then, I probably caught a comment that didn't make the distinction or confused it with Derek's comment that all ATI GPU were affected except for the 1GB 4870.
 
Originally posted by: chizow
Originally posted by: apoppin
Driver support is excellent for X2 imo; CF-X appears to also be a priority with AMD .. perhaps you are right about Nvidia's drivers and sandwich cards
rose.gif
You consider having to frame cap a new title to 30FPS in order to play it as excellent? At that point you need to seriously ask yourself why you spent $500 on a video card to begin with when $90 cards can achieve the same thing.

It is rather rediculous to have to frame cap far cry 2 to get decent performance with the newer Radeons(even my hd3850 512 was affected by it).... the hitching is unbearable. I would like to add though that it's not like you have to frame cap at 30 fps, the actual method I've discovered is that by finding your average fps in the benchmark utility at desired settings and then dropping a few extra fps just to make your "average" a little lower, and cap it at that amount provides a very smooth gameplay with minimal hitches. 30 fps was just too low for me. Since my card was capable of an average of around 45 fps with all options at their highest with 2xAA @1920x1200, Setting my frame cap @42 fps gave me very smooth gameplay. I think the hitching is due to the way the cards drivers are handling fluctuations in FPS. So if you limit to where your card seems to average, your good.

*** sorry for being off topic, but I just wanted to clarify my findings about that issue
 
Originally posted by: chizow
Originally posted by: apoppin

i had no issues with FC2 and X2
--just with the lesser ($300) AMD cards; for them i had to workaround

FC2 runs better on my $469 X2 than with a $380 GTX280
rose.gif
That makes sense then, I probably caught a comment that didn't make the distinction or confused it with Derek's comment that all ATI GPU were affected except for the 1GB 4870.

I personally recommend anyone with monitor with a resolution of 1680x1050 or 1600x1200 or higher to spend the extra and get a 1gb video card...
 
Originally posted by: apoppin
Driver support is excellent for X2 imo; CF-X appears to also be a priority with AMD .. perhaps you are right about Nvidia's drivers and sandwich cards
rose.gif

Originally posted by: apoppin
i had no issues with FC2 and X2
--just with the lesser ($300) AMD cards; for them i had to workaround
FC2 runs better on my $469 X2 than with a $380 GTX280
rose.gif

I could have swore this was the Dual GT200 thread......


As far as FUD goes, this is like any pre-NDA info, might be right, might be wrong.

Only when you see a link to an ACDC video in my sig will you know somethins burnin in the kitchen for sure.....

😉
 
Originally posted by: AdamK47
Originally posted by: apoppin
Dual GTX260+

it is impossible to do gtx280 on 55nm

Why is it impossible? Did I miss something?

It's just impossible , just like lowering the cost of the GTX260/280 was, due to yields and higher die size!

Pay attention!

LOL- Apoppin is speculating the thermals and power consumption of such a beast would prevent it, he has a 50% chance of being right. The forum "engineers" have specualted it could not be done for a while.

 
Originally posted by: AdamK47
Originally posted by: apoppin
Dual GTX260+

it is impossible to do gtx280 on 55nm

Why is it impossible? Did I miss something?

He may be referring to power draw or thermals. The difference between a GTX 260 and GTX 280 under load is pretty extreme, like 40W, requiring an 8-pin PCIE instead of a 2nd 6-pin. There's certainly going to be some decrease in both heat and power consumption going to 55nm, but powering the full 10/10 clusters might just push it over the edge in terms of power consumption. NV has also historically clocked the GX2 parts lower, probably to keep thermals and power draw under control.
 
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

I have a 9800GX2, and one of my buddies is still using my 7950GX2 to this day. I don't recall any issues personally, beyond the initial 7950GX2 quad drivers.
 
Originally posted by: nRollo
Originally posted by: chizow
If its anything like the 9800GX2 and 7950GX2, its better to wait for the next-gen single-GPU or just go native SLI. These sandwich cards meant only to recapture performance crowns are the first to be forgotten when it comes to driver updates. Driver support may never be as bad as the 4870X2, since you can always create your own profiles, but if you look around the 9800GX2 is also very prone to driver issues for newly launched games. Just read over some of the threads over on EVGA around the time GTX 280 came out. 9800GX2 owners couldn't Step-Up fast enough (maybe Step-Away is more appropriate).

I have a 9800GX2, and one of my buddies is still using my 7950GX2 to this day. I don't recall any issues personally, beyond the initial 7950GX2 quad drivers.

I have owned both in the past, and they were both great cards. But yeah, first few sets of drivers for the 7950 were horrid.
 
Originally posted by: nRollo
I have a 9800GX2, and one of my buddies is still using my 7950GX2 to this day. I don't recall any issues personally, beyond the initial 7950GX2 quad drivers.

Like Chizow said there were issues with some newer games...I can't remember which site it was but when they did some scaling tests, the 9800GX2 was scoring lower than a 3870 due to improper scaling.
 
"its production cycle happened during the 33rd week of this year (August 11th to 17th), only 5 weeks after G200-103-A2 which was manufactured in the 28th week. Therefore, obviously NVIDIA had prepared for a long time."

I figured they'd had it for a while. Just hard getting two monsters with all clusters active on a single pcb, or getting clock speeds & cooling good enough for a sandwich.
 
Originally posted by: jaredpace
"its production cycle happened during the 33rd week of this year (August 11th to 17th), only 5 weeks after G200-103-A2 which was manufactured in the 28th week. Therefore, obviously NVIDIA had prepared for a long time."

I figured they'd had it for a while. Just hard getting two monsters with all clusters active on a single pcb, or getting clock speeds & cooling good enough for a sandwich.

Why didn't they launch the regular ones then? Just getting rid of old stock?
 
Back
Top