• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GTX295 exposed.

Cookie Monster

Diamond Member
Pic

VR-Zone scored the first photo of the upcoming GeForce GTX 295 card slated for launch over at CES 2009. The GeForce GTX 295 is based on the sandwich design again like the 9800GX2 with two 55nm GT200 GPUs. As you can see, there are 2 DVI and 1 Display Port. As for the power connectors, it uses 8+6 pins so that gives you a clue how much power this card needs. The pricing is yet to be disclosed but card makers are speculating that Nvidia will price it competitively against AMD 4870X2 card this time.

Possible Specs

Firstly, GeForce GTX295 VGA card will feature two GPUs of 55nm GT200. And the number of stream processors is 480 (240×2), not 216×2 as we reported. Besides, its memory bus width is 896 bit (448bitx2), and memory size is 1792MB DDR3 (896MBx2). It won?t adopt GDDR5, and its total power is 289W. GTX 295 will be launched on Jan 8th, exactly on the day when CES2009 starts.

The frequency and pricing of GeForce GTX295 are unknown yet. Judging from the above specifications, it isn?t dual-GPU GTX260 or dual-GPU GTX280. Anyway, it?s obvious that GeForce GTX295 is about to regain its performance dominance.
 
Sounds like a beast, should be an awesome card for those who got $$ 🙂
I wonder if they will announce a 55nm refresh to the current cards.. maybe a cheaper/better 280 equivalent?
 
Anyway, it?s obvious that GeForce GTX295 is about to regain its performance dominance.

Sounds like whoever wrote that piece didn't put much thought into this sentence.
 
I'm not so interest in the GTX295, but the new 55nm GTX260 should save a nice chunk of power from these figures.

Guess you were right cookie monster. :thumbsup:
 
Originally posted by: SSChevy2001
I'm not so interest in the GTX295, but the new 55nm GTX260 should save a nice chunk of power from these figures.

Guess you were right cookie monster. :thumbsup:

Seconded. I'm not in the market for a new video card, but generally anything that has to do with SLI or XFire is more troule than it's worth imo.
 
Well it looks like nV will have the crown back.

Not that it is that big of a deal. I dont see too many 4870x2s around here, but I bet that is because of the price. I hope the 260GX2 is not too expensive.

Personally I am waiting for the new chips, not shrinks.
 
Originally posted by: Ocguy31
Well it looks like nV will have the crown back.

Not that it is that big of a deal. I dont see too many 4870x2s around here, but I bet that is because of the price. I hope the 260GX2 is not too expensive.

Personally I am waiting for the new chips, not shrinks.

Yeah I am sick of all of these shrinks and want a new chip!
 
Glad to see this is finally confirmed. Definitely be a monster card, although I'll probably skip it.

Interesting in terms of NVIDIA's strategy though. It seems that NV's plan for the top end (at least for now) is to go big and monolithic, shrink, and then go multi. They did this with G70 -> G71, G80 -> G92, and now with GT200 -> GT200b. It will be interesting to see how ATI and NV's different strategies work out for them.
 
its not nvidia's strategy, they were "forced" by ati to do it, for the last two generations anyway
 
This is a bit offtopic guys, talk strategy in a different topic please.

The card, I think the only thing I could fault it right now would be it's lack of additional ports, more DVI or a HDMI ports would have been nice. I wonder how it pans out with 240sp's and 'only' a 448bit bus and 896mb vram.
 
Originally posted by: james1701
I wonder if it will have micro studder like the last GX2 card.

All dual(+) GPU solutions have microstutter when the framerates get low, including the X2 products.

The low framerates are the key, the higher your minimums, the less likely microstutter will be noticeable.

If the rumors about this card are true, it should have less microstutter than any single slot product on the market.

 
Looks like it could be a beast. I like the x2 cards for people who dont have dual PCIE slots. However I wont be in the mood for an upgrade for another 12-16 months.
 
Anyone knows the bandwith of the interconnection bridge? I assume it is a SLI bridge. I don't like this approach but if it works... I think ATI architecture is most elegenat and efficient, and cheap
 
Anyone know why Nvidia goes with the 'sandwich' route while AMD goes with the single board layout? I would think 2 PCB's would cost a bit more to produce, but didn't know if there was an advantage other then I would think the sandwich could be a bit shorter in length.
 
Originally posted by: nRollo
Originally posted by: james1701
I wonder if it will have micro studder like the last GX2 card.

All dual(+) GPU solutions have microstutter when the framerates get low, including the X2 products.

The low framerates are the key, the higher your minimums, the less likely microstutter will be noticeable.

If the rumors about this card are true, it should have less microstutter than any single slot product on the market.
Yeah ive noticed this with every x-fire and SLi setup ive had so far.
Its a shame that most benchmarks ony show average framerates and rarely show min framerates.
The dual card setups look great in benchmarks, but in all honesty, i feel im personally better off with the fastest single GPU i can get from now on, because i hate those stutters.

nRollo, can you elaborate more on exactly what causes the dual cards to stutter at low framwerates when single GPU cards don't?
And will it ever be cured or is it always going to be a dual GPU side effect?

And Back on topic, I wonder what the card length will be.
I know a GTX 280 is pushing it in most pc cases, i cant see anything longer than a GTX 280 fitting in many peoples systems.









 
So I guess I just wasted $520 on my future Geforce GTX 260 Core 216 Superclocked (what a mouthful) SLI setup?

At first, I thought this was just two 260s sandwiched together, but with 480 shaders, even with the lowered clocks to compensate for heat, it looks like this might be the card to get for peak performance.

Although, if its price is more than $520, I suppose my purchase wasn't a total waste...
 
Originally posted by: aznxk3vi17
So I guess I just wasted $520 on my future Geforce GTX 260 Core 216 Superclocked (what a mouthful) SLI setup?

At first, I thought this was just two 260s sandwiched together, but with 480 shaders, even with the lowered clocks to compensate for heat, it looks like this might be the card to get for peak performance.

Although, if its price is more than $520, I suppose my purchase wasn't a total waste...
LOL
I know the feeling.
Im trying to keep from going Dual GPU again. (im on a single GTX 280 now)
And im still looking at the preview specs and pics of this card thinking "man, i'd love to own one of those!", this cards specs look great!

 
Originally posted by: aclim
you should be fine with 260s in sli. You could always sell them down to road...

Ahhh.. you're missing the point i think.
Just when you think "Yeah, I finally got the ultimate setup, my system screams and it was worth every penny!"
You're stuck saying " Well, i spent all that money and this thing will put my setup to shame, i guess maybe i didnt make the right choice".

Anyway, its hard to understand unless you have the pc hardware addiction most on these forums have.

Some people are always searching for that "ultimate setup".
Which is what makes this hobby fun and never boring. :thumbsup:
 
Hahaha, indeed. Though he does make a good point, when my parts come in, I'll have this 8800GTX to sell already. I could use my Step-up on one of the 260s, and sell the other.

...wait, this card won't be released in 90 days, do you think? January? My plans shall be successful after all!
 
Originally posted by: MTDEW
Originally posted by: aclim
you should be fine with 260s in sli. You could always sell them down to road...

Ahhh.. you're missing the point i think.
Just when you think "Yeah, I finally got the ultimate setup, my system screams and it was worth every penny!"
You're stuck saying " Well, i spent all that money and this thing will put my setup to shame, i guess maybe i didnt make the right choice".

Anyway, its hard to understand unless you have the pc hardware addiction most on these forums have.

Some people are always searching for that "ultimate setup".
Which is what makes this hobby fun and never boring. :thumbsup:

You say you've had your desktop for over a week? Throw that junk away man it's an antique!
 
Originally posted by: MTDEW
Originally posted by: aclim
you should be fine with 260s in sli. You could always sell them down to road...

Ahhh.. you're missing the point i think.
Just when you think "Yeah, I finally got the ultimate setup, my system screams and it was worth every penny!"
You're stuck saying " Well, i spent all that money and this thing will put my setup to shame, i guess maybe i didnt make the right choice".

Anyway, its hard to understand unless you have the pc hardware addiction most on these forums have.

Some people are always searching for that "ultimate setup".
Which is what makes this hobby fun and never boring. :thumbsup:

But there is always something down the road that is better. If one were able to build the fastest machine on the planet using hardware off the shelf. I bet the longest it would retain that title is about 1 month.
 
For everybody joshing at MTDEW, he is actually right, if we're just concerning my case.

I ordered parts for a new i7 box just two days ago. Obviously, I would have stuck to my 8800GTX (or even the single 260) while keeping the rest of the upgrades had I known that this card was due so soon, with 480 shaders as opposed to 432.
 
Back
Top