• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

9800GX2 Pics!

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: nitromullet
Originally posted by: Cookie Monster
More 9800GX2 pics

- Uses two G92-400 GPUs (ones found on the 8800GTS)
- Release date of 14 Feb 08
- MSRP of $449
- Will implement a 4 way AFR for Quad SLi
- Performance roughly 30% to 50% faster than the 8800 Ultra

At $449 it really is not an unreasonable option, especially for someone like me with a decent P35 board as opposed to getting a second GTS 512 and a 780i board. It really depends on how long the gap is between this card's introduction and the next architectural release, and of course driver support.

Yeah, the news of the estimated price (if true) has changed my tone about this product a little bit... Well, have to wait and see!

 
I still think it's worthless at $449 due to the SLI dependence, but it's still good news in another sense because it is somewhat below what Nvidia has traditionally charged for their top card at release. That might suggest that they want to leave room for something at their usual $600 high end pricepoint.
 
Hmmm, $449. for a dual might settle in @ $399. with promos. Rumours had the R680 in the low $300.This all begins to make sense. I guess all those 8800GTS's (G92) are overpriced. Come to think of it all lot of prices will find new sweet points.
 
Some people are forgetting that MSRP matters little, outside of finding it in a B&M store. This latest round of cards from ATi/NV still suffer from higher than MSRP prices, if you can even find them. If you dont get in the first few days, there is a chance it will be sold out, then priced higher when they trickle back in stock. As what has happened lately, sadly.
 
If it comes out on Feb 14th at that price, I'm probably going to step up from my 8800GT. If the 9800GTX comes out at the same time and is a better solution (almost as fast but without SLI dependence) then I'd buy that instead. My step up period ends around Feb 16~ though.
 
Will this 9800GX2 'monster' really have 1GB of usable VRAM or is it 512MB per card but limited to 512MB overall usable VRAM due to the SLI-based configuration?
 
Originally posted by: Ackmed
Some people are forgetting that MSRP matters little, outside of finding it in a B&M store. This latest round of cards from ATi/NV still suffer from higher than MSRP prices, if you can even find them. If you dont get in the first few days, there is a chance it will be sold out, then priced higher when they trickle back in stock. As what has happened lately, sadly.

I don't think anyone is forgetting that. MSRP not always being the street price works both ways and, usually it works to our benefit. The 8800GT 512 has been an exception rather than the norm in this regard because of the very high demand. I already got an 8800GTS 512MB for $339 ($319 AMIR), which is below MSRP.

I imagine that the 9800GX2 wouldn't be much over MSRP either, considering that it will be a rather niche market product without the huge demand that a card like the 8800GT has. If it's anything like the 7850GX2, it will be relatively well stocked (but only at a few select places), and will hover pretty close to MSRP. The stores stocking the 9800GX2 will know that they will only be hot for so long, and after that any cards they don't sell will just be really expensive inventory.
 
Originally posted by: Cheex
Will this 9800GX2 'monster' really have 1GB of usable VRAM or is it 512MB per card but limited to 512MB overall usable VRAM due to the SLI-based configuration?

If only 512MB was useable, then they would only include 512MB, why spend extra money for no gain? The reason 1GB is present in a traditional SLI system but only 512MB is useable is because both cards have their own memory and are not designed exclusively for SLI.

I don't know how nVidia's GX2 cards work or whether this is different than the 7950GX2. I would think that each chip has it's own memory and link to memory, and it would be like two processing cores having seperate L2 caches.
 
Originally posted by: Extelleron
Originally posted by: Cheex
Will this 9800GX2 'monster' really have 1GB of usable VRAM or is it 512MB per card but limited to 512MB overall usable VRAM due to the SLI-based configuration?

If only 512MB was useable, then they would only include 512MB, why spend extra money for no gain? The reason 1GB is present in a traditional SLI system but only 512MB is useable is because both cards have their own memory and are not designed exclusively for SLI.

I don't know how nVidia's GX2 cards work or whether this is different than the 7950GX2. I would think that each chip has it's own memory and link to memory, and it would be like two processing cores having seperate L2 caches.

So that is to say that essentially you'll have 1GB of VRAM to run the higher resolutions with higher AA and AF.


Examples:
1920x1200 with 8xAA and 16xAF
2560x1600 with 4xAA and 16xAF
 
Originally posted by: Cheex
Originally posted by: Extelleron
Originally posted by: Cheex
Will this 9800GX2 'monster' really have 1GB of usable VRAM or is it 512MB per card but limited to 512MB overall usable VRAM due to the SLI-based configuration?

If only 512MB was useable, then they would only include 512MB, why spend extra money for no gain? The reason 1GB is present in a traditional SLI system but only 512MB is useable is because both cards have their own memory and are not designed exclusively for SLI.

I don't know how nVidia's GX2 cards work or whether this is different than the 7950GX2. I would think that each chip has it's own memory and link to memory, and it would be like two processing cores having seperate L2 caches.

So that is to say that essentially you'll have 1GB of VRAM to run the higher resolutions with higher AA and AF.


Examples:
1920x1200 with 8xAA and 16xAF
2560x1600 with 4xAA and 16xAF

not really, games these days occupy lot of vram at lower resolutions, world in conflict for instance occupies 508-524 MB @ 1280 x 1024 with 4xAA/16xAF and Ultra High.

 
It all depends on the game, but something like Crysis running at 2560x1600 with 4xAA, it would easily fill the GB of the GX2. Its really time for a step up in both vram size and bandwith.
 
So that is to say that essentially you'll have 1GB of VRAM to run the higher resolutions with higher AA and AF.
No, you'll have 512 MB because it's 2 x 512 MB, not a global pool of 1 GB that any card can use as it pleases. Each card is restricted to its own 512 MB pool.

SLI/Crossfire memory is never combined because each card needs its own copy of data in order to render a frame.
 
Originally posted by: BFG10K
So that is to say that essentially you'll have 1GB of VRAM to run the higher resolutions with higher AA and AF.
No, you'll have 512 MB because it's 2 x 512 MB, not a global pool of 1 GB that any card can use as it pleases. Each card is restricted to its own 512 MB pool.

SLI/Crossfire memory is never combined because each card needs its own copy of data in order to render a frame.

Well that will really make the GX2 suck then.

1GB marketed but 512MB essentially.

ew!
 
Originally posted by: Cheex
Originally posted by: BFG10K
So that is to say that essentially you'll have 1GB of VRAM to run the higher resolutions with higher AA and AF.
No, you'll have 512 MB because it's 2 x 512 MB, not a global pool of 1 GB that any card can use as it pleases. Each card is restricted to its own 512 MB pool.

SLI/Crossfire memory is never combined because each card needs its own copy of data in order to render a frame.

Well that will really make the GX2 suck then.

1GB marketed but 512MB essentially.

ew!


Well, it is 512MB per GPU. Saying the GX2 is a 1GB card is like saying the X2 6400+ has 2MB of L2 cache when it has 1MB*2. There's an advantage to having a shared pool of memory, but it still is way better to have 2*512MB than 1*512MB.
 
How much does Crysis use @ 1920x1200?

1) with no AA
2) with 2xAA
3) with 4xAA
I haven't tested it myself but I do know that at high resolution + AA levels Crysis will flatten 512 MB video cards.

Normally G92 based 8800 cards are faster than the GTS 640 MB but in said situation the legacy card scores victories because it has 640 MB VRAM.

We will have to wait and see if the GX2 is indeed 1GB card.
I don't think there's much doubt as going less than 512 MB per board will cripple it but I don't see them adding more than 512 MB due to cost reasons.
 
Originally posted by: Extelleron

Well, it is 512MB per GPU. Saying the GX2 is a 1GB card is like saying the X2 6400+ has 2MB of L2 cache when it has 1MB*2. There's an advantage to having a shared pool of memory, but it still is way better to have 2*512MB than 1*512MB.

That's true but...

How do they expect a 'high-end' card like that to run with such little memory?

I'm 😕confused😕.
 
developers these days take dx10 for granted and add very high quality textures for the performance the API offers, this requires constant refreshing of GPU data resulting in failure of overall performance.

Alanwake streams data constantly from the drive instead of loading everything into memory like crysis. But anyway the trick is to come up with better algorithms because dx10 and wddm architecture is different and it requires a new level optimization.
 
How do you know that about Alan Wake?
Did I not read that piece of information somewhere?

Link please...??
 
Originally posted by: Cheex
How do you know that about Alan Wake?
Did I not read that piece of information somewhere?

Link please...??

i read in some interview, it streams data 'on fly'. Also in PCMark Vantage there is a Alan Wake data streaming test that is used to test hard drive performance.

sorry i don't remember the link.
 
Originally posted by: Cheex
Originally posted by: Extelleron

Well, it is 512MB per GPU. Saying the GX2 is a 1GB card is like saying the X2 6400+ has 2MB of L2 cache when it has 1MB*2. There's an advantage to having a shared pool of memory, but it still is way better to have 2*512MB than 1*512MB.

That's true but...

How do they expect a 'high-end' card like that to run with such little memory?

I'm 😕confused😕.

We're just now on the verge of moving from 512MB~ on a card to 1GB~ on a card (not percisely 512/1024 because of nVidia's weird bus width) and right now 1GB is still not mainstream.

This happens every few years in graphics development. 2005 was the year we saw the first real 512MB cards, but they didn't have much use then. 2006 and especially 2007 is when 512MB became necessary and 256MB much too little. We saw the first 1GB cards in 2007. Don't expect 1GB to become common (even on high-end cards) until 2008-2009.

There's just not much of an advantage to more than 512MB of memory with current games at normal resolutions/settings. Crysis is the big one, it sees a huge increase in performance at 2560x1600 moving from 512MB->1GB, but how many people actually play at that resolution? And, the game is unplayable even on a 1GB, even though it is 6X faster than on a 512MB one (Firingsquad 8800GT 1GB review.)

nVidia usually isn't the first to move to huge memory sizes, either. nVidia was still on 256MB for their 7800GTX while ATI went 512MB for the X1800XT.
 
Originally posted by: Aberforth
Originally posted by: Cheex
How do you know that about Alan Wake?
Did I not read that piece of information somewhere?

Link please...??

i read in some interview, it streams data 'on fly'. Also in PCMark Vantage there is a Alan Wake data streaming test that is used to test hard drive performance.

sorry i don't remember the link.


Oh.

I do suppose that that is a better means of utilizing your hardware.
Then again, it has to be 'smart' enough to pre-load all that it needs or else you'll have stuttering.
 
Originally posted by: Cheex
Originally posted by: Aberforth
Originally posted by: Cheex
How do you know that about Alan Wake?
Did I not read that piece of information somewhere?

Link please...??

i read in some interview, it streams data 'on fly'. Also in PCMark Vantage there is a Alan Wake data streaming test that is used to test hard drive performance.

sorry i don't remember the link.


Oh.

I do suppose that that is a better means of utilizing your hardware.
Then again, it has to be 'smart' enough to pre-load all that it needs or else you'll have stuttering.

yes, that is true. But i doubt anyone can max AW 🙁 there are rumours going on the net that no one can max it with Quad and SLI.

 
Back
Top