55 nm GT200 on October 22?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

rjc

Member
Sep 27, 2007
99
0
0
Originally posted by: Compddd
Why Christmas rjc for 55nm 280? Let's hope for October! :)

Yes :)

Am just skeptical cause they have had some much trouble so far. Expanding my previous post:
Nvidia have admitted the stock levels on their 65nm are high, and it would take the rest of the financial year to work through(ie end of jan 09). They also said they had misjudged the market, concentrating too much on the mid range and not working hard enough at the low end to where the market had shifted.

See my previous post on this thread for how well they getting on with TSMC :(

So they have some 55nm allocation. What are they going to spend it on?
G96 - The 9500
This launched a month or so ago, the first product in their new direction. It looks like a shrunk G84(8600). They had to launch this at 65nm! Apparently later versions are supposed to be 55nm. This is to be the new high volume chip for them.
G94 - The 9600
This is a 65nm as well, havent heard anything about die shrinking it, maybe it is getting squeezed too much from above by the 9800gt and below by upmarket 9500's
G92 - The 9800
The best seller! Again they had to launch the 9800gt with some at 65nm and some at 55nm. Also they seemed to have trouble with the 9800gtx+ which launched the same time as the 4850 but was hard to get for ages (..havent checked lately, is it easy to get now?)
GT200 - GTX260, 280
One of the biggest chips ever engineered. Almost like 2 g92s in one chip. I guess designed to return the G80 glory days to nvidia. After G92 shortages last year, everybody ordered heaps of these so as not to get caught out again by the craziness and lost sales of the 8800gt introduction. No engineering sample yet seen, or leaked benchmark indicating existence of any 55nm samples.

Ok so theres the menu, where do you spend your allocation? You cant have them all ;)

(In above havent included mobile parts which are increasing in sales and possibly also would be considered for 55nm)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: apoppin
i am thinking it will outperform the current one by 10-15+% - to catch the X2; and maybe introduce a 'Ultra' to beat it ..
--but the old 280GTX should definitely be worth sli'ing together with the new one - in my book [since i have one now and i still can OC it further to get close to the new GTX' (stock) performance]

I don't agree with this, 10-15% would not catch the 4870X2 in many benches that scale.

However, I don't think it has to, because there's no debate that a similar level of performance delivered by a single GPU is always preferable to multiGPU.

So if a GTX280+ (or whatever they call it) halves the difference in performance between a GTX280 and a 4870X2, it would be the only card to buy IMO. The X2 wouldn't differentiate itself in any meaningful way at that point, unless you have a 25X16 monitor it doesn't differentiate itself all that much now. (I'm unaware of any games a GTX280 can't run at 19X12 4X16X fine, except Crysis)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nRollo
Originally posted by: apoppin
i am thinking it will outperform the current one by 10-15+% - to catch the X2; and maybe introduce a 'Ultra' to beat it ..
--but the old 280GTX should definitely be worth sli'ing together with the new one - in my book [since i have one now and i still can OC it further to get close to the new GTX' (stock) performance]

I don't agree with this, 10-15% would not catch the 4870X2 in many benches that scale.

However, I don't think it has to, because there's no debate that a similar level of performance delivered by a single GPU is always preferable to multiGPU.

So if a GTX280+ (or whatever they call it) halves the difference in performance between a GTX280 and a 4870X2, it would be the only card to buy IMO. The X2 wouldn't differentiate itself in any meaningful way at that point, unless you have a 25X16 monitor it doesn't differentiate itself all that much now. (I'm unaware of any games a GTX280 can't run at 19X12 4X16X fine, except Crysis)

"catch" was probably a poor choice of a single word - i know i thought about it when i originally posted it, but i left it with the addition of a '+' sign

"get closer to" or "narrow the gap" were probably more accurate as it takes about 20+% more performance out of the 280 to actually catch it - usually beyond a simple core tweak and speed bump the shrink delivers. There is a "feeling" i get that Nvidia also may bring out a very selected and highly O/C'd GTX as their "ultra" to actually attempt to take back the performance crown

You might want to *add* Warhead and Clear Sky to your list of "struggling at 19x12" [with maxed details]:p
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: apoppin

How can AMD aggressively price the X2 against GTX280?
- more importantly "why" as it is the faster card

$550 is a lot to pay for that beast
- but if you want the fastest, you are over AMD's price barrel - i got lucky; i got mine for $469 on sale

Well because in a lot of popular games the X2 is obliterates a single 280GTX. That's quite a small price to pay for extra $150 over GTX280 if you are playing at 25x16 as nRollo said:

Check the benches:

Age of Conan 2560x1600 4AA/16AF
280 = 22.7
X2 = 50.3 (+122%)

ET: QW 2560x1600 4AA/16AF
280 = 71.2
X2 = 100.3 (+41%)

Oblivion 2560x1600 4AA/16AF
280 = 37.7
X2 = 50.7 (+34%)

GRID 2560x1600 4AA
280 = 40.6
X2 = 82.6 (+103%)

The Witcher 2560x1600 2AA
280 = 37.2
X2 = 56.2 (+51%)

It doesn't seem to scale well in Crysis and Assassin's Creed though. But since the target market for X2 are users who play at 2560x1600, it's the only single card that's worth buying for that resolution. Of course, the GTX 260 SLI are direct competition in that case.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
you know what, I still wonder how the GTX280 would have performed with GDDR5 (and 512bit bus)...
 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: rjc
I would have thought nvidia would be concentrating on getting the 55nm G92 and other higher selling lines going on smaller process first.

That's my thought too. With all the excitement about high end cards in enthusiast forums, easy to forget that in reality the volume is in the low-mid range.

Originally posted by: taltamir
you know what, I still wonder how the GTX280 would have performed with GDDR5 (and 512bit bus)...

Doubling the memory bandwidth? How much more performance does the GTX 280 get now from strictly memory overclocking (leaving the core alone)?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: taltamir
you know what, I still wonder how the GTX280 would have performed with GDDR5 (and 512bit bus)...

i think we are going find out with their ultra .. they got to get that magic +25% if they want to get close to X2's performance

But since the target market for X2 are users who play at 2560x1600, it's the only single card that's worth buying for that resolution. Of course, the GTX 260 SLI are direct competition in that case.
If the target market for X2 is 25x16, AMD is going out of business
.. it is that small

19x12 is perfect for a 4870x2 if you are like me and prefer running with everything completely maxed and also love to play with CF filtering

i wonder which benches they used for ET-QW ? Mine are a lot closer at 19x12 between the 4870x2 and the 280GTX

and running FRAPS benches in Oblivion are useless .. they are - along with the Witcher - *Real World* benches and subject to a LOT of variation and margin of error over 10% - useless they spell out their criteria for benchmarking and i don't see it in Derek's testing; perhaps you can point out the methodology he used - especially for Oblivion

Damn .. you just made quite an argument for 260 SLI over the X2 .. what are 260's? ~$225ea after rebate?
rose.gif

 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: apoppin

If the target market for X2 is 25x16, AMD is going out of business
.. it is that small

How? I am guessing (without checking AMD's actual financials) that most of the money is made in <$250 graphics card market (lower contribution margin but much higher volume offets loss of % profits). IMO the main purpose of X2 4870 is for performance crown/bragging rights. With Q3 numbers we should see ATI take away market share from NV with 4670, 4850 and 4870. For the most part I don't believe that management is that concerned about X2s (I mean where is 4850X2? Continued driver support for 3870 X2?) when it comes to net earnings on the financial statements. It's an image booster like the Dodge Viper is.

i wonder which benches they used for ET-QW ? Mine are a lot closer at 19x12 between the 4870x2 and the 280GTX

In all honesty though 70 frames for 280GTX is more than sufficient. But looking at numbers alone X2 seems faster.

and running FRAPS benches in Oblivion are useless .. they are - along with the Witcher - *Real World* benches and subject to a LOT of variation and margin of error over 10% - useless they spell out their criteria for benchmarking and i don't see it in Derek's testing; perhaps you can point out the methodology he used - especially for Oblivion

Good point but then the same applies for both cards. Also check benches around the net and you'll see that ATI's cards are faster in Oblivion - 4870 > GTX 280 2560x1600 8AA + HDR

Damn .. you just made quite an argument for 260 SLI over the X2 .. what are 260's? ~$225ea after rebate?
$210! For anyone with an SLI board, it's hard to recommend X2 over GTX260s (except if you are running multiple monitors).
 

sourthings

Member
Jan 6, 2008
153
0
0
Originally posted by: RussianSensation
Originally posted by: apoppin

How can AMD aggressively price the X2 against GTX280?
- more importantly "why" as it is the faster card

$550 is a lot to pay for that beast
- but if you want the fastest, you are over AMD's price barrel - i got lucky; i got mine for $469 on sale

Well because in a lot of popular games the X2 is obliterates a single 280GTX. That's quite a small price to pay for extra $150 over GTX280 if you are playing at 25x16 as nRollo said:

Check the benches:

Age of Conan 2560x1600 4AA/16AF
280 = 22.7
X2 = 50.3 (+122%)

ET: QW 2560x1600 4AA/16AF
280 = 71.2
X2 = 100.3 (+41%)

Oblivion 2560x1600 4AA/16AF
280 = 37.7
X2 = 50.7 (+34%)

GRID 2560x1600 4AA
280 = 40.6
X2 = 82.6 (+103%)

The Witcher 2560x1600 2AA
280 = 37.2
X2 = 56.2 (+51%)

It doesn't seem to scale well in Crysis and Assassin's Creed though. But since the target market for X2 are users who play at 2560x1600, it's the only single card that's worth buying for that resolution. Of course, the GTX 260 SLI are direct competition in that case.

Yeaht this is pretty much the case, it's not reasonable to think any die shrink or whatever other iteration of the GTX 280 that gets released will ever outperform the 4870X2 when it's scaling properly. As it stands now, when the X2 scales it's about 40-60% faster depending on the title.

That said, I am regretting having bought one, as for my 24" monitor it's overkill, and I should of gotten a 280 looking on it in hindsight.

Now that said, if you are using a 30" monitor to game, multi-gpu is a veritable necessity for modern games, moreso when you factor in AA/AF. Anyone gaming at 2560x1600 should only be looking at the 4870X2, or two of them. As nvidia has nothing to compete with the card in multi-gpu. I believe the only multi-gpu bench on a 30" nvidia has is Crysis using 3x280s, and it's by a sliver at that. In most benches at that res, a single X2 is faster than 2x280 in SLI. Pretty impressive, seeing as you also save $300 and don't have to use a crappy nvidia motherboard.

The X2 is impressive in it's own right, but it's such a monster, it's meant for a monster monitor.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: RussianSensation
Originally posted by: apoppin

If the target market for X2 is 25x16, AMD is going out of business
.. it is that small

How? I am guessing (without checking AMD's actual financials) that most of the money is made in <$250 graphics card market (lower contribution margin but much higher volume offets loss of % profits). IMO the main purpose of X2 4870 is for performance crown/bragging rights. With Q3 numbers we should see ATI take away market share from NV with 4670, 4850 and 4870. For the most part I don't believe that management is that concerned about X2s (I mean where is 4850X2? Continued driver support for 3870 X2?) when it comes to net earnings on the financial statements. It's an image booster like the Dodge Viper is.

i wonder which benches they used for ET-QW ? Mine are a lot closer at 19x12 between the 4870x2 and the 280GTX

In all honesty though 70 frames for 280GTX is more than sufficient. But looking at numbers alone X2 seems faster.

and running FRAPS benches in Oblivion are useless .. they are - along with the Witcher - *Real World* benches and subject to a LOT of variation and margin of error over 10% - useless they spell out their criteria for benchmarking and i don't see it in Derek's testing; perhaps you can point out the methodology he used - especially for Oblivion

Good point but then the same applies for both cards. Also check benches around the net and you'll see that ATI's cards are faster in Oblivion - 4870 > GTX 280 2560x1600 8AA + HDR

Damn .. you just made quite an argument for 260 SLI over the X2 .. what are 260's? ~$225ea after rebate?
$210! For anyone with an SLI board, it's hard to recommend X2 over GTX260s (except if you are running multiple monitors).

AMD's money maker is the 4850 .. awesome bang for buck and their bottom line; heck they were selling them in pairs!

The X2 is "status" and a very fast card .. it is one that i imagine they will continue to support so as to keep their 'crown' as long as possible
. . . 4850x2 is a strange beast ... you are right .. i guess to aim directly at the 280GTX in every way including price; AMD appears to be in no hurry

X2 IS faster than the 280 in ET-QW .. i see it .. i just did not get a great difference at 19x12 and i wondered which of the 4 established benches they used

Oblivion is the hardest game i ever tried to bench .. repeatably ... well BioShock is tough also to do RW benches
-i see differences if you run 50 times in a row. You can try to load a save and count the fps the instant it launches or a very short 2 or 3 second run, but they all have their disadvantages and have big margins of error; just an observation about Oblivion and some RW testing.
rose.gif


Wow! it looks like 260 SLI is the new king of bang-for-buck performance!!!
- i'd love to get a sli board .. but later .. perhaps when i get my 2nd GTX290 .. by then maybe i can run them all in x58



 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: RussianSensation
Damn .. you just made quite an argument for 260 SLI over the X2 .. what are 260's? ~$225ea after rebate?
$210! For anyone with an SLI board, it's hard to recommend X2 over GTX260s (except if you are running multiple monitors).

Actually only $190 each!

Click on the $210 link and click to show all combo deals. Deal is if you buy a single card for $240 there is a single $30 rebate to get the $210 price. HOWEVER, if you buy two cards in the special combo link, it is $480 for the two cards, but there is a special $100 rebate for two cards on one invoice, making it $380 for two cards, or $190 per card. Heck, even if you don't need two cards you can probably sell the second one for $200 (after figuring out the better overclocker to keep for yourself ;) ) and end up with a $180 GTX 260... or something like that.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Compddd
When do you think GTX 280 55nm will be out nRollo?

I know the projected release, but it's NDA. (as usual)

If you start seeing links to old heavy metal songs in my sig and vague references to a good new card, you'll know it's close. (the furthest in advance I've ever received a card is 4 weeks, and its usually 1-2)
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
which means when translated, he doesn't have it yet so it is probably more than 1-2 weeks off

$380 for GT260 SLi is great bang-for-buck - if you have the MB; better performance than the X2 for $550, never mind what it does to a single 280, performance-wise. .. i might have considered it with a SLI MB if i wasn't lucky with my 4870x2's price - of course, the 260 was pretty expensive back when i got my x2 :p
. . .YMMV regarding the other rebates from NewEgg, btw

i know what Nvidia was doing with all the cash it stashed all last year

- saving it for a day like today


You know "i am given" is kinda awkward . .
.. what about "get" ?


:D

i hear the new Metallica album is like their roots - have you heard it yet?
- i will be watching for references
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: apoppin

i hear the new Metallica album is like their roots - have you heard it yet?
I like it a lot but I was never big on their old stuff (I prefer the black album onwards).
 

rjc

Member
Sep 27, 2007
99
0
0
I forgot to write yesterday on my post re why it is likely not till christmas for the 55nm GT200 ws the slide leaked from elsa in china:
ELSA hints NVIDIA's next step is 45nm

If you look the gt206(the 55nm GT200) is not due till the end of 08/start of 09. And it looks like they are dividing the GTX260 into a different product stream from the GTX280. The GT206 at 55nm replaces the GT260, and the 45nm GT212 replaces the GTX280.

I guess they want GT206 to compete with ATIs products and the GT212 out on its own free from competition to make them lots of money.

Note also from the product map, now is the time for the 9800GTX+ to go all 55nm, it looks like that has been prioritised ahead of any GT200 replacement. So at the moment nvidia is using its 55nm allocation on the G96b(9500 and 9400) and the G92b, neither of which appears to be anywhere near fully over yet as there is still alot of 65nm products intermixed with the new stuff.

So if your interested in the GT206, look to see how well the 92b/96b transitions are going. Surely they would want to make the simpler chips work well on the new process, before they tackled the monster.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: Zap
Originally posted by: taltamir
you know what, I still wonder how the GTX280 would have performed with GDDR5 (and 512bit bus)...

Doubling the memory bandwidth? How much more performance does the GTX 280 get now from strictly memory overclocking (leaving the core alone)?
And it wouldn't even be that good. Overclocking lowers the latency (in ns) on the GDDR3, but at stock both the GDDR3 and GDDR5 are going to have similar latencies. So what you'd get out of overclocked GDDR3 is more than what you'd get out of GDDR5 if the improvement in latency is affecting the performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Zap

Actually only $190 each!

Click on the $210 link and click to show all combo deals. Deal is if you buy a single card for $240 there is a single $30 rebate to get the $210 price. HOWEVER, if you buy two cards in the special combo link, it is $480 for the two cards, but there is a special $100 rebate for two cards on one invoice

I see it. It says rebate ends Sept 25th. That means you can now buy 2xGTX260s for the price of a single GTX 280!! :shocked:

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: nitromullet
Originally posted by: apoppin

i hear the new Metallica album is like their roots - have you heard it yet?

have a listen...

http://www.metallica.com/index.asp?item=601231

he he .. if forgot i can do that now
- i still have 56K



. . . as backup to my [wild blue] Satellite connection :p


Wow .. just like their roots ..
the old geezers took 13 months to make it
rose.gif


I see it. It says rebate ends Sept 25th. That means you can now buy 2xGTX260s for the price of a single GTX 280!!

Like i said .. unreal bang-for-buck!!
- what i paid for a single 280; of course i plan to get a 290 to sli with it
:)