Nvidia's next gen high end

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RZaakir

Member
Sep 19, 2003
116
0
0
Originally posted by: YlurienAnd the damn thing can't even run Oblivion well (although that game needs a bit more optimizing if you ask me).

Huh? Mine runs Oblivion at 1920x1200 and everything turned up with no problem here. Performance under Vista has finally become respectable as well.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: bryanW1995
Originally posted by: ronnn
You still miss my point - if Nvidia has something faster right now - many people who already have an ultra would buy. Which is cash in hand. Assuming it will still be the fastest in 6 months is risky - versus guaranteed sales. Also would get a higher premium, as its lead over the 2900pro (soon to be 2950) would be greater. Or for the Nvidia only audience, the lead over the gts.

Actually I never said Nvidia does not have a high end because of difficulties. Just said it is possible. Personally I think they are following their road map, which appears to be very similar to amd's. Multi gpu - that are smaller, and produce less heat. Heat and power seem to be a barrier at this time.

Still there are rumours that the g92 is not the best, likely fueled by gossip sites or as you say - by Amd or Nvidia marketing. Hopefully will be interesting times this November and both will be on the shelves.
researching a new gpu is relatively cheap, changing over their fab production to a new gpu is NOT. They have a breakeven point which they must achieve to make a card upgrade worthwhile vs continuing to produce what they have right now. If they can charge $50 for an 8900 ultra but they only sell 1,000 of them, it's definitely not worth it for them to switch. That's the beauty of the midrange gpus; their market is many times larger than high-end so you need a much lower % of users to adopt your new card. also, midrange users have been without dx10 thus far, making it the next logical battleground. we all know that halo-effects will help all other cards sell if you have the fastest, but nvidia's fortunes can turn quickly if DAAMIT can produce a better midrange card for the same or less money.

Which is what I have been saying, except that my understanding is that R&D is not exactly cheap. If Nvidia had a twice as quick card, that would produce with good yields - there is money to be made. I am fairly sure they would sell more than a thousand at more than $50 profit. We all see the economic beauty of a mid range chip - that can be doubled for high end. I like others doubt if either company can make it work for more than a few choice benchmarks though. But maybe if new games are coded with this in mind - things might change.
 

alcoholbob

Diamond Member
May 24, 2005
6,387
465
126
Originally posted by: RZaakir
Originally posted by: YlurienAnd the damn thing can't even run Oblivion well (although that game needs a bit more optimizing if you ask me).

Huh? Mine runs Oblivion at 1920x1200 and everything turned up with no problem here. Performance under Vista has finally become respectable as well.

I believe he's merely defining "well" as in an 8800GTX with a quad core doing 30fps in outdoor scenes for a 2 year old game isn't that impressive given the cost. And it if this was a shooter, 30fps is borderline unplayable on a serious difficulty rating, or a competitive online shooter.

And it also shows pretty much how unplayable it is unless you have a load of money to blow.

Of course Oblivion isn't that bad, at least compared to Bethesda's previous attempt--the obscenity that was Morrowind (top end hardware 2 years later were barely managing 20fps in outdoor scenes).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: ronnn
You still miss my point - if Nvidia has something faster right now - many people who already have an ultra would buy. Which is cash in hand. Assuming it will still be the fastest in 6 months is risky - versus guaranteed sales. Also would get a higher premium, as its lead over the 2900pro (soon to be 2950) would be greater. Or for the Nvidia only audience, the lead over the gts.

Actually I never said Nvidia does not have a high end because of difficulties. Just said it is possible. Personally I think they are following their road map, which appears to be very similar to amd's. Multi gpu - that are smaller, and produce less heat. Heat and power seem to be a barrier at this time.

Still there are rumours that the g92 is not the best, likely fueled by gossip sites or as you say - by Amd or Nvidia marketing. Hopefully will be interesting times this November and both will be on the shelves.

actually i didn't ... let me continue to expand on what i said:

the people who have an ultra right now are encouraged to buy another one.. i already see people considering that 2nd GTS-640 for experimenting with sli just as i am with Xfire - to get assured performance.

IF nvidia released a new card now - they'd tip their hand - *But they don't have to* ... AMD has to make a move first as they are in 3rd position with their "high-end" GPU

it all points to a wise business decision by nvidia ... only ~1% of their profit is made on the high end directly - they can afford to make that 1% wait while they *force* AMD to move with their 2950xt first .... as soon as nvidia knows "what's coming" they announce and release their long-prepared counter and their hope is to crush it.

Personally, i don't think we are going to see the Ultra most of you guys are hoping for this year - unless you think 2950xt is going to be much better then the 2900xt :p ... i am looking for the "usual" evolutionary "speed bump" that comes with a die shrink and some core improvements ... and of course DX10.1 and all the "goodies" to tempt most of you to upgrade.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Could we pontentially see multi GPUs as the future?
Based on nVidia's track record of driver support I hope not, especially their Vista SLI support. Also Crossfire still doesn't allow custom profiles which resorts users to renaming executables, which is unacceptable.

Nvidia already has experience with 4 way sli, and from what I have seen it works well.
Quad SLI has never worked well on XP and only a handful of games ever showed performance gains over regular SLI. Also it doesn't work at all on Vista.

Additionally one of the untold problems with AFR style rendering is the increase in input lag due to buffering up of frames, and the more GPUs you?ve got the bigger the problem is. This can lead to laggy input even when your framerate is high.

I personally think SLI/Crossfire's best advantage is the doubling of AA and free super-sampling but unfortunately neither vendor seems to be aggressively pursuing it this generation.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Astrallite
Originally posted by: RZaakir
Originally posted by: YlurienAnd the damn thing can't even run Oblivion well (although that game needs a bit more optimizing if you ask me).

Huh? Mine runs Oblivion at 1920x1200 and everything turned up with no problem here. Performance under Vista has finally become respectable as well.

I believe he's merely defining "well" as in an 8800GTX with a quad core doing 30fps in outdoor scenes for a 2 year old game isn't that impressive given the cost. And it if this was a shooter, 30fps is borderline unplayable on a serious difficulty rating, or a competitive online shooter.

And it also shows pretty much how unplayable it is unless you have a load of money to blow.

Of course Oblivion isn't that bad, at least compared to Bethesda's previous attempt--the obscenity that was Morrowind (top end hardware 2 years later were barely managing 20fps in outdoor scenes).
Funny, I bought morrowing so that I could get used to it b4 getting oblivion. How do I check fps in it?

 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: apoppin
Originally posted by: ronnn
You still miss my point - if Nvidia has something faster right now - many people who already have an ultra would buy. Which is cash in hand. Assuming it will still be the fastest in 6 months is risky - versus guaranteed sales. Also would get a higher premium, as its lead over the 2900pro (soon to be 2950) would be greater. Or for the Nvidia only audience, the lead over the gts.

Actually I never said Nvidia does not have a high end because of difficulties. Just said it is possible. Personally I think they are following their road map, which appears to be very similar to amd's. Multi gpu - that are smaller, and produce less heat. Heat and power seem to be a barrier at this time.

Still there are rumours that the g92 is not the best, likely fueled by gossip sites or as you say - by Amd or Nvidia marketing. Hopefully will be interesting times this November and both will be on the shelves.

actually i didn't ... let me continue to expand on what i said:

the people who have an ultra right now are encouraged to buy another one.. i already see people considering that 2nd GTS-640 for experimenting with sli just as i am with Xfire - to get assured performance.

IF nvidia released a new card now - they'd tip their hand - *But they don't have to* ... AMD has to make a move first as they are in 3rd position with their "high-end" GPU

it all points to a wise business decision by nvidia ... only ~1% of their profit is made on the high end directly - they can afford to make that 1% wait while they *force* AMD to move with their 2950xt first .... as soon as nvidia knows "what's coming" they announce and release their long-prepared counter and their hope is to crush it.

Personally, i don't think we are going to see the Ultra most of you guys are hoping for this year - unless you think 2950xt is going to be much better then the 2900xt :p ... i am looking for the "usual" evolutionary "speed bump" that comes with a die shrink and some core improvements ... and of course DX10.1 and all the "goodies" to tempt most of you to upgrade.
I'm very interested to see what 2950xt has in store for us. Supposedly ati chose not to go after 8800 gtx/ultra b/c of heat/noise issues. These will undoubtedly be greatly reduced with the switch to 55nm. Will it be enough to enable them to be competitive at the high end? Tune back in during Q1, or, um, maybe Q2 or Q3 depending on how they are feeling next year...

 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: bryanW1995
Originally posted by: apoppin

. . . it all points to a wise business decision by nvidia ... only ~1% of their profit is made on the high end directly - they can afford to make that 1% wait while they *force* AMD to move with their 2950xt first .... as soon as nvidia knows "what's coming" they announce and release their long-prepared counter and their hope is to crush it.

Personally, i don't think we are going to see the Ultra most of you guys are hoping for this year - unless you think 2950xt is going to be much better then the 2900xt :p ... i am looking for the "usual" evolutionary "speed bump" that comes with a die shrink and some core improvements ... and of course DX10.1 and all the "goodies" to tempt most of you to upgrade.
I'm very interested to see what 2950xt has in store for us. Supposedly ati chose not to go after 8800 gtx/ultra b/c of heat/noise issues. These will undoubtedly be greatly reduced with the switch to 55nm. Will it be enough to enable them to be competitive at the high end? Tune back in during Q1, or, um, maybe Q2 or Q3 depending on how they are feeling next year...

*exactly* ... no one knows ... AMD is keeping their cards close [unlike formerly leaky ATi]

That is why that nvidia is waiting ... for AMD also
--nvidia then chooses how to answer them

of course, if HD2950xtx is a *monster*, we may see another DustBuster from nvidia :p
[doubtful]
 

Hauk

Platinum Member
Nov 22, 2001
2,806
0
0
Originally posted by: IsLNdbOi
This is a good thing. It was crazy having new cards coming out every other month.

Good new for those who've had a GTX for some time, most mileage for the money. Bad new for those who were holding off for next gen. It's time GTX drops to $400 range though, especially if this news is true. nVidia, you've recooped R&D costs. Drop price on G80.
 

SniperDaws

Senior member
Aug 14, 2007
762
0
0
Originally posted by: SteelSix
Originally posted by: IsLNdbOi
This is a good thing. It was crazy having new cards coming out every other month.

Good new for those who've had a GTX for some time, most mileage for the money. Bad new for those who were holding off for next gen. It's time GTX drops to $400 range though, especially if this news is true. nVidia, you've recooped R&D costs. Drop price on G80.


Yep im one of the suckers holding out for the next ATI/Nvidia whichevers faster and im gutted, but still not a total loss becase in my opinion the 8800/2900 series arent running new games very well, Colin Mcrae Dirt, Company of Heroes DX10, World in Conflict, Medal of Honour:Airborne, if i had an 8800GTX or 2900XT i would expect to be getting at least 80fps on these games but some of these games are only managing 30-40 fps @ 1600x1200 with everything maxed out, that as fars as im concerened isnt worth the money.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: BFG10K
Could we pontentially see multi GPUs as the future?
Based on nVidia's track record of driver support I hope not, especially their Vista SLI support. Also Crossfire still doesn't allow custom profiles which resorts users to renaming executables, which is unacceptable.


Quad SLI has never worked well on XP and only a handful of games ever showed performance gains over regular SLI. Also it doesn't work at all on Vista.

This is one of the things that annoys me about the forums here.
Every topic about video cards or a cpu being fast is about gaming applications.
People can't seem to see past gaming use.

Quad SLI works fine in xp on professional applications , proving the technology works and works well. If its not working well for gaming, thats a driver issue and not a hardware one. Applications that are designed to support it work fine.




 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: apoppin

of course, if HD2950xtx is a *monster*, we may see another DustBuster from nvidia :p
[doubtful]


Than heads will roll, as they could have been selling that card without competition for 6 months. ;)
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
they are already selling the top 2 cards without competition, why would they up the speed/cost AT ALL right now?
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: bryanW1995
they are already selling the top 2 cards without competition, why would they up the speed/cost AT ALL right now?
They're not. They're making G92, which will certainly cost less to make than an 8800-based card (and probably didn't even require a large engineering investment). Even if it outperforms current cards, so long as it's cheaper for nVidia to make them, they should; it will increase their profits.

One day they'll have to make a high end card and compete with themselves in a sense. They just need to wait until the high end market is completely saturated. Once people stop buying their graphics cards, they will be forced to do something. Look at what happened with AMD and Barcelona. :light:
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
from what I've seen/heard, g92 is going to be a midrange card (8700gts) that will be close but not equal to the 8800gts. their intro for high end is supposed to be Q2 08.
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Originally posted by: SniperDaws
Yep im one of the suckers holding out for the next ATI/Nvidia whichevers faster and im gutted, but still not a total loss becase in my opinion the 8800/2900 series arent running new games very well, Colin Mcrae Dirt, Company of Heroes DX10, World in Conflict, Medal of Honour:Airborne, if i had an 8800GTX or 2900XT i would expect to be getting at least 80fps on these games but some of these games are only managing 30-40 fps @ 1600x1200 with everything maxed out, that as fars as im concerened isnt worth the money.

...your expectations are entirely unrealistic. The 8800/2900 cards are good, but they aren't some miracle sent from above. I can not think of a time that any current generation high end card ran the majority of the latest game engines with eye candy at 80FPS. I don't even think SLI/Crossfire can boast 80+FPS in many of the newest titles. Usually, if you want the best gaming experience you will need to wait until the next generation of cards is launched after a game is released. This was the case for FarCry, Doom3, HL2, FEAR, Oblivion, and many others, and it will continue to be true with DX10 titles as well.