Serious Question: Why are people upgrading to G80?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Raduque
Originally posted by: Crusader

But- Yes you can get the same visual quality of G80 on DX9. Yet, you never will though... because it will end up so much slower than the same level of effects running under DX9 that it wont even be feasible. Or will run a very low resolution in comparison to the DX10 path.
This is eventually though, not sure about the first DX10 apps.

If DX9 could perform with this level of image quality.. devs would do it. But DX10 reduces alot of overhead, and DX10 GPUs (all so far being unified) offers vastly improved performance (no wasted shader resources).

I don't think you're really understanding what I was getting at. He said the G80 had "much higher visual quality", and I took it to mean he says the card can make DX9 games better looking then current dx9 cards, and I was asking how this was true.

Oh, I'm guessing he meant by the G80 having the best AA/AF on the market. They've made huge strides there. In that regard, G80 will make DX9 games better looking than current DX9 cards, both ATI and NV.

Originally posted by: Raduque
Also, NO DX10 games are coming out for at least 6 months, though somebody may surprise and deliver early. Those Crysis shots in the linked thread are under DX9.

I think you all are misunderstanding me. I am excited about DX10 and DX10 games, and the basic DX10 architecture, and I can't wait for it. But what I am absolutely NOT, in any way, shape or form excited about, is this card and it's prices. It's certainly not worth $500 more then my current card, nor is it worth more then the entire core of my current system (inc. vidcard, it was worth 575).

The fact that those Crysis shots are under DX9 doesnt mean much, everyone knows -most- of the DX10 visual improvements can be done under DX9, with more passes.
Its so much slower though, you probably wont be playing Crysis like THAT on a DX9 card at high res.

I wasnt going to upgrade to this card for three reasons.
1. I didnt think it was unified. If its not unified it wont last nearly as long as a unified gpu will.
2. I didnt think it would have such an IQ bump. Not just SM4, but the AA/AF.
And the price? Its not bad when you compare it to the 7950GX2.. the IQ/performance/DX0'ness of this card makes the ~$600 GX2 look so bad.. that I was happy to plop down $650 for the GTX. And the X1950XTX is going for $400 today on Newegg. A 8800GTS would be a far better choice over that.

So looking at the old high end for both red and green camps-
8800GTX > 7950GX2
8800GTS > X1950XTX

3. I didnt think it'd be spanking (or at minimum, matching) x1950 crossfire and 7900gtx sli at much as it does. I'm not surprised that 7900gtx > Quad SLI, or that the 8800GTX beats Quad as well.

While I was convinced to get 1 8800GTX.. I probably wont be buying two. :p As amazing as two would be, I have enough power and great image quality to last for a good while.
 

OBCT

Senior member
Jul 10, 2006
236
0
0
I just bought a X1900XT in August when I built my new computer. There's no reason for me to upgrade yet. But by the looks of it, ATI and nVidia are ready to go at it, which will be awesome for us consumers. My next upgrade should be a difficult yet enjoyable decision.
 

Piuc2020

Golden Member
Nov 4, 2005
1,716
0
0
I have my upgrade route all planned up and I don´t intend to get a G80/R600 until well... at least until I get Vista and DX10 (the actual API) is out, I´m very tempted to get a 8800GTX for my Mac Pro (quad core xeon + 8800gtx goodness) but that probably won´t be such a great idea.

In any case, G80 is a great card to play current DX9 games at 2560x1600... I´d like to say with AA but it is kinda pointless using AA at such a high resolution, in any case, 2X AA would be enough, heck 2X at 1024x768 is already jaggy free... I rather have someone explain me why people need 16X AA, I can understand the need for angle indepent and high quality 16X AF or in AA´s case better or faster rendering methods like transparency, temporal, etc... but 16 samples? talk about overkill.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
...heck 2X at 1024x768 is already jaggy free...
That level of AA at that resolution is a far cry from "jaggy free". You're just not seeing the degree of difference between higher AA modes.
...someone explain me why people need 16X AA...
Why do you need 2xAA?
...16 samples? talk about overkill.
More like talk about perfection. 16x of SSAA is unrivaled so far in terms of image quality and the G80 seems to be able to support such eye-candy with playability. Even if you were to just use 16 samples of MSAA or CSAA you would still have wonderful image quality that makes 2xAA look like No AA while keeping performance where we like it. To say that 16 samples of AA is overkill is to say that the G80 is overkill--which, depending on your preferences may or may not be true.

Bottom line, if you are content with 2xAA, great. In fact, it will probably save you quite a lot of money. But if you mean to say that there is no difference between 2xAA and 16xAA except for the performance hit you are gravely mistaken.
 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Originally posted by: Crusader
Originally posted by: MegaWorks
Originally posted by: Crusader
Originally posted by: MegaWorks
To be honest I?m sick of tired of hearing this on the forums every time a new card is

released by nVidia or ATI, same old carp over and over again.

Sounds like you are into the wrong hobby then, mang. 2nd thing to consider would be if you are sick of hearing "it".. stop coming to forums? :confused:

I come to the forums to get information and not to compare "my e-penis is bigger than your e-penis". :roll:

No one is saying your legacy card sucks, or is comparing "e-peens".

You said you are tired of hearing about everyone ranting on about the latest and greatest.. thats a part of the hobby. The rest of us LOVE the "same old crap".

I hope "same old crap" never stops. Including R600 and G81. Let the good times roll.

At least, the rest of us are having a good time. You must have a hidden fetish for a G80 or something because everyone else is just excited.

Lol! I don't think I made myself very clear. My problem is not your hobby or anybody's

hobby; you can talk all you want about the latest tech. I myself enjoy the latest tech in

video cards or any other hardware advancements. The only downside to all of this is that

some people intended to spread FUD and that creates fanboyism flame wars!

I come to the forums on a daily bases to get the latest information in technology, especially

about video cards. I?m a hardcore gamer and I want to know what?s the best money can

buy to play my favorite games. But seeing you guys go at it brings me back to the days

when Rollo was around. :)
 

Pugnate

Senior member
Jun 25, 2006
690
0
0
No I think you made yourself pretty clear. Some people need a bit more elaboration though haha. :p
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
I didnt think it would have such an IQ bump. Not just SM4, but the AA/AF.
The only visual bump we had was the AF. The AA is actually worse now that SSAA isn't an option at the moment. The only thing nice about the AA now is that now it can have true HDR with it as well as not give huge performance hits (i.e. 2 ROP's instead of 4)
And the price? Its not bad when you compare it to the 7950GX2..
Which tells you just how bad of a price the 7950GX2 was.
So, do nVidia's offerings now allow HDR+AA in Oblivion?
Yes, unless you're Anandtech. Then for some reason you can't do HDR+AA, even when you have two 8800GTX's in SLI with X6800 Conroe...
 

Kromis

Diamond Member
Mar 2, 2006
5,214
1
81
Originally posted by: josh6079
I didnt think it would have such an IQ bump. Not just SM4, but the AA/AF.
The only visual bump we had was the AF. The AA is actually worse now that SSAA isn't an option at the moment. The only thing nice about the AA now is that now it can have true HDR with it as well as not give huge performance hits (i.e. 2 ROP's instead of 4)
And the price? Its not bad when you compare it to the 7950GX2..
Which tells you just how bad of a price the 7950GX2 was.
So, do nVidia's offerings now allow HDR+AA in Oblivion?
Yes, unless you're Anandtech. Then for some reason you can't do HDR+AA, even when you have two 8800GTX's in SLI with X6800 Conroe...

HIYO!!!
 

darXoul

Senior member
Jan 15, 2004
702
0
0
I've just ordered eVGA 8800 GTX, even though I have a 19" LCD (1280*1024 native resolution) and I'm waiting for the NEC 24WGX3, to be realeased in April. Overkill?

Nope.

My current rig, equipped with X1900XTX can't properly handle some of the newer games with all the bells and whistles. The new card will provide silky smooth frame rate in 12*19 with 4-8*AA and 16*AF activated, plus HDR where possible. Oblivion is finally gonna run like it deserves, NWN2 with its screwed up engine wil probably show decent performance with shadows on, etc. etc. A huge benefit IMHO.

In a nutshell, I'm buying G80 to enjoy truly smooth frame rates in eye candy settings - a task in which my XTX sometimes failed.
 

ayabe

Diamond Member
Aug 10, 2005
7,449
0
0
Originally posted by: Matt2
Originally posted by: flexy
Originally posted by: ayabe
for me personally, the 8800 is a non-starter. I can play every game at I own at 1680x1050 with pretty much everything maxed and get acceptable frames.

come on. be HONEST.

You would've rather spent those $600 on a 8800GTX now :)

But i agree..as of now the X1900XT is a great card, still....it might be different thinking about upgrading from a 1900XT to a GTX.....from that point of view i am "lucky" i am still sitting with an old X850XT :)

Even my XTX @ 690/800 can be brought to its knees at 1680x1050

8800GTX at the VERY least negates SLI/Crossfire.

With what game? There isn't a game out there that I can't play at pretty much max settings, except NWN2, which has problems that will hopefully be fixed with a patch.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
What is beyond my comprehension seeing all these threads about the same issue is why do we have to justify our purchase ?

I know R600 will be released in a more appropriate time for Vista.
I know R600 will be faster.
I know I can't use SM4.0 capabilities yet
I know that there will be a G81 probably in mid-late spring..
I know that X1900XTX could do me very well with the vast majority of today's games

But you know what? I DON'T CARE. I WANTED THE FASTEST SINGLE CARD WITH THE BEST IQ POSSIBLE RIGHT NOW.. And not to wait for ATI this time 2-3 months and buy R600 which probably would require a new $$$ psu as well..
PERIOD.
Do what you want to do, but never for a second believe that all those who upgrade now are fools. They know what are they doing for their own benefit. That's it bye bye..

Now let me go play Oblivion with full LOD tweaks and 4096x4096 textures and HDR+4xAA@16x12. I feel like I'm playing a whole new game with the framerates I'm getting..
 

SniperWulf

Golden Member
Dec 11, 1999
1,563
6
81
I bought a GTS mostly because I was tired of dealing with SLI issues with the GX2. SLI is good in theory, but not necessarily in practice. When it works, it works good... but when i doesn't it makes you feel like... why the hell did I spend all this money on these cards when I can only use 1 without issue in <insert favorite game title here>
 

TanisHalfElven

Diamond Member
Jun 29, 2001
3,512
0
76
Originally posted by: HeroOfPellinor
Originally posted by: Pugnate
Ummm... I gave my 9800 to my sister a while back. Trust me, it can not play Oblivion.

Not at max settings, but it will run it fine.

yeah you ppl who insist on playing everything at super max settings and then complain they cannot afford the latest tech make me sick.
 

yanman

Member
May 27, 2002
40
0
0
I'm puzzled as to how there could be any doubt. I game at 1920x1200, AA would be nice but I don't use it at the moment and won't necessarily need it, but the fact is I need as much GPU as I can get to play Gothic3 smoothly. Why would I go for an older-gen card when I can get the 8800GTS and get much better results? It's a no-brainer.