9800 GTX, Will You Buy? With Poll!

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: Anath
In all honesty i would like to see a new competitors come to the market.

You get your wish. Around the 2010 timeframe Intel will be entering the high end GPU market. Engineering samples should be ready by the end of this year.





 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Naw...UPS says my shiny new 9600GT should be arriving today and that'll take care of my gaming needs for at least the next 6-12 months (I've lasted about 16 months on my current x1900gt). At $125 after MIR I think it's a pretty good bargin for the level of performance it provides.
 

Canterwood

Golden Member
May 25, 2003
1,138
0
0
9800GTX, what a sad joke!

Nvidia needs to quit regurgitating old tech.

Wouldn't waste my money on one.
 

thegimp03

Diamond Member
Jul 5, 2004
7,420
2
81
I was considering it since I have a 7900 GTO from 10/2006, but I think I'll wait. None of the games I'm playing right now really need the power of the 9800 GTX, and while it's at a decent price point, I can't justify it with Nvidia coming out with a new card in the Fall.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
I think Nvidia is demonstrating very clearly that even the company at the top needs to compete with its own products if it wants to keep sales momentum going.

A slight incremental improvement over cards enthusiasts already have in their hands (and in the case of the Ultra, a step down in performance) combined with only ONE game demanding graphics power this card doesn't have in the first place means this flagship release is a total yawner. Definitely a card to keep in mind in the case of very special EVGA step up circumstances or maybe doing a brand new build, but not at all a consideration for those of us with an 8800GT or better already.
 

blackangst1

Lifer
Feb 23, 2005
22,902
2,359
126
Im with most people on this. Im almost ready to pull the trigger on an 8800GTS 500, and I just am not able to justify $100 more for what will amount to barely, if any, real world performance gains. I dont benchmark so I couldnt care less about 50 more 3D marks.
 

Tempered81

Diamond Member
Jan 29, 2007
6,374
1
81
Originally posted by: ginfest
Originally posted by: SteelSix
Originally posted by: jaredpace
nice poll, you can vote for all three. hahah

My first time doing a poll. I'll have to study up next time. You get the idea though.. :)

Don't worry about it, it serves the purpose :)
That Dude constantly throws stones, hopefully one of the mods will pick up on it soon :disgust:

Yeah sorry about that.
 

Phris

Member
Jul 15, 2007
75
0
66
I'd be in an upgrade from a vanilla 8800GT...the benchmarks only seem to give ~10FPS more than it, so is it worth the $100? Probably not.

I'm not even sure if my PSU (Antec Earthwatts 500) can support it.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: gigahertz20
Wouldn't it be great if this was just a huge elaborate April fools joke by Nvidia? Tomorrow they'll be like, haha we got you all, and then ship the real 9800GTX's to reviewers, that blow everything out of the water. That would be great, all these hardware review sites put all this work into reviewing a fake card.

HAHAHAHA. That is an awesome idea. But I doubt it will happen.


Anyways, enough with ripping about "regurgitating old tech" people. Get a freaking grip on reality!
The G80 chip was an AMAZINGLY fast chip... it was also a huge power hog and cost 600-900$!

Even the "budget" 8800GTS 320 STARTED at 350$ a whole YEAR after it came out and was far too much money!
I finally upgraded the 7900GS to a 8800GTS 512MB recently, but if I hadn't then this card (9800GTX) would have been the one for me.
It is an overclocked and slightly optimized 8800GTS 512 (so, a few percent faster) with new hardware video decode options, tri-sli support, and most importantly, it actually supports hybrid power!
If you have a phenom + nforce7 system you can put it in knowing the card is literally powered down and taking 0 watts unless you are running a game. This is AWESOME!
I can't wait until the intel mobo's supporting hybrid power come out.

I am honestly VERY happy about the fact that nvidia is now refreshing products every 2-3 months or so. Before that, if you didn't buy a product within a month or two of release, then you were faced with paying full price for depreciated technology. Now slightly older cards are actually going down in price on a monthly basis, and new features are ALWAYS available, so there is no point in "waiting until the next release".

When the intel hybrid power mobo's come out I will buy either this or the G200 and sell my 8800GTS 512. It's an XFX so the lifetime warranty is transferable and I could get a better price for it...

A year and a half ago: Tick: new architechture G80 comes out, it costs a whole lot and gets a huge performance boost.
Half a year ago: Tock: G92 comes out, costs less then half G80 with comparable performance.
Every few months since: A slightly better G92 comes out, driving older G92 prices constantly down, offering new features and optimizations (VERY minor speed increase)
Q4 2008: Tick: G200 comes out, completely new architecture with probably very high performance, and cost...

Why is everyone bitching aboutthe fact that we did NOT have to go for an entire year with the exact same G92 cores, with no improvements, no price decreases, etc... I am very happy to see that the 8800GT that was selling at 300-350$ is now sold for 170$ instead of staying at 300$ until the G200 comes out in Q4 08...
You keep whining like that, and nvidia will go back to the "once a year" cycle to avoid the "bad press", and we will be stuck with higher prices and slower hardware...


PS. when the G92 refresh of the 8800 arrived I said it was an early released of incomplete GF9. I was right. It had a variety of features (new purevideo features and hybrid power) disabled due to being incomplete. Now that they finally finished it they released it as GF9. Before they were just selling it as a GF8. Yes it is damn confusing. They SHOULD have called the GF8900 instead of the 8800WTF... or something like that. But it is really not that big of a deal, and not as malicious as you make it out to be.
 

Phris

Member
Jul 15, 2007
75
0
66
Hey, I still have a computer around (in working condition) with 640K of RAM. People just can't seem to understand the raw beastliness of 12MHz.
 

CorCentral

Banned
Feb 11, 2001
6,415
1
0
I own an 8800gts/640
Whether it's the next gen after the 9800gtx, or the one after, I'm waiting for a nice jump up in speed.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: blackangst1
Originally posted by: Scoop
Only GPUs with one PCIe connector for me. Ever.

Dont forget "640K of RAM ought to be enough for anybody" ~Bill Gates

:p

well... that is completely different. This isn't a matter of UPGRADING. its a matter of scaling in size. You can expect the average person to buy smaller and smaller computers with the years. If someone said "I am never buying a PC the size of a fridge", well....
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
bill gates has just been lucky his whole life. clearly the man is deranged. anybody who thinks 640 k is large enough needs to try running crysis on very high at 19x12 with an 8800gt 640. ;)
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
1. Bill G denies saying that.
2. At the time that sentiment was expressed the average PC had 64k of ram or less. 256k was a monstrously well hung business machine, and a 10 megabyte hard drive cost as much as your car. If I told you how many concurrent users we had on a PDP-11 with a meg of ram your jaw would break from hitting the floor.
3. It was a statement about a DOS memory addressing limit. It's only in the later days of DOS when 640 started to be a problem, and hardware&software workarounds were found. Plus the 8088 DOS ran on at the time had only 20 pins to address memory with, so saying 640k out of a possible theoretical 1 meg was enough for anyone is a perfectly valid statement in context.

In conclusion, I have no point.