EVGA 8800 ultra for $575 AR!

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Link

Id say this is pretty sweet deal for a card that use to cost $799 and above.

GTXish price for a ultra is definately hot.

Plus you know it uses the A3 cores so its sure OCable. (700mhz has been acheived by quite a few with the new revisions)
 

NoStateofMind

Diamond Member
Oct 14, 2005
9,711
6
76
It was never worth the $799 to begin with and most people don't like rebates (myself included). Is it a hot deal? IMO no, but then again it is my opinion.
 

wanderer27

Platinum Member
Aug 6, 2005
2,173
15
81
That's a bit tempting, but I'm waiting on the G92's in hope of a bit smaller/cooler cards.

 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: thatolchestnut
how has the ultra already fallen to 570 and GTX's are by and large still 500+?
/boggle

The Ultra, in relation to where the regular GTX is performance wise, is finally priced where it belongs. It's probably 70.00 faster than the GTX. Not 300 dollars faster, that's for sure.
 

lupi

Lifer
Apr 8, 2001
32,539
260
126
Originally posted by: thatolchestnut
how has the ultra already fallen to 570 and GTX's are by and large still 500+?
/boggle

The top end GTXs are (from reports I've read on the forums) better than the stock ultra.
 

chewietobbacca

Senior member
Jun 10, 2007
291
0
0
If its the overclocked version that might be true but you could overclock the Ultra even more easily than the GTX. Also, the Ultra's memory is different from the GTX and has the revision A3 core so it's defenitely plenty fast.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Length of time to receive a rebate check isn't something I worry about it, it is the fact that I might NEVER see the check!

Though the last 4 rebates I have done have gone through and they were hefty ones.
 

Fallen Kell

Diamond Member
Oct 9, 1999
6,189
524
126
Originally posted by: thatolchestnut
how has the ultra already fallen to 570 and GTX's are by and large still 500+?
/boggle

Because the Ultras are nothing more then binned GTX chips...
 

chewietobbacca

Senior member
Jun 10, 2007
291
0
0
Originally posted by: Fallen Kell
Originally posted by: thatolchestnut
how has the ultra already fallen to 570 and GTX's are by and large still 500+?
/boggle

Because the Ultras are nothing more then binned GTX chips...

Again, that's not ture at all. The Core is the newer revision and so it runs cooler with less power consumption than the GTX. It also has different memory IIRC and so its memory comes stock clocked at 2160, which is far beyond anything the GTX can OC to. Also, someone said that if the GTX was at 384 GFlops, the Ultra is at like 510 GFlops of power, so there's defenitely a difference.

The $800 price of the Ultra was defenitely ridiculous, but at $570, it's probably a better buy now than GTX's at $500+.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: chewietobbacca
Originally posted by: Fallen Kell
Originally posted by: thatolchestnut
how has the ultra already fallen to 570 and GTX's are by and large still 500+?
/boggle

Because the Ultras are nothing more then binned GTX chips...

Again, that's not ture at all. The Core is the newer revision and so it runs cooler with less power consumption than the GTX. It also has different memory IIRC and so its memory comes stock clocked at 2160, which is far beyond anything the GTX can OC to. Also, someone said that if the GTX was at 384 GFlops, the Ultra is at like 510 GFlops of power, so there's defenitely a difference.

The $800 price of the Ultra was defenitely ridiculous, but at $570, it's probably a better buy now than GTX's at $500+.

Pretty sure the Gflops theoretical type stuff is due to the 'clock speed' of the chip itself. Thus, if a GTX is clocked at the same speed as the ultra on the core, then they both will perform the same ammount of 'Gflops'. The entire thing is theoretical though, so those numbers don't mean squat :D

As others have said, there are some O/C GTX's out there (water cooled, IIRC) that are clocked higher than the 'stock' ultra's in BOTH core, shader AND memory speed. Thus, it will outperform the ultra.

I will say this though, the Ultra at 575 is a better than deal than 8800GTS 640 - SLI as it performs about right on par with it AND is a single card solution AND is priced less than two 8800GTS 640...
 

lupi

Lifer
Apr 8, 2001
32,539
260
126
3DMark06 score is the following: default 8800GTX scored 12.515, EVGA's ACS3 board scored 13.115, while the latest baby from Nvidia scored 13.191, with the main performance difference between eVGA and Nvidia being the clock of 128 scalar shaders (even though the eVGA has 14 MHz performance advantage, nV has higher shader clock - 1.456 compared to 1.500 MHz).

When divided by segments, SM2.0 test was 5131 for 8800GTX, 5354 for EVGA's ACS3 and 5438 for 8800Ultra. SM3.0 test yielded in 5.418 for both default 8800GTX and 8800Ultra, while 8800ACS3 scored 5.431, taking the lead here. CPU Score was practically equal, with 8800GTX and Ultra sharing the very same CPU score (4.556), and ACS3 botched this score (4.546).

Where it gets interesting is the fact that ACS3 board has higher fill-rate, when compared to 8800Ultra: ACS3 will churn out 7541.83 Mtexel/s in Single- and 19.499,91 MTexel/s in Multi-Texturing mode. 8800Ultra cannot count on higher shader-clock here, so the board churned out 7.370,04 MTexel/s in Single- and 19.085,74 Mtexel/s in Multi-Texturing mode.

Things also got heated in both simple (and complex Vertex Shader operations, where 8800GTX scored 107.29 fps, 8800ACS3 scored 118.12 and 8800Ultra trailed with 115.66 fps.

8800Ultra's muscle showed in Pixel Shader test with 9fps lead (518.54 vs. 507.70 - regular 8800GTX scored 478.25 fps), Shader Particles showed who is the boss (Ultra scored 180.90fps, compared to 166.67 fps on the ACS3 and 161.19fps for the default 8800GTX).

 

CP5670

Diamond Member
Jun 24, 2004
5,660
762
126
The MSI Ultra someone posted here recently was a slightly better deal with its 660mhz stock speed and free CoH copy.
 

lopri

Elite Member
Jul 27, 2002
13,310
687
126
I don't think the A3 revision (Ultra) showed any power consumption advantage compared to GTX. My guess is that at a very high clock speed it might consume less than GTX (650MHz+). But at frequency lower than that, many tests showed that it actually consumes more power than regular GTX. I guess it's because of the tweak on clock generator unit to accomodate faster memory, and also I suspect the memory controller was more refined for 1.50GB of memory just in case. (vs HD 2900XT 1GB) On the same line I wonder if the Quadros are using A3 sillicon? Is there any reviews of Quadro based on G80?
 
Jan 9, 2007
180
0
71
It still was never worth $799. By now the GTX should be @ $400, the 640MB GTS @ $300 and the 320MB @ $200, but since they have little competition in their respective price points, they just keep on charging top dollar. :(. The 2900XT could be called competition, but it is too expensive for the performance you get (AA and image quality issues). That is merely my opinion, of course, (except for the Ultra costing way too much at release, I think that is pretty much a fact considering that it was scorned by most forum goers and review sites alike).