7900 GTX to run at 650 MHz with only 24 pipes?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Regs

Lifer
Aug 9, 2002
16,666
21
81
At least the game Chernobyl hasn't been used as an excuse yet not to upgrade to a newer card. :D
 

Zebo

Elite Member
Jul 29, 2001
39,398
19
81
Originally posted by: Rollo
BTW- I know I said I would not post here, but several members have asked that I do. (which surprised me pretty much)

So I'm going to try it out.

:thumbsup: Good to see.. i may not always agree you but you're always an entertaining read.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
There are a few things wrong here. firstly the difference is 18% not 20(i know its a little) and your not taking into account the memory being 120mhz slower and you are comparing the old unofficial catalyst drivers to what they perform like now! Even B&W2 had a patch to counter the bug in it that made it slow on all ATI cards. And also Mhz dont also translate into direct scaling performance. 20% can sometimes not yield anything at all.

Those are simply rough numbers to give a general example. The rumors of a 32 pipe 700MHZ part made little sense as if they released a part with that configuration it would almost certainly destroy the XTX in nearly everything- even in FEAR given no other improvements save brute force it would be neck and neck. A 650(or 660)MHZ clocked part isn't going to give ATi an easy win, nothing like it.

Where did you get your numbers from? Lately the only numbers nV fans seem to be using are techreport.

AT- I'm not particularly a fan of nVidia- the last nV board I purchased was a GeForceDDR back in January of 2000. I've been running ATi in my primary rig for some time now, but that doesn't make me delusional.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Zebo
Originally posted by: Rollo
BTW- I know I said I would not post here, but several members have asked that I do. (which surprised me pretty much)

So I'm going to try it out.

:thumbsup: Good to see.. i may not always agree you but you're always an entertaining read.

Thanks dude, much appreciated!
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: DeathReborn
Games like The Elder Scrolls IV: Oblivion should be abenched when available as not everyone will judge FPS performance for an RPG. I hate to say it but we really need a good MMORPG that stresses the GPU & a Flight Sim or two as benching nothing but FPS titles is a waste of ra review if you ask me.

People want playable framerates in EQ2 will all the eyecandy turned on, they also want B&W2 to perform well and just because it's not an FPS it shouldn't be discounted.

I agree with this perfectly, the games I am interested in determining video card performance, are B&W2, Elder Scrolls Oblivion, X3: The Reunion, AOE3, Sims 2, Neverwinter Nights 2, I don't really have an interest in FPS.

I guess it's logical to cater to the majority, but the groups with less people playing in them should still be represented.

To me F.E.A.R performance is not completely representative of anything as Radeon X850 XT PE is competitive to the 7800 GT & GTX in this game.
While it is true, that the Nvidia TWIWMTBP program sticker is on F.E.A.R, it was an ATI GITG up until the beta, from what I have been told, so it was basically programed on ATI hardware for the most part, and runs most optimal there and it shows. Nvidia kind of stole the baby at the end of the maturation process so to speak.


 

FPS3lusive

Junior Member
Feb 22, 2006
1
0
0
Yea well you know how may hits these internet sites are getting by saying the 7900GTX has only 24pipes... probably in the 100's of Thousands. Seems like nvidia has had no problem in the past makeing cards that are faster then ATI's no matter the cost to the consumer. Unless they had a plan to make the 7900GTX with 32 pipes from the beginning, And after seeing what ATI was going to delever. They decided to just keep the 24 pipes insted of 32, and beat the X1900XTX. Maybe 32 Pipes would have Pwnt the X1900XTX To Much? and they only wanted 5-10FPS more to keep the market in there hand, and keep there Title. With minimal production costs? Either way I dont see NVIDIA giving away the Proformance Crown until the G80. Hence the 7800GTX 512, which was basicly a hype lanuch.

Sense when is Nvidia concerend with being price competivite? thats what the GT sieres of cards are for. Generly hasnt the Top end GT/GTX cards in the past had more pipes then just mem/core clock differences. And how would Nvidia Dubble Folting Point Proformance with a -150Mhz mem drop and a +105Mhz Core Boost. "The documents from NVIDIA also indicate that GeForce 7900GTX will be "twice as fast as previous generation chipsets" in floating-point performance - Dailytech"

I doubt Nvidia's Top End card will retail for $499. Prices Generaly Go up not down. Altho I wouldnt be opposed it they drop....

I have a tendency to believe chinese sites over US ones.. being thats were the cards are made. In any case I want to see the 7900GTX with 32 Pipes. But I guess time well tell in a week or so we will see some real cards.


Here is a little number brake down based on

http://www.dailytech.com/article.aspx?newsid=915

7900GTX Assuming 512MB

655MHz core frequency
256-bit memory interface
Memory Bandwidth (GB/sec) 52
Fill Rate (Billion pixels/sec) 15.0
Vertices/Second (Millions) 1450
24 pixels per cycle

7800GTX512

550Mhz core frequency
Memory Interface 256-bit
Memory Bandwidth (GB/sec) 54.4
Fill Rate (Billion pixels/sec) 13.2
Vertices/Second (Millions) 1100
Pixels per clock (peak) 24
RAMDACs (MHz) 400


Brake Down

7900GTX vs 7800GTX

Vertices Vertices/Second (Millions) +30.19% +350 Million
Memory Bandwidth (GB/sec) -.047% -2.4 GB
Fill Rate (Billion pixels/sec) +1.8% +1.8 Billion
Core Frequency +105Mhz
Core Memory -150Mhz

Dose This Hold True? "The documents from NVIDIA also indicate that GeForce 7900 GTX will be "twice as fast as previous generation chipsets" in floating-point performance - Dailytech" Or dose it Seem like the 7900GTX would need 32 Pipes to Dubble Folting Point Proformance. Or do they mean Previous Chipsets as in the 440MX :p

Who's Gona Calculate avg Frame Increase?

Anyway Iam done Ranting for now, Someone respond.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: Zebo
Originally posted by: Rollo
BTW- I know I said I would not post here, but several members have asked that I do. (which surprised me pretty much)

So I'm going to try it out.

:thumbsup: Good to see.. i may not always agree you but you're always an entertaining read.

He is now gone. Which is great news to me.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: coldpower27
Originally posted by: DeathReborn
Games like The Elder Scrolls IV: Oblivion should be abenched when available as not everyone will judge FPS performance for an RPG. I hate to say it but we really need a good MMORPG that stresses the GPU & a Flight Sim or two as benching nothing but FPS titles is a waste of ra review if you ask me.

People want playable framerates in EQ2 will all the eyecandy turned on, they also want B&W2 to perform well and just because it's not an FPS it shouldn't be discounted.

I agree with this perfectly, the games I am interested in determining video card performance, are B&W2, Elder Scrolls Oblivion, X3: The Reunion, AOE3, Sims 2, Neverwinter Nights 2, I don't really have an interest in FPS.

I guess it's logical to cater to the majority, but the groups with less people playing in them should still be represented.

To me F.E.A.R performance is not completely representative of anything as Radeon X850 XT PE is competitive to the 7800 GT & GTX in this game.
While it is true, that the Nvidia TWIWMTBP program sticker is on F.E.A.R, it was an ATI GITG up until the beta, from what I have been told, so it was basically programed on ATI hardware for the most part, and runs most optimal there and it shows. Nvidia kind of stole the baby at the end of the maturation process so to speak.

Oblivion is an ATI GAME (R580 escpially) :! i can tell you that much :! it really takes advantage of the shaders :!

Anyways alot of games will take advantage of Shader because of the xbox 360 :! same could be said about PS3 but Sony has alot of issue so i doubt you will see PS3 anytime soon.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: keysplayr2003
Turtle? is that you?

Rollo is that you?

:Q


:D

personally - with Rollo now 'banned' - i think we need to eliminate the unnecessary sarcastic/aggressive comments [including yours and mine . . . of course, mine somehow seemed necesary]. :p


in a very short time we'll know about nVidia's new offerings . . . for sure.
;)
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Originally posted by: apoppin
Originally posted by: keysplayr2003
Turtle? is that you?

Rollo is that you?

:Q


:D

personally - with Rollo now 'banned' - i think we need to eliminate the unnecessary sarcastic/aggressive comments [including yours and mine . . . of course, mine somehow seemed necesary]. :p


in a very short time we'll know about nVidia's new offerings . . . for sure.
;)

Apoppin, what in the heck are you talking about? What was sarcastic/aggressive about "Turtle, is that you?". It "looks" like it could be. Doesn't it? You manage yourself, and I will manage myself.

 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: CP5670
Rollo was banned? Did I miss something?

Yes.

http://forums.anandtech.com/messageview...atid=31&threadid=1797519&enterthread=y

Mods edited the 1st post with:
Typically, we do not get involved in kiddie fights over video cards, but this issue went much further, and was far more reaching than the usual fanboyism. From this point, until further notice, Rollo is no longer a member of these forums. We observed, and at the end of the day, we decided, that in the best interests of the forums, this member had gone too far. The action has been taken, and any further threads on this topic will be immediately locked. The issue is not resolved in it's entirety, as we will still be watching, as well as conducting our own investigations. Consider this thread closed, with further action to follow.

AnandTech Moderator

Now lets just carry on with talking about video cards.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Looks like it might be priced $600. Which means it will probably sell in shops for more then the XTX currently does.
I think the X1900 may just be the best choice for a little more while to come.

Release date and price
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
Originally posted by: 5150Joker
Yeah my desktop is hella messy.
:roll: Yeah, right. Maybe if you would quit playing games and do something productive, it would really get messy. :p
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
omg 72C?? Wow, you're making that thing sweat!!
My X800XL will start artifacting over ~58C. I can't imagine running a chip at 72C.
 

schtuga

Member
Dec 22, 2005
106
0
0
under same machine

3Dmark06

1024*768 4AA 8Anti only for SM2
79SLi vs 78SLi is 29% gain
1280*768 4AA 8Anti
31% gain
1280*1024 4AA 8ANTI
.........49% gain

for single card
79vs78
SM2
21xx vs 11xx....
50% gain


btw 78 is runing at 600/1800

this was on xtreme systems.I don't know where the scores come from. I am assuming the 78 would be a 512 for the clock to be at 600/1800.

They also say this card runs very very warm.


Not really looking all that spectacular for the green team,especially if they attatch a hefty ransom note to it.Did I also mention this card runs very warm.15 days
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: nib95

Wow, nice man, congrats!!

You did more then just beat it, you went 200+ over it!
I wonder how well my Sapphire X1900 XTX will OC on air.

I shouldnt look at your scores, because without watercooling I'll never even get close!


Thanks nib. :) If you lap the cooler (big thread on how to do it at HardOCP forums) you should be able to hit close to the same clocks. The XTX is a monster overclocker.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Originally posted by: tuteja1986
Originally posted by: coldpower27
Originally posted by: DeathReborn
Games like The Elder Scrolls IV: Oblivion should be abenched when available as not everyone will judge FPS performance for an RPG. I hate to say it but we really need a good MMORPG that stresses the GPU & a Flight Sim or two as benching nothing but FPS titles is a waste of ra review if you ask me.

People want playable framerates in EQ2 will all the eyecandy turned on, they also want B&W2 to perform well and just because it's not an FPS it shouldn't be discounted.

I agree with this perfectly, the games I am interested in determining video card performance, are B&W2, Elder Scrolls Oblivion, X3: The Reunion, AOE3, Sims 2, Neverwinter Nights 2, I don't really have an interest in FPS.

I guess it's logical to cater to the majority, but the groups with less people playing in them should still be represented.

To me F.E.A.R performance is not completely representative of anything as Radeon X850 XT PE is competitive to the 7800 GT & GTX in this game.
While it is true, that the Nvidia TWIWMTBP program sticker is on F.E.A.R, it was an ATI GITG up until the beta, from what I have been told, so it was basically programed on ATI hardware for the most part, and runs most optimal there and it shows. Nvidia kind of stole the baby at the end of the maturation process so to speak.

Oblivion is an ATI GAME (R580 escpially) :! i can tell you that much :! it really takes advantage of the shaders :!

Anyways alot of games will take advantage of Shader because of the xbox 360 :! same could be said about PS3 but Sony has alot of issue so i doubt you will see PS3 anytime soon.

Correction needed: Oblivion WAS an ATI Preferred Game but when they removed a lot of the details it lost the need for more pixel crunching power. It'll still be pixel hungry but nowhere near F.E.A.R.'s standards.

As for the PS3 the components are fine but the BluRay Drive at present is driving the Cost of Manufacture to $800+. The Cell CPU is getting good yields and the RSX costs roughly $70. The delay is mostly down to costs and partly down to a possible 2-3 million console a year limit at present.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: schtuga
under same machine

3Dmark06

1024*768 4AA 8Anti only for SM2
79SLi vs 78SLi is 29% gain
1280*768 4AA 8Anti
31% gain
1280*1024 4AA 8ANTI
.........49% gain

for single card
79vs78
SM2
21xx vs 11xx....
50% gain


btw 78 is runing at 600/1800

this was on xtreme systems.I don't know where the scores come from. I am assuming the 78 would be a 512 for the clock to be at 600/1800.

They also say this card runs very very warm.


Not really looking all that spectacular for the green team,especially if they attatch a hefty ransom note to it.Did I also mention this card runs very warm.15 days


Yeah saw some random guy on the forums post that. Sounds like complete b.s. to me.
 

schtuga

Member
Dec 22, 2005
106
0
0
Seeing as it's to be released in 2 weeks,is it not possble there are people that have these cards.Besides they were only SM2 tests.That's why I said not that impressive.
And it is unlikely that a 50 mhz increase would make that much difference,unless it wasn't merely a die shrink.Or is it also not possible that they left it at 24 pipes but did some tweaking?

24 pipes is automatically believable cause that's what you want to hear,but as soon as someone says there are great improvements within those 24 pipes then it obviously must be bs.


I am now thinking they were using a 7800gtx 256 with the snot oc'd out of it,this would explain some of the difference,it wasn't stated and from the clocks I assumed it would have to be the 512.my bad.