7900 cards are scarce because it's outselling its ATI competitor by a ratio of 4:1

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Wreckage
Even if actual sales numbers came out supporting the article by the Tech Report, there are a few here who clearly have there own agenda.

Yeah like your own admitted agenda to be AT's biggest troll just to "piss off the ATi fans". I'm surprised the AT mods haven't tossed you out yet.

Most people can see past a few FPS here and there and pick a reliable well supported card that is better overall.

I would never buy a card based on one game or any fps for that matter. Quality, cost, reliability, support, etc. all come in to play. Heck I bet the number is higher than 4 to 1.


LMAO@above.."few fps here and there..oh and uh..the HDR+AA..oh and the HQ AF..oops".
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: 5150Joker
Yeah like your own admitted agenda to be AT's biggest troll just to "piss off the ATi fans". I'm surprised the AT mods haven't tossed you out yet.

Posting my opinion (backed by facts and links when needed) is what a forum is all about.

It's not like I go around trying to sell a company's product by posting ads in my sig.

If the mods have a problem with me, I am more than willing to discuss it with them. However you are the last person on this forum to cry foul about how someone is posting.

My post was on topic and supported by a reputable website that I'm sure did its research. Clearly the 7900's are selling well, why this hurts you so personally is beyond me.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
(We all know a 7800GTX can kill a 6800GT SLI setup most of the time).

Which 7800GTX-- the 256 or 512? And define "most of the time"... because Anandtech's old benches don't support that:

BF2 1600x1200 4xAA
7800GTX......66.7
6800U SLI....67.4

Doom3 1600x1200 4xAA
7800GTX......54.2
6800U SLI....75.4

HL2 1600x1200 4xAA
7800GTX......119
6800U SLI....118.4

Splinter Cell: CT 1600x1200 4xAA
7800GTX......56.5
6800U SLI....53.7

I see a flaw in your benchmarks. Cookie Monster said 6800GT SLi, yet you have all Ultras?
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Cookie Monster
Its either demand or yield problems. But looking at this article, NV produces WAY more GPU dies per wafer compared to ATI and infact cheaper to produce each GPU die due to decreasing unwanted transistors and smaller GPU die to boot. I dont think yields are a problem. Simply demand for the card.

That would be the common sense answer that most of us know and recognize.
Why others on this board refuse to ACKnowledge that.. I wouldnt know ;) :D

Originally posted by: akugami
Wait, I have to show you proof that you are talking out of your rear end after I ask you for proof that your words are true?!? Let me break it down in simple terms for you Crusader because even a 10 year old can understand what I'm saying.

1) I say your claims are not scientifically accurate and thus can't be taken as any sort of true measure of industry sales.

2) I say that there are not enough sales data available to the general public to support your claims that the Steam survey is a good measurement of actual sales.

3) If 2 is true that means I can't prove you're wrong because such data is not available, but by the same measure since such data is not available, your claims that the Steam survey is an accurate representation of sales is also unprovable thus falsifying your statements.

4) It is then up to you to prove me wrong. By the very nature of my statements, I can't prove I'm right. But by that very same measure, I'm also saying that you can't prove your words are true because such data does not exhist in the public sector. Thus making your words false.

If you can prove you're right, do so. Until you do, it's just another case of you talking out of your rear. And I won't comment again in this thread about this subject until you can provide proof. Which I am 99.99% positive you can't.

LOL Calm down.
Thing is, if you go back and look at GF6 sales and X800 sales they correlate with the Valve statistics of the time.
There was info released on the GF6 sales by Nvidia and I was not the one to uncover the situation, it was another member here. But I do remember reading it.

Lets just ask some very basic questions that are a matter of your opinion-
Do you really think the X800 series outsold the GF6? :confused:
Do you think the X1800 outsold the GF7800s??? :confused:
Do you think the X1900 is outselling the GF7900?? :confused:
Lets get real here and quit trying to argue in circles. I dont have to go dig up that thread to show your monkeyass things that have been clear and obvious to anyone watching this market.

Seriously answer with your thoughts on this. If you answer yes to the first question you might as well drop your tirade by covering your ears and burying your head in the sand screaming 'AHHH THERES NO PROOF NV IS OUTSELLING ATI!'
You have an incredibly strong intuition they are outselling ATI from all the news you have been hearing and from the popularity of the Geforces since the GF6.

Its pretty damn clear and obvious to any besides those who wish to remain ignorant to the GF6-7 popularity. But whatever, you are prob more interested in merely arguing over semantics and small details to "win" an argument on a forum.

I'm just telling you like it is, and anyone with open eyes can see which card lineup has been more popular since the Geforce6 release.
If you just want to argue for no reason and not just admit this is more than likely the case.. then ignorance is bliss I guess.

I'll do some digging on that thread, you think about it in the meantime.
 

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136
Originally posted by: 5150Joker
Originally posted by: Wreckage
Even if actual sales numbers came out supporting the article by the Tech Report, there are a few here who clearly have there own agenda.

Yeah like your own admitted agenda to be AT's biggest troll just to "piss off the ATi fans". I'm surprised the AT mods haven't tossed you out yet.

Most people can see past a few FPS here and there and pick a reliable well supported card that is better overall.

I would never buy a card based on one game or any fps for that matter. Quality, cost, reliability, support, etc. all come in to play. Heck I bet the number is higher than 4 to 1.


LMAO@above.."few fps here and there..oh and uh..the HDR+AA..oh and the HQ AF..oops".

Nor you Mr roaming advertisement. Anand should charge you rent for the advertisements .
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Originally posted by: Shamrock
Originally posted by: deadseasquirrel
Originally posted by: Cookie Monster
(We all know a 7800GTX can kill a 6800GT SLI setup most of the time).

Which 7800GTX-- the 256 or 512? And define "most of the time"... because Anandtech's old benches don't support that:

BF2 1600x1200 4xAA
7800GTX......66.7
6800U SLI....67.4

Doom3 1600x1200 4xAA
7800GTX......54.2
6800U SLI....75.4

HL2 1600x1200 4xAA
7800GTX......119
6800U SLI....118.4

Splinter Cell: CT 1600x1200 4xAA
7800GTX......56.5
6800U SLI....53.7

I see a flaw in your benchmarks. Cookie Monster said 6800GT SLi, yet you have all Ultras?

Well.. pretty much same thing. 6800U or GT SLi. But the 6800GT SLi was more popular. So i used that as an example.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Acutally akumi.. I'd like to place money on the sales numbers from ATI and NV.. if they ever are released or if I can dig up some, then we can do a little monetary exchange. If interested, PM me and we can work something out.

As far as my 3 questions in my above post, I'll put money down on any 3 of those questions that I am right about NV outselling ATI within those criteria.

Thats how sure I am of my claims.

Are you? ;)
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I think ati is selling alot of high end cards. That was a major, huge win - Oblivion - this is a slow time of year for sales, but by Christmas - Ati's lead will be apparent. Or Nvidia will catch up. With all this one must remember a 5% shift is huge in this market.
 

pacho108

Senior member
Jul 14, 2005
217
0
0
OMG!!!
i swear that if you guys meet each other in person it would end up with a lot of punching and kicking
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: Dman877
An overclocked 7900GT at 550 core is about equal to the X1800XT at stock in games like Oblvivion, Fear, CoD2, etc. GT powns the ati in D3/Q4 and flight sims. Still, you can oc the X1800 too. Whoeever said the GT is way faster than an X1800 needs to check some reviews.

7900GT vs X1800XT 512

That's true, but for the die hard graphics/video whores out there, a volt mod will easily see that 7900 Gt to pass 7900 GTX levels.
 

nib95

Senior member
Jan 31, 2006
997
0
0
Originally posted by: 5150Joker
Does anyone here actually buy a lot of pancakes/hotcakes? I don't so I'm wondering why that term is so frequently used.


That's a good point, but it's unfortunately one of those catch fraises that's lasted through the ages.
It's more of a nostalgic thing for when itw as originally coined then how much relevance it has these days.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: Shamrock
Nor you Mr roaming advertisement. Anand should charge you rent for the advertisements .

What is better?

1. Someone showing a list of good deals for video cards
2. Someone showing Shamrock's Pot O' Gold?

Stupidity 4TW!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: ST
Originally posted by: munky
How exactly is slower performance and worse IQ better?


so why again aren't you running HDR+AA for Oblivion on your X1900? o_O

"...was playing at 1280x960 with maxed out settings (HDR but no AA) and the frames would sometimes drop into the 20's outside..."

Wasnt that before the chuck patch? Right now I am playing at 1280x960 with AAA and HDR.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: ST
Originally posted by: CaiNaM
well, no munky is certainly not me, lol... and frankly rather illogical for you to think so.

still, you cannot post comparitively, as your hardware is not capable of running HDR and AA at the same time.

also, here's a review of XFX's XXX version (560Mhz/1.32GHz) which shows the GT needs to run at a substantially higher clock than reference just to keep up with even an x1800xt (mine is clocked a bit higher, but i didn't need to void my warranty to do so, nor did i need a conductive pen!), let alone an x1900xt.

the 7900GT is certainly a nice card and it has its share of strengths, but the ati cards certainly have their strengths as well (strong performance, lower price vs nv, and more features). i just found it amusing you used one of the nvidia's weaknesses (oblivion performance, lack of ability to run both HDR & AA) to try and put down the ati card.

illogical to think it was him when you answered my question directed at munky specifically? o_O

anyhow, i'm not like most ati fanbois trying to incite any bs on other branded cards, i just wanted to understand why he, munky, was playing at such a low res without HDR + AA when he has a x1800/x1900? it's kinda funny how defensive ya'll get even with a simple little query.

To answer that question, it seems like going from 1024 to 1280 did not change my performance, so while I originally thought my card could only handle it at 1024, it turns out 1280 is just as playable as 1024. And no, I'm not using any other alias on this board.
 

Dman877

Platinum Member
Jan 15, 2004
2,707
0
0
Originally posted by: nib95
Originally posted by: Dman877
An overclocked 7900GT at 550 core is about equal to the X1800XT at stock in games like Oblvivion, Fear, CoD2, etc. GT powns the ati in D3/Q4 and flight sims. Still, you can oc the X1800 too. Whoeever said the GT is way faster than an X1800 needs to check some reviews.

7900GT vs X1800XT 512

That's true, but for the die hard graphics/video whores out there, a volt mod will easily see that 7900 Gt to pass 7900 GTX levels.

Ok so at stock clocks, X1800XT > 7900GT (Both in price and performance atm)

Overclocked, (X1800XT@700, 7900GT@550) the performance gap is probably a bit smaller since the GT oc's a lot more as a percentage of its stock clocks, but the X1800XT will still win most of the benchmarks, especially CoD2, Fear, Oblivion.

Volt modded and oc'd to the extreme (X1800's have voltage controls do they not?) both cards need aftermarket cooling unless you like listening to a dust buster so the price battle remains unchanged. The performance at this stage would probably be pretty close accross the board with the 7900GT winning the typical nv benchs and the X1800XT equalling or slightly besting the nv card in typical ati strengths.

Everything I just said is just my feeling from the benchs I've seen. Feel free to dispute any of it. With that said, it seems both cards offer pretty good value but since right now the ati card is cheaper, available, and capable of aa+hdr (is this gonna be like the 6800's vs X800's and sm 3.0 argument?) I have to give the nod to the ati card.

I like the fact that the 7900GT is a cool running, low power card and frankly, I'd be happy with either of them right now. However, if it was my money, I'd grab a X1800XT for 260 off newegg instead of the cheapest 7900GT at 300.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: munky
To answer that question, it seems like going from 1024 to 1280 did not change my performance, so while I originally thought my card could only handle it at 1024, it turns out 1280 is just as playable as 1024.

That's one of the odd things I've noticed about Oblivion. I'm using an XTX and I don't notice really any performance difference going from 1280x960 to 1600x1200. Using HDR + 2xAA, 16xHQAF, everything on/maxed. Benchmarks seem to back that up too (1280x1024-- 32fps; 1600x1200-- 27fps). So, I said screw it and went up to 16x12. It chugs in some areas outdoors, but I prefer the higher resolution by far. I might tweak the grass a bit to improve perf. To me, less grass >>>> lower res.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: Dman877

Volt modded and oc'd to the extreme (X1800's have voltage controls do they not?) both cards need aftermarket cooling unless you like listening to a dust buster so the price battle remains unchanged. The performance at this stage would probably be pretty close accross the board with the 7900GT winning the typical nv benchs and the X1800XT equalling or slightly besting the nv card in typical ati strengths.

While I don't dispute your stock nor slightly overclocked assessment, I think you are underestimating what a volt moded oc'd 7900gt can do. on average the mod will yield 7900gtx speeds (650MHz gpu) with the upper spectrum seeing a little beyond this performance in the 700mhz arena (http://desertshooter.nl/12103.JPG) , and the insanely extreme overclocks will just kill it (http://www.vr-zone.com/index.php?i=3437&s=3). Now I do not suspect even 1% of the population will duplicate this as the VRZone does, but thats what's unique about the 7900GT....it CAN overclock quite well, mainly limited by your cooling capacity. The versatility can extend the other way as well, with passively cooled 7900GTs on the horizon for silence/htpc freaks!

 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: Dman877
I like the fact that the 7900GT is a cool running, low power card and frankly, I'd be happy with either of them right now. However, if it was my money, I'd grab a X1800XT for 260 off newegg instead of the cheapest 7900GT at 300.
but if you're gonna mod it and crank up the voltage and clockspeeds, the gt is not going to be a "cool running, low power card". it would still likely use a little less (both 90nm but the x1k has more transisters), but no longer significantly less.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: CaiNaM
Originally posted by: Dman877
I like the fact that the 7900GT is a cool running, low power card and frankly, I'd be happy with either of them right now. However, if it was my money, I'd grab a X1800XT for 260 off newegg instead of the cheapest 7900GT at 300.
but if you're gonna mod it and crank up the voltage and clockspeeds, the gt is not going to be a "cool running, low power card". it would still likely use a little less (both 90nm but the x1k has more transisters), but no longer significantly less.

Says who? Again you're pulling more BS from your arse:

Power Consumption/Heat Output/Chart

Geforce 7900GT @ Default voltage/Speed 64 watts
Geforce 7900GT @ 745MHz 1.5v 87 watts
Geforce 7900GT @ 780MHz 1.6v 101 watts
Geforce 7900GT @ 800MHz 1.7v 111 watts

How much power does the X1Ks draw again? o_O


 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This whole modding thing has taken a predictable turn. Last gen a $160 x800gto could be unlocked and OC to x850xt pe levels, and with a volt mod it would reach over 600mhz, which would be equal or faster than a $300 7800gt. Yet I didnt see masses of enthusiasts claiming it was as good as the 7800gt - in fact I remember the 7800gt being touted as the best card for the money, when in fact it wasnt from a pure performance perspective. And of of course the issue of SM3 and HDR vs a primitive r300 rehash was being debated over and over. Now, however, hardly anyone seems to care that the 7900 series is mostly a nv40 rehash from 2 years ago, and volt modding is all of a sudden a common and easy thing to do. How the tables have turned...
 

CaiNaM

Diamond Member
Oct 26, 2000
3,718
0
0
Originally posted by: ST
Originally posted by: CaiNaM
Originally posted by: Dman877
I like the fact that the 7900GT is a cool running, low power card and frankly, I'd be happy with either of them right now. However, if it was my money, I'd grab a X1800XT for 260 off newegg instead of the cheapest 7900GT at 300.
but if you're gonna mod it and crank up the voltage and clockspeeds, the gt is not going to be a "cool running, low power card". it would still likely use a little less (both 90nm but the x1k has more transisters), but no longer significantly less.

Says who? Again you're pulling more BS from your arse:

Power Consumption/Heat Output/Chart

Geforce 7900GT @ Default voltage/Speed 64 watts
Geforce 7900GT @ 745MHz 1.5v 87 watts
Geforce 7900GT @ 780MHz 1.6v 101 watts
Geforce 7900GT @ 800MHz 1.7v 111 watts

How much power does the X1Ks draw again? o_O

about 110w under full load.

the gtx runs about 90w. even assuming the GT with high clocks and volt mod draws no more power than a stock GTX, it goes from 55 watt consumption advantage to about a 20w consumption advantage (a 40% increase). big difference.

so how was i wrong?
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Crusader, I don't get worked up in a forum. I do get irritated by stupidity but I never ever go frothing at the mouth when trying to patiently explain something a 10 year old should understand to the stupid.

I'm not the one trying to prove a point with opinion. I asked you for facts. I broke it down so that even a 10 year old should understand it. So again, what you state is all opinion backed up by zero facts. Seriously, back it up. I'm not saying that I believe the ATI cards outsell the nVidia cards overall. However, nowhere is there data to say conclusively one way or the other. That has always been my argument. If you have proof then show it.

I don't know if the 6x00 series outsold the X800's From all accounts both sold nearly equally well. If you have proof showing otherwise then produce it.The X1800 can't have outsold the 7800's due to the 7800's being out on the market for 6+ months longer than it. I think the 7900's have outsold the X1900's but again, this is just conjecture. I do not have facts to back this up and neither do you. If you have proof then show it. If not all it is is just opinion and not fact. (Is anyone seeing a pattern here from the bold print?) I will never make claims of what card outsold what card because to my knowledge no such data exhist in the public sector.

Again, why do I have to produce proof that would show you wrong when such proof does not exhist in the public sector? I am saying that no such proof exhists in the public sector that will prove me right, but again it is the very same proof that will prove you wrong. If it does not exhist publicly how do I produce it? If it does not exhist publicly which you insist it does, then produce it and prove me wrong. Is that too hard to understand? I'm not sure how I can make what I'm saying clearer than that. You're the one who is arguing around circles. Done with this particular argument unless you can come up with something other than evasion. Much like all your other arguments it's full of hot air and you can't prove jack.

EDIT: Fixed some formatting and added the below.

Someone like Ronin, who works in the games industry doesn't even have exact sales numbers. I doubt someone like you would have them. I've searched a good while on and off over the net and the only thing I have come up with is a vague percentage breakdown but even then it's very vague because they don't breakdown the numbers between low end and high end. But if you want to bet, meet me over a gambling table. I don't bet over the internet because it doesn't interest me. The games I play are Pai Gow Tiles and Texas Holdem.
 

imported_ST

Senior member
Oct 10, 2004
733
0
0
Originally posted by: CaiNaM
Originally posted by: ST
Originally posted by: CaiNaM
Originally posted by: Dman877
I like the fact that the 7900GT is a cool running, low power card and frankly, I'd be happy with either of them right now. However, if it was my money, I'd grab a X1800XT for 260 off newegg instead of the cheapest 7900GT at 300.
but if you're gonna mod it and crank up the voltage and clockspeeds, the gt is not going to be a "cool running, low power card". it would still likely use a little less (both 90nm but the x1k has more transisters), but no longer significantly less.

Says who? Again you're pulling more BS from your arse:

Power Consumption/Heat Output/Chart

Geforce 7900GT @ Default voltage/Speed 64 watts
Geforce 7900GT @ 745MHz 1.5v 87 watts
Geforce 7900GT @ 780MHz 1.6v 101 watts
Geforce 7900GT @ 800MHz 1.7v 111 watts

How much power does the X1Ks draw again? o_O

about 110w under full load.

the gtx runs about 90w. even assuming the GT with high clocks and volt mod draws no more power than a stock GTX, it goes from 55 watt consumption advantage to about a 20w consumption advantage (a 40% increase). big difference.

so how was i wrong?

are you illiterate or did you not just see my post and quoted it at that? those are ACTUAL measured numbers from a REAL 7900gt not some numbers devised out from la-la-land. since those numbers are from your very HIGH overclocks, we can see a typical volt modded gt would fall within the 65W - 86W range, which is still immensely cooler and lower powered than any x1k.

again, please don't make arguments on your insinuations, it just shows your intelligence (or lack thereof)....stick to topics you do know about, like your own vid card ....