Why does AT say the X1900's take the performance crown?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: Wreckage
People want the best performing system. Why else would NVIDIA sell millions of SLI boards. [/b]

i actually think you are WRONG !!

The main reason many people went with SLI was that SLI allowed to put two LESS performing, cheaper cards on the board.

They put 6600s on it and got a "ok" price/performance ratio.....the amount of people who put two HIGHEND cards like 7800 or similiar in SLI is (correct me if i am wrong) mind-boggling LOW.

Ask any big company (Valve etc.) who did surveys what MOST people run....MOST people run mid-range hardware, MANY people run high-end hardware and only a FEW run high-end/sli.

SLI was BIG (i dislike it, in case you didnt notice :) especially when tehre was a shortage on 6800s (as far as i remember)..because i remember the hype when they came out with SLI and people were putting two low-end cards in it ...MOSTLY
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
So if you rule out SLI because of that your pretty much rule out any card above $300.

hu ?

I am willing (and always was) to spend $400-$500 on the current high-end SIGNLE card when i see its worth it. There are a bunch of people who jump and shell out $400-$500 or so for a card...but not TWICE the amount in SLI.
(Way) above $300 for single card is pretty much normal.

We enter the "crazy high-end niche" you're talking about are when it comes to $800-$1200 just for the vid in SLI.

 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
I was not trying to label you. But a lot of the "red team" seem to think for some strange reason that running 2 cards is a bad thing. (those were the people I was applying that label to).



yes it IS a bad thing. (And yes, i am RED and i have a crush on RUBY...hahaha). The point is it is a DUMB thing to put to high-end cards in SLI when there's a 6 month cycle with better cards coming out ANYWAY.

The current X1900XT just makes *ME* see again how dumb SLI is and invest so much $$$ in raw performance which will be topped anyway in 6 months OFTEN by a SINGLE card.

It might be way different for milliionaires, if your $$$ doesnt play a role, but it is STILL dumb.
 

ectx

Golden Member
Jan 25, 2000
1,398
0
0
$$$$$$$.

I don't know how much you guys make, but I find it amazing how much $ you are willing to spend for a (or two) video card(s).
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com

Originally posted by: ArchAngel777
No one is denying that SLI has benefits. I think one has to take a step out of the video forum and ask themselves "What am I doing here? Why am I wasting my time argueing with others who feel differently? Why do I always have to have the last word? Why cannot I accept the fact that others can do what they want, whether it makes sense to me or not? Who am I trying to convince?" I think if everyone took a step out of their emotionalism, this would be a much better place.
and as dull as General Hardware
:Q


:D

 

Steelski

Senior member
Feb 16, 2005
700
0
0
Ok. What i can see is some people not willing to acnowledge single card performance as the criteria.
If you include SLI and other scalable technology then there is no end. NO END. that means that even two SLI'd GTX's 512 or any other will be outperformed by.....wait for it.....more GTX's. Suprised......I'm not.
I could go to Nvidia and tell them what I nead and they will desighn it and get it built, but i am not a billionare. Single card performance is the only way you can measure who gets the performance crown. As for the Asus card, I will admit that it is a very fast card. Unfortunatley there are as many as voodoo 5 6000's (slight exageration) and are not in mass production.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Wreckage
Originally posted by: Fenixgoon
im not an ati fanboy, im a budget consumer. i run a sempron64 2800+ OC'ed to ~3200+, my mobo was a $30 refurb, i have an x800pro for a vid card (single most expensive part, was a very good deal though and last upgrade i will make for a while).

im not saying that two cards is a bad thing, but 2 cards is obviously an advantage over 1. other than ATI having issues with black&white2, they're at the top overall according to multiple benchmarks. and please dont say "OMG GTX 512 FTW" because 1) they're more rare than duke nukem forever and 2) they cost $700+, which is more expensive than a 1900XTX

I was not trying to label you. But a lot of the "red team" seem to think for some strange reason that running 2 cards is a bad thing. (those were the people I was applying that label to).

If you look at the Tech Report benchmarks Dual GT's had better performance than a XTX and they cost less. So SLI can also be an affordable high end solution.

Running 2 cards is not a bad thing. However, you're mostly argueing semantics. Your "single card" is really a 2 card solution modified into a single card. Now, I didn't look at it cause it's an expensive as heck solution. At those prices you're better off buying a 7800GTX 512MB.

Again you're only arguing semantics to make nVidia look good because you are a fanboy. And you are a fanboy because I've caught you making idiotic anti-ATI flames. And when I called you out on them you didn't answer. I especially liked how you called the 16 pipe X1900's "It looks like ATI is a few pipes short of a video card." because the G71 was rumored to be a 32 pipe beast.

Or how about the FUD when you said ATI's AVIVO used the CPU for encoding when ATI and Anandtech (who hosted the beta encoding software) even stated that it was a beta that was not yet using the GPU to accelerate video encoding?

I can dig up others examples but calling someone else a fanboy when you've clearly shown yourself to be one shows you're a hypocrite on top of being a fanboy.


Either way, the ATI X1900XT takes the performance crown for single video cards. And for the fanboys arguing semantics, single GPU video card. The 7800GTX 512MB is included probably more as a comparison than any serious competition to the X1900XT. Even if it's price dropped to $550'ish, the availability of these cherry picked cores is just too low to mount any type of real competition. That least the 7800GTX breathing fumes until the 7900 series comes out.

Heck, I'm sure that ATI will take a similar route with the X1900's and bump up the memory to 800mhz and the core to maybe 700-750mhz with cherry picked cores to combat the 7900's.
 

Regs

Lifer
Aug 9, 2002
16,666
21
81
Originally posted by: M0RPH
People are not interested in who can slap the most GPUs onto one board, they're interested in who designs the best GPU.


That was the most logicial and most intelligent thing I heard yet from a member. I'm....i'm....completly shocked.
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: akugami
Originally posted by: Wreckage
Originally posted by: Fenixgoon
im not an ati fanboy, im a budget consumer. i run a sempron64 2800+ OC'ed to ~3200+, my mobo was a $30 refurb, i have an x800pro for a vid card (single most expensive part, was a very good deal though and last upgrade i will make for a while).

im not saying that two cards is a bad thing, but 2 cards is obviously an advantage over 1. other than ATI having issues with black&white2, they're at the top overall according to multiple benchmarks. and please dont say "OMG GTX 512 FTW" because 1) they're more rare than duke nukem forever and 2) they cost $700+, which is more expensive than a 1900XTX

I was not trying to label you. But a lot of the "red team" seem to think for some strange reason that running 2 cards is a bad thing. (those were the people I was applying that label to).

If you look at the Tech Report benchmarks Dual GT's had better performance than a XTX and they cost less. So SLI can also be an affordable high end solution.

Running 2 cards is not a bad thing. However, you're mostly argueing semantics. Your "single card" is really a 2 card solution modified into a single card. Now, I didn't look at it cause it's an expensive as heck solution. At those prices you're better off buying a 7800GTX 512MB.

Again you're only arguing semantics to make nVidia look good because you are a fanboy. And you are a fanboy because I've caught you making idiotic anti-ATI flames. And when I called you out on them you didn't answer. I especially liked how you called the 16 pipe X1900's "It looks like ATI is a few pipes short of a video card." because the G71 was rumored to be a 32 pipe beast.

Or how about the FUD when you said ATI's AVIVO used the CPU for encoding when ATI and Anandtech (who hosted the beta encoding software) even stated that it was a beta that was not yet using the GPU to accelerate video encoding?

I can dig up others examples but calling someone else a fanboy when you've clearly shown yourself to be one shows you're a hypocrite on top of being a fanboy.


Either way, the ATI X1900XT takes the performance crown for single video cards. And for the fanboys arguing semantics, single GPU video card. The 7800GTX 512MB is included probably more as a comparison than any serious competition to the X1900XT. Even if it's price dropped to $550'ish, the availability of these cherry picked cores is just too low to mount any type of real competition. That least the 7800GTX breathing fumes until the 7900 series comes out.

Heck, I'm sure that ATI will take a similar route with the X1900's and bump up the memory to 800mhz and the core to maybe 700-750mhz with cherry picked cores to combat the 7900's.

I don't understand why you kept calling Wreckage and me nVidia fanboy when we simply stated "ATI would do better with more texture unit". It seemed to me you are too sensitive with any one who don't agree ATI is not perfect.

There are lots of misconception on AVIVO software. Yes it is software and all videocard should be able to take advantage of it. Though ATI specifically tailored it to only allow x1xxx series card to use it. It is actually just a software using shader processor part of GPU to decode video hence reduce CPU usage. CPU will always be required at least in fetch operation.

7800GTX 512mb's core is okay, I don't see how some CPU or GPU part is unable to get to a speed while others can. Its more likely its a problem with memory supply or other componenets on the PCB that weren't able to attain to the specified speed.

Yes, X1900xt does take the performance crown for single card though by not much when you compare it to 7800GTX 512mb. Many games run slower on x1900xt as stated in a review (xlib or something).

Yes, x1900xt is the better bang for the buck than 7800GTX 512mb, and probably the best bang for the buck for high end gaming experience.

Like you said in another thread when you called me out, no one is here with conspiracy to flame Nvidia or ATI. Please speak the same to yourself before you start calling people out.
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I'm only going to say this once: How about we wait until more modern games are released in a few months before passing judgement that Ati should have put more texture units. Putting more texture units would have either:
a) required to have less pixel shaders to make room for additional TMU's, or
b) drive up the prices of all the x1900's to make up for a larger die with more TMU's

The x1900 is not exactly suffering in any game right now from a lack of TMU, and the future remains to be seen. I really cant explain it any other way, but to me it seems like all the peeps who claim "fillrate is king" and demand more TMU's really have no idea where the trend in modern games is going or what are some of the other consequences of having a lot of TMU's. For example, textures, unlike ALU ops, use up memory bandwidth, and in case anyone hasnt noticed, memory clocks and bandwidth have not progressed nearly as fast as raw gpu performance. The more TMU's you stuff onto a gpu, the more likely they are to be competing for the limited available bandwidth, and after a certain point you get heavily diminishing returns. And that's only scratching the surface of the subject, so I'll just leave off with this comment: wait and see...
 

beggerking

Golden Member
Jan 15, 2006
1,703
0
0
Originally posted by: munky
I'm only going to say this once: How about we wait until more modern games are released in a few months before passing judgement that Ati should have put more texture units. Putting more texture units would have either:
a) required to have less pixel shaders to make room for additional TMU's, or
b) drive up the prices of all the x1900's to make up for a larger die with more TMU's

The x1900 is not exactly suffering in any game right now from a lack of TMU, and the future remains to be seen. I really cant explain it any other way, but to me it seems like all the peeps who claim "fillrate is king" and demand more TMU's really have no idea where the trend in modern games is going or what are some of the other consequences of having a lot of TMU's. For example, textures, unlike ALU ops, use up memory bandwidth, and in case anyone hasnt noticed, memory clocks and bandwidth have not progressed nearly as fast as raw gpu performance. The more TMU's you stuff onto a gpu, the more likely they are to be competing for the limited available bandwidth, and after a certain point you get heavily diminishing returns. And that's only scratching the surface of the subject, so I'll just leave off with this comment: wait and see...

agreed. we'll have to wait and see, it all depends on those game developers now..
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: akugami
Running 2 cards is not a bad thing. However, you're mostly argueing semantics. Your "single card" is really a 2 card solution modified into a single card. Now, I didn't look at it cause it's an expensive as heck solution. At those prices you're better off buying a 7800GTX 512MB.
Semantics or not it's still a single card. Would you leave a dual core processor out of a CPU comparison?

Again you're only arguing semantics to make nVidia look good because you are a fanboy.
One could say that they are using semantics to rule out the single card dual GT because they are an ATI fan.
I especially liked how you called the 16 pipe X1900's "It looks like ATI is a few pipes short of a video card." because the G71 was rumored to be a 32 pipe beast.
A.)It's been pointed out to me several times that a certain group here has no sense of humor what so ever.
B.)When the G71 is benchmarked we will see if I was off base or not.

Or how about the FUD when you said ATI's AVIVO used the CPU for encoding when ATI and Anandtech (who hosted the beta encoding software) even stated that it was a beta that was not yet using the GPU to accelerate video encoding?
So you call it FUD when I was right? Now who's the Fan boy? I say they are using the CPU and you back it up with ATI and Anandtech saying the same thing. Were you drinking when you wrote this?

I can dig up others examples but calling someone else a fanboy when you've clearly shown yourself to be one shows you're a hypocrite on top of being a fanboy.
Welcome to the club.

Either way, the ATI X1900XT takes the performance crown for single video cards. And for the fanboys arguing semantics, single GPU video card. The 7800GTX 512MB is included probably more as a comparison than any serious competition to the X1900XT. Even if it's price dropped to $550'ish, the availability of these cherry picked cores is just too low to mount any type of real competition. That least the 7800GTX breathing fumes until the 7900 series comes out.
Wow that did not sound biased at all :roll:
Heck, I'm sure that ATI will take a similar route with the X1900's and bump up the memory to 800mhz and the core to maybe 700-750mhz with cherry picked cores to combat the 7900's.
Whatever keeps you warm at night.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: beggerking
Originally posted by: akugami
Originally posted by: Wreckage
Originally posted by: Fenixgoon
im not an ati fanboy, im a budget consumer. i run a sempron64 2800+ OC'ed to ~3200+, my mobo was a $30 refurb, i have an x800pro for a vid card (single most expensive part, was a very good deal though and last upgrade i will make for a while).

im not saying that two cards is a bad thing, but 2 cards is obviously an advantage over 1. other than ATI having issues with black&white2, they're at the top overall according to multiple benchmarks. and please dont say "OMG GTX 512 FTW" because 1) they're more rare than duke nukem forever and 2) they cost $700+, which is more expensive than a 1900XTX

I was not trying to label you. But a lot of the "red team" seem to think for some strange reason that running 2 cards is a bad thing. (those were the people I was applying that label to).

If you look at the Tech Report benchmarks Dual GT's had better performance than a XTX and they cost less. So SLI can also be an affordable high end solution.

Running 2 cards is not a bad thing. However, you're mostly argueing semantics. Your "single card" is really a 2 card solution modified into a single card. Now, I didn't look at it cause it's an expensive as heck solution. At those prices you're better off buying a 7800GTX 512MB.

Again you're only arguing semantics to make nVidia look good because you are a fanboy. And you are a fanboy because I've caught you making idiotic anti-ATI flames. And when I called you out on them you didn't answer. I especially liked how you called the 16 pipe X1900's "It looks like ATI is a few pipes short of a video card." because the G71 was rumored to be a 32 pipe beast.

Or how about the FUD when you said ATI's AVIVO used the CPU for encoding when ATI and Anandtech (who hosted the beta encoding software) even stated that it was a beta that was not yet using the GPU to accelerate video encoding?

I can dig up others examples but calling someone else a fanboy when you've clearly shown yourself to be one shows you're a hypocrite on top of being a fanboy.


Either way, the ATI X1900XT takes the performance crown for single video cards. And for the fanboys arguing semantics, single GPU video card. The 7800GTX 512MB is included probably more as a comparison than any serious competition to the X1900XT. Even if it's price dropped to $550'ish, the availability of these cherry picked cores is just too low to mount any type of real competition. That least the 7800GTX breathing fumes until the 7900 series comes out.

Heck, I'm sure that ATI will take a similar route with the X1900's and bump up the memory to 800mhz and the core to maybe 700-750mhz with cherry picked cores to combat the 7900's.

I don't understand why you kept calling Wreckage and me nVidia fanboy when we simply stated "ATI would do better with more texture unit". It seemed to me you are too sensitive with any one who don't agree ATI is not perfect.

There are lots of misconception on AVIVO software. Yes it is software and all videocard should be able to take advantage of it. Though ATI specifically tailored it to only allow x1xxx series card to use it. It is actually just a software using shader processor part of GPU to decode video hence reduce CPU usage. CPU will always be required at least in fetch operation.

7800GTX 512mb's core is okay, I don't see how some CPU or GPU part is unable to get to a speed while others can. Its more likely its a problem with memory supply or other componenets on the PCB that weren't able to attain to the specified speed.

Yes, X1900xt does take the performance crown for single card though by not much when you compare it to 7800GTX 512mb. Many games run slower on x1900xt as stated in a review (xlib or something).

Yes, x1900xt is the better bang for the buck than 7800GTX 512mb, and probably the best bang for the buck for high end gaming experience.

Like you said in another thread when you called me out, no one is here with conspiracy to flame Nvidia or ATI. Please speak the same to yourself before you start calling people out.

1) I called Wreckage an nVidia fanboy because of multiple posts in multiple threads that were anti-ATI. I noted a couple of examples and if I really wanted to I could dig up more.

2) I don't believe I called you a fanboy yet but I'm leaning towards that conclusion. I also didn't mention your name in that post, you jumped in all by your lonesome.

3) I don't completely disagree with the comment that ATI would do better with more texture units. However, when someone (Wreckage) makes comments about how ATI's video cards are "a few pipes short of a video card" it implies that ATI's solutions are not as good as ATI's. The benchmarks say otherwise.

Furthermore, I asked Wreckage in that thread where he made that comment if he was a developer, programmer, or engineer working with games or for either ATI or nVidia. I didn't receive an answer. My only conslusion is he was talking out of his rear end and just slamming ATI to slam ATI. FUD and typical fanboy crap.

ATI has stated that they believe increasing the shader power of their GPU's is what developers want (as evidenced by a couple of games out now and in developement). Whether that ultimately is the right way to go is still up in the air. Give them credit for going in a direction they feel is right for them and the industry rather than being reactionary and just saying "me too" in response to what someone else does.

4) Indeed, AVIVO will still need the CPU for some operations. However, what hardware solution completely negates the use of a CPU in a modern computer anyways? The point of AVIVO's solutions was to accelerate video encoding and decoding so that there is only a negligeable load on the CPU. Much like nVidia's set of video decoders and their Purevideo. But when someone says ATI's AVIVO encoders are CPU encoders, especially when it's stated by everyone and their grandmother who reports on these things that it's a beta preview only and currently uses the CPU, then you're just spreading FUD.

5) Called you out? Are you holding a grudge? I refuted your statements and disagreed with your opinions in a couple of threads. If that's calling you out, so be it. I never called you a fanboy that I can recall. If you can find me specifically calling you a fanboy, show it cause I don't remember it.

As for Wreckage, he has repeatedly made idiotic statements flaming ATI that can only be called fanboy crap. If you want to defend him fine, that's your right, but I can show more than one example of him crapping on ATI for no reason except to crap on ATI. He's also arguing semantics which is the last resort of those with only a flimsy platform to support their argument, trolls and fanboys.

 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Wreckage
Originally posted by: akugami
Running 2 cards is not a bad thing. However, you're mostly argueing semantics. Your "single card" is really a 2 card solution modified into a single card. Now, I didn't look at it cause it's an expensive as heck solution. At those prices you're better off buying a 7800GTX 512MB.
Semantics or not it's still a single card. Would you leave a dual core processor out of a CPU comparison?

Again you're only arguing semantics to make nVidia look good because you are a fanboy.
One could say that they are using semantics to rule out the single card dual GT because they are an ATI fan.
I especially liked how you called the 16 pipe X1900's "It looks like ATI is a few pipes short of a video card." because the G71 was rumored to be a 32 pipe beast.
A.)It's been pointed out to me several times that a certain group here has no sense of humor what so ever.
B.)When the G71 is benchmarked we will see if I was off base or not.

Or how about the FUD when you said ATI's AVIVO used the CPU for encoding when ATI and Anandtech (who hosted the beta encoding software) even stated that it was a beta that was not yet using the GPU to accelerate video encoding?
So you call it FUD when I was right? Now who's the Fan boy? I say they are using the CPU and you back it up with ATI and Anandtech saying the same thing. Were you drinking when you wrote this?

I can dig up others examples but calling someone else a fanboy when you've clearly shown yourself to be one shows you're a hypocrite on top of being a fanboy.
Welcome to the club.

Either way, the ATI X1900XT takes the performance crown for single video cards. And for the fanboys arguing semantics, single GPU video card. The 7800GTX 512MB is included probably more as a comparison than any serious competition to the X1900XT. Even if it's price dropped to $550'ish, the availability of these cherry picked cores is just too low to mount any type of real competition. That least the 7800GTX breathing fumes until the 7900 series comes out.
Wow that did not sound biased at all :roll:
Heck, I'm sure that ATI will take a similar route with the X1900's and bump up the memory to 800mhz and the core to maybe 700-750mhz with cherry picked cores to combat the 7900's.
Whatever keeps you warm at night.

Semantics, the last argument of the trolls, fanboys and generally clueless. And yes, I would leave a dual core solution out of a single cpu comparison because most people talking about a single cpu usually mean single core cpu. To me it's the same as saying that an external RAID HD you bought is a single drive because due to it's RAID nature, your computer uses it as if it was a single HD.

Every time you crap on ATI it's supposedly for sarcasm and humor. Please show me where you give equal billing to nVidia. I'm curious to see it cause all I see is you flaming ATI then when someone calls you out on it, it's supposedly sarcasm.

Now, you're calling me a fanboy but show me where I post stuff slamming nVidia just to slam them. Show me where I don't temper my comments with impartiality. Please do so, I have already shown a few examples of you being a fanboy. Show me some posts where I'm being a fanboy. Hell, I posted more than once before and after the release of the X1900 that I'm going to wait and see how the G71 will do.

FUD refers to Fear, Uncertainty, Doubt. It usually refers to a smear campaign. When you stated that ATI's AVIVO video encoders used the CPU, did you say anywhere in your post that it was beta software and not ready for distribution? Did you mention that everyone and their grandmother reporting on such has reported large gains in video encoding speed when using beta encoders that actually used the GPU? Did you mention that everyone and their grandmother reporting on these things that it was more of a preview to show how their encoders would eventually work to the public at large? No. You did not. Hence, why I said you were spreading FUD.

I said that the X1900XT's closest competitor is the 7800GTX. The X1900XT completely dominates the 7800GTX. I stated the truth unlike you with your halftruths and insinuations. That's fanboy crap?

I stated that ATI will pull the same crap that nVidia did with the 7800GTX 512MB, that's releasing a limited card that is cherry picked cores to once again take the performance crown from nVidia with very limited availability. That's more fanboy crap I suppose. For someone calling me a fanboy, you're doing a good job discrediting yourself.


So call me a fanboy, prove it. I've already showed a few posts from you where you say some very idiotic flames against ATI then say you're only joking. Happens once too many times for me to think it's a joke especially when I do not recall any thread of you making similar crap against nVidia.

I have posted cricicism against ATI and against nVidia. I have always tried to be as impartial as possible in my tone. If you can show posts where I'm crapping on nVidia just to crap on them then put your moeny where your mouth is and link them.

And btw, I've never kept it a secret that I have used ATI for the last couple of years due to their superiority, it was only in the 7800 series that nVidia retook the lead which ATI has again retaken. I would still be using nVidia if they didn't make those misteps moving to the Geforce 4 and 5 series as I did with the Geforce, Geforce 256 and 3 series. I've posted that I'm waiting on the 7900 series to see if it's better than X1900 series.

You know what, I guess, that does make me a fanboy...of whichever video card is on top at the moment.
 

d1gw33d

Junior Member
Jan 27, 2006
1
0
0
This topic is too funny.
I've always used nvidia in the past, utnill recently (X800GTO now)
And I'm sorry to all you blantant ATI FANBOYS but the X1900 isn't all that it should be.
Sorry. I really am. And all the crying about SLI vs. XFire dosent count is BS. It does count because its a viable graphics solution that cost more yes, but is readily available. When someone wants the absolute fastest gfx solution, SLI is it. Cost is relative. Get over it.

The reason I say the X1900 isn't all that it should be is simply because nvidia is going to smoke it with its next gen gpu. If you doubt that, and say SLI isn't relative, then you are in fact, the definition of a fanboy.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: akugami
You know what, I guess, that does make me a fanboy...

I finally agree with you. You are a fanboy. I will right you off like M0RPH and move on to those willing to look at facts. I realize my public debate with such closed minded individuals who refuse to post links to back their statements are becoming spam in the forum. You are now ignored and this will be my last public acknowledgement of your posts.

Fact AVIVO encoding relied on the CPU
Fact ASUS dual GT is a single card.
Fact X1900XTX did not dominate the GTX512 like many had hoped
Fact your heavy use of terms "Idiotic", "trolls", "clueless", etc. not only remove any shred of credibility you had, but also relegated your post to an insulting retort lacking in any substance.




 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Breaking AT News!!
Flaming has yet again reached a new level here in the video forum. Trips to early vacation looks likely.

FACT according to xbitlabs the 7800GTX 512mb wins 11 benchs while the X1900XTX wins 9.
FACT the 7800GTX 512mb is NO where to be found. (Although many people have that card or two when it was released in the first couple of days or bought it for $749~999.)
FACT the X1900XTX supports 1.1ns ram, so you can OC it to 1800mhz on memory.


 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Wreckage
Originally posted by: akugami
You know what, I guess, that does make me a fanboy...

I finally agree with you. You are a fanboy. I will right you off like M0RPH and move on to those willing to look at facts. I realize my public debate with such closed minded individuals who refuse to post links to back their statements are becoming spam in the forum. You are now ignored and this will be my last public acknowledgement of your posts.

Fact AVIVO encoding relied on the CPU
Fact ASUS dual GT is a single card.
Fact X1900XTX did not dominate the GTX512 like many had hoped
Fact your heavy use of terms "Idiotic", "trolls", "clueless", etc. not only remove any shred of credibility you had, but also relegated your post to an insulting retort lacking in any substance.

And again, I ask you for PROOF where I am blatantly flaming nVidia without reason. Not criticism but outright lies and half truths like I've quoted you doing. Can't find any can you? Because I'm quite impartial, I have criticized both companies in many posts while you only criticize ATI then laugh it off as sarcasm and humor. Heck, I was just kidding in calling you a fanboy all those times, it was all sarcasm and humor. I kid, I kid! <---Sarcasm at work for the sarcasm impaired.

I had hoped the X1900 series would dominate the 7800GTX 512MB, that is true, but only because I want the best part available. However, if we were to compare mainstream offerings, it would currently be the X1900XT vs the 7800GTX. The 7800GTX is getting whipped left and right, except in SLI cause Crossfire is simply not ready. And nowhere in any post I have made in this forum ever have you seen me say Crosssfire > SLI. Cause it's not at this point and probably won't be for another 6 months to a year.

FACT: You told many half-truths on AVIVO encoding. Something you still haven't ackowledged.

FACT: I and others have caught you many times being blatantly biased against ATI.

FACT: You show classic simptoms of the troll and fanboy. 1) Half-truths and lies. 2) Slamming a company or it's products for no reason. 3) Out of context quotes. 4) Ignoring any argument that doesn't suit your views instead of addressing them and why you feel you are right in your views.

 

Golgatha

Lifer
Jul 18, 2003
12,424
1,110
126
Originally posted by: TSS
SLI rigs are only bought by a select few who have the money :) 99% of all computers built and bought still carry a single card solution. so thats why its the fastest. you sed so yourself ;)

besides, go look at the 1900xtx crossfire vs 7800gtx 512 SLI in the fear benchmark, WITH AA (if you run SLI without aa your a idiot. you dont buy that horsepower for nothing.)


Hmm...if you're willing to spend that kind of money on graphics, I'd say that 1900 series in the only solution because you're going to run AA, AF, and HDR all at the same time to get your money's worth.
 

deadseasquirrel

Golden Member
Nov 20, 2001
1,736
0
0
Originally posted by: flexy
The current X1900XT just makes *ME* see again how dumb SLI is and invest so much $$$ in raw performance which will be topped anyway in 6 months OFTEN by a SINGLE card.

One could have bought 2 6800GTs and put them in SLI in January of 2005, for $650 total. 6 months later, the 7800GTX was released and merely, more-or-less, equaled performance of the SLI'd 6800GTs (better at 2048x1536, but less at 16x12). i.e. there wasn't much of a reason for the SLI 6800GT users to pay for an upgrade to get equal performance.

It could even be argued that it isn't until NOW, with the x1900xt, that there is finally a reason for the SLI 6800GT user to upgrade. 12 months later.

AND, that 6800GT user can likely sell his cards easily for $350, and only spend $150 more to get the 1900XT.

Yeah, SLI is dumb. :roll:
 

Gamer X

Banned
Feb 11, 2005
769
0
0
Originally posted by: d1gw33d
This topic is too funny.
I've always used nvidia in the past, utnill recently (X800GTO now)
And I'm sorry to all you blantant ATI FANBOYS but the X1900 isn't all that it should be.
Sorry. I really am. And all the crying about SLI vs. XFire dosent count is BS. It does count because its a viable graphics solution that cost more yes, but is readily available. When someone wants the absolute fastest gfx solution, SLI is it. Cost is relative. Get over it.

The reason I say the X1900 isn't all that it should be is simply because nvidia is going to smoke it with its next gen gpu. If you doubt that, and say SLI isn't relative, then you are in fact, the definition of a fanboy.

X1900 has the performance crown NOW,Nvidia fanboys/affiliates get over it.
 

Ackmed

Diamond Member
Oct 1, 2003
8,499
560
126
Originally posted by: deadseasquirrel
Originally posted by: flexy
The current X1900XT just makes *ME* see again how dumb SLI is and invest so much $$$ in raw performance which will be topped anyway in 6 months OFTEN by a SINGLE card.

One could have bought 2 6800GTs and put them in SLI in January of 2005, for $650 total. 6 months later, the 7800GTX was released and merely, more-or-less, equaled performance of the SLI'd 6800GTs (better at 2048x1536, but less at 16x12). i.e. there wasn't much of a reason for the SLI 6800GT users to pay for an upgrade to get equal performance.

A 6800GT was much more than $325 last January. That negates your whole point. Not to mention the only SLI board then, was the Asus A8N-SLI, and it was about $230.