• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

If i see one more post where some stupid little nvida fanboy...

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: dug777
Originally posted by: AMDZen
The new nVidia cards do pwn their ATi counterparts. SM 3.0 is simply one reason.

Which is a good thing any way. ATI downright p0wnED the last round of nVidia cards, and so its now nVidia's turn. As it should be. That way, both companies have hit the bottom and know defeat, only to come back with kick a$$ products the next round.

Its simple, buy 6xxx series cards if you want the best in gaming right now. You'll also be able to find them easier, and up until recently - it was the only card you could get in AGP. ATi has since caught up, but not before thousands of gamers across the world either : A) bought nVidia cause they had AGP, or B) bought nVidia because ATI didn't have jack in stock.

in non SLI x850XT PE stomps the 6800U in most games (d3 being the obvious exception), your point was what again?

EDIT: consider 'stomps' to be replaced with 'beats by a small margin' before the flamers get me 😉

Yea, and when you find one, in stock. Post a link.

Yea, thats right. You know why, because they don't exist.

Another point is - when/if ATI actually provides these so called x850XT PE's on store shelves, they will be at least $600 right? So how would two 6800GT's overclocked to 6800 Ultra speeds, in SLI compare. You can't say the non SLI to SLI comparisson is invalid, just compare price to price. 2x300$ cards is the same as 1x600$ card.

Another point is - nVidia is supposedly going to release a 6800 Ultra Extreme to compete with it head on, and will be faster if they do. The new nVidia GPU's are simply better. This time around that is.

You have to realize that the x850XT PE is something that a) doesn't exist, and so b) nVidia doesn't have an oppenent for. If you compare cards that are supposed to be compared - like the 6600GT to the x700, or the 6800 Ultra to the x800XT - then where does ATI sit? You can't compare apples to oranges, and comparing a 6800 Ultra to that nonexistant card, is apples to oranges.

Lastly - I just want to add that I own an MSI 9800 Pro, that has the r360 core (9800 XT core) and is overclocked well beyond 9800XT speeds. So basically I have the fastest of ATI's previous gen cards, and I love it to death. Its more then attequate for me, which is why I haven't gotten a newer card. So I am in no way, at all, an nVidia fan boi.
 
Originally posted by: GTaudiophile
Originally posted by: AMDZen
The new nVidia cards do pwn their ATi counterparts. SM 3.0 is simply one reason.

Which is a good thing any way. ATI downright p0wnED the last round of nVidia cards, and so its now nVidia's turn. As it should be. That way, both companies have hit the bottom and know defeat, only to come back with kick a$$ products the next round.

Its simple, buy 6xxx series cards if you want the best in gaming right now. You'll also be able to find them easier, and up until recently - it was the only card you could get in AGP. ATi has since caught up, but not before thousands of gamers across the world either : A) bought nVidia cause they had AGP, or B) bought nVidia because ATI didn't have jack in stock.

WTF are you smoking? Can I have some?

You call this pwnage in HL2 when two 6800GTs in SLI cannot match one X850XT?

SM3.0 may have an impact on the games of tomorrow, but when I am shopping for a new videocard today, I want one that performs the best on the games of today.

No you cant. 😛

Again you are comparring apples to oranges. Where is the benchmark comparing the 6800 Ultra to the x850XT? That is the comparisson to be made and it isn't shown. SLI pretty much does nothing, for right now. And in HL2, it does do nothing. So find me benches comparing a single 6800 Ultra and the x850XT and that is the comparisson. Also, look at Splinter Cell

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2394&p=15

Look how much two 6800 Ultra spank the x850XT. I can guarentee that a single 6800 Ultra will be close to those numbers, because like I said, SLI doesn't do anything in but a select few games right now.

When SLI does do something, what happens then?
 
Originally posted by: AMDZen
Originally posted by: dug777
Originally posted by: AMDZen
The new nVidia cards do pwn their ATi counterparts. SM 3.0 is simply one reason.

Which is a good thing any way. ATI downright p0wnED the last round of nVidia cards, and so its now nVidia's turn. As it should be. That way, both companies have hit the bottom and know defeat, only to come back with kick a$$ products the next round.

Its simple, buy 6xxx series cards if you want the best in gaming right now. You'll also be able to find them easier, and up until recently - it was the only card you could get in AGP. ATi has since caught up, but not before thousands of gamers across the world either : A) bought nVidia cause they had AGP, or B) bought nVidia because ATI didn't have jack in stock.

in non SLI x850XT PE stomps the 6800U in most games (d3 being the obvious exception), your point was what again?

EDIT: consider 'stomps' to be replaced with 'beats by a small margin' before the flamers get me 😉

Yea, and when you find one, in stock. Post a link.

Yea, thats right. You know why, because they don't exist.

Another point is - when/if ATI actually provides these so called x850XT PE's on store shelves, they will be at least $600 right? So how would two 6800GT's overclocked to 6800 Ultra speeds, in SLI compare. You can't say the non SLI to SLI comparisson is invalid, just compare price to price. 2x300$ cards is the same as 1x600$ card.


Another point is - nVidia is supposedly going to release a 6800 Ultra Extreme to compete with it head on, and will be faster if they do. The new nVidia GPU's are simply better. This time around that is.

You have to realize that the x850XT PE is something that a) doesn't exist, and so b) nVidia doesn't have an oppenent for. If you compare cards that are supposed to be compared - like the 6600GT to the x700, or the 6800 Ultra to the x800XT - then where does ATI sit? You can't compare apples to oranges, and comparing a 6800 Ultra to that nonexistant card, is apples to oranges.

Lastly - I just want to add that I own an MSI 9800 Pro, that has the r360 core (9800 XT core) and is overclocked well beyond 9800XT speeds. So basically I have the fastest of ATI's previous gen cards, and I love it to death. Its more then attequate for me, which is why I haven't gotten a newer card. So I am in no way, at all, an nVidia fan boi.


http://www.newegg.com/app/ViewProductDesc.asp?description=14-102-519&depa=1

really? more like $100 less and clearly in stock. they cost as little as 1x 6800ultra. lowest price is $489 as ocmpared to a $475 6800ultra.
 
Originally posted by: AMDZen

Lastly - I just want to add that I own an MSI 9800 Pro, that has the r360 core (9800 XT core) and is overclocked well beyond 9800XT speeds. So basically I have the fastest of ATI's previous gen cards, and I love it to death. Its more then attequate for me, which is why I haven't gotten a newer card. So I am in no way, at all, an nVidia fan boi.
and hitler said in mein kampf that he liked jews
 
Originally posted by: AMDZen
No you cant. 😛

Again you are comparring apples to oranges. Where is the benchmark comparing the 6800 Ultra to the x850XT? That is the comparisson to be made and it isn't shown. SLI pretty much does nothing, for right now. And in HL2, it does do nothing. So find me benches comparing a single 6800 Ultra and the x850XT and that is the comparisson. Also, look at Splinter Cell

http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2394&p=15

Look how much two 6800 Ultra spank the x850XT. I can guarentee that a single 6800 Ultra will be close to those numbers, because like I said, SLI doesn't do anything in but a select few games right now.

When SLI does do something, what happens then?


SLI is a bandaid fix by nVidia to boost sales. It's mostly just marketing BS.
And if 1:1 top end vs top end is so important, why didn't you post one of Anand's other benches?

ATi spanking the 6800 Ultra in HL2

nVidia spanking ATi in Doom 3

Far Cry performance is even until AA/AF is turned on

Next time research your topic before blindly following your fan-boyish ideas.
 
Originally posted by: meltdown75
Originally posted by: YOyoYOhowsDAjello
Originally posted by: AMDZen
SLI pretty much does nothing, for right now.

😕

it's good if you play games at insane resolutions w/ eye candy cranked up. other than that it's pretty much useless, no?

Who's spending $600 to $1000 on videocards that isn't going to be doing that?

ATI vs 6800ultra in HL2 btw
 
Originally posted by: meltdown75
Originally posted by: YOyoYOhowsDAjello
Originally posted by: AMDZen
SLI pretty much does nothing, for right now.

😕

it's good if you play games at insane resolutions w/ eye candy cranked up. other than that it's pretty much useless, no?

Good point! And here I was trying to get all the office pcs equipped with dual 6810472x ULTRA EXTREME BBQs. 😕
 
3dfx is the best choice. HW T&L is not needed, and the power of the T-Buffer cannot be underestimated. Real time AA and motion blur produces far superior image quality. The 3dfx also adds GLIDE support, which is what Duke Nukem Forever will use. Soon the SLI x 4 will be implimented, i would like to see your nvidia stand up to quad VSA100 power.
 
Originally posted by: mwmorph
Originally posted by: AMDZen
Originally posted by: dug777
Originally posted by: AMDZen
The new nVidia cards do pwn their ATi counterparts. SM 3.0 is simply one reason.

Which is a good thing any way. ATI downright p0wnED the last round of nVidia cards, and so its now nVidia's turn. As it should be. That way, both companies have hit the bottom and know defeat, only to come back with kick a$$ products the next round.

Its simple, buy 6xxx series cards if you want the best in gaming right now. You'll also be able to find them easier, and up until recently - it was the only card you could get in AGP. ATi has since caught up, but not before thousands of gamers across the world either : A) bought nVidia cause they had AGP, or B) bought nVidia because ATI didn't have jack in stock.

in non SLI x850XT PE stomps the 6800U in most games (d3 being the obvious exception), your point was what again?

EDIT: consider 'stomps' to be replaced with 'beats by a small margin' before the flamers get me 😉

Yea, and when you find one, in stock. Post a link.

Yea, thats right. You know why, because they don't exist.

Another point is - when/if ATI actually provides these so called x850XT PE's on store shelves, they will be at least $600 right? So how would two 6800GT's overclocked to 6800 Ultra speeds, in SLI compare. You can't say the non SLI to SLI comparisson is invalid, just compare price to price. 2x300$ cards is the same as 1x600$ card.


Another point is - nVidia is supposedly going to release a 6800 Ultra Extreme to compete with it head on, and will be faster if they do. The new nVidia GPU's are simply better. This time around that is.

You have to realize that the x850XT PE is something that a) doesn't exist, and so b) nVidia doesn't have an oppenent for. If you compare cards that are supposed to be compared - like the 6600GT to the x700, or the 6800 Ultra to the x800XT - then where does ATI sit? You can't compare apples to oranges, and comparing a 6800 Ultra to that nonexistant card, is apples to oranges.

Lastly - I just want to add that I own an MSI 9800 Pro, that has the r360 core (9800 XT core) and is overclocked well beyond 9800XT speeds. So basically I have the fastest of ATI's previous gen cards, and I love it to death. Its more then attequate for me, which is why I haven't gotten a newer card. So I am in no way, at all, an nVidia fan boi.


http://www.newegg.com/app/ViewProductDesc.asp?description=14-102-519&depa=1

really? more like $100 less and clearly in stock. they cost as little as 1x 6800ultra. lowest price is $489 as ocmpared to a $475 6800ultra.

OK, so they do exist. Now. Seriously though, nVidia has had their cards out in qty for some time now. ATI has been paper launching things like mad. This card as an example, took 6+ months to finally be sold after it was announced. Meanwhile, nVidia sold thousands of cards. ATI's problem this time around was actually having cards.

We can argue performance benchmark numbers all day and get nowhere. nVidia has already won this latest bout. They've sold tons more cards, and for a reason. I could care less because I'm skipping this round of cards altogether. Or at least until the next round of cards comes out. Then I'll buy nVidia, simply because I know SLI will become worth it once games are programmed with that in mind. Of course, I may change my mind if/when ATI ever releases something like it. Only time will tell.
 
Originally posted by: dug777
a little vent 😀 (and no, it's not really a FI post, because you can't stop idiots posting just because they are nvida (or an other company) fanbois 🙁 )

Discuss.

nVidia PWNZ JOO!!! 😀
 
Hmmm.....I'd get a GeForce 6-series just because the drivers are better.

I FREAKING HATE ATI DRIVERS!!!

I just tried going back to a Radeon to play SHIII with, since the developer didn't bother testing the game on nVidia cards, and it's kinda buggy. But, I can't keep the damn thing, the drivers are infuriating!

What's the point with the whole desktop flicker/re-init when making driver changes? I'm not making CHANGES to the desktop, why does the whole screen flicker on and off when I change anti-aliasing settings?

Worse, ATI cards have a 3d color profile and a 2d color profile - and ne'er the two shall mix. There is no WAY to get the card to use the 2d color profile in 3d apps (which is the default behavior for nVidia cards unless you override it). Net result? If you use a hardware color calibration tool, you can calibrate your monitor in the DESKTOP mode just as peachy keen as nVidia's cards....but that doesn't effect the 3d mode at all, which will still have whacked-out colors. No problem on nVidia cards!

Finally, ATI cannot scale an image to a 1280x1024 LCD properly. You have two options - unscaled, in which a 1024x768 image (max res of SHIII) kind of floats in the middle of the display, or 'scaled', where it distorts the 4:3 aspect ratio image to fit the 5:4 aspect ratio LCD.

With nVidia cards, you have an option called "fixed aspect ratio scaling", that will scale the 4:3 image up to the monitor's size without distorting it. In the case of the 4:3 resolution 1024x768, it scales it up to the 4:3 resolution of 1280x960 - leaving a small black letterbox bar above and below the image. Distortion free!

And let's not even TALK about ATI's crappy anti-aliasing support or godawful support for 16-bit color games....
 
Originally posted by: dderidex
Hmmm.....I'd get a GeForce 6-series just because the drivers are better.

I FREAKING HATE ATI DRIVERS!!!

I just tried going back to a Radeon to play SHIII with, since the developer didn't bother testing the game on nVidia cards, and it's kinda buggy. But, I can't keep the damn thing, the drivers are infuriating!

What's the point with the whole desktop flicker/re-init when making driver changes? I'm not making CHANGES to the desktop, why does the whole screen flicker on and off when I change anti-aliasing settings?

Worse, ATI cards have a 3d color profile and a 2d color profile - and ne'er the two shall mix. There is no WAY to get the card to use the 2d color profile in 3d apps (which is the default behavior for nVidia cards unless you override it). Net result? If you use a hardware color calibration tool, you can calibrate your monitor in the DESKTOP mode just as peachy keen as nVidia's cards....but that doesn't effect the 3d mode at all, which will still have whacked-out colors. No problem on nVidia cards!

Finally, ATI cannot scale an image to a 1280x1024 LCD properly. You have two options - unscaled, in which a 1024x768 image (max res of SHIII) kind of floats in the middle of the display, or 'scaled', where it distorts the 4:3 aspect ratio image to fit the 5:4 aspect ratio LCD.

With nVidia cards, you have an option called "fixed aspect ratio scaling", that will scale the 4:3 image up to the monitor's size without distorting it. In the case of the 4:3 resolution 1024x768, it scales it up to the 4:3 resolution of 1280x960 - leaving a small black letterbox bar above and below the image. Distortion free!

And let's not even TALK about ATI's crappy anti-aliasing support or godawful support for 16-bit color games....

1. of course you arechanging how the card functions, of course the desktop is gonna flicker a bit. when i cahnge options on the nv drivers like, i have to reboot. rather sustain a .1ms flicker rather than a 1-2minute reboot.

2. wtf are you using color profiles at all? there is a reason your monitor has it's settings. besides, you're just lazy. 2 color profiles are vastly better. you want word documents and internet browsing to have brighter coloers while DOOMIII has a more muted color palette? done.

3. nv does this also, what do you want? for the card to magicalyl add pixels until it is a 5:4 ratio? i dont ever recall nv able to do a "fixed aspect ratio scaling" option. if it is possible, where is it in the options so i can test it myself?

4. why do you even paly 16bit if you have a current gen card? ati aa is just as good and most of the time better than nv aa.
http://www.bytesector.com/data/bs-article.asp?ID=357&page=4

dont get me wrong, i own both ATI and NV cards but drivers are a nonissue. all yuo people complaining about drivers dont know what the hell you are talking about.sure bootup takes maybe 10seconds longer but wtf are you complaining? some of you spoiled geeks should be forced to use a 466mhz 96mb of ram 5400rpm pc for a week to see how good you a55hats have it.
/pwned
/end thread
 
Good god man, do you have ANY idea what you are talking about!??!

Originally posted by: mwmorph
1. of course you arechanging how the card functions, of course the desktop is gonna flicker a bit. when i cahnge options on the nv drivers like, i have to reboot. rather sustain a .1ms flicker rather than a 1-2minute reboot.

Ummm....no, no it doesn't. There. I just changed my FSAA to 4x in SHIII. No flicker. (And hey, how about that, I can set FSAA, Aniso, optimization, etc settings PER GAME with nVidia drivers! No need to activate a profile or anything, the drivers automatically detect which game is running and apply the profile).

Sounds like something is messed up with your driver install.

Originally posted by: mwmorph
2. wtf are you using color profiles at all? there is a reason your monitor has it's settings. besides, you're just lazy. 2 color profiles are vastly better. you want word documents and internet browsing to have brighter coloers while DOOMIII has a more muted color palette? done.

Because I have a Dell 19" 1905FP LCD, and the color the monitor can put out are SHITE. And no, if you use a DVI cable, the monitor does NOT have its own color settings - that's the entire point of using DVI, the operating system controls it.

At least, it WOULD, if the ATI drivers would LET IT (like, you know, the S3 drivers and nVidia drivers and Matrox drivers do - it's just ATI that buggers it all up).

Originally posted by: mwmorph
3. nv does this also, what do you want? for the card to magicalyl add pixels until it is a 5:4 ratio? i dont ever recall nv able to do a "fixed aspect ratio scaling" option. if it is possible, where is it in the options so i can test it myself?

Text (and pwnage? Looks like...)

Of course it doesn't "add pixels", it just 'letterboxes' the image. A 1024x768 image scaled up to a 1280x1024 display preserving the 4:3 aspect ratio only scales up to 1280x960 (if you ran it at 1280x1024, the image would be distorted). That option scales the image up to 1280x960, leaving 64 vertical pixels unused - 32 on the top of the image, 32 on the bottom.

Originally posted by: mwmorph
4. why do you even paly 16bit if you have a current gen card? ati aa is just as good and most of the time better than nv aa.
http://www.bytesector.com/data/bs-article.asp?ID=357&page=4

Well, let's see, maybe because the GAME DOESN'T SUPPORT ANYTHING ELSE!?

How about that?

Jane's F/A-18, Jane's F-15, Longest Journey, Starfleet Command 1, 2, and 'Orion Pirates', Sub Command, Fleet Command, etc....heck - even the just-released-last-month 'Dangerous Waters' only supports 16-bit color!

For ATI to not properly support such a huge section of the (granted, usually older) market is a pain.

Originally posted by: mwmorph
dont get me wrong, i own both ATI and NV cards but drivers are a nonissue. all yuo people complaining about drivers dont know what the hell you are talking about.sure bootup takes maybe 10seconds longer but wtf are you complaining? some of you spoiled geeks should be forced to use a 466mhz 96mb of ram 5400rpm pc for a week to see how good you a55hats have it.
/pwned
/end thread

Beg parden, but I started seriously PC gaming on an 80286 (that let you set the bus speed on the fly - it could run at *6*, *8*, OR *10* mhz!). From a pure software development perspective, ATI's drivers are just sloppy - period, end of story.
 
Originally posted by: dderidex
Good god man, do you have ANY idea what you are talking about!??!

Originally posted by: mwmorph
1. of course you arechanging how the card functions, of course the desktop is gonna flicker a bit. when i cahnge options on the nv drivers like, i have to reboot. rather sustain a .1ms flicker rather than a 1-2minute reboot.

Ummm....no, no it doesn't. There. I just changed my FSAA to 4x in SHIII. No flicker. (And hey, how about that, I can set FSAA, Aniso, optimization, etc settings PER GAME with nVidia drivers! No need to activate a profile or anything, the drivers automatically detect which game is running and apply the profile).

Sounds like something is messed up with your driver install.

Originally posted by: mwmorph
2. wtf are you using color profiles at all? there is a reason your monitor has it's settings. besides, you're just lazy. 2 color profiles are vastly better. you want word documents and internet browsing to have brighter coloers while DOOMIII has a more muted color palette? done.

Because I have a Dell 19" 1905FP LCD, and the color the monitor can put out are SHITE. And no, if you use a DVI cable, the monitor does NOT have its own color settings - that's the entire point of using DVI, the operating system controls it.

At least, it WOULD, if the ATI drivers would LET IT (like, you know, the S3 drivers and nVidia drivers and Matrox drivers do - it's just ATI that buggers it all up).

Originally posted by: mwmorph
3. nv does this also, what do you want? for the card to magicalyl add pixels until it is a 5:4 ratio? i dont ever recall nv able to do a "fixed aspect ratio scaling" option. if it is possible, where is it in the options so i can test it myself?

Text (and pwnage? Looks like...)

Of course it doesn't "add pixels", it just 'letterboxes' the image. A 1024x768 image scaled up to a 1280x1024 display preserving the 4:3 aspect ratio only scales up to 1280x960 (if you ran it at 1280x1024, the image would be distorted). That option scales the image up to 1280x960, leaving 64 vertical pixels unused - 32 on the top of the image, 32 on the bottom.

Originally posted by: mwmorph
4. why do you even paly 16bit if you have a current gen card? ati aa is just as good and most of the time better than nv aa.
http://www.bytesector.com/data/bs-article.asp?ID=357&page=4

Well, let's see, maybe because the GAME DOESN'T SUPPORT ANYTHING ELSE!?

How about that?

Jane's F/A-18, Jane's F-15, Longest Journey, Starfleet Command 1, 2, and 'Orion Pirates', Sub Command, Fleet Command, etc....heck - even the just-released-last-month 'Dangerous Waters' only supports 16-bit color!

For ATI to not properly support such a huge section of the (granted, usually older) market is a pain.

Originally posted by: mwmorph
dont get me wrong, i own both ATI and NV cards but drivers are a nonissue. all yuo people complaining about drivers dont know what the hell you are talking about.sure bootup takes maybe 10seconds longer but wtf are you complaining? some of you spoiled geeks should be forced to use a 466mhz 96mb of ram 5400rpm pc for a week to see how good you a55hats have it.
/pwned
/end thread

Beg parden, but I started seriously PC gaming on an 80286 (that let you set the bus speed on the fly - it could run at *6*, *8*, OR *10* mhz!). From a pure software development perspective, ATI's drivers are just sloppy - period, end of story.

teh buuuurrrnnnn :thumbsup:
 
WOW this thread got pretty heated.

I guess if you put either ati/nvidia/intel/amd in the title it brings out the fanboys in a nasty rash 😛

for the record it's ati who seem to have gotten on top of their driver support these days, while nvidia have been getting slow and sloppy...i like monthly driver updates while how long was the 61.77 (i think...😛) the official nvidia driver?

😉
 
I was gonna get an x700 but I'm getting a 6600 instead because of SM3.0 support--its very important in the next crop of games.

Well, that and it spanks the x700 around like a kid in the projects.

😀 😀 😀
 
I used to be one of the top posters in the video forum a few years ago.

Stuff like that drove me to the OT-only creature I am now.
 
Originally posted by: UNCjigga
I was gonna get an x700 but I'm getting a 6600 instead because of SM3.0 support--its very important in the next crop of games.

Well, that and it spanks the x700 around like a kid in the projects.

😀 😀 😀

It does spank the x700, so that's the reason to buy it, not the sm3.0 support :|

i suspect my sarcasm meter on the issue is broken 😉 but have you seen the framerates when sm3.0 is enabled in Riddick? It becomes just about unplayable on a 6800GT let alone a 6600GT ( and i hope you're talking about a GT not the stock 6600 😛 ).

😀 😀 😀 indeed
 
Originally posted by: dug777
Originally posted by: Deeko
I used to be one of the top posters in the video forum a few years ago.

Stuff like that drove me to the OT-only creature I am now.

:thumbsup:

I was even a bit of a 'fanboy' myself, i was one of the biggest pushers of the Voodoo5, but I was objective about it I wasn't an annoying little twerp.


....well I was 16, so yea I was, but that was only in ot....
 
Back
Top