Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.
I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?
Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?
What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
Originally posted by: Ronin
Both of you need to go back and read the thread where it was discussed. It is a MEMORY HOG, no matter how you spin it. I did MY tests 5 separate times to verify results.
Saying I timed my screenshot is about is valid as most of the arguments you put forth for your coveted ATi.
Originally posted by: 5150Joker
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.
I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?
Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?
What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
As long as the card is functioning properly with the heat being exhausted out of the system, who cares how hot it gets? Sure it eats up extra power but considering most enthusiasts have more than ample PSU's, that is another non-factor.
Originally posted by: nts
Originally posted by: Ronin
Both of you need to go back and read the thread where it was discussed. It is a MEMORY HOG, no matter how you spin it. I did MY tests 5 separate times to verify results.
Except that your results don't reflect anyone elses and you have a pretty clear agenda.
Saying I timed my screenshot is about is valid as most of the arguments you put forth for your coveted ATi.
Starting CCC is the only time mem usage will be high.
btw what's your definition of a memory HOG?
If I remember that thread then someone posted 3DMark results with CCC open and closed and it made no difference. Either buy more RAM or use NVIDIA cards, problem solved.
Anyone have Fraps or some other video recording tool? Can record CCC starting up with mem usages.
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.
I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?
Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?
What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
As long as the card is functioning properly with the heat being exhausted out of the system, who cares how hot it gets? Sure it eats up extra power but considering most enthusiasts have more than ample PSU's, that is another non-factor.
Well I was just checking cuz you know that extra heat can cause problems inside your case if it's not exhausting air properly(i doubt joe blow has decent cooling in his system). But if you want to live with the extra heat and power than go ahead and buy it.
Personally I wouldnt want it in my system especially considering that my computer is in a small room as is and it gets hot in the room with my current hardware....thats with my Venice loading up at 41C and my 7800GT loading up around 65-68C. I couldnt imagine if my vid card was exhausting out my case and was running at 85C.
Originally posted by: 5150Joker
Translation: Oh snap I got caught lying (again).
Originally posted by: vaccarjm
Originally posted by: 5150Joker
Originally posted by: vaccarjm
Originally posted by: 5150Joker
Originally posted by: Ronin
Originally posted by: 5150Joker
This coming from a guy that falsified CCC memory results in order to make ATi look bad, moderates nVidia's nZone forums and rides Sony's jockstrap as a test monkey to get free nVidia cards. Yeah Ronin, you're one to be talking about neutrality. If I'm a fan of ATi cards that's all I am, a fan, not someone that like you who depends on video card welfare.
Get off your abrasive high horse, bud. My results were accurate (and validated by other people, I might add). Yeah, I moderate the nZone forums, but bfd, really. And as far as my employer is concerned (which you STILL haven't figured out, regardless of what you *think* you know), I get the same treatment from ATi as I do nVIDIA (which hasn't gotten through your thick skull, hence a perfect example of you seeing what you WANT to see).
Like I said. No leg to stand on, and the way you present yourself certainly doesn't win any points. Grow up, Joker. You're a joke, and everyone knows it.![]()
Your results weren't validated by anyone you tool. They were proven to be b.s. in that thread, should I dig it up and make you look like the liar you are again? I already know who your employer is and how you get your cards. You don't get any "treatment" from ATi, your employer Sony gets cards to test their games. It just happens that you're one of the test monkeys at work that gets to use them.
Edit: Ronin's CCC FUD:
His screenshot: http://server.counter-strike.net/images/misc/atimemusage2.png
Actual CCC usage: http://img436.imageshack.us/img436/2241/cccopen8vn.jpg
Conclusion: Ronin took a quick shot of CCC during its split second memory load to falsely claim it uses that much memory all the time when CCC is open. My SS proves Ronin is a liar - as expected.
How old are you?
Old enough and likely smarter than you.
Intelligent response.
So let me guess....19 and finishing your 1st year at your local JC?
Originally posted by: Ronin
I'm acceptable to those numbers. It would seem that the integrated market is where ATi pulls ahead, and I'm comfortable with that.
Thanks for spending the time to check it out.![]()
Originally posted by: Matthias99
Ooh, geek fight! Anybody bring popcorn?
Seriously, guys, take it to PMs if you want to have a pissing match.
80-90C isnt a factor for the GPU. its rated at that and higher.Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.
I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?
Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?
What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
Originally posted by: Ronin
Originally posted by: 5150Joker
Translation: Oh snap I got caught lying (again).
You really should learn to not attack people when you don't have a leg to stand on. It really validates your ignorance and immaturity. Give it up.
I'm continually looking through this thread and I'm laughing my ass off at the arguments being made.
"It's ok that it's hot" - Temps don't matter, right?
"It's ok that it uses more power" - Power consumption doesn't matter, right?
"It's ok that the standard default control panel they're using has a commit charge of over 100MB" - RAM and/or VM usage doesn't matter, right?
And the list goes on. You guys are a riot.
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.
I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?
Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?
What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
Maybe to other cards but im runing it at 690mhz/800mhz and its going to around 80-90c around there and i see no Artifacts. Again cant compare to other cores as its made to get that hot.Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.
Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.
Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.
Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.
never thought i'd QFT a ronin postOriginally posted by: Ronin
Yes, because you say I have no credibility, I don't. Talk about ego. LOL. You realize people consider you a BAD version of Rollo, right?And you're in select company, lemme tell you.
Keep proving my point, Joker. You're doing a stellar job.
Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.
Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.
Originally posted by: tuteja1986
err XT is better choice but nvidia got everyone hooked on SLI which i think is pertty stupid :! spend your cash on something eles i say instead of another card.
