The Inq Calls the 7900 a 7800GTX 512 Repeate

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.

I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?


Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?

What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...


As long as the card is functioning properly with the heat being exhausted out of the system, who cares how hot it gets? Sure it eats up extra power but considering most enthusiasts have more than ample PSU's, that is another non-factor. Do you reach around to the back of your PC with one hand to get a feel of the heat being exhausted by a vid card while gaming?
 

nts

Senior member
Nov 10, 2005
279
0
0
Originally posted by: Ronin
Both of you need to go back and read the thread where it was discussed. It is a MEMORY HOG, no matter how you spin it. I did MY tests 5 separate times to verify results.

Except that your results don't reflect anyone elses and you have a pretty clear agenda.

Saying I timed my screenshot is about is valid as most of the arguments you put forth for your coveted ATi.

Starting CCC is the only time mem usage will be high.

btw what's your definition of a memory HOG?

If I remember that thread then someone posted 3DMark results with CCC open and closed and it made no difference. Either buy more RAM or use NVIDIA cards, problem solved.

Anyone have Fraps or some other video recording tool? Can record CCC starting up with mem usages.

 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: 5150Joker
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.

I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?


Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?

What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...


As long as the card is functioning properly with the heat being exhausted out of the system, who cares how hot it gets? Sure it eats up extra power but considering most enthusiasts have more than ample PSU's, that is another non-factor.

Well I was just checking cuz you know that extra heat can cause problems inside your case if it's not exhausting air properly(i doubt joe blow has decent cooling in his system). But if you want to live with the extra heat and power than go ahead and buy it.

Personally I wouldnt want it in my system especially considering that my computer is in a small room as is and it gets hot in the room with my current hardware....thats with my Venice loading up at 41C and my 7800GT loading up around 65-68C. I couldnt imagine if my vid card was exhausting out my case and was running at 85C.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: nts
Originally posted by: Ronin
Both of you need to go back and read the thread where it was discussed. It is a MEMORY HOG, no matter how you spin it. I did MY tests 5 separate times to verify results.

Except that your results don't reflect anyone elses and you have a pretty clear agenda.

Saying I timed my screenshot is about is valid as most of the arguments you put forth for your coveted ATi.

Starting CCC is the only time mem usage will be high.

btw what's your definition of a memory HOG?

If I remember that thread then someone posted 3DMark results with CCC open and closed and it made no difference. Either buy more RAM or use NVIDIA cards, problem solved.

Anyone have Fraps or some other video recording tool? Can record CCC starting up with mem usages.



Hell if somone doesnt want to use CCC, they can opt for Ray Adam's ATi Tray Tools which uses very little memory and does a fantastic job.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.

I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?


Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?

What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...


As long as the card is functioning properly with the heat being exhausted out of the system, who cares how hot it gets? Sure it eats up extra power but considering most enthusiasts have more than ample PSU's, that is another non-factor.

Well I was just checking cuz you know that extra heat can cause problems inside your case if it's not exhausting air properly(i doubt joe blow has decent cooling in his system). But if you want to live with the extra heat and power than go ahead and buy it.

Personally I wouldnt want it in my system especially considering that my computer is in a small room as is and it gets hot in the room with my current hardware....thats with my Venice loading up at 41C and my 7800GT loading up around 65-68C. I couldnt imagine if my vid card was exhausting out my case and was running at 85C.


Open your windows if it gets hot in your room. The X1900/X1800 HSF exhausts the heat it produces outside your case so it doesn't affect your PC.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: 5150Joker
Translation: Oh snap I got caught lying (again).

You really should learn to not attack people when you don't have a leg to stand on. It really validates your ignorance and immaturity. Give it up.

I'm continually looking through this thread and I'm laughing my ass off at the arguments being made.

"It's ok that it's hot" - Temps don't matter, right?
"It's ok that it uses more power" - Power consumption doesn't matter, right?
"It's ok that the standard default control panel they're using has a commit charge of over 100MB" - RAM and/or VM usage doesn't matter, right?

And the list goes on. You guys are a riot.
 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
I live in Michigan often times it's hard to open a window with freezing my ass off and sending the heating bill sky high.....or opening my window and screw up the AC. Right now is about the only time of year I can open my window without that.....except it's still 30 degrees(fahrenheit) outside.

I know the HSF exhausts outside your case but that heat isnt just on the heatsink it is on the BACK of the card as well. So if the components on the backside are heating up too there generally isnt anything to exhaust that heat. Usually it wafes up to your CPU cooler. Otherwise yea it's exhausts out the back.
 

Jules

Lifer
Oct 9, 1999
15,213
0
76
Originally posted by: vaccarjm
Originally posted by: 5150Joker
Originally posted by: vaccarjm
Originally posted by: 5150Joker
Originally posted by: Ronin
Originally posted by: 5150Joker
This coming from a guy that falsified CCC memory results in order to make ATi look bad, moderates nVidia's nZone forums and rides Sony's jockstrap as a test monkey to get free nVidia cards. Yeah Ronin, you're one to be talking about neutrality. If I'm a fan of ATi cards that's all I am, a fan, not someone that like you who depends on video card welfare.

Get off your abrasive high horse, bud. My results were accurate (and validated by other people, I might add). Yeah, I moderate the nZone forums, but bfd, really. And as far as my employer is concerned (which you STILL haven't figured out, regardless of what you *think* you know), I get the same treatment from ATi as I do nVIDIA (which hasn't gotten through your thick skull, hence a perfect example of you seeing what you WANT to see).

Like I said. No leg to stand on, and the way you present yourself certainly doesn't win any points. Grow up, Joker. You're a joke, and everyone knows it. ;)


Your results weren't validated by anyone you tool. They were proven to be b.s. in that thread, should I dig it up and make you look like the liar you are again? I already know who your employer is and how you get your cards. You don't get any "treatment" from ATi, your employer Sony gets cards to test their games. It just happens that you're one of the test monkeys at work that gets to use them.


Edit: Ronin's CCC FUD:

His screenshot: http://server.counter-strike.net/images/misc/atimemusage2.png
Actual CCC usage: http://img436.imageshack.us/img436/2241/cccopen8vn.jpg

Conclusion: Ronin took a quick shot of CCC during its split second memory load to falsely claim it uses that much memory all the time when CCC is open. My SS proves Ronin is a liar - as expected.


How old are you?


Old enough and likely smarter than you.


Intelligent response.

So let me guess....19 and finishing your 1st year at your local JC?

Im sorry but what are you adding to this thread other then having a age comment?
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Ronin, the needle AF of your 512 GTX's coupled with your green tinted glasses have obviously blinded you. Look around, you've been proven a liar by me and then the results were also confirmed by others. You have no crediblity.
 

akugami

Diamond Member
Feb 14, 2005
6,210
2,552
136
Originally posted by: Ronin
I'm acceptable to those numbers. It would seem that the integrated market is where ATi pulls ahead, and I'm comfortable with that.

Thanks for spending the time to check it out. :)

Wow, did you realize that in the time it took me to do my search (I did a little work in between so it wasn't a straight 15min of googling), the page count jumped from 1-4? So much BS posted between the post of yours I quoted and my response. Everybody needs to dial it down.

I do however agree with Joker in that I've never seen my ATI CCC usage that high as shown in your screen shots. Some have said that it's only when it initially loads and opens the CC and after that it drops to acceptable levels. I never tried checking it out and I'm not going to bother. All I know is that, I still consider ATI's CCC bloated...but just this side of manageable and that I rarely see the three CLI.exe components take more than about 15MB in RAM total.

Either way, I was curious as to the actual market share between ATI and nVidia and from what I could dig up they're neck and neck. This is contrary to many people's belief (myself included) that nVidia had the majority share of the market.

Now, I'm going to attribute the initial selling out of the 7900 cards to pent up demand for a new nVidia video card. Now, if in a couple of weeks supply is still very tight, then I'd say nVidia might have a supply problem. I wouldn't quite go the level of the Inq article and calling it a 7800GTX 512MB repeat just yet. After all, the 512MB was so scarce as to be nearly vapor.

Originally posted by: Matthias99
Ooh, geek fight! Anybody bring popcorn?

Seriously, guys, take it to PMs if you want to have a pissing match.

QFT. Repeat QFT.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Yes, because you say I have no credibility, I don't. Talk about ego. LOL. You realize people consider you a BAD version of Rollo, right? ;) And you're in select company, lemme tell you.

Keep proving my point, Joker. You're doing a stellar job.
 

Jules

Lifer
Oct 9, 1999
15,213
0
76
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.

I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?


Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?

What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
80-90C isnt a factor for the GPU. its rated at that and higher.
 

Jules

Lifer
Oct 9, 1999
15,213
0
76
Originally posted by: Ronin
Originally posted by: 5150Joker
Translation: Oh snap I got caught lying (again).

You really should learn to not attack people when you don't have a leg to stand on. It really validates your ignorance and immaturity. Give it up.

I'm continually looking through this thread and I'm laughing my ass off at the arguments being made.

"It's ok that it's hot" - Temps don't matter, right?
"It's ok that it uses more power" - Power consumption doesn't matter, right?
"It's ok that the standard default control panel they're using has a commit charge of over 100MB" - RAM and/or VM usage doesn't matter, right?

And the list goes on. You guys are a riot.


Temps at 80-90c are fine. this isnt a factor.
Sure power matters but if you PSU cant feed the x1900xt you shouldnt be owning a high end card in the first place.
Control Panel works great for me. But id have to say i liked my 7900gtx only because of the drivers. didnt care much for the 7900gtx. but then again i went from a x1900xt to a 7900gtx and back to my old card.
 

Ronin

Diamond Member
Mar 3, 2001
4,563
1
0
server.counter-strike.net
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.

Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.
 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Originally posted by: MyStupidMouth
Originally posted by: DeathBUA
Originally posted by: 5150Joker
Originally posted by: BenSkywalker
If the roles were reversed and nVidia had the superior IQ X1900XTX and ATi the needle in your eyes AF 7900 GTX, ATi would've been lucky to sell a dozen cards.

I want to be clear on this Joker- you are going on record saying that the GeForceFX series was in fact superior to the R3x0 based parts? That the FX5800, being louder, hotter and offering vastly superior AF should have been a no brainer choice over the R9700Pro? You are going on record saying that right now, right?


Are you trying to imply the X1900 is like the FX5800? The 5800 sucked because it was an underperforming card with crappy AA (no RGMS or TRAA back then) and the AF it did have wasn't practical because of the huge hit it took. The loud fan + heat were just nails on its coffin. Contrast that with the X1900: Superior angle independent AF that takes a very small hit, HDR+AA, performs on par if not slightly better in D3D overall, priced competitively with the 7900 GTX and has better availability. The only negative thing going for the X1900 is the fan noise, the heat is a non-factor since it gets pushed out by its HSF anyway. Do you put your ear to the side of your PC with a temperature probe while playing games?

What about the power useage? And what about the heat that is given off from the back of the card? And 80-90C is a 'non' factor. Thats HOT man....burn your hand hot...
80-90C isnt a factor for the GPU. its rated at that and higher.

:confused:
 

Jules

Lifer
Oct 9, 1999
15,213
0
76
Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.

Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.
Maybe to other cards but im runing it at 690mhz/800mhz and its going to around 80-90c around there and i see no Artifacts. Again cant compare to other cores as its made to get that hot.

 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.

Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.



That's funny because so far it's the nVidia overclocked 7900 cards that are artifacting and pissing off many owners that expected a working card for the $500+ they paid. Prove to us that 80-90C causes X1900 cards to artifact or is this another one of your usual b.s. nVidia inspired lies? Like I said...zero crediblity.
 

rise

Diamond Member
Dec 13, 2004
9,116
46
91
Originally posted by: Ronin
Yes, because you say I have no credibility, I don't. Talk about ego. LOL. You realize people consider you a BAD version of Rollo, right? ;) And you're in select company, lemme tell you.

Keep proving my point, Joker. You're doing a stellar job.
never thought i'd QFT a ronin post :p
 

Dethfrumbelo

Golden Member
Nov 16, 2004
1,499
0
0
Do any of you guys really imagine that you're convincing anyone else (people who are not affiliated with some marketing company) to go out and buy ATI or Nvidia based on these arguments? I think you're wasting your time.

 

TraumaRN

Diamond Member
Jun 5, 2005
6,893
63
91
Well I just wanna know if I can use Joker's X1900 to cook my eggs with in the morning cuz my stove is broken
 

Cooler

Diamond Member
Mar 31, 2005
3,835
0
0
I was hoping after Rollo left the fud would stop on Video card fourm. I can see now it will never stop unless the fourm get more mods.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Originally posted by: Ronin
Originally posted by: MyStupidMouth
80-90C isnt a factor for the GPU. its rated at that and higher.

Really? You realize just because it's 'rated' higher doesn't mean anything? Most cards will artifact at that temp, and unless you missed video card school, artifacting is bad, and so is heat.

Heat is bad if the components cannot tolerate it. If the heat output is within the tolerable limits of the components used, then you'd see no issues.

My x19000xtx does not have any artifacts and reading various boards, I haven't seen this as being an issue. This particular core and it's components handle these temps without any issue so I'm not sure what you're basing your artifacting conclusion on.
 
Mar 14, 2006
60
0
0
Originally posted by: tuteja1986
err XT is better choice but nvidia got everyone hooked on SLI which i think is pertty stupid :! spend your cash on something eles i say instead of another card.

cause there is no x1900 xl, ATI really need a 299.99 range card.