Half-life 2 Performance: Breaking news

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

element

Diamond Member
Oct 9, 1999
4,635
0
0
Originally posted by: jiffylube1024
Originally posted by: element®
Originally posted by: HigherGround
on the same note...

3. Aquamark is a DX9 game that FXs seem to run fine.

Aquamark (3) is not a game, it's a benchmark.

And what exactly is the difference between a game and a benchmark? Not polling the input devices (keyboard, mouse joystick, wheel, whatever) for input? Collision detection? Last time I checked video cards didn't play a part in those.

Actually there are *many* significant differences between a benchmark and a game.

1. Many/most benchmarks don't run with sound enabled. Last I checked, most games are played with sound.

What does this have to do with video cards again?

6. Benchmarks can be created with biases. For example, they can be coded more optimally to run on one architecture or another, or (in a worst case scenario) they can actually have code in them to make them run worse on another platform (on purpose).

So can games.

7. Games are created (usually) to run optimally on all video cards (assuming the company wants to maximize sales). Benchmarks are often not tweaked as well as they can be to get optimal performance out of every card.

I like how you snuck usually in there in parentheses.

8. We should not encourage video card manufacturers (or more specifically, their driver teams, marketing teams, etc) to optimize their cards for benchmarks. Nvidia has an incredible driver team that has essentially worked miracles in the past when focusing on one particular game. For example, they have optimized so well for Quake 3 that it runs at ridiculously high framerates, even on older cards. ATI also has an excellent (and ever improving) driver team that can get very good performance out of their cards. Remember how poorly the Radeon 8500 ran at launch? It was slower than the GeForce 3 despite many significant technical advantages. The Radeon 8500 is now in a class above the GF3, just because the driver team put so much effort into their Catalyst drivers.

Would you prefer that Nvidia and ATI focus their efforts on making themselves look good on all the synthetic benchmarks out there (so people buy their cards), or would you prefer that they work as hard as they possibly can on making then newest/greatest games run as fast as possible? Again, they only have so much developing resources, and any resources they use to optimize for benchmarks could have been used somewhere better.

I agree they should focus on games. And they do. You don't suppose NVidia might come up with a card and driver set that will eat the latest from ATI for lunch in HL2 for the next product cycle do you? :D
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Why do you need validation from internet users regarding your purchase rollo-
I don't need validation, I said I found the backlash I received as the owner of a 5800 amusing. People came out of the woodwork to slam me as "clueless" for buying it, foolish to give up the 9700 Pros "superior" IQ, etc ad infinitum. All because I was bored with my old card and wanted to try a new one.
There's a big difference between this and wanting validation.

I could care less what anyone thinks of my hardware but me.

I do wish I would have kept the 5800 a bit longer though. The 9800 isn't that huge of an upgrade for my needs, and the nV40/R400 are around the corner.
 

alloutofbubblegum

Junior Member
Feb 1, 2003
2
0
0
In the news today....

Nvida sells chips for 10 cents. High end Nvidia cards will now sell for $80 US.

Yes ATI is faster, but why develop games fro 10 people when Nvida chips installed in 1000+'s computers.

Apparently, people playing on-line solitare.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Rollo
Where did you see Doom 4 is using DX7?

i assume you mean 3, and a lot of it is dx7 class stuff and while it does make use of a bit of dx9 tech it is considered to be feature compleate on dx8 hardware.

 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: element®
Originally posted by: jiffylube1024
[Actually there are *many* significant differences between a benchmark and a game.

1. Many/most benchmarks don't run with sound enabled. Last I checked, most games are played with sound.

What does this have to do with video cards again?

Sound takes away clock cycles that could be used from the CPU to the GPU for video data.

6. Benchmarks can be created with biases. For example, they can be coded more optimally to run on one architecture or another, or (in a worst case scenario) they can actually have code in them to make them run worse on another platform (on purpose).

So can games.

Yes but a game developer would be moronic if they castrated game performance on 40-60% of the market's video cards by tampering with performance on ATI or Nvid. And don't give me the crap about next-gen games running slow on all machines - someone has to push the envelope.
7. Games are created (usually) to run optimally on all video cards (assuming the company wants to maximize sales). Benchmarks are often not tweaked as well as they can be to get optimal performance out of every card.

I like how you snuck usually in there in parentheses.

Thanks.

I agree they should focus on games. And they do. You don't suppose NVidia might come up with a card and driver set that will eat the latest from ATI for lunch in HL2 for the next product cycle do you? :D

Of course Nvidia will come up with something that will eat ATI's 9800 for lunch. And if they actually follow DX9 this time, then it will be within the usual +/- 10-15% of ATI's next gen part. Depending on who's first out the gate, the second one is *usually* the faster one.
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
Originally posted by: sandorski
Originally posted by: rogue1979
Originally posted by: cmdrdredd
Originally posted by: rogue1979
Originally posted by: sman789
ahhh, 9600 pro made me happy again

The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;


Gee, according to the article the 5900 wasn't doing very well. So if the 9600Pro has about equal performance in the same game, it doesn't sound like something to be happy about.

If you check beyond3d for benchmarks you'll see that the 9600 is much better than Nvidia at the default

I don't think I need to check benchmarks to see that this biased article says in the same sentence how the 5900 Ultra is "horrendously slower", then goes on immediately to say the 9600Pro "performs quite well" all the time saying that the performance in Half Life 2 is close. What does that mean? I don't think there is gonna be many benchmarks where a 9600Pro overclocked or not beats the 5900 Ultra. I own both the Radeon 9500, 9700Pro and 3 fast Geforce4 Ti 128MB cards, so I don't really care who is faster. But it is hard not to comment on the heavily biased statement quoted above and then the incorrect conclusion from sman789 that the 9600Pro is actually in direct competition with the 5900 Ultra.

Context. The 5900U is meant to compete with the 9800Pro and is priced accordingly. The 9600Pro is meant to compete with the 5600 and priced accordingly. So when the 2x Price 5900U gets beat by the 1/2x Price 9600Pro, how can one not be unimpressed with the 5900U and impressed by the 9600Pro?



Yes, I understand the context. But one benchmark does not make a video card. This is insignificant unless
Half Life2 is the only video game that people play, or if all future games will show similar results. This doesn't seem likely to happen.
To put things in perspective the flip chip 5600 Ultra and 9600 Pro are very close in price and performance, both good cards. While without a doubt the Radeons have a large performance lead in Half Life 2, this obviously stemming from some type of driver optimization, not any kind of superior hardware perfomance from ATI. Like I said before, both Radeon and Geforce are fine cards, very competitive in particular price ranges for the most part. Half Life 2 should in no way be considered a standard benchmark to gauge the performance of the age old Nvidia vs ATI. If you take 100 popular gaming titles that people are still playing (both old and new) and roll them into a giant benchmark, one game would only change the overall performance by 1% Let's keep things in perspective.
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: rogue1979
Originally posted by: sandorski
Originally posted by: rogue1979
Originally posted by: cmdrdredd
Originally posted by: rogue1979
Originally posted by: sman789
ahhh, 9600 pro made me happy again

The 5900 ultra is noticeably slower with the special codepath and is horrendously slower under the default dx9 codepath;
- the Radeon 9600 Pro performs very well - it is a good competitor of the 5900 ultra;


Gee, according to the article the 5900 wasn't doing very well. So if the 9600Pro has about equal performance in the same game, it doesn't sound like something to be happy about.

If you check beyond3d for benchmarks you'll see that the 9600 is much better than Nvidia at the default

I don't think I need to check benchmarks to see that this biased article says in the same sentence how the 5900 Ultra is "horrendously slower", then goes on immediately to say the 9600Pro "performs quite well" all the time saying that the performance in Half Life 2 is close. What does that mean? I don't think there is gonna be many benchmarks where a 9600Pro overclocked or not beats the 5900 Ultra. I own both the Radeon 9500, 9700Pro and 3 fast Geforce4 Ti 128MB cards, so I don't really care who is faster. But it is hard not to comment on the heavily biased statement quoted above and then the incorrect conclusion from sman789 that the 9600Pro is actually in direct competition with the 5900 Ultra.

Context. The 5900U is meant to compete with the 9800Pro and is priced accordingly. The 9600Pro is meant to compete with the 5600 and priced accordingly. So when the 2x Price 5900U gets beat by the 1/2x Price 9600Pro, how can one not be unimpressed with the 5900U and impressed by the 9600Pro?



Yes, I understand the context. But one benchmark does not make a video card. This is insignificant unless
Half Life2 is the only video game that people play, or if all future games will show similar results. This doesn't seem likely to happen.
To put things in perspective the flip chip 5600 Ultra and 9600 Pro are very close in price and performance, both good cards. While without a doubt the Radeons have a large performance lead in Half Life 2, this obviously stemming from some type of driver optimization, not any kind of superior hardware perfomance from ATI. Like I said before, both Radeon and Geforce are fine cards, very competitive in particular price ranges for the most part. Half Life 2 should in no way be considered a standard benchmark to gauge the performance of the age old Nvidia vs ATI. If you take 100 popular gaming titles that people are still playing (both old and new) and roll them into a giant benchmark, one game would only change the overall performance by 1% Let's keep things in perspective.
The point is, its NOT just HL2. It's DX9 games. HL2 is just one of them.

 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
The point is, its NOT just HL2. It's DX9 games. HL2 is just one of them.


I don't think there is really enough DX9 games out to make that assumption, is there? If you look at the hardware breakdown both ATI and Nvidia should be very competitive. While the ATI drivers are working better in HF2, I certainly wouldn't expect Nvidia to sit on their hands and let ATI make faster drivers for much longer. Then again in my humble opinion ATI's fastest drivers are always released at the expense of some compatibility. Nothing to be worried about, just not quite as easy to work with as Nvidia, remember, I have both.;)
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: Rollo
Doom3 is not a significant DX9 game or DX at all for that matter. It is an OpenGL game.
LOL @ myself Old Fart. Good point, I forgot. Actually I'd say my OGl games outnumber my DX games at least 5X1, if not more. They're all based on Quake engines, except for the Mechs, Unreals, and Max Payne. (and even the Unreals go OGl)

This is a VERY interesting point the more I think about it. How relevant is DX9 for the fps player? Everyone uses Carmack's game engines. Damn. I better cancel my "superior 9600 Pro" order.

HL1 graphics were pretty decent for its day.
We'll have to agree to disagree there. Everything was pretty angular, and the lighting effects were cheesy. All it really had for me was the fun of interacting with the characters. Didn't think the monsters were scary. The Fiend in Q1 was a damn fine monster, when those things first jumped you and started slicing away, pretty tense. The chainsaw ogres were good to, as was the shambler.

Aquamark (3) is not a game, it's a benchmark.
True, but Aquanox2 and Spellforce will be games with the same engine, so is there any difference in this case?

Returning my new unopened GeForce 5900 Ultra. Today the last day to return this thing.
Don't do it Parrothead! I'll trade you a superior 9600Pro for it, unopened also! You'll be all set for DX9 and better off, just like BFG says! He is wise and respected, listen to him. PM we'll exchange addresses and Heat, and I'll set you up right with some (POS) ATI 9600 goodness!

 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: RogerAdam
Originally posted by: Rollo
Doom3 is not a significant DX9 game or DX at all for that matter. It is an OpenGL game.
LOL @ myself Old Fart. Good point, I forgot. Actually I'd say my OGl games outnumber my DX games at least 5X1, if not more. They're all based on Quake engines, except for the Mechs, Unreals, and Max Payne. (and even the Unreals go OGl)

This is a VERY interesting point the more I think about it. How relevant is DX9 for the fps player? Everyone uses Carmack's game engines. Damn. I better cancel my "superior 9600 Pro" order.

HL1 graphics were pretty decent for its day.
We'll have to agree to disagree there. Everything was pretty angular, and the lighting effects were cheesy. All it really had for me was the fun of interacting with the characters. Didn't think the monsters were scary. The Fiend in Q1 was a damn fine monster, when those things first jumped you and started slicing away, pretty tense. The chainsaw ogres were good to, as was the shambler.

Aquamark (3) is not a game, it's a benchmark.
True, but Aquanox2 and Spellforce will be games with the same engine, so is there any difference in this case?

Returning my new unopened GeForce 5900 Ultra. Today the last day to return this thing.
Don't do it Parrothead! I'll trade you a superior 9600Pro for it, unopened also! You'll be all set for DX9 and better off, just like BFG says! He is wise and respected, listen to him. PM we'll exchange addresses and Heat, and I'll set you up right with some (POS) ATI 9600 goodness!

Duh! I can't find an edit option (not used to this board layout).

Some interesting read here.

As for Aquamark go READ their pages, you CANNOT consider it a DX9 benchmark as it will do PS 1.1 - 1.4 depending on what it detects as capability, that said the FX WILL be using PS 1.4 (w/lower precision) to run it - unless NVidia can HONESTLY release a driver that runs DX9 (pure), as per EVERY DX9 bench game/beta whatever to date the FX series gets killed by the R95/6/7/800's, and the is NO DRIVER that can fix it for DX9, the ONLY fix is to run DX9 games/benchies as DX8.1 as NVidia has with it's "optimized" drivers, and has Valve when trying to help the crippled DX9 FX series work with HL2. As per a developer:

"Note the Actual problem here lies in the Bottom line Raw Rendering capabilities of Each hardware. ATi can do 8 FP ops and 8 Texture ops in the same Clock in paralell while Nvidia cannot. Its that Simple.

This has nothing to do with one being "8 Bits" and the Other being "4bits". Nor is it a Bug in DX9. There is an Update to DX9 on the way that will help with Shader Code Generation for Nvidia hardware. But even that is not going to make up the Difference between the two. It has nothing to do with a missing "Channel" Or anything else.

Any time that Nvidia has to Face Texture and Pixel Shader Work in the Same Clock they are at an Extreme Disadvantage. Which is the case in every single game released. Keep in mind here that this article is dealing with the Idea of Pure Peak Performance. Which never, ever happens."


This means the NV3x can NEVER fix their shader (PS2.0) problem with a driver, it's a H/W problem. What is really happening is an PER APPLICATION SPECIFIC fixes, ie EVERYTIME a new DX9 benchmark or title is ready too or goes public, NVidia MUST "optimize", ie make their drivers revert to DX8.1 to even compare with a Radeon DX9 part running DX9, and that folks is invalid, that is why Valve jumped all over NVidia, as if Valve never said anything, NVidia would've "optimized (DX8.1)" their drivers to pass ATI by a few FPS and plastered that benchmark score as DX9 comparison, when in reality, they are being smoked in pure DX9 comparisons in EVERYTHING including HL2.

Another thing (this relates to the beloved Doom3), ATI using the ARB2 path in OGL SMOKES NVidia, Carmack has to specially code an NV3x path because in HIS OWN WORDS "NV3x is very slow using the ARB2 path". What's funny are the remarks "wait til D3", well as per DX9, NV3x NEEDS HELP for it to perform, it simply CANNOT using the default path, so Carmack is doing exactly what valve did - he's dumbing down the API not using advanced shaders as much and lower precision FX12 w/ a little FP16, ATI's cards simply use ARB2 (all the effects) and higher precision with FP24 you can find this from the source (ID) itself.
With the FX series the concept of "future-proofing" just is not there, it's barely "present-proof".

THIS IS FACT:

DX9 - ATI runs it native, NVidia is painfully slow native, but can run it DX8.1 (wheteher the ISV does it, or NV does it with their "*optimizations")

OGL - ATI runs the native ARB2 path, again NV can but it's "very slow" (JC), again bringing down the API is the solution.

*optimizations - What gets me about this is that in their "statement" about Valve, they ADMIT that replacing shaders and lowering precision (FP16) is OK, and does NOTHING to the "experience" on their h/w. Does anyone remember the 3Dmark03 contraversy a few months back? - Wasn't it NVidia that COMPLAINED there was enough PS 2.0 in the benchmark? Further stating that made 3DMark invalid, well now they did a complete 180 on the matter (re:HL2 statement from NV themselves about "replacing 2.0 shaders w/1.4), can you NOW see the PR damage control (using the above).

How does anyone ACTUALLY believe it will turn around, when the only solution is to run DX9 and OGL BELOW their spec's?

Why does NV need to be helped along at all, OGL and DX are STANDARDS, it should just plain run on default paths (ie like they do on ATI).
 

oldfart

Lifer
Dec 2, 1999
10,207
0
0
Originally posted by: rogue1979
The point is, its NOT just HL2. It's DX9 games. HL2 is just one of them.


I don't think there is really enough DX9 games out to make that assumption, is there? If you look at the hardware breakdown both ATI and Nvidia should be very competitive. While the ATI drivers are working better in HF2, I certainly wouldn't expect Nvidia to sit on their hands and let ATI make faster drivers for much longer. Then again in my humble opinion ATI's fastest drivers are always released at the expense of some compatibility. Nothing to be worried about, just not quite as easy to work with as Nvidia, remember, I have both.;)
Not that many, but there are few. Each one I have seen benched shows low performance on nV HW. If nV HW is not DX9 compliant, how is any DX9 game suppoed to run well on it?
 

OmegaRedd

Banned
Sep 14, 2003
143
0
0
Originally posted by: DefRef
This whole thread is proof that some people just HATE NVIDIA to the exclusion of all reason AND in defiance of any claims the Fanboys make about wanting "competition". When Nvidia ruled the roost, people swore that 3dfx, then ATI, had to be kept alive to prevent Nvidia's "monopoly" (fess up, that's what it was called) from keeping prices high and quality low.

But, now that ATI is King, these same people are now shoveling dirt on Nvidia and cackling about how they're the smartest people in the world for contributing to their "downfall". How many posts have declared that there is NO WAY Nvidia will ever come back with anything good? Do these delusional Fanboys remember that ATI was the b*tch until the 9700 came out or are they just not admitting it?
Nvidia owned ATI from 1998-2002 - that's five years. ATI's been comparable for a year and you'd think they invented the transistor for all of the giggling from the Fanboys.

ATI and Nvidia are the Ferrari and Lamborghini of the video world - while one may be faster than the other, it's not like the 2nd place car is a hooptie, that is, unless you're a rabid fan of whichever card ISN'T an Nvidia. ATI fans used to brag about the 2D quality - "Who cares if my card is 40% slower and has broken drivers that bork all the games I play? My Excel looks SHARP!" - when they couldn't compete on speed. Now that ATI can fuel their d*ck-measuring egos, they're just that much more obnoxious.

Finally, does anyone think it's odd that in most benchmarks, the top cards from both finish within 10 percent of each other, but suddenly the Nvidia part is HALF as fast? Combined with the reach-around that ATI and Valve are giving each other AND the event was put on by ATI:Q, the lack of critical questions being asked as to whether Nvidia is getting smeared shows just how disinterested in FACTS the Fanboys are. Who cares if it's untrue if it supports their beloved ATI?

1. ATI staged the event.

2. ATI and Valve are seriously in bed with each other.

3. Valve has been leaking comments that ATI was better. (Remember when Carmack used to trash ATI drivers as not following specs? At least he's independant.)

4. Nvidia has a new driver revision pending and tried to get betas used for comparison, but Valve refuses, making vague accusations of "out-of-hand optomizations", but doesn't specify what those optimizations do.

The last point is key: While the Fanboys wank themselves furiously and yelp that Nvidia has been busted for cheating, what's to say that the major crime the new drivers commit is BEING AS FAST AS THE ATI DRIVERS? God forbid Valve has to back down from all their smacktalk about Nvidia in support of their cash masters at ATI.

What's been missing from all of this is SPECIFIC DETAILS about what the heck is going on. All we get is vague murmurs about how crappy Nvidia's parts and drivers are and how ATI r00lz cuz they follow the DX9 spec, but HOW DO WE KNOW THAT? Just because a video company and a game maker who are in bed together say so?

INDEPENDENT investigation is needed and, so far, all I've seen are the Valve supplied benchmarks that were tailored to show ATI in the best light. Aren't you Fanboys just the slightest bit interested in the truth or is your Nvidia hatred so all-consuming that you can't help yourselves?

I was a die hard fan of nvidia but lets keep it real. For direct x 9 games the 5900 sucks. It's slow on tomb raider/hl2 and doom 3. That's right d3. Id also had to make special code path w/reduced quality to get the card up to speed. You are mad like I am because you spent hard cash on a product that's not up to spec. There is no excuse for making a $500 card that needs special attention from developers and a pr team just to keep up on a lower quality settings. All dem bumbo clot wan do is a tell lies. They bending over the gamers who are spending upgrading dollars for a dx 8.1 card in dx9 clothes. They finally got some body with the balls to the the truth and the sky is falling. It takes big stones to talk bad about nvidia and their 100 million+ users. Get your head out the sand. Your are not a dx9 game developer. Take a close look if you don't have a crappy monitor at the screen. I don't know how old you are but I've been playing games before the glide api and 3dfx vodoo 2. It's ironic that as soon as they merge aquire whatever 3dfx this is what happens. Just like back then there was hardware that didn't ship on time and fell behind. While all the fans (myself inc.) waited and flamed nvidia the underdog. I like nvida but I don't like their f**ked up attitude that we are all dumb and a driver is going to fix this broken hardware. then the can sell the new improved dx9 comlient $500 wonder card that we should have had this year. I returned my 5900 for a hurcules 9800 pro and guess what it does look better plus it was cheaper and oc'd better. Would you defend a car dealer that cheated you hell no. Chi-chi man wake up and smell what nvidia is smearing in all the fanboys faces.
 

Rogodin2

Banned
Jul 2, 2003
3,219
0
0
9800PRO

No question.

I'd buy a 5900 if I was very rich and wanted to test it out-but I wouldn't rely on it for good performance for the next 5 months.

rogo
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: Rogodin2
9800PRO

No question.

I'd buy a 5900 if I was very rich and wanted to test it out-but I wouldn't rely on it for good performance for the next 5 months.

rogo

Well I bought the FX5800U when it came out, couldn't stand the noise, so I bought a 9800P to replace it, I still have the 5800U (back in it's box) and I may keep it because they dropped the line entirely, kind of like a collectors item.

I had 3 5900U's on other machines at home, replaced ALL of them with 9800P's, they're just better IMO. The 4 9800P's, 2 9700P, and a 9600P have been the second time EVER I had bought ATI cards, the first time was the Mach64 (which I hated BTW). I have bought NVidia boards for all my machines at home since the Riva128, and I still have several Ti4600's running. IMO, the FX's are turds simply because they don't deliver what they're marketed at (DX9+, OGL 1.4+), they're good to excellent (from low to highend) "DX8.1+" boards though.

 

sodcha0s

Golden Member
Jan 7, 2001
1,116
0
0
Jesus, all this BS is getting so old. All because the supposed king of 3d is geting caught with their pants down. I remember when 3dfx was king o' da hill, and all the nvidia fanboys were out in full force at the beginning of their demise. Now that nividia has fallen behind, it's the same story with the ATI fans. Face it, nividia has really screwed up in a big way. They released "dx9" hardware that evidently really isn't up to the task. I would be pissed if I had spent $500 for their card, only to find out that it didn't perform as advertised. I'd feel the same way with ANY vendors card I bought. Maybe the 5900ultra will perform better in D3, or all open GL games for that matter. But I bet it won't perform 70% better.

I am not a fanboy of either side, I buy what I think is the best at the time. I couldn't care less who's making the card, I just want the best overall value for my money. Right now, ATI is making the best. Get over it.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
You poor nvidia people are getting so defensive when you find out that a big name game is gonna run considerably better on another platform. Its rather amusing.

I am PISSED at nVidia, period. I am not an nVidia person. I sold my FX5800 at a huge loss due to their mishandling of the nV30, and not even bothering to return my emails about it, and wasted money on a 9800Pro.
They can go out of business today for all I care, their customer service sucks, as does their driver cheating in UT2003, my favorite game.

I am a gamer however; and this ATI circle jerk over the benchmarks of one unreleased game is ridiculous.

You still haven't shown any proof that you have a 9800 pro and until you do, I'll call you for what you are--a compulsive liar that is trying to build up credibiliity by saying he owns a 9800 pro. People that show off about their "credit cards, bank accounts, boats, 2 bachelors degrees" etc. are usually insecure liars such as yourself.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
You still haven't shown any proof that you have a 9800 pro and until you do, I'll call you for what you are--a compulsive liar that is trying to build up credibiliity by saying he owns a 9800 pro. People that show off about their "credit cards, bank accounts, boats, 2 bachelors degrees" etc. are usually insecure liars such as yourself.

I think I'll just call you what I think you are: a pesky dumba$$ with a computer.
Think about it dimbulb: let's say I jump through your little hoop and post a picture of me holding my 9800 Pro and the box it came in, in front of my boat, with my BS Business and BA Psych hanging on the wall behind me. What would that be proof of? You'd just post,"That is done with Photoshop!" or "That could be anyone, how do we know it's you?" or something else of that ilk. We'd be back to square one, you calling me a liar, me saying I'm not.

It's interesting you didn't have any problem believing I bought a 5800, but you won't believe I sold it and bought a 9800. LOL, like you're in some elite club of 9800 owners that anyone with a job can't join. What a punk.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
http://www.gamespot.com/pc/action/doom3/news_2689004.html
If you're going to post a link at least read it first and understand the issue at hand.

Where did you see Doom 4 is using DX7?
I didn't say it's using DirectX 7, I said it's primarily using the DirectX 7 featureset with parts of the DirectX 8 featureset. If you can't understand the difference (like I supect you can't) then you really have no business even trying to argue with me.

Anyone who posts something as totally inane as "the 9600 Pro is even superior to the 5900Ultra" just becasue it may run some games that aren't out yet better is in no position to talk about "clueless".
You really don't have any clue do you?

Developers, 3D editors/reviewers, GPU manufacturers - people twenty times smarter than you who actually know what they're talking about - have been analysing the entire issue for the last 6-12 months and the overwhelming evidence is that NV3x has extreme problems running PS 1.3/1.4 and even bigger problems with PS 2.0, evidence that is backed up by comprehensive technical analysis and testing.

Then up pops Rollo, the guy who "upgraded" his 9700 Pro to a 5800 and suddenly he debunks the entire situation in one fell swoop. Do you honestly expect anyone to take you seriously and not react negatively when you constantly continue to post such garbage?

Still using that 9700 eh, BFG? It's worth the cost of a 9800 to me just to have a better card than you, given how rude you are.
rolleye.gif


If you don't want to constantly get slammed then stop these asshat posts of yours and start making some sense.
 

RogerAdam

Member
Oct 14, 2001
29
0
0
Originally posted by: BFG10K
http://www.gamespot.com/pc/action/doom3/news_2689004.html
If you're going to post a link at least read it first and understand the issue at hand.

Where did you see Doom 4 is using DX7?
I didn't say it's using DirectX 7, I said it's primarily using the DirectX 7 featureset with parts of the DirectX 8 featureset. If you can't understand the difference (like I supect you can't) then you really have no business even trying to argue with me.

Anyone who posts something as totally inane as "the 9600 Pro is even superior to the 5900Ultra" just becasue it may run some games that aren't out yet better is in no position to talk about "clueless".
You really don't have any clue do you?

Developers, 3D editors/reviewers, GPU manufacturers - people twenty times smarter than you who actually know what they're talking about - have been analysing the entire issue for the last 6-12 months and the overwhelming evidence is that NV3x has extreme problems running PS 1.3/1.4 and even bigger problems with PS 2.0, evidence that is backed up by comprehensive technical analysis and testing.

Then up pops Rollo, the guy who "upgraded" his 9700 Pro to a 5800 and suddenly he debunks the entire situation in one fell swoop. Do you honestly expect anyone to take you seriously and not react negatively when you constantly continue to post such garbage?

Still using that 9700 eh, BFG? It's worth the cost of a 9800 to me just to have a better card than you, given how rude you are.
rolleye.gif


If you don't want to constantly get slammed then stop these asshat posts of yours and start making some sense.


Hey BFG10K
You forgot to mention that currently HDR is not implemented in the FX series drivers.

If they ever bother turning on HDR and run a "PURE DX9" driver, the performance issue the FX series has gets worse.

D3 in OGL STILL needs to be "optimized- NV TM" by Carmack to be playable, as Carmack HIMSELF has stated that the NV3x runs the ARB2 path "very slow", so he MUST do what valve did, and dumb down the API, it's precison to FX12 w/a little FP16 for IQ to a point that balances IQ with (playability) performance, The ATI boards run ARB2 native at a higher precision (FP24), and won't be getting the special attention the NV FX series needs. The trade off being the NV FX should run a little faster, and the ATI Boards should be off a higher IQ.
 

5150Joker

Diamond Member
Feb 6, 2002
5,549
0
71
www.techinferno.com
Originally posted by: Rollo
You still haven't shown any proof that you have a 9800 pro and until you do, I'll call you for what you are--a compulsive liar that is trying to build up credibiliity by saying he owns a 9800 pro. People that show off about their "credit cards, bank accounts, boats, 2 bachelors degrees" etc. are usually insecure liars such as yourself.

I think I'll just call you what I think you are: a pesky dumba$$ with a computer.
Think about it dimbulb: let's say I jump through your little hoop and post a picture of me holding my 9800 Pro and the box it came in, in front of my boat, with my BS Business and BA Psych hanging on the wall behind me. What would that be proof of? You'd just post,"That is done with Photoshop!" or "That could be anyone, how do we know it's you?" or something else of that ilk. We'd be back to square one, you calling me a liar, me saying I'm not.

It's interesting you didn't have any problem believing I bought a 5800, but you won't believe I sold it and bought a 9800. LOL, like you're in some elite club of 9800 owners that anyone with a job can't join. What a punk.

Business and Psychology? At least lie about something impressive :) Yes I do not think you have a 9800 pro and are full of it, until you show otherwise. All it would take is a simple picture of it w/a written letter like, "Rollo's 9800 pro". I mean come on, you have two really tough to get bachelors degrees, a boat, a big house and two top end video cards, surely a cheap digital camera or hell even a webcam can't be that big of an investment.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Business and Psychology? At least lie about something impressive
That's the point here, you think I'm lieing about "fantastic" things, when I'm just talking about average middle class things no one WOULD lie about to look good. Why would I say my degrees are in business and psych to "impress" people? Who would be impressed? I bought a $15K fishing boat , not a $50K speed boat. Etc.

In any case, I'm not going to pay one cent for something I don't particularly want to prove anything to an unknown pest on a website. I do have a scanner. If you'd like me to scan something on the box it came in, or from the owner's manual, that is the extent of what I'll do to "prove" this.

Again, you don't seem to have any problem thinking I'd spend $339 on a 5800, because that suits you. That I'd spend $372 on a 9800 Pro, which is in many ways a better video card, seems unthinkable to you. Why would having that card "build my credibility"? I had a 9700Pro for 9 months before that, it made me no more knowledgeable, it just made me a guy that had $390 when 9700Pros came out.

I emailed you screenshots of my device manager and display properties that show I'm running a 9800 Pro. I'll do the same with anything from the 9800 Pro packaging if you like. If my computer's device manager, and the packing of the card don't convince you, you're out of luck.