Originally posted by: Rollo
You still haven't shown any proof that you have a 9800 pro and until you do, I'll call you for what you are--a compulsive liar that is trying to build up credibiliity by saying he owns a 9800 pro. People that show off about their "credit cards, bank accounts, boats, 2 bachelors degrees" etc. are usually insecure liars such as yourself.
I think I'll just call you what I think you are: a pesky dumba$$ with a computer.
Think about it dimbulb: let's say I jump through your little hoop and post a picture of me holding my 9800 Pro and the box it came in, in front of my boat, with my BS Business and BA Psych hanging on the wall behind me. What would that be proof of? You'd just post,"That is done with Photoshop!" or "That could be anyone, how do we know it's you?" or something else of that ilk. We'd be back to square one, you calling me a liar, me saying I'm not.
It's interesting you didn't have any problem believing I bought a 5800, but you won't believe I sold it and bought a 9800. LOL, like you're in some elite club of 9800 owners that anyone with a job can't join. What a punk.
Originally posted by: RogerAdam
Originally posted by: RogerAdam
Originally posted by: Rollo
LOL @ myself Old Fart. Good point, I forgot. Actually I'd say my OGl games outnumber my DX games at least 5X1, if not more. They're all based on Quake engines, except for the Mechs, Unreals, and Max Payne. (and even the Unreals go OGl)Doom3 is not a significant DX9 game or DX at all for that matter. It is an OpenGL game.
This is a VERY interesting point the more I think about it. How relevant is DX9 for the fps player? Everyone uses Carmack's game engines. Damn. I better cancel my "superior 9600 Pro" order.
We'll have to agree to disagree there. Everything was pretty angular, and the lighting effects were cheesy. All it really had for me was the fun of interacting with the characters. Didn't think the monsters were scary. The Fiend in Q1 was a damn fine monster, when those things first jumped you and started slicing away, pretty tense. The chainsaw ogres were good to, as was the shambler.HL1 graphics were pretty decent for its day.
True, but Aquanox2 and Spellforce will be games with the same engine, so is there any difference in this case?Aquamark (3) is not a game, it's a benchmark.
Don't do it Parrothead! I'll trade you a superior 9600Pro for it, unopened also! You'll be all set for DX9 and better off, just like BFG says! He is wise and respected, listen to him. PM we'll exchange addresses and Heat, and I'll set you up right with some (POS) ATI 9600 goodness!Returning my new unopened GeForce 5900 Ultra. Today the last day to return this thing.
Duh! I can't find an edit option (not used to this board layout).
Some interesting read here.
As for Aquamark go READ their pages, you CANNOT consider it a DX9 benchmark as it will do PS 1.1 - 1.4 depending on what it detects as capability, that said the FX WILL be using PS 1.4 (w/lower precision) to run it - unless NVidia can HONESTLY release a driver that runs DX9 (pure), as per EVERY DX9 bench game/beta whatever to date the FX series gets killed by the R95/6/7/800's, and the is NO DRIVER that can fix it for DX9, the ONLY fix is to run DX9 games/benchies as DX8.1 as NVidia has with it's "optimized" drivers, and has Valve when trying to help the crippled DX9 FX series work with HL2. As per a developer:
"Note the Actual problem here lies in the Bottom line Raw Rendering capabilities of Each hardware. ATi can do 8 FP ops and 8 Texture ops in the same Clock in paralell while Nvidia cannot. Its that Simple.
This has nothing to do with one being "8 Bits" and the Other being "4bits". Nor is it a Bug in DX9. There is an Update to DX9 on the way that will help with Shader Code Generation for Nvidia hardware. But even that is not going to make up the Difference between the two. It has nothing to do with a missing "Channel" Or anything else.
Any time that Nvidia has to Face Texture and Pixel Shader Work in the Same Clock they are at an Extreme Disadvantage. Which is the case in every single game released. Keep in mind here that this article is dealing with the Idea of Pure Peak Performance. Which never, ever happens."
This means the NV3x can NEVER fix their shader (PS2.0) problem with a driver, it's a H/W problem. What is really happening is an PER APPLICATION SPECIFIC fixes, ie EVERYTIME a new DX9 benchmark or title is ready too or goes public, NVidia MUST "optimize", ie make their drivers revert to DX8.1 to even compare with a Radeon DX9 part running DX9, and that folks is invalid, that is why Valve jumped all over NVidia, as if Valve never said anything, NVidia would've "optimized (DX8.1)" their drivers to pass ATI by a few FPS and plastered that benchmark score as DX9 comparison, when in reality, they are being smoked in pure DX9 comparisons in EVERYTHING including HL2.
Another thing (this relates to the beloved Doom3), ATI using the ARB2 path in OGL SMOKES NVidia, Carmack has to specially code an NV3x path because in HIS OWN WORDS "NV3x is very slow using the ARB2 path". What's funny are the remarks "wait til D3", well as per DX9, NV3x NEEDS HELP for it to perform, it simply CANNOT using the default path, so Carmack is doing exactly what valve did - he's dumbing down the API not using advanced shaders as much and lower precision FX12 w/ a little FP16, ATI's cards simply use ARB2 (all the effects) and higher precision with FP24 you can find this from the source (ID) itself.
With the FX series the concept of "future-proofing" just is not there, it's barely "present-proof".
THIS IS FACT:
DX9 - ATI runs it native, NVidia is painfully slow native, but can run it DX8.1 (wheteher the ISV does it, or NV does it with their "*optimizations")
OGL - ATI runs the native ARB2 path, again NV can but it's "very slow" (JC), again bringing down the API is the solution.
*optimizations - What gets me about this is that in their "statement" about Valve, they ADMIT that replacing shaders and lowering precision (FP16) is OK, and does NOTHING to the "experience" on their h/w. Does anyone remember the 3Dmark03 contraversy a few months back? - Wasn't it NVidia that COMPLAINED there was enough PS 2.0 in the benchmark? Further stating that made 3DMark invalid, well now they did a complete 180 on the matter (re:HL2 statement from NV themselves about "replacing 2.0 shaders w/1.4), can you NOW see the PR damage control (using the above).
How does anyone ACTUALLY believe it will turn around, when the only solution is to run DX9 and OGL BELOW their spec's?
Why does NV need to be helped along at all, OGL and DX are STANDARDS, it should just plain run on default paths (ie like they do on ATI).
Uh oh Omega. You said you're PO'd that nVidia wouldn't return your emails even though you've been a good customer and now you're switching.After buying ten thousand (conservitive estimate) dollars of cards for 9+ years they wouldn't even return 1 e-mail. Not even microsoft has stones like that. I guess I will be going over to ati untill nvidia remembers that we the gamers control their destiny
Originally posted by: Rage187
"Valve insists that they have not optimized the game specifically for any vendor."
cough*** BULLSHIT***cough
The whole part about them "insisting" just shows they are guilty of it.
They are a co-advertiser; they are even packaging their merchandise in the SAME bundle as a sales promotion. Yeah, no optimizations happened there![]()
![]()
Look I just buy the best card period but now customer response also means alot to me now. Who wants to buy a $500 dollar card and get silence from the maker am I being unfair to nvidia. S-it I called ati just for the hell of it got a live peron in 5min. 1(905) 882-2626 Asked stupid questions and the still tried to help me. Now I don't know about the rest of you but when I spen my money I would like just a tiny little bit of F--KING RESPECT. (End of rant)
The 9600 Pro is clocked higher and has more memory bandwidth, but the 9500 Pro has the fillrate advantage because of its 8 pipeline configuration.How similar is ATI Radeon 9500 Pro to 9600 Pro?
Checka few reviews. AFAIK the 9500 Pro should be a bit faster but I don't want to mislead you on a hunch.I know that 9600 Pro will run the HL2 well, but how about the 9500 Pro?
Originally posted by: Rollo
5150:
You asked me to spend money I worked for on something I don't want to own, to prove I own something to a stranger who I'll never meet that for some reason thinks he's in a position to judge whether I'm a liar. Who's a borderline troll?
I sent you my desktop settings saying I have a 9800Pro /Nokia 445X Pro, as well as a screen of my device manager showing I have a 9800Pro installed. I've offered to hook up a scanner I don't use to scan the box the 9800 Pro came in, as well as pages of the manual that came with it. I wouldn't likely have the packaging if I didn't have the card now, would I? If this doesn't constitute "proof" to you, oh well. The best part of using a 9800Pro is using it, not having strangers in CA believe you own it.
You assumptions about my motives for selling the 5800 (which you have no trouble believing I bought) are ridiculous. Like I said, switching video cards didn't involve much money. Before my son and the expenses associated with him came along, I used to switch video cards every month or two, often to cards that didn't perform as well, just for the fun of trying new hardware.
nVidia screwed me and every other nV30 owner over. They released the nV35s earlier than expected, for less than expected. They trashed the nV30s to the media, and removed them from their website. It was found they cheated the drivers in the only game I really play, UT2003. Then their support sent me the emails of 3 people in their PR department to address my issues with, none of whom bothered to respond to me. As a person who bought an nV30, which was fast depreciating, I thought it best to cut my losses, sell it, and get the best card at the time. (9800Pro)
If this seems implausible to you, you'll have to live with it. It's the last I'll address the issue. The board can take me seriously as they see fit, you don't speak for them. Call me a troll all you like, I'm not the one who thread craps any nVidia related post with snide comments about how worthless their cards are. (and I'm still fairly annoyed with them)
If you say so, I'd never waste my time learning to do something like that. (students have a lot more time though)First of all, it's easy to fake a device ID, any moron can do it. I can fake my device ID to be a 5900 ultra if I wanted, those desktop shots you sent me prove nothing.
I "could" buy many, but why I would do so to convince a stranger that I own a video card is probably a better question.As a student I have no problems buying digital cameras or webcams, surely you could buy one.
When you're out of school, working full time, caring for kids and a home, time and how you spend it will change for you. So, I'm not going to yank my card, photo it, scan the photo and email you a copy just because you think I should. If you want to think I have a S3 Virge, that's ok with me.Hell, you could use your wife's regular camera and scan the pictures, its not exactly a demanding task.
If you bought a new Dodge Viper to drive, and Dodge suddenly pulled the Viper off their website, and replaced it with the Joker, told the media the Viper was a mistake and everyone should buy Jokers, would you have "screwed yourself"?nVidia didn't scre anyone over, you screwed yourself over.
At the time, it was the 3rd fastest card I could buy, and so I wouldn't call it "crap". I actually wish I still had it sometimes. I was bored with the 9700Pro, and the 9800 Pro is pretty much the same as having it back. The 5800 did everything I needed it to.You were foolish enough to buy a 5800 ultra knowing full well it was a crap card--unless you were blind and didn't see all the reviews.
I don't endorse nVidia products, I just have a more rational opinion of them the ATI zealots. A card isn't "crap" because it's not the fastest you can buy at every benchmark and detail setting. The 5900 Ultra won't be a great choice for the couple of DX9 games that come out this year, but it should be faster at any OpenGl game at up to 4X8X, and in the ball park at any DX 8.1 game.It's easy not to take you seriously because your constant endorsements of nVidia products (after claiming to have been burned by them) is illogical which means either you're completely retarded or you are a liar. I gave you the benefit of the doubt and chose the latter.
RolloYourKillingMe LOL, some people need to get a girl or maybe a life. Why is he so interested in what in your pc? Watch out maybe he really wants a pic of you in your boxers. LOL LOL LOL
Originally posted by: stardust
i wonder how hl would look if they showed the "HL_02 Source.exe" demo in dx8.1a... im really curious about the actual difference it makes of directx versions in hl2.
Originally posted by: Rollo