GeForce 6 series video processor OFFICIAL THREAD

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

epking

Member
Jun 22, 2004
114
0
0
Originally posted by: MDE
epking, it's PVP not VPP.

lol my bad. thanks. I've been saying VPP for some reason. on chip Video processor is how I think of it/what i mean. VPP is the nvdvd codec thing i've been using for dvd playback. which I find really nice by the way.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: epking
I am the world's loneliest man. I will pretend to care very much about this issue which concerns you, if someone will talk to me. :(


Good post dude!


BTW-
I don't have to be a longtime member here to know unquestionably, that Rollo has taken up the fight for Nvidia in every circumstance over the years. Does anyone question for a second that he would take any other side, or see any other point of view than an Nvidia fanboy?

You dohave to read posts other than this one, Big Chief. My Asus X800XT PE is arriving tomorrow or the next day, (shipped Friday) do you think my posts about that will be "Yay nVidia!"? :roll:
LOL-dude, I have purchased almost every flagship card by every company for the last ten years. I put my money where my mouth is when I say I'm no one's "fan boy", how about you?

LOL- usually I try not to feed the trolls, but one of my buddies PMd me about your ludicrous manifesto, so I read the first paragraph and wrote this. If you were here, I'd buy you a beer for the laughs.
:):beer:
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: epking
Originally posted by: bpt8056
Originally posted by: Rollo
I guess my "point" is that we have this 1000 post thread with people fighting amongst themselves like it was the Civil War, calls for lawsuits, demands for all sorts of free goods, over something I don't really see in use.

Rollo,

I appreciate that you were able to provide whatever information you can from your friend at nVidia about the status of the VP. However, that statement that I quoted from you does not sit well with me. First of all, this thread is not about you. We don't care if you don't ever plan to use WMV 9 encoding/decoding and that it doesn't affect your gaming experience. If you're so happy with your GeForce 6800 and have no problems with the VP, then why are you posting to this thread? Didn't all of the posts here give you any indication that there are GeForce 6 owners that are not happy (apparently that doesn't include you) with the way nVidia handled the VP situation? If it didn't, then I don't know what will.

What we do care is the VP's ability to do WMV9 acceleration like nVidia stated that the VP could do back in May. People in this forum are naturally venting their frustration to the lack of communication, driver (software), support from nVidia in regards to the VP situation. It's just that statements like the one above induce a civil war in this thread so if you have no use for the video processor, please try to refrain from making those kind of statements. Thanks.

exactly. He is here because he HAS to defend NVidia, no matter the issue. That is what he does. I don't have to be a longtime member here to know unquestionably, that Rollo has taken up the fight for Nvidia in every circumstance over the years. Does anyone question for a second that he would take any other side, or see any other point of view than an Nvidia fanboy? Earlier, he made the comment about the dx9 controversy, and how people were out of line criticizing last year....lol...just as I said before, no doubt Rollo was attacking people who complained their 5900's,5600,5700's wouldn't do ps2.0 with any sort of playability, Despite Nvidia's grandiose marketing campaign. So, even he admits, that was his position. But, he fails to take into account, that Now in late 2004 there are thousands of peoples who have paid $400,300.,whatever on these cards, and these modern ps2.0 games are now out....and they can't play them to their fullest...or to the extent they were led to believe when they trusted Nvidia, and invested $400 in a card. But now and then, rollo takes Nvidia's side. The parralels to this current situation is clear. Thanks rollo, for bringing it up...lol.

And now, for this broken VPP... unbelievably, when a completely disabled feature is lacking, one that a great many in this thread bought the card over ATI for, and is key for their computing habits....he attacks them. He is so nearsighted, and so clouded by subjectivity and devotion to nvidia, he cannot for a second realize other users have different computing needs...video editing, and HTPC in particular..hell even someone who watches a lot of hidef video or online movies would be drastically affected by this. And there is also the whole divx issue and non working hardware encoding/decoding. Yeah, that is a real unpopular format Rollo..can't see how anyone would have the audacity to demand Nvidia to deliver what they adverised. :disgust: People go and spend $500 bucks on a video card that advertised such revolutionary onboard features, and said feature profoundly influenced their buying decisions....and when its broken or whatever......Nvidia fanboys come on here and attack. At first, they say its not broken...then when proved wrong
1. you're over reacting.
2. its not really needed anyways, there isn't any content...and if I don't have a need for it, then nobody else possibly could
3. go and buy a new cpu for $400 and then it won't be an issue anymore ..lol
4. don't worry, it will be fixed just you wait and see..LOL.

At first Rollo was lashing out at anyone who spoke out against his beloved Nvidia. Now, once Rollo has taken a beating from users here..lol.., his tune has changed ever so slightly to less attacking. But, make no mistake, he still seeks to minimize our concerns, and say its not a needed feature, and we are over reacting. I guess it would be too much to ask, for someone like him, who only sees things through the rose colored glasses of an Nvidia fanboy, to see things from someone ELSE's perspective....or understand that OTHER PEOPLE may have DIFFERENT NEEDS AND USES for THEIR CARDS THAN HE HAS.. But, no, for a self involved, self consumed person like Rollo, he can only see things from his own narrow viewpoint. ie: "I love Nvidia, they are the greatest graphics company ever, anyone who says any negative word doesn't know what he is talking about." By the way, he has yet to get it through his thick skull,( perhaps because this is Nvidia PR spin)...but he has yet to come to the realization that this issue involves much more than .WMV... the broken VPP has had implications across the board for video playback. Cpu usage is up in virtually every playback circumstance except dvd, and even that is flawed due to this broken hardware.

As for the .wmv thing I own 4 .WMV HD Dvd's, (stepinto liquid, standing in the shadows of motown, T2 extreme, and Lewis and Clark exbhidition) and they won't play on this card. There is just too much stuttering. Whereas mine and others 9500pro and 9800pros, (it doesn't matter what dx9card) played flawless. Now, The 9800 series doesn't have a VPP, but it IS able to play these movies fine....the 6800 doesn't have a workinhg VPP either..its broken...but it cant play it at all!! What is the difference? Why can't the 6800's play without stutter? The difference is, the broken VPP has had implications down the line for video playback, vmr9 mode in particular.The 6800 series has to offload everything onto the CPU, any other dx9 card was able to utilize the V/P pipelines for assistance(in VMR9 playback mode). That is just one tangible example of video playback issues affecting this card. No hardware accel for divx decode/encode is another. High cpu usage disproportionately across the board on this 6800 for virtually all video playback modes is another. On a relatively fast processor, a user playing high quality video can only do that, play video. If one even loads a web page, it will bog down the computer...whereas any other card in the last two years handles such tasks without breaking a sweat. The 6800 appears to offload virtually everything onto the processor as a result of this broken VPP> The exception being dvd playback, but even that has higher cpu usage imo. Those few with the highest end, most modern processors may notice this less...but throw in an X800 series for comparison, hell throw in a 5900, and the difference will be clear. These 6800's because of the broken VPP, put a tremendous burden on the cpu, moreover this is unique to the 6800's, a user is essentially getting a performance downgrade as far as videoplayback capabilities. That is unbelievable, and entirely unacceptable, for people who bought the fecking card at a premium price mind you, because they wanted the onboard VPP to help OFFLOAD the burden on the CPU!!! The fact of the matter is this, due to the broken VPP, these$300 $400,$500 cards are a disaster for all around Video Playback, and HTPC. As has been said over and over again...virtually any other modern card, a 5800, 5900. 9500pro, 9800pro, any of these are better performing for video playback, plain and simple. :disgust: to make things worse, many of us here bought the god damned 6800 specifically because we wanted an upgrade for video playback performance, and there is no question about it, we have gotten a downgrade...the reason is clear, the VPP is broken. To be fair, I will say this, image quality(when its not tearing) is spectacular on this card, so Image quality is phenominal and is an upgrade, the cpu burden and performance though, makes it virtually unusable in many a circumstance.

We trusted Nvidia, and gave our hard earned dollars to them, thinking we were getting an upgrade for games and video. They have only delivered on the games part, and unbelievably, not only did we not get an upgrade for video capabilities....we got a fecking downgrade, and a broken, nonexistent feature....that was highly touted, and forcefed to us in a BS marketing campaign. Now they are in PR mode, and lying their asses off trying to cover this up...and we come here to talk about it, and try and get things done, and clowns like Rollo are unconditionally defending them, and attacking us for wanting what we paid for. But hey, nothing new there, He's been there done that, and apparently been doing that around here the last couple of years.


btw, there is nothing wrong with defending Nvidia in itself, it depends on the issue, or the truth or falsehood of the situation, and whether they are right or wrong. My point is, in Rollos world, Nvidia is always right, and he defends them no matter how culpable they are, no matter their tactics, and no matter how they have harmed consumers. apparently he can not see it any other way if he wanted too. he is a fanboy. when someone says something negative about his precious company, he attacks them and flame. Nvidia is infallible in his world. That is in essence what we have witnessed in this thread, and I imagine, many many more. Text


OOOPS! Better quote you so I don't end up on vacation like your alter ego, Hard Warrior. I wouldn't want the mods to think I'd call anyone the "world's loneliest man" without them having written a 10000 word term paper about the "Evils of Rollo, ATI Buying nVidiot". ;)


 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: HardWarrior
Vacation? More "insider information", often-wrong Rollo?

When I'm wrong, I'm wrong. I didn't think I could post in this thread without you flaming me, so I assumed you'd been temporarily banned for your antics.
 

HardWarrior

Diamond Member
Jan 26, 2004
4,400
23
81
Originally posted by: Rollo
Originally posted by: HardWarrior
Vacation? More "insider information", often-wrong Rollo?

When I'm wrong, I'm wrong. I didn't think I could post in this thread without you flaming me, so I assumed you'd been temporarily banned for your antics.

You're off-topic again and otherwise hijacking the thread again, Rollo. Anyway, 7-days and counting.

 

y0bailey

Junior Member
Nov 6, 2004
12
0
0
i will not lose faith!!!!!

its a programmable video processor....they can program it to work, they can they can!


please!!! ah!!!!!!
 

epking

Member
Jun 22, 2004
114
0
0
Oh,you've bought an ATI, congratulations, Rollo. Great, I'm glad for you. And Damn, I thought I had you figured out based on your subjective and one sided viewpoints supporting Nvidia unconditionally...Damn, I had witnessed such things and assumed you were an unconditional Nvidia defender no matter the circumstance, but I guess your words and viewpoints mean nothing now, since you've said you bought an ATI card and all :roll:

Thats the famous words of every fanboy "I own both company's cards" Whatever. It doesn't change the fact that you worship Nvidia, and defend them no matter the circumstance. You know it. I know it, and everyone reading this thread knows it. By the way, you're not the only one to get PM's (if you have at all) I've gotten 4 PM's from people laughing and complimenting me for ripping you to shreads. and 2 more saying not to bother with Rollo, "he has no respect around here ect..and hasn't the ability to even carry on a reasonable debate with since he's so biased" Now, I don't even know anybody here, these are strangers approaching me and telling me this about you, not some of your forum buddies trying to make you feel better and cheer you up after you've been taken to the whippin post..lol. Gosh, you'd think people would have more respect for you and be more grateful after all you've done around here for everybody....I mean, golly, you did get those Far Cry ps3.0 benches early and all.

As for me calling you a fanboy, It needs to be said, because in the context of this thread and how you've responded, it's more than pertinent. In response to me calling you a fanboy you say "You do have to read posts other than this one.....I've just gone and bought an ATI...."

Fine, I did just that. I did a search. LOL, it took me all of two posts into your search to pick out this gem:

Not good news for ATI. The money they've put into the X700XT is lost, and the failure to bring the card to market damages their credibility.
now in itself, in response to the x700 rumored cancellation, its a fair comment...coming from you...not so sure...so i keep reading...

Next, someone says this would be much less of a failure than the 5800. Which no one in their right mind would argue, since The 5800 is almost universally regarded as one of the most notorious and colossol failures in 3dhardware history, second only to the 3dfx voodoo fiasco. The exception to this is Rollo of course, in his Nvidia fanboy world he sees everything through a false lens...and he is somehow able to put a positive spin on the 5800's cancellation and never reaching retail....claiming they were available everywhere...LOL:

Of course, they actually sold thousands of those and they were pretty much available everywhere. They supposedly produced 100,000 of them. Do you think 100,000 X700XTs were produced? XT PEs? Nah.

lol, You see, ATI's x700 cancellation is a disaster, but the 5800 not so. ...Even better, in a post a day before about the NV48 being cancelled, Rollo amazingly was singing a different tune yet again...you see, if its Nvidia, then well.....

I don't know how much it will matter if nV48 is not released. Judging from the X850 XT PE review here, I don't think you'd be disappointed with a 6800Ultra. The only real differences were Doom 3 and HL2, and either card plays either game well anyway.
LOL>>this is priceless.

So I go back and look 2 posts into your search results for posts, and that comes up. Come on dude..you're an Nvidia FANBOY> I'm sorry....It colors any opinion you have. You are completely blinded. Same post cont:

You don't seem to be long on facts though do you Silvertrine?
I was unaware those Gideon Bible, 9368 registry entry Catalysts are now considered the gold standard of drivers?
Or that ATI cards have cornered the market on "reliability"? Or 2d IQ?
You're really reaching with the "power and noise"- links to significant difference?

It was rough about the 5900s shaders- too bad by the time any shader games worth playing came out 5900s and 9800s were low end cards, getting stomped by $200 6600GTs.

All in how you look at things, eh, Silvertrine?

BTW- I'm not "anti-ATI", I have a X800XT PE arriving this week, and am psyched!

someone with some common sense(sickbeast) had this to say:
" - Rollo I'm still shocked you're downplaying PS2.0 yet you continually stress the importance of SM3.0. Face it, the 5800/5900 cards would be much better pieces of hardware if they could run HL2 in DX9 mode in similar fashion to the 9700/9800. The 5800/5900/9700/9800 were not "low end" cards when Far Cry was released, by the way. "

Yeah, I'm shocked too, imagine that, Rollo downplayed ps2.0....but now ps3.0 is a must have. Real consistent and objective there Rollo. Its got nothing to do with your Nvidia bias, I'm sure :laugh:

another gem, same post:
"Yeah, the next thing you know they will be releasing cards with faulty non functioning hardware and cancelling the next 2 generations of cards. "

(rollo's reply)
That's all well and good, but

1. You don't know the hardware is faulty, or to what extent.
2. What it's faulty at is video encode at a little used format, for that matter a little used feature on a gaming card?
3. ATI still has nothing at all to compete with the 6600GT and 6800NU? If that's not a big deal to you, I can guarantee you that it is to them.
4. One of the cards they "cancelled" was a refresh part that wouldn't have differentiated itself from the current 6800U. The other, if it's truly cancelled, is likely due to the recent announcement of nVidia's affiliation with Sony, which will sell more GPUs for them, by far, than ATIs XBox and Gamecube contracts combined.
5. You don't really need to offer refresh part when your current situation is: $200 market- nVidia owns $300 market- nVidia owns $400 market-nVidia owns $500 market - flip a coin Big Bucks cost no object market- nVidia has no competition at all.
6. Did I mention nForce4 motherboards are being built as I type this, it's rumoured SoundStorm is returning, nVidia stock just went up 50%, and they inked a deal with Intel to build motherboards for Intel CPUs as well?

Talk smack all you like about nVidia ZimZum, but they owned ATI this year. No lack of PVP for the relatively few people who want to watch movies on little monitors instead of big tvs is going to change that

Now I don't necessarily disagree with a lot of the above, but there is unquestionable ATI bias in there and pro Nvidia slant, and nobody except yourself Rollo, would mistake you for a fanboy.


...for some reason, Rollo doesn't seem to be a fan of the graphics in Hl2...i wonder why?
:D

this is probably becuase you are running a 5200 or 5800 or 5900 that cannot properly render the dx9 path. on a proper true dx9 card the water looks fantastic.

rollo's response:

Overstatement. I've only played on a 6800Nu and a 6800GT, but I don't think the water is that big a deal.

Actually I don't think this game is that big a deal- Doom3 held my interest for weeks, HL2 a few days.

THats a good one, i like that. Something tells me Rollo likes Carmak better than Gabe.

Here is another, Now, these are all comments from the last 20 of Rollo's posts, mind you:

Nobody with any common sense (at least that knows anything about nVidia's finances) believed this? nVidia has never been anything but in the black, making millions every quarter for years and years AFAIK. ATI bled millions in losses for years until sometime last year when they first started making a profit.
unbelievable.

here is someone the other day complaining that his new Leadtek 6800GT is loud and asking for advice, of course, Rollo doesn't like that one bit:(

Man this case is quiet however now all i hear is my eVGA 6800GT. It sounds like a freakn leaf blower. I was wondering if there is anything i can do to quiet it down.

rollo:
I sort of doubt it sounds like a leaf blower.


here we go, here is Rollo on the Video Processor issue, he manages as always to get in some shots on ATI, all the while gushing over Nvidia's despite its flaw.

You have to remember rbV5 that nVidia isn't just rehashing the exact same core for four calendar years in a row like ATI is? [if this is unclear to any of you: I bought my 9700P in October 2002, ATI is now "introducing" the same core in 2005! (X850)] When you have years to work on drivers and core design, rbV5, you have some time to improve on things. The nV40 is a whole new core design that has been on the market less than half a year. nVidia has actually released a new core the last two years running, each of which is a more advanced design in some ways than ATIs core design they bought from ArtX.
Unbelievable. No, you're not a fanboy at all.:laugh:
same quote cont...
My Rev. 1 9700Pro had all kinds of wacky problems, as did everyone else's, necessitating Rev.2. (wouldn't run on many motherboards, rolling wavey lines) My rev.1 8500 retail couldn't run stable at default speeds, many others had the same problem So you see, when ATI either comes up with something new (8500) or buys it (9700), they have problems too

Or was it just not done? In any case, what do we as consumers care about cost effective to the company? I suppose when they replaced the superior 9500 Pro with the inferior but cheaper to produce 9600 Pro, you were squeeling,"Hurray! ATI cut costs big time on slower chips they sell us at the same price!!!"

Apparently not from the current lines triumph though: $200 MSRP 6600GT owns. $300 MSRP 6800 owns. $400 MSRP 6800GT owns. $500 MSRP* X800 XT on parity Woot!

You are right there: they not only acquired tech they couldn't produce, they acquired the people who invented it. Perhaps the original ATI engineers got to design the dinosaur graphic for the fans, or the chrome bull for the box!

My bad. If you're one of the 8 people that got one, the X800 XT PE is a faster card with a 2 year old feature set.

Oh, but no, YOu're not an Nvidia fanboy at all. There is no bias here at work in your opinions. You said I should read all your posts. Well, I didn't have to, these came up within the first 20 search results for Rollo....there are people in these threads calling you Captain Nvidia, and literally every post, is either a pro Nvidia gushing love letter. Either that, or you manage to bash ATI in one way or the other. I said a longtime ago you were the quintessence of a fanboy. It doesn't take a genius to know that I wasn't that far off the mark there.

You say I'm lonely, or have nothing better to do than to give you a hard time..fine, for arguments sake, I concede the point. But defending Nvidia's honor appears to be your raison d'etre around here, because you do little else. Just come out and admit it. You are an Nvidia fanboy through and through. So don't come into these posts and pretend to be objective or reasonable. You are anything but.
 

Malichite

Member
Feb 28, 2001
45
0
0
I don't want to get into any flame war and I am not defending the PVP since it is obvious that it isn't doing any acceleration, but I think we need to be honest about the alternatives. Since there have been MANY comments about the 6800 being the only one to tank so hard on WMVHD, I wanted to post an opposing view. I have tested Step Into Liquid on my A64 3200+ with my GFFX 5900 and 6800 and I get almost identical results (floating around 80%). Given that those are both nVidia cards I decided to test it on another machine, a P4 3.2E with a ATI 9700 to be exact. With Hyperthreading ON the clip plays fine in the 40-60% region, but as soon as I disable HT guess what? Yup that is right the ATI also jumps to high 80's and dropped about 14 frames. Does this prove anything as far as the PVP feature of the 6800, nope just shows that none of the DX9 test cards seem to be doing anything but leaving the heavy lifting up to the processor. I understand everyone is upset about the card not doing any acceleration, but the single biggest issue seems to be focused on WMVHD and that doesn't appear to play any better on the two year old DX9 card, like some have suggested.

-Malichite
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: epking
Oh so lonely....if only Rollo will acknowledge me....or someone will agree he's a fanboy...only this can bring meaning to my troubled existence...why won't the voices stop.....

Another good post and jolly laughs, Epking!

I don't think I'm convinced I'm a fanboy yet though- could you please look up some more posts and write another Off Topic essay?


BTW-
You say I'm lonely, or have nothing better to do than to give you a hard time..fine, for arguments sake, I concede the point.
X The Answer

 

epking

Member
Jun 22, 2004
114
0
0
Originally posted by: Malichite
I don't want to get into any flame war and I am not defending the PVP since it is obvious that it isn't doing any acceleration, but I think we need to be honest about the alternatives. Since there have been MANY comments about the 6800 being the only one to tank so hard on WMVHD, I wanted to post an opposing view. I have tested Step Into Liquid on my A64 3200+ with my GFFX 5900 and 6800 and I get almost identical results (floating around 80%). Given that those are both nVidia cards I decided to test it on another machine, a P4 3.2E with a ATI 9700 to be exact. With Hyperthreading ON the clip plays fine in the 40-60% region, but as soon as I disable HT guess what? Yup that is right the ATI also jumps to high 80's and dropped about 14 frames. Does this prove anything as far as the PVP feature of the 6800, nope just shows that none of the DX9 test cards seem to be doing anything but leaving the heavy lifting up to the processor. I understand everyone is upset about the card not doing any acceleration, but the single biggest issue seems to be focused on WMVHD and that doesn't appear to play any better on the two year old DX9 card, like some have suggested.

-Malichite


put your 6800 in the machine the 9700 is in now, and you will see exactly what I'm talking about. It will perform profoundly worse. You should run both cards test with HT disabled, and then make sure both tests are run in VMR9 mode. You will see a clear and unquestionable difference.
 

redDragon128

Senior member
Sep 28, 2004
423
0
0
Originally posted by: Rollo
Originally posted by: redDragon128
quite frankly I don't care whether it will become the standard today or next year. I don't care that there is only one vid that I stutter on. All I care about is that when something is advertised, it should work as advertised. I don't ask any more than what the company offers, but I certainly do not expect any less.

Do you even have a 6800 Red Dragon? (I don't see it noted in your sig)

I guess my "point" is that we have this 1000 post thread with people fighting amongst themselves like it was the Civil War, calls for lawsuits, demands for all sorts of free goods, over something I don't really see in use.

So my question is: Specifically, how is this effecting your use of this card? When do we expect to see actual recorded content in this format beyond the tech demos on MSs page?

IMO, it's not a very big deal if a S3 card doesn't work right with their MeTal API if there are no games with MeTal?

I know a lot of you reading this are angrily preparing "Who cares?!?! We want what's ours! Show solidarity to our cause or die!" but I think the question is fair:
How are you currently damaged by this? When do you expect to see recorded movies in this format?


So you HAVEN'T been reading this thread after all. If you did read any of my post, you'd know the answer to your first question. Also what's with the I don't see it noted in your sig. I see your sig has no system so I can assume you don't have either the 6800 or the x800? -__-
In any case, your response is the same as the fanboy response that I've read approximately a million times. Hey, I've even written one of those responses myself in the early stages of the discovery. But that was before I realized there would be an impact on my ability to even play those. How am I damaged by this? I can't play wmv HDTV without heavy stuttering and frame drops. That's enough for me. my 5900 did better than this for reasons stated above by someone else.
 

epking

Member
Jun 22, 2004
114
0
0
Another good post and jolly laughs, Epking!

I don't think I'm convinced I'm a fanboy yet though- could you please look up some more posts and write

you're not laughing. you're plenty mad. nice try though, :cookie: not much else you can say. you could always tell me you're gonna put me on the ignore again.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,207
126
Originally posted by: epking
Next, someone says this would be much less of a failure than the 5800. Which no one in their right mind would argue, since The 5800 is almost universally regarded as one of the most notorious and colossol failures in 3dhardware history, second only to the 3dfx voodoo fiasco.
I hesitate to tip-toe into this particular flame-fest, but I just wanted to point out that the NV-1 itself, NVidia's first product, was actually the "most notorious and colossol failures in 3d hardware history". It should tell you something when the mfg's FAQ states that the card is essentially useless for 3D, totally DirectX/Direct3D-incompatible, and to treat it as a 2D-only dumb frame-buffer. It was the premiere product, that almost sank the company. The next runner-up for that position would probably be the ATI Rage MAXX. AFAIK, NT/W2K drivers were never produced that could utilize the second GPU on that card.

As for Rollo, well, I lambasted him enough earlier in the thread for being an NV fanboy. :) I can't fault anyone for allowing something to make them happy, though. If NV video cards do that for Rollo, then so be it. But I won't let that detract or distract from the focus on getting the company to make things right for their customers over this current issue. Hopefully this thread won't degrade too much further into personal issues.
 

Malichite

Member
Feb 28, 2001
45
0
0
For the last time it WILL NOT perform worse, since it is already completely dependant on the CPU. I don't understand why you have difficulty believing this, but I have tested it multiple times with overlays, VMR, no hardware acceleration and I get the SAME results across multiple machine and multiple cards. The CPU is doing ALL the work with the 1080 WMVHD version of Step Into Liquid and none of the three cards I am listing help the processor out much. This is ONLY pertaining to WMVHD since I didn't test the rest of media samples, but I have no problem believing the 6800 has higher utils on other types. But if WMVHD is the poster child for the future, currently neither ATI or nVidia seem to have a goods. I understand ATI is working on drivers that may change this, but current not the case on my test machine. As for testing my 6800 in another computer, I am not going to bother. If you don't believe me, fine. I simply wanted to post this information for anyone who finds it useful.

This doesn't mean that that PVP is working in the least. If you have proof that I am wrong I would love to see it, why not provide a screenshot of a 9800 on a A64 at 40% util since I haven't been able to find one. Everyone that I have found inevitably has HT turned on and thus isn't a valid test. I don't find my claims that outlandish since I have seen several post of people complaining their X800XT-PE is also using high util and dropping frames.

-Malichite
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,207
126
Originally posted by: Malichite
Since there have been MANY comments about the 6800 being the only one to tank so hard on WMVHD, I wanted to post an opposing view. I have tested Step Into Liquid on my A64 3200+ with my GFFX 5900 and 6800 and I get almost identical results (floating around 80%). Given that those are both nVidia cards I decided to test it on another machine, a P4 3.2E with a ATI 9700 to be exact. With Hyperthreading ON the clip plays fine in the 40-60% region, but as soon as I disable HT guess what? Yup that is right the ATI also jumps to high 80's and dropped about 14 frames. Does this prove anything as far as the PVP feature of the 6800, nope just shows that none of the DX9 test cards seem to be doing anything but leaving the heavy lifting up to the processor. I understand everyone is upset about the card not doing any acceleration, but the single biggest issue seems to be focused on WMVHD and that doesn't appear to play any better on the two year old DX9 card, like some have suggested.
That makes perfect sense to me. My understanding of the "VMR9" display mode, was not that the GPU's shader hardware was actually doing any of the decoding work, at least not automatically, but rather, that it used DX9 surfaces to render the decoded video to (using the main CPU to do the decoding), which then allowed use of the GPU's shader hardware to further post-process the video before being displayed to the user. So various effects could be added to the video stream, like sharper/blur/de-interlace/etc. Now, I guess, that with sufficiently-powerful shader hardware on the GPU, it might be possible for the CPU to feed the incoming (compressed/encoded) video data to the video frame-buffer, and then let some custom video-decoder shader programs execute to process that data into a decoded format, that is then later post-processed and displayed. If NV is going to save face at all on the 6800 AGP PVP issue, that's probably what they are going to have to do. (Actually, if it takes an additional step in the process like that, or if there needs to be some custom way to load the compressed video data into video memory, NV very well might need MS to make some changes to WMP10 to accomodate, but that doesn't make that much sense either, since NV has already been working with MS on the same issues for XP MCE PCs.)
 

Malichite

Member
Feb 28, 2001
45
0
0
Is there any evidence that the 6600GT's PVP is actually working? I know that was the rumor, but I haven't seen anyone post thier results. Another rumor is that NV does supposedly need WMP10 support, but I take that with a grain of salt since most inside sources are far from reliable. Except about the coming 20th of Dec, lets hope. LOL

-Malichite
 

epking

Member
Jun 22, 2004
114
0
0
malchite we are maybe misunderstanding each other. I'm not saying the other cards have .wmv hardware acceleration. what i am saying is this. other users in this thread and elsewhere are reporting the exact same thing. In my case, step into liquid plays fine in vmr9 mode on my xp3200 1gig nf2 with a 9500pro, 9800np, and a 9800pro. With the 6800 in there, it is a slideshow. 2 other users with 5900's in this thread, that i specifically remember, have reported the exact same thing. Now, how is this possible?, what could account for this? I'm by know means a hardware expert, but I know what I'm experiencing, and I know how to test it. Its practically all i've been doing in my free time is fooling around with these playback modes and dvd playback. I'm not claiming any of the radeons or the 5900's have .wmv acceleration, clearly not. But clearly, at least to my reasoning...something with the broken video processor in the 6800 appears to either over-burden the cpu because the other cards used pixel shader assist and the 6800 had bypassed that in the hardware thinking it would have a working onboard video processor;) or, I don't know, something else, maybe the drivers, or something on the 6800 is causing profoundly, and i mean very noticably worse vmr9 playback in .wmvhd. In fact, i'm seeing terrible performance in vmr9 across the board.


Also, it is likely the situation is much more pronounced on lesser cpu's, ie: one is much more able to notice terrible performance if ones cpu can't handle these clips as easy. In the case of the xp3200, before it was just over the threshold where it could play them clean with no stutter..actually in liquid, there was one frame that stuttered in the same spot...with the radeons...and somehow, now with the 6800, it is well below being able to playback and the whole thing is literally stuttering its way along. I've done two clean installs of Xp, tried every forceware available more or less, and a couple of nforce2 drivers. No change whatsoever. Before these clips played fine....same CPU. Now they play like utter shit. Again, I'm not a lone voice here experiencing this.
 

epking

Member
Jun 22, 2004
114
0
0
Originally posted by: Malichite
Is there any evidence that the 6600GT's PVP is actually working? I know that was the rumor, but I haven't seen anyone post thier results. Another rumor is that NV does supposedly need WMP10 support, but I take that with a grain of salt since most inside sources are far from reliable. Except about the coming 20th of Dec, lets hope. LOL

-Malichite

yeah, there are benches up, cpu usage is waaaay down. I forget where they are, but i think they are widely available.
 

epking

Member
Jun 22, 2004
114
0
0
that it used DX9 surfaces to render the decoded video to (using the main CPU to do the decoding), which then allowed use of the GPU's shader hardware to further post-process the video before being displayed to the user. So various effects could be added to the video stream, like sharper/blur/de-interlace/etc. Now, I guess, that with sufficiently-powerful shader hardware on the GPU, it might be possible for the CPU to feed the incoming (compressed/encoded) video data to the video frame-buffer, and then let some custom video-decoder shader programs execute to process that data into a decoded format, that is then later post-processed and displayed

this is how virtual larry is talking about dx9 shaders and these .wmv's... Malchite, doesn't it seem plausible that what I and others are experiencing is the older cards handling it in the shaders without issue and most likely aleviating some strain off the cpu....and the 6800 could, well, even likely to have problems since it was designed for an on-chip decoder in the first place, and said decoder is disfunctional. Doesn't this make sense? Another possibility would be that perhaps since the 6800 has, was supposed to have the on chip decoder, then the shader hardware on the 6800 was tasked with more sophisticated, more cpu intensive post processing. This may account for the superior image quality I'm experiencing as well.(wish i could use it..lol) Since the on chip decoder isn't doing its job, then the more sophisticated post processiing coupled with the added burden from no on chip hardware....then this makes the clips unplayable on users machines, where before with other cards, they were much more playable. I would think both of these a possibility. Irregardless, this many users can't be wrong, before the upgrade they could play the clips, after the 6800 series...no go. Now, this obviously, would only occur with users within a certain cpu range, but reading this and other forums, I've found that the incidents of before/after nonplayability, go as high as users with A64's. One claimed to have an FX that played it before the upgrade but not after.

you know, i'm just thinking if nvidia, or anyone in tech support or pr happens across these threads as we try to figure out, maybe occasionally touch around the truth, but mostly stab in the dark about ultimately WTF is up with this stuff, you have to know they just snicker, if not out and out laugh their d*cks off.
 

Malichite

Member
Feb 28, 2001
45
0
0
I guess why I have a hard time believing that it the I tested a 9700 on a 3.0E that should be able to handle it and it still dropped frames WITH full acceleration but without HT. It did NOT run in 50%, but rather in the high 80's low 90's (hmm seems familiar) and if what you are saying is true that would have been WITH the GPU offloading some of the strain. Thus why would a even slower GPU and slower CPU perform any better than the 9700 with a 3.0E? That doesn't seem to make any sense, if there are people reading this that are running 1080 WMVHD (on a 9x00 without HyperThreading) in the 40-50% region without HT just take a screen capture and you'll have proven me wrong. Like I said I was aware of the issue before I installed my 6800 so I put my old 5900 through the tests and found it to be no better with WMVHD. Then when you were in doubt I decided to try a 9700 and I once again got high CPU utilizations without HT.

[/quote]. One claimed to have an FX that played it before the upgrade but not after. [/quote]
Well my system is well below an FX an I never drop a frame so I don't know. Bare in mind, if you believe anything on the Xbit page the forcewares were really broken before and now maybe they just don't hinder the performance.

-Mike


 

Damolee

Junior Member
Oct 24, 2004
2
0
0
Right, after months of views, opinions,driver fix this...driver fix that, boring flaming, whining,moaning etc

Lets just all assume nVidia went bust yesterday and cannot do a damn thing to fix the issue, does anyone here have the knack for creating/modifying drivers to the extent where we could perhaps fix this ourselves?

After all, if the gospel has some half truths and a driver fix is the magic ticket, it can be done ....whos to say it has to be done through official avenues, what are nVidia going to do? sue someone for tampering with their tech.... wouldn't suprise me, but if so , have my permission to pin the blame on myself, I will gladly confess. :)

Anybody?
 

epking

Member
Jun 22, 2004
114
0
0
what happens is the cpu is always high, on the rads on my xp3200 and the 6800. The 9800pro for example is up in the high 90's, however, it plays the video. The 6800 is in the high 90's as well, but it is a slideshow. This is mine and others firsthand experience. I'm not sure about the HT with and without or the Intel/AmD how apples to apples it is. All's i know is this is what happens, the 9800 can play it, the 6800 cannot, and not even close. As I said, others are reporting this all over these threads. We are in the realm of theory and what should happen with cpu usage, particularly your argument, however, no offense...but mine and other's experience is saying otherwise. I think this divide between us is in part due to differences in yours and mine's hardware, mine and the others reporting this appear closer to the threshold of what can and cannot play WinHD. Now, I see what you are saying about cpu usage, there should be no difference, However, try to look at it from my perspective. For a second try not to think in terms of what should happen as far as cpu numbers. Think merely in terms of whether it plays satisfactorily or not. In the same machine, these vids were playing before the 6800 upgrade. Now they are not. Whether that makes sense or not... it does appear that is exactly what is happening for a number of users across these threads. If I remember right there was some discussion on this waay back towards the beginning of this thread..along with some comments on HT giving false positives.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,572
10,207
126
Originally posted by: epking
this is how virtual larry is talking about dx9 shaders and these .wmv's... Malchite, doesn't it seem plausible that what I and others are experiencing is the older cards handling it in the shaders without issue and most likely aleviating some strain off the cpu
That's what I was actually trying to point out, epking, I think that you're probably slightly wrong about that, or at least the assumption that the potential ability to use shaders to process the video means that they somehow automatically are doing that. VMR9 means that the drivers/codecs could use shaders to process the video (either decode or post-process), but based on the CPU numbers, it appears that they aren't. (Or maybe they are? People have suggested that ATI's drivers have some sort of HW video-playback/WMV-acceleration for a while now, but I haven't seen any concrete, testable evidence to verify that.) OTOH, perhaps the current batch of drivers for the 6800 have problems handling the VMR9 render mode period? Have you verified whether or not FastWrites are enabled (with WCPUID or PowerStrip)? Does your 6800 still stutter in Overlay render mode as well? I guess the only reason that I'm suggesting that is: 1) CPU usage numbers between different older cards on the same system seem similar (right?), and 2) I don't think that there are many older pre-6800 with a powerful enough GPU to meaningfully offload decoding tasks to the shader hardware anyways, for WMV-HD content, save for possibly the R350 and newer GPUs. I have a feeling that something else is going on, not shaders, but perhaps the motion-compensation engine or something. That actually would seem to fit your theory slightly better, if NV took that out, intended to utilize the PVP for that feature, whereas prior cards had that as an integrated dedicated hardware function. Supposedly the FX5900 has an excellent hardware MC engine implemented, as I'm sure ATI cards of a similar vintage do too.
Originally posted by: epkingIrregardless, this many users can't be wrong, before the upgrade they could play the clips, after the 6800 series...no go. Now, this obviously, would only occur with users within a certain cpu range, but reading this and other forums, I've found that the incidents of before/after nonplayability, go as high as users with A64's. One claimed to have an FX that played it before the upgrade but not after.
Clearly, there is something going on with the 6800, in that it seems to be not only no better, but in fact worse, than prior cards, but I don't think that you can definately point to some sort of shader-based video-decode acceleration on prior cards. If there was, then by varying the GPU clockspeeds, keeping the CPU clockspeed constant, you should be able to directly measure the relative CPU offload by the GPU shader hardware. That would be one potential way to test.

Edit: Some info about ATI WMV acceleration on the bottom of the 2nd page of this thread . It seems to indicate, although the drivers support it, it can't be enabled without some fix from MS, and therefore hasn't worked yet without it, so they're removing the option for now? I wonder if that has any bearing on this 6800 AGP issue?
 

epking

Member
Jun 22, 2004
114
0
0
ok, thanks so much virt larry for the explanatory post. yeah, about the shaders, someone mentioned it earlier, or in another thread..it makes sense to me. I'm just trying to figure out why I can't play these clips anymore. If you don't think this is so, at least there appears to be several other explanations and possibilities. LOL. As for how i do vmr9 and overlay ect...yeah, i'm sure its turned on right, I'm setting it up first through WMP, and then later on for more advanced options, through zoomplayer, and have tried every possible combination of output modes in VMR9 yuv,rgb32, and all those modes. Nothing made a difference. Also, yeah, I have tried with and without fastwrites just in case, I leave it on by default. I've even tried turning on/off sideband addressing, anything i can think of, i've tried. I've got two full WinHD movies I haven't watched yet, Step into Liquid full movie, and Standing in the Shadows of Motown.

About the clockspeeds and shaders you mentioned, I'm not sure if this is why or not, but over at AVS, the HTPC gurus always recommend 9700pro and above as ideal for HDTV playback performance, anything else can't handle without stuttering..which I guess I'm getting confused, but if its soley a cpu thing, then why would these gpu's matter for HDTV playback performance? To throw another one out there, because I honestly don't understand, the big thing now with these 6800's, if you have the system, is to use AA and aniso in DVD and video playback. I didn't believe it when i first heard it, but it does indeed make a difference. Do you have any idea why that works, or why/how it might work? and if that has anything to do with what we're talking about? Sorry for the ??'s, if you don;t have time no worries. I appreciate your knowledge though.

about that edit, and the hardware accel in .wmv Could there be any sort of difference or any sort of "assistance" that occurs in the rads, possibly the FX cards? Doesn't VMR9 mode work through the hardware (shader pipelines) in some way by its very nature? Its just not full hardware acceleration? To be clear, I'm asking, not saying..lol. I really don't know, and would like to. I was always under the impression though, that VMR9 used some of the videocard hardware, i think pertaining to color output space in particular?

I just looked it up, here is a VMR9 definition from google one of the first couple entries of course:

On Windows XP, the default renderer is the VMR7, but on older Windows version it's the Video Renderer. The main difference is performance and overlay mixing capabilities: as older renderer use different versions of DirectDraw API (even older API for the Video Renderer), the VMR9 is based on DirectX Graphics, so it uses the Direct3D capability of your 3D video card. The result is an improved performance on recent 3D cards, better support of overlay mixing, compatibility with all Windows versions that support DirectX9, and some new capability such as de-interlacing and ProcAmp support (contrast, saturation, etc.).

So the new VMR9 looks great but it's not the default renderer, regardless of Windows version... We've to build it manually, and this is why I wrote this class.

What you need...