GeForce 6 series video processor OFFICIAL THREAD

Page 23 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_humey

Senior member
Nov 9, 2004
863
0
0
MY comment is wrong to sheik124 as he had a 6800 gpu also, i only read the 5700 part, nvidia has had some kind of hardware accel since early G-Force 2 anyhow so claims Nvidia peep and prob true just now in newer cards their trying new approach.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: VirtualLarry
Originally posted by: bpt8056
Anybody see that nVidia changed the wording of the 6800GT video processor on their website? Instead of 'On-chip video processor', it has changed to 'Adaptable video processor'. Link

However, the 6600GT retained the phrase 'On-Chip video processor'. Hmm...

Interesting. Why in the world, would NV need to do that... if the information from Rollo's "friend" about the PVP hardware in the AGP 6800 cards were fully functional in hardware, and only needed software to be enabled. (Yet, Hexus already benchmarked a working PVP using existing software on the 6600, so you tell me what's really going on here, hmm.)

Also, interesting to note, that when the word started getting about about the IBM 75GXP drives, IBM added an additional spec to their technical spec pages, that the drive was only intended to be used for something like 36 power-on hours per month, something totally rediculous.

I guess NV is just following the playbook from IBM on the 75GXP, about how to avoid taking responsibility for a product with problems.


When I see things like this I wonder how many people posting here actually work for video card companies, in this case ATI.

The fact of the matter is this, IMO:

1. The 6800 series VP isn't working for WMV and never will.
OR
2. The 6800 series VP will work to some extent for WMV in a few weeks after software updates
OR
3. The 6800 series VP is about to be redefined to include the shaders, and deliver some hardware acceleration of WMV in a few weeks

No one here can post anything relevant in regard to this legally, which is why you haven't seen it anywhere.
It's not worth getting sued so Virtual Larry has something factual to complain about. (or not)

There's no point in posting "Well, if they did this they better give me a new card, Miss December 2004, and a Chevy Avalanche, or I'll be PISSED!"

The documentation I've seen leads me to believe number 2 above is closest to the truth. I'm not going to go into more detail than that, I feel pretty lucky to get the info in advance occasionally, and am not going to break my word to my friend or his company's NDA to argue with someone on a BBS.

IMO, if 2. or 3. is the truth, this is not "Second Holocaust" issue many of you have made it out to be, or even 1. for that matter.
A. This are $300-$500 video cards, one element of their functionality, not the fate of nations.
B. If 2. is true, everyone will be happy. If 3. is true, almost everyone will be happy. (but some will surely whine,"Waah. But..but...they said THOSE circuits would accelerate, not THESE circuits")

Jesus. Just wait a couple weeks and then you'll know for sure. It's not worth this petty bickering over suppositions.
 

bpt8056

Senior member
Jan 31, 2001
528
0
0
Jesus. Just wait a couple weeks and then you'll know for sure. It's not worth this petty bickering over suppositions.

I've been waiting since July for the implementation of WMV acceleration. Since you don't really have much interest in the functionality of the PVP, I guess it's hard for you to understand the eagerness of the crowd here. Besides it's perfectly normal to have "petty bickering" especially when nVidia was supposed to have the PVP functional six months ago.

If nVidia touted a new filtering technology that promises to be 2x better than AF in performance and quality at the time of launch and failed to deliver on their promise after six months, don't you think there will be plenty of "petty bickering"? Just want to give you a different perspective. Cheers.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Rollo
When I see things like this I wonder how many people posting here actually work for video card companies, in this case ATI.
ROTFL.

Funny, I could swear that you, or perhaps someone in your family, must work for NV's marketing dept., based on your comments in another thread flouting your prior early access to 6800 cards, early access to Doom3, and being able to provide benchmarks of such before the release of either.

I was thinking about that when I read it, but I refrained from comment at the time. I started to think about it again when I was reading and replying to this thread - think about it - those things are normally under very heavy, strict, NDAs. Even sites like AT have to abide by their NDA blackout periods, usually, in order to even get access to the hardware ahead of time. So how did you get access to that hardware, Rollo? Care to come clean and tell us?

For the record, I work for no video-card or even computer-hardware company. I have no hidden bias here, save for promoting the technically-accurate truth, and preventing hardware companies from being able to sweep defects under the rug, at the expense of consumers and of their own corporate integrity.

I also noticed that your M.O., Rollo, is to immediately accuse and question the credibility of anyone that criticizes your deity NVidia. Speaking of more than just myself here.

Originally posted by: Rollo
The fact of the matter is this, IMO:

1. The 6800 series VP isn't working for WMV and never will.
OR
2. The 6800 series VP will work to some extent for WMV in a few weeks after software updates
OR
3. The 6800 series VP is about to be redefined to include the shaders, and deliver some hardware acceleration of WMV in a few weeks

So now you are admitting, that NV are (or may be) attempting to blur the issue of defective hardware here, by "re-defining" one hardware feature (believed to be defective), in terms of another?

Isn't that exactly what IBM did? When people accused their HDs of being un-reliable, they "re-defined" reliability to include only a limited amount of power-on hours per month, which nearly every user of their HDs exceeded! Ergo, it's not IBM's problem at all - the users simple use their HDs too much! Nevermind the fact that nearly every other mfg of HDs had no such spec, and most people were using their competitors HDs for plenty of hours per month, many of them running 24x7 under non-server use.

Also, isn't that "cheating", in much the same way as ATI's "trylinear" - the argument at the time in defense of that, was that if the resulting effect was mostly the same, why would it matter if they did it in a different way?

I'm curious, what was your stance on that matter at the time - were you defending ATI, since obviously they were doing some sort of filtering, or were you accusing them of cheating and deceptive marketing practices, along with the rest?

(Not that I'm defending that myself, I have no position on that, other than I feel that it was a deceptive practice.)

Originally posted by: Rollo
No one here can post anything relevant in regard to this legally, which is why you haven't seen it anywhere.
It's not worth getting sued so Virtual Larry has something factual to complain about. (or not)

Fact: NVidia sold these cards on the basis of promised features to consumers.
Fact: NVidia still hasn't delivered on all of those features.
Fact: NVidia has been working on shader-accellerated video-processing for smoother DVD-playback, for some time now, that works on cards other than the 6800, cards with no 'PVP'.

The fact that the release of such shader-accellerated DVD-playback was "imminent" (as of mid-october), and coincided with your announcement, Rollo, along with some of NVidia's responses to the articles in The Inq., makes me very suspicious, very suspicious indeed. Your responses and suggestions seem to be mirroring the NV corporate line very closely, as a matter of fact. That alone disturbs me.

If NV offers that the only hw-accellerated video processing is that shader accellerated DVD-playback, that requires their special DVD player and/or codec, then you can rest assured that the PVP hardware in the 6800 AGP cards must be dead as a doornail. That would be the only logical technical explaination.

Originally posted by: Rollo
There's no point in posting "Well, if they did this they better give me a new card, Miss December 2004, and a Chevy Avalanche, or I'll be PISSED!"

If ATI had released a card, with a defective portion of the 3D-accelerator pipeline, say (hypothetically-speaking here) the AA/AF unit, or the T&L unit, or something not enough to totally cripple the card, but enough to make a significant difference in the CPU load or display output quality, and you had to pay boku bucks for that card - I know that you would be complaining too, Rollo. Please don't deny that.. You wouldn't be saying, "oh, that's nothing - who uses a hardware T&L unit anyways? You can always just emulate it in the drivers on the host CPU - games will still run, so it doesn't really matter." Because that's what you have effectively been saying, much ado about nothing, just wait, just wait. That's clearly NVs' plan here, to get their next product refresh out, along with some workaround drivers, and hope that this all blows by and is forgotten. At least, that's my opinion of the situation.

Originally posted by: Rollo
The documentation I've seen leads me to believe number 2 above is closest to the truth. I'm not going to go into more detail than that, I feel pretty lucky to get the info in advance occasionally, and am not going to break my word to my friend or his company's NDA to argue with someone on a BBS.
IMO, if 2. or 3. is the truth, this is not "Second Holocaust" issue many of you have made it out to be, or even 1. for that matter.
A. This are $300-$500 video cards, one element of their functionality, not the fate of nations.
B. If 2. is true, everyone will be happy. If 3. is true, almost everyone will be happy. (but some will surely whine,"Waah. But..but...they said THOSE circuits would accelerate, not THESE circuits")
Jesus. Just wait a couple weeks and then you'll know for sure. It's not worth this petty bickering over suppositions.
You're right. So why do you call those that have a difference of opinion from yours, and I quote, "nutjobs".

I thought that you were a man of integrity, Rollo.

Once upon a time.

You've shown your true colors over this issue, that's for sure. (I'll wait for you to make more personal accusations about my motives or intentions or integrity. I'm sure that they will be forthcoming.)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Ok, on a more productive note - there may be two possible ways to "sniff out" how much of the work is being done in by the shaders in whatever new hypothetical driver set NV is going to release for 6800 AGP owners, and how much is actually being done by the PVP. I can't remember the name of it off-hand, but it lets you intercept shader programs and replace them; it was used by some sites during their FC 1.2/1.3 SM2.0/3.0 investigation work? Does anyone remember the name of that program or the site?

The other possibility, is getting a hold of some beta NV drivers that have the "NULL" shader switches enabled in the binary, I know that there have been some of those leaked in the past.

The third possibility, is to run the NV drivers under emulation, and log what they are doing, and compare between the 6800 and 6600 boards. If one could log how to twiddle the "magic bits" to awaken the PVP, then we could see, once and for all, whether or not the one in the 6800 AGP was "live" or "stillborn".

Essentially, the idea is to try shader-replacement techniques that were useful for 3D games, but in this case for video-playback, and see what is really going on here.
 

carage

Senior member
Sep 20, 2004
349
0
0
As long as they could get the PVP working without breaking other things (i.e. hurting performance in other categories), I really don't care how they do it or which circuits are actually utilized.

I really wonder how did that website get those nice 30% numbers? Did they recieve special production units?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
sniff...sob...

I confess, I'm Secret Agent 00n. :roll:

I'm going to load up HL2 and check it out. You guys will see what I've seen in a few weeks, as well as all the review communities tests of it.

I thought it was pretty nice of my friend to give me a look at "things to come" and the OK to post that you will be getting hardware acceleration out of your 6800s, if some of you thought the information was of no use that is your perogative.

I thought the baseless speculation was of no use as well, so we're probably even.

Cheers! :):beer: <<<<< "Old Rasputin Russian Imperial Stout" lately, excellent!
 

Chippy99

Member
Oct 20, 2004
30
0
0
Originally posted by: sheik124
bah, i still think mine is working

ok, then how come its now playing with zero dropped frames, and the temperature rose 4-5C?
its probably not working, but i might have gotten lucky

1. No dropped frames. So what. I have no dropped frames.

2. Let me quote you from earlier: "UPDATE: i tested with the nVidia temp monitor open on the side, and let the video loop several (like 6) times, and the GPU temp steadily rose up from 60C to 63C, after i shut Windows Media Player, it went back down to 61C, is it really working?"

63-61 = 2. Not "4 or 5".

And anyway, if it *was* working, CPU would be around 30% and temp rise about 9C.

Its not working.

Chip
 

bpt8056

Senior member
Jan 31, 2001
528
0
0
Don't know if any of you read this, but this was posted in Anand's weblogs:

I promised an update on NVIDIA's Video Processor situation, so here it is: right now, on the surface, nothing has changed. There's still no solid release date for the drivers and codec that will enable the video processor, but thanks to all of the negative press lately the folks at NVIDIA are scrambling to enable its support. There's the power of the internet for you :)

Nice job guys getting nVidia's attention. We did our part and all we can do is hope and pray that nVidia will deliver.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,092
32,631
146
Originally posted by: bpt8056
Don't know if any of you read this, but this was posted in Anand's weblogs:

I promised an update on NVIDIA's Video Processor situation, so here it is: right now, on the surface, nothing has changed. There's still no solid release date for the drivers and codec that will enable the video processor, but thanks to all of the negative press lately the folks at NVIDIA are scrambling to enable its support. There's the power of the internet for you :)

Nice job guys getting nVidia's attention. We did our part and all we can do is hope and pray that nVidia will deliver.
:thumbsup:
 

carage

Senior member
Sep 20, 2004
349
0
0
3 more days until the end of November, I guess the moment of truth is approaching.
Just hope it won't be a moment of disappointment.
 

govtcheez75

Platinum Member
Aug 13, 2002
2,932
0
76
Originally posted by: carage
3 more days until the end of November, I guess the moment of truth is approaching.
Just hope it won't be a moment of disappointment.


yep...i can't wait until we finally get this "resolved", even if that means many of us get very dissapointed.
 

Ice27181

Junior Member
Nov 4, 2004
13
0
0
First we were told we would get a working driver at November 8th, then in the end of November.
If they won't deliver this time I'll really get mad!!!
I'll remember that for sure the next times I buy a graphics card (or mainboard chipset for that matter)! Hasta la vista nVidia! It's your last chance!!! Just do something NOW, it doesn't have to be perfect!!!
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ice27181
First we were told we would get a working driver at November 8th, then in the end of November.
If they won't deliver this time I'll really get mad!!!
I'll remember that for sure the next times I buy a graphics card (or mainboard chipset for that matter)! Hasta la vista nVidia! It's your last chance!!! Just do something NOW, it doesn't have to be perfect!!!

This makes a lot of sense- limit yourself to one graphics card company because one form of video encoding hasn't been accelerated for a while.

You'll be out of luck if ATI ever does something you don't like, won't you?

:roll:

BTW- LOL-
If they won't deliver this time I'll really get mad!!!

Oh no! Not that!!!! Don't get mad!!!

:roll:
 

Ice27181

Junior Member
Nov 4, 2004
13
0
0
Hey Rollo, sorry to say that but you asked for it:
Are you payed by nVidia to spread some rumors and change every thread like this into a place where people mock about one another so the real topic won't be discussed anymore?
You allready draged your other thread concerning this topic in the mud, don't you think that's enough?

(Btw.: I didn't say I wouldn't generally buy anything from nVidia anymore, just that I will remember that they lied to us the second time allready concerning the capabilities of their products (first time was the 3dMark "optimizations"), and thats a strong point in favour of ATI's products!)
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Ice27181
Hey Rollo, sorry to say that but you asked for it:
Are you payed by nVidia to spread some rumors and change every thread like this into a place where people mock about one another so the real topic won't be discussed anymore?
You allready draged your other thread concerning this topic in the mud, don't you think that's enough?

(Btw.: I didn't say I wouldn't generally buy anything from nVidia anymore, just that I will remember that they lied to us the second time allready concerning the capabilities of their products (first time was the 3dMark "optimizations"), and thats a strong point in favour of ATI's products!)

LOL some more.

Errrr, no, I don't get "paid" by nVidia. Do you get "paid" by ATI to come here and post FUD about things you know nothing about?

Point being: It's not long till nVidia's answer to this all will be out. Why are you here stirring the ol' troll pot saying, "They lied once! What if they lied again?!?!".

We'll all know soon, hold your water. Sheesh. There solution may be more or less than you hoped for, but could we just wait the few days to see it without having the pointless speculation and empty threats?

"Hasta la vista" my ass. What are you? 12? Hint: I don't think quoting Gov Schwarzeneggers old action movies is "in".
 

carage

Senior member
Sep 20, 2004
349
0
0
Originally posted by: Rollo
Originally posted by: Ice27181
Hey Rollo, sorry to say that but you asked for it:
Are you payed by nVidia to spread some rumors and change every thread like this into a place where people mock about one another so the real topic won't be discussed anymore?
You allready draged your other thread concerning this topic in the mud, don't you think that's enough?

(Btw.: I didn't say I wouldn't generally buy anything from nVidia anymore, just that I will remember that they lied to us the second time allready concerning the capabilities of their products (first time was the 3dMark "optimizations"), and thats a strong point in favour of ATI's products!)

LOL some more.

Errrr, no, I don't get "paid" by nVidia. Do you get "paid" by ATI to come here and post FUD about things you know nothing about?

Point being: It's not long till nVidia's answer to this all will be out. Why are you here stirring the ol' troll pot saying, "They lied once! What if they lied again?!?!".

We'll all know soon, hold your water. Sheesh. There solution may be more or less than you hoped for, but could we just wait the few days to see it without having the pointless speculation and empty threats?

"Hasta la vista" my ass. What are you? 12? Hint: I don't think quoting Gov Schwarzeneggers old action movies is "in".


Well, to be more exact. There are only 2 more days left in November.
I'm still waiting. It doesn't have to be perfect, it can be half-baked quick fix as long as it shows promise (no pun intended) and commitment to keep consumers happy.
True, there really isn't that much of a choice when it comes to consumer 3D graphics nowadays, but I have been a nVidia fanboy long enough to dismiss pretty much anything ATi offers (except their TV tuners), so I guess there won't be anything that can stop me for reversing polarity and becoming an ATidiot.
By the way, S3 is making a comeback. Although there 3D performance is still laughable at best, but it seems like their multimedia performance is giving ATi a run for their money.
There is also XGI, if they ever manage to make their drivers capable of running anything beside 3DMark.
Maxtrox Perhelia still seems to be the only way to run 3 LCDs at the same time, not that I have the desk space or the budget to get two more Apple CinemaDisplays, but it looks really cool when you are running M$ Flight Simulator 2004.

 

Algere

Platinum Member
Feb 29, 2004
2,157
0
0
but I have been a nVidia fanboy long enough to dismiss pretty much anything ATi offers (except their TV tuners), so I guess there won't be anything that can stop me for reversing polarity and becoming an ATidiot.
Pfft all wrong. It's nVidiot / fanATIc :p
 

carage

Senior member
Sep 20, 2004
349
0
0
Originally posted by: Algere
but I have been a nVidia fanboy long enough to dismiss pretty much anything ATi offers (except their TV tuners), so I guess there won't be anything that can stop me for reversing polarity and becoming an ATidiot.
Pfft all wrong. It's nVidiot / fanATIc :p

I'm standing from nVidia's POV. No one would call themselves idiots, right?