***Official*** NV40 Benches (Updated as they go live) ANANDTECH Review Added

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Alkali

Senior member
Aug 14, 2002
483
0
0
Originally posted by: BoomAM


Actually the tech used in r3xx is just to remove the blockness from videos. Doesnt decode the video stream. At all. Cant remember the name of it exactely, i think it was videoshader or fullstream or something like that.
Well whatever it was called, it DID`NT decode the video stream, it just removed some of the blockness from it. Especially in streaming media.

Ah right, yes I remember that name now, but it still required a seperate player to get the enhancement, I remember downloading it ages ago and selecting the '9700/9800 series enhancement' box in the options....

Point is, 6800 of course doesnt require extra software apparently, which is cool :) Lets just hope (so we get a good scrap) ATi decided years ago (when they introduced this other thing) that they wanted to go hardware-accelerated with video too :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
2 Things to clear up some confusion:

1) 14680 3dmark03 was posted at 1024x768 on a FX51 highly tweaked system and an overclocked 6800Ultra so these scores are legit but are not in stock form. I have read that in 1 or 2 reviews but don't remember which ones exactly

2) one of the reviews ran the system with 430watt antec truepower, so we already know 480 watts is BS. Full system power on a 3.2ghz p4 system with hard drives and all that stuff is <290 watts as tom's hardware shows which of course excludes average efficiency of 69% for a power supply so that would mean on average a true power requirement of 400 - 420 watts for a power suply if you take the efficiency into consideration.

 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Hardware decoding of video doesn't appeal to me. Any system can do that.
Hardware ENCODING on the other hand makes me wet. My system takes about 4 hours to render a DVD before I burn it and that's ridiculous. I can't wait to see the results of this thing...
 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Originally posted by: BoomAM
Originally posted by: Alkali


ATi actually did this already ...

May I remind you that the Radeon 9700 and 9800 series both contain on-chip enhancements for MPEG/Avi playback, but you need the right drivers, plus a free - but not highly publicised player to give the enhancement.

So I won't be suprised at all if they have a similar system for video processing, especially the high quality HDTV signal throughput stuff.
Actually the tech used in r3xx is just to remove the blockness from videos. Doesnt decode the video stream. At all. Cant remember the name of it exactely, i think it was videoshader or fullstream or something like that.
Well whatever it was called, it DID`NT decode the video stream, it just removed some of the blockness from it. Especially in streaming media.

On THG, the 6800 doesnt draw much more power than a 9800XT or a 5950U. On a retail board, or at least after a few revisions, i think we can expect the power circuitary to be "optimised" a bit, and we`ll probably just see the one power connector.
Either way, im covered. My 350W Cheiftec PSU couldnt handle what i was throwing at it when i got it, so i brought a 480w Antec PSU! lol.


The R300 series can decode DivX in hardware using its pixel shaders. You have to use the DivX Player though...

 

Alkali

Senior member
Aug 14, 2002
483
0
0
Originally posted by: RussianSensation


1) 14680 3dmark03 was posted at 1024x768 on a FX51 highly tweaked system and an overclocked 6800Ultra so these scores are legit but are not in stock form. I have read that in 1 or 2 reviews but don't remember which ones exactly

Actually, that score there (14,680) is at 800x600 resolution - check out the launch pics if you dont believe me.
Plus, the system was touted to be an Athlon 3400+ with both itself and the video card at stock.... I dont know where you got your info from, sounds like your hoping :D
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Originally posted by: AIWGuru
Hardware decoding of video doesn't appeal to me. Any system can do that.
Hardware ENCODING on the other hand makes me wet. My system takes about 4 hours to render a DVD before I burn it and that's ridiculous. I can't wait to see the results of this thing...

The recommended specs for decoding MS WM9 HD is a Pentium 4 3.0GHz or faster. Good luck trying to decode that with the system in your signature. I'm not sure if the new NVidia hardware supports WM9 HD decoding, but it sure would be a nice feature on their lower end cards for people without top of the line computers.
 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Originally posted by: Pariah
Originally posted by: AIWGuru
Hardware decoding of video doesn't appeal to me. Any system can do that.
Hardware ENCODING on the other hand makes me wet. My system takes about 4 hours to render a DVD before I burn it and that's ridiculous. I can't wait to see the results of this thing...

The recommended specs for decoding MS WM9 HD is a Pentium 4 3.0GHz or faster. Good luck trying to decode that with the system in your signature. I'm not sure if the new NVidia hardware supports WM9 HD decoding, but it sure would be a nice feature on their lower end cards for people without top of the line computers.

Yeah, I've tried the demos and they runn choppy.
But....what exactly is distributed in WM9HD?
Nothing.
When this does become mainstream, I'll have a proc that can handle it.
Actually, my system could handle most of the demos. It was just the very highest one it couldn't do. 1080p I think.
Incidentally, the guy I was living with when those demos hit could play that demo smoothy on his dually p3 :confused:
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Sunner
I hope ATi doesn't match it, only because it would make me smile to see whatever excuses GTAudiphile can come up with :)

<Disclaimer> Owner of an R9800 </Disclaimer>
:D

<Disclaimer> Owner of an R8500 </Disclaimer>

rolleye.gif


And soon to be proud owner of a GF6800ultra. :p
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Originally posted by: AIWGuru
Originally posted by: Pariah
Originally posted by: AIWGuru
Hardware decoding of video doesn't appeal to me. Any system can do that.
Hardware ENCODING on the other hand makes me wet. My system takes about 4 hours to render a DVD before I burn it and that's ridiculous. I can't wait to see the results of this thing...

The recommended specs for decoding MS WM9 HD is a Pentium 4 3.0GHz or faster. Good luck trying to decode that with the system in your signature. I'm not sure if the new NVidia hardware supports WM9 HD decoding, but it sure would be a nice feature on their lower end cards for people without top of the line computers.

Yeah, I've tried the demos and they runn choppy.
But....what exactly is distributed in WM9HD?
Nothing.
When this does become mainstream, I'll have a proc that can handle it.
Actually, my system could handle most of the demos. It was just the very highest one it couldn't do. 1080p I think.
Incidentally, the guy I was living with when those demos hit could play that demo smoothy on his dually p3 :confused:


The stringent hardware requirements of the format make it available to a very small audience. Hardware decoding from NVidia could bring it to a much larger market. As with all new formats, adoption has been slow, but WM9 has made enough inroads in the movie industy that it looks like a standard that will be around for a while. SMP does nothing for WM9 decoding. There's no way it would play on a P3 at full resolution.
 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Originally posted by: Pariah
Originally posted by: AIWGuru
Originally posted by: Pariah
Originally posted by: AIWGuru
Hardware decoding of video doesn't appeal to me. Any system can do that.
Hardware ENCODING on the other hand makes me wet. My system takes about 4 hours to render a DVD before I burn it and that's ridiculous. I can't wait to see the results of this thing...

The recommended specs for decoding MS WM9 HD is a Pentium 4 3.0GHz or faster. Good luck trying to decode that with the system in your signature. I'm not sure if the new NVidia hardware supports WM9 HD decoding, but it sure would be a nice feature on their lower end cards for people without top of the line computers.

Yeah, I've tried the demos and they runn choppy.
But....what exactly is distributed in WM9HD?
Nothing.
When this does become mainstream, I'll have a proc that can handle it.
Actually, my system could handle most of the demos. It was just the very highest one it couldn't do. 1080p I think.
Incidentally, the guy I was living with when those demos hit could play that demo smoothy on his dually p3 :confused:


The stringent hardware requirements of the format make it available to a very small audience. Hardware decoding from NVidia could bring it to a much larger market. As with all new formats, adoption has been slow, but WM9 has made enough inroads in the movie industy that it looks like a standard that will be around for a while. SMP does nothing for WM9 decoding. There's no way it would play on a P3 at full resolution.

It did. My XP 1800+ choked on it. His dually did just fine.
 

Chad

Platinum Member
Oct 11, 1999
2,224
0
76
Yeah, I've tried the demos and they runn choppy.
But....what exactly is distributed in WM9HD?
Nothing.
When this does become mainstream, I'll have a proc that can handle it.
Actually, my system could handle most of the demos. It was just the very highest one it couldn't do. 1080p I think.
Incidentally, the guy I was living with when those demos hit could play that demo smoothy on his dually p3 :confused:


BINGO!

 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
I don't really care if you say it did. There is absolutely no way a system that couldn't have been faster than a 1.4Ghz P3 could play a WM9 HD file at full resolution. There is simply no way. That's like trying to convince someone you were watching DVD's in software on a dual Pentium 200.
 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Originally posted by: Pariah
I don't really care if you say it did. There is absolutely no way a system that couldn't have been faster than a 1.4Ghz P3 could play a WM9 HD file at full resolution. There is simply no way. That's like trying to convince someone you were watching DVD's in software on a dual Pentium 200.

I've done that too! (233mhz) Just very, very, very choppy. :p
But, the WMV9 demo was smooth. I don't have any explanation except to suggest that maybe you don't know as much about it as you think and that SMP actually does help it.
Also, my 1.53 plays with with only SOME slowdown. It's not a stretch for a 1.4 dually to do it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Originally posted by: Alkali
Originally posted by: RussianSensation


1) 14680 3dmark03 was posted at 1024x768 on a FX51 highly tweaked system and an overclocked 6800Ultra so these scores are legit but are not in stock form. I have read that in 1 or 2 reviews but don't remember which ones exactly

Actually, that score there (14,680) is at 800x600 resolution - check out the launch pics if you dont believe me.
Plus, the system was touted to be an Athlon 3400+ with both itself and the video card at stock.... I dont know where you got your info from, sounds like your hoping :D

You might be right about the launch demo but you can get 14000 point on an FX system

Here is the quote:

...but swap our AMD for a P4EE rig, and NVIDIA say they scored an insane 12,353 3DMarks! Though it doesn't stop there: 14,000+ is possible with an AMD FX-51, and some tweaking!!!

Anyway i remember reading exactly someone saying that the system that scored 14000 + points was a FX51 overclocked with an overclocked 6800ultra but i cant find that article cuz i read so many....anyways...the card is great even doing 12000 so i wont argue with you.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Yea, you're right, owning multiple SMP systems probably means I know nothing of their capabilities. It must just be a coincidence that trying to play a WM9 file results in choppy playback and 50% CPU usage. Do you always just make stuff up to try and look smarter?
 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Originally posted by: Pariah
Yea, you're right, owning multiple SMP systems probably means I know nothing of their capabilities. It must just be a coincidence that trying to play a WM9 file results in choppy playback and 50% CPU usage. Do you always just make stuff up to try and look smarter?

We've had different experiences and we'll just leave it at that.
The 3ghz requirement is B.S.
My 1800+ (1.53ghz) plays all of them fine except 1080p and even then it plays mostly okay except during fast motion.
A dual 1.4 does fine on it.
I obviously can't do anything to convince you of it but it's true.
BTW: plays better in PowerDVD or divx player than media player. We found the videos played better in one of those. I don't remember which.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
Thanx chief. Much appreciated. It's a pet peeve to see people trying to pass off utter BS as practical information for others. We all probably do it at least occasionally, but some seem to make a habit of doing it here.
 

Eug

Lifer
Mar 11, 2000
24,131
1,782
126
Originally posted by: Pariah
Originally posted by: Eug
These speeds are stupid-fast and all, but this power utilization and die size stuff is getting way too out of hand.

WTF? NV40 is approximately three to four times the size of the IBM G5 PPC 970FX 2.5 GHz CPU (66 mm2). And NV40 requires a 480 Watt power supply?

What happens with the next generation?
You're reading into this too literally. The 9700Pro supposedly required a 300W PS when it was released, and we all know, it can run on something much lower. 480 is just a recommendation based on 1) even a bad 480W PS will be able to handle it, eliminating and potential problems, and 2) the assumption of what kind of system this card will be in, meaning everything top of the line, multiple HD's, etc... The Apple G5 depending on configuration comes with a 450W or 600W power supply, which certainly doesn't leave Apple in any position to be bragging about low power requirements. I don't see why NVidia thought the 6800Ultra needed 2 power connectors, all the reviews point to it needing barely more than what's already out there. Looks more like NVidia playing it very safe like Apple does with a 600W PS than adding the second connector out of necessity.
Yeah the dual uses a 600 W power supply, but then Apple massively overbuilds its Power Macs. nVidia is of course playing it safe, but 140 W and 250 mm2 for a GPU seems excessive.

BTW, the point of the post wasn't bragging rights, but just as a comparison between a relatively small (but very fast) current generation CPU, vs. a now current generation GPU.

Originally posted by: ViRGE
Originally posted by: Eug
These speeds are stupid-fast and all, but this power utilization and die size stuff is getting way too out of hand.

WTF? NV40 is approximately three to four times the size of the IBM G5 PPC 970FX 2.5 GHz CPU (66 mm2). And NV40 requires a 480 Watt power supply?

What happens with the next generation?
Bigger heatsink, more power connectors. It's just like CPUs; over time, power consumption is going to increase no matter what you do.
Except it sounds like NV40 is already hotter than the hottest consumer CPU in existence, including Prescott, while the G5 and Athlon64 have decreased in power compared to the products they replaced. It's no surprise that Intel has Banias mania. These power levels are getting insane.

One can only hope ATI has a cooler option, although initial rumours aren't promising regarding the heat issue.

Maybe it's time to buy stock in power supply manufacturers...
 

AIWGuru

Banned
Nov 19, 2003
1,497
0
0
Originally posted by: Pariah
Thanx chief. Much appreciated. It's a pet peeve to see people trying to pass off utter BS as practical information for others. We all probably do it at least occasionally, but some seem to make a habit of doing it here.

Hey 'chief'
Have you tried any other player than media player as we did? is it not possible that any of these are SMP aware?
Maybe you should check your factbook and get off your high horse. Your condescending righteous know it all attitude is getting tired.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
OK, instead of posting hypothetical questions, how about telling us which movie player it was?

Yeah the dual uses a 600 W power supply, but then Apple massively overbuilds its Power Macs. nVidia is of course playing it safe, but 140 W and 250 mm2 for a GPU seems excessive.

I doubt NVidia had any goal in mind with the 6800U except build the fastest gaming card on the planet, and leave no doubt that it is. For a card which was basically designed to play games as fast as it can, power efficiency isn't of vital importance. For products like CPU's that can be used in large #'s in small places, power consumption and the resultant necessary cooling requires more thought in the design. 450W for a single lowend G5 is still pretty exhorbitant. I agree that Apple is just playing it safe and that much power really isn't needed, but I think NVidia is doing the same thing. You can't rip one company for doing it, while not ripping the other.
 

Eug

Lifer
Mar 11, 2000
24,131
1,782
126
Originally posted by: Pariah
OK, instead of posting hypothetical questions, how about telling us which movie player it was?

Yeah the dual uses a 600 W power supply, but then Apple massively overbuilds its Power Macs. nVidia is of course playing it safe, but 140 W and 250 mm2 for a GPU seems excessive.
I doubt NVidia had any goal in mind with the 6800U except build the fastest gaming card on the planet, and leave no doubt that it is. For a card which was basically designed to play games as fast as it can, power efficiency isn't of vital importance. For products like CPU's that can be used in large #'s in small places, power consumption and the resultant necessary cooling requires more thought in the design. 450W for a single lowend G5 is still pretty exhorbitant. I agree that Apple is just playing it safe and that much power really isn't needed, but I think NVidia is doing the same thing. You can't rip one company for doing it, while not ripping the other.
That doesn't make sense. A G5 1.6 and Radeon 9800 Pro use what, somewhere around 70 Watts each? (Or maybe a bit more for the latter I dunno.) OTOH, a Prescott 3.0 GHz and an NV40 use what, 120 and 140 W respectively?

There's no denying the NV40 is a HUMUNGOUS chip and VERY hot. Now the question becomes what ATI brings us.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
According to Tech-Report the 6800U uses less power than the 9800XT when idle, when under load the 6800U tops out at 206W's, while the 9800XT hit 177W's. It should be noted that the 5950 hit 198W's, so the 6800U is only an 8W increase over the card it is replacing. Not sure why that would require a second power connector. I don't know where the top end Prescott's are, but A64's and Northwoods are no where near 120W's.
 

Alkali

Senior member
Aug 14, 2002
483
0
0
Originally posted by: Pariah
According to Tech-Report the 6800U uses less power than the 9800XT when idle, when under load the 6800U tops out at 206W's, while the 9800XT hit 177W's. It should be noted that the 5950 hit 198W's, so the 6800U is only an 8W increase over the card it is replacing. Not sure why that would require a second power connector. I don't know where the top end Prescott's are, but A64's and Northwoods are no where near 120W's.

If you bother to read the article, they note specifically that those numbers are for the whole system, taking into account 69% efficency of the PSU.