Go Back   AnandTech Forums > Hardware and Technology > Video Cards and Graphics

Forums
· Hardware and Technology
· CPUs and Overclocking
· Motherboards
· Video Cards and Graphics
· Memory and Storage
· Power Supplies
· Cases & Cooling
· SFF, Notebooks, Pre-Built/Barebones PCs
· Networking
· Peripherals
· General Hardware
· Highly Technical
· Computer Help
· Home Theater PCs
· Consumer Electronics
· Digital and Video Cameras
· Mobile Devices & Gadgets
· Audio/Video & Home Theater
· Software
· Software for Windows
· All Things Apple
· *nix Software
· Operating Systems
· Programming
· PC Gaming
· Console Gaming
· Distributed Computing
· Security
· Social
· Off Topic
· Politics and News
· Discussion Club
· Love and Relationships
· The Garage
· Health and Fitness
· Merchandise and Shopping
· For Sale/Trade
· Hot Deals
· Free Stuff
· Contests and Sweepstakes
· Black Friday 2013
· Forum Issues
· Technical Forum Issues
· Personal Forum Issues
· Suggestion Box
· Moderator Resources
· Moderator Discussions
   

Reply
 
Thread Tools
Old 07-04-2006, 04:51 AM   #1
Sable
Senior Member
 
Join Date: Jan 2006
Posts: 882
Default Nvidia sucks?

http://www.theinquirer.net/default.aspx?article=32809

GAME developers on an upcoming boat racing benchmark, entitled Rydermark alleged that one of the two graphic vendors is fudging the truth with its per pixel precision. DirectX 9 requires you to use at least 24 bit or full 32-bit precision.
Nvidia doesn?t let developers use more than 16 bit and of course it is much faster than 32 bit precision. The only problem is that 16 bit precision is below the requirements of DirectX 9, so if you use less than 24 you are not DirectX 9 compliant.

If you want to do normal mapping, parallax mapping and water reflection/refraction, your Shader requires 32 bit precision.

Nvidia doesn?t leave you any choice, it's claimed. You simply cannot turn 24 or 32 bit precision on, you are always locked at 16 bit. From a developer and artistic perspective this is really unacceptable but will buy you a few frames here and there.

Developers have also informed us that they have no way to stop Nvidia doing this. The only way is to make the community aware, and that can change some minds. There is more to come and we will try to get you screenshots to really see the difference.

---------------------------------------------------------------------------------------------------

As always, pinch of salt with the story.

If it's only a few fps then why do it? Obviously if the screen shots (when they arrive) don't show any difference then I don't see a problem but if it's degrading image quality then it's a blatent cheat. And one which would bound to be found out. O_o

__________________
Fractal R3 White || Asus P8Z77-V Pro || Intel i5 2500k @ 4.8ghz 1.4v || Arctic Freezer i30 || 8GB Corsair Vengeance LP || eVGA GTX 680 2GB || 2x Samsung 830 256GB SSD || WD Caviar Black 1TB || 4x WD Green 2TB 2x RAID1 || Corsair AX750 || BenQ XL2420T

Fractal Node304 White || Asus H81I-PLUS || Intel Pentium G3420 || Arctic Freezer 7 rev2 || 8GB Crucial Ballistix Tactical || Gigabyte GTX 750 1GB || Crucial V4 64GB || WD Caviar Black 500GB || Be Quiet L8 430W
Sable is offline   Reply With Quote
Old 07-04-2006, 04:53 AM   #2
akshayt
Banned
 
Join Date: Feb 2004
Posts: 2,227
Default Nvidia sucks?

inquirer = crap
akshayt is offline   Reply With Quote
Old 07-04-2006, 04:56 AM   #3
Sable
Senior Member
 
Join Date: Jan 2006
Posts: 882
Default Nvidia sucks?

Quote:
Originally posted by: akshayt
inquirer = crap
Good excuse for a flame fest though.

Obviously I'm not gonna believe anything until I see some screenshot business.
__________________
Fractal R3 White || Asus P8Z77-V Pro || Intel i5 2500k @ 4.8ghz 1.4v || Arctic Freezer i30 || 8GB Corsair Vengeance LP || eVGA GTX 680 2GB || 2x Samsung 830 256GB SSD || WD Caviar Black 1TB || 4x WD Green 2TB 2x RAID1 || Corsair AX750 || BenQ XL2420T

Fractal Node304 White || Asus H81I-PLUS || Intel Pentium G3420 || Arctic Freezer 7 rev2 || 8GB Crucial Ballistix Tactical || Gigabyte GTX 750 1GB || Crucial V4 64GB || WD Caviar Black 500GB || Be Quiet L8 430W
Sable is offline   Reply With Quote
Old 07-04-2006, 05:05 AM   #4
josh6079
Diamond Member
 
Join Date: Mar 2006
Posts: 3,261
Default Nvidia sucks?

Quote:
Originally posted by: akshayt
inquirer = crap
I thought the X1900XTX was crap?

I do agree with you though in that the Inquirer is just as reliable as a magic 8-ball.
josh6079 is offline   Reply With Quote
Old 07-04-2006, 05:12 AM   #5
BFG10K
Lifer
 
BFG10K's Avatar
 
Join Date: Aug 2000
Posts: 20,266
Default Nvidia sucks?

That article flip-flops from texture to pixel precision so I sincerely doubt it's correct.
__________________
i5 2500K | Titan | 16GB DDR3-1600 | GA-P67A-UD3-B3 | 128GB Samsung 830 | 960GB Crucial M500 | 1TB VelociRaptor | X-Fi XtremeMusic | Seasonic X 560W | Fractal Arc R2 | 30" HP LP3065
BFG10K is offline   Reply With Quote
Old 07-04-2006, 05:24 AM   #6
Gstanfor
Banned
 
Join Date: Oct 1999
Posts: 3,307
Default Nvidia sucks?

Sounds to me like the developer is misusing the Global Partial Precision flag. FP16 is supported by DX9 as partial precision. You can either _PP hint an instruction or us the Global Partial Precision flag to run all shaders at partial precision.

3Danalyse should quickly show the true situation in any case.

NVIDIA GPU Programming Guide
Page 21
Gstanfor is offline   Reply With Quote
Old 07-04-2006, 05:38 AM   #7
Stumps
Diamond Member
 
Join Date: Jun 2001
Location: Wagga Wagga NSW Australia
Posts: 7,125
Default Nvidia sucks?

Quote:
Originally posted by: akshayt
inquirer = crap
hmmm, the young Padawan is starting to learn
__________________
I'm a little cutie

Q6600@3.6ghz (400x9, 1.425v Vcore)
Gigabyte GA-EP35-DS3P rev 2.1
8gb Patriot DDR2-800@800 (4x 2gb) 4-4-4-12 2T 2.1v
2x Gigabyte Radeon HD3870 512mb
2x 1TB WD10EACS SATA2 Raid 0
LiteON SHD-16P1S DVD-ROM
LiteON DH20A4P DVD-RW
Antec TX1050B with TPII 550w PSU
ThermalTake Bigwater 745 water cooling
Windows Vista Ultimate x64
Stumps is offline   Reply With Quote
Old 07-04-2006, 05:58 AM   #8
hardwareking
Senior Member
 
Join Date: May 2006
Posts: 618
Default Nvidia sucks?

how come there isn't a mention of ATI in that article?
Is it cause they have all those features which nvidia don't?
hardwareking is offline   Reply With Quote
Old 07-04-2006, 06:22 AM   #9
Sable
Senior Member
 
Join Date: Jan 2006
Posts: 882
Default Nvidia sucks?

Quote:
Originally posted by: hardwareking
how come there isn't a mention of ATI in that article?
Is it cause they have all those features which nvidia don't?
That's what they're implying yes.

Quote:
That article flip-flops from texture to pixel precision so I sincerely doubt it's correct.
Does it? Where does it mention the texture precision?

The thing with this article is that I can see an inq reporter getting confused/messing up the story but surely the benchmarking people would know what they're talking about and they've flagged up the problem originally.

Mind you, now I think of it, why has this never been caught by Futuremark when they've put together their benchmark?

Forget the pinch of salt, time for an entire salt shaker. O_o
__________________
Fractal R3 White || Asus P8Z77-V Pro || Intel i5 2500k @ 4.8ghz 1.4v || Arctic Freezer i30 || 8GB Corsair Vengeance LP || eVGA GTX 680 2GB || 2x Samsung 830 256GB SSD || WD Caviar Black 1TB || 4x WD Green 2TB 2x RAID1 || Corsair AX750 || BenQ XL2420T

Fractal Node304 White || Asus H81I-PLUS || Intel Pentium G3420 || Arctic Freezer 7 rev2 || 8GB Crucial Ballistix Tactical || Gigabyte GTX 750 1GB || Crucial V4 64GB || WD Caviar Black 500GB || Be Quiet L8 430W
Sable is offline   Reply With Quote
Old 07-04-2006, 06:33 AM   #10
BFG10K
Lifer
 
BFG10K's Avatar
 
Join Date: Aug 2000
Posts: 20,266
Default Nvidia sucks?

The texture thing is in the second title, right above Fuad Abazovic's name.

Quote:
The thing with this article is that I can see an inq reporter getting confused/messing up the story but surely the benchmarking people would know what they're talking about and they've flagged up the problem originally.
If it's true then I can't see how nVidia could get WHQL certification for their drivers, but they appear to do so.
__________________
i5 2500K | Titan | 16GB DDR3-1600 | GA-P67A-UD3-B3 | 128GB Samsung 830 | 960GB Crucial M500 | 1TB VelociRaptor | X-Fi XtremeMusic | Seasonic X 560W | Fractal Arc R2 | 30" HP LP3065
BFG10K is offline   Reply With Quote
Old 07-04-2006, 06:49 AM   #11
Sable
Senior Member
 
Join Date: Jan 2006
Posts: 882
Default Nvidia sucks?

Quote:
Originally posted by: BFG10K
The texture thing is in the second title, right above Fuad Abazovic's name.

Quote:
The thing with this article is that I can see an inq reporter getting confused/messing up the story but surely the benchmarking people would know what they're talking about and they've flagged up the problem originally.
If it's true then I can't see how nVidia could get WHQL certification for their drivers, but they appear to do so.
Christ on a bike!!! Absolutely right, I don't mind them writing utter tosh stories but at least write consistent tosh.


Do you know if this Ryder Benchmark has a website as the only information I can find about it is from theinq or sites regurgitating theinq's story.
__________________
Fractal R3 White || Asus P8Z77-V Pro || Intel i5 2500k @ 4.8ghz 1.4v || Arctic Freezer i30 || 8GB Corsair Vengeance LP || eVGA GTX 680 2GB || 2x Samsung 830 256GB SSD || WD Caviar Black 1TB || 4x WD Green 2TB 2x RAID1 || Corsair AX750 || BenQ XL2420T

Fractal Node304 White || Asus H81I-PLUS || Intel Pentium G3420 || Arctic Freezer 7 rev2 || 8GB Crucial Ballistix Tactical || Gigabyte GTX 750 1GB || Crucial V4 64GB || WD Caviar Black 500GB || Be Quiet L8 430W
Sable is offline   Reply With Quote
Old 07-04-2006, 06:59 AM   #12
Nirach
Senior Member
 
Join Date: Jul 2005
Posts: 415
Default Nvidia sucks?

Quote:
Originally posted by: josh6079
Quote:
Originally posted by: akshayt
inquirer = crap
I thought the X1900XTX was crap?

I do agree with you though in that the Inquirer is just as reliable as a magic 8-ball.
Nothing wrong with Magic 8-balls.

You can trust them to be wrong and vague 90% of the time. The other 10% they're broken.
Nirach is offline   Reply With Quote
Old 07-04-2006, 08:28 AM   #13
F1shF4t
Golden Member
 
F1shF4t's Avatar
 
Join Date: Oct 2005
Posts: 1,575
Default Nvidia sucks?

nvidia cards supported full 32bit precition since fx days.
Ati's 9*** and x*** series cards were 24bit precition. Shader model 3 requires 32bit precition i think to be fully compliant.

Nvidia cards can run at 16 bit precition, and it was quicker back in the fx days, i dont know if its the same now.

Someone who knows more about this should post something.
__________________
Laptop1: Asus G750JX, i7 4700HQ, Plextor M5 Pro 512GB, 32GB DDR3, GTX770M 3GB GDDR5, Ubuntu 12.04.4
Laptop2: Asus G73SW, i7 2630qm, Samsung 840 500GB, 24GB DDR3, GTX460M 1.5GB GDDR5, Win8.1 Pro
F1shF4t is offline   Reply With Quote
Old 07-04-2006, 08:29 AM   #14
LOUISSSSS
Diamond Member
 
LOUISSSSS's Avatar
 
Join Date: Dec 2005
Posts: 8,361
Default Nvidia sucks?

x1900xtx is crap
if this is true then it'd come out elsewhere, besides the inquirer. dont really trust anything they write
__________________
Intel 4570 || GA-Z87X-D3H || 2x4gb G.Skill RipjawX
Samsung Evo 120gb || WD 640gb Blue || Noctua DH14
Sapphire 7950 || 2x Asus VN248H || 1x Viewsonic VX2025WM
M-Audio LX4 || Shure SRH840 || Senn HD555

Lenovo X200 w/ Samsung 840
LOUISSSSS is offline   Reply With Quote
Old 07-04-2006, 08:34 AM   #15
akugami
Diamond Member
 
akugami's Avatar
 
Join Date: Feb 2005
Location: 費城, 賓夕法尼亞州
Posts: 3,892
Default Nvidia sucks?

Quote:
Originally posted by: Sable
Mind you, now I think of it, why has this never been caught by Futuremark when they've put together their benchmark?

Forget the pinch of salt, time for an entire salt shaker. O_o
I've got a pallette of 50lb bags of salt in the backroom if you need that.

I find it highly unlikely this story is true. ATI and nVidia dissects each other's hardware and software looking for any chinks in the armor and anything they can copy. There are a lot of knowledgeable reporters out there that also scrutinize the hardware. Then there are game developers who are looking at the hardware and trying to get the most out of it. Out of all those people, you'd think someone would have noticed the Geforce 7 series is not DX9 compliant way before now (assuming the story is true, which I doubt).

I don't mind reading the Inq. They have nuggest of truth in some of the things they report and of course it's a rumor site. This just seems like something they wrote after smoking some Wacky Weed.
__________________
Canon 50D
Canon 16-35mm L MK1
_______________________
That was insensitive of me. I asked you to stop being stupid without considering how extremely difficult that must be for you.
akugami is offline   Reply With Quote
Old 07-04-2006, 10:06 AM   #16
apoppin
Lifer
 
apoppin's Avatar
 
Join Date: Mar 2000
Posts: 34,903
Default Nvidia sucks?

Quote:
Nvidia doesn?t let developers use more than 16 bit . . .
i find that really [really] hard [no - impossible] to believe.

i need *confirmation* before i give ANY credit to this theinq BS "news".


even if i wanted to believe it
:Q



happy 4th of July!

[everyone, everywhere . . even Canada and Texas]
:laugh:
__________________
.


AMD’s Upcoming HD 7970 Exposed – a Short-lived Video card?


Core i7 920 @ 3.8 GHz
/Noctua NH-U12P SE2/3x2 GB Kingston KHX1800/GTX 590, GTX 580 SLI or HD 6990 + HD 6970 TriFire-X3/Gigabyte GA-EX58-UD3R/2xThermaltake ToughPowerXT-775W/OCZ 850W/Thermaltake Element G/Klipsch v.2 400w/128 GB Kingston VNow 100 SSD and 2x500 GB 7200.12 Seagate HDDs/640 GB WD USB 2.0/Win 7 64/HP LP 3065 2500x1600 LCD/3 x Asus VG236H 23"
- 5760x1080 120Hz
apoppin is offline   Reply With Quote
Old 07-04-2006, 10:58 AM   #17
hans030390
Diamond Member
 
hans030390's Avatar
 
Join Date: Feb 2005
Location: Colorado
Posts: 7,282
Default Nvidia sucks?

I'm pretty sure Nvidia likes developers to use 16 bit when it can be. You have to admit, it's probably not practical to have 32 bit running all the time when the same effect can usually be had by 16 bit. I'm sure with new features in Shader Model 3 and all that, which require 32 bit, they would let developers use 32 bit.

Even if it is true, it doesn't bother me. I've never seen an IQ difference between ATI and Nvidia, so even if Nvidia is doing this 16 bit thing, it looks the same as ATI...so I can't complain. Everything looks good to me.
__________________
"What the heck are you gonna do if you're on a picnic and have an ice cream, and the ants crawl on the ice cream? What are you gonna do? You're gonna eat the ants, because it's made out of protein. For your health!" - Dr. Steve Brule
hans030390 is offline   Reply With Quote
Old 07-04-2006, 11:04 AM   #18
LittleNemoNES
Diamond Member
 
LittleNemoNES's Avatar
 
Join Date: Oct 2005
Posts: 4,138
Default Nvidia sucks?

Quote:
Originally posted by: LOUISSSSS
x1900xtx is crap
if this is true then it'd come out elsewhere, besides the inquirer. dont really trust anything they write
WHAT? Let the flames begin!
LittleNemoNES is offline   Reply With Quote
Old 07-04-2006, 12:15 PM   #19
Nelsieus
Senior Member
 
Join Date: Mar 2006
Posts: 330
Default Nvidia sucks?

Quote:
Developers have also informed us that they have no way to stop Nvidia doing this. The only way is to make the community aware, and that can change some minds. There is more to come and we will try to get you screenshots to really see the difference.
Why would developers go to theINQ if they expected to be taken seriously, and why would they have waited all these years to finally say something.

Most of the people at B3D call this BS, so I'm inclined to go along with them.
http://www.beyond3d.com/forum/showthread.php?t=31786

Nelsieus is offline   Reply With Quote
Old 07-04-2006, 12:20 PM   #20
fierydemise
Platinum Member
 
Join Date: Apr 2005
Posts: 2,034
Default Nvidia sucks?

Waiting for confirmation from a more credible source
fierydemise is offline   Reply With Quote
Old 07-04-2006, 12:22 PM   #21
otispunkmeyer
Lifer
 
Join Date: Jun 2003
Posts: 10,442
Default Nvidia sucks?

if this came out round the time of the FX series i would of believed it.

those cards could do either 16bit or 32bit and they were horrendous at 32bit, this along with other pieces of misguided desgin gave the FX series ****** DX9 performance. for good speed you had to go to 16bit...hence having to run in DX8.1 mode instead of DX9 mode on many games.

i thought the NV40 was designed from the ground up with 32bit speed in mind? and of course G7x is a progression of that rather quite stellar design.
otispunkmeyer is offline   Reply With Quote
Old 07-04-2006, 12:46 PM   #22
ronnn
Diamond Member
 
ronnn's Avatar
 
Join Date: May 2003
Posts: 3,916
Default Nvidia sucks?

Fuad is very good at getting your attention and he certainly covers all bases - so he can get at least some of his stories right. This one is highly unlikely, but does futher erode the public preception of nvidia IQ. Nvidia's famous pr better get cracking.
__________________
orange barrel
ronnn is offline   Reply With Quote
Old 07-04-2006, 01:49 PM   #23
josh6079
Diamond Member
 
Join Date: Mar 2006
Posts: 3,261
Default Nvidia sucks?

To answer if it was true:

Quote:
why would they have waited all these years to finally say something.
because G80 is almost here and the 7 series has already had a good run. If they would have let this be known at its launch, I don't think it would have been pretty.

I still think this is BS. (I had two 7800's, it better be BS. Otherwise its "five across the eyes" if I ever see an Nvidia rep)
josh6079 is offline   Reply With Quote
Old 07-04-2006, 03:26 PM   #24
Elfear
VC&G Moderator
 
Elfear's Avatar
 
Join Date: May 2004
Posts: 6,064
Default Nvidia sucks?

Quote:
Originally posted by: josh6079
Otherwise its "five across the eyes" if I ever see an Nvidia rep
Lol. That struck my funny bone for some reason. Thanks josh6079.

The story sounds doubtful to me too. Like someone else mentioned, both graphics companies scrutinize each other very closely and I can't imagine this hasn't come to light earlier if it were true.
__________________
4770k@4.7Ghz | Maximus VI Hero | 2x290@1150/1450 | 16GB DDR3 | Custom H20
Elfear is offline   Reply With Quote
Old 07-05-2006, 01:54 AM   #25
otispunkmeyer
Lifer
 
Join Date: Jun 2003
Posts: 10,442
Default Nvidia sucks?

Quote:
Originally posted by: Elfear
Quote:
Originally posted by: josh6079
Otherwise its "five across the eyes" if I ever see an Nvidia rep
Lol. That struck my funny bone for some reason. Thanks josh6079.

The story sounds doubtful to me too. Like someone else mentioned, both graphics companies scrutinize each other very closely and I can't imagine this hasn't come to light earlier if it were true.

http://www.thebestpageintheuniverse.net/c.cgi?u=beat

otispunkmeyer is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT -5. The time now is 07:18 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2014, vBulletin Solutions, Inc.