Rumor of possible yield problems with the G71

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Rollo
Originally posted by: munky
Nice try there, might wanna try harder next time. That quote was dated 04/25/2005 10:58 AM, well before the introduction of the 7 series, and at that time the fastest Nv card (6800u) was slower than the x850xt pe.
I don't have to try harder Munky. I exposed your nature quite clearly.
On 4/25/2005, you were saying the features of the nV40 weren't worth $50.. The nV40 offered SM3, HDR, SLI, and Soft Stencil shadows at that time over primitive ATI parts. Four features. Now you say two features, one of which is used in only two games should make us join your "pimp the X1900" crusade.

You have no game Munky.

No, you just dont have reading comprehension. The x1900 does not need my pimping, and people are buying it because it offers better performance for the price. The extra features are just a bonus. Is that too hard to understand?
 
B

Blackjack2000

Originally posted by: Ronin
They're buying it because it's new. The performance is negligible.

Performance compared to what? The $700 GTX 512 vaporware?
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Drayvn
Rollo why do you say that HDR was pioneered by nVidia....

When it clearly wasnt.

Care to elaborate on that by linking us to some articles about other mainstream gaming cards that predate the nV40 and support EXR HDR?

I didn't say nVidia invented it, I said they gave developers cards and a user base of card owners. There would be no EXR HDR games at all now if nVidia did not exist.

ATI didn't have cards that would do it till three months ago.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Drayvn
Rollo why do you say that HDR was pioneered by nVidia....

When it clearly wasnt.

I think his point was that Nvidia was the first to bring the feature to the mainstream market as a viable option at mutiple price points (6600GT, 6800GT, 6800U and everything in between)

EDIT: Dangit Rollo, u beat me to it
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Umm. Pioneer and Inventing are practically the same. And no you specifically said nVidia pioneered HDR. Well anyway you made yourself clearer on the subject. So its all cool now.

 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: Blackjack2000
Originally posted by: Ronin
They're buying it because it's new. The performance is negligible.

Performance compared to what? The $700 GTX 512 vaporware?

How about 7800GT SLI which is on par if not faster in games??

 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
Features... hmm
Both cards have:

Pure Video (when FW 85 is released) = AVIVO
S.M 3.0
Multi solution GPU ability (SLi > Xfire however)
TSAA = AAA
HDR

ATi cards however have HDR plus AA and angle independent AF

In terms of features.. ATi wins out 2.
However, NV has a really strong advantage when it comes to OpenGL, Stencil Shadows (Probably due to there ultrashadow2 tech), easier driver controls (even easier with the new upcoming NV control panel in FW 85).

I just think people are over exaggerating the X1 series because its a new architecture, and the X series didnt have any of them. To me, both companies have good feature set.

But... arent we suppose to be talking about G71 and rumours of having possible yield issues?
 

linkgoron

Platinum Member
Mar 9, 2005
2,598
1,238
136
Originally posted by: munky
No, you just dont have reading comprehension. The x1900 does not need my pimping, and people are buying it because it offers better performance for the price. The extra features are just a bonus. Is that too hard to understand?

QFT

 

fstime

Diamond Member
Jan 18, 2004
4,382
5
81
Originally posted by: linkgoron
Originally posted by: munky
No, you just dont have reading comprehension. The x1900 does not need my pimping, and people are buying it because it offers better performance for the price. The extra features are just a bonus. Is that too hard to understand?

QFT

yup, munky is correct.

Rollo, no one cares about the 6 series anymore, just like you pointed out no one cared about the 9700p so many times since it was "old news"

 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Here is what I think will happen, the 7800 willl not be discontinued completely.

Geforce 7900 GTX (Full Yield Core) (32 Pipe/8VS+/24ROP)
G71 Core, 32 Pipeline, 32 Pixel Shader, 8 or Higher VS.
700MHZ/1800MHZ
256 Bit Memory Interface
512MB
MSRP 649US

Geforce 7900 GT (Geforce 7800 GTX 512 Discontinued immediately)
G71 Core, 24 pipeline, 24 Pixel Shader, 8 or Higher VS. (2Quad Disabled)
600MHZ/1600MHZ
256 Bit Memory Interface
256MB & 512MB
MSRP 549US

Geforce 7900 GS (will come in after Geforce 7800 GTX 256 is discontinued, 7800 GTX 256 will be this price thru a price drop)
G7x Core, 20 Pipeline, 20 Pixel Shader, 7 VS. (Full Yield Core) (20 Pipe/7VS/16ROP)
550MHZ/1400MHZ
256 Bit Memory Interface
256MB
MSRP 449US

Geforce 7900 (will come in after 7800 GT is discontinued, 7800 GT will be this price thru a price drop)
G7x Core, 16 Pipeline, 16 Pixel Shader, 6VS (1 Quad, 1 VS Disabled)
525MHZ/1200MHZ
256 Bit Memory Interface
256MB
MSRP 349US

Geforce 7900 XT (PCI-Express, replaces 6800 GS)
G7x Core, 16 Pipeline, 16 Pixel Shader, 6VS (1 Quad, 1 VS Disabled)
450MHZ/1000MHZ
256 Bit Memory Interface
256MB
MSRP 249US

Geforce 7600 GT
G73 Core, 12 Pipeline, 12 Pixel Shader, 5 VS (Full Yield Core) (12 Pipe/5VS/8ROP)
600MHZ/1500MHZ
128 Bit Memory Interface
256MB
MSRP 199US

Geforce 7600 GS (Replaces Geforce 6600 GT)
(Lower clockspeed and partially defective G73 Cores.)
G73 Core, 8 Pixel Pipeline, 8 Pixel Shader, 5VS
500MHZ/900MHZ
128 Bit Memory Interface
256MB & 512MB
MSRP 149US

Geforce 7300 GT (Dunno what tech specs it could be or if there is room for such a product, replaces Geforce 6600 DDR2)
MSRP 119US

Geforce 7300 GS
G72 4 Pipeline, 4 Pixel Shader, 3 VS (Full Yield Core) (4 Pipe/3VS/2ROP)
550MHZ/800MHZ GDDR2
64Bit Memory Interface
128MB & 256MB Turbo Cache
MSRP 99US for 128MB version

Geforce 7300 LE
G72 4 Pipeline, 4 Pixel Shader, 3 VS (Full Yield Core)
450MHZ/650MHZ GDDR
64Bit Memory Interface
64MB, 128MB, 256MB Turbo Cache
MSRP 79US for 128MB version
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Call this a wild speculation, but maybe.. just MAYBE... Nv will add more shader ALU's instead of more of the same pipes. For example right now they have 2 shader ALU's in each pixel shader unit. Maybe instead of adding more pipes, they'll keep it at 24, but now have 3 ALU's per shader. If they do, you heard it from me first! :)
 

Alaa

Senior member
Apr 26, 2005
839
8
81
does unreal engine 3 use shaders like F.E.A.R? i think many games are goin to be built on that engine! its good to know now..
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Most new games use shaders, hence the focus on shader performance.

Munky, Xenos actually has 32 texture units: 16 bilinear-filtered (typical pixel implementation) and 16 point-sampled (meant for VTF). You can probably use either type for anything, but that's what I understand their main usage to be.

Edit: FP blending was apparently the key feature for HDR, as it enabled devs to continue using the blending effects with partially-transparent textures with the higher-range FP16 format. So it seems fair to say NV40 pushed that for gamers. I guess Source's HDR implementation is an exception, but, again, I'd like to see IQ comparisons, as I don't believe ATI and NV cards are using the same precision.
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Gstanfor
So HDR + AA doesn't look that out there, especially when using ATI's new (and so far exclusive) FX10 mode

This is the most amusing part of the whole "ATi has HDR and AA" argument to me. [...] its less than FP32 [...], its less than FP24 [...], its less than FP16 [...], its even less precision than the FX12 mode found in nV30! It isn't an industry defined standard that has trickled down to consumers either.
I cut out the flamebait and will ignore the irrelevant-to-"HDR" mentions of FP32, FP24, and FX12. I hope you don't mind.

I'm not about to put "HDR" in quotes everytime I mention it, but much of what we see touted as such seems to be a bloom effect. You're right that apparently anything less than FP16 is considered "MDR." Curious that AoE3 shots don't show an obvious difference b/w FX10 and FP16, but that may be due to bloom being its primary "HDR" feature. Same with SS2, I guess.

SC:CT seems to show a clear difference b/w NV's FP16 and ATI's FX16, so that's something to consider. Then again, FX16 appears to be a concession to ATI's R300-R480 architecture, seeing as it was added to SC:CT's SM2 patch. I'd like to see someone use 3DAnalyze or something to trick SC:CT into running R520+ with FP16 HDR, as if it were an NV40+ card.

Yeah, EXR's FP16 format is an "industry defined standard," which I suppose helps dev and artist uptake. Ultimately being an industry standard isn't as interesting to me (as a gamer) as is what you can do with the formats you use. If FX10 shows the same IQ as FP16 but with half the bandwidth and the ability to use AA on top, I'll take it. Obviously it's not worth squat if it limits you to the point that you're not showing a difference much beyond the current FX8 standard. But it's nice to have as a higher-performing option when it won't impact IQ too severely (similar to FP16 vs. FP32). It's a compromise for performance. I'm not saying games should use it instead of FP16, but surely it can be helpful for bandwidth- or ROP-limited cards.

But I'll agree that calling FX10 an HDR format might be a stretch. Again, I'd like to see an article on HDR in current games to clarify this. I suppose SC:CT is the best example that FP16 is a minimum for the "HDR" moniker, as even FX16 (greater precision but *much* smaller range) doesn't seem to cut it.
 

nRollo

Banned
Jan 11, 2002
10,460
0
0
Originally posted by: Pete
Originally posted by: Gstanfor
So HDR + AA doesn't look that out there, especially when using ATI's new (and so far exclusive) FX10 mode

This is the most amusing part of the whole "ATi has HDR and AA" argument to me. [...] its less than FP32 [...], its less than FP24 [...], its less than FP16 [...], its even less precision than the FX12 mode found in nV30! It isn't an industry defined standard that has trickled down to consumers either.
I cut out the flamebait and will ignore the irrelevant-to-"HDR" mentions of FP32, FP24, and FX12. I hope you don't mind.

I'm not about to put "HDR" in quotes everytime I mention it, but much of what we see touted as such seems to be a bloom effect. You're right that apparently anything less than FP16 is considered "MDR." Curious that AoE3 shots don't show an obvious difference b/w FX10 and FP16, but that may be due to bloom being its primary "HDR" feature. Same with SS2, I guess.

SC:CT seems to show a clear difference b/w NV's FP16 and ATI's FX16, so that's something to consider. Then again, FX16 appears to be a concession to ATI's R300-R480 architecture, seeing as it was added to SC:CT's SM2 patch. I'd like to see someone use 3DAnalyze or something to trick SC:CT into running R520+ with FP16 HDR, as if it were an NV40+ card.

Yeah, EXR's FP16 format is an "industry defined standard," which I suppose helps dev and artist uptake. Ultimately being an industry standard isn't as interesting to me (as a gamer) as is what you can do with the formats you use. If FX10 shows the same IQ as FP16 but with half the bandwidth and the ability to use AA on top, I'll take it. Obviously it's not worth squat if it limits you to the point that you're not showing a difference much beyond the current FX8 standard. But it's nice to have as a higher-performing option when it won't impact IQ too severely (similar to FP16 vs. FP32). It's a compromise for performance. I'm not saying games should use it instead of FP16, but surely it can be helpful for bandwidth- or ROP-limited cards.

But I'll agree that calling FX10 an HDR format might be a stretch. Again, I'd like to see an article on HDR in current games to clarify this. I suppose SC:CT is the best example that FP16 is a minimum for the "HDR" moniker, as even FX16 (greater precision but *much* smaller range) doesn't seem to cut it.

WOW.

So GStanfor is right, and ATIs "HDR" + AA is another hacky workaround like they did getting "HDR" for SC:CT?!?!?

Maybe ATI should change all their ads to "MDR+AA", and I wonder if nVidia cards could do similar tricks if they opted to lower precision as well?

I guess nts needs to delete that line about "partial precision" in his signature, or else he's mocking his own card. :(

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Rollo
Munky on advanced features April, 2005
I'm not saying sm3 is bad to have, but I would not pay $50-100 more for a video card just to have it.
As far as visual features go (like HDR and soft shadows), I've already mentioned it before that most if not all of them can be done using sm2 if the developer choses to do so. And having HDR at the expense of AA doesn't seem like a good trade off.
So back then Munky didn't think it was worth $50 more to have SLI, Soft stencil shadows, HDR, SM3.

Now, some of us are supposedly "hypocrites" because we don't think HDR+AA and a method of AF nVidia used to do makes X1900s the only card to have.
Munky is shifting the buying decision back to price/performance, which I think was the big issue with X1800XT vs. GTX-256. It's not quite the same situation with X1900XT vs. 7800GTX. Yeah, a quick NewEgg check shows the former at $519 and the latter at $454, but the two most definitely don't perform the same. Cast your net wider and I think the GTX still sells for around $445 minimum, vs. around $493 for a X1900XT. $50 more "just to have" potentially better IQ, but also to have definitely better performance.

These ATI fanboys that are telling the world "Your card is bad without angle independent AF and HDR AA" after spending the last year and a half saying multicard, soft stencil shadows, HDR, SM3, transparency AA were "not necessary" are the one who need to learn what hypocrite means. (not to mention how to spell it)
SLI is orthogonal to single-card IQ enhancing features. Transparency AA is not an NV exclusive, and in fact is an ATI exclusive in first-gen DX9 cards. That leaves FP blending (HDR) and SM3, which are valid NV advantages. Are they necessary? That's irrelevant, as the crux of the argument is and always has been price/performance, hasn't it?

I guess you're arguing about the high end again, though, thus SLI. You're right, in that case.

You and munky are arguing about different facets of the same cards, so maybe agree to disagree on your focuses.

When I said angle independent AF was important on nV30s, I was told it didn't matter. Now it's the whole reason to buy a card.
Yeah, you championed IQ over performance. I guess that's why you kept switching back to the 5800 series. Come on, now. Nicer IQ is obviously preferable, but "nothing in 3D is for free," hence both NV and ATI switching to angle-dependent AF algorithms as well as brilinear/trylinear compromises. Heck, hence NV including FP16 as well as FP32!

When nVidia had better transparency AA on EVERY game, it didn't matter. Now AA+HDR on two games is the reason to buy a card.
Better to compare NV's apparently superior transparency AA to ATI's apparently superior AF. HDR+AA might be better positioned against VTF or a similar feature that can only be enabled on the other card via expensive workarounds.

It seems to me the argument for X1900 over 7800 is the same as it was for 6800 over X800: same price for similar performance but with bonus IQ features that could prove useful in due time.
 

M0RPH

Diamond Member
Dec 7, 2003
3,302
1
0
I don't know anything about these different HDR modes, but from what I'm reading here, ATI's version is a lower precision, higher-performing one?

Well if that's the case, all I can say is I'd rather have AA+(lesser precision HDR) than AA+(no HDR at all).

And actually it seems like a smart move. The Nv supporters here are always saying that AA+HDR won't be feasible for these cards for performance reasons. A higher performing form of HDR may ensure that it is feasible in upcoming games. ATI can always implement AA+(full precision HDR) in future gen parts when the needed performance is there.
 

RobertR1

Golden Member
Oct 22, 2004
1,113
1
81
Sounds like Splinter Cell developers could release a patch that allows the x1k cards to use FP16 HDR along with AA, if they so cared.

The feature is supported by the card, just not enabled by the game. It's a developer issue, not an issue with the card not being to able support the feature.

 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Originally posted by: Rollo
[I'll retract if you're saying that developers could do the form of displacement mapping touted as a new feature in SM3, and geometry instancing as done on a nV40 with other cards prior to it's existence and post links to back you.
Why is the burden of proof on me? I honestly don't know enough about the specifics of either implementation to say if they're equivalent. IIRC, Parhelia performed DM via texture lookups on a heightmap. I'm guessing NV40 can do the same, courtesy of VTF. Actually, that particular DM use only covers creating fixed geometry (not sure if it could be subsequently modified, or re-displaced), and I *believe* Parhelia's DM only allowed for displacing 2D geometry in one dimension (so, no caves under mountains with their DM approach, just creating hills and valleys on a plane). Does NV40's VTF allow for deformable geometry, like broken walls and girders and such? In that case, it's a superior and so at least partially new implementation. But I can't say for sure that's what SM3's DM via VTF allows for, and the only in-game example we have is wave creation in PF. That's essentially the canyon Parhelia touted--1D displacement of a 2D surface--but maybe the mere fact that waves are animated indicate a (key) new feature.

AFAIK soft stencil shadows began with nV40s, again link me and I'll retract.
I just realized you're talking about Riddick's soft shadows, right? (I can't think of another game that offered soft shadows an NV-only option.) I was thinking of 3DM05, where ATI could also produce "soft" shadow edges. (Actually, I got it reversed, as NV's DST+PCF "hardware" implementation produces harder shadow edges than ATI's shader implementation--but at higher speeds, natch.) I'm not sure if they count as "soft stencil shadows," though. The answer is probably in here. A skim leads me to guess 3DM05 uses shadow maps rather than stencil shadows. Not sure if NV using "Depth Stencil Textures" is in any way related to stencil shadows.

(But, man, what a terrific performance hit. Has this improved with newer drivers or G70?)

You'll have to do better than just saying this stuff: I don't care about ATIs driver workarounds that would have had to be custom coded and no one did.
Yes, well, you just said this stuff too, no? Or did I miss your technical explanations?

That's my whole point though isn't it? You know as well as I that every "new" feature in every game you play this year was made possible by the nV40, not some BS SM2b workaround ATI cobbled while scrambling to catch up for the last year. (and for sure not because of the Parhelia)
We're basically talking about HDR as the only new feature with widespread adoption, right?

I'd call my arguments better than yours so far. I flat out don't believe the Parhelia's displacement mapping is the same as what's made available in the SM3 feature set, I don't care if SM2b can emulate instancing because I only remember one game where the developer bothered to do the patch, and I don't believe any hardware prior to the nV40 could do soft stencil shadows at all.
I was referring mostly to your first two posts, arguments you continued with munky about price/perf vs. perf-uber-alles.

You're probably right about VTF being an expanded version of Parhelia's DM. I was focusing merely on DM. It's likely VTF allows for a lot more flexibility, basically putting it in another league.

I don't know enough about Riddick's "PS2+" stencil shadows to say either way. Apparently they can't be enabled with a R520, but Riddick is OGL (and so may use NV-specific extensions) and an older game (thus not much financial incentive), so even if R520+ and its drivers are capable of SSS, the devs may never get around to accomodating it.

As for GI, AFAIK, ATI doesn't "emulate" it with R300, it just can't tell D3D it supports it via "official channels" b/c MS deemed it an SM3-only feature (similar to that early centroid sampling HL2 AA issue, I think). So, rather than check the defined D3D GI cap bit, a dev has to check for ATI's fourcc GI cap bit--that's the only workaround I'm aware of. I don't think there's any emulation (vs., say, FP filtering, which NV40+ has in hardware but R5x0 has to emulate via shaders).

This is a really long way to say HDR +AA is an issue on one good two year old game, and one bargain bin game. (SS2 is nice silly fun for little kids, but it's hardly the sort of thing guys considering a $500 card are going to say "OMFG! AA+HDR in Serious Sam2! When I release a bomb toting parrot at a buck toothed viking while doing a bad Schwarzenegger impersonation, it will look better!".
If you think anyone will be playing UT2007 with HDR +AA, heh, I sort of doubt it.
How many games use DM/VTF or soft stencil shadows? PF and Riddick, respectively?

I'm not a mod, nor do I play one on tv.
Yet you're playing at being one, no? :p

The man who calls others hypocrites for not acknowledging the rare as Bigfoot and useful as a floppy disk HDR+AA after spending a year and a half saying the many advantages of nVidia during that time needed to be called on it.
Glass houses. (Yeah, that applies to me, too.)