NV: Everything under control. 512-Fermi may appear someday. Yields aren't under 20%

Page 19 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Look guys, this conversation would most likely be entirely different had AMD been the one to implement AA in Batman and did an ID check. You'd be defending it til doomsday, much like your condemning Nvidia for doing it and will continue to do so.

This is absolutely untrue and misses the point of much of the discussion in this thread. Defending nVidia by questionoing the integrity of the users who are upset about the negative ramificaions of nVidias actions is a weak defense at best. :thumbsdown:

Your argument can be understood to lead the reader to believe that you understand the issues some forum members here have with nVidia, to be explicit that nVidias actions are bad for gamers. Given that you agree that nVidia's actions are bad for gamers, you defend these actions simply by saying that the actions dont matter, but rather the users allegiance is what is meaningful. Which is of course wrong. Hurting a gamers game experience through shady tactics does not magically become a non issue, as you would suggest, simply by switching the perpetrator of the shady tactics identity.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
This is absolutely untrue and misses the point of much of the discussion in this thread. Defending nVidia by questionoing the integrity of the users who are upset about the negative ramificaions of nVidias actions is a weak defense at best. :thumbsdown:

Your argument can be understood to lead the reader to believe that you understand the issues some forum members here have with nVidia, to be explicit that nVidias actions are bad for gamers. Given that you agree that nVidia's actions are bad for gamers, you defend these actions simply by saying that the actions dont matter, but rather the users allegiance is what is meaningful. Which is of course wrong. Hurting a gamers game experience through shady tactics does not magically become a non issue, as you would suggest, simply by switching the perpetrator of the shady tactics identity.

Umm, I hope you enjoyed that conversation you just had with yourself. :D
 

pyroluv

Junior Member
May 10, 2010
6
0
0
I wouldnt have a problem with ATI locking out Nvidia cards from any code ATI writes that enhances the experience for ATI consumers.

Tbh the ony way for game develpers to be neutral for ati/nvidia when implementing codes is the example of the ps3 and xbox360. you only can play ps3 games on ps3 and xbox360 games on xbox360, games developers will let ati/nvidia do the implenting and they dont need to pick sides so its a win win situation.
I for shure would be getting amd/ati hardware because it be way cheaper to game. like how ps3 games are more expensive than xbox360 games.
All and all of this nonesense of not implementing features in game will go out the door since now its up to the vendors to add features. And if this happens it be good news and will show really who is actually working hard to perfect the game and performance. and really no ones to blame but themself cuz that game only can work with there hardware.
It be fair game for both sides.... :D
 
Last edited:

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Umm, I hope you enjoyed that conversation you just had with yourself. :D

So you're just gonna make ignorant statements and claim no responsibility for them? :p

I'm not one of the guys that think you're biased but for some reason you're thinking the people in here just hate nV rather than the idea of effectively turning the PC into an nVidia console or an ATI console.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
So you're just gonna make ignorant statements and claim no responsibility for them? :p

I'm not one of the guys that think you're biased but for some reason you're thinking the people in here just hate nV rather than the idea of effectively turning the PC into an nVidia console or an ATI console.

Well, exactly how far did you intend to take this never ending argument? Do you have something to add that hasn't been said 7500 times already from both sides of the argument?
I really don't. But I will say that you have your views, and I have mine. And the above was kind of a joke as it appeared that Attic was carrying out a conversation between himself and I, without me being there. He even answered for me. Get it? Ha ha? Smiley face and all?

I know, loses translation over the web right?

K.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Well, exactly how far did you intend to take this never ending argument? Do you have something to add that hasn't been said 7500 times already from both sides of the argument?
I really don't. But I will say that you have your views, and I have mine. And the above was kind of a joke as it appeared that Attic was carrying out a conversation between himself and I, without me being there. He even answered for me. Get it? Ha ha? Smiley face and all?

I know, loses translation over the web right?

K.

Weird I guess, I read Attic's post as an analysis of what you'd already said, I don't see how you could have participated in his response anyway so it seems like you're just blowing it off :p

I agree there's at least 2 good sides to the argument, and I'm not trying to participate in it because everyone involved pretty much knows the score and no one else is reading this, I just was amazed that you think the people against proprietary exclusive systems are just nV haters because they own ATI.

If AMD were doing that shit I wouldn't be happy about that either. If Samsung, who makes something like 30% of DRAM modules, required a Samsung hard drive or Samsung monitor to clock the RAM at full speed for no explicable reason I wouldn't be happy about that shit. If you can't give me some reason why your hardware doesn't work with another piece of hardware in such a standards heavy environment as the PC then it's bullshit marketing and nothing annoys me more than marketing.

And what ever happened to the mentality that gave us Sound Blaster compatible clones...
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Try popping an i7 into and AMD motherboard or a Phenom II into Intels. Hows that for standards heavy? No standard "requires" a game to have built in AA. No standard "requires" a game to have PhysX.
Anyway, nuff said.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Try popping an i7 into and AMD motherboard or a Phenom II into Intels. Hows that for standards heavy? No standard "requires" a game to have built in AA. No standard "requires" a game to have PhysX.
Anyway, nuff said.

What? That post just blew my mind. It's like the Uncharted 2 analogy.

Guess the signature says it all haha.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Try popping an i7 into and AMD motherboard or a Phenom II into Intels. Hows that for standards heavy? No standard "requires" a game to have built in AA. No standard "requires" a game to have PhysX.
Anyway, nuff said.

Like I said, if you can't give me a good reason the hardware won't work together.. socket type is an obvious reason hardware isn't gonna work together. The CPU's require very different things from the chipset. I think it's stupid nVidia can't make Intel chipsets due to licensing as well btw, but at least they can make AMD chipsets.

A PCI-e card not functioning (PhysX) because a PCI-e card in a different slot with entirely unrelated function isn't labeled nVidia is bullshit. And yes PhysX processing is unrelated to the graphics processing. It just returns the resolved physics computation the CPU needs to set up the scene.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Like I said, if you can't give me a good reason the hardware won't work together.. socket type is an obvious reason hardware isn't gonna work together. The CPU's require very different things from the chipset. I think it's stupid nVidia can't make Intel chipsets due to licensing as well btw, but at least they can make AMD chipsets.

A PCI-e card not functioning (PhysX) because a PCI-e card in a different slot with entirely unrelated function isn't labeled nVidia is bullshit. And yes PhysX processing is unrelated to the graphics processing. It just returns the resolved physics computation the CPU needs to set up the scene.

I gave you an example that not everything is a standard. Moot point.

You say it's bullsh*t, and I say it's a way to make one hardware more attractive over the competition. Probably why ATI doesn't have shutter glasses like Nvidia's for 3Dgaming (although I hear they are working on their own) , or why Nvidia doesn't have eyefinity (but came up with their own way of doing it). In the PC graphics card world, a company has to do something to draw attention to it's own products. Make them more desirable when all else is so close. Somebody said it was anti-competitive before, but it's actually ultra competitive. Make them want yours, not theirs, whomever they are.
It's why everybody chanted up DX10.1 when Nvidia couldn't run it and ATI/AMD could.
It's why everybody chanted up PhysX when ATI couldn't run it and Nvidia could.
It's why everybody chanted up Glide when 3Dfx could and ATI/Nvidia could not.
Ying, Yang. Here, there. It's just a continuing cycle.
 

Rebel44

Senior member
Jun 19, 2006
742
1
76
Another epic fail from Fermi:
NV promised to have NV Surround driver available in April . Its now May and it looks like users will have to wait loooooong time. NV didnt even bother to post "Sorry we are late, drivers wil be out in X weeks"

From Hardforum:
Stole this from another thread. Looks like nvidia surround wont make it in the first 256 driver release.

QUOTE (Freelancer852 @ May 3 2010, 09:01 PM)
Just out of curiosity ManuelG, will the non-3D Surround features (to clarify, we're wondering about the ability to use our three monitors with SLI enabled) be included in the first 256 series drivers? I know that myself and a good friend of mine have been eagerly awaiting these new drivers in hopes that we can retire our SoftTH hacks in favor of an officially supported solution. If you're allowed to comment on this subject it would really put our minds at ease, we have no problem waiting a couple more months for this either."

^^ random kid on the forum.


"With the initial launch driver, this will not be supported but it will with a future driver. There will be two modes, NVIDIA Surround and NVIDIA 3D Vision Surround. Obviously the difference is one will use NVIDIA 3D Stereo technology and have hardware requirements similar to the current NVIDIA 3D Vision hardware requirements. The other is NVIDIA Surround which is what you are referring to which will support three displays or projectors synchronized together in landscape or portrait mode. All three displays must run at the same resolution, refresh rate and sync polarity. "

^^ manuelG ( from nvidia )

Sucks :/



 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Another epic fail from Fermi:
NV promised to have NV Surround driver available in April . Its now May and it looks like users will have to wait loooooong time. NV didnt even bother to post "Sorry we are late, drivers wil be out in X weeks"

From Hardforum:





You know what AMD should do to be "ultra Competative"? Loock Eyefinity resultions to their own cards and keep nV cards running under 1920x1080 across 3 monitors. That would be so good for the consumers :p
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Look guys, this conversation would most likely be entirely different had AMD been the one to implement AA in Batman and did an ID check. You'd be defending it til doomsday, much like your condemning Nvidia for doing it and will continue to do so.
BS. Complete and utter BS. If AMD were doing this, everybody would be condemning them for it, you included. But because it's nVidia, you're forced to try and spin it to make AMD look like the bad guy instead.


Batman had no in game AA. Nvidia added it. Now NV card users have the option in game.
And NV card users only. Unlike the DX10.1 and DX11 effects AMD has helped developers with. Those can be displayed on everybody's cards. But we don't hear you complaining about getting those effects for free on your nVidia cards.


And also, something I don't quite understand. How long has it been since Batman AA came out and this whole AA debacle went underway? Tell me, have you seen, in all this time, AMD following through to offer their own AA code in this game for their loyal customers? Haven't you wondered about that? Did you not once ask yourselves "Why?" hasn't AMD ALSO offered this feature for you?
From what I've read, one reason is because it's a standard bit of coding that NV is claiming as proprietary.

We have asked our "panel of independent judges" i.e. developers about this method to ensure objectivity of this article and in both cases our developers told us that the description above is a standard technique advocated to gamers by both AMD and nVidia. With the description above being explained to us as "standard", it turns out that the code that was labeled "MSAA Trademark NVIDIA Corporation" was identical to the one recommended by both sides with one very important difference: nVidia's code features a Vendor ID lock that allows for AA menu inside the game only if nVidia hardware is detected.
Another reason is that AMD shouldn't have to submit identical code, sans the vendor ID lockout. They freely allow all the graphics they assist developers with to be displayed on everybody's video cards. NV is the one trying to bring that sort of interoperability to an end.


I think you know what questions you should be asking. Why haven't you pummeled AMD with emails asking them for in game AA in Batman? Did you not want it for yourselves?
Oh, come on... Did you get that little gem from Rollo? You're better than that, Keys. Don't sink to his level.


There is a TON of "I don't get it(s)" whenever we have this conversation.
And I suspect the majority of them you profess to have because of that "Nvidia Focus Group member" signature you're forced to display.

Really Keys, the members here understand the situation and what it potentially means to gamers from both camps. That's why the majority of us are against this sort of behavior.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I gave you an example that not everything is a standard. Moot point.

You say it's bullsh*t, and I say it's a way to make one hardware more attractive over the competition. Probably why ATI doesn't have shutter glasses like Nvidia's for 3Dgaming (although I hear they are working on their own) , or why Nvidia doesn't have eyefinity (but came up with their own way of doing it). In the PC graphics card world, a company has to do something to draw attention to it's own products. Make them more desirable when all else is so close. Somebody said it was anti-competitive before, but it's actually ultra competitive. Make them want yours, not theirs, whomever they are.
It's why everybody chanted up DX10.1 when Nvidia couldn't run it and ATI/AMD could.
It's why everybody chanted up PhysX when ATI couldn't run it and Nvidia could.
It's why everybody chanted up Glide when 3Dfx could and ATI/Nvidia could not.
Ying, Yang. Here, there. It's just a continuing cycle.

What I said was 'If you can't give me some reason why your hardware doesn't work with another piece of hardware in such a standards heavy environment as the PC then it's bullshit marketing'. I didn't say everything had to comply to the same standards, I said they needed a reason not to.

Socket type IS NOT BULLSHIT MARKETING and doesn't piss me off. You gave me an example of exactly what I posted was not bullshit marketing. So.. yay. Good example of nothing.

Adding new features is a great way to make hardware more attractive. 3d vision, Eyefinity, Surround, DX10.1, DX11.. all great.

Disabling those features because you don't like what the person is spending their money on is marketing run amok. I bought a Radeon so I could use Eyefinity.. if I couldn't use an nForce board and Eyefinity I'd call bullshit on that too.

I understand nVidia can gamble and win big by vendor locking this stuff. That's great for them. I'm not nVidia.. I don't win too.. I would win if more than 1 game a year used GPU PhysX and not for floating leaves.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
Like I said, if you can't give me a good reason the hardware won't work together.. socket type is an obvious reason hardware isn't gonna work together. The CPU's require very different things from the chipset. I think it's stupid nVidia can't make Intel chipsets due to licensing as well btw, but at least they can make AMD chipsets.

A PCI-e card not functioning (PhysX) because a PCI-e card in a different slot with entirely unrelated function isn't labeled nVidia is bullshit. And yes PhysX processing is unrelated to the graphics processing. It just returns the resolved physics computation the CPU needs to set up the scene.

You are right, but some people just don't want to see it and give silly examples that should support their side ("diesel cards don't run on normal petrol either!" :D). The said scenario, with an nVidia card doing PhysX and an ATi card doing the rendering, works fine. It's been tested by users and there are hacked drivers to allow it. There's only a business reason for blocking said pair - they do not care about progress or their customers, they just want to force you to go all-green without any technical reasons for doing so.

It's the same with SLi and X58-based motherboards. The only thing stopping some of them from using SLi is an nVidia license - there are no technical reasons for it (so nVidia provides the mobo creators with a small code in the BIOS that enables SLi in drivers). Even now people flash motherboards with "SLi enabled BIOSes" from the "same line" and it works fine. We have been fed the same bullshit with the older chipsets too.

It's all about the money :)
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
You are right, but some people just don't want to see it and give silly examples that should support their side ("diesel cards don't run on normal petrol either!" :D). The said scenario, with an nVidia card doing PhysX and an ATi card doing the rendering, works fine. It's been tested by users and there are hacked drivers to allow it. There's only a business reason for blocking said pair - they do not care about progress or their customers, they just want to force you to go all-green without any technical reasons for doing so.

It's the same with SLi and X58-based motherboards. The only thing stopping some of them from using SLi is an nVidia license - there are no technical reasons for it (so a small code in the BIOS that enables SLi in drivers). Even now people flash motherboards with "SLi enabled BIOSes" from the "same line" and it works fine. We have been fed the same bullshit with the older chipsets too.

It's all about the money :)

Good example. . . I don't think you need a license to have crossfire on a mobo do you?
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Good example. . . I don't think you need a license to have crossfire on a mobo do you?

No license required.. the SLI license is about $5 per mobo from what I've read. I think nVidia would still have SLI restricted to nForce boards if they had been able to get a license to make Intel chipsets, but since they couldn't it's the only way they could make money from Intel boards.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
Oh, come on... Did you get that little gem from Rollo? You're better than that, Keys. Don't sink to his level.

Cool bait. And, Not. It's a perfectly valid question. Have you yourself petitioned AMD to add AA to Batman?
Have others? Asked them why they haven't yet?
Or is it possible that you really don't even care?
You can fall back to my focus group affiliation as often as you like, as some often do when they don't like my opinions. However, like you, my opinion on this is pretty clear. I respect yours, and I expect you to reciprocate. And we agree to disagree.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Cool bait. And, Not. It's a perfectly valid question. Have you yourself petitioned AMD to add AA to Batman?
Have others? Asked them why they haven't yet?
Or is it possible that you really don't even care?
You can fall back to my focus group affiliation as often as you like, as some often do when they don't like my opinions. However, like you, my opinion on this is pretty clear. I respect yours, and I expect you to reciprocate. And we agree to disagree.


EDIT: Another pointless ramble from me, it's late here ;) In any event, you have lots of useful information to provide, which you would not be able provide unless you were a member of said group. That is clearly something of considerable value to people on this forum, particularly those who are undecided, and using nVidia cards and with queries or issues or feedback for you :)
 
Last edited:

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Although it definitely "is" in the best interests of their customers, or potential customers.
I would think so anyway. And as for the best interests of PC gaming and gamers, Well, as a PC gamer, wouldn't you buy the cards that gave you the most? Or is it that it's not in the best interest of those gamers who would just refuse to buy Nvidia products? I don't really know.

How is it in the best interest of their customers? Let's say I'm running an AMD 5870 for my main card and a 9800GTX+ for Physx. Nvidia advertised the 9800GTX+ as a Physx capable part. Then they release a driver that cuts off that ability. My card is now a paper wieght in my PC, it's worthless. Nvidia has my money, I bought the card to use it as it was advertised to do (and yes, Nvidia did create the part to be able to be used as a PPU only, otherwise why would they have the option in the driver to select the card as a Physx only part?).

How is that in the best interest of their customers? They have my money. I bought a part to use it as it was made to function. Now it doesn't work because Nvidia had a hissy fit.

You never answered this even though I mentioned it in several posts. Tell me, how would you feel about AMD's business tactics if their next driver release disabled 3D capabilities and HTPC abilities when an Intel chipset/CPU is detected? Also, keep in mind it worked fine for months. The Radeon is advertised to have certain functions but AMD just disables those functions when an Intel chipset/CPU is detected... would that be something that is in the best interest of their customers?

Why wouldn't Nvidia just leave that as an unsupported configuration? As in it may work, but you are on your own if you have problems? I don't see how you can possibly say this is in the best interest of their customers... all they have done is hurt some of their customers and tarnished their reputaion in the gaming world.
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
No license required.. the SLI license is about $5 per mobo from what I've read. I think nVidia would still have SLI restricted to nForce boards if they had been able to get a license to make Intel chipsets, but since they couldn't it's the only way they could make money from Intel boards.

I bet nVidia was thinking about licensing "running a different vendor as a 3D renderer for a PhysX-enabled environment" and since nobody was interested, they came up with that laughable excuse they're feeding us right now.

CrossFire is for free, SLi is a few bucks per mobo sold. There are no technical reasons any X58-based motherboard can't run SLi. It's just an on/off switch from nVidia that's needed in the BIOS. It was the same in the older generation chipsets - those did CrossFire fine. But since nVidia was making their own Intel chipsets, allowing it on other boards was a no-no. And they were also feeding us the technical limitation BS.

EDIT: Hell, before the Intel 3-series chipsets, when ATi was separate from AMD, the red camp was making their own chipsets too. But, in contrast to today, you could have CrossFire only on an ATi-based motherboard. The Intel i945+ chipsets had two PCI-e x8/16 slots and neither SLi nor CF was supported. I think the only one that was CF enabled was the enthusiast i975. So it's not like ATi didn't do it either. However, now they changed their approach and are more consumer friendly. Every board supports CrossFire now without any additional limitations - be it Intel based or AMD (the owner of ATi and CF technology now). So it's a consumer-friendly approach. Something that cannot be said about nVidia.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Cool bait. And, Not. It's a perfectly valid question. Have you yourself petitioned AMD to add AA to Batman?
Have others? Asked them why they haven't yet?
Or is it possible that you really don't even care?
You can fall back to my focus group affiliation as often as you like, as some often do when they don't like my opinions. However, like you, my opinion on this is pretty clear. I respect yours, and I expect you to reciprocate. And we agree to disagree.

You know why they haven't, you keep asking questions that have already been answered and don't any that you are asked like what Slowspider asked above.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Cool bait. And, Not. It's a perfectly valid question. Have you yourself petitioned AMD to add AA to Batman?
Have others? Asked them why they haven't yet?
If you'll look above, I believe I have already answered these questions.

How about if I throw a couple at you now?

-Do you feel that AMD should block Nvidia users from any DX10.1 and DX11 content they've helped developers implement?

-Is it better for consumers as a whole to have video card companies paying developers to block content from each others cards? And if so, how is it better for us?


Or is it possible that you really don't even care?
Obviously, since I'm taking the time to make my viewpoint known, I care.

You can fall back to my focus group affiliation as often as you like, as some often do when they don't like my opinions. However, like you, my opinion on this is pretty clear. I respect yours, and I expect you to reciprocate. And we agree to disagree.
I'm not sure what 'respecting each others opinions' has to do with anything. Does that mean that we cannot debate a subject? We are both free to create our own responses to any posts made within these forums.

The reason I brought up your Nvidia Focus Group affiliation is that anybody's opinion regarding a company they work for (or receive compensation from) is always going to be suspect. I'm sorry if that bothers you, but you had to know that was going to happen the minute you joined the Nvidia Focus Group. In this case, defending the nVidia vendor ID lockout while simultaneously attempting to say that AMD is the one to blame is, to me, beyond ridiculous and makes me look for the real motivation behind making such a claim.
 

Martimus

Diamond Member
Apr 24, 2007
4,490
157
106
How is it in the best interest of their customers? Let's say I'm running an AMD 5870 for my main card and a 9800GTX+ for Physx. Nvidia advertised the 9800GTX+ as a Physx capable part. Then they release a driver that cuts off that ability. My card is now a paper wieght in my PC, it's worthless. Nvidia has my money, I bought the card to use it as it was advertised to do (and yes, Nvidia did create the part to be able to be used as a PPU only, otherwise why would they have the option in the driver to select the card as a Physx only part?).

In this case, it would likely be best to open up a class action. To sue by normal means wouldn't be cost effective, since the legal fees are more than you would get in a settlement. This is the reason they are doing this illegal activity, because the cost of calling them on it is prohibitive. If you can get enough people together to join you on this, you could make it less cost prohibitive, and might even get a sum increase in funds from the lawsuit. It would also make nvidia less likely to do such an action in the future.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
The reason I brought up your Nvidia Focus Group affiliation is that anybody's opinion regarding a company they work for (or receive compensation from) is always going to be suspect. I'm sorry if that bothers you, but you had to know that was going to happen the minute you joined the Nvidia Focus Group. In this case, defending the nVidia vendor ID lockout while simultaneously attempting to say that AMD is the one to blame is, to me, beyond ridiculous and makes me look for the real motivation behind making such a claim.
It is, but good luck ever getting a straight answer from him. Excellent reply though.