• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

A call to boycott Nvidia games for ATI/AMD owners

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
2 chips that are barely faster than 1 chip..... uhmazin!!! 😱

is that really an argument?

So we should just disregard anything that uses 2 chips? Eh, so SLI and XFire shouldn't even be mentioned ever. They have 2 chips.

Quad cores are faster in parallel applications than dual cores!! HAZZAAA



The lack of logic in this entire thread is outstanding. How is this not locked yet.
 
is that really an argument?

So we should just disregard anything that uses 2 chips? Eh, so SLI and XFire shouldn't even be mentioned ever. They have 2 chips.

Quad cores are faster in parallel applications than dual cores!! HAZZAAA



The lack of logic in this entire thread is outstanding. How is this not locked yet.

Ow haven't you heard? ATI's dual GPU cards don't count as much as NV's dual cards.
I still remember all the talk about how the GTX 295 was faster than 5870.
 
2 chips that are barely faster than 1 chip..... uhmazin!!! 😱

Well, talk about double standards here, screaming out loud that the GTX 295 could outperform in some games the HD 5870. Now suddenly the HD 5970 beating the GTX 580 is no longer valid 🙄

Lol cmon now why is it Nvidia's problem AMD can't optimise and work with developers like them? There are reasons AMD cards are cheaper.

They are engineered better, are smaller to manufacture, requires far less circuitry and stuff in the PCB to run the GPU. The HD 6970 and the GTX 570 performs almost the same, and yet the HD 6970 consumes less power, offers you more bang for buck and doesn't dive when you game at ultra high resolutions.
 
Yeah, they'll hear you allright.

"Nvidia, we will not buy your products until you reduce the performance of your cards in games equivilent to AMDs offerings and performance."

How many weird looks do you think that would snag.
Honestly guys, this is the dumbest idea I've heard in a long time.

Ok, you guys have fun now. This is all the keystrokes I'll put in to this topic.

/cheers.
You say one thing and do another.After saying the above I saw more more keystrokes from you put into this topic.Can't do what you say you would do much?


Member callouts are not productive and violate the posting guidelines.

Please read and reflect upon the following comments taken directly from our AnandTech Forum Guidelines:
We want to give all our members as much freedom as possible while maintaining an environment that encourages productive discussion. It is our desire to encourage our members to share their knowledge and experiences in order to benefit the rest of the community, while also providing a place for people to come and just hang out.

We also intend to encourage respect and responsibility among members in order to maintain order and civility. Our social forums will have a relaxed atmosphere, but other forums will be expected to remain on-topic and posts should be helpful, relevant and professional.

We ask for respect and common decency towards your fellow forum members.

These kinds of castigating diatribes against your fellow forum colleagues need to stop. They are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:
You say one thing and do another.After saying the above I saw more more keystrokes from you put into this topic.Can't do what you say you would do much?

Really, REALLY shouldn't be a concern of yours what I do. If the conversation proves interesting to me, I'll post as I like regardless if I say I'm done posting previously. Understand?
 
Wellllll.... if they were innovating in a way that benefited everyone, people would probably see that as a positive and buy more of their cards to support the true innovators. Instead they use that power to stifle PC gaming and try to limit people to just their products. I think it'd work out better if they weren't trying to destroy PC gaming but rather would make it better for everyone, even competitor's products.

As of now, it's in PC gamer's interests to see Nvidia go away, and replaced by another company with a different mindset, than to encourage the behavior.

How exactly can they do that? Offer beyond standards and yet have to support every one, too? Does nVidia look like gaming welfare to you and everyone is simply entitled?
 
Really, REALLY shouldn't be a concern of yours what I do. If the conversation proves interesting to me, I'll post as I like regardless if I say I'm done posting previously. Understand?
Yeah I understand..... a poster who can't keep his word.Amazing.


Personal attacks are not acceptable.

Moderator Idontcare
 
Last edited by a moderator:
Wow, almost a decade ago. Although it seems a lot longer. Back then it seemed that NV3x architechture played a major part in its weaker performance. That could just as well hold true today with a
AMD's architechture, although probably not as severe an example as NV3x.
 
Because it's cheaper that way I'd wager. The more layers you stack up the more expensive it gets.

Yup, that's the norm lately as GPU's gets more complex. My HD 6970 barely fits my Antec 900 case :biggrin:

Wow, almost a decade ago. Although it seems a lot longer. Back then it seemed that NV3x architechture played a major part in its weaker performance. That could just as well hold true today with a
AMD's architechture, although probably not as severe an example as NV3x.

AMD architecture doesn't have the weaknesses that the NV3x had. nVidia's NV3x was their first and last attempt of launching a pure superscalar architecture and they failed miserably. AMD in the other hand has been very sucessfully with it since the HD 2900XT which was very competitive with the 800GTS 640. It allowed AMD to make small chips that can be as competitive as their competititor's much larger chip. Of course, nVidia's approach isn't bad either as it means that their GPU doesn't rely too much on driver optimizations and will have predictable performance. In the end, both approaches has their strong points and weaknesses.
 
I really wanted the HD 6990, but I couldn't wait for it and probably wouldn't have been able to afford it.

I don't buy Ubisoft games for a variety for reasons starting wtih their DRM.

I won't buy Nvidia cards again until they stop disableing PhysX GPU acceleration when AMD products are present in the sysytem. There are several things wrong with Nvidia doing this.
 
I won't buy Nvidia cards again until they stop disableing PhysX GPU acceleration when AMD products are present in the sysytem. There are several things wrong with Nvidia doing this.

Dude, you won't believe how trouble I had trying to get that 9600gt working in my system. I did fet it working, but I was so pissed of I just ripped the card out afterwards. Physx wasn't impressive enough to be worth it.
 
Dude, you won't believe how trouble I had trying to get that 9600gt working in my system. I did fet it working, but I was so pissed of I just ripped the card out afterwards. Physx wasn't impressive enough to be worth it.

From the other side of the coin, it is seamless here, with nice gains in Mafia 2, but the key is the lack of new content. Where is it? Would like to see nVidia rethink their strategy and make an attempt to port it to openCL. From a gamer perspective, division, chaos, fragmentation sucks, but gamers don't have to make tough decisions. Realize, personally, don't have the data like nVidia but how much mature is Cuda PhysX over OpenCL PhysX? Is it worth doing for nVidia customers? How much will it cost? Differentiation is important to nVidia, one may imagine, how does this play in?

It's hard sometimes, when nVidia desires to have their GPU's in everything and will support anything that offers GPU processing, but yet, doesn't support their own PhysX GPU's, based on render GPU. On the surface it's tough to understand. I tried to use logic and did guess their potential reasoning, but I still have problems with it though. Then you combine this with lack of GPU PhysX content, maybe, it's time to move forward and try to port it now.
 
Back
Top