Originally posted by: Qbah
Originally posted by: Keysplayr
Originally posted by: Elfear
The whole thing just leaves a sour taste in my mouth with the way Nvidia does business. I've had just as many Nvidia cards over the years (maybe more) than I have ATI cards since bang for the buck trumps brand loyalty in my book. However, with the way I see Nvidia running things lately, if both Nvidia and ATI offered cards that performed and cost the same, I'd probably pick the ATI card. I'm probably not alone in that sentiment either and it just strikes me that Nvidia isn't winning many customers with the business strategy they've taken.
Hehe, I see Intels business practices are cool with you. Nice i7 you have there.

:evil:
What has Intel got to do with a topic about PhysX? You can run SLI on Intel and CF too - if anything, nVidia should learn from them! This was such blatant trolling and bad try to rebound it's sad.
I think it's more with Elfear's comments on the business practices of nVidia. Though to be fair, Elfear did say that everything else being roughly equal between ATI & nVidia he'd go nVidia. Elfear didn't say, "OMFG, burn nVidia! I'm going ATI only cause your business practices suck rocks!" I'd take that to mean that if nVidia provided a superior product (i7 vs whatever AMD has now for example) he'd go with that.
Originally posted by: Genx87
Originally posted by: BFG10K
Artificial vendor lock-in, when done for no good reason other than to restrict the competition, is bad for the customer.
There are lockins in every market. It is a tool by manufacturers to keep you as a customer. I cant buy an AMD chip and use an Intel chipset. I cant buy an Audi and get OnStar. My wife uses Sandisk Rhapsody that only works with Sandisk MP3 players. Apples iTune works best with Apple products. If we got up in arms about every lock in we would lose our mind. This isnt uncommon and it wont be the last time a vendor does this to keep customers.
Some of us remember when you could buy any motherboard and drop in a CPU from a few different vendors. We all remember how expensive CPU prices got after Intel locked out the competition. Excepting AMD, where are the competition now? Regardless of industry, lock-ins rarely (if ever) benefit consumers. Consumers are the ones that lose in most of these cases.
Originally posted by: Genx87
It sucks for people using an ATI card but it doesnt surprise me in the least. What does Nvidia get out of you paying ATI top dollar for a graphics card and then buying the sub 100 dollar Nvidia card to generate physics?
And I wont be surprised in the least when\if Intel delivers Havok on a GPU it locks out the competition as well, possibly even ATI.
Well, nVidia gets a larger potential market for PhysX as I've posted previously. This would make good business sense. Make your products more enticing and customers are more likely to buy it. Simple concept.
In the first scenario, we have a person who bought an ATI card. He'd like to enable PhysX but he can't cause nVidia doesn't allow it. In this case, nVidia makes no money. nVidia loses.
In the second scenario, we have an ATI card along with an nVidia card for PhysX. nVidia makes money, probably not as much as if the person bought an nVidia card as the primary GPU but they still made something and it reinforces PhysX as a standard. Not the best case scenario but at least they still made money.
In the third scenario, we have an nVidia + nVidia combo. nVidia just plain wins in this case.
Obviously the 3rd scenario is what nVidia would love but if I was a business, the second scenario sure beats out the first. And the second scenario is what they'd get.
Originally posted by: Genx87
You ever take a business or marketing class? Lock-ins are part of the curriculum to advance your business. From a business perspective right now it makes sense for Nvidia to disable the ability to have PhysX with the a competitors card that has no plans to support your standard. Until Microsoft gets off their ass or Nvidia's marketshare plummets to ATI lebels we will see these types of marketing games.
There's two views on this. You can lock your customers in early and set it so that anyone who wishes your products or technologies must use them on your terms. The potential pitfall is the market finds your terms restrictive and decide not to adopt your products. You lose. An example of it working is the iPhone, but the iPhone built on the success of the wildly popular iPod. An example of it not working is Mac computers.
Alternatively you can try to get as much people to use your products/tech as possible before you find ways to lock them in. Example is iPods after they released a version with connectivity to a PC. Yes, the iPods have iTunes lock-in but that wasn't a huge issue with the iPod's uptake. You can get music from pretty much anywhere into an iPod. Apple has now used the iPod/iTunes dominance as a vehicle to their other products.
Originally posted by: SirPauly
nVidia already has strong leverage with GPU PhysX and the ability to offer with a single GPU -- ATI doesn't care about it and enjoys the word "death" hehe! It seems not lifting a finger. It would make more sense to leverage PhysX for ATI platforms with a PhysX discrete card to make more end-users aware and more systems that may offer this for games. But how much would this cost nVidia to do this? Is it easy to do or difficult? Does anyone really know? I have many questions.
I disagree. If one was to say nVidia has strong leverage on GPGPU then I wouldn't be able to argue. ATI's GPGPU offerings have been anemic compared to nVidia's. However, in the physics acceleration space, everything is in such an early state that things can go either way. Some of the Havok tech demos (for what it's worth) have showed great potential, certainly no less than that of PhysX.
Good games typically take 2-3 years to develop. That means that it could potentially be another 3-5 years before we see a real winner start to emerge in the Havok vs PhysX wars. Yes, I know that games using PhysX are being developed even now but I think for physics acceleration to truly shine it'll take a grounds up approach where a game is specifically made with physics acceleration in mind and not just as an added feature for some added fluff as in current games.
The whole X-factor in all this is how much resources Intel will bring to push Havok. After all, nVidia has provided resources to developers to help push their own technology (The Way It's Meant To Be Played program) as have ATI (Get in the Game). Intel has a vested interest in pushing Larrabee and Havok. While Intel has a really tough battle ahead, they do have a lot of money on their side.