"AMD calls nVidia's marketing bluff"

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: Idontcare
Originally posted by: Kakkoii
Originally posted by: Schmide
Originally posted by: Kakkoii
Nvidia is changing the game once Fermi comes. They aren't going to need to create their own new market. The market already exists. From game/video production, to science, to the internet. There is a huge market for fast and efficient server farms. Pixar for example will be able to render their movies a hell of a lot faster if they switch to a Fermi based cluster and port their code over.

So Nvidia isn't creating a new market, but changing up one of the biggest and oldest ones.

Are you sure changing is the right word? From what we have all read Fermi seems to be bigger, a bit more efficient (threading), and yes more powerful. The type of processing will remain the same, massively parallel yet devoid of logic.

In the end the true test will be the power/performance numbers.

Well I meant changing from the status quo. Currently server farms mostly use massive amounts of power sucking CPU's. They'll change the game up with much better price/performance ratio and lower energy cost's. And yeah the processing will stay the same for the most part.

It will be interesting to see if Nvidia can generate traction at a rate better than Intel did with Itanium in the marketspaces that required displacing COTS (commodity off the shelf) x86-based clusters and servers. (and perhaps one could even include the DEC alpha desktop processor with FX32 translator, etc, same barrier to entry essentially)

From a high level perspective the same barriers to entry surround Fermi/Tesla as did Itanium, both can scale select applications like mad but you must make serious investments recoding applications for the new architecture to extract that performance otherwise the performance/watt and scaling advantages fail to fall thru to the bottom-line.

If the marketing material is right and they have real world experiences on their side. I think a lot of farms will give Nvidia a serious look. What was one of the examples they used? Some lab consolidated their servers down to a 1/10th the physical size, 1/10th the electrical requirement, and heat was way down all for the same cost? One would think if real world experiences are like that, people have to be brain dead to not take advantage of that. Some farms could literally seem magnitudes more performance for the same cost.

Itanium had an issue in that it required virtually the same power and infrastrucutre as x86 to achieve the same performance on top of recoding to take advantage of its arch. In other words, it had\has the same problems as RISC processors did against x86.