Originally posted by: SickBeast
Originally posted by: Keysplayr
Originally posted by: SickBeast
Originally posted by: OCguy
Originally posted by: SickBeast
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.
And I bet you actually believed it. :laugh:
Some of it made sense, yes. I personally don't see CUDA or PhysX as a complete waste of time, however, and I'm not going to write off the GT300 before it even comes out. I can see where they are coming from in that they may do very poorly in terms of performance per transistor, however I would not write off NV given the fact that they have had the fastest overall GPU for the past several generations.
The problem is, NV has not executed in terms of midrange derivatives of the GT200, and AMD will surely make more money on their next gen part right off the bat due to their superior strategy.
Not if it's the same architecture they won't. IMHO, people will want the new core on GT300. Simply because it's a complete change. New tech. MIMD. If AMD changes their architecture, they they have a good shot, but I don't think they're doing much more than doubling the shaders and adding ROP's. Sure, It'll perform great in games, but how will it perform in OpenCL, DirectX Compute of Windows 7, or Snow Leopard? Ah, but I'm getting ahead of things here. I know. Wait and see.
The thing is, apparently there have been a bunch of features in AMD's GPUs that have not been utilized by DX10 in its current form. AMD thought that the DX10 spec would go much further than it actually did. Therefore, in their current form, AMD GPUs are more or less DX11 GPUs. In essence, it would probably be wasteful for AMD to re-invent the wheel at this point just for the sake of doing so.
In terms of GPGPU performance, I'm going to reserve judgment for quite some time. In all likelihood, GPGPU performance is not going to matter for a long time because we will probably not have great applications that benefit from it for at least the next two years. Hopefully I will be proven wrong. Of course, there will be scientific and server applications that use it. I'm talking about killer apps for the consumer.