Wreckage
Banned
- Jul 1, 2005
- 5,529
- 0
- 0
Originally posted by: Paratus
It seem Nv only wants to improve their GPGPU capability while gaming prowess and cost are secondary.
Secondary? They have the gaming performance crown.
Originally posted by: Paratus
It seem Nv only wants to improve their GPGPU capability while gaming prowess and cost are secondary.
Originally posted by: Keysplayr
A few things to note from the posts in this thread:
Architecture scalability: How is ATI's architecture any more or less scalable than their competitors?
ATI will get crushed if they don't change their arch: While I don't think they will get crushed, and will probably just double or triple the shaders and impliment DX11 support, they still end up with a arch that nobody wants to code for. Even if technically ATI's architecture is superior (subjective) on paper, it won't be realised in real world apps and games. But then agan we don't know how NVs MIMD arch will perform either. Total wait and see.
NV more focused on GPGPU than on gaming: It would appear that this statement has no teeth. NV was extremely focused on the CUDA architecture since G80, and has been leading ATI in gaming performance ever since. Doesn't appear they have lost site to the gaming aspect of their GPUs. Big die, small die, transistor budgets really shouldn't matter. I think that "Idontcare" did a rough calculation of what the die size should be for the number of rumored transistors of the GT300 core based on 40nm. Correct me if I'm wrong IDC, but I think you said somewhere around 220-250mm2.
ATI isn't interested in competing in the high end: This is a pleasant yet nonsensical spin on the real statement, "We can't best them, so we'll say we never intended to. Yeah, we'll go with that. And also fellow board members, we have opted to adopt Havok as our Physics method of choice. This will give us about two years to actually create a GPU that can actually be programmed for efficiently and actually be able to run GPU Physics. It saves us the embarrassment of the public actually finding out that we can't run a whole lot more than just games well on our current architecture. Meeting adjourned. Sushi anyone?"
By the way, anyone planning on getting insulted by these comments needs to understand that they are directed at AMD/ATI. Getting personally insulted over it would be kind of silly. Don't let it happen to you.![]()
Originally posted by: Keysplayr
A few things to note from the posts in this thread:
Total wait and see.
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.
Originally posted by: Scali
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.
Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.
You really can't make that inference. Remember the 5800 Ultra?Originally posted by: Wreckage
If it's not based on a new architecture (as rumors suggest) they will get crushed.
Originally posted by: Keysplayr
NV more focused on GPGPU than on gaming: It would appear that this statement has no teeth.
Originally posted by: Henrah
Originally posted by: Scali
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.
Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.
Linky please ^_^
. . .nVidia decided to stick with 512-bit memory controller for the GT300 architecture. If that unconfirmed information is true, this would mean that nVidia has a playing field of eight 32-bit memory controllers connecting to multiple 32-core clusters for a total of 512-cores . . .
512-bit GPU connected to 1-2 GB of GDDR5 memory at 1000 MHz QDR [Quad Data Rate]- we are looking at memory bandwidth of 256GB/s per single GPU, more than dual-GT200b GPU in the form of GeForce GTX295 [896-bit x 1GHz DDR = 224 GB/s]. That would be a conservative estimate, but given the development of GDDR5 memory, if the GT300 chip ends up connected to GDDR5 memory at 1050 or 1150 [alleged 5870/5870X2 clocks] - we are looking at memory bandwidth anywhere between 268.8-294.4 GB/s.
. . .
512-bit GPU with 512 cores? if these numbers are true, we're looking at one beast of a GPU.
Originally posted by: Henrah
Originally posted by: Scali
Originally posted by: nitromullet
Although, I imagine that GT300 will use GDDR5.
Rumours say that GT300 will be GDDR5 AND 512-bit, which would amount to roughly twice the bandwidth of today's high-end cards.
Linky please ^_^
Originally posted by: GundamSonicZeroX
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
What about the FX-55?
Originally posted by: alyarb
you guys speak as if CUDA will take center stage next year rather than OpenCL.
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
Originally posted by: jandlecack
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
As I said, that was because AMD had the upper hand in CPUs at the time. It had nothing to do with AMD and ATI actively pursuing anything. The next generation, it could have gone either way and AMD + nVidia could be the better combo.
It's always just a matter of whether Intel or AMD and ATI or nVidia have the faster chips at any given point, and AMD hasn't had that in a long time.
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
Originally posted by: nitromullet
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
I doubt you were running an ATI or AMD chipset though. The best performing chipset for AMD at that time was nForce 2:
http://www.anandtech.com/mb/showdoc.aspx?i=1872&p=10
Originally posted by: ochadd
Originally posted by: nitromullet
Originally posted by: ochadd
Originally posted by: jandlecack
Originally posted by: ochadd I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
When was the last time that happened? From my recollection, any Intel + ATI combo beats any AMD + ATI combo.
When the ATI Radeon 9800 and 9700 cards were around I was running an all AMD/ATI system that was trouncing any other combination.
I doubt you were running an ATI or AMD chipset though. The best performing chipset for AMD at that time was nForce 2:
http://www.anandtech.com/mb/showdoc.aspx?i=1872&p=10
I was probably running my old Asus A7v333 but don't recall. Via? My goodness people are relentless. Only wishing AMD CPUs and ATI GPUs were the top dogs, that's all I'm getting at.
Originally posted by: ochadd
I don't think AMD is willing to do what it takes to beat them on the high end. Nvidia will unleash an unprofitable card to keep the single card performance crown while AMD has the common sense not to. I'd give both pinky toes to go back to an all AMD system that rocked Intel and Nvidia.
Originally posted by: ochadd
Originally posted by: nitromullet
I was probably running my old Asus A7v333 but don't recall. Via? My goodness people are relentless. Only wishing AMD CPUs and ATI GPUs were the top dogs, that's all I'm getting at.
Originally posted by: Stoneburner
When the 9700 came out, i believe Intel was on the pentium 4b series (northwood?) and it was before AMD started overtaking intel with a64. The XP series were good at gaming but were a bit behind intel and those came after the 9700.
Even the 9 series wasn't a clear win for ATI. They had better D3D performance, but Nvidia had the advantage in OpenGL. Lots of games like Doom 3, Call of Duty, Jedi Outcast, Jedi Academy, and Wolfenstein used OpenGL, so there was still a reason to buy Nvidia products. I had a Radeon 9600XT at the time and it had some major problems with Doom 3.Now, the 9700/9800 ran roughshod over Nvidia. INitially people thought FX was competitive with the 9 series but then there was a scandal about NVIDIA rigging tests and cheating in games
AMD is the only company keeping Intel and Nvidia in check. Without AMD, we wouldn't have $50 Intel processors or $100 Nvidia graphics cards.I must be too young to understand where this die-hard hate for anything not AMD comes from.
Originally posted by: ShawnD1
Originally posted by: nitromullet
Originally posted by: ShawnD1
I had to put my spare Nvidia card into a server because the Catalyst Control fucks with Remote Desktop.
Just curious, why are you installing CCC on a server?
The computer is occasionally used as an extra desktop computer. Even basic things like web browsing are incredibly laggy without installing drivers.
Originally posted by: BFG10K
You really can't make that inference. Remember the 5800 Ultra?Originally posted by: Wreckage
If it's not based on a new architecture (as rumors suggest) they will get crushed.
Or how about the 2900? ATi got back in the game by refining it into the 4xxx series, not by shipping a brand new architecture.
Originally posted by: Wreckage
Originally posted by: BFG10K
You really can't make that inference. Remember the 5800 Ultra?Originally posted by: Wreckage
If it's not based on a new architecture (as rumors suggest) they will get crushed.
Or how about the 2900? ATi got back in the game by refining it into the 4xxx series, not by shipping a brand new architecture.
I remember the 8800, seemed pretty good. Nvidia only really had one bad launch and it actually managed to recover during that generation fairly well.
ATI's track record has not been as stellar and the 4800 could not take the crown from NVIDIA's old architecture, so it seem logical that it won't stand a chance against a new one.
