secretanchitman
Diamond Member
- Apr 11, 2001
- 9,352
- 23
- 91
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.
I'm sure Nvidia knows everything it needs to know about R600.
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.
I'm sure Nvidia knows everything it needs to know about R600.
Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.
Originally posted by: munky
I wasn't expecting anything more from an 8800u, I said before it's most likely just a speed bump. I would be surprised, however, if it sold for almost a grand, and someone actually bought one (or two...:laugh: )
Originally posted by: golem
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.
I'm sure Nvidia knows everything it needs to know about R600.
Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.
What Matt2 said actually makes sense. It would be logical that your closest competitor would be the ones that spend the most time and effort to learn about your product and try and come up with a counter for it.
If the 8800u has no chance against R600 using GDDR3, why bother making it, especially if the rumours of it's limited availability and pricing are true?
Oh really? Turn the details on full and run the game at 2560x1600.BDG10K, I would not say that Stalker really taxes the card that badly.
At what resolution/settings?Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
You don't need to "speak" for it, just look at the benchmarks.Cant speak for call of juarez though as never tried it.
Originally posted by: BFG10K
Oh really? Turn the details on full and run the game at 2560x1600.BDG10K, I would not say that Stalker really taxes the card that badly.
At what resolution/settings?Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
You don't need to "speak" for it, just look at the benchmarks.Cant speak for call of juarez though as never tried it.
Those scores are without AA too - imagine what they would look like with AA.
Granted, games can be run at lower detail settings but it's foolish to claim "you don't need more power" when you can always take advantage of it in any game by raising detail levels, resolution and/or AA.
Originally posted by: terentenet
@jim1976: what's "the missing MUL"?
Originally posted by: thilan29
Maybe Dell accidentally called it an 8800 Ultra on their website??
Originally posted by: bunnyfubbles
Originally posted by: golem
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.
I'm sure Nvidia knows everything it needs to know about R600.
Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.
What Matt2 said actually makes sense. It would be logical that your closest competitor would be the ones that spend the most time and effort to learn about your product and try and come up with a counter for it.
If the 8800u has no chance against R600 using GDDR3, why bother making it, especially if the rumours of it's limited availability and pricing are true?
That logic is far from bullet proof. What would they do if R600 proved to be twice as fast as the 8800GTX? Magically pull the G90 out of their ass?
Bumping clock speeds as they have is just about the only thing they can do as any sort of short term response. And if you read you would have understood how achieving those clock speeds practically requires water cooling on current 8800 hardware - meaning the speed bumps happen to be pushing the hardware to the limit of air cooling, so I wouldn't call that any sort of feeble bump in speeds.
And as far as the memory is concerned 2160MHz GDDR3 is not fast? That provides bandwidth on par with the rumored XT version of the R600 and its 1600MHz ram. Of course its true that if there is any flavor of R600 with ram as fast as 2160MHz, it would have a decisive advantage with memory bandwidth.
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?
It doesn't matter how strong/par/poor the R600 performs, the only thing nVidia can do to preempt is exactly what they're doing with this supposed 8800 Ultra.
Originally posted by: jim1976
Originally posted by: terentenet
@jim1976: what's "the missing MUL"?
Simply put MADD and MUL are mathematical computations that an ALU can do for specific purposes inside the GPU especially for shader computations. MADD means (multiply+add) while MUL means simple multiplication. As you can understand MADD is more complex and does more cycle ops than MUL, but combined they can provide greater results. The fact is that G80 has scalar ALUs and MADD only computations are more common, but MUL can be used too for greater performance,especially in the shader part.
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?
Originally posted by: BFG10K
Oh really? Turn the details on full and run the game at 2560x1600.BDG10K, I would not say that Stalker really taxes the card that badly.
At what resolution/settings?Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
You don't need to "speak" for it, just look at the benchmarks.Cant speak for call of juarez though as never tried it.
Those scores are without AA too - imagine what they would look like with AA.
Granted, games can be run at lower detail settings but it's foolish to claim "you don't need more power" when you can always take advantage of it in any game by raising detail levels, resolution and/or AA.
Originally posted by: chizow
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?
There's lots of ways to know what your competition is up to, at the end of the day there's people behind these big companies, they're not just faceless entities. People talk and NV and ATI share a few AIB partners and also compete for a lot of the same OEM business.
They've also up to this point (may change if AMD starts fab'ing their own, as rumored) shared the same chip-maker in TMSC. The people who work for these companies are smart folks, you could probably transplant any group of engineers from either camp and they could transition over in a short amount of time. Hell, they could probably even simulate or virtually re-create a competitor's part if they wanted to, the same way they model their own chips before taping out.
I'm not implying any wrong-doing from either camp but it'd be naive to suggest either company is operating with blinders on in regards to what the other is doing. With NV30/9700pro, NV was already too far gone into NV30 to scrap it and start new. They made a gamble and lost, thinking their 1x2 rendering path would pay off and that game programmers would bend over backwards to make their chips run well. They quickly scrapped it and moved on. AMD has always been slower in bringing new chip architectures to market. They don't have the R&D and fab prowess that Intel has so when they have a winning part (which they did for 2-3 years before Core2) they tend to squeeze every last bit out of it. That's not to say they don't know they're beat. Spin it as you like but there's little doubt G80's amazing performance was a major reason for R600s delays. It simply wasn't ready to compete with G80 and ATI/AMD knew it.
Originally posted by: chizow
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?
There's lots of ways to know what your competition is up to, at the end of the day there's people behind these big companies, they're not just faceless entities. People talk and NV and ATI share a few AIB partners and also compete for a lot of the same OEM business.
They've also up to this point (may change if AMD starts fab'ing their own, as rumored) shared the same chip-maker in TMSC. The people who work for these companies are smart folks, you could probably transplant any group of engineers from either camp and they could transition over in a short amount of time. Hell, they could probably even simulate or virtually re-create a competitor's part if they wanted to, the same way they model their own chips before taping out.
I'm not implying any wrong-doing from either camp but it'd be naive to suggest either company is operating with blinders on in regards to what the other is doing. With NV30/9700pro, NV was already too far gone into NV30 to scrap it and start new. They made a gamble and lost, thinking their 1x2 rendering path would pay off and that game programmers would bend over backwards to make their chips run well. They quickly scrapped it and moved on. AMD has always been slower in bringing new chip architectures to market. They don't have the R&D and fab prowess that Intel has so when they have a winning part (which they did for 2-3 years before Core2) they tend to squeeze every last bit out of it. That's not to say they don't know they're beat. Spin it as you like but there's little doubt G80's amazing performance was a major reason for R600s delays. It simply wasn't ready to compete with G80 and ATI/AMD knew it.
Originally posted by: terentenet
So let me get this right. G80 has MUL but it's disabled? What's the reason for that? Maybe they ran into issues using it and decided to disable it. Or did they disable it on purpose just to keep an ace in the hole if they need it?
Back when G80 was launched, they said the shaders are scalar and more can be added without too much hassle. I wonder why won't they add another 32 beside the core&mem speed bump. We need innovation, not just a speed bump that can also be obtained by overclocking.
I for one allready feel the need for more power, as I just got my 30" LCD and gaming at 2560x1600 is taxing even the 8800GTX SLI. I need to back down on AA/AF to keep things smooth at that res.
30" LCDs' are 1-2 years old on the market, prices dropped and more will adopt the new large LCD trend. We need significantly more powerfull GPUs.