Dell leaks 8800 Ultra specs

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
I wasn't expecting anything more from an 8800u, I said before it's most likely just a speed bump. I would be surprised, however, if it sold for almost a grand, and someone actually bought one (or two...:laugh: )
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.

I'm sure Nvidia knows everything it needs to know about R600.

Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Just a speed bump? With watercooling you can get those speeds with the actual 8800GTX's. I was hoping for more shaders and core+mem speed bump. I mean, will this small bump be enough to match the R600?
Maybe Dell got it wrong and it's just another factory OCed card.

@jim1976: what's "the missing MUL"?
 

golem

Senior member
Oct 6, 2000
838
3
76
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.

I'm sure Nvidia knows everything it needs to know about R600.

Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.

What Matt2 said actually makes sense. It would be logical that your closest competitor would be the ones that spend the most time and effort to learn about your product and try and come up with a counter for it.

If the 8800u has no chance against R600 using GDDR3, why bother making it, especially if the rumours of it's limited availability and pricing are true?

 

Aquila76

Diamond Member
Apr 11, 2004
3,549
2
0
www.facebook.com
Originally posted by: munky
I wasn't expecting anything more from an 8800u, I said before it's most likely just a speed bump. I would be surprised, however, if it sold for almost a grand, and someone actually bought one (or two...:laugh: )

sonz70 already has clientèle lined up.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: golem
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.

I'm sure Nvidia knows everything it needs to know about R600.

Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.

What Matt2 said actually makes sense. It would be logical that your closest competitor would be the ones that spend the most time and effort to learn about your product and try and come up with a counter for it.

If the 8800u has no chance against R600 using GDDR3, why bother making it, especially if the rumours of it's limited availability and pricing are true?

That logic is far from bullet proof. What would they do if R600 proved to be twice as fast as the 8800GTX? Magically pull the G90 out of their ass?

Bumping clock speeds as they have is just about the only thing they can do as any sort of short term response. And if you read you would have understood how achieving those clock speeds practically requires water cooling on current 8800 hardware - meaning the speed bumps happen to be pushing the hardware to the limit of air cooling, so I wouldn't call that any sort of feeble bump in speeds.

And as far as the memory is concerned 2160MHz GDDR3 is not fast? That provides bandwidth on par with the rumored XT version of the R600 and its 1600MHz ram. Of course its true that if there is any flavor of R600 with ram as fast as 2160MHz, it would have a decisive advantage with memory bandwidth.


What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?

It doesn't matter how strong/par/poor the R600 performs, the only thing nVidia can do to preempt is exactly what they're doing with this supposed 8800 Ultra.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
BDG10K, I would not say that Stalker really taxes the card that badly.
Oh really? Turn the details on full and run the game at 2560x1600.

Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
At what resolution/settings?

Cant speak for call of juarez though as never tried it.
You don't need to "speak" for it, just look at the benchmarks.

Those scores are without AA too - imagine what they would look like with AA.

Granted, games can be run at lower detail settings but it's foolish to claim "you don't need more power" when you can always take advantage of it in any game by raising detail levels, resolution and/or AA.
 

terentenet

Senior member
Nov 8, 2005
387
0
0
Originally posted by: BFG10K
BDG10K, I would not say that Stalker really taxes the card that badly.
Oh really? Turn the details on full and run the game at 2560x1600.

Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
At what resolution/settings?

Cant speak for call of juarez though as never tried it.
You don't need to "speak" for it, just look at the benchmarks.

Those scores are without AA too - imagine what they would look like with AA.

Granted, games can be run at lower detail settings but it's foolish to claim "you don't need more power" when you can always take advantage of it in any game by raising detail levels, resolution and/or AA.

You couldn't be more correct BFG. I also argued with people saying 8800GTX SLI is overkill. No, it's not. Once you crank up the resolution and enable eye candy, performance will drop drastically and you'll find that even 8800GTX SLI will struggle to keep a decent framerate.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: terentenet
@jim1976: what's "the missing MUL"?

Simply put MADD and MUL are mathematical computations that an ALU can do for specific purposes inside the GPU especially for shader computations. MADD means (multiply+add) while MUL means simple multiplication. As you can understand MADD is more complex and does more cycle ops than MUL, but combined they can provide greater results. The fact is that G80 has scalar ALUs and MADD only computations are more common, but MUL can be used too for greater performance,especially in the shader part.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: thilan29
Maybe Dell accidentally called it an 8800 Ultra on their website??

or nvidia ordered dell to "accidentally" release these specs-so everyone would be UNDERwhelmed
:Q

i wouldn't rule anything out ... with nvidia
:p
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Originally posted by: bunnyfubbles
Originally posted by: golem
Originally posted by: munky
Originally posted by: Matt2
The fact that Nvidia didn't even bother to put GDDR4 on the board doesn't speak highly of what they think R600 will be.

I'm sure Nvidia knows everything it needs to know about R600.

Either that, or putting ddr4 on a g80 would require changes to the core itself, and that would be expensive. I know Nv said it's technically 'ddr4-ready' but this wouldn't be the first time they lied.

What Matt2 said actually makes sense. It would be logical that your closest competitor would be the ones that spend the most time and effort to learn about your product and try and come up with a counter for it.

If the 8800u has no chance against R600 using GDDR3, why bother making it, especially if the rumours of it's limited availability and pricing are true?

That logic is far from bullet proof. What would they do if R600 proved to be twice as fast as the 8800GTX? Magically pull the G90 out of their ass?

Bumping clock speeds as they have is just about the only thing they can do as any sort of short term response. And if you read you would have understood how achieving those clock speeds practically requires water cooling on current 8800 hardware - meaning the speed bumps happen to be pushing the hardware to the limit of air cooling, so I wouldn't call that any sort of feeble bump in speeds.

And as far as the memory is concerned 2160MHz GDDR3 is not fast? That provides bandwidth on par with the rumored XT version of the R600 and its 1600MHz ram. Of course its true that if there is any flavor of R600 with ram as fast as 2160MHz, it would have a decisive advantage with memory bandwidth.


What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?

It doesn't matter how strong/par/poor the R600 performs, the only thing nVidia can do to preempt is exactly what they're doing with this supposed 8800 Ultra.

A. If R600 is twice as fast as the G80, they will release a GX2 variant
B. Outside of a few instances, in the tech industry, companies are very close performance wise to each other.


 

terentenet

Senior member
Nov 8, 2005
387
0
0
Originally posted by: jim1976
Originally posted by: terentenet
@jim1976: what's "the missing MUL"?

Simply put MADD and MUL are mathematical computations that an ALU can do for specific purposes inside the GPU especially for shader computations. MADD means (multiply+add) while MUL means simple multiplication. As you can understand MADD is more complex and does more cycle ops than MUL, but combined they can provide greater results. The fact is that G80 has scalar ALUs and MADD only computations are more common, but MUL can be used too for greater performance,especially in the shader part.

So let me get this right. G80 has MUL but it's disabled? What's the reason for that? Maybe they ran into issues using it and decided to disable it. Or did they disable it on purpose just to keep an ace in the hole if they need it?
Back when G80 was launched, they said the shaders are scalar and more can be added without too much hassle. I wonder why won't they add another 32 beside the core&mem speed bump. We need innovation, not just a speed bump that can also be obtained by overclocking.
I for one allready feel the need for more power, as I just got my 30" LCD and gaming at 2560x1600 is taxing even the 8800GTX SLI. I need to back down on AA/AF to keep things smooth at that res.
30" LCDs' are 1-2 years old on the market, prices dropped and more will adopt the new large LCD trend. We need significantly more powerfull GPUs.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?

There's lots of ways to know what your competition is up to, at the end of the day there's people behind these big companies, they're not just faceless entities. People talk and NV and ATI share a few AIB partners and also compete for a lot of the same OEM business.

They've also up to this point (may change if AMD starts fab'ing their own, as rumored) shared the same chip-maker in TMSC. The people who work for these companies are smart folks, you could probably transplant any group of engineers from either camp and they could transition over in a short amount of time. Hell, they could probably even simulate or virtually re-create a competitor's part if they wanted to, the same way they model their own chips before taping out.

I'm not implying any wrong-doing from either camp but it'd be naive to suggest either company is operating with blinders on in regards to what the other is doing. With NV30/9700pro, NV was already too far gone into NV30 to scrap it and start new. They made a gamble and lost, thinking their 1x2 rendering path would pay off and that game programmers would bend over backwards to make their chips run well. They quickly scrapped it and moved on. AMD has always been slower in bringing new chip architectures to market. They don't have the R&D and fab prowess that Intel has so when they have a winning part (which they did for 2-3 years before Core2) they tend to squeeze every last bit out of it. That's not to say they don't know they're beat. Spin it as you like but there's little doubt G80's amazing performance was a major reason for R600s delays. It simply wasn't ready to compete with G80 and ATI/AMD knew it.

 

chrismr

Member
Feb 8, 2007
176
0
0
Originally posted by: BFG10K
BDG10K, I would not say that Stalker really taxes the card that badly.
Oh really? Turn the details on full and run the game at 2560x1600.

Sure, the FPs can hover in low 40's, but that is really with everything maxed. I find 40 more than playable.
At what resolution/settings?

Cant speak for call of juarez though as never tried it.
You don't need to "speak" for it, just look at the benchmarks.

Those scores are without AA too - imagine what they would look like with AA.

Granted, games can be run at lower detail settings but it's foolish to claim "you don't need more power" when you can always take advantage of it in any game by raising detail levels, resolution and/or AA.

Hmm, yes, I did forget about the uber resolutions - but then again, not that many people use them.

I am running at 1680 x 1050, so yeah, I guess anything on a 24" or up it might be a little taxing.

But for people who can afford 24" monitors or up, can afford SLI generally, so let's just say my arguments were based on general usage for the average pc user.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: chizow
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?

There's lots of ways to know what your competition is up to, at the end of the day there's people behind these big companies, they're not just faceless entities. People talk and NV and ATI share a few AIB partners and also compete for a lot of the same OEM business.

They've also up to this point (may change if AMD starts fab'ing their own, as rumored) shared the same chip-maker in TMSC. The people who work for these companies are smart folks, you could probably transplant any group of engineers from either camp and they could transition over in a short amount of time. Hell, they could probably even simulate or virtually re-create a competitor's part if they wanted to, the same way they model their own chips before taping out.

I'm not implying any wrong-doing from either camp but it'd be naive to suggest either company is operating with blinders on in regards to what the other is doing. With NV30/9700pro, NV was already too far gone into NV30 to scrap it and start new. They made a gamble and lost, thinking their 1x2 rendering path would pay off and that game programmers would bend over backwards to make their chips run well. They quickly scrapped it and moved on. AMD has always been slower in bringing new chip architectures to market. They don't have the R&D and fab prowess that Intel has so when they have a winning part (which they did for 2-3 years before Core2) they tend to squeeze every last bit out of it. That's not to say they don't know they're beat. Spin it as you like but there's little doubt G80's amazing performance was a major reason for R600s delays. It simply wasn't ready to compete with G80 and ATI/AMD knew it.

its pretty *clear* they know what is going on in each other's camp ... more than "generally"

look at the Dustbuster debacle ... nvidia *knew* r300 was a monster and they slapped a 'spoiler and exhaust' on it to give the appearance they were keeping up

i think r600 became AMD's new toy ... they wanted *more* from it and so added 'bells and whistles' which may or may not prove *compelling* to gamers ... they just took way too long, imo

i think the 8800 ultra is a *ruse* by nvidia ;)



 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Originally posted by: chizow
Originally posted by: bunnyfubbles
What really gives me a good laugh is how some of you guys actually buy into stuff like companies always knowing what the competition is going to be bringing and thus are always prepared. Well then WTF was nVidia doing when ATI released the 9700 Pro? WTF has AMD been doing since the release of Core 2?

There's lots of ways to know what your competition is up to, at the end of the day there's people behind these big companies, they're not just faceless entities. People talk and NV and ATI share a few AIB partners and also compete for a lot of the same OEM business.

They've also up to this point (may change if AMD starts fab'ing their own, as rumored) shared the same chip-maker in TMSC. The people who work for these companies are smart folks, you could probably transplant any group of engineers from either camp and they could transition over in a short amount of time. Hell, they could probably even simulate or virtually re-create a competitor's part if they wanted to, the same way they model their own chips before taping out.

I'm not implying any wrong-doing from either camp but it'd be naive to suggest either company is operating with blinders on in regards to what the other is doing. With NV30/9700pro, NV was already too far gone into NV30 to scrap it and start new. They made a gamble and lost, thinking their 1x2 rendering path would pay off and that game programmers would bend over backwards to make their chips run well. They quickly scrapped it and moved on. AMD has always been slower in bringing new chip architectures to market. They don't have the R&D and fab prowess that Intel has so when they have a winning part (which they did for 2-3 years before Core2) they tend to squeeze every last bit out of it. That's not to say they don't know they're beat. Spin it as you like but there's little doubt G80's amazing performance was a major reason for R600s delays. It simply wasn't ready to compete with G80 and ATI/AMD knew it.

There were two parts to my statement, I don't think you took it as a whole. I never said that they wouldn't have an idea of what was going on, my point was that you can have an idea of what they're going to bring to the table but that isn't going to change whether or not YOU are ready YOURSELF.

Its like any heavyweight fight - you might know the other guy's strengths and weakness like the back of your hand, you might even have a really good idea of what strategy he'll be using to try and take you down. However none of that matters if you simply aren't ready to fight - you're going to get knocked the f*** out. It is what has happened, is happening now, and will happen well into the future.
 

jim1976

Platinum Member
Aug 7, 2003
2,704
6
81
Originally posted by: terentenet
So let me get this right. G80 has MUL but it's disabled? What's the reason for that? Maybe they ran into issues using it and decided to disable it. Or did they disable it on purpose just to keep an ace in the hole if they need it?
Back when G80 was launched, they said the shaders are scalar and more can be added without too much hassle. I wonder why won't they add another 32 beside the core&mem speed bump. We need innovation, not just a speed bump that can also be obtained by overclocking.
I for one allready feel the need for more power, as I just got my 30" LCD and gaming at 2560x1600 is taxing even the 8800GTX SLI. I need to back down on AA/AF to keep things smooth at that res.
30" LCDs' are 1-2 years old on the market, prices dropped and more will adopt the new large LCD trend. We need significantly more powerfull GPUs.

No G80 does not have MADD+MUL. They cannot simply hide or disabled this, and later make it possible with a magic driver. It will probably appear with 8900GTX coupled with 32 more SPs. Why don't they add another 32 SPs in the Ultra? The question should be why would they? If X2900XTX cannot "overtake" the 8800gtx by a significant margin and all they need is a clockspeed bump, there's no point for them to throw an asset in the battle. I'm not sure if they can add 32 more SPs in 90nm (rumours said they could) but they will surely be able to do it with 80 or 65, later on.
Well gaming at 2560x1600 is very taxing. What G80 managed, that is to make a resolution like this playable with one single gpu is phenomenal to say the least, but you cannot simply expect this to happen constantly, it is very difficult to push so many pixels in the screen with a single card in newer games. You have to bear in mind that an SLI configuration make come in handy for you. Big boys need more juice :D