• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

8800Ultra for $999??

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Auric
Meow meow perspective meow.

You could have spent $800-850 on a couple cheesy Voodoo2's when they came out in the mid-1990's (in adjusted dollars). Given MSRP of $999 then even 10% less for retail is all of a sudden only $900. Think of the exponentially better experience that can be had for your money today.

"MEOW"... You seem like a person who watches to much A!N!M3

You know what i mean. Stop watching the crap with cherry blossom trees and cats and stupid Issy saying "Meow Meow"
 
Thats ridiculous 😕 And then R600 gets released at 500$ and it gets its price halved? Huh... Why doesnt Nvidia make the card cost 5000$? Those who buy a 999$ card have way too much money already, Im sure they would buy it whatever it costed
 
Can't something "single chip" still be "dual-core"? Just curious.
The other possibility is that these things are designed to be able to be loaded up with stream processors and bandwidth and memory and not overload a single-core, and if so maybe they doubled those items compared to a single 8800GTX.
 
Can't something "single chip" still be "dual-core"? Just curious.
Yes, but in a different way to how you are thinking. You pretty much determine the power of the GPU by how many quads are present on the die (ad also of course by clock speed).
for instance a 6200 has one quad, a 6800 utra has 4 quads. Other than that the two chips are pretty much identical.
 
Here's some speculation for ya!!!

G80 core (90nm) cherry picked at 800MHz
1.5 GB of 2200MHz 512-bit GDDR4

Sounds possible, but not likely.

 
$1000 is a lot yes, but i know plenty of people who have the means to drop 1000, even a few who wouldnt think twice about laying down that amount.

there is definately people out there capable of paying for something like this, its just that these people arent even remotely interested in computers lol

lol just thinking about it, where i live i have noticed a lot of super nice cars

couple of bentley contintental GT's, porsche 911's, cayennes, Aston DB9 and DB9 volante, V8 Vantage, numerous range rovers, and the best one yet...... mercedes-benz AMG SL65 bi-turbo - 6 liter, v12, twin turbo 2 seater with 612bhp and 760+lbs ft of torque

if people can afford these +£100,000 machines. $1000 is pocket change.
 
Originally posted by: jim1976
Originally posted by: Imyourzero
Originally posted by: 5150Joker
$999? Hahahahahahaha...breathes...hahahahahahaha....

That was my reaction when Intel was selling the PIII 1GHz for over $1k

The price of this gpu if true is outrageous but the specs are yet to be seen.. The fact is though that spending 1k for a gpu is "less insane" than spending 1k for a cpu.. Manufacturing costs are higher also..

Originally posted by: Stumps
hmmm I wonder if this will have a 768-bit memory bus instead of a 512-bit or the standard 384-bit...since the ram has been doubled from 768mb to 1.5GB they could have simply added a second memory bus, effectivily offering twice as much memory bandwidth as the GTX.

Just a thought.

This is impossible.. How exactly do you expect them to "simply add" a second memory bus in the SKU/PCB? Despite that even a single 768bit memory bus is simply daydreaming.. We are too early in terms of technology for that and even if they could do this (which they can't 100% ) that would make the pcb gigantic..

It would be a possibility by using a "bridge" chip containing a memory bus "crossbar"...similar in "Idea" not "operation' to the one used in the old GF4MX series which had two 64-bit memory buses linked together via a "crossbar" interface...this allowed the GF4MX series to utilise either a 64-bit or "128-bit" memory bus.

Just a thought.
 
I'm not paying that much for a video card.

Nvidia has done quite well with the 8800GTX and the GTS. I like the fact that they haven't really taken advantage of not having competition.
Really? Have you seen the state of their drivers?
 
Originally posted by: Stumps
Originally posted by: jim1976
This is impossible.. How exactly do you expect them to "simply add" a second memory bus in the SKU/PCB? Despite that even a single 768bit memory bus is simply daydreaming.. We are too early in terms of technology for that and even if they could do this (which they can't 100% ) that would make the pcb gigantic..

It would be a possibility by using a "bridge" chip containing a memory bus "crossbar"...similar in "Idea" not "operation' to the one used in the old GF4MX series which had two 64-bit memory buses linked together via a "crossbar" interface...this allowed the GF4MX series to utilise either a 64-bit or "128-bit" memory bus.

Just a thought.

The G80 is already working from a crossbar memory controller, with 6x64Bit Partitions, we just made the move to wider bit interfaces, it will be sometime before there will be an increase in bit width again on the Nvidia side.

All the modern cards have the ability to scale down, the problem is scaling up. 12x64Bit Partitions is hard to believe at this point in time.

Were probably going to get more bandwidth in the mean time by using higher clocked GDDR4 as opposed to increasing the bit width.
 
I certainly don't think we'll see the number of partitions double, but I wouldn't be terribly suprised to see 2 extra partitions added, bringing G80 up to a 512 bit interface.

You have to remember, G80's biggest advantage has always been time to market - it was buildable in november 2006 @ 384 bits where R600 was not. Its possible that nearly 6 months on a combination of refining the chip layout plus 90nm yeild maturing (or moving to 80nm) means those two extra partitions can be bolted on (or enabled if already present and disabled).
 
Originally posted by: coldpower27
Originally posted by: Stumps
Originally posted by: jim1976
This is impossible.. How exactly do you expect them to "simply add" a second memory bus in the SKU/PCB? Despite that even a single 768bit memory bus is simply daydreaming.. We are too early in terms of technology for that and even if they could do this (which they can't 100% ) that would make the pcb gigantic..

It would be a possibility by using a "bridge" chip containing a memory bus "crossbar"...similar in "Idea" not "operation' to the one used in the old GF4MX series which had two 64-bit memory buses linked together via a "crossbar" interface...this allowed the GF4MX series to utilise either a 64-bit or "128-bit" memory bus.

Just a thought.

The G80 is already working from a crossbar memory controller, with 6x64Bit Partitions, we just made the move to wider bit interfaces, it will be sometime before there will be an increase in bit width again on the Nvidia side.

All the modern cards have the ability to scale down, the problem is scaling up. 12x64Bit Partitions is hard to believe at this point in time.

Were probably going to get more bandwidth in the mean time by using higher clocked GDDR4 as opposed to increasing the bit width.

ahhh yes now I see, I hadn't looked all that much into the G80's specs.
 
Originally posted by: otispunkmeyer
$1000 is a lot yes, but i know plenty of people who have the means to drop 1000, even a few who wouldnt think twice about laying down that amount.

there is definately people out there capable of paying for something like this, its just that these people arent even remotely interested in computers lol

lol just thinking about it, where i live i have noticed a lot of super nice cars

couple of bentley contintental GT's, porsche 911's, cayennes, Aston DB9 and DB9 volante, V8 Vantage, numerous range rovers, and the best one yet...... mercedes-benz AMG SL65 bi-turbo - 6 liter, v12, twin turbo 2 seater with 612bhp and 760+lbs ft of torque

if people can afford these +£100,000 machines. $1000 is pocket change.

Next on "pimp my boxen". We drop a pair of GeForce 8800ultras into this dudes old socket 939. Woot!
 
Yeah but my 3000+ might be a little bit of a bottleneck.

No worries though, I am going to prostitute myself for 5000 wow gold, ebay and then I will be 9/10 of the way there!
 
Originally posted by: AVP
Yeah but my 3000+ might be a little bit of a bottleneck.

No worries though, I am going to prostitute myself for 5000 wow gold, ebay and then I will be 9/10 of the way there!

I'll give you half a nickel for one night.

I hope this video card does retail for $1000. Makes it all the more sweeter, or all the more laughable.
 
Originally posted by: keysplayr2003
Originally posted by: Ackmed
Originally posted by: yacoub
Originally posted by: Ackmed
Originally posted by: yacoub
Originally posted by: Modular
pffft, give it 10 years. We're already over $500 as a norm for just released top of the line cards.

And over 99% of buyers never buy a $500 GPU, so no.
And yes, $999 is STUPID. Anyone who buys one at that price is wasting their money no matter how much of it they have.

While your point is made... lets try to be a little realistic. Its not even close to 99%.

So you want to talk about realistic but you don't think that for every $500 videocard sold, 99 cheaper ones are sold? Okay, true, it's probably closer to 1000:1. (That'd be 99.9% though, which means I'm not all that far off percentage-wise.)

Sure it is, if you talk about $400 Dell's. Which we arent. In this topic alone, its much less than 99 to 1. You tried to make a point by over exaggerating, its ok. Its not a big deal. I just pointed it out.

The thing is, we all know he was trying to make a point anyways. But you are the only one who said something that didn't really need to be said. I know you think everyone is a dufus except you. Try to remember that we have brains also. Although none that works like yours. 😉

There was only two posters between he and I. So of course others arent going to say it after I brought it up. I simply brought it up, and wasnt even that serious. As you can tell with the italics. Then he tried to back up his statement, which didnt work. That was that. Then of course you had to add your two cents worth, as you always try to with me. Which didnt work, as usual.

Dont put words in my mouth, I didnt say anyone here was a dufus, you sure implied it to me though, with your last comment. Funny that.

And now to get back on topic. $1000 is not too much to me, if its fast enough. I really doubt it will be faster than 2x8800GTX's in SLI, for about the same price though, or even cheaper. I dont really believe the MSRP will be $999, but I guess we'll see. Hopefully its not going to be another 7800GTX 512. With next to zero availability, and highly inflated price tag. $150 or so more than the 8800GTX seems about right, depending on performance.
 
Originally posted by: Ackmed

The thing is, we all know he was trying to make a point anyways. But you are the only one who said something that didn't really need to be said. I know you think everyone is a dufus except you. Try to remember that we have brains also. Although none that works like yours. 😉

There was only two posters between he and I. So of course others arent going to say it after I brought it up.
>You don't know that. Just speculating.

I simply brought it up, and wasnt even that serious.
>Why was it even necessary?

As you can tell with the italics. Then he tried to back up his statement, which didnt work.
>Back up his exaggeration? Alrighty then....

That was that. Then of course you had to add your two cents worth, as you always try to with me. Which didnt work, as usual.
>Of course it worked. LOL.

Dont put words in my mouth, I didnt say anyone here was a dufus, you sure implied it to me though, with your last comment. Funny that.
>I really don't care if you said it or not. It was implied. Fact. Done.

[/quote]

My whole point was............. Get a life....... By life, I don't mean material wealth. (Trophy wive's, Big house, Fancy Cars, Yacht's etc. etc.) It's what is inside that counts.



 
Originally posted by: BassBomb
300$CDN video card is where i sit still!!!

The market should be more geared to that area

It is. Nvidia and ATI just want to have the fastest cards on the market. I can't imagine that the extreme high-end is where they make their money.

That said, they need to hurry up and release the 8900s.
 
I hope SONY or MICROSOFT make their consoles to compactible to MOUSE+Keyboard,
consoles will became the best gaming machines and noone will go crazy about PC
anymore.
$1000 for GPU ??????
After INTEL will go crazy and anounced $2000 CPU, does it woth ???????
Your $5000 PC in a couple month will became outdated because someone will anounce
$3000 GPU and $5000 CPU...........
It's just FUC*ING $50 game, nothing more.....

But thise is the BUSSINESS, someone gona have 10 FERRARY's in garage and someone
gona just keep collecting money for Subway........
 
$1000 for a video card? that way to expensive, I think. I remember when you could buy a really good decent performance video card for $130(3dfx voodoo3 2000).
 
Originally posted by: GEOrifle
I hope SONY or MICROSOFT make their consoles to compactible to MOUSE+Keyboard,
consoles will became the best gaming machines and noone will go crazy about PC
anymore.
$1000 for GPU ??????
After INTEL will go crazy and anounced $2000 CPU, does it woth ???????
Your $5000 PC in a couple month will became outdated because someone will anounce
$3000 GPU and $5000 CPU...........
It's just FUC*ING $50 game, nothing more.....

But thise is the BUSSINESS, someone gona have 10 FERRARY's in garage and someone
gona just keep collecting money for Subway........

Soon the technology will get cheaper and you'll be able to get that $1000 level of performance for $39.99.
 
Thinking about 7800 GTX 512, I remember that I bought one the day it came out for $649 with Call of Duty 2, and sold it for $800 on eBay like 2 moths later. (And COD2 was sold separately for another $20 or so) 😀
 
Back
Top