AMD to sell millions of GPU's.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Matt2 - yeah, LOL, I realized as I was posting that I was taking this probably nonsense for fact and running with it.

I'll rephrase. If AMD has the good sense to do this (release the R600 at a much lower price point) then I'll be impressed and say it was a better decision than releasing it at $600.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
that is part of the main heatsink block Thilan and it doesn't cover every last inch of the card either
 

thilanliyan

Lifer
Jun 21, 2005
12,085
2,281
126
Originally posted by: Gstanfor
that is part of the main heatsink block Thilan and it doesn't cover every last inch of the card either

Here is an even better pic...the copper part IS just an insert but has a heatpipe connecting the insert to the aluminum fins. What is so much more different in the R600 cooler?? Only difference I see is that there are actually copper fins and heatpipes (much better IMO) which are actually a part of the core cooler, and hence should cool the core better anyway, which would be the hottest part of the card.

The aluminum "main" heatsink covers a good proportion of the card on the 8800GTX...even moreso(percentage wise) on my GTS....and is the same on the R600 pic.

Here are some more pics.


OT, I can't wait for some of the new NVidia and ATI midrange cards come out, I wanna try my hand at some volt mods and see what they can do. There are already some voltmods for the 8600GTS cards I think over at vr-zone.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Instead of editing my own post I'll add one caveat, it's a good decision to sell at a lower price point if they have the room in the card's profitability to do so and/or the reserves necessary to sell at a loss for a short time.

Given AMD's position I can't imagine the latter to be the case; the CPU side is already doing it. Given AMD's cash position, I'd say that they probably shouldn't do it if they're going to lose serious cash.
 

coolpurplefan

Golden Member
Mar 2, 2006
1,243
0
0
Originally posted by: ShadowOfMyself
So... Does this mean AMD is actually in a much better position than we all thought? :confused:

AMD did have losses. And may continue with losses for a little while. But, I'm watching the stock drop and getting hot and horny. :) I can't wait to see it lower so I can get in! :D

`EDIT: Whoops, maybe I shouldn't say that. It would be viewed as a stock tip and therefore kind of unethical. Seems like a tempting move though.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
A friend and I were discussing the AMD situation via email and he talked about something the head of his company (one of the large, publicly traded financial services firms) said a few years back. Basically, the CEO said, "we're no where near as smart as everyone thought back during the days of the dot-com boom and we're no where near as dumb as they think now."

Wall Street overreacts to current trends. A lot of traders are concerned about shorter-term profitability. It's just the nature of the beast. nVidia, AMD, Intel, they just all need to keep running a good race, keep their business units focused on making good products, keep their sights set on the long term and do their best to weather any storms that come up along the way.

That's not to say that AMD isn't in a dicey situation right now, just that the race being run is closer to a marathon, even if we all want folks to sprint to the finish line when it comes to launching those shiny new GPUs that we can't seem to get enough of.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: thilan29
Originally posted by: Gstanfor
G80 has nothing that remotely resembles this

Look at this pic of the 8800GTX cooler. You see the copper insert under the TIM...same idea as your pic.

Here's another pic.
QFT.

The card's aluminum plate isn't what bothers me though. (Heck, nVidia's is pretty similar). It's the way the fan is mounted onto the card like it was almost an after-thought.

Why make that much of a physical difference between the OEM and the Retail? My X1900XTX was OEM and looked exactly the same as the retail versions of its time.
God, I swear ATi are the biggest idea thieves I've ever come across.
That's why the 8800's heatsink and fan blow air out of the back of the case now, right?
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: BladeVenom
Originally posted by: josh6079
Argh....ZSTREAM!!!

You could have at least posted the info from that place so others wouldn't give the Inq. site hits...

Well there is this little thing called copyright law. You can't copy entire stories.
LoL, whatever. I'm sure the OP was thinking about that when he posted it.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
It would certainly be wonderful news to have GPUs return to more realistic pricing. $999 GPU...ahahahaha wow we will all have a great laugh about that if R600s top out around $400.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: RussianSensation
1000/0.9ns ram = 2200mhz * 512Mbit = 140Mb/sec bandwidth :) That's a significant jump from 8800GTX.

Memory bandwidth is not the be all, end all at the high end.

Just because it has that much bandwidth doesnt mean that AMD will be able to saturate that entire bus.

Core efficiency is going to play a big role in this battle. Although Nvidia can make the point that if they lose out, so what?

They were top dawg for more than 6 months. If R600 bests them, then they will just regroup and refresh G80 and then it's a whole new ball game.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
If R600 bests them, then they will just regroup and refresh G80 and then it's a whole new ball game.
While I personally agree that that will happen, it's not *always* the case.

For instance, X1900XTX came out before the 7900's (7800 refresh) and faired quite well. Hell, the X1950XTX still competes with the 8800GTS 640 MB depending on the title and settings.

I'm not going to make any predictions other than that the 8800 refresh (8900's) will have me quite interested.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Matt2
Originally posted by: RussianSensation
1000/0.9ns ram = 2200mhz * 512Mbit = 140Mb/sec bandwidth :) That's a significant jump from 8800GTX.

Memory bandwidth is not the be all, end all at the high end.

Just because it has that much bandwidth doesnt mean that AMD will be able to saturate that entire bus.

Core efficiency is going to play a big role in this battle. Although Nvidia can make the point that if they lose out, so what?

They were top dawg for more than 6 months. If R600 bests them, then they will just regroup and refresh G80 and then it's a whole new ball game.

It may not be the Be All and End All, but its darn nice to have if you can get it (and use it).
 

Sable

Golden Member
Jan 7, 2006
1,130
105
106
Originally posted by: Gstanfor
God, I swear ATi are the biggest idea thieves I've ever come across. They've gone and stolen the SLI header fingers to try and get backfire working now also.

backfire fingers
Text

I'm going to love to see them explain how backfire is so superior to the competition that they felt compelled to steal the competitors long established ideas (after running out of their own apparently...)

Jesus tittyfvcking christ, is that the best you can come up with? WAH WAH, THEY COPIED HOW TO CONNECT THE CARDS. How the fvck else did you want em to do it? If it's the most efficient way then that's what they're gonna do.

MAYBE they'll explain how CROSSFIRE is superior by having a driver that works?

Who knows, it might NOT be superior. Your rampant ATI bashing is just so tedious though.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: BFG10K
God, I swear ATi are the biggest idea thieves I've ever come across.
You mean like SLI AA copying Super AA?

Since when has nvidia resorted to slapping a "compositing engine" chip on their cards? No, NVIIO isn't your answer -- my G71's handle SLI-AA just fine without it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Since when has nvidia resorted to slapping a "compositing engine" chip on their cards?
This question isn't really relevant.

What is relevant is that Super AA was introduced first into Crossfire and then nVidia followed suit with SLI AA.

We can also list things like 16xAF, gamma-corrected AA, SGMS, etc which ATi had first and then nVidia implemented.

The nature of competition is that vendors will copy the better ideas from each other.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
I'd rather say nvidia migrated SLI-AA down from Quadro platforms (from whence ATi copied nvidia...)
 

AnotherGuy

Senior member
Dec 9, 2003
678
0
71
/me puts more gasoline to the fire...

How about ATI that stole the SLI idea from nVidia making their Great CROSSFIRE huh? how about that?







:D:D
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Gstanfor, you're arguements are a bit silly. Many examples of different companies (some even by yourself) using ideas from the competition and modifying them for their on use have been posted as examples in this thread. Yet you single out AMD/ATI as idea theives because now they have a connector that looks similar to the one Nvidia uses for SLI. Seems especially odd that you'd single out AMD/ATI for doing the same things Intel, Nvidia, Microsoft, ect. do when you have this in your sig:

"Jack Tramiel:
Business is war, I don't believe in compromising. I believe in winning."
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Well, Jack Tramiel didn't generate Commodore's wins by imitating his competition thats for sure... (as you'd know if you know anything at all of Commodore's history).
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
56
91
He's just craving the attention fellas. Don't mind him. Hell, you shouldn't even be responding to his bile at this stage of the game.
 

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
Originally posted by: keysplayr2003
He's just craving the attention fellas. Don't mind him. Hell, you shouldn't even be responding to his bile at this stage of the game.

You have changed a lot in a few months. Keep it up :)