The New and Improved "G80 Stuff"

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Ulfhednar
Originally posted by: Crusader
Uhhh SURE!
I'll eat someone elses words for you, toolbag. Do you not have reading comprehension? That was pulled from chinese forums.
Toolbag? Way to keep up credibility there, not like you ever had any.

Hmm, your source is someone on a Chinese forum eh? I'll take the word of AMD over that any day, and AMD say that the ATI brand and Radeon product family is going nowhere.

:)

Yeah.. I go with AMD as well: http://www.amd.com/us-en/0,,3715_14197_14198,00.html?redir=goBG01

Notice- No mention of high end GPUs.

No one really knows for sure Elfdinner, but theres not much of a chance considering ATI was losing money on the high end GPUs to stay in the market and feeding it cash from their other more profitable areas.. that AMD cares enough to go up against both Intel (on CPUs) and Nvidia (on GPUs).. thats a recipe for bankruptcy that ATI wasnt making any headway against Nvidia on when they stood alone.

It makes sense, but I'm sure that doesnt exactly make YOU happy that ATI might be out of the high end game after R600.

So taking the official word from AMD + the chinese forums who pretty much tell it like it is.. ? I'd wager I'm correct and you are not.

Feel free to hold me to this.
You ARE right that the Radeon/ATI brand arent ultimately going "anywhere" but you wont be getting to many more high end "R600" class products. They'll probably sell a lot of midrange and integrated parts. And might even continue the high end as you are suggesting, if its profitable.
But everyone besides YOU realizes that the market is and will shift towards a unified GPU/CPU.. so thats what AMD is probably working on.. both this, and reorganizing their company (which is enough work alone, I know).
So you get to play with yourself one last time as AMD gives you what is probably one of the few last high end GPUs from them.

They'll be more than happy in the future to plaster ATI or Radeon on any product they think suckers like yourself will buy simply because it says that.
Thats what they probably meant.

But millions in fighting a losing battle with Nvidia in the high-end when Intel wants AMD dead? Good luck on that.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
Originally posted by: Crusader
Notice- No mention of high end GPUs.
No mention of CPUs either, I guess by your logic that means AMD and ATI are pulling out of all markets bar the ones on that page? You're such a muppet. :laugh:

Originally posted by: DailyTech
"AMD has no plans to drop the ATi brand name or ATi's product brands. The ATi name will live on at AMD as our leading consumer brand, and so will the Radeon brand and other ATi product brands."

http://www.dailytech.com/article.aspx?newsid=3719

Yup, I guess I will take AMD at their word over you and your anonymous friend on a random Chinese forum any day since AMD have said that the Radeon brand is going nowhere. :)

Originally posted by: Crusader
Elfdinner
P.S. It's spelt "Ulfhednar," surely you can't be so mentally incapable as to fail at simply copying a word that's already on your screen. Then again; this is the nationalist fanboy we're talking about, and every time I read one of your posts I can picture you slamming your fists on the keyboard and drooling on your desk, raging with a crumpled expression as if like a rabid bulldog chewing on a wasp and swearing under your breath with short breaks to growl the words "MUST KILL ATI! MUST KILL ATI!"
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Elfear
Originally posted by: Crusader

I was going to wait for G90, but since G80 has fully unified shaders.. I'm onboard and ready to shell out $$ for this.


Where did you get this from? All the rumors I've seen have said that G80 will either have the standard pipes again (i.e. not unified) or the pipes will be partially unified.

The chinese! :p But Dailytech also said it here- http://www.dailytech.com/article.aspx?newsid=4441
Originally posted by: Anh HuynhGeForce 8800GTX graphics cards are equipped with 128 unified shaders clocked at 1350 MHz.


G81
-June '07
-65nm
-GDDR4
-512bit MC

I highly doubt this is accurate. The leap from 90nm to 65nm would be huge and I doubt Nvidia will be able to do that in ~6-7 months. From what I know ATI is ahead in jumping to smaller architectures and I doubt they will have a high-end 65nm card out by June. Maybe someone who knows more about this stuff can chime in.

I was also skeptical of this point.
I'm not sure, thats what they were claiming.

But one point I think you are wrong on is that ATI jumped ahead to smaller size before NV.. actually its NV who beat ATI to 80nm..
http://www.digitimes.com/news/a20060918A5027.html

They are both using TSMC (even for R600) because ATI barely has enough production capacity for themselves.

Which is another reason why I find it remotely amusing that people think AMD is going to focus on the high end GPU market.. when they can barely fight off Intel..
It doesnt make any business sense, esp considering ATI was losing money for years on the high end.. just to remain in the market with Nvidia...
But whatever. People dont want to admit ATIs misgivings and I understand that if I were on that side of the fence.
I happen to know that software > hardware when it comes down to it, within the overall product.. which is why I tend to Lean Green. They have better driver support.

I won't even get into the "Nvidia has godlike drivers" debate with you. We obviously have very different opinions on that topic.

Fair enough.
But not godlike, just better overall support. Quick example being ATI dropping support for <R300 in Vista. NV wont be doing such a thing with their older products. Some people dont care about stuff like this, but for me.. if I do happen to not upgrade or want to pop older NV hardware into an older rig someday.. I want the best support on the market.. and I've seen enough to know that NV provides that.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
I think it is a bit naive for anyone to assume they know what AMD/ATI's plans are for the future. Who knows? I'll say this: to assume that Intel & AMD/ATI are simply going to cede the high-end GPU market to nVidia is rather unlikely. There is money to be made in the high-end, if the company can leverage its assets appropriately. While most of the money is not in the high-end, as we've seen from both Intel, AMD, nVidia and ATI, there is a lot of branding that revolves around creating a viable high-end competitor. AMD really took off--as a 'quality' brand--with the introduction of the A64 at the high-end. Intel still dominated many parts of the market, but the cache that AMD gained through the A64 brand was incredibly valuable to the company. I wouldn't assume that anyone is going to just leave the high-end to nVidia. Though it isn't terribly profitable, it is immensely important from a marketing standpoint.

Not to mention from an R&D standpoint, as technology advances at the high-end trickle down.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: dreddfunk
I think it is a bit naive for anyone to assume they know what AMD/ATI's plans are for the future. Who knows? I'll say this: to assume that Intel & AMD/ATI are simply going to cede the high-end GPU market to nVidia is rather unlikely. There is money to be made in the high-end, if the company can leverage its assets appropriately. While most of the money is not in the high-end, as we've seen from both Intel, AMD, nVidia and ATI, there is a lot of branding that revolves around creating a viable high-end competitor. AMD really took off--as a 'quality' brand--with the introduction of the A64 at the high-end. Intel still dominated many parts of the market, but the cache that AMD gained through the A64 brand was incredibly valuable to the company. I wouldn't assume that anyone is going to just leave the high-end to nVidia. Though it isn't terribly profitable, it is immensely important from a marketing standpoint.

Good point. But its stretching it a bit to think that AMD can fight both NV and Intel in all out 2-front war.. when ATI wasnt really holding out against Nvidia in a single-front war.

http://www.bit-tech.net/hardware/2006/07/24/how_it_all_shakes_out_for_4/2.html

ATI has been losing money on its graphics card sales to buy marketshare, whilst propping its business up with TV chip and mobile phone chip sales.

Maybe AMD can turn that around?

But I hardly can fathom the engineering needed to build-
-A new CPU/GPU Fusion chip
-A Fusion platform
-Business chipsets/platforms
-Mobile devices
-New Athlon CPUs in the meantime
-Radeon high end GPUs (which under ATIs previous business model were losing money.. and AMD is limited in fab space currently and will be for years if they continue to fight off Intel)
-Among whatever else AMD is working on

Granted they have a lot of engineers now.. but it doesnt make a ton of sense why they'd purchase ATI.. just to enhance ATIs former business?

I'm not a CEO yet, but I think it makes a lot of sense to stick with profitable areas.
Not ones where Nvidia was chomping at ATIs bit.. and doing it more profitably on the high end..

Guess we'll have to see, but its a pretty long stretch to assume AMD is going to fight off Intel + Nvidia. I think theres a lot more money in having a "total platform" as Intel has had for years.. and creating something innovative and fresh, like this Fusion chip.

Thats the ticket there. Not staying back in 1996 and trying to win the standalone GPU wars.. that Nvidia has already won from a business standpoint.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Ulfhednar
Originally posted by: Crusader
Notice- No mention of high end GPUs.
No mention of CPUs either, I guess by your logic that means AMD and ATI are pulling out of all markets bar the ones on that page? You're such a muppet. :laugh:

*cue Bill Lumbergh voice*
Hmmmm yeeahhhh ... see.. I'm going to have to disagree with you a bit there....

I think what AMD is saying there is... what ATIs integration into AMD is going to mean to the Company.

Probably not what AMDs overall market plan is. Notice no mention of Fusion either. ;)
Reading comprehension!

Originally posted by: DailyTech
"AMD has no plans to drop the ATi brand name or ATi's product brands. The ATi name will live on at AMD as our leading consumer brand, and so will the Radeon brand and other ATi product brands."

http://www.dailytech.com/article.aspx?newsid=3719

Yup, I guess I will take AMD at their word over you and your anonymous friend on a random Chinese forum any day since AMD have said that the Radeon brand is going nowhere. :)

"You ARE right that the Radeon/ATI brand arent ultimately going "anywhere" but you wont be getting to many more high end "R600" class products. They'll probably sell a lot of midrange and integrated parts. And might even continue the high end as you are suggesting, if its profitable.
But everyone besides YOU realizes that the market is and will shift towards a unified GPU/CPU.. so thats what AMD is probably working on.. both this, and reorganizing their company (which is enough work alone, I know).
So you get to play with yourself one last time as AMD gives you what is probably one of the few last high end GPUs from them.

They'll be more than happy in the future to plaster ATI or Radeon on any product they think suckers like yourself will buy simply because it says that.
Thats what they probably meant.

But millions in fighting a losing battle with Nvidia in the high-end when Intel wants AMD dead? Good luck on that. "
Originally posted by: Crusader
Elfdinner
P.S. It's spelt "Ulfhednar," surely you can't be so mentally incapable as to fail at simply copying a word that's already on your screen. Then again; this is the nationalist fanboy we're talking about, and every time I read one of your posts I can picture you slamming your fists on the keyboard and drooling on your desk, raging with a crumpled expression as if like a rabid bulldog chewing on a wasp and swearing under your breath with short breaks to growl the words "MUST KILL ATI! MUST KILL ATI!"

I apologize mulchdigger, it was an honest typo.
 

WelshBloke

Lifer
Jan 12, 2005
33,112
11,292
136
But its stretching it a bit to think that AMD can fight both NV and Intel in all out 2-front war.. when ATI wasnt really holding out against Nvidia in a single-front war.

But can NV fight both Intel and AMD in the future?

But everyone besides YOU realizes that the market is and will shift towards a unified GPU/CPU.. so thats what AMD is probably working on.

And you think NV are well placed in this situation because...?
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
So it doesn't mention Fusion, yet they're working on Fusion. But, according to you, because it doesn't mention graphics cards so they must be ditching it.

Do you see where you're obviously utterly oblivious to even the most simple of logic here?

ATI graphics cards, be they budget or enthusiast level, are going nowhere. Only a few months until we all get to see you eat your words.

As I keep saying; The more you believe all the propaganda and FUD you are spreading, the more disappointed you will be when you come to realise you were talking crap.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
A simple word, to me, dictates the merits of the merger: convergence. CPU development has begun to converge with GPU development to a point where a natural synergy is going to be reached soon (in the next 3-5 years). CPU development hit a wall not long ago and turned to parallelism to drive performance gains. GPUs had already been driven by a largely parallel-demanding market for some time.

I think AMD/ATI see a new market emerging: CPU/GPU on the same die (low cost, low energy solutions), CPU/GPU combinations on the motherboard (mid-end to upper-end solutions depending on implementation). I think AMD wanted ATI's intellectual capital, not their business model. Five years out, we may (and I stress may) not see discrete GPU solutions anymore. nVidia is already a major manufacturer of chipsets for both Intel and AMD; they are also a prominent GPU maker. Intel already manufacturers chipsets and integrated GPU solutions (in fact they dominate the IGP market). Now AMD/ATI manufacture CPUs, GPUs and chipsets. The synergy that I spoke of is something of which all of these major players are aware. I think graphics are headed towards, for lack of a better term, an all-IGP market. Whether on-die CPU/GPU or in the coprocessor (via Hypertransport w/AMD-ATI, for example) mode.

AMD needs two things from ATI to compete in this emerging chipset/CPU/GPU market: the chipset and GPU knowledge. I think this is why you've seen nVidia looking towards CPU production. The remaining player is the biggest--Intel--and it remains to be seen just when and how they will address an emerging high-end, integrated chipset/CPU/GPU market but, make no mistake, I think it's going to happen sooner rather than later.

Heck, we have ATI & nVidia talking about using a second or third GPU for physics processing--what happens when you buy a quad-core system that can dynamically allocate cores to CPU or GPU functions?
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
Truthfully, my guess would be that AMD/ATI will specifically target the mid- to high-end market with these solutions. Intel is a behemoth when it comes to low-end IGP. AMD may be able to leverage ATI's intellectual property to provide performance/watt solutions in the mid/high end with which nVidia can't compete.

This is all rampant speculation but I don't see AMD/ATI giving up the high-end sector. If their choice is either a) going after Intel, or, b) going after nVidia, I think that 'b' is a much more likely option. We'll see if nVidia can ramp up CPU development quickly enough to become an player in the future, largely integrated market.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Ulfhednar
So it doesn't mention Fusion, yet they're working on Fusion. But, according to you, because it doesn't mention graphics cards so they must be ditching it.

Do you see where you're obviously utterly oblivious to even the most simple of logic here?

ATI graphics cards, be they budget or enthusiast level, are going nowhere. Only a few months until we all get to see you eat your words.

As I keep saying; The more you believe all the propaganda and FUD you are spreading, the more disappointed you will be when you come to realise you were talking crap.

Dude.. you dont have a clue.
Theres no logic in business beyond profit, profit, profit.

When you grow up.. and if you even work in the business world you might understand.
I'm going to reply to the other guy cuz you are a waste.

Originally posted by: WelshBloke
But its stretching it a bit to think that AMD can fight both NV and Intel in all out 2-front war.. when ATI wasnt really holding out against Nvidia in a single-front war.

But can NV fight both Intel and AMD in the future?

But everyone besides YOU realizes that the market is and will shift towards a unified GPU/CPU.. so thats what AMD is probably working on.

And you think NV are well placed in this situation because...?

Both really good questions.. there are even worse "answers" out there for these than the fate of ATIs high end.
In the end, it matters not if ATIs high end exists or not.. it wont really benefit AMD to continue it at all.
They need and want Intels HUGE market, not Nvidias relatively small market.

But Intel has been buying significant chunks of Nvidia.. something is up there. Nv likely wont be going anywhere, and if they do.. it of course looks like Intel would gobble them up.
As far as one can tell they do intend to create a "fusion" cpu/gpu. But thats interesting because they are a fabless company!! So I've been doing thought on this.

If worst comes to worst? They ride the AMD/Intel line and make chipsets and profit from both.. while dominating the standalone GPU market.. not a horrible position to be in!
They sit around and fight off AGEIA with expensive SLI solutions! :p

But best case scenario (as if that one above is BAD).. is that they get purchased by Intel.
Likely scenario is that Intel + NV get closer.. rather than an outright purchase.
Thats what it appears to be shaping up as.

I think all 3 are now on stable ground as the article suggests (http://www.bit-tech.net/hardware/2006/07/24/how_it_all_shakes_out_for_4/2.html)...
Intel/AMD/NV are all sitting pretty nice now.

Before.. ATI wasnt really looking so hot.

I cant really give as good of a thesis on the future of Nvidia as I can on the AMD/ATI merger.. because we know so much more about AMDs intentions with ATI as they are clearly stated.
But NV does have nothing but a bright future and plenty of options..

They have their Sony alliance.. certainly nothing to spit at.. nor would it be a bad company to pair up with/merge with.. their IBM connection, their Intel connection.. their AMD connection.. and they stand on their own feet pretty damn well (beyond being fabless).

So I dont think its time to pull your money out of Nvidia stocks, by any stretch at all. They are HIGHLY desirable by all the aforementioned companies as a partner at the very least.

They are kind of the "golden nugget" stuck in the middle of everyone right now.. and better off today stability/business-wise than they ever have been.

Theres little incentive for all those big companies to put little Nvidia out of business. They are to busy fighting each other, as Nvidia will continue to be more and more niche.. yet as desirable by the big boys as ever.

AMD would have LOVED to purchase Nvidia instead of ATI.
But ATI was not the caliber, nor of the same performance as Nvidia.. hence lower evaluation and a far more expensive price.

Lots of bang for buck with the ATI purchase. Just because they had a failing and lackluster business model doesnt mean they didnt have a ton of great technology and talent.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: dreddfunk
Truthfully, my guess would be that AMD/ATI will specifically target the mid- to high-end market with these solutions. Intel is a behemoth when it comes to low-end IGP. AMD may be able to leverage ATI's intellectual property to provide performance/watt solutions in the mid/high end with which nVidia can't compete.

This is all rampant speculation but I don't see AMD/ATI giving up the high-end sector. If their choice is either a) going after Intel, or, b) going after nVidia, I think that 'b' is a much more likely option. We'll see if nVidia can ramp up CPU development quickly enough to become an player in the future, largely integrated market.

This makes some sense.. but Intels market is far more desirable than Nvidias as its larger and more stable as its been around far longer.
Sell a CPU + chipset + integrated GPU in a Dell? Or sell a GPU in a dell, with the rest being someone elses equipment??
I know what market I'd want.

And AMD was making headway against Intel before.. imagine now.. !
Its exciting because this purchase was bad news for Intel.

I'm not so sure about Nvidias CPU development.. if you read my other post.. they are a fabless company.. they need a partner.. either IBM/Sony/Intel/TSMC or AMD. All of those arent going to tell Nvidia to take a hike. It will be interesting to say the least.

AMD doenst care about the former hate between ATI and NV. ATI as we knew it is dead.. its highly likely they'll trade technology or partner up with projects.

Its certainly not out of the realm of possibility now.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
A couple of things. First, ATI was fabless. They aren't fabless anymore. Last time I checked, AMD (now AMD/ATI) had a number of fabs and the know-how to use them. They may not be using AMD fabs for a little while but it will come, eventually, in one form or another. nVidia, in the long run, will need a partner with fabs, in all likelihood.

Second, I'm fairly confident that, unless something drastically changes about how companies are profitable, there won't be a large 'discrete' graphics market as we understand it 10 years from now. At the very least, GPUs are going to become onboard and even more likely to become dynamically allocated in a multipurpose CPU/GPU core. It only makes sense. If we look historically at the development of chipsets and motherboards (as well as CPUs): we see the gradual integration of other components, whether interfaces (USB, IEE1394), audio, graphics, processors (FPU), controllers (memory), etc., onto either the motherboard or the 'CPU' itself.

Integration provides for greater profitability. You will have a lot of manufacturers providing the designs for discrete components that are then integrated into the chipset/CPU/GPU setup. nVidia has great intellectual property in two of these areas (chipset & GPU) but the question will inevitably be, how will they, as a company, address the changing conditions of the market.

In the short term, I think they're in great position. In the long term, if CPUs and GPUs truly converge, they'll need to make some drasitc adjustments as they will be competing with companies larger than themselves by several orders of magnitude. Personally, I think nVidia has to hope that Intel doesn't try to develop their own high-end GPU intellectual capital (if they haven't already). In that scenario, nVidia is redundant to the market's needs.

In any event, speculating is useless and, quite frankly, these last posts have added nothing to our understanding of the market except that there are a lot of people that have ATI/nVidia brand loyalty.
 

dreddfunk

Senior member
Jun 30, 2005
358
0
0
To answer you, Crusader, the question isn't really about which market is most profitable, at least initially. Initially, it is about which market gives your company the best chance to grow and gain brand awareness. AMD has proven very flexible, when Intel had the high end sewn up, AMD played in the low-end market. When AMD produced a winner in the high-end market, they made Intel pay (to the degree that they could). Of course, I think AMD/ATI will compete in the low-end market, either sooner or later. I just think that we have to look at where the convergence is going: to completely integrated CPU/GPUs. When that happens the 'integrated' market will be one-in-the-same with the high-end market. In this situation, nVidia stands to lose a lot, unless they partner up or get really big, really fast.
 

WelshBloke

Lifer
Jan 12, 2005
33,112
11,292
136
Originally posted by: Crusader
Originally posted by: Ulfhednar
So it doesn't mention Fusion, yet they're working on Fusion. But, according to you, because it doesn't mention graphics cards so they must be ditching it.

Do you see where you're obviously utterly oblivious to even the most simple of logic here?

ATI graphics cards, be they budget or enthusiast level, are going nowhere. Only a few months until we all get to see you eat your words.

As I keep saying; The more you believe all the propaganda and FUD you are spreading, the more disappointed you will be when you come to realise you were talking crap.

Dude.. you dont have a clue.
Theres no logic in business beyond profit, profit, profit.

When you grow up.. and if you even work in the business world you might understand.
I'm going to reply to the other guy cuz you are a waste.

Originally posted by: WelshBloke
But its stretching it a bit to think that AMD can fight both NV and Intel in all out 2-front war.. when ATI wasnt really holding out against Nvidia in a single-front war.

But can NV fight both Intel and AMD in the future?

But everyone besides YOU realizes that the market is and will shift towards a unified GPU/CPU.. so thats what AMD is probably working on.

And you think NV are well placed in this situation because...?

Both really good questions.. there are even worse "answers" out there for these than the fate of ATIs high end.
In the end, it matters not if ATIs high end exists or not.. it wont really benefit AMD to continue it at all.
They need and want Intels HUGE market, not Nvidias relatively small market.

But Intel has been buying significant chunks of Nvidia.. something is up there. Nv likely wont be going anywhere, and if they do.. it of course looks like Intel would gobble them up.
As far as one can tell they do intend to create a "fusion" cpu/gpu. But thats interesting because they are a fabless company!! So I've been doing thought on this.

If worst comes to worst? They ride the AMD/Intel line and make chipsets and profit from both.. while dominating the standalone GPU market.. not a horrible position to be in!
They sit around and fight off AGEIA with expensive SLI solutions! :p

But best case scenario (as if that one above is BAD).. is that they get purchased by Intel.
Likely scenario is that Intel + NV get closer.. rather than an outright purchase.
Thats what it appears to be shaping up as.

I think all 3 are now on stable ground as the article suggests (http://www.bit-tech.net/hardware/2006/07/24/how_it_all_shakes_out_for_4/2.html)...
Intel/AMD/NV are all sitting pretty nice now.

Before.. ATI wasnt really looking so hot.

I cant really give as good of a thesis on the future of Nvidia as I can on the AMD/ATI merger.. because we know so much more about AMDs intentions with ATI as they are clearly stated.
But NV does have nothing but a bright future and plenty of options..

They have their Sony alliance.. certainly nothing to spit at.. nor would it be a bad company to pair up with/merge with.. their IBM connection, their Intel connection.. their AMD connection.. and they stand on their own feet pretty damn well (beyond being fabless).

So I dont think its time to pull your money out of Nvidia stocks, by any stretch at all. They are HIGHLY desirable by all the aforementioned companies as a partner at the very least.

They are kind of the "golden nugget" stuck in the middle of everyone right now.. and better off today stability/business-wise than they ever have been.

Theres little incentive for all those big companies to put little Nvidia out of business. They are to busy fighting each other, as Nvidia will continue to be more and more niche.. yet as desirable by the big boys as ever.

AMD would have LOVED to purchase Nvidia instead of ATI.
But ATI was not the caliber, nor of the same performance as Nvidia.. hence lower evaluation and a far more expensive price.

Lots of bang for buck with the ATI purchase. Just because they had a failing and lackluster business model doesnt mean they didnt have a ton of great technology and talent.


If Intel buy NV it would be awful (apart from shareholders). Intel need a lot less from NV than AMD needed from ATI.

Intel would acquire NV's intellectual property and trash the company, they don't need much of what NV has.

I don't know how much profit they would get out of chipsets if it wasn't for SLI. They never sold that many Intel chipsets and now with AMD making their own as well...?

I think NV are well placed for the near future, in the long term........dodgy ground

 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Sorry to pull the subject away from the ploitcal side of things but does anyone know what the G8800GTX will draw in amps? Kinda curious whats the minimum amperage needed on a 12V rail for one of these beasts. Or for two beasts in SLI? This inquiring mind wants to know.

Thanks,

Skott
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: Ulfhednar
Originally posted by: Crusader
ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
Hahahahahahaha, I can't wait to see you eat those words when the refreshes are out and whatever next generation comes out after R600 and G80 are over and done with. AMD have said over and over that the ATI brand and Radeon product family is going nowhere, and is going to continue on as the flagship graphics product of AMD. :laugh:

Of course there will be R600 refreshes and minor chips - what do think "amd" have been/are working on up until r600 launches? - if you think it's r600 then you are a fool and don't have the first clue how the industry works...
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: Ulfhednar
Originally posted by: Crusader
ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
Hahahahahahaha, I can't wait to see you eat those words when the refreshes are out and whatever next generation comes out after R600 and G80 are over and done with. AMD have said over and over that the ATI brand and Radeon product family is going nowhere, and is going to continue on as the flagship graphics product of AMD. :laugh:

Of course there will be R600 refreshes and minor chips - what do think "amd" have been/are working on up until r600 launches? - if you think it's r600 then you are a fool and don't have the first clue how the industry works...

we already know for an absolute certainty that you don't know how it works . . . and you are annoying the hell out of everyone pretending you do have a clue

the only one getting fooled is yourself
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Originally posted by: Skott
Sorry to pull the subject away from the ploitcal side of things but does anyone know what the G8800GTX will draw in amps? Kinda curious whats the minimum amperage needed on a 12V rail for one of these beasts. Or for two beasts in SLI? This inquiring mind wants to know.

Thanks,

Skott

Its got to be under the ATX 2.0 standard.. which specifies 18amps per 12v rail. Wattage required though is 40 or 50 watts higher than the previous Geforce gen, which required a total PSU wattage of 400watt for the 7950GX2. 8800GTX requires a 450watt PSU.

edit to add-

I missed the SLI part.. you probably wont be able to run SLI on a standard ATX 2.0 compliant PSU. A large single rail or special PSU will probably be needed (I'm guessing.. but I find it hardly likely).

So really you are getting 7900GTX SLI / X1950 Crossfire performance out of a single card (with a ton more features like DX10/128bit HDR ect and so on). But SLI I cant imagine you could use anything on the market today besides the Quad SLI certified PSUs or a large single rail PSU (like 45amp+?)

As an aside, if you dont have a large single rail already.. I'd suggest getting one of the Nvidia certified PSUs when the Geforce8 is launched.
Nvidia is working with PSU manufacurers as we speak.. or so I'm told.
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: apoppin
Originally posted by: Gstanfor
Originally posted by: Ulfhednar
Originally posted by: Crusader
ATI R600
-This is ATIs last attempt, so its going to be good. And thats why they dont care about missing Christmas sales.. its the last from them like this (high end). After this, everything moves to a pro-AMD stance.
Hahahahahahaha, I can't wait to see you eat those words when the refreshes are out and whatever next generation comes out after R600 and G80 are over and done with. AMD have said over and over that the ATI brand and Radeon product family is going nowhere, and is going to continue on as the flagship graphics product of AMD. :laugh:

Of course there will be R600 refreshes and minor chips - what do think "amd" have been/are working on up until r600 launches? - if you think it's r600 then you are a fool and don't have the first clue how the industry works...

we already know for an absolute certainty that you don't know how it works . . . and you are annoying the hell out of everyone pretending you do have a clue

the only one getting fooled is yourself

whatever you reckon, batshit.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: Gstanfor
Originally posted by: apoppin


we already know for an absolute certainty that you don't know how it works . . . and you are annoying the hell out of everyone pretending you do have a clue

the only one getting fooled is yourself

whatever you reckon, batshit.
that was a very clever response . . . for you

you show the emotional maturity of a 4th grader - when you have nothing to reply you call names

i am SO hurt

:D





 

lopri

Elite Member
Jul 27, 2002
13,314
690
126

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: secretanchitman
mmm, im hoping the inq is right about the g80 demo of the playboy model (adrianne curry is on the cover - february 2006).

g80 looks nice.

I have seen a pic of her render... no so good looking :(