So seems like AMD/ATI totally missed it this time

Gomce

Senior member
Dec 4, 2000
812
0
76
And who's to blame them. They wanted to play good capitalists (talking about AMD here), sitting on their good product (Athlon 64) for far too long with virtually 0 innovation, enjoying rise in sales, profit and share prices.

K7 architecture is what, 5+ years old now?

I've been using AMD since the 166mhz days, then duron 700@900, Tbird 1400, A64 3000+ etc. And the performance was good, albeit accompanying chipsets weren't so stable and one had to make careful selection of what he buys.

But AMD imo, abused the underdog card for far too long. A64 ended up being $400+, and AMD quickly became the new Intel, a period that lasted for almost 18 months!

-

I'm glad Intel did the price reduction strategy, first with PentiumD and then with the awesome performing C2D. I now have 4 systems all with Intel processors which run without any problems with all sorts of ram (ddr, ddr2 value and more enthusiast alike).

------

Same thing with ATI. They had a great run with 9700pro!

But what happened afterwards? 9800pro > X800 > X1800 (lol) > X1900

Did we see any innovation and breakthrough? No, same mistake as AMD!

One is led to conclude that the A64 and 9700 gems were just extremely lucky shots by these 2 companies.


I'm saddened to think that there will be nothing in the next 3-4 years to challenge Intel/Nvidia.

The 8800 series are an amazing breakthrough, same goes for C2D. Sadly, without competition NVidia's prices will now remain stagnant and you can't expect anything more interesting than a 8800gtx in the next 2 years.


 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
The K8 and the R9700 weren't the only awesome products made by those two companies. The original Athlon was great during the P3/P4 days. ATI's X800 line competed very well with what nVidia had at the time.

All early signs point to the R600 *competing* with nVidia, at the very least with the 8800GTS, which isn't too far off the GTX.
 

MadBoris

Member
Jul 20, 2006
129
0
0
"Sadly, without competition NVidia's prices will now remain stagnant and you can't expect anything more interesting than a 8800gtx in the next 2 years. "

Where do you get this information.
The 8800 has been available for 7 months. Are you saying the next gen GPU will take 2.5 years?

Actually it will likely be a little less than another year.
6800 came out April 2004
7800 GTX came out about june 2005
8800GTX September 2006.

So we are looking at about 15 months or so between generations. I would imagine Nvidia will try and hit the November, December 2007 holiday.

Nvidia was king in the early days, ATI stepped up for a while (finally got their drivers better too), now Nvidia is stepping up. It's good competition.
 

nanaki333

Diamond Member
Sep 14, 2002
3,772
13
81
amd also pulled an intel with the socket sizes. intel's been using socket 775 for awhile for their last couple chips. amd took what? 3 sockets to get their a64's right? s754, s939, AM2. they didn't have a64 in s940 or F did they? think those were just opty. intel used to be the king of socket changes and amd rode out their socket-a for their athlon, tbird, xp, and sempron.
 

Extelleron

Diamond Member
Dec 26, 2005
3,127
0
71
Originally posted by: nanaki333
amd also pulled an intel with the socket sizes. intel's been using socket 775 for awhile for their last couple chips. amd took what? 3 sockets to get their a64's right? s754, s939, AM2. they didn't have a64 in s940 or F did they? think those were just opty. intel used to be the king of socket changes and amd rode out their socket-a for their athlon, tbird, xp, and sempron.

Intel may be using the same socket, but older LGA 775 motherboards will not run Pentium D's, and (with a few exceptions) mobos from the P4/PD era will not run C2D's. The fact that Intel uses the same socket is irrelevant when you keep needing to change motherboards with new CPU releases.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
I never understood the whole socket debacle.

Why AMD needed Socket 939 and 940 made no sense to me; they're pretty much the same thing.

DDR2 wasn't even necessary for the A64, and AMD should have let the dust settle a bit before they jumped the gun.
 
May 8, 2007
86
0
0
Originally posted by: MadBoris
"Sadly, without competition NVidia's prices will now remain stagnant and you can't expect anything more interesting than a 8800gtx in the next 2 years. "

Where do you get this information.
The 8800 has been available for 7 months. Are you saying the next gen GPU will take 2.5 years?

Actually it will likely be a little less than another year.
6800 came out April 2004
7800 GTX came out about june 2005
8800GTX September 2006.

So we are looking at about 15 months or so between generations. I would imagine Nvidia will try and hit the November, December 2007 holiday.

Nvidia was king in the early days, ATI stepped up for a while (finally got their drivers better too), now Nvidia is stepping up. It's good competition.



The reason innovation would stagnate is becuase w/o a competitor there is not nearly as much of an insentive to invest the large amounts of money into improvements of the product, not to mention there is almost NO insentive to lower prices.

A market w/o competition is a bad market.
 

Gomce

Senior member
Dec 4, 2000
812
0
76
Greed. To make u change CPU and motherboard every time you plan on upgrading. Either that or incompetence.

 

tatteredpotato

Diamond Member
Jul 23, 2006
3,934
0
76
I'd say AMD did much more innovation in the CPU department over the past several years than Intel. While Intel decided to pursue and archticture that would let them put big numbers on boxes, AMD did it right and made an efficient processor. AMD implemented the first x86 compatible 64 bit processor and launched the first dual core processor. It's also looking like AMD will have the first native quad core processor on the market. I realize it makes no difference to people if the quad core is native or 2 chips, but its a push forward in CPU design. Also lets not forget Hyper Transport that allows for massive bandwidth in 2p and 4p systems. Also typically AMD really only needs a socket change for a memory change, especially now that they have a unified socket.

As far as Intel goes they made a crappy architecture for 4 years and got tired of getting beat by AMD so they threw their massive R&D weight towards retaking the performance crown. The only real innovation that I can think of that Intel has is Hyper Threading, and that was so good that Intel decided to drop it from the C2D line of chips.

Also why criticize AMD for trying to sell processors at Intel prices? They operate on about 1/4 to 1/5 of the market, and they have to sell their products for less than Intel. This is why I stick with AMD; they do thinks right and they manage to do it on a tiny budget compared to Intel. You criticize AMD for sticking with a design for too long, but C2D is based on the Pentium 3 architecture. You don't just redesign a chip for the hell of it, if you have a good design you refine it and improve upon it, and that's what we're going to see from AMD this fall when they launch the Barcelona and Agena processors.

I'd like to know what amazing CPU innovations Intel has brought forth in recent history?
 

defiantsf

Member
Oct 23, 2005
132
0
0
One notable incentive for AMD to design highly efficient (IPC) CPU - they know their fab process will always lag Intel's. Brute force, high frequency design approach is not an option for them. Necessity is the mother of most if not all inventions.

 

aka1nas

Diamond Member
Aug 30, 2001
4,335
1
0
Originally posted by: SickBeast
I never understood the whole socket debacle.

Why AMD needed Socket 939 and 940 made no sense to me; they're pretty much the same thing.

DDR2 wasn't even necessary for the A64, and AMD should have let the dust settle a bit before they jumped the gun.

They should have not produced socket 754 and just ensured that the 939 mem controller could handle single channel configurations for budget setups. 940 was still necessary for the ECC/Workstation platform.

They were kind of stuck with the AM2 situation. AMD doesn't have the marketshare to force the memory manufacturers to stick with DDR. Intel had already started the move two years prior and was at that point completely converted to DDR2, which means that the vast majority of the OEM was on DDR2.

If they had managed to get some architectural improvements into the first AM2 chips on top of the die shrink, the platform would likely have been better received.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: Gomce
And who's to blame them. They wanted to play good capitalists (talking about AMD here), sitting on their good product (Athlon 64) for far too long with virtually 0 innovation, enjoying rise in sales, profit and share prices.

K7 architecture is what, 5+ years old now?

I've been using AMD since the 166mhz days, then duron 700@900, Tbird 1400, A64 3000+ etc. And the performance was good, albeit accompanying chipsets weren't so stable and one had to make careful selection of what he buys.

But AMD imo, abused the underdog card for far too long. A64 ended up being $400+, and AMD quickly became the new Intel, a period that lasted for almost 18 months!

-

I'm glad Intel did the price reduction strategy, first with PentiumD and then with the awesome performing C2D. I now have 4 systems all with Intel processors which run without any problems with all sorts of ram (ddr, ddr2 value and more enthusiast alike).

------

Same thing with ATI. They had a great run with 9700pro!

But what happened afterwards? 9800pro > X800 > X1800 (lol) > X1900

Did we see any innovation and breakthrough? No, same mistake as AMD!

One is led to conclude that the A64 and 9700 gems were just extremely lucky shots by these 2 companies.


I'm saddened to think that there will be nothing in the next 3-4 years to challenge Intel/Nvidia.

The 8800 series are an amazing breakthrough, same goes for C2D. Sadly, without competition NVidia's prices will now remain stagnant and you can't expect anything more interesting than a 8800gtx in the next 2 years.

Lets see :
N40 vs R400
fastest GPU 1st round : 6800U
Fastest GPU 2nd round : X800XT PE
Fastest GPU 3rd round : X850XT PE
High end Bang for buck GPU 1st round : 6800GT
High end Bang for buck GPU 2nd round : X800XL
high end Bang for buck GPU 3rd round : X800GTO2 softmod to X850XT PE
Midrange GPU bang for buck 1st round : 6600GT
Midrange GPU bang for buck 2nd round : X800GTO
Fastest GPU before 7800GTX : X850XT PE
Fastest GPU at the end of war : X850XT PE
Best High bang for GPU at the end of war : X800XL
Best Midrange GPU at the end of war : X800GTO2

G70 vs R500
fastest GPU 1st round : 7800GTX
Fastest GPU 2nd round : X1900XTX
High end Bang for buck GPU 1st round : X1800XT 512MB for $300 1st round
High end Bang for buck GPU 2nd round : X1900XT 512MB for $300
high end Bang for buck GPU 3rd round : X1950XT 256MB for $250
Midrange GPU bang for buck 1st round : none at $200 in R500 or G70 series.
Midrange GPU bang for buck 2nd round : X1950pro 256MB for $200
Fastest GPU before 8800GTX : Nvidia 7950GX
Fastest GPU at the end of war : Dual X1950pro which can compete with 8800GTX
Best High bang for GPU at the end of war : X1950XT 256mb
Best Midrange GPU at the end of war : X1950pro 256mb

Also 8800 wasn't a amazing break through... maybe to eyes of newbie.
 

RayvinAzn

Member
May 10, 2007
30
0
0
People trash AMD for jumping from socket to socket, but Intel really isn't all that different. The multitude of chipsets for Intel that determine compatibility between Core 2 Duo and Pentium D (and Pentium 4) is just as annoying, and still requires a new motherboard. At least with a new socket you can't accidentally drop your processor into your machine without forcing it and hope it works.

That being said, I agree that DAAMIT needs to get their act together - hopefully May 14th will be the start of that, but there's not way to know for sure until 4 days from now. Here's hoping that R600 and Barcelona keep the competition healthy.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Originally posted by: BlameCanada
I'd say AMD did much more innovation in the CPU department over the past several years than Intel. While Intel decided to pursue and archticture that would let them put big numbers on boxes, AMD did it right and made an efficient processor. AMD implemented the first x86 compatible 64 bit processor and launched the first dual core processor. It's also looking like AMD will have the first native quad core processor on the market. I realize it makes no difference to people if the quad core is native or 2 chips, but its a push forward in CPU design. Also lets not forget Hyper Transport that allows for massive bandwidth in 2p and 4p systems. Also typically AMD really only needs a socket change for a memory change, especially now that they have a unified socket.

As far as Intel goes they made a crappy architecture for 4 years and got tired of getting beat by AMD so they threw their massive R&D weight towards retaking the performance crown. The only real innovation that I can think of that Intel has is Hyper Threading, and that was so good that Intel decided to drop it from the C2D line of chips.

Also why criticize AMD for trying to sell processors at Intel prices? They operate on about 1/4 to 1/5 of the market, and they have to sell their products for less than Intel. This is why I stick with AMD; they do thinks right and they manage to do it on a tiny budget compared to Intel. You criticize AMD for sticking with a design for too long, but C2D is based on the Pentium 3 architecture. You don't just redesign a chip for the hell of it, if you have a good design you refine it and improve upon it, and that's what we're going to see from AMD this fall when they launch the Barcelona and Agena processors.

I'd like to know what amazing CPU innovations Intel has brought forth in recent history?

Hyperthreading wasn't dropped from Core2Duo. It never had it AFAIK. I believe Intel is bringing it back (another incarnation of it) with Nehalem.

BTW. Intel "thought" they had a breakthrough with Pentium 4 Netburst Microarchitecture.
I didn't turn out that way. Intel's only REAL flaw, is that they beat it to death and waited too long to change the way they do things.

AMD "did" do the right thing with the Athlon. For what they had to work with, they really came up with impressive hardware. (ONLY IN COMPARISON TO P4) Anything looked good compared to the P4. If P4 was a stellar success, where would AMD be right now? They would have never had the top product. History would have been different. But as the old saying goes, "What will be will be.".

They can come back with Barcelona, hopefully. Stockholders are hoping like there is no friggin tomorrow. ;)
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
I can see Barcelona be killer CPU but i don't think AMD could compete with Intel in late Q4 2007 if intel do that crazy price drop. Intel will reduce the price of its quad core / dual core cpu to crazy stupid level where they are making loose on every cpu they sell to its OEM and big wholesalers like Newegg. AMD can't go into price war to that extreme as AMD isn't loaded with cash. Intel has a simple strategy to bleed AMD dry with a price war to bring AMD to a point off no recovery. Also Intel CPU division isn't profitable at the moment because of the price drop of the current Dual core CPU. So the future holds doom and gloom for consumer. We will be back to the state of P4 netburst architecture and we will have no competition in 2010. Also before you people start telling me how profitable intel is ? you better look at Intel profit in 2000 , 2001 , 2003 , 2004 and now compare it against Q1 & Q2 2007.
 

Pabster

Lifer
Apr 15, 2001
16,986
1
0
Originally posted by: keysplayr2003
Hyperthreading wasn't dropped from Core2Duo. It never had it AFAIK. I believe Intel is bringing it back (another incarnation of it) with Nehalem.

Yep.

BTW. Intel "thought" they had a breakthrough with Pentium 4 Netburst Microarchitecture.
I didn't turn out that way. Intel's only REAL flaw, is that they beat it to death and waited too long to change the way they do things.

Yep. But then Chipzilla awoke...

AMD "did" do the right thing with the Athlon. For what they had to work with, they really came up with impressive hardware. (ONLY IN COMPARISON TO P4) Anything looked good compared to the P4. If P4 was a stellar success, where would AMD be right now? They would have never had the top product. History would have been different. But as the old saying goes, "What will be will be.".

One can always say "What if..." ... There is no question that AMD is responsible for the current state of affairs. That is, top of the line Intel chips for under (way under) $1000. Without the stiff competition AMD has offered the last several years we'd all be paying about $1200 for an E4300.

They can come back with Barcelona, hopefully. Stockholders are hoping like there is no friggin tomorrow. ;)

I'll wait and see. :D
 

Gstanfor

Banned
Oct 19, 1999
3,307
0
0
Originally posted by: SickBeast
I never understood the whole socket debacle.

Why AMD needed Socket 939 and 940 made no sense to me; they're pretty much the same thing.

DDR2 wasn't even necessary for the A64, and AMD should have let the dust settle a bit before they jumped the gun.

Exactly, and on top of that AMD had no real Celeron equivalent (the S754 Sempron doesn't count in my book) for people to buy, so you had a socket debacle, Intel style pricing with no sensible budget alternative and zero innovation afoot. Little wonder things have turned out the way they have.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76

I would say AMD is doing a good thing with AM2+ having backwards compatibility with AM2 chips, AM2+ processor can plug into the AM2 Sockets with the HT3.0 and Power Saving features disabled.

Regarding Socket changes and such both companies are going to have issues, while the Socket A was a long running Socket, you would still need to upgrade to support the new FSB and newer memory technologies anyway. 266/333/400 FSB & DDR266/333/400 as well as Dual Channel DDR.

Same argument for the LGA775 side, 915/925 for the initial Pentium 4 with DDR2 + PCI-E, 945/955 for Pentium D, DDR2-667 & 1066FSB throughout, 965/975 Bad Axe for Core 2 Duo & DDR2-800, and P35/X38 for 45nm Core 2's & DDR3.

The major advantage of keeping the same socket is that you can upgrade your motherboard and get these new features while keeping the same processor, as the 965 Series has the greatest processor support list yet: Celeron D 90nm/65nm 533, Pentium 4 90nm/65nm 533/800, Pentium D 90nm/65nm 800, Core 2 65nm 800/1066/1333.

It's a shame though that Bearlake won't support NetBurst, as that would allow it to span 3 process nodes worth of product lines.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: BlameCanada
I'd say AMD did much more innovation in the CPU department over the past several years than Intel. While Intel decided to pursue and archticture that would let them put big numbers on boxes, AMD did it right and made an efficient processor. AMD implemented the first x86 compatible 64 bit processor and launched the first dual core processor. It's also looking like AMD will have the first native quad core processor on the market. I realize it makes no difference to people if the quad core is native or 2 chips, but its a push forward in CPU design. Also lets not forget Hyper Transport that allows for massive bandwidth in 2p and 4p systems. Also typically AMD really only needs a socket change for a memory change, especially now that they have a unified socket.

As far as Intel goes they made a crappy architecture for 4 years and got tired of getting beat by AMD so they threw their massive R&D weight towards retaking the performance crown. The only real innovation that I can think of that Intel has is Hyper Threading, and that was so good that Intel decided to drop it from the C2D line of chips.

Also why criticize AMD for trying to sell processors at Intel prices? They operate on about 1/4 to 1/5 of the market, and they have to sell their products for less than Intel. This is why I stick with AMD; they do thinks right and they manage to do it on a tiny budget compared to Intel. You criticize AMD for sticking with a design for too long, but C2D is based on the Pentium 3 architecture. You don't just redesign a chip for the hell of it, if you have a good design you refine it and improve upon it, and that's what we're going to see from AMD this fall when they launch the Barcelona and Agena processors.

I'd like to know what amazing CPU innovations Intel has brought forth in recent history?

Intel launched the first Dual Core processor, they beat AMD to the punch on this one by 2-3 days. It's something most AMD fanboys seem to forget.

AMD has the first to Native Dual Core and will have the Native Quad Core crown as well.

Intel used a tactic that worked given it's reputation and marketing prowness, if you can make someone believe something it doesn't matter if that thing is or isn't good, it's only the perception that matters.

Unlike AMD, Intel inovates in other departments, as you so conveniently choose to ignore because unlike AMD Intel has businesses in other things besides CPU's only, which is where AMD is concentrating their efforts solely for the most part until more recently.

Intel introduced alot of good technologies of their own as well, they pushed the DDR2 transition helping bring higher bandwidth lower power (at the same frequency) memory to the masses, they pushed PCI-E which is a great interconnect and allows higher power delivery as well as multiple GPU solutions.

They are the makers of the Centrino brand a great mobile solution with a combination of chipset CPU and wireless all from a single vendor.

In the end it doesn't matter if Intel inovates or not, the eventual goal is to make tons of moola, how you get there isn't important as long as you do.

For AMD unlike Intel which can actually afford to stay with a design for awhile and still make billion of dollars AMD isn't in good shape if it makes such an error, and as referenced by the last 2 quarters the fact that they stayed with K8 as long as they did is now taking it's toll, they don't have a marketing engine to weather the storm like Intel.

Considering how long Intel stayed with NetBurst a good 5.5 yrs, it's done pretty well. But those days are over and with Intel's tick tock method, major architectural improvements will happen every 2 years instead of the old 5 year cycle.

AMD stayed with K8 for 4 years which is a little better then Intel cycle was still a bit too long.
 

coldpower27

Golden Member
Jul 18, 2004
1,676
0
76
Originally posted by: tuteja1986
Lets see :
N40 vs R400
fastest GPU 1st round : 6800U
Fastest GPU 2nd round : X800XT PE
Fastest GPU 3rd round : X850XT PE
High end Bang for buck GPU 1st round : 6800GT
High end Bang for buck GPU 2nd round : X800XL
high end Bang for buck GPU 3rd round : X800GTO2 softmod to X850XT PE
Midrange GPU bang for buck 1st round : 6600GT
Midrange GPU bang for buck 2nd round : X800GTO
Fastest GPU before 7800GTX : X850XT PE
Fastest GPU at the end of war : X850XT PE
Best High bang for GPU at the end of war : X800XL
Best Midrange GPU at the end of war : X800GTO2

G70 vs R500
fastest GPU 1st round : 7800GTX
Fastest GPU 2nd round : X1900XTX
High end Bang for buck GPU 1st round : X1800XT 512MB for $300 1st round
High end Bang for buck GPU 2nd round : X1900XT 512MB for $300
high end Bang for buck GPU 3rd round : X1950XT 256MB for $250
Midrange GPU bang for buck 1st round : none at $200 in R500 or G70 series.
Midrange GPU bang for buck 2nd round : X1950pro 256MB for $200
Fastest GPU before 8800GTX : Nvidia 7950GX
Fastest GPU at the end of war : Dual X1950pro which can compete with 8800GTX
Best High bang for GPU at the end of war : X1950XT 256mb
Best Midrange GPU at the end of war : X1950pro 256mb

Also 8800 wasn't a amazing break through... maybe to eyes of newbie.

Oh the 8800 GTX is a impressive leap, nearly 2x as fast as the old 7900 GTX, with DX10 support, not to mention higher quality AF, as well as new AA modes. That kinda of leap hasn't been seen in ages since the 6800 Ultra from 5950 Ultra on the NV side.

Most of the time it's either one or the other, you either have much improved functionality and a more modest performance increase. Or you spend the majority of the budget on performance.

In both the R4xx and R5xx generation ATI was at a disadvantage to NV in terms of competing on production cost, X800 GT to match the 6600 GT and X1800 GTO to go against the 7600 GT and later still the X1650 XT.

The only reason ATI typical has faster GPU "At the end of war" is because they have been late to the new generation so they needed more stopgaps then Nvidia did as well as because your failing to incorporate a few factors, Nvidia with the Geforce 6 generation had SLI, so had the overall crown compared to the X850 XT PE anyway and didn't care about single card performance because of their richer feature set.

X1950 Pro Dual comes in far too late as it is already competing against Nvidia's 8800 GTX, and suffers from having to not only deal with Crossfire technology, but a inferior feature set as well. Against the 7950 GX2 this would have been a great product, but not against the 8800 GTX. It also seems to be a vendor only product and not an actual supported SKU overall like the 7950GX2 is.

The only thing going for it is that for the most part it's where it should be on the price/performance ladder 20% slower then 8800 GTX, for about 20% less at 430USD. It does well in certain scenarios like Splinter Cell 3 and X3.

ATI has been providing bang for buck at the expense of it's own costs, and is not a necessarily good thing, as it has been eating into their profits.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Originally posted by: coldpower27
Oh the 8800 GTX is a impressive leap, nearly 2x as fast as the old 7900 GTX, with DX10 support, not to mention higher quality AF, as well as new AA modes. That kinda of leap hasn't been seen in ages since the 6800 Ultra from 5950 Ultra on the NV side.

Most of the time it's either one or the other, you either have much improved functionality and a more modest performance increase. Or you spend the majority of the budget on performance.

In both the R4xx and R5xx generation ATI was at a disadvantage to NV in terms of competing on production cost, X800 GT to match the 6600 GT and X1800 GTO to go against the 7600 GT and later still the X1650 XT.

The only reason ATI typical has faster GPU "At the end of war" is because they have been late to the new generation so they needed more stopgaps then Nvidia did as well as because your failing to incorporate a few factors, Nvidia with the Geforce 6 generation had SLI, so had the overall crown compared to the X850 XT PE anyway and didn't care about single card performance because of their richer feature set.

X1950 Pro Dual comes in far too late as it is already competing against Nvidia's 8800 GTX, and suffers from having to not only deal with Crossfire technology, but a inferior feature set as well. Against the 7950 GX2 this would have been a great product, but not against the 8800 GTX. It also seems to be a vendor only product and not an actual supported SKU overall like the 7950GX2 is.

ATI has been providing bang for buck at the expense of it's own costs, and is not a necessarily good thing, as it has been eating into their profits.

Dont bother trying to reason with him.

G80 was a very impressive leap compared to GPUs in the past. G80 fixed all the IQ problems of G7x and managed to double it's performance all without even a die shrink.
 

kmmatney

Diamond Member
Jun 19, 2000
4,363
1
81
Originally posted by: Gstanfor
Originally posted by: SickBeast
I never understood the whole socket debacle.

Why AMD needed Socket 939 and 940 made no sense to me; they're pretty much the same thing.

DDR2 wasn't even necessary for the A64, and AMD should have let the dust settle a bit before they jumped the gun.

Exactly, and on top of that AMD had no real Celeron equivalent (the S754 Sempron doesn't count in my book) for people to buy, so you had a socket debacle, Intel style pricing with no sensible budget alternative and zero innovation afoot. Little wonder things have turned out the way they have.

I totally agree. I ended up getting a Socket 754 MB way back, when were no Semprons for S939. I would have preferred a S939 motherboard, but all the cpu's were priced too high, and I wnted a total budget system. It was just plain nonsense that AMD didn't have any budget cpu's for 939.

 

terentenet

Senior member
Nov 8, 2005
387
0
0
Originally posted by: tuteja1986

Lets see :
N40 vs R400
fastest GPU 1st round : 6800U
Fastest GPU 2nd round : X800XT PE
Fastest GPU 3rd round : X850XT PE
High end Bang for buck GPU 1st round : 6800GT
High end Bang for buck GPU 2nd round : X800XL
high end Bang for buck GPU 3rd round : X800GTO2 softmod to X850XT PE
Midrange GPU bang for buck 1st round : 6600GT
Midrange GPU bang for buck 2nd round : X800GTO
Fastest GPU before 7800GTX : X850XT PE
Fastest GPU at the end of war : X850XT PE
Best High bang for GPU at the end of war : X800XL
Best Midrange GPU at the end of war : X800GTO2

G70 vs R500
fastest GPU 1st round : 7800GTX
Fastest GPU 2nd round : X1900XTX
High end Bang for buck GPU 1st round : X1800XT 512MB for $300 1st round
High end Bang for buck GPU 2nd round : X1900XT 512MB for $300
high end Bang for buck GPU 3rd round : X1950XT 256MB for $250
Midrange GPU bang for buck 1st round : none at $200 in R500 or G70 series.
Midrange GPU bang for buck 2nd round : X1950pro 256MB for $200
Fastest GPU before 8800GTX : Nvidia 7950GX
Fastest GPU at the end of war : Dual X1950pro which can compete with 8800GTX
Best High bang for GPU at the end of war : X1950XT 256mb
Best Midrange GPU at the end of war : X1950pro 256mb

Also 8800 wasn't a amazing break through... maybe to eyes of newbie.



Riiiiiiiiight. tuteja, you're such a fanboy. What would break through be? 10 times the performance of last series? Naaah; just 2 times the performance, unified shadres, DX10, more AA modes, improved IQ.
Quit it and stop posting. You're sounding like a broken disk allready.
Fastest GPU at the end of war : Dual X1950pro which can compete with 8800GTX Hear yourself. Comparing 2 cards with a single card. A single card that runs cooler, consumes less than 1 1950Pro, is DX10 compatible and has much better IQ. And, I have a slight impression that that single card beats the 2 1950Pro cards.

/me happy with 1 8800GTX performing better than 2 7900GTX in SLI or CF 1950Pro.

Nvidia power:
http://www.crazypc.ro/forum/attachment.php?attachmentid=21161&d=1178821594
http://www.crazypc.ro/forum/attachment.php?attachmentid=21162&d=1178821594
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: coldpower27
Originally posted by: tuteja1986
Lets see :
N40 vs R400
fastest GPU 1st round : 6800U
Fastest GPU 2nd round : X800XT PE
Fastest GPU 3rd round : X850XT PE
High end Bang for buck GPU 1st round : 6800GT
High end Bang for buck GPU 2nd round : X800XL
high end Bang for buck GPU 3rd round : X800GTO2 softmod to X850XT PE
Midrange GPU bang for buck 1st round : 6600GT
Midrange GPU bang for buck 2nd round : X800GTO
Fastest GPU before 7800GTX : X850XT PE
Fastest GPU at the end of war : X850XT PE
Best High bang for GPU at the end of war : X800XL
Best Midrange GPU at the end of war : X800GTO2

G70 vs R500
fastest GPU 1st round : 7800GTX
Fastest GPU 2nd round : X1900XTX
High end Bang for buck GPU 1st round : X1800XT 512MB for $300 1st round
High end Bang for buck GPU 2nd round : X1900XT 512MB for $300
high end Bang for buck GPU 3rd round : X1950XT 256MB for $250
Midrange GPU bang for buck 1st round : none at $200 in R500 or G70 series.
Midrange GPU bang for buck 2nd round : X1950pro 256MB for $200
Fastest GPU before 8800GTX : Nvidia 7950GX
Fastest GPU at the end of war : Dual X1950pro which can compete with 8800GTX
Best High bang for GPU at the end of war : X1950XT 256mb
Best Midrange GPU at the end of war : X1950pro 256mb

Also 8800 wasn't a amazing break through... maybe to eyes of newbie.

Oh the 8800 GTX is a impressive leap, nearly 2x as fast as the old 7900 GTX, with DX10 support, not to mention higher quality AF, as well as new AA modes. That kinda of leap hasn't been seen in ages since the 6800 Ultra from 5950 Ultra on the NV side.

Most of the time it's either one or the other, you either have much improved functionality and a more modest performance increase. Or you spend the majority of the budget on performance.

In both the R4xx and R5xx generation ATI was at a disadvantage to NV in terms of competing on production cost, X800 GT to match the 6600 GT and X1800 GTO to go against the 7600 GT and later still the X1650 XT.

The only reason ATI typical has faster GPU "At the end of war" is because they have been late to the new generation so they needed more stopgaps then Nvidia did as well as because your failing to incorporate a few factors, Nvidia with the Geforce 6 generation had SLI, so had the overall crown compared to the X850 XT PE anyway and didn't care about single card performance because of their richer feature set.

X1950 Pro Dual comes in far too late as it is already competing against Nvidia's 8800 GTX, and suffers from having to not only deal with Crossfire technology, but a inferior feature set as well. Against the 7950 GX2 this would have been a great product, but not against the 8800 GTX. It also seems to be a vendor only product and not an actual supported SKU overall like the 7950GX2 is.

The only thing going for it is that for the most part it's where it should be on the price/performance ladder 20% slower then 8800 GTX, for about 20% less at 430USD. It does well in certain scenarios like Splinter Cell 3 and X3.

ATI has been providing bang for buck at the expense of it's own costs, and is not a necessarily good thing, as it has been eating into their profits.

If it wasn't for ATI you wouldn't have been seen out cry for higher quaility AF and also new AA modes. Also The only richer feature that 6800U had was a Shader 3.0 which meant crap since shader 3.0 were suppose to run more efficent and faster than Shader 2.0. But if you looked at Farcry benchmark it will tell how much Shader 3.0 actually mattered.

Even if ATI has bang for buck GPU they don't sell no where as good as inferior GPU from NVIDIA.