How could BD pull AMD up?

Status
Not open for further replies.

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I must be missing something because I don't see how so many people can think that BD (the $290 offering) is going to be faster/better than intel's 2500k.

the reasons why I think it will be a flop are as follows:
it's gotta run hotter since the cores are clocked so much higher.
it costs $70 more than intel's comparable offering
it's late to the market.
amd's chipsets have traditionally been inferior to those of intel.
it has 2x as much L3 cache as 2500k, but it's clocked only 20% faster than the Phenom II's L3 cache, and there is only 33% more L3 cache compared to the PII.

given all that why do some people think it wil be faster or a better value than the 2500k? What am I missing? The modules share the L2 cache, but that only gives 512KB/core if that were to be divided.

Correct me if I'm wrong, but if AMD wants to succeed, then they should either just copy intel, or they should just stick to making products that aren't quite as fast as intel's but are a lot cheaper.

In addition to their hotter-running, poorly designed CPUs (it's faster if they went with a design with less L3 cache clocked the same as the cores; for example, the 6 core Thuban was a really stupid idea, IMO; they could've just made a well binned PII x4 with 3.8-4 GHz core clock and 2.5-2.8 GHz L3 cache), AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

Thread locked due to many reports of insults which are against the rules. Be careful guys, the next step is handing out infractions.

Anandtech Admin
Red Dawn
 
Last edited by a moderator:

Wizlem

Member
Jun 2, 2010
94
0
66
Copying another company is a horrible business idea unless you think you can copy them and then make their product more efficiently. AMD GPUs are completely competitive with NVIDIA and their cpus are currently targeting throughput over single threaded performance. Saying their 6 core is a horrible idea is like saying public transportation is a horrible idea. It really depends on how you use it.
 

podspi

Golden Member
Jan 11, 2011
1,982
102
106
I must be missing something because I don't see how so many people can think that BD (the $290 offering) is going to be faster/better than intel's 2500k.

the reasons why I think it will be a flop are as follows:
it's gotta run hotter since the cores are clocked so much higher.
it costs $70 more than intel's comparable offering
it's late to the market.
amd's chipsets have traditionally been inferior to those of intel.
it has 2x as much L3 cache as 2500k, but it's clocked only 20% faster than the Phenom II's L3 cache, and there is only 33% more L3 cache compared to the PII.

given all that why do some people think it wil be faster or a better value than the 2500k? What am I missing? The modules share the L2 cache, but that only gives 512KB/core if that were to be divided.

Correct me if I'm wrong, but if AMD wants to succeed, then they should either just copy intel, or they should just stick to making products that aren't quite as fast as intel's but are a lot cheaper.

In addition to their hotter-running, poorly designed CPUs (it's faster if they went with a design with less L3 cache clocked the same as the cores; for example, the 6 core Thuban was a really stupid idea, IMO; they could've just made a well binned PII x4 with 3.8-4 GHz core clock and 2.5-2.8 GHz L3 cache), AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

Reasons AMD's Bulldozer will be a success:
  • It's got to run cooler thanks to SOI
  • Throughput has got to be higher since it has 2x more cores
  • Costs $709 less than Intel's comparable offering
  • Launching two quarters before Ivy Bridge
  • AMD's chipsets have traditionally been a much better value than Intel's
  • It has 2x as much L3 cache as the 2500K


    Given all of those, I don't see how people couldn't think BD will be a success? What am I missing? The modules share the L2 cache, which should greatly increase the speed at which threads that share data execute at.

    Correct me if I'm wrong, but if AMD wants to succeed they should continue pushing products that focus on what they're good at, like market-leading graphics.

    --------

    See? I can do it too :biggrin: I think the best reason to think the $290 FX-8xxx will be competitive with the 2500K and 2600K is that as a primarily enthusiast CPU (Llano is their mainstream for 2011), they would be downright retarded to try selling it for $290 when it couldn't compete with a $200 CPU from Intel. Yes, currently PhII are overpriced, but that's because if you already have an AM2+/3 mobo you're kind of stuck if you don't want to upgrade the whole thing. Nobody is going to buy into AM3+ if the pricing is that out of wack.





 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
To counter your doubts about why BD will be a failure: AMD has maintained some market share with its current Phenom II architecture despite CPU performance 2 generations behind. If BD is faster, it will at least gain some market share back, and become MUCH more competitive in the lucrative server space. BD doesn't have to be faster than SB to be successful.

AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

That's pretty funny and completely false.

- 9700Pro/9800Pro/XT series > 5800/5900 series. In fact it wasn't even close in this series.

- X800XT/X850XT series > 6800U

- X1800XT/X1900XT/1950XT series > 7800/7900/7950 series

- HD4850/4870 set a new benchmark for price performance vs. 9800GTX+/GTS250/GTX260. I also recall HD4850 X2 and HD4870 X2 being the fastest single cards that generation.

- You could also say HD6970 is a much better value on the high end today given that its less than 5% slower than a GTX580 at 2560x1600 but costs $150 less.

- HD6990 is easily competitive with a GTX590 and HD6950 2GB Tri-Fire setup has no equal on the NV side.

- Can 1 NV card power 5-6 monitors? :biggrin:

Troll
ac061_troll_web.jpg
 
Last edited:

PCboy

Senior member
Jul 9, 2001
847
0
0
AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

Not sure if serious... and before you pull the "Nvidia-drivers-are-better card":
http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551
http://www.engadget.com/2010/03/05/nvidia-pulls-196-75-driver-amid-reports-its-frying-graphics-car/
http://www.anandtech.com/show/3605
http://www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html
http://www.maximumpc.com/article/news/nvidia_yanks_19675_drivers_investigate_overheating_reports
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/26.html
http://www.tweaktown.com/news/19192...590_why_some_have_gone_up_in_smoke/index.html

I still have a Hercules 3D Prophet III (my favorite card), 9800 GTX+, and a GTX295 still running strong so I am no means an AMD/ATI fanboy but to call them inferior is ridiculous. You were probably not around the scene when the R300 GPU first came out and completely dominated the 3D graphics arena.

Then again, I feel like I've been successfully trolled. :hmm:
 

ThatsABigOne

Diamond Member
Nov 8, 2010
4,422
23
81
Not sure if serious... and before you pull the "Nvidia-drivers-are-better card":
http://www.zdnet.com/blog/hardware/warning-nvidia-19675-drivers-can-kill-your-graphics-card/7551
http://www.engadget.com/2010/03/05/nvidia-pulls-196-75-driver-amid-reports-its-frying-graphics-car/
http://www.anandtech.com/show/3605
http://www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html
http://www.maximumpc.com/article/news/nvidia_yanks_19675_drivers_investigate_overheating_reports
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/26.html
http://www.tweaktown.com/news/19192...590_why_some_have_gone_up_in_smoke/index.html

I still have a Hercules 3D Prophet III (my favorite card), 9800 GTX+, and a GTX295 still running strong so I am no means an AMD/ATI fanboy but to call them inferior is ridiculous. You were probably not around the scene when the R300 GPU first came out and completely dominated the 3D graphics arena.

Then again, I feel like I've been successfully trolled. :hmm:

You were.. He needs a ban from starting new threads.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Here's my take when taking a step back and looking at it from a business perspective.

AMD is targeting a very specific market segment with it's BD CPU design. That market is not the general market or the gaming market. It's targeting CPUs for FPU processing and modeling that requires high precision FPU and strong performance. Doubling the FPU width will give it a significant advantage for a narrow set of applications... applications that are used by people who pay very steep margin on their products.

When you can't outright beat your competitor, you have to target a specific niche and own a segment of the market. Llano and Brazos is targeting a pretty broad segment with large volume and low margins (pretty much any machine without a discrete graphics card). BD is taking the opposite tack, by targeting a narrow niche with high profit margins. They will market it to the general public, and squeeze what they can from that, but the real 'meat' for BD will be simulation, modeling and similar applications that are, for whatever reason, better suited to processing on CPU than GPGPU.

From what I've seen on the Radeon 7xxx series, they are also significantly pumping compute performance, but that is really early info and we have no idea if that's just hype or if it's reality.

Now roll all that together and what do you see? A full on attack of the simulation and modeling marketplace. Pushing GPGPU into the average computer gives people a taste and gets OpenCL more widely accepted so apps can be more easily adapted to GPGPU. 7xxx GPUs that might be competitive with nVidia architectures for GPGPU. CPUs that are better than Intel at CPU related tasks to physics and modeling... they're targeting that marketspace from all sides. It looks to me like in 5 years they want to be THE name in that space.

They need a small, high profit margin, steady segment of the market to help insulate them from general PC volatility and float them through a cycle like they had in K10 vs. C2D. They tried to own servers with Opteron, and it didn't work. To me it looks as if they are trying a different high margin segment now. At this point it's somewhat speculation. We'll see how the story unfolds when Radeon 7xxx series and BD sampling starts happening.
 
Last edited:

dac7nco

Senior member
Jun 7, 2009
756
0
0
Considering the OP's name begins with "Anarchist", I'll let this one lie. The last part of his name is 420, so I just assume he's a pot smoking punk bitch with nothing better to do than troll.

Daimon
 

LOL_Wut_Axel

Diamond Member
Mar 26, 2011
4,310
8
81
I must be missing something because I don't see how so many people can think that BD (the $290 offering) is going to be faster/better than intel's 2500k.

the reasons why I think it will be a flop are as follows:
it's gotta run hotter since the cores are clocked so much higher.
it costs $70 more than intel's comparable offering
it's late to the market.
amd's chipsets have traditionally been inferior to those of intel.
it has 2x as much L3 cache as 2500k, but it's clocked only 20% faster than the Phenom II's L3 cache, and there is only 33% more L3 cache compared to the PII.

given all that why do some people think it wil be faster or a better value than the 2500k? What am I missing? The modules share the L2 cache, but that only gives 512KB/core if that were to be divided.

Correct me if I'm wrong, but if AMD wants to succeed, then they should either just copy intel, or they should just stick to making products that aren't quite as fast as intel's but are a lot cheaper.

In addition to their hotter-running, poorly designed CPUs (it's faster if they went with a design with less L3 cache clocked the same as the cores; for example, the 6 core Thuban was a really stupid idea, IMO; they could've just made a well binned PII x4 with 3.8-4 GHz core clock and 2.5-2.8 GHz L3 cache), AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.


So many things wrong with this I don't even know where to start. First, you don't know the clock speeds of the Bulldozer CPUs; no one aside from AMD at this point does.

Second, you don't know either if it's gonna run hot because we haven't seen proof of it. Because it uses SOI technology that's probably false.

Third, based on pricing, the FX-4110 (and perhaps the FX-6110) competes with the Core i5.

Fourth, you should learn to not make blanket statements like "AMD's chipsets are worse". What exactly are you referring to? Given the fact that most of what you've posted is false, I don't think it's anything substantial.

Fifth, you clearly don't know much about CPU cache as you made a statement of Bulldozer (FX-8110 specifically) having 2x the L3 cache as the 2500K, which is completely false. It has 8MB L3 cache, as opposed to 6MB L3 cache for the 2500K. That's not 2x more. Bulldozer does have more L2 cache, though, at 2MB/module--1MB/core if you want to look at it that way. Sandy Bridge only has 256KB L2 cache/core.

Sixth, it's hilarious yet an insult to say that AMD makes "poorly designed" CPUs. Newsflash: the microprocessor is the most complex device to ever be invented by mankind. It takes a very high amount of money and extremely skilled engineers to come out with a new CPU architecture.

Seventh, AMD is just as good (if not better than) NVIDIA when it comes to GPUs as of now. They have the fastest graphics card on the market (Radeon HD 6990), and from an architectural standpoint they have their pluses and minuses. Not only that, but as of now their GPUs are immensely more efficient than NVIDIA's, especially in the Performance and Enthusiast category.

From reading all of the mindless drivel you've posted, I can only get to two conclusions: you're trying to be a troll or you're clueless.
 
Last edited:

Ares1214

Senior member
Sep 12, 2010
268
0
0
it's gotta run hotter since the cores are clocked so much higher.

Wrong. From the leaks we have seen, wrong they may have been, BD looks like one cool cat. This is likely due to the module design and SOI.

it costs $70 more than intel's comparable offering

? Why would it compete against the 2500K? Its competing against the 2600K, which happens to be the same cost. 8 module system vs 8 thread system, and then I believe the 6 core and 4 core are priced just above and below the 2500K respectively:

amdbulldozerlianofiyat_dh_fx57.jpg


it's late to the market.

This is true, but that wont have too much of an effect on performance or how much of a "flop" it will be.

amd's chipsets have traditionally been inferior to those of intel.

990FX supports Nvidia SLI now, and has native USB 3.0. Also, HyperTransport was just upgraded. Not to mention SB has a measly 24 lanes, 990FX has 32. With more and more power GPU's coming out, x8 is actually starting to give a noticeable bottleneck. Im not saying 990FX is better than Z68, just that it is not "traditionally inferior" to it.

it has 2x as much L3 cache as 2500k, but it's clocked only 20% faster than the Phenom II's L3 cache, and there is only 33% more L3 cache compared to the PII.

Oh, and you know exactly how this will effect performance...how? How about you wait for the benchmarks on this one. For all you know the advantage of 2x the cache might win for BD. Perhaps not.

Correct me if I'm wrong, but if AMD wants to succeed, then they should either just copy intel, or they should just stick to making products that aren't quite as fast as intel's but are a lot cheaper.

...Copy Intel. Thats a good way to get sued. Being mediocre. Thats a good business plan. The only way AMD can beat Intel is to not play Intel at their own game, so exactly the opposite of copy them. Intel has a massive R&D lead, way better manufacturing capabilities, and are a half node ahead of AMD, for the forseeable future. The only way AMD can make better CPU's than Intel are to design them entirely differently, and for an entirely different purpose. First there was the clock speed race, then the monolithic core race, the core count race. AMD cant beat Intel in all these races. AMD needs to pursue different areas, which is exactly which they have done. For example, Zacate, hitting intel where they are weak, Llano, giving a good enough CPU with a much faster iGPU than Intel, and now BD, where the module design is meant to be a different approach to Intel's Hyperthreading. In the future, Intel will likely continue to do what they are good at, making fast CPU's and manufacturing. However AMD wants to move on to heterogeneous computing, a field where they could establish a lead in. And then ARM comes in here and knocks them all over :D

AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

Really? Really? Wait, you are serious? At this very moment, Nvidia and AMD are in a dead heat in the GPU race, however Id give the nod to AMD because they have the better low end. However, economically, Nvidia is in a world of hurt. Thier CFO just resigned, their stocks are falling, market share is falling, they are posting in the red. They will get out of it Im sure, but they definitely arent doing horribly well economically, but their products are good now.

So basically, you are horribly mislead, or this thread is a poor attempt at trolling...then again, I did respond :hmm:
 

Terzo

Platinum Member
Dec 13, 2005
2,589
27
91
Has there been any more news about Bulldozer? Maybe from the Fusion Developer Summit? I haven't heard anything about it since the E3 announcements.
 

waffleironhead

Diamond Member
Aug 10, 2005
7,068
572
136
I think before anyone else spends time responding to this thread, you do yourself a favor and find other threads by the OP. Come to your own conclusions.
 

john3850

Golden Member
Oct 19, 2002
1,436
21
81
Amd just doesn't have the rd cash to build anything good.
We need amd to keep Intel and nvida in line.
Only once did amd have a good offering and that was because Intel got greedy with their p4.
The only reason we see good amd video is do to ati purchase.
To me the 2500k is just mid range card or a defective 2600k.
Yes amd can have cpu that matches Intel at lower speeds but never at mid or high end.
Intel's new 22-nm process is going to hurt amds low end market big time.
 

Spikesoldier

Diamond Member
Oct 15, 2001
6,766
0
0
Oh, boy. Another Anarchist420 thread.

Looks like another idiot that has been hypnotized by intel and nvidia.

Nothing to see here.
 

TakeNoPrisoners

Platinum Member
Jun 3, 2011
2,599
1
81
I must be missing something because I don't see how so many people can think that BD (the $290 offering) is going to be faster/better than intel's 2500k.

the reasons why I think it will be a flop are as follows:
it's gotta run hotter since the cores are clocked so much higher.
it costs $70 more than intel's comparable offering
it's late to the market.
amd's chipsets have traditionally been inferior to those of intel.
it has 2x as much L3 cache as 2500k, but it's clocked only 20% faster than the Phenom II's L3 cache, and there is only 33% more L3 cache compared to the PII.

given all that why do some people think it wil be faster or a better value than the 2500k? What am I missing? The modules share the L2 cache, but that only gives 512KB/core if that were to be divided.

Correct me if I'm wrong, but if AMD wants to succeed, then they should either just copy intel, or they should just stick to making products that aren't quite as fast as intel's but are a lot cheaper.

In addition to their hotter-running, poorly designed CPUs (it's faster if they went with a design with less L3 cache clocked the same as the cores; for example, the 6 core Thuban was a really stupid idea, IMO; they could've just made a well binned PII x4 with 3.8-4 GHz core clock and 2.5-2.8 GHz L3 cache), AMD's GPUs and their technologies are pretty inferior to those of nvidia, so I don't see how they've managed to survive so long.

LOL the 6950 says Hi.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Considering the OP's name begins with "Anarchist", I'll let this one lie. The last part of his name is 420, so I just assume he's a pot smoking punk bitch with nothing better to do than troll.

Daimon

Did you make it to the harmony festival last week?
 
Feb 19, 2001
20,155
23
81
To counter your doubts about why BD will be a failure: AMD has maintained some market share with its current Phenom II architecture despite CPU performance 2 generations behind. If BD is faster, it will at least gain some market share back, and become MUCH more competitive in the lucrative server space. BD doesn't have to be faster than SB to be successful.



That's pretty funny and completely false.

- 9700Pro/9800Pro/XT series > 5800/5900 series. In fact it wasn't even close in this series.

- X800XT/X850XT series > 6800U

- X1800XT/X1900XT/1950XT series > 7800/7900/7950 series

- HD4850/4870 set a new benchmark for price performance vs. 9800GTX+/GTS250/GTX260. I also recall HD4850 X2 and HD4870 X2 being the fastest single cards that generation.

- You could also say HD6970 is a much better value on the high end today given that its less than 5% slower than a GTX580 at 2560x1600 but costs $150 less.

- HD6990 is easily competitive with a GTX590 and HD6950 2GB Tri-Fire setup has no equal on the NV side.

- Can 1 NV card power 5-6 monitors? :biggrin:

Troll

Uhh to be honest I'd say the 6800 and 7800 series of NVidia were seriously good. The X1800 was disappointing and I was glad I jumped the 7800GT on launch day. Even the X1900 was ehhh. What ended up working out was the steep discounts that ATI had and somehow it was like every other slickdeal which made the cards worth it.

The AMD 2xxx series was still disappointing and was never really a choice for any card upgrader but the 3xxx series started raising heads. It wasn't until the 4xxx series that I'd say AMD won the crown back. That's when I jumped back into AMD territory and bought the 4850. The AMD 6xxx series makes it interesting because NV woke up a bit and fixed a few things. I'd say it's even right now, but the 4xxx and 5xxx series helped AMD maintain a decent advantage for 2-3 years.
 

BababooeyHTJ

Senior member
Nov 25, 2009
283
0
0
- HD4850/4870 set a new benchmark for price performance vs. 9800GTX+/GTS250/GTX260. I also recall HD4850 X2 and HD4870 X2 being the fastest single cards that generation.

Yeah, AMD caught Nvidia with their pants down that round as far as launch prices go. Its not like AMD dropped the bar for price vs performance when they had no competition with the 5xxx series. I recall 5770 launching at a higher price than 4870 had been going for for the longest time.

I also owned both a 4870x2 and a GTX280 and the 280 was by far a better card. I remember Morrowind with a few mods running like poo on the X2 while it ran like a champ on the 280, I remember Crysis running smoother on the 280 thanks to microstutter on the x2, I remember a 9800gtx 65nm outperforming the X2 in FSX.

You can throw whatever slant that you want to on things.

- You could also say HD6970 is a much better value on the high end today given that its less than 5% slower than a GTX580 at 2560x1600 but costs $150 less.

Like I said in the other thread thats in select games, on average that is far from the case. I'll say it again for every Crysis (which artifacts, I have videos) or Metro 2033 where the 6970 is within 5% there is a GTA4 or NWN2, or Sims 3 which barely runs. Yeah, cayman is very good for its price and is the king of multi-monitor support that I'll never use but you are delusional if you think that there are no advantages with Nvidia's 5xx series.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
To counter your doubts about why BD will be a failure: AMD has maintained some market share with its current Phenom II architecture despite CPU performance 2 generations behind. If BD is faster, it will at least gain some market share back, and become MUCH more competitive in the lucrative server space. BD doesn't have to be faster than SB to be successful.



That's pretty funny and completely false.

- 9700Pro/9800Pro/XT series > 5800/5900 series. In fact it wasn't even close in this series.

- X800XT/X850XT series > 6800U

- X1800XT/X1900XT/1950XT series > 7800/7900/7950 series

- HD4850/4870 set a new benchmark for price performance vs. 9800GTX+/GTS250/GTX260. I also recall HD4850 X2 and HD4870 X2 being the fastest single cards that generation.

- You could also say HD6970 is a much better value on the high end today given that its less than 5% slower than a GTX580 at 2560x1600 but costs $150 less.

- HD6990 is easily competitive with a GTX590 and HD6950 2GB Tri-Fire setup has no equal on the NV side.
AMD has much worse image quality, runs hotter, and has a lesser feature set for people who don't have to have 3 monitors. It's not all about performance either. It's very debateable as to whether the X850xTPE was better than the 6800Ultra. I had a 6800GT and it was pretty good, except for that it couldn't do AA with FP render targets and that the Anisotropic filtering quality, so I just turned it off.
The 6800 Ultra ran a little cooler than the 6800GT and the 6800GT ran quite a bit cooler than the x800XTPE which lacked SM3.0.

The 5900 Ultra was much slower than the 9800 series, but ATi had awful driver problems back then (okay, I'll admit some of the compatibility issues were due to TWIMTBP, but ATi made no attempt to counter that) and ATi's feature set and IQ was worse. I think the worst thing nvidia did was forcing texture filtering optimizations with the FX series in the 50 series drivers which were the latest drivers the majority of the time I had it, until I upgraded to the 6800GT.

As for the 4850, it sucked even if it was $1, because like I said, nvidia's IQ is much better. ATi uses some depth range optimization (lossy Z compression and call it lossless anyway?, i don't know, but nvidia's depth range is longer) and nvidia has always had better filtering and better feature set.
So many things wrong with this I don't even know where to start. First, you don't know the clock speeds of the Bulldozer CPUs; no one aside from AMD at this point does.

Second, you don't know either if it's gonna run hot because we haven't seen proof of it. Because it uses SOI technology that's probably false.

Third, based on pricing, the FX-4110 (and perhaps the FX-6110) competes with the Core i5.

Fourth, you should learn to not make blanket statements like "AMD's chipsets are worse". What exactly are you referring to? Given the fact that most of what you've posted is false, I don't think it's anything substantial.

Fifth, you clearly don't know much about CPU cache as you made a statement of Bulldozer (FX-8110 specifically) having 2x the L3 cache as the 2500K, which is completely false. It has 8MB L3 cache, as opposed to 6MB L3 cache for the 2500K. That's not 2x more. Bulldozer does have more L2 cache, though, at 2MB/module--1MB/core if you want to look at it that way. Sandy Bridge only has 256KB L2 cache/core.

Sixth, it's hilarious yet an insult to say that AMD makes "poorly designed" CPUs. Newsflash: the microprocessor is the most complex device to ever be invented by mankind. It takes a very high amount of money and extremely skilled engineers to come out with a new CPU architecture.

Seventh, AMD is just as good (if not better than) NVIDIA when it comes to GPUs as of now. They have the fastest graphics card on the market (Radeon HD 6990), and from an architectural standpoint they have their pluses and minuses. Not only that, but as of now their GPUs are immensely more efficient than NVIDIA's, especially in the Performance and Enthusiast category.

From reading all of the mindless drivel you've posted, I can only get to two conclusions: you're trying to be a troll or you're clueless.
My PII was manufactured with SOI tech and it runs a lot hotter than the 2500k does even though it's a lot less powerful. It's the C3 edition also.

Wikipedia posted the clock speeds and they were higher for the cores but slower for the L3 cache.

I had thought that the i2500k had 4 MB L3 cache, so it was a simple error on my part.
 

dac7nco

Senior member
Jun 7, 2009
756
0
0
Did you make it to the harmony festival last week?

LOL, no, I missed it. I don't actually have anything against the herb - this thread pissed me off for no rational reason. I fell for the troll like a young'un.

Daimon
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Uhh to be honest I'd say the 6800 and 7800 series of NVidia were seriously good. The X1800 was disappointing and I was glad I jumped the 7800GT on launch day. Even the X1900 was ehhh. What ended up working out was the steep discounts that ATI had and somehow it was like every other slickdeal which made the cards worth it.

Maybe the X1800XT was disappointing from a noise perspective but it was faster than the 7800 GTX 256mb, outside of OpenGL games. It took the mighty 7800GTX 512mb which cost an arm and a leg at the time to surpass it.

X1900 was ehhhh? I am too lazy to look for old reviews. All I know is that X1900 series easily smoked 7800/7900 series once more modern games started to ship. 7800/7900 series simply just fell apart in modern DX9, shader heavy games like Call of Duty Modern Warfare. Even NV themselves did a full change with GF8 towards a shader heavy (not texture heavy) architecture after they realized how unbalanced GF7 was. Even the lowly 8600GTS became faster than the mighty 7900GTX in modern games thereafter.

The AMD 2xxx series was still disappointing and was never really a choice for any card upgrader but the 3xxx series started raising heads. It wasn't until the 4xxx series that I'd say AMD won the crown back.

Agreed. HD2xxx and HD3xxx series were inferior to GF8/9. Either way, both firms traded blows over the years, yet he implied that AMD/ATI GPUs were always inferior.

As for the 4850, it sucked even if it was $1, because like I said, nvidia's IQ is much better.

Your comment makes no sense. It took GF8 to fix NV's horrible tri-linear filtering quality and its far inferior AF to ATI. If anything, AMD had superior image quality all the way until GF8. Even starting in 9700Pro days, AMD already had superior AA image quality.

there is a GTA4 or NWN2, or Sims 3 which barely runs. Yeah, cayman is very good for its price and is the king of multi-monitor support that I'll never use but you are delusional if you think that there are no advantages with Nvidia's 5xx series.

Can you provide proof that AMD cards can't run those 3 games you mentioned? I already linked you a review showing HD6870 outperforming GTX560 Ti in GTA4.

I never said GTX 5xx series doesn't have any advantages over AMD. But a GTX580 won't be any more future proof than an HD6970. If you can afford a $500 graphics card each GPU generation, of course get the GTX580 if you want the fastest single GPU card. But for the rest of us, it might make more sense to spend $150 "savings" towards a new GTX670 or an HD7950, etc. Those cards will spank the 580. The point is both HD6970 and GTX580 are viable alternatives. It's not like comparing GeForce 8800 vs. HD2900.
 
Last edited:
Status
Not open for further replies.