AMD FX 9590 Price Drop on Aria.co.uk

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
I like everything about how AMD has gone about exploring this market niche opportunity. Intel certainly wasn't going to do it.

At least AMD was willing to try and see what kind of market demand there would be for such a product.

Big kudos to AMD for at least trying. Intel is too cheap to even use proper solder these days, but it doesn't matter, because they're too lazy to go above ~80 TDP @ stock anyway. With the manufacturing process optimized for <40W TDP, they've removed the headroom that made their previous processors so attractive. Intel isn't even trying to push the boundaries any more. They're very similar to how AMD was before Core2 launched. Remember when Brisbane launched at 65nm? It used less power than the 90nm CPUs but didn't overclock any better. Core2 owned it in every benchmark. That's what happens when you get lazy.
 
Last edited:

MisterMac

Senior member
Sep 16, 2011
777
0
0
@IDC:

It's too easy to call it the "testing market" option.

The QX chips were first of it's kind - and did offer some relative top performance nothing else could rival.


Short of the 5 GHZ Novelty - the GHZ race dropped years ago - it's not like it's near a 3930K.

And if your a tinkerer (which i imagine most in this bracket are) - there's more performance to be gained from a 3930k or 3970X.


It's a sign that they wanted to play in the "i will pay 90% extra for 10% club" - because while volume is low - margin is high.
But they were clearly deemed vastly inferior even for those people in that club.

That's not positive.


Well see if it's not some super deal, it kinda does look like it is tho :)
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
@IDC:

It's too easy to call it the "testing market" option.

I consider it as "testing the market" by virtue of its TDP and power consumption.

Gamers have long been accustomed to putting 300W+ GPU products (and in multiples too) in their computers...but outside of OC'ing your CPU to silly high clocks and volts, your CPU isn't going to come close to those power levels.

AMD is testing the market to see if the market is equally willing to put a 220W+ stock CPU into their rigs.

Regardless the performance/watt or absolute performance, it is pushing boundaries on the status quo no less than those first GPU's which technically "broke" the PCI-E spec by going over 300W power consumption.

I hope this opens up a new market of high-end mobos and air-coolers that can cheaply and effectively deal with the heat output so that the next round of these type of CPUs have an easier time of it.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I hope this opens up a new market of high-end mobos and air-coolers that can cheaply and effectively deal with the heat output so that the next round of these type of CPUs have an easier time of it.

Isn't there a reasonable number of MBs and coolers able to deal with 220W+ CPUS?
 

Rvenger

Elite Member <br> Super Moderator <br> Video Cards
Apr 6, 2004
6,283
5
81
Isn't there a reasonable number of MBs and coolers able to deal with 220W+ CPUS?


Yes but not much that will go over that for overclocking without throttling.

I bet this SKU is no longer in production so they are starting to clear stock. That would be my guess. As IDC said, they were testing the market.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
I consider it as "testing the market" by virtue of its TDP and power consumption.

Gamers have long been accustomed to putting 300W+ GPU products (and in multiples too) in their computers...but outside of OC'ing your CPU to silly high clocks and volts, your CPU isn't going to come close to those power levels.

AMD is testing the market to see if the market is equally willing to put a 220W+ stock CPU into their rigs.

Regardless the performance/watt or absolute performance, it is pushing boundaries on the status quo no less than those first GPU's which technically "broke" the PCI-E spec by going over 300W power consumption.

I hope this opens up a new market of high-end mobos and air-coolers that can cheaply and effectively deal with the heat output so that the next round of these type of CPUs have an easier time of it.


Wait what?

They're not pushing any boundries - your saying you believe they sat down and said:
"Hey guys - let's create a massive highend product that pushes the thermal envelope!"
(And that they did it in a time when perf\watt is gaining momentum too).

I'd love 300w CPUs with 16 cores @ mainstream clockspeeds - but that's not what AMD did.
Even so for the gpu remark - the people who put 300w GPUs to action do it for ABSOLUTE performance.

In the high end performance within X envelope \ absolute performance is the ONLY goal.

Which the 9590 fails at miserably - mind you.


9590 only pushes the thermal envelope and a stock setting for GHZ.
Which...does nothing to the power user.


Your kinda sounding weird to me here IDC :p
Normally i feel so little in your great vast wisdom :p
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Wait what?

They're not pushing any boundries - your saying you believe they sat down and said:
"Hey guys - let's create a massive highend product that pushes the thermal envelope!"
(And that they did it in a time when perf\watt is gaining momentum too).

I'd love 300w CPUs with 16 cores @ mainstream clockspeeds - but that's not what AMD did.
Even so for the gpu remark - the people who put 300w GPUs to action do it for ABSOLUTE performance.

In the high end performance within X envelope \ absolute performance is the ONLY goal.

Which the 9590 fails at miserably - mind you.


9590 only pushes the thermal envelope and a stock setting for GHZ.
Which...does nothing to the power user.


Your kinda sounding weird to me here IDC :p
Normally i feel so little in your great vast wisdom :p

Consider for a moment the answer to the following question - why didn't AMD release a 220W TDP bulldozer SKU when bulldozer first launched?

Or for that matter, why did they wait so long after piledriver was released before they decided to released the 9370 and 9590 SKUs?

Go back in time to the era in GPU's before the first 330W SKU was released...what kept Nvidia and AMD from releasing a 330W SKU even earlier?

Anytime you want to take a pre-existing supply chain, from mobo makers to heatsink providers, to case designers and power supply manufacturers, and you want to push that entire supply chain out of its existing space and expand the space to a new level it takes a lot of time and effort.

Intel is doing the same thing on the other end of the power spectrum. Look at what it takes for them to enable at the platform level a haswell system which has 0.5W consumption. That pushed the envelope of what the market could do as a matter of the "economically routine".

Every time a new product has come to exist in a position on the power-consumption curve that was previously unoccupied it is not something that could not have been done before (220W CPUs could have been done before, 0.5W platform consumption could have been done before, 330W GPUs could have been done before)...the mere existence of the product itself is not the achievement, rather the mere existence of the product is an indication that a lot of people have geared up and pushed to have the entire supplier eco-system move together towards pushing the envelope of what the consumer can expect to see in stores as a purchasing option.

In a world where we want to get from Point A to Point D, seeing us get to Point B is exciting (to me personally) because it heralds progress in moving towards Point D.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
Consider for a moment the answer to the following question - why didn't AMD release a 220W TDP bulldozer SKU when bulldozer first launched?

Or for that matter, why did they wait so long after piledriver was released before they decided to released the 9370 and 9590 SKUs?

Go back in time to the era in GPU's before the first 330W SKU was released...what kept Nvidia and AMD from releasing a 330W SKU even earlier?

Anytime you want to take a pre-existing supply chain, from mobo makers to heatsink providers, to case designers and power supply manufacturers, and you want to push that entire supply chain out of its existing space and expand the space to a new level it takes a lot of time and effort.

Intel is doing the same thing on the other end of the power spectrum. Look at what it takes for them to enable at the platform level a haswell system which has 0.5W consumption. That pushed the envelope of what the market could do as a matter of the "economically routine".

Every time a new product has come to exist in a position on the power-consumption curve that was previously unoccupied it is not something that could not have been done before (220W CPUs could have been done before, 0.5W platform consumption could have been done before, 330W GPUs could have been done before)...the mere existence of the product itself is not the achievement, rather the mere existence of the product is an indication that a lot of people have geared up and pushed to have the entire supplier eco-system move together towards pushing the envelope of what the consumer can expect to see in stores as a purchasing option.

In a world where we want to get from Point A to Point D, seeing us get to Point B is exciting (to me personally) because it heralds progress in moving towards Point D.


Because the ammount of chips from GF that could withstand the normal validation of 5ghz 220w chip - was miniscule?

Because they realized "worlds first 8 core" ploy failed - and now tried the old ghz race?

It heralds progress - but we're not arguing that IDC.
If the progress is only consumption within a given a market well - that's fail?

We're discussing the viability of the 9590 as a product - and it fails on every metric except "Well it consume alot of power!".


I think i'm like "swooooosh" to what your saying.
Because AMD did not - and i'll gladly be quoted geusstimating this - NOT release a SKU just to PUSH the POWER envelope.


You can't seriously be saying that?
And you can't seriously be saying that "It's kewl for pushing the power envelope - but doesn't really offer more performance than other SKUs with LESS TDP on the market"?
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I'm really happy AMD would release seomthing like this. It may open the door for other enthusiast parts. GPU's steadily went up in power use and no one complained because performance was better generation after generation. AMD's performance has gone up generation to generation as well, but not in every metric, and not compared to Intel who leap frogged them years ago and hasn't looked back. I guess I don't see why some pepole seem so offended that this part exists.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Be careful about jumping to conclusions, Aria.co.uk is a bit dodgy. They kind of have a reputation for pulling stunts when it comes to prices and they often don't honour them. Their resellerrating isn't as bad as overclockers.co.uk but its still very bad (http://www.resellerratings.com/store/Aria_Technology_Ltd).

I don't know if this is a genuine indication of a price drop or an error that Aria will soon correct, or another stunt to try and get people to buy with them instead of elsewhere and then switch the price. I just know this company can't be trusted.
 

Durp

Member
Jan 29, 2013
132
0
0
I think i'm like "swooooosh" to what your saying.
Because AMD did not - and i'll gladly be quoted geusstimating this - NOT release a SKU just to PUSH the POWER envelope.

You can't seriously be saying that?
And you can't seriously be saying that "It's kewl for pushing the power envelope - but doesn't really offer more performance than other SKUs with LESS TDP on the market"?

Exactly my opinion on this processor release. IDC has posted amazingly informative things here, so much so that I made an account. But since the 9590 came out he has been trying to say SOMETHING positive about it but there really isn't anything positive about it.

I think it's safe to assume a coworker is trolling us when IDC is out of the office. :p

Yea, the price drops haven't made it to any US sites that I see.

For anyone who purchased this chip at full price one month ago, I do hope this was just Aria.co.uk trying to push their stock out of the door and not something that will happen at other retailers.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
In a world where we want to get from Point A to Point D, seeing us get to Point B is exciting (to me personally) because it heralds progress in moving towards Point D.

The crux of the issue is that there is a tangible benefit for going to 300W GPUs, as the workloads there are extremely parallel. Sure, the increase is not not linear, but it's close. GPU companies can say that with a straight face to their customers, they can show the benefits of going 300W. It's a fair deal. But what is AMD offering us? You get some 60% increase on top of an already outrageous power consumption for some 15% performance improvement.

Where is this D AMD wants to take us? Most of the industry is moving to mobile, so D isn't there. Say "60% increase in power consumption for 15% more performance" and any datacenter manager will die from laughing, so no D here. And the guy looking for absolute performance will look elsewhere, so also no D. Maybe AMD is planning the budget gamer king, selling a PCB with an APU for the price of a dGPU, but they wouldn't need to stress the supply chain as 300W PCBs are around for quite some time. So, where or what is this D?

I don't know whether we should care for AMD's D. If B was a 60% increase for 15% extra performance, what's C? A two-fold increase in power consumption on top of B for some 10% extra performance? Sorry, I'll pass, and most people will too. Maybe there's no D at all and this was simply a marketing stunt. Something that doesn't make sense commercially but something new marketing team decided to give it a go just to shake the things up a little and make some money at the expense of the company's fans. It's even worse than Intel's Extreme line, as the 9590 doesn't come with a performance crown and the correspondent bragging rights.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I understand you guys think I'm drinking the Kool-Aid here, that I'm hopped up on Uncle Bob's krazy juice, but I'll keep trying to make my point because based on the replies I don't think you guys are really seeing my point yet.

(but I admit that even if/when you do see my point, you may still deride me as hitting the bottle too much ;)

Because the ammount of chips from GF that could withstand the normal validation of 5ghz 220w chip - was miniscule?

While possible, I see you are getting hung up on the clockspeed binning rather than the specific TDP tier.

It is true that of course binning is a factor of power consumption.

But ignore the clockspeed and just look at the power consumption bin.

What kept AMD from opening up the binning window for existing 8150 bulldozer chips from day-one to include all chips that consumed enough power as to require a TDP not of 140W but of 220W?

It is true that with the 9590, AMD used the extra TDP headroom to also boost clockspeeds in their binning process.

But something prevented them from opening up that TDP tier earlier, even with existing clockspeed bins, even though it would have meant higher yields for them.

Because they realized "worlds first 8 core" ploy failed - and now tried the old ghz race?

Actually if that were there conclusion then they would not have pushed the 9590 and the 9370 as 8-cores.

They could have cut the core-count to 6cores or 4cores, and kept the 4.7/5GHz bins and kept the TDP at a more conventional status-quo upper limit.

Allowing for higher power consumption while binning has always been an option, there is a reason the bins turn out to exist at the values they do.

95W instead of 100W or 90W for the 2600K for example.

Creating a 220W bin has always been an option, technically, but there is a reason no one has pushed the envelope that far until now. Same thing with GPUs.

Go back as far as you like prior to the first 330W GPU and any prior GPU product you want to point at could have been a 330W GPU as well...if only the supplier chain could have supported it in an economically viable fashion at that earlier timeframe.

It heralds progress - but we're not arguing that IDC.
If the progress is only consumption within a given a market well - that's fail?

We're discussing the viability of the 9590 as a product - and it fails on every metric except "Well it consume alot of power!".

I don't understand your logic and I'll tell you why - it isn't self-consistent.

Your stated position on why a 220W TDP 9590 SKU is "fail" could be equally applied to literally every other CPU that both AMD and Intel offer by simply ratcheting down the threshold of what you personally want to consider "unacceptably high power consumption".

That 150W hex-core $1020 extreme 3970X CPU from Intel? Total fail. What does it offer that a $300 77W quad-core 3770K doesn't offer?

That $300 77W quad-core 3770K from Intel? Total fail. What does it offer that a $200 77W quad-core 3570K doesn't offer?

That $200 77W quad-core 3570K from Intel? Total fail. What does it offer that a $130 55W dual-core i3-3220 from Intel doesn't offer?

You see how this works, pick any arbitrary TDP value and you'll find one lower than it - lower in price, lower in TDP, and lower in performance - and yet all those products exist because there is a spectrum to the demand of the market.

So how is going the other direction suddenly a total fail? How is a 130W or 150W Intel chip a fail over the lower-performing, lower-costing, lower TDP SKUs offered by Intel?

And by extension, how is a 9590 a total fail compared to the lower-performing, lower-costing, lower TDP SKUs offered by AMD?

I think i'm like "swooooosh" to what your saying.
Because AMD did not - and i'll gladly be quoted geusstimating this - NOT release a SKU just to PUSH the POWER envelope.

Of course they didn't. Nor did Intel when they released their 150W TDP 3970X which was literally nothing more than a 0.2GHz speed-bump up from the existing 130W TDP 3960X.

I remember when Intel was about to release that chip, I had a little bird telling me about it in advance and the ground-breaking (in their industry) efforts they were going through to make sure turn-key 3rd party cooling solutions on the market would be able to reliably and economically handle the heat dissipation requirements.

Internally it took a LOT of engineering time and effort to validate the viability of a product that merely pushed the existing envelope from 130W to 150W.

AMD made no less the effort to ensure their 220W chips would be viable, that the infrastructure was in place to enable such products to work correctly and without serious failure (ala the 1.13GHz P3 situation) should the cooling solutions not be viable (considering the cost-cutting measures AMD knows cooling suppliers are under constant pressure to take)

You can't seriously be saying that?
And you can't seriously be saying that "It's kewl for pushing the power envelope - but doesn't really offer more performance than other SKUs with LESS TDP on the market"?

I do think it is cool. I think the entire effort and outcome is cool.

While all you see at this time is a singular product with two SKUs to have come of it, I see the initial footsteps, the required first-steps, to what will likely become a whole new high-performance high-power tier of products no different than what happened in the GPU space once the AIBs saw the world did not implode into a blackhole just because they violated the PCI-E spec and created a whole new tier of >300W TDP GPU products.

AMD has laid the groundwork, the first steps have been taken. I am not specifically excited by the specific products that those first steps embody - I am not excited by the 9590 itself, but I am excited to see what round 2 and round 3 bring.

I am excited for what the future may bring thanks to this new direction.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Cliffs:
-AMD opened up a new tier.

-This new tier sports higher TDP and performance, instead of a new tier in the low end (i.e, even lower power for slightly less performance)

-This was not as feasible before as it is now, due to the ecosystem (not the CPU design or fab process) including power delivery on mobos and the cooling performance.

-Once a new tier is opened up, with the removed restrictions being mobo power delivery and cooling, it is effectively available to ALL vendors - not just AMD. Intel can, theoretically, take advantage of the newly developed ecosystem in place, opening up more higher-performing options for enthusiasts in the future
/cliffs


Commentary
This is what's good about this. It's a positive step for high-end enthusiasts like us. We now have a chance that we will have higher-TDP (and performance) options available in the future, regardless of vendor (AMD/Intel) because it seems there already is feasible support in the surrounding ecosystem.

What if high-end enthusiasts like us will now have a higher tier beyond 130/140/150W? What if all SKU's today suddenly had a 50W higher TDP additional SKU, to offer anything from 20-30% more performance? Is this not a win that we will have a choice of opting for truly extreme performance, vs a more balanced one (lower tier), vs opting for cheapest/slowest/most efficient (absolute lowest tier available).

Some of you have expressed their disappointment in the focus of lowering TDP vs increasing performance from the enthusiast's POV (e.g., Haswell). AMD did exactly the opposite: screw TDP just to offer yet another tier of performance. Would you not rather have available various enthusiast options ranging from 100W to 200W, rather than, say, just topping out at 130W or 150W? If it were available, wouldn't you rather have a 200W TDP, 20% faster i7-E series if you are after top performance?

If you are an enthusiast who bemoaned the focus of Haswell because it left you in the cold, dumped for a more lucrative mobile market, then all the more should you welcome the opening of a new tier. It is a portent of things that may come, good things, for enthusiasts. Why settle for an SKU with only a 77W TDP, or even 150W, if Intel also offers a higher performing 200W TDP part and you are looking for absolute top-end performance per generation?

Forget about performance/$ or performance/watt. I doubt any of the highest-end parts (130W-150W Intel E SKUs) actually sport reasonable performance/$ or performance/watt compared to a nominal i5 quad, or maybe even i7 quad+HT. If your concern is performance/$ and/or performance/watt, then you don't belong in the high-end enthusiast tier targeted, stop complaining and stick to your energy-efficient i5. But if you do belong to it, possibly having extra tiers beyond the traditional 130/150W to choose from is good. What's 200W anyway? Most high-end rigs sport a way overpowered PSU (we know this, since they are the ones being targetted by all the 1kw+ PSU's). 4770k + Titan, you won't even need over 500W. We (high-end enthusiasts) already have the power and cooling required, it's just that the vendors (AMD/Intel) won't sell us even faster parts despite us sporting a 1KW silver/gold efficiency PSU, and giant 1kg air coolers or custom water loops, and crazy expensive overclocking boards that laugh at a mere 200W power draw for 2 years straight. So for AMD and Intel, go crazy. Stop with the arbitrary 130/140/150W limit. For the high-end SKU's, go crazy. Give us the absolute bleeding edge performance available in your current archs, drive the TDP up to 200W, we'll worry about the power and cooling.

That's the possible future you are poo-pooing when you dismiss 200W CPUs as crazy. And the enthusiast in me just can't reconcile that kind of attitude vs how this forum is supposed to be filled with more "crazy enthusiasts who consider PC's a hobby instead of a mere tool" than "lame non-enthusiast Excel users who won't know a molex if it bit him". The 9590 is crazy expensive, and not even 20% faster than the regular-tier 8350, and I think that's kind of punching you in the eye and making you miss the forest for the trees. So, forget about the 9590 itself, and instead imagine if Intel, seeing how AMD now opened up the feasibility of much higher than 150W TDP SKUs, will then offer 200W TDP Haswell-E aside from the expected 130W and 150W varieties.
 
Aug 11, 2008
10,451
642
126
Cliffs:
-AMD opened up a new tier.

-This new tier sports higher TDP and performance, instead of a new tier in the low end (i.e, even lower power for slightly less performance)

-This was not as feasible before as it is now, due to the ecosystem (not the CPU design or fab process) including power delivery on mobos and the cooling performance.

-Once a new tier is opened up, with the removed restrictions being mobo power delivery and cooling, it is effectively available to ALL vendors - not just AMD. Intel can, theoretically, take advantage of the newly developed ecosystem in place, opening up more higher-performing options for enthusiasts in the future
/cliffs


Commentary
This is what's good about this. It's a positive step for high-end enthusiasts like us. We now have a chance that we will have higher-TDP (and performance) options available in the future, regardless of vendor (AMD/Intel) because it seems there already is feasible support in the surrounding ecosystem.

What if high-end enthusiasts like us will now have a higher tier beyond 130/140/150W? What if all SKU's today suddenly had a 50W higher TDP additional SKU, to offer anything from 20-30% more performance? Is this not a win that we will have a choice of opting for truly extreme performance, vs a more balanced one (lower tier), vs opting for cheapest/slowest/most efficient (absolute lowest tier available).

Some of you have expressed their disappointment in the focus of lowering TDP vs increasing performance from the enthusiast's POV (e.g., Haswell). AMD did exactly the opposite: screw TDP just to offer yet another tier of performance. Would you not rather have available various enthusiast options ranging from 100W to 200W, rather than, say, just topping out at 130W or 150W? If it were available, wouldn't you rather have a 200W TDP, 20% faster i7-E series if you are after top performance?

If you are an enthusiast who bemoaned the focus of Haswell because it left you in the cold, dumped for a more lucrative mobile market, then all the more should you welcome the opening of a new tier. It is a portent of things that may come, good things, for enthusiasts. Why settle for an SKU with only a 77W TDP, or even 150W, if Intel also offers a higher performing 200W TDP part and you are looking for absolute top-end performance per generation?

Forget about performance/$ or performance/watt. I doubt any of the highest-end parts (130W-150W Intel E SKUs) actually sport reasonable performance/$ or performance/watt compared to a nominal i5 quad, or maybe even i7 quad+HT. If your concern is performance/$ and/or performance/watt, then you don't belong in the high-end enthusiast tier targeted, stop complaining and stick to your energy-efficient i5. But if you do belong to it, possibly having extra tiers beyond the traditional 130/150W to choose from is good. What's 200W anyway? Most high-end rigs sport a way overpowered PSU (we know this, since they are the ones being targetted by all the 1kw+ PSU's). 4770k + Titan, you won't even need over 500W. We (high-end enthusiasts) already have the power and cooling required, it's just that the vendors (AMD/Intel) won't sell us even faster parts despite us sporting a 1KW silver/gold efficiency PSU, and giant 1kg air coolers or custom water loops, and crazy expensive overclocking boards that laugh at a mere 200W power draw for 2 years straight. So for AMD and Intel, go crazy. Stop with the arbitrary 130/140/150W limit. For the high-end SKU's, go crazy. Give us the absolute bleeding edge performance available in your current archs, drive the TDP up to 200W, we'll worry about the power and cooling.

That's the possible future you are poo-pooing when you dismiss 200W CPUs as crazy. And the enthusiast in me just can't reconcile that kind of attitude vs how this forum is supposed to be filled with more "crazy enthusiasts who consider PC's a hobby instead of a mere tool" than "lame non-enthusiast Excel users who won't know a molex if it bit him". The 9590 is crazy expensive, and not even 20% faster than the regular-tier 8350, and I think that's kind of punching you in the eye and making you miss the forest for the trees. So, forget about the 9590 itself, and instead imagine if Intel, seeing how AMD now opened up the feasibility of much higher than 150W TDP SKUs, will then offer 200W TDP Haswell-E aside from the expected 130W and 150W varieties.

While that may be true, if Intel wanted to produce a 200 watt CPU, they don't need amd to pave the way for them. If Intel had produced such a chip, the motherboard and other supporting devices would have followed. I give amd credit for trying to push gaming and the performance envelope I guess. But this chip is like putting out a sports car with a huge gas guzzling engine that is slower than an efficient one, and yet charging twice the price. I am not sure that is any kind of progress.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
While that may be true, if Intel wanted to produce a 200 watt CPU, they don't need amd to pave the way for them
Yes, but that's not the point, which you missed if this remark is any indication.

The point is, they could have, but they didn't. Just as they (Intel) could have let upper-tier Haswell chips have higher frequency at the expense of TDP and power consumption while letting mobile parts retain all the power-saving goodness. But they also didn't. And we know why, and it isn't because the upper tier desktop parts don't need it. It simply didn't make financial sense to do so, and why we enthusiasts seem to be rather ho-hum about Haswell. Yep, it beats all AMD chips, but that's a low bar in the first place, but from the ridiculously low TDP of the highest end i7 that is non-E, you know Intel could have easily pumped out faster chips at stock.

Let's get into the "why it didn't" for the >150W SKU's for Intel. It's not just a matter of "hey guys, MSI, Gigabyte, ASUS, whoever else, we now have 200W TDP CPUs. So, good luck, hope your enthusiast-class mobos are good to go! Oh, and Cooler Master, Noctua, I hope those copper-based coolers of yours are any good, haha!". That's kind of what COULD be inferred with your rather curt "yeah, mobo and other devices would have followed suit". But it isn't like that, Intel's various relevant engineers would be coming up with specs and coordinating with the component makers, and allotting significant time for testing and validation. It's an added cost. And since the parameters needed by those mobos and cooling may not exist yet, even just coming up with the parameters - without even meeting a single component manufacturer yet - already entails cost.

Unknown market (>150W CPU's) even for a niche market itself (enthusiasts like us), you can understand why bean counters wouldn't be too excited at the prospect of starting this endeavor. It's undeniably good for enthusiasts. But it's questionable to Intel's profit estimates, ergo, never gets the greenlight.

Now here's AMD. Somehow, they crunched the numbers (or threw darts, who knows what they do at management there, right? ;) ) and said "yep, it would be worth all the costs involved to make these chips and ensure the ecosystem is in place for all customers". And so we have this.

And now the barrier to entry for Intel has also gone down, since they don't have to be the one to initiate such an endeavor with the component makers (the cost isn't zero, of course). But even more, now the bean counters will have a better source for their decision tree - sales of AMD's 200W chips. If this AMD experiment proves profitable enough, then that's a data point that the bean counters at Intel can consider that gets us (enthusiasts) closer to getting what we want - higher performance out of the box, even at the expense of TDP.

Pretty cool, right? :)


But this chip is like putting out a sports car with a huge gas guzzling engine that is slower than an efficient one, and yet charging twice the price. I am not sure that is any kind of progress.
Indeed, but that's because all the AMD chips are inferior to the Intel ones in the first place, so it's not surprising that even their new highest tier chips aren't better than the Intel alternatives. This isn't a surprise.

But the effect of opening up a new tier is clear enough: compared the 8350 (their previous highest-tier part), the new tier does give better performance, comparable to the highest overclocks achieved, but out-of-the-box and VALIDATED by AMD to work as expected, not just our dinky, unreliable "validation" methods c/o Prime or IBT or OCCT.

And, as stated in the earlier section, Intel's product chiefs can now consider whatever data AMD gets, and we can only hope that AMD's stunt pulls off - not only because it will make AMD more predisposed to make it a regular feature of their product lines, but also because it might also make Intel consider making a new 200W-tier for their E-series, something like an i7-5990XX-Extreme.

How can that not be good for enthusiasts? It's the exact opposite of the trend that we don't like - sacrificing performance for the sake of power consumption, even for desktop. I like the lower power consumption, but what I don't want is that ALL vendors will dictate that we can only ever have these SKU's that primarily focus on power consumption, and performance is secondary. We have different needs. It's good if we'd get a choice of SKU's that offer performance first and power consumption second.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I dont think 220W can be substained as a product, Intel, AMD or combined. The world is moving towards energy conserving. So the demand in the first place is very low. 375W/450W graphics cards failed for the same reason. And its getting harder and harder to sell 225W+ graphics cards as well.

125/130W CPUs are already niche. If you want to sell anything in volume it needs to be 100W or below.

There is also the cost TCO factor, climate in your room and so on.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
I dont think 220W can be substained as a product, Intel, AMD or combined. The world is moving towards energy conserving. So the demand in the first place is very low. 375W/450W graphics cards failed for the same reason. And its getting harder and harder to sell 225W+ graphics cards as well.
Indeed, it's a very hard sell - an unknown market in a niche market itself, as I described it.

But if no vendor tried, then we won't even have a chance for enthusiast-class products like it. The general populace, regular consumers, are perfectly served by really low-power SKU's, true. But it would be nice if our (as enthusiasts) options aren't limited to just what the general populace would prefer or SKU's that aren't too far off from them.

It's nice someone tried. Now there's a chance, no matter how small, that it might just pan out. If it fails, then we'll be no worse off than where we started anyway. It damages none of us and can only possibly benefit us in the future, even though the chances of that is very small.
 

Abwx

Lifer
Apr 2, 2011
12,034
4,995
136
I dont think 220W can be substained as a product, Intel, AMD or combined. The world is moving towards energy conserving. So the demand in the first place is very low. 375W/450W graphics cards failed for the same reason. And its getting harder and harder to sell 225W+ graphics cards as well.

125/130W CPUs are already niche. If you want to sell anything in volume it needs to be 100W or below.

There is also the cost TCO factor, climate in your room and so on.


They would be stupid to not take the money they are in dire need of ,
they will be better selling it than not whatever the quantities ,
if there s a demand , then fill it and thoses who say otherwise
are just talking irrationaly in respect of corporate management.

Btw , I live in France, how was the climate this year in Denmark ?..
.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Indeed, it's a very hard sell - an unknown market in a niche market itself, as I described it.

But if no vendor tried, then we won't even have a chance for enthusiast-class products like it. The general populace, regular consumers, are perfectly served by really low-power SKU's, true. But it would be nice if our (as enthusiasts) options aren't limited to just what the general populace would prefer or SKU's that aren't too far off from them.

It's nice someone tried. Now there's a chance, no matter how small, that it might just pan out. If it fails, then we'll be no worse off than where we started anyway. It damages none of us and can only possibly benefit us in the future, even though the chances of that is very small.

It actually hurts the chances every time these 1000 unit run products fail. Because the next one will be less keen on trying. And the products will end up in an automatic rejection at the companies that are supposed to support it. AMD didnt try to sell you 220W CPUs to test a new segment. They only did it for a last ditch for some free PR with the 5Ghz checkbox. And their future already planned products reflect this. They have no intention in even trying to follow it up. Nomatter the outcome of the product.

I dont see it as enthusiast products either. Its simply factory OCed to the brim of what they can take. The border between "dumb person with too much money" and "enthusiast" is quite small.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
They would be stupid to not take the money they are in dire need of ,
they will be better selling it than not whatever the quantities ,
if there s a demand , then fill it and thoses who say otherwise
are just talking irrationaly in respect of corporate management.

Btw , I live in France, how was the climate this year in Denmark ?..
.

Ofcourse they should take the money. But it was more a PR stunt than anything. And most likely a huge disaster financially. But PR is cost upfront and return later. And AMD had no intention of any kind of followup on the product. So I dont get the novelty creation of the supposed new segment testing.

Climate? Average summer I guess. Well, as average it can be with the continual climate changes.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
It actually hurts the chances every time these 1000 unit run products fail. Because the next one will be less keen on trying. And the products will end up in an automatic rejection at the companies that are supposed to support it. AMD didnt try to sell you 220W CPUs to test a new segment. They only did it for a last ditch for some free PR with the 5Ghz checkbox. And their future already planned products reflect this. They have no intention in even trying to follow it up. Nomatter the outcome of the product.

I dont see it as enthusiast products either. Its simply factory OCed to the brim of what they can take. The border between "dumb person with too much money" and "enthusiast" is quite small.

What you say is true. Look at Skulltrail. Totally killed our chances of having an honest to goodness dual-CPU board for enthusiasts that doesn't literally cost more than just straight up buying two separate computers.

It was a great experiment, but it totally flopped, and with it so went the dual-socket enthusiast market.

So yeah, there is that risk here too, nothing is risk free. But its a start.

And in terms of global warming, the carbon footprint of these processors is nearly entirely captured by their energy-intensive production. Whether your CPU burns 200W or 100W over the course of its lifetime is going to pale in comparison to how much fossil fuels had to be burned to mine the ores, refine the silicon, develop the process, power the people who show up to work to build the chips, plus the energy used to build the chips.

I'm all for energy conservation, you know that about me with my Prius and wind-powered electricity and so forth, but I'm not about to feel that a 200W is all that much of a crime against nature versus a 100W processor. The bulk of the crime against nature was committed in the making of the CPU in the first place.

But I am absolutely with you on the TCO angle. These chips, be they Intel's 150W behemoth or AMD's 220W guzzler or Nvidia's 300W titan, its all an excessive egregious level of energy consumption that doesn't do the wallet any justice.
 
Aug 11, 2008
10,451
642
126
Yes, but that's not the point, which you missed if this remark is any indication.

The point is, they could have, but they didn't. Just as they (Intel) could have let upper-tier Haswell chips have higher frequency at the expense of TDP and power consumption while letting mobile parts retain all the power-saving goodness. But they also didn't. And we know why, and it isn't because the upper tier desktop parts don't need it. It simply didn't make financial sense to do so, and why we enthusiasts seem to be rather ho-hum about Haswell. Yep, it beats all AMD chips, but that's a low bar in the first place, but from the ridiculously low TDP of the highest end i7 that is non-E, you know Intel could have easily pumped out faster chips at stock.

Let's get into the "why it didn't" for the >150W SKU's for Intel. It's not just a matter of "hey guys, MSI, Gigabyte, ASUS, whoever else, we now have 200W TDP CPUs. So, good luck, hope your enthusiast-class mobos are good to go! Oh, and Cooler Master, Noctua, I hope those copper-based coolers of yours are any good, haha!". That's kind of what COULD be inferred with your rather curt "yeah, mobo and other devices would have followed suit". But it isn't like that, Intel's various relevant engineers would be coming up with specs and coordinating with the component makers, and allotting significant time for testing and validation. It's an added cost. And since the parameters needed by those mobos and cooling may not exist yet, even just coming up with the parameters - without even meeting a single component manufacturer yet - already entails cost.

Unknown market (>150W CPU's) even for a niche market itself (enthusiasts like us), you can understand why bean counters wouldn't be too excited at the prospect of starting this endeavor. It's undeniably good for enthusiasts. But it's questionable to Intel's profit estimates, ergo, never gets the greenlight.

Now here's AMD. Somehow, they crunched the numbers (or threw darts, who knows what they do at management there, right? ;) ) and said "yep, it would be worth all the costs involved to make these chips and ensure the ecosystem is in place for all customers". And so we have this.

And now the barrier to entry for Intel has also gone down, since they don't have to be the one to initiate such an endeavor with the component makers (the cost isn't zero, of course). But even more, now the bean counters will have a better source for their decision tree - sales of AMD's 200W chips. If this AMD experiment proves profitable enough, then that's a data point that the bean counters at Intel can consider that gets us (enthusiasts) closer to getting what we want - higher performance out of the box, even at the expense of TDP.

Pretty cool, right? :)



Indeed, but that's because all the AMD chips are inferior to the Intel ones in the first place, so it's not surprising that even their new highest tier chips aren't better than the Intel alternatives. This isn't a surprise.

But the effect of opening up a new tier is clear enough: compared the 8350 (their previous highest-tier part), the new tier does give better performance, comparable to the highest overclocks achieved, but out-of-the-box and VALIDATED by AMD to work as expected, not just our dinky, unreliable "validation" methods c/o Prime or IBT or OCCT.

And, as stated in the earlier section, Intel's product chiefs can now consider whatever data AMD gets, and we can only hope that AMD's stunt pulls off - not only because it will make AMD more predisposed to make it a regular feature of their product lines, but also because it might also make Intel consider making a new 200W-tier for their E-series, something like an i7-5990XX-Extreme.

How can that not be good for enthusiasts? It's the exact opposite of the trend that we don't like - sacrificing performance for the sake of power consumption, even for desktop. I like the lower power consumption, but what I don't want is that ALL vendors will dictate that we can only ever have these SKU's that primarily focus on power consumption, and performance is secondary. We have different needs. It's good if we'd get a choice of SKU's that offer performance first and power consumption second.

I think I do see the point. I am sure intel investigated a higher end chip with a high TDP and high price tag. In fact they do already have such a chip in the extreme edition 3970x or whatever it is.

I suppose if the 9590 were a great sales success, and they sold every one at full list, then it could encourage intel to come out with higher TDP chips with higher performance. If they have difficulty selling them, have to cut the price, or are so supply constrained that they only have a few chips to sell no matter what, I dont see what effect it will have on intel's strategy.

What AMD needs to do to really push Intel is to come out with a 250.00 chip that is faster than an i7 (not just in a few selected benchmarks) and uses 100 watts or less. If the rumors of amd going to an all apu strategy are true, this will be exactly the opposite of what is needed to push intel toward higher performance. And actually, considering the heat issues and relatively poor overclocking of Ivy and Haswell, I am not sure there is as much easy performance gain left on the table as many, myself included, used to assume.