Official AMD Ryzen Benchmarks, Reviews, Prices, and Discussion

Page 233 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dahorns

Senior member
Sep 13, 2013
550
83
91
Ok, so we can both agree then that Intel's HEDT isn't dead or obsolete, right? Because that's really all I was saying. Other people jumping on that because they seem to want Intel to 'die' doesn't change that fact.



I never disputed that Ryzen offers better performance per dollar for a very large amount of users. To imply that that's what I suggested is nonsense.



Again, never disputed that AMD doesn't offer better performance per dollar for a huge amount of users. Didn't dispute that at all. I was just saying that there are still users out there who need something that AMD currently doesn't offer, and that keeps the Intel CPUs from being dead and obsolete.

I'm not really sure how you guys go from such a simple statement to reading all this other nonsense into it.

Give it up Matti. The mob has spoken. What you actually said or meant doesn't really matter.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
If it's the number of PCI-E lanes that bother you the most, then just wait for Naples to release in H2.

I think that's certainly an option. I suppose the only concern at that point is the speed of those upcoming chips. If they're server chips it again leaves some users with a tradeoff they might not like.

Hi i don't post that often but i have been reading all of your posts and the thing that really comes to mind is, the cost saving of buying a RyZen Vs a 6900k platform can be put towards Radeons new Professional GPU with the SSD bolted onto it no? Thus negating the need for tons of PCIE lanes etc, removing any storage bottlenecks and what not.

If I'm correct these cards are aimed at this and similar types of work?

Every argument you have has been basically countered and you have consistently moved the goal posts so i am intrigued to see how you move the goalposts with my comment.

Well, at the risk of being annoying, your last sentence is quite adversarial. It makes me think that regardless of what I say - unless it's an unequivocally pro-AMD/it'sthebestestintheuniversenoexceptions post - you'll ignore it.

But to answer you: If what you state above gives a user more performance for equal or less money then of course that's an interesting possibility. Part of my argument is that these users don't like change because it's not worth bothering with a new platform and possible issues (that's a 'perception thing'), and part of it is that some of their other concerns are legitimate, lane-count impacting connectivity and the amount of GPUs etc being examples.

At the end of the day I still maintain that Intel's high end HEDT chips aren't obsolete or dead. Anyone can feel free to show the numbers of how that's the case. A year from now? That's a different issue.
 
  • Like
Reactions: Sweepr

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Professionals do write their own code that hooks into commercial software like Resolve for this.

Certainly professionals that actually might be sensitive to the impact of PCIe lanes.

Hmmm.... I haven't met a single colorist that writes his own code to do correction/grading. I'd say most users don't. You're sure you're not thinking of CGI work? Because that's not what I'm talking about.

Last week it was the AVX2 guy, this week the HEDT guy and the thread gets hijacked for 3-4 pages just because 1 guy can't let it go and 10 others are feeding him.

Irony is so 1990's.....
 

sushukka

Member
Mar 17, 2017
52
39
61
I think that might be true, depending on what consumers choose. I guess time will tell, but currently I don't see how Intel's CPUs are "obsolete". If they were I think we'd have seen cuts in pricing to compete already.
I think one reason why there haven't been any price drops is that they have enough assets to pretend that everything is ok. If Intel would drop their prices now that would be a significant indication that they're worried which is something Intel would avoid at all cost.

I think you're not seeing the specific use case I'm talking about. I don't think it scales as you imply above, because you're probably not going to save by getting another operator of a computer over getting a better performing computer. The tasks are highly different (design by a human versus rendering) and the savings won't translate. SLI isn't really something used in at least editing/color grading, and as far as I know all grading is done on single computers with multiple GPUs at the high end.
I'm looking this from IT management perspective where things are more pure mathematics. Also they are the people making the bigger mass decisions where the money is invested. If you're running a big company and need hundreds or thousands HEDT units the decision between some minor couple of percents speed loss comparing to total savings around 30% per workstation is just a no-brainer. With the CPU cost savings you could buy a better GPU, faster SSD etc. which would provide a better overall speed boost than just sticking on Intel.

There is though. There's everything from TB-connected drives to displays to chassis etc. It's pretty widely used. The issue isn't "is this needed to do this job or could it get done differently", the issue is are you going to say "Sorry, no TB" to a client in order to save a few hundred dollars? It is what it is. Even irrational reasons for adopting a technology becomes a rational reason for others to adopt it, if you know what I mean.
My wall of text over TB was meant to say that TB is, like the case you profiled, a very minor area too. Thunderbolt is pretty much Apple ecosystem solution. You have TB ports also on Windows/Linux computers but to be honest the main reason is because of the backwards compatibility with USB3. They will most probably very rarely been used with actual TB devices. Thing is that everything USB3 related cost less and you have the real universal compatibility comparing to TB. If you need eg. two 4k displays why don't use the ports GPU provides? If you have many external SSDs it can give some peak performance benefits but again even the fastest consumer SSDs are well in USB3's ballpark. Then you have ofc SATA, M.2 etc for connecting these. If you really need fast mass storage connections then there are all different enterprise based solutions and we're hopping purely on server side. It's just hard to find any real reason why TB would get a bigger mass coverage. It's no wonder why Thunderbolt is often mentioned with Firewire which in its time provided same kind of technological superiority but was never a commercial success.

As for "too many sources": Just for the record - my skepticism is based on my experience of very often finding that all of these "many sources" really boil down to the same one or two sources. In other words you go to Anandtech and WCFTCSFTech or whatever it's called and Hardforum or whatever and they all provide links to sources, but often those sources are other blogs/news outlets and eventually it boils down to just one or two "original" sources that are no more than rumors. So the volume of reports is one thing, and the volume of sources is another.

I agree with this. My opinion was also based on the real facts around Naples. 32c/64t/socket, 8 memory channels per socket, 128PCIe3 channels shows the scalability of Zen. Of course there will be middle-level variants between this and Ryzen. The prices will surely continue this disruptive path AMD has always followed. So there will be very interesting middle-tier between Ryzen and high-end Naples which could be a quite neat fit in the segment you presented.
 

FalcUK

Junior Member
Mar 1, 2017
7
39
51
I think that's certainly an option. I suppose the only concern at that point is the speed of those upcoming chips. If they're server chips it again leaves some users with a tradeoff they might not like.



Well, at the risk of being annoying, your last sentence is quite adversarial. It makes me think that regardless of what I say - unless it's an unequivocally pro-AMD/it'sthebestestintheuniversenoexceptions post - you'll ignore it.

But to answer you: If what you state above gives a user more performance for equal or less money then of course that's an interesting possibility. Part of my argument is that these users don't like change because it's not worth bothering with a new platform and possible issues (that's a 'perception thing'), and part of it is that some of their other concerns are legitimate, lane-count impacting connectivity and the amount of GPUs etc being examples.

At the end of the day I still maintain that Intel's high end HEDT chips aren't obsolete or dead. Anyone can feel free to show the numbers of how that's the case. A year from now? That's a different issue.

Unfortunately the minute AMD introduced their 8/16 chips, Intels 8/16 was dead in the water, the platform also is dead in the water, the amount you need to invest into the X99 platform, which is about to become obsolete due to being superseded by Intels next cash cow platform.

If as you are saying you are looking at it from a purely non gaming scenario, as a workstation type of platform, then the savings on the AMD platform will entice far more people to adopt, say for instance a typical Intel x99 6900k build costs £3000, the equivalent AMD 8/16 will be around £2500, even if the AMD platform only offers 90% of the performance its 75% of the cost. That right there will sway alot of people away from the Intel platform, then you add in that the AMD socket will allow you to discard the CPU in the future for a better perfoming one, where as with the Intel your pretty much at a dead end and need to swap the entire rig out etc.

Then add in the savings you get with the AMD platform, you go and put that £500 towards a professional M series AMD GPU with SSD bolted on it, that alone pretty much pays for itself with regards to the performance you achieve, especially in the types of tasks you are talking about.

Fact is, Intels X99 platform was massively overpriced, AMD have shown this to be true, now many X99 owners are trying to justify their overpriced purchases on many forums like this, by continually grasping at differing straws each time their current straw is disproved.

Soon you will be strawless, stuck on a dead platform and have no one left but an echo chamber to complain to my friend.

My advice? quietly exit through the side door while you still have a modicum of respect and integrity left before you totally lose that too.
 

imported_jjj

Senior member
Feb 14, 2009
660
430
136
Hmmm.... I haven't met a single colorist that writes his own code to do correction/grading. I'd say most users don't. You're sure you're not thinking of CGI work? Because that's not what I'm talking about.

Irony is so 1990's.....

You have a 2 weeks old account, 23 posts, 19 of them in this thread on the HEDT topic in the last few pages
You've been at it for about a day now, do you think that's healthy and sane?
 

Roger Wilco

Diamond Member
Mar 20, 2017
4,770
7,156
136
The following is my inerudite breakdown of my perceived value with AMD and Intel going head-to-head, based on the PRICES I've gleaned from surfing the web over the past few weeks. Please feel free to correct me, as I am sure most of this is nonsense.

This is my estimated perception, and it's largely from a gaming perspective (these prices take third-party coolers into consideration for non-bundled CPUs):

65 - 120$
i3 7100 - best value single-core/multi-core/multi-thread mix
Pentium G4650 - best value multi-core/multi-thread mix
Ryzen 3 - ?

125 - 180$
i3 7350K - best value single-core
i5 7400 - best value multi-core
ryzen 5 1400 - best value multi-core/multi-thread mix

195 - 225$
i5 7600 - Best value single-core/multi-core mix
Ryzen 1500x - Best value single-core/multi-core/multi-thread mix
Ryzen 1600 - Best value multi-core/multi-thread mix

235 - 275$
i5 7600k - Best value single-core/multi-core mix
Ryzen 1600x - Best value multi-core/multi-thread mix

285 - 315$
i7 7700 - Best value single-core/multi-core mix
Ryzen 1700 - Best value multi-core/multi-thread mix

350 - 450$
i7 7700k - best value single-core/multi-core mix
Ryzen 1700X - best value multicore/multi-thread mix

450$ and beyond -
Ryzen 1800X - best single-core/multi-core/multi-thread mix
Absurdly priced Intel stuff - no value wins
 
  • Like
Reactions: lightmanek

FalcUK

Junior Member
Mar 1, 2017
7
39
51
I'm looking this from IT management perspective where things are more pure mathematics. Also they are the people making the bigger mass decisions where the money is invested. If you're running a big company and need hundreds or thousands HEDT units the decision between some minor couple of percents speed loss comparing to total savings around 30% per workstation is just a no-brainer. With the CPU cost savings you could buy a better GPU, faster SSD etc. which would provide a better overall speed boost than just sticking on Intel.

As an IT buyer for a Global Corporation, i have to totally agree, if i am asked to purchase alot of equipment i need to provide alternatives and atleast 3 quotes for each for comparison, and 99% if there performance difference is within 10% but the price comparison is say 20% or something, we go with the cheaper but slightly lower performing product.

Never has anyone said to me "It must be Intel" or "Must be Nvidia" i am told "It must be on Autocad's approved list" or "Must be able to run xxx".

The company i work for is a Dell Premiere customer, all our Laptops are Dell, im fairly certain soon once AMD get Zen into Laptops, that going forward all our Laptops will be AMD based, as they will be cheaper, that will be the driving force.

The only time ive been told what we can and cant buy is Mobile Phones, we only use Samsung Android or Apple Iphone, thats it, ive tried to argue against this but someone further up the chain obviously likes those devices ;)
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Ok, so we can both agree then that Intel's HEDT isn't dead or obsolete, right? Because that's really all I was saying. Other people jumping on that because they seem to want Intel to 'die' doesn't change that fact.



I never disputed that Ryzen offers better performance per dollar for a very large amount of users. To imply that that's what I suggested is nonsense.



Again, never disputed that AMD doesn't offer better performance per dollar for a huge amount of users. Didn't dispute that at all. I was just saying that there are still users out there who need something that AMD currently doesn't offer, and that keeps the Intel CPUs from being dead and obsolete.

I'm not really sure how you guys go from such a simple statement to reading all this other nonsense into it.

Perhaps reading comprehension is not your thing, try reading your post that i quoted, in it you essentially say you are spending money no matter what, regardless of which system/CPU you go with. For people with a fixed budget, they do not have the option to spend endless amounts of $$ and have to make a choice/comprimise.

You are trying so hard moving the goalposts here that its obvious to anyone watching this that you are simple here to troll.

Intel HEDT is DEAD, except maybe in the .01% of corner cases that actually need the PCIe lanes, but IMO being 99.99% dead is still dead even if you are clinging to .01%
 

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
But then your argument is sort of like saying that Ferraris are obsolete because only a fraction of users buy them and now there are BMWs that are really really good. If people still buy Ferraris and BMWs aren't Ferraris, then Ferraris aren't "obsolete" or "dead".

Ferrari's don't need economy of scale. In fact they are usually limited on purpose.
 

zinfamous

No Lifer
Jul 12, 2006
111,844
31,336
146
Ferrari's don't need economy of scale. In fact they are usually limited on purpose.

also, relative performance of a ferrari vs a porsche vs a bmw vs a lambo on the same track are determined by user skill, not relatively simple and straightforward quantitative analysis--math--using the same tools. The value of high performance cars isn't singly tied to their guts, unlike a CPU. It's tied to user need/want (Actually, this is the same historic defense of Apple products and, it seems....intel. Marketing!) :p
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Unfortunately the minute AMD introduced their 8/16 chips, Intels 8/16 was dead in the water, the platform also is dead in the water, the amount you need to invest into the X99 platform, which is about to become obsolete due to being superseded by Intels next cash cow platform.

Chrystal balls are cool. I guess we'll see how everything turns out.

Fact is, Intels X99 platform was massively overpriced, AMD have shown this to be true,

Not really. This is capitalism. Pricing something as high as possible isn't "overpricing", "overpricing" is pricing it higher than the market can bare. I agree that for a lot of users x99 CPUs have become "overpriced" since AMD released Ryzen, and that's where I agree that sooner or later part of the CPU lineup of Intel will have to move down in cost. But the top end probably won't for a while as far as I can see, even though I hope it does.

My advice? quietly exit through the side door while you still have a modicum of respect and integrity left before you totally lose that too.

Well, given the attitude of some posters here I'm not really sure I care about getting the respect of some here. Sufficed to say I'm on an old AMD 9950 quad and I'd prefer to stay AMD. I'm not the one justifying an old x99 system I felt I overpaid for. I'm merely pointing out what I've seen discussed on forums in content creation industries. While many love the value Ryzen provides, there are some that aren't. They're either not into switching platforms and having to deal with learning what that means or they have demands they state can't be met by Rzyen. Either way, I'm fine if you don't respect me. If that's your attitude you should put me on ignore right away.
 

unseenmorbidity

Golden Member
Nov 27, 2016
1,395
967
96
Chrystal balls are cool. I guess we'll see how everything turns out.



Not really. This is capitalism. Pricing something as high as possible isn't "overpricing", "overpricing" is pricing it higher than the market can bare. I agree that for a lot of users x99 CPUs have become "overpriced" since AMD released Ryzen, and that's where I agree that sooner or later part of the CPU lineup of Intel will have to move down in cost. But the top end probably won't for a while as far as I can see, even though I hope it does.



Well, given the attitude of some posters here I'm not really sure I care about getting the respect of some here. Sufficed to say I'm on an old AMD 9950 quad and I'd prefer to stay AMD. I'm not the one justifying an old x99 system I felt I overpaid for. I'm merely pointing out what I've seen discussed on forums in content creation industries. While many love the value Ryzen provides, there are some that aren't. They're either not into switching platforms and having to deal with learning what that means or they have demands they state can't be met by Rzyen. Either way, I'm fine if you don't respect me. If that's your attitude you should put me on ignore right away.
Capitalism is just an excuse. It doesn't change the facts it was overpriced.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Has anyone tested gaming on the new 1.0.0.4 AGESA version?

So far users report massive latency reductions, and I'd imagine games are pretty sensitive to that. Most interested about Tomb Raider and Watch Dogs 2.

I have a beta BIOS with 1.0.0.4a. I got a little bit of gaming in last night on HoTS. Definitely a faster. Maybe 5-10fps. Latency in AIDA was in the 70ns range. Other bonuses, POST time is MUCH faster. It's a reasonable 10s or so instead of like 20-30.
 
  • Like
Reactions: lightmanek

french toast

Senior member
Feb 22, 2017
988
825
136
6900k is a better processor than 1800x overall (not by much), x99 is technically a better platform than am4, both accounts if based soley on outright performance and features averaged over many workloads.

Unfortunately its not a big difference and there are more things to consider, mainly price paid for such performance, in that area 6900k gets crushed by 1800x, which is the least cost effective out of the three ryzen 8 cores.
6900k is not obsolete....Yet, its just grossly overpriced, when skylake x and x399 come it will be.

Once again it will come to pricing as to which future HEDT processor excels, intel better not copy broadwell E pricing with Skylake X, else that will be rendered obsolete in quick time.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,247
16,108
136
Capitalism is just an excuse. It doesn't change the facts it was overpriced.
I think I can be objective, when I say Intels HEDT and server CPU's are overpriced. First, everyone has seen the reviews. The 1800x can complete very nicely (not always winning) against the 8 core and 10 core Intel offerings in computing tasks, at 1/2 to 1/3 the price. The platforms they sit on are very similar in price. So Ryzen wins that one easily, and Intel is overpriced.

Next, you have the server chips, the Xeon line. So I have FOUR E5-2683 (14 cores, 28 threads) that I got for $350 each. In my DC projects, they beat the Ryzen by a small portion. But if you paid retail (or even wholesale) they would be $1500 for the cheapest one ! Big businesses can't buy off ebay. (like the company I just retired from) Ryzen wins again.

Power usage. They are at least very close, both my E5-2683 running WCG and my Ryzen 1800x (at stock) take about 183 watts, the full system, and the Ryzen is running a 550TI (just until my 1080ti comes in), so Ryzen most likely uses less.

So to summarize, FROM PERSONAL EXPERIENCE I can say that Ryzen is a much better value for HEDT tasks in all respects. Intel is over-priced.

Edit: and BTW, I also have a dual 6276 Opteron system with 32 cores. The Ryzen wipes the floor with it in every respect, CPU power, and power usage. The E5-2683 wipes the floor with it. Its not turned on anymore, since it uses too much power. And trying to sell it ? I gave up.
 
Last edited:

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
@mattiasnyc
I understand that the PCIe scaling tests using games that people linked to from TPU aren't useful to you, but I would like to know if there are any actual benchmarks to back up your need for min 2 times x8 slots? What vendors recommend and what is actually required are not always the same thing.

For instance, if one GPU is being used for rendering (GPU compute) and the other for display and preview will both of them need to have lots of PCIe bandwidth? The rendering / encoding GPU most likely will need the x8 PCIe 3.0 speed, but the display/preview one is what I question.

For just 2D display, x1 PCIe 2.0 is probably enough. What kind of acceleration does the display GPU do? It's not 3D stuff (Autodesk etc.) which require Pro card (Quadro / FireGL), so does it help with colour conversion and the like? After all, a x4 PCIe 3.0 slot provides the same bandwidth as a x8 PCIe 2.0 slot.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
The fact that we are having this discussion at all means competition is back to the market... we actually have a choice in the entry level workstation market (or distributed computing crunching rigs like some of us have).

I am very interested in what AMD's HEDT platform will bring to the market in 2H2017, as that will give us a more direct competitor to Intel's HEDT lineup, with more cores at each equivalent price point.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
@mattiasnyc
I understand that the PCIe scaling tests using games that people linked to from TPU aren't useful to you, but I would like to know if there are any actual benchmarks to back up your need for min 2 times x8 slots? What vendors recommend and what is actually required are not always the same thing.

Sure. You can read up on what users use when operating Premiere Pro doing heavy lifting, or Davinci Resolve etc.


Some will run more than one card for realtime processing of effects. That puts you at x8/x8 minimum.


People use x4 Blackmagic Design cards to get uncompressed and/or realtime compressed/decoded video out through SDI connections to full-res professional monitors. It's a bit similar to me running a dedicated card with dedicated outputs for audio as opposed to the motherboard's built-in audio i/o.
 

w3rd

Senior member
Mar 1, 2017
255
62
101
Like I said, "I haven't seen a single thing come out of AMD about this."

You're listing Naples which is a server platform if I understand correctly, and then you're listing speculation not from AMD - it even says "rumor" in the headline.

So, my point exactly.

No, I am saying I (& everyone else) are aware of AMD's enterprise chips & technology, based on Zen architecture. And that AMD already possesses, what you claim to not know, or understand, or able to connect the dots on.

If Intel can release a High End Desk Top based on server boards & chips, then AMD can too. The inference is there, but odd how you don't know a single thing about it? Or the possibility?

*laugh*
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Did I write that x8 PCIe negatively penalizes performance? I don't think I did. You tell me.

You made the argument that X99 has more PCIe lanes. Why make that argument if you are not implying that those lanes make a difference to anyone's real world performance?
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
I think you're all trolling Mathias. His message is simple - Intel HEDT is not dead. I suggest if anyone wants to drag this on they should counter that argument, instead of making this a mainstream computing debate where saving $500 to invest in your graphics or faster storage is a no brainer.

This discussion has actually made me realize something. Where does Ryzen occupy in the CPU hierarchy? I think the speed with which AMD is rumoured to be bringing X399 online should be testament to what Mathias is saying. Ryzen is not there yet as an HEDT CPU. You can't use it in any serious, mission critical stuff just yet. It's a cpu for those enthusiast long thirsty for Intel's many-core computing party, and who have been locked out because they either can't or are unwilling to pay to play. Oh, Ryzen is also the second coming of Christ for most die hard AMD fans. It's a high end mainstream cpu that is also a throughput monster and so can do well in certain areas of HEDT computing but lacks in other areas as well. Mathias has already touched on some of these so no need to go there.

I also laugh at people who talk about drop-in upgrades on the AM4 platform - well, if you're moving from 4/8 to 8/16, fine. You're high if you think you can drop a 16c/32t chip in your $120 motherboard. Well, you could, but you'd have to put the fire department on alert. Intel's HEDT boards are expensive, as they should. A 16c/32t chip should be well built and cost significantly more than your $80 board. That's the only way you'd know you're getting top of the line, quality parts. I don't even know how people can buy a cpu for $500 and put it in an $80 board and go to sleep without keeping their noses, ears, and eyes wide open.