- Mar 3, 2017
- 1,777
- 6,791
- 136
I'm certain that, if you look hard enough, you'll find a game or two that have more than 4 highly pressured, time sensitive threads, that will perform better on Hawk Point. The VAST majority of games tend to have 1-2 performance critical threads and an additional 1-2 threads that are very latency sensitive. Beyond that, it's more about running background tasks or off-screen work on unpacking graphics, etc. As long as the threads are there, it doesn't much matter where they are. Given the improvement on ST performance that Strix has though, there's not going to be much that suffers from running on Strix over Hawk.One can wonder if this will be a problem in newer games that might use more than 4 cores. I mean according to AMD X3D get only one V-cache die, as cross CCD latency would be prohibitive, and here we have cross CCX latency that doesn't look too good. Could it be that Hawk Point may do better in some games than Strix Point?
I always suspect companies of hiding stuff. Have some low margins products which won't go down well with the stock market? Hide them with the high margins stuff.If we forget about other markets for a second, Ian Cutress did a BOM analysis for the 7950X when it released in 2022, estimating total BOM to be around $70. For a $700 MSRP product. I have no idea how a 90% profit margin could be considered "very crappy profitability".
Based on his breakdown, the single CCD chips would be ~$45-50 BOM at the time.
Launch prices are higher now, but the 7600X fell below 230€ about half a year after launch and is available for 195€ right now. Also, the main problem with AM5 at launch weren't the processors, but the ridiculous mainboard prices.I still remember buying the 6600K for 230€.
If we forget about other markets for a second, Ian Cutress did a BOM analysis for the 7950X when it released in 2022, estimating total BOM to be around $70. For a $700 MSRP product. I have no idea how a 90% profit margin could be considered "very crappy profitability".
Based on his breakdown, the single CCD chips would be ~$45-50 BOM at the time.
Do you maybe have an idea where to find such breakdown? How well threaded modern games are? And another thing that might bite Strix is Windows scheduler if by chance it will decide to move the time sensitive thread to Zen5c CCX then well... [Of course in ideal world that would not happen, but even 3 years after Alder Lake is on the market it's still not uncommon to find software with issues related to handling the E cores]I'm certain that, if you look hard enough, you'll find a game or two that have more than 4 highly pressured, time sensitive threads, that will perform better on Hawk Point. The VAST majority of games tend to have 1-2 performance critical threads and an additional 1-2 threads that are very latency sensitive. Beyond that, it's more about running background tasks or off-screen work on unpacking graphics, etc. As long as the threads are there, it doesn't much matter where they are. Given the improvement on ST performance that Strix has though, there's not going to be much that suffers from running on Strix over Hawk.
Back in the day when iPhone had a $200 BOM and retailed for $800 that was considered a handsome margin. Somehow today a $70 BOM with a $700 retail is a company barely scraping by.Please guys, don't act like the materials are the only costs that a company has. As if they don't have to pay thousands of employees. And marketing costs. And electricity. And shipping costs. And so forth and so on.
It seems like AMD is doubling down on pushing the 12-core SKU this time. The 9900X is seen much more often in the leaks, and in their slides from the AMD Tech Day they are placing the 14900K against the 9900X, not the 9950X.Because no one buys the 12 cores Zen.![]()
It's an unquestionable position, irrespective of conflicting data points. Not knowing the accounting processes at AMD, to say division ? is losing money is at best, unwise. Where they assign costs is crucial.If we forget about other markets for a second, Ian Cutress did a BOM analysis for the 7950X when it released in 2022, estimating total BOM to be around $70. For a $700 MSRP product. I have no idea how a 90% profit margin could be considered "very crappy profitability".
Based on his breakdown, the single CCD chips would be ~$45-50 BOM at the time.
Did you forget that the sales of Apple devices led to other strategic gains such as OS and ecosystem market share and buy in, which resulted in a closed ecosystem where Apple could take 30% from most software sales? A quarter of Apple's total revenue comes from software and services.Back in the day when iPhone had a $200 BOM and retailed for $800 that was considered a handsome margin. Somehow today a $70 BOM with a $700 retail is a company barely scraping by.
With a CPU you’re paying almost entirely for the R&D, at least I would thinkBack in the day when iPhone had a $200 BOM and retailed for $800 that was considered a handsome margin. Somehow today a $70 BOM with a $700 retail is a company barely scraping by.
The R&D is (or at least could be) amortized across all product lines using the CCD they designed for that gen. IIRC BOM typically does not contain those costs, but specifically the $70 figure was 2xCCD + 1xIOD + packaging (substrate)With a CPU you’re paying almost entirely for the R&D, at least I would think
Plus retailer margins, packaging, shipping etc.. not sure if that was worked into the BOM
The R&D is (or at least could be) amortized across all product lines using the CCD they designed for that gen. IIRC BOM typically does not contain those costs, but specifically the $70 figure was 2xCCD + 1xIOD + packaging (substrate)
Then you don't know how semiconductor margins work. Most top end parts for CPU/GPU have very high margins and even mid range parts will have decent margins but the cost of R&D is so high, companies need these margins to continue designing and making new chips. AMD is only somewhat profitable pricing their products at their current prices and doesn't have the same margins as Nvidia or Intel from 10 years ago.I have no idea how a 90% profit margin could be considered "very crappy profitability".
Guys, there's gross margins and then there's operating margins...
Revenue - COGS = Gross Margins. COGS = cost of goods sold, aka raw production costs.
But then you have operating expenses, which includes R&D, rent, electricity, salaries, and all the other overhead items, basically.
From AMD's last earnings statement, Gross Margins were 47% and Operating Margins were 1% (both are averaged across all segments). Note: This includes an expense item from acquiring Xilinx which dilutes the EPS.
If you want it broken down by segment, you'll have to look more closely at the tables. Make of this what you will.
View attachment 103592
Correct, hence my one note regarding Xilinx.AMD is still writing off huge amounts of money from their Xilinx and other acquisitions over the last few years to save on taxes. Their non-GAAP results are a much better reflection of how they are actually doing overall. Their operating income goes form $36M to $1.1B for the latest quarter with an operating margin of 21%, the vast majority of the difference being in depreciation/amortization accounting.
View attachment 103595
I don't know of any really good technical writeups on it, but, from living on the forums for years and looking at the task manager and programming tools monitoring the modern games I play, it's quite evident that it's rarely more than 2 physical cores that are pushed to near 100% usage in any modern game. Single threaded performance is VERY important for the first two cores, and still somewhat important in the next few. After that, they spend more of their time doing little more than waiting for system events or decompressing data.Do you maybe have an idea where to find such breakdown? How well threaded modern games are? And another thing that might bite Strix is Windows scheduler if by chance it will decide to move the time sensitive thread to Zen5c CCX then well... [Of course in ideal world that would not happen, but even 3 years after Alder Lake is on the market it's still not uncommon to find software with issues related to handling the E cores]
For the second bullet, would that mean Zen6 release in like 12-18 months from now, so mid-life kicker is not needed?
hmm I'm expecting Vanilla Zen 5 and X3D this year and maybe a refresh on 3nm in 2025 but not Zen 6 in 2025....I cam across this article saying 2025 release for Zen 6, which would be within 18 months of Zen 5 release. They say their source is AMD, but I have not seen any direct quotes from AMD about that.
![]()
AMD Zen 6 is "on track" for 2025 release - report claims - OC3D
Rumour has it that AMD Zen 6 CPUs are on track to release in 2025, giving the Ryzen 9000 series a shorter than expected shelf life.overclock3d.net
Not quite. They say that according to TechRadar AMD says that Zen 6 is on target for 2025.I cam across this article saying 2025 release for Zen 6, which would be within 18 months of Zen 5 release. They say their source is AMD, but I have not seen any direct quotes from AMD about that.
![]()
AMD Zen 6 is "on track" for 2025 release - report claims - OC3D
Rumour has it that AMD Zen 6 CPUs are on track to release in 2025, giving the Ryzen 9000 series a shorter than expected shelf life.overclock3d.net
AMD didn’t have anything of note to say at the LA event about its next-generation Zen 6 processors first mentioned at Computex 2024. We still don’t know much about the Zen 6 and Zen 6c architectures at the moment, only that they are still on track for a release in 2025.
I agree I think Zen 6 will tie into the release of DDR6.Zen6 in 2025 makes no sense.
AMD announced extended support for AM5 platform to 2027.
They can easily launch Zen6 in 2026 and launch refreshes/X3D in 2027 to support that claim. Why would they as a company want to hurry things up?
hmm I'm expecting Vanilla Zen 5 and X3D this year and maybe a refresh on 3nm in 2025 but not Zen 6 in 2025....
View attachment 103624