1: The APUs must be produced first before Miscrosoft assemble/manufacture the XB1.
2: A large Volume Launch in 13 countries at 22 November 2013
3: For this to happen they already have final silicon, XB1 is in full production now.
4: It will take 2-3 months for TSMC to produce the APUs,
thus final clocks decision was not made recently.
I dont believe thats how it works. AMD will have to know how many chips can do 1.75GHz, that means that yields will be different than chips at 1.6GHz. Also AMD needs to validate those chips at 1.75GHz and then TSMC to produce them.
Also, changing the specs will have an impact in thermals and consumption. Remember they changed both the CPU and GPU clocks. Having the same Heat-Sink Fan will increase DB noise as well or you will have to change the HSF, and more.
The original "decision" on clockspeeds may have entailed nothing more than a "target range", with the final decision on locking in a specific value for the clockspeed being something left open to be determined once the ship date was closer.
They may have, for example, left themselves a target of say 1.4 to 1.7GHz provided the power-consumption fit inside the defined TDP window.
Now that they've seen a few thousand wafers worth of yields and bin outs, they have much more confidence in the reliability of a specific bin coming out of the fabs.
Good time to lock in the final clockspeeds, now that they have more data to drive that decision.
Great explanation. Instead of blasting AMD as implied by some posts for some type of "conspiracy", your post, IDC, makes sense.
And specially MS havent exactly been lucky with their design goals
ShintaiDK could you please elaborate more on that. Who's to blame in that scenario?
ShintaiDK could you please elaborate more on that. Who's to blame in that scenario?
I don't know who to blame except MS, but it is just beyond my comprehension how MS can deliver a less powerful system and charge more money for it. Obviously someone in MS thought their system has some advantages that justify the cost, but I don't see what they are. Or maybe someone at MS just totally screwed the pooch in setting the price.
I don't know who to blame except MS, but it is just beyond my comprehension how MS can deliver a less powerful system and charge more money for it. Obviously someone in MS thought their system has some advantages that justify the cost, but I don't see what they are. Or maybe someone at MS just totally screwed the pooch in setting the price.
"Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU," Goosen reveals. "Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console."
Expect to see sub-1080p or dynamic resolutions on next-gen games. Crytek's Ryse is confirmed as running at 900p
"What we're seeing in titles is adopting the notion of dynamic resolution scaling to avoid glitching frame-rate. As they start getting into an area where they're starting to hit on the margin there where they could potentially go over their frame budget, they could start dynamically scaling back on resolution and they can keep their HUD in terms of true resolution and the 3D content is squeezing. Again, from my aspect as a gamer I'd rather have a consistent frame-rate and some squeezing on the number of pixels than have those frame-rate glitches."
http://www.eurogamer.net/articles/digitalfoundry-vs-the-xbox-one-architects
Yet more info, and its not good.
Heavily CPU limited before release.
And resolution cant be kept up either.
How long are these consoles expected to last? 1-2 years?
if they really want, 5+ years I think.
when the PS3 was released (Q4 2006) Intel was releasing the Core 2 Quad and Nvidia the 8800GTX, so the PS3 was already quite weak compared to high end desktops.
The topics discussed here (adaptive scaling, performance level) and the given compatibility of XBone, PS4 SoC base components and easy gfx scalability let me think, that there is a chance in seeing upgraded variants with higher gfx performance to sufficiently serve 4K or 3D screens.
. But yea, it is a bad sign when they start showing screenshots to say 900p looks as good as 1080.
100% true... That weak jaguar is not able to handle 1080p rendering. Next step will be to convince us that 10FPS is absolutely playable.
I dont think they are too weak and even so the jaguar cores are much faster than the older ppc cores.100% true... That weak jaguar is not able to handle 1080p rendering. Next step will be to convince us that 10FPS is absolutely playable.
