• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Wow, Ars Technica roasts Kaby Lake

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Dayum, son.

5_6.jpg
 
If intel had stopped trying we would all still be buying new 2500ks.

We basically are.....we havent seen a performance jump like we did when going from Gen 1 -> Gen 2 since, well, going from Gen 1 to Gen 2. I havent seen a new processor out since the 2500K that has had any tangible performance increase worth me buying it at full price. Not to mention needing a new board and maybe new RAM too.
 
Intel is not stupid. They have enough IP aces up their sleeves. It is good for business to have a healthy competitor. Monopoly lawsuits aside, slowly dishing out improvements one after the other is good for business because you can keep selling it to oem manufacturers who need a bit of an edge over their competition. It is not good business practice to give everything you have at once only to find out the oem hardware manufacturers find it to expensive and software support is lacking. Intel keeps a sharp eye on AMD. AMD comes with great ideas only to be burned down or to be stabbed in the back by cheapskate oem manufacturers (single channel APU notebooks).
I am happy when AMD will come with vega and ryzen and hopefully enter the deep learning market as well. That means contracts for years to come.
With a bit of luck 8 core will become mainstream within 2 years.
Believe me, Intel will come with something nice if ryzen is really providing.
 
every reviewer is, it has nothing going for it and if you have a 3770 or newer there is no reason whatsoever to upgrade

There is 0 reason to upgrade. These new CPUS are for new builds. There is no reason for anyone to upgrade.

The days of software driving PC sales is over. This ain't like the old days where you had to upgrade to keep up with software.

If Intel thinks they won, they are mistaken. Once the upgrade cycle becomes an 8 year cycle they are going to be in a world of hurt.
 
At least in my opinion, the last few generations have been largely lackluster in regards to the CPU itself. I think the biggest upgrades are largely when it comes to the features built into the chipset, which tend to advance each year. Although, I think Z270 is just 4 more PCI-E lanes (chipset, not CPU) and USB 3.1.
 
At least in my opinion, the last few generations have been largely lackluster in regards to the CPU itself. I think the biggest upgrades are largely when it comes to the features built into the chipset, which tend to advance each year. Although, I think Z270 is just 4 more PCI-E lanes (chipset, not CPU) and USB 3.1.

yea thats the only reason i went from a 2600k to a 6700k, better PCIE/USB and M.2
 
This, makes sense to me as well. Will maybe hard to come by within half a year. And the 7700K has a HVEC codec in hardware. With good software support from for streaming sites that provide an app such as netflix, that is quite the energy saver for people who are interested in such features.

http://www.anandtech.com/show/10610...six-notebook-skus-desktop-coming-in-january/3
https://en.wikipedia.org/wiki/High_Efficiency_Video_Coding
Not only that, Netflix REQUIRES 7xxx series or later for DRM reasons for 4K. It won't even play at all on 6xxx series chips.

This is why I thought it was braindead for (some) people interested in video playback to buy a 6xxx series CPU in late 2016, unless they upgrade their CPUs every 2 years.

This is a big step forward for me, esp. since video playback matters to me, and I usually wait 5 years or longer before significant hardware upgrades. Upgrading to 6xxx would have been much, much less of an upgrade, because the things I can't do on older CPUs I still wouldn't be able to do on 6xxx due to feature differences and DRM support issues.
 
Meh on all of this. I've been happy with Sandy Bridge for a while now. It runs Titanfall 2 great.

And meh on 4K. 720p video is more than enough for me.
 
I'm already on a 6 to 8 year upgrade cycle. The reality is that if you don't do real gaming, there just isn't much reason to upgrade these days. Fast ssd, plenty of memory and your 4 year old cpu will serve most people just fine.
 
I don't share people's optimism about AMD competing with Intel leading us to a golden age. I think they'll compete but they'll be able to in part because of how processor performance is stuck. Intel 5 years ago is becoming a progressively harder competitor for Intel and there's no way they wouldn't have stepped up their game if it were easy.
 
I don't share people's optimism about AMD competing with Intel leading us to a golden age. I think they'll compete but they'll be able to in part because of how processor performance is stuck. Intel 5 years ago is becoming a progressively harder competitor for Intel and there's no way they wouldn't have stepped up their game if it were easy.
Perhaps, but it might be enough to bring the 6-8 core prices down. I'll be a happy guy if AMD manages to do that.
 
Perhaps, but it might be enough to bring the 6-8 core prices down. I'll be a happy guy if AMD manages to do that.

That's likely enough to happen, or at least bring 6 cores to more affordable platforms. I'm similarly hopeful that we might get some improvements in how the market is segmented, I'm just not hopeful that some previously unseen technology is going to suddenly spring fully formed from Intel's head and revolutionize processors. We might get something cool in memory technology though. Basically any hope I've got is for an unexpected synergy between processors and a new technology somewhere else in the system.
 
The most exciting processor of the past few years for me has been Core M. I really wish they would work more on these low power parts.

Btw, remember when the government restricted export of like the PS2 because it was so powerful it might be used for nuclear simulations?
 
I'm already on a 6 to 8 year upgrade cycle. The reality is that if you don't do real gaming, there just isn't much reason to upgrade these days. Fast ssd, plenty of memory and your 4 year old cpu will serve most people just fine.
Most people are honestly fine with a 10 year old CPU...

Gave the mother in law my old phenom II, with my old 256GB SSD and 8GB ram and she things its the fastest computer in the world
 
I'm already on a 6 to 8 year upgrade cycle. The reality is that if you don't do real gaming, there just isn't much reason to upgrade these days. Fast ssd, plenty of memory and your 4 year old cpu will serve most people just fine.
My main PC is on an i5 2nd gen running Win 10 with SSD and it is just recently starting to feel long in the tooth. I'm not sure why that is, all I do is word processing and web browsing.
 
The most exciting processor of the past few years for me has been Core M. I really wish they would work more on these low power parts.
Kaby Lake + Y-class is a match made in heaven IMO, given the addition of hardware HEVC 10-bit decode capabilities (and of course the built-in DRM support), along with some CPU performance boosts.

However, it's not going to be called Core M anymore. Well, most of them aren't anyway. The top end one will now be called Core i7 and the mid range ones Core i5. Only the lowest end model will be called Core M. Or more accurately, Core m3.

So, Kaby Lake Y is:

Core i7 7Y75: 1.3 GHz base / 3.6 GHz turbo, TDP 4.5 W
Core i5 7Y57: 1.2 GHz base / 3.3 GHz turbo, TDP 4.5 W
Core i5 7Y54: 1.2 GHz base / 3.2 GHz turbo, TDP 4.5 W
Core m3 7Y30: 1.0 GHz base / 2.6 GHz turbo, TDP 4.5 W

Not sure why both 7Y54 and 7Y57 exist.

I will likely buy a Core i5 Y or Core i7 Y laptop this spring, although I haven't completely given up on the 15 Watt U series yet. Not interested in the higher wattage mobile parts.
 
Last edited:
Looks like single threaded performance is once again essentially flat with kaby lake. Looked at the game performance reviews and it turns out there really isn't a reason to move up from my 3770k even. If it had more 8 cores and 16 threads like the upcoming Ryzen will have at least there would be an encoding advantage to crow about but stuck at 4 cores with essentially just some minor clock speed improvements for normal desktop use I can't see retiring the old 3770k in favor of it yet. waiting to see what the real life Ryzen does at this point.
 
But that's just one item...
To get to 4K from netflix, you need...
- this CPU
- New mobo
- new RAM

To go along with -
- Win10
- 4k TV
- [you gonna play 4K on the TVs crappy speakers? didn't think so...] AV receiver with HDMI 2.0a
- Uber nice Wifi with 100% bars?
- Or, Gigabit Ethernet
- Oh and that Internet connection needs to be 45Mbs at least
(Yes, yes...I know the googlenet says Netflix says they only need 15 but who are we kidding?)

So looking at all the 4K Netflix requirements...we're looking at thousands of dollars just for Netflix shows, of which are only a tenth (or much less) of all the programming we can get online or over the air.

The should just introduce the new industry standard: $k TV

Not only that, Netflix REQUIRES 7xxx series or later for DRM reasons for 4K. It won't even play at all on 6xxx series chips.

This is why I thought it was braindead for (some) people interested in video playback to buy a 6xxx series CPU in late 2016, unless they upgrade their CPUs every 2 years.

This is a big step forward for me, esp. since video playback matters to me, and I usually wait 5 years or longer before significant hardware upgrades. Upgrading to 6xxx would have been much, much less of an upgrade, because the things I can't do on older CPUs I still wouldn't be able to do on 6xxx due to feature differences and DRM support issues.
 
But that's just one item...
To get to 4K from netflix, you need...
- this CPU
- New mobo
- new RAM

To go along with -
- Win10
- 4k TV
- [you gonna play 4K on the TVs crappy speakers? didn't think so...] AV receiver with HDMI 2.0a
- Uber nice Wifi with 100% bars?
- Or, Gigabit Ethernet
- Oh and that Internet connection needs to be 45Mbs at least
(Yes, yes...I know the googlenet says Netflix says they only need 15 but who are we kidding?)

So looking at all the 4K Netflix requirements...we're looking at thousands of dollars just for Netflix shows, of which are only a tenth (or much less) of all the programming we can get online or over the air.

The should just introduce the new industry standard: $k TV
1. I don't build my own PCs anymore. I stopped doing that over a decade ago. I just buy pre-built systems, although I do occasionally do peripheral upgrades.
2. Pre-builds are now all Win 10. And all my old PCs are Win 10 anyway. Win 10 works way better on old hardware than Win 7, and I hated Win 8.
3. Two of my receivers are already HDMI 2.0a with HDCP 2.2. Such receivers are inexpensive these days, as it has become a standard feature. However, with my computers I generally use headphones, and watch video on the computer screen while I work at the computer.
4. The computer screen for my next desktop will likely be 5K.
5. Gigabit ethernet throughout my house, but I also have a mesh of 3 WiFi routers (802.11ac and 802.11n) giving me full bars everywhere.
6. Currently have a 25 Mbps VDSL2 connection, rock solid, but I'm testing out 60 Mbps cable service. BTW, 25 Mbps is the slowest connection out of all of my friends.
7. My primary "TV" watching these days is actually Netflix.

But even if I didn't have this stuff, if I were buying or building a new computer today, I think it would often be foolish to do it with the 6xxx series, at least for a decent chunk of people. 7xxx isn't much more, but adds features, some of which are make-or-break video features important for some people. Sure, if you bought a computer 2 years ago, there may not be a good reason to upgrade, but that's a different scenario.
 
Last edited:
But that's just one item...
To get to 4K from netflix, you need...
- this CPU
- New mobo
- new RAM

To go along with -
- Win10
- 4k TV
- [you gonna play 4K on the TVs crappy speakers? didn't think so...] AV receiver with HDMI 2.0a
- Uber nice Wifi with 100% bars?
- Or, Gigabit Ethernet
- Oh and that Internet connection needs to be 45Mbs at least
(Yes, yes...I know the googlenet says Netflix says they only need 15 but who are we kidding?)

So looking at all the 4K Netflix requirements...we're looking at thousands of dollars just for Netflix shows, of which are only a tenth (or much less) of all the programming we can get online or over the air.

The should just introduce the new industry standard: $k TV

I would also argue that 4K really only makes sense for projectors or when you sit really close like in the case of computer monitors ...
I mean if you sit like 7 feet or more back from a 60 inch screen, 1080p and 4k will look the same.
4k projection is EXPENSIVE.
JVCs pixel shifted "faux" 4k starts at $5K, Sony around 10K for trueHD (though the JVC overall actually has arguably better quality thanks to better black levels)
 
I'm already on a 6 to 8 year upgrade cycle. The reality is that if you don't do real gaming, there just isn't much reason to upgrade these days. Fast ssd, plenty of memory and your 4 year old cpu will serve most people just fine.
Unless you're a gamer or a developer, desktop computing is pretty much dead. I've been running a Dell Vostro Core 2 Quad (Yorkfield) as my main rig since 2009. That's like a lifetime ago in computer years. It's maxed out at 4GB DDR2 RAM. I upgraded to an SSD about 4 years in to speed up boot times, and then a larger SSD when I installed Windows 10. I've probably upgraded my GPU twice, and currently run a Radeon 7770. I hardly do any gaming--the last game I played was BioShock Infinite and it ran fine at 1080p and medium settings.

It's pretty much used for web browsing, MS Office, and some basic Adobe Lightroom work when I offload photos from the digital camera. Occasionally I may rip a CD/DVD/BD since it's the only machine I have with a physical drive. I used to run a few "server" apps but those have all shifted over to my NAS. I probably use it about an hour a day but most of my computing is done on a laptop--a Haswell i5 (Y series) that is barely any faster for basic productivity stuff. I think my next desktop upgrade will probably be picking up whatever Dell desktop I can get for $500 to replace my old Dell when it dies. My wife likes the new Retina iMac, so she may get that or a new MacBook Pro and external monitor for the photo editing work. Not sure Kaby Lake on the desktop side will be necessary.
 
Back
Top