Core i7-4770K is performance crippled

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,634
4,562
75
No, you don't remember correctly. That was an actual quote. Here. Look at Item #3.

He said worth it, look for buried treasure, greatly imparting the meaning that "yep, gonna be sweet for all you overclockers". I distinctly remember it because it is what got me to hope greatly about Haswell OC. And then we have reality, which is pretty much the opposite, because there is zero boon for overclockers.

This is pretty much the only reason I hold out hope that the BCLK straps aren't limited to the K-series. And I'll continue holding out hope until I see someone say definitively that they tried everything they could to use a BCLK strap on a non-K-series processor and failed. Because this would be a boon to overclockers who want to buy cheaper chips - though not to those who want the fastest speeds.

If the straps are limited to the K series, then you're right, there is zero boon for overclockers. :(
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
- I dont see it. The need for total computing power will always be on the incline, driven by games, porn or otherwise virtual realities enabling people in their day to day tasks.

But what if form factor becomes a limitation?
 

Revolution 11

Senior member
Jun 2, 2011
952
79
91
I think Intel is aware of the parallels, and their recent refocus on mobile is a sign that they are trying to avoid the same fate. They're finally bumping Atom up to the leading edge and bringing it out on 14nm within the process' first year (instead of lagging behind, as standard).

The risks to them is from disruptive "good enough" cores in vast numbers- whether it is a phalanx of ARM microservers, or a few GPGPUs. Their dedication to bringing the Xeon Phi to market, as well as making Atom more aggressive once again, seem like promising signs. I just hope that they can learn to suck up lower margins, and stop playing these damn market segmentation games.

The fact that they are playing these market segmentation games kind of means they are joining Fairchild, Cray, DEC, and Sun's ranks.

The focus on high margins is actually a great strategy for companies who can succeed at it. But too strict a focus on high margins can blind a company to a competitor rising from below. I don't think Intel is ignoring ARM but they are not doing as well as they could be doing.
 

beginner99

Diamond Member
Jun 2, 2009
5,315
1,760
136
I still don't see desktops going away. Mobile is just limited in terms of performance and storage space. Show me a tablet with 6 TB of storage I have in my desktop? Now you can argue processing power and storage goes into the cloud but the later, forget it. The limitation are the networks and internet connection. While download is somewhat ok (but still too slow for TB amounts, the upload bandwidth usually is much lower and basically unusable for large files. And in 5-10 years 4k will be more common and those files just keep getting bigger and bigger...
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I still don't see desktops going away. Mobile is just limited in terms of performance and storage space. Show me a tablet with 6 TB of storage I have in my desktop? Now you can argue processing power and storage goes into the cloud but the later, forget it. The limitation are the networks and internet connection. While download is somewhat ok (but still too slow for TB amounts, the upload bandwidth usually is much lower and basically unusable for large files. And in 5-10 years 4k will be more common and those files just keep getting bigger and bigger...

And how many people do you know that need/use 6TB of space? And I assume you don't travel too much, because you don't usually travel with 6TB of data in your backpack.
 

cytg111

Lifer
Mar 17, 2008
25,723
15,207
136
But what if form factor becomes a limitation?

- That is where the lower perf/watt comes into play. For total compute output wattage has to decrease pr core.
In an ironic kind of sense, AMD is helping pave that way with the new deals on xbox and ps. Parallel computing on the rise.
 

cytg111

Lifer
Mar 17, 2008
25,723
15,207
136
Yes but companies now want to go back to the rental model with the "cloud". So their goal is that eventually the fastest chips are out of reach of most individuals and mainly in their servers.

The cloud delivering me frames at a 500ms delay? Put on a pair of occulus-rift kind of headgear and 90% of the population will need sick bags. The "cloud" is good for a number of things, but up front computing is IMO not one of them.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
The cloud delivering me frames at a 500ms delay? Put on a pair of occulus-rift kind of headgear and 90% of the population will need sick bags. The "cloud" is good for a number of things, but up front computing is IMO not one of them.

I didn't say you would like their cloud rental vision of the future.

I suppose their answer would be improving the internet connections such that the delay is less noticeable. Fios and Google Fiber come to mind as well as the efforts to deliver wireless along a broader spectrum. It's what many companies are interested in bringing to fruition and it may actually reach an adoption rate where high end consumer computing is priced out of many people's grasp.
 

cytg111

Lifer
Mar 17, 2008
25,723
15,207
136
I didn't say you would like their cloud rental vision of the future.

I suppose their answer would be improving the internet connections such that the delay is less noticeable. Fios and Google Fiber come to mind as well as the efforts to deliver wireless along a broader spectrum. It's what many companies are interested in bringing to fruition and it may actually reach an adoption rate where high end consumer computing is priced out of many people's grasp.

I tried googling it a bit without getting a definitive answer, but what is an acceptable repsonse time? For movies it doesnt really matter, but take a game, a fps game, suppose you got fiber, suppose the cloud is deployed akamai style (at the backbone of your isp), what can we expect?
1. you react to something in the game x ms to the server
2. y ms server handling fee
3. x ms on the fiber back to you
Add delay on whatever device you'd be using to interact.

I dont know, might just be possible ? You'd problary miss the 100Hz mark, but that doesnt mean much.

But you'd be suspect to outtages and primetime bottlenecks and what else follows. But hey, maybe ?
 

BenchPress

Senior member
Nov 8, 2011
392
0
0
Yes, look for 'mutex' in the Mesa source code. And that's probably very straightforward compared to NVIDIA and AMD's drivers.
So you don't know, therefore don't have any proof.
Mesa is full proof. I was merely adding that NVIDIA and AMD's drivers are more complex and therefore are highly likely to have even more multi-threading going on where TSX would help.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I don't think any of the companies offering or looking to offer (Microsoft Xbox One anyone?) these services has published any studies. I did find a decent examination here:

http://enterthesingularity.blogspot.com/2010/04/latency-human-response-time-in-current.html

~50ms for UI interaction
~100-150ms for regular console like play

Keep in mind that would need to be the round trip time. That blogger is assuming single player gaming as well. For multi-player any server delay would eat into that latency allocation. From my experience cable internet would be hit and miss, the Fios I had for a while would have been fine the vast majority of the time.

Also note that competitive and similarly demanding gamers would need tighter tolerances. John Carmack has said he thinks there would be a noticeable benefit to reducing latency even with direct PC hardware.
 

guskline

Diamond Member
Apr 17, 2006
5,338
476
126
Nothing dies, there is still a market for horse buggies and the necessary horse whips.

So saying desktop won't die isn't really making a strong stand because no one is really saying its death is coming.

But we all know what happens once a form factor or product falls out of favor. It stops getting priority, exposure, and R&D begins to dry up.

Very little R&D is going into developing better and better vinyl records, landline telephones, and leaded-gasoline. But you can still buy all of those products, as stagnate as they are.

That is where the desktop is headed, eventually, IMO. There will always be a need for them, but the market volumes that represent that base need are going to be silly small. Engineering stations, design, etc.

But things have peaked, as they inevitably do, and that makes other things become the priority for today's R&D dollar.

IBM, SUN and HP show that dedicated server CPUs have a place for low-volume high-performance server markets. That market isn't going away just because mainstream consumer PCs are in a decline.

But in my opinion the consumer desktop is this decades landline, and it is falling out of favor big-time with this decades kids who are going to be young adults in 10yrs and won't give a damn about buying a desktop (just as I haven't paid for a landline these past 6 yrs).

For the 5-6% of them that go on to become engineers they will still have a desktop at work, the other 95% will be on something mobile.

Agree. How do you overclock an I-Phone?:confused: (P.S. I'm using my wife's I Pad more and more)
 
Last edited:

Durp

Member
Jan 29, 2013
132
0
0
Actually, If I remember correctly those weren't his words at all. He characterized overclocked Haswell to be "interesting" due to the new bclk overclocking feature, but didn't really go further than that. Of course everyone interpreted that to mean Haswell being what Sandy Bridge was a few years back.

No. He (Intel) posted those exact words during the AMA on reddit. I said HIS WORDS for a reason.

http://www.reddit.com/r/IAmA/comments/15iaet/iama_cpu_architect_and_designer_at_intel_ama/

edit: It seems jvroig already corrected you. Sorry for this spam, I didn't see his post.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Mesa is full proof. I was merely adding that NVIDIA and AMD's drivers are more complex and therefore are highly likely to have even more multi-threading going on where TSX would help.

Again you're guessing. That's not proof.
 

cytg111

Lifer
Mar 17, 2008
25,723
15,207
136
I can't believe anyone would waste time overclocking their phone.

- It makes sort of sense since there's alot of different devices outthere with different performance profiles, its not like overclocking your xbox or ps .. that wouldnt make much sense, but having a game lag on your phone cause it just *kinds* of meets the requirements? I can see overclocking happening.
 

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
I can't believe anyone would waste time overclocking their phone.

Overclocking the HP TouchPad was pretty common among techies because the tablet's default configuration was just too slow. It's practically a night and day difference after you get all the necessary PreWare software installed.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Overclocking the HP TouchPad was pretty common among techies because the tablet's default configuration was just too slow. It's practically a night and day difference after you get all the necessary PreWare software installed.

This is why I also don't buy cheap junk when it comes to phones and tablets. ;)
 

BenchPress

Senior member
Nov 8, 2011
392
0
0
Mesa is full proof. I was merely adding that NVIDIA and AMD's drivers are more complex and therefore are highly likely to have even more multi-threading going on where TSX would help.
Again you're guessing. That's not proof.
Guessing? What more proof could you want? You know Mesa is open source, right? You asked whether the conditions where TSX helps occur in a graphics driver. I pointed out a driver where it does, and you can check it for yourself. If you refuse to accept the proof that's right in front of you, or you don't know how to verify it, I can't help you.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
One person's cheap junk = someone's great bargain = another's bang for buck = someone else's maximum budget

I'm not old but I've lived long enough to have touched all the corners of that box :(...rags to riches and back again, twice over :D
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
I still don't see desktops going away. Mobile is just limited in terms of performance and storage space. Show me a tablet with 6 TB of storage I have in my desktop? Now you can argue processing power and storage goes into the cloud but the later, forget it. The limitation are the networks and internet connection. While download is somewhat ok (but still too slow for TB amounts, the upload bandwidth usually is much lower and basically unusable for large files. And in 5-10 years 4k will be more common and those files just keep getting bigger and bigger...

Why are people still making crappy arguments like this? Your needs != Most people needs

And right, 5-10 years later phones will still be using Snapdragon 600s with only 32GB storage space and that is not taking into account whether people even want 4K video. I don't see people complaining about 360p only videos on Youtube while 720/1080p are already mainstream so why would they care then? "Good enough" strikes back again.