First complete review of Haswell i7-4770K

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,080
3,582
126
Coolaler's comment says it all in that thread:

Translated : "5GHz voltage is very ugly".

AHAHAHAHAHAHAHAHAHAAHAHAHAHAHAHAHAHAHAHAHAHAHA :whiste:

*goes off looking at dual ivy cpu boards*

Guys ive known coolaler for a while now...
If something was wrong, he probably would of said something is wrong with your cpu.

Instead he's saying its UGLY... lolol... I pretty much trust his assessment.
 

Khato

Golden Member
Jul 15, 2001
1,303
380
136
But what resolution was your display? 768p, 1080p? That MBP was pushing 2560x1600 around with a HD4000.

Please stop attempting to blame Intel graphics for an Apple software issue. The 15" rMBP with its GT 650M suffers the exact same problem in OSx.

A desktop HD4000 (running at its idle frequency by the way) does just as well at driving a Dell 2713HM at full resolution under windows as my GTX 670, that is to say no perceptible issues whatsoever. I've been pondering the idea of doing a hackintosh install, but it likely wouldn't do any good in convincing those who want to bash Intel in any fashion possible so why bother?
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
If you're going to present an argument, at least have your facts correct. The 13 inch retina macbook pro uses HD4000. The native resolution is 2560x1600. Specifications for you:

http://www.apple.com/macbook-pro/specs-retina/

Every ultra portable Apple product, except the 2799$ configuration of the macbook pro retina and 15 inch standard pro, uses HD4000. The 15 inch MBP uses the GT650M. I can tell you, that if intel matches or comes close to the 640M with the next gen ULV parts that Apple won't use discrete in the 2013 MBP. Already, nearly all of apple's products use HD4000. Now why would they use discrete when nearly all of their stuff (sans mac pro and imac) uses HD4000 _already_? They won't. Except perhaps in the high end configuration of the imac, which oddly enough costs nearly 3000$ - they would likely use a 780M for that (if they even release an imac this year, rumors indicate they won't.)

And I know you'll jump to the gaming argument. I've seen it here all too much - the thing is, the mass market doesn't care. Macbooks are not gaming machines, OSX isn't a good OS for gaming, and macs have never been designed for gaming. Period. So don't bother citing gaming performance of the 650M. Apple merely wants a display processor that can make the UI and experience snappy - apparently they thought it (HD4000) wasn't enough for 2880x1880. But this will change with Haswell.

I was simply stating the fact that the HD4000 in the Macbook Pro Retinas was not enough and that there is indeed a need for better iGPU performance outside of gaming. Also, when did I state that OSX is good for gaming? I couldn't care less about the gaming performance of my laptop, hence why I have a desktop too.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I was simply stating the fact that the HD4000 in the Macbook Pro Retinas was not enough and that there is indeed a need for better iGPU performance outside of gaming. Also, when did I state that OSX is good for gaming? I couldn't care less about the gaming performance of my laptop, hence why I have a desktop too.

Sorry, Perhaps I was reading into things too much and anticipated that response - gaming performance is the first counter-argument presented about this topic by most. It really is irrelevant when discussing ultra portables, they're a different category of machine.

Anyway, most MBPs still use the HD4000 and honestly I think it's fine for general use - but I can see how some would rather have discrete. I still feel that Apple will not use discrete even in their high end MBPs if they don't have to. They're mainly going for UI responsiveness and a generally good user experience, so if Haswell ULV does that reasonably close to the 640/650M I could see them dropping nvidia. We shall see though.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Sorry, Perhaps I was reading into things too much and anticipated that response - gaming performance is the first counter-argument presented about this topic by most. It really is irrelevant when discussing ultra portables, they're a different category of machine.

Anyway, most MBPs still use the HD4000 and honestly I think it's fine for general use - but I can see how some would rather have discrete. I still feel that Apple will not use discrete even in their high end MBPs if they don't have to. They're mainly going for UI responsiveness and a generally good user experience, so if Haswell ULV does that reasonably close to the 640/650M I could see them dropping nvidia. We shall see though.

Yeah, it's totally irrelevant with small form factor laptops. I almost always have gfxcardstatus set to integrated only. I'll take battery life and cooler running temperatures over graphics performance, unless I absolutely need the dedicated GPU.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
GT630 is 30-130% faster (in actual games,AA or no AA) than HD4600 @ 1350Mhz with DDR3-2600(!). Imagine HD4600 using normal 1600/1866Mhz memory...

Those tests are all at 4x or 8x AA. That will kill any igp (especially at 1080p). Also consider the fact that intel's drivers tend to suck at high resolutions and AA (and settings) more than nvidia or AMD (performance tapers off faster)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
The GT630 gets more than 30fps across all the games in that review even with that high AA settings.

It has GDDR5 which is really uncommon for that gpu and probably results in a large performance gain at 1080p with 8x AA. The ddr3 version (which intel's slides probably refer to) is probably significantly slower.

(Look at the 640 vs 650. At low resolutions the 640 and 650 perform similarily but at 1080p at high levels of AA the 650 pulls significantly ahead-both are unplayable but the 650 is quite a bit ahead).
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
not sure where to put this...but lol at prices


http://notebookitalia.it/sony-vaio-p11-p13-11-13-pollici-intel-haswell-17349

they still think ultrabooks. ...command 'apple' margins

You're looking at a Vaio. They are always premium priced at launch, without exception.

Just give it a few months, if I can use an analogy - what you will see is the natural selection process applied to portable computers. The best ultrabooks/portables will maintain their high prices (rMBP, maybe a few others) while the cheap junk will quickly be relegated to the bargain bin. This is precisely what happened with 2012 ultrabooks, many of which were originally priced fairly high at launch.

I've seen quite a few 2012 ultrabooks for the 500-600$ mark. I expect the same to happen with the low quality 2013 models. Say what you will about Apple but, I have to say i'm happy that they upped the ante with ultra portable computers - PC makers are finding it tough to compete on price alone nowadays. The low end 1366*768 ultrabooks don't really sell well to my knowledge, many users are willing to pay more for great features.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
the vaio series are not premium...the z series was...

the macbook air actually costs less...

Well, it will quickly depreciate to what the market is willing to pay. The great thing about the internet is that more users are informed, and they know what proper pricing should be a lot of the time - I think the machines you linked will quickly go down in price.

Launch prices are always higher, this isn't unique to Sony. The market will auto-correct that.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Needed to ask, who is "we", "us"?

Seriously? Did you not see the first noun in the sentence?

Okay. We are the the business people that buy PCs for enterprises. I buy a couple hundred every day.

Are you done being pedantic yet?
(No I will not define that for you)
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Well here is a mobile haswell cpu bench (looks like entry level haswell i7)

aezUFcC.jpg


~8400 cpu score.

Can't say if accurate.

(Also 780m gpu score--nice boost).
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Seriously? Did you not see the first noun in the sentence?

Okay. We are the the business people that buy PCs for enterprises. I buy a couple hundred every day.

Are you done being pedantic yet?
(No I will not define that for you)

I asked you for statistics and you are not giving any, only a vague "we this" "we that". Ok, confirmed that you have no verifiable data to offer us here.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I asked you for statistics and you are not giving any, only a vague "we this" "we that". Ok, confirmed that you have no verifiable data to offer us here.

Prove otherwise.

But here's clue for free...If the CPU manufactures customers didn't want low power consumption then said manufacturers wouldn't produce such products.

See how logic works?

Or, are you going to claim to know more about enterprise computing than I do?
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
GT630 is 30-130% faster (in actual games,AA or no AA) than HD4600 @ 1350Mhz with DDR3-2600(!). Imagine HD4600 using normal 1600/1866Mhz memory...

Here's the GT 630 GDDR5 going against HD 4000 using DDR3-1600: http://technewspedia.com/intel-hd-graphics-4000-vs-nvidia-geforce-gt-630/

Average(excluding 3DMark) shows 77% gain for the GT 630 GDDR5. Latest drivers bring 10-20% gain in games(also fixes low clock bug) while Haswell GT2 will add further 30% or so to that.

Enigmoid said:
It has GDDR5 which is really uncommon for that gpu and probably results in a large performance gain at 1080p with 8x AA. The ddr3 version (which intel's slides probably refer to) is probably significantly slower.

Also, the GT 630 has 25% or so higher fillrate and even in identical setups, discrete GPUs are 20-30% faster than shared memory setups as in iGPUs.
 
Last edited:

RampantAndroid

Diamond Member
Jun 27, 2004
6,591
3
81
Prove otherwise.

But here's clue for free...If the CPU manufactures customers didn't want low power consumption then said manufacturers wouldn't produce such products.

See how logic works?

Or, are you going to claim to know more about enterprise computing than I do?

Where I work, we do care about consumption. We also have domain policies to force your systems to shut down at night.

Also, lower power consumption means more battery life in laptops...but I guess that's not important, eh galego?
 

galego

Golden Member
Apr 10, 2013
1,091
0
0
Also, lower power consumption means more battery life in laptops...but I guess that's not important, eh galego?

If you go backward in the thread you will notice that all started with a claim about the "majority of desktop users". Nobody was mentioning laptops... before you.
 

ThePeasant

Member
May 20, 2011
36
0
0
Yea I understand the ISAs in question fairly well, as I've had to work with SSE2 assembly and intrinsics (and I've glanced at the instructions in AVX, nothing in depth though). What I don't get is where those large multipliers come from, but it's OT anyway so forgive the distraction.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
If you go backward in the thread you will notice that all started with a claim about the "majority of desktop users". Nobody was mentioning laptops... before you.

Enterprise/corporations don't use desktop PCs? News to me.

Look, some home users don't care about efficiency, and I would be one of those. But the system in question would have to offer something very compelling to compensate without being a complete furnace 24/7, and I can't think of anything compelling with AMD's CPUs offerings. Sure, I like their GPUs but I simply can't think of a good reason to like the processors - now if the FX chips were similar in terms of performance delta compared to intel, as they were during their K6 days..., I absolutely wouldn't mind the additional power consumption. AMD did some great things during that time and their hubris and bloated corporate culture obviously caught up with them. As things are, you have worse efficiency with the FX and worse performance in many workloads. So what's the compelling reason to buy one here? 30 bucks? Come on. That's like a day's lunch and a tank of gas. Who cares.

And then you have enterprise, and efficiency absolutely matters for enterprise, corporate and data center sales.

You keep suggesting "High power consumption? SO WHAT?" Well, my question to you is what do the FX chips offer as compensation for that loss of efficiency. It sure isn't performance. Now the 7970 has worse efficiency than it's primary competitor but it differentiates itself with better performance in many games while overclocked, comes with a free game bundle and is hundreds of dollars cheaper. To me that isn't only compelling for a lot of consumers, but a darn good deal. You really can't say anything similar about the FX chips...you basically trade worse efficiency for..nothing?
 
Last edited: