aaksheytalwar
Diamond Member
What will happen in q3? Either buy today or wait for haswell
Dunno about the 3570K, but the Q6600 in my machines runs in the mid 60s at stock speeds under full load. I don't know if that's good or not, but it's well under the 90C max temp for the CPU. I don't have a fancy enclosure or anything, (ventilation is mediocre at best, "silent" fans, etc.) so I'd expect some people would be running cooler.I never said IB is better than SB...just that it's better than the old C2D architecture.
Question - which ran hotter at stock under load - Q6600 or 3570K?
You've got to realize something.. These "enthusiasts" around here buy these outrageously over-powerful PC's not for functionality, but to express themselves to others, through what they buy.
Thus explaining why inferior pc components would make them feel that others will view them as being inferior as human beings. D:
They are so born and bred to support the "latest and greatest" product being promoted their way, that they actually develop hostility towards older technology as you can see demonstrated in this very thread.
Dunno about the 3570K, but the Q6600 in my machines runs in the mid 60s at stock speeds under full load. I don't know if that's good or not, but it's well under the 90C max temp for the CPU. I don't have a fancy enclosure or anything, (ventilation is mediocre at best, "silent" fans, etc.) so I'd expect some people would be running cooler.
I am still using my only C2D laptop even though I have a new desktop. I didn't develop any hostility towards how fast it is because it is still relevant for like 60% of what I do daily. I find what you've said is partially untrue as I am in the middle of considering of getting the Raspberry Pi which isn't powerful by any standards but just to satisfy my curiosity of what it is capable.They are so born and bred to support the "latest and greatest" product being promoted their way, that they actually develop hostility towards older technology as you can see demonstrated in this very thread.
No. You and the OP support unsubstantiated and dubious claims comparing apples to oranges. I have no problem with older tech. That said, C2D is not really 'old tech' anyway.
Just because you don't understand computing needs doesn't mean everyone else doesn't as well. Keep that in mind.
I am still using my only C2D laptop even though I have a new desktop. I didn't develop any hostility towards how fast it is because it is still relevant for like 60% of what I do daily. I find what you've said is partially untrue as I am in the middle of considering of getting the Raspberry Pi which isn't powerful by any standards but just to satisfy my curiosity of what it is capable.
What will happen in q3? Either buy today or wait for haswell
For browsing etc c2d is okay but won't go earlier than that. And for hardcore gaming getting the latest and best is wise if your budget permits
Even for gaming, especially when talking cpu (vs. GPU), getting the latest is only good if you are more than one generation behind.
No, I do understand computer needs.. which is why I don't understand why anybody would need the type of computing power people like you are insisting on. Unless they are using their computers for very nontraditional, professional grade, NASA-like computing power. Or maybe you guys just enjoy flattering yourselves.
You didn't list an i7 and 32 gigs of ram in your signature just for the hell of it.. you did it because you assembled that rig partly for snob reasons, with the hope that others would envy what you have.. which is why so few people around here can resist boasting about such over-bought builds in their sigs.. you've even decided to tell us what cars you're driving!
Whenever I ask what you people with your overclocked i7's, thousand watt power supplies, and 32gb of ram are actually doing to warrant that type of specs.. you can't ever do it without resorting to some theoretical BS scenario .. and suddenly, you're all professional video editors that need to encode in the background while gaming, or something equally absurd..
The problem I have with the enthusiasts here, is they tend to "look down" on inferior equipment, even when its not really inferior.. its just not currently receiving heavy product promotion from sites like this.
They develop very unrealistic expectations for how much computing power we ALL really need, and allow manufacturers to hike up prices based on how much hype they can generate. They also seem to neglect to realize that the gaps between this generation and the last are smaller than they have ever been before, and that the i5 and i7 chips only beat out last generation's favorite chips (like the q6600's and e8400's) on a select few, poorly optimized games, that don't even LOOK as good as a well-tweaked Crysis from 2007-2008.
Even then its a small difference really. There are hundreds of "max settings", 60FPS, 120FPS, and so on. But really if everyone dropped down one setting, like AA or, or shadows, or textures or so just by 1 setting, all of a sudden this card that would "barely work" all of a sudden excels. If you start doing IQ tests on each setting, people would notice that there is barely any difference between all of these settings. Yet people are engrained with the idea that your not experiencing the game correctly if you aren't running at max video settings. People are spending and throwing insane amount of money at maybe a 5% difference in IQ.
I have a new card in my machine, but that's because I built my machine recently, an I like to set it up and forget it for several years. I do this knowing that in 2015 a game will come out that I might have to lower the settings on. Who knows by how much. If its a game I like, and the game looks bad. Maybe I do need to upgrade at that point. But I doubt it. Really the 4870 in my last machine would have been playable in BF3, but probably would have been at settings that where the IQ would have been measurable. Not enough for me to upgrade just for that, but enough.
For browsing etc c2d is okay but won't go earlier than that. And for hardcore gaming getting the latest and best is wise if your budget permits
I'm running a E2140 (a gift from VirtualLarry) and its load is only 35C
Ivy bridge now runs 50-65 at load.
Gotta love old tech.
Gotta go for that sweet 100% overclock. E2140 rules.
No, I do understand computer needs.. which is why I don't understand why anybody would need the type of computing power people like you are insisting on. Unless they are using their computers for very nontraditional, professional grade, NASA-like computing power. Or maybe you guys just enjoy flattering yourselves.
You didn't list an i7 and 32 gigs of ram in your signature just for the hell of it.. you did it because you assembled that rig partly for snob reasons, with the hope that others would envy what you have.. which is why so few people around here can resist boasting about such over-bought builds in their sigs.. you've even decided to tell us what cars you're driving!
Whenever I ask what you people with your overclocked i7's, thousand watt power supplies, and 32gb of ram are actually doing to warrant that type of specs.. you can't ever do it without resorting to some theoretical BS scenario .. and suddenly, you're all professional video editors that need to encode in the background while gaming, or something equally absurd..
The problem I have with the enthusiasts here, is they tend to "look down" on inferior equipment, even when its not really inferior.. its just not currently receiving heavy product promotion from sites like this.
They develop very unrealistic expectations for how much computing power we ALL really need, and allow manufacturers to hike up prices based on how much hype they can generate. They also seem to neglect to realize that the gaps between this generation and the last are smaller than they have ever been before, and that the i5 and i7 chips only beat out last generation's favorite chips (like the q6600's and e8400's) on a select few, poorly optimized games, that don't even LOOK as good as a well-tweaked Crysis from 2007-2008.
I've got an E4400 sitting around. I may turn it into a hackintosh. I was a linux server for a little bit, but I never used it.