Lets talk about incoming increase in system requirements

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
Well, 50% sounded absurd to me at first glance also. But if you look at it closely, at stock clocks it is not so outrageous. You have about 6% higher clocks, plus about 15% higher ipc, which is 21%. If hyperthreading adds another 20 to 30 percent, there you go.

However, realistically that is not a very viable scenario. If overclocked a lot of the IPC difference disappears unless you have a golden Haswell chip, and the gain from hyperthreading is usually minimal. However, that *could* change in Watchdogs I suppose.
 

Tequila

Senior member
Oct 24, 1999
882
11
76
The difference is the 4770 is about twice as powerful as your 2500k.

Not in gaming. Even my i5-760 was holding its own. Just look at my benchmark links in my sig and check the differences in metro233, metroLL. It's not that big, about a 9fps difference in MetroLL at max settings. I ran those with the same video cards before and after the mobo/cpu swap.

For raw cpu power, video editing, sure I love the 4770 but I suspect people can and will be gaming on lynnfield/ivy/sandy for a long time.
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
i7's are starting to show a decent margain in performance (mostly in sli/crossfire setups)over the i5 in select games, Crysis 3, Battlefield 4, etc. But 50% (talking gaming) are you serious?

Yes. An i7 is basically seen as an 8 core. The i5 is a 4 core.

i7 has twice as many cores available. This will make a difference because next-gen games are going to rely on multi-threading because they have weak single core performance. It barely made a difference last gen because games rarely used more than 2 cpu cores. If games only use 2 cores on next-gen systems that means 2 x 1.6ghz, which simply wouldn't be enough to drive next-gen game simulation. They have to use more cores.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Problem is defining what's genuinely required to play & enjoy the game vs what's theoretically "required" simply to stuff everything on Ultra for 2560p @ 120fps e-peen contests, which simply loads the CPU more with optional extra shadows, etc, which are often barely perceptible during static side by side screenshot comparisons let alone gameplay. In fact, some games actually manage to look worse at higher settings. Eg, Skyrim High vs Ultra vs Max comes to mind, with the grass / ground looking sharper on High @ 81fps than Max @ 44fps (and far worse on Ultra than High):-
http://www.techspot.com/articles-info/467/images/High_02.jpg
http://www.techspot.com/articles-info/467/images/Ultra_02.jpg
http://www.techspot.com/articles-info/467/images/Max_02.jpg


Thief recently "recommended" an i7, yet runs just fine on i3's with barely 10% difference vs an i7. In this case, "it needs an i7" = an i7-920 @ 2.66GHz. CoD Ghosts "required" 6GB RAM until it didn't. Some games will inevitably be "heavier" than others (Watch Dogs), but if the core game engine is too heavy (baseline engine, not just "extra shadows for ultra", etc), then how is it going to even work on the PS4 / XB1 whose 8 tablet cores barely match an i3-4340 in 100% perfectly threaded games (of which only 6 are usable for games on consoles)? ie, no matter what optional extra eye-candy you laden the CPU down with on Ultra for PC's, the core game will still have to run on typical console equivalent "Med" (sometimes even Low) setting in order to well, run on consoles! I can see 8GB RAM + 64-bits (and as always, GPU) making far more of a difference than CPU alone. If you have the extra horsepower, you can turn it up. If not, you don't HAVE to run everything on High / Ultra at all to enjoy the game. Half the games I own I can't often tell High from Ultra during actual gameplay, and my i5-3570 @ 4.2GHz isn't even an unlocked K chip and isn't close to being maxed on any game. It's still the GFX card that counts even for "next gen" games.

There's a huge gap between tablet CPUs and the parts in the XB1/PS4. Downplaying them as netbook GPUs is more accurate, but still off base.


With Call of Duty, Infinity Ward is just a bunch of idiots that just simply doesn't know what RAM is. But other developers do need to be more clear on those requirements. Listing an i7, for example, when there's been 4 generations under that brand, is misleading. An i7 860 performs similarly to an i3 3220, for example.
 

Majcric

Golden Member
May 3, 2011
1,409
65
91
Yes. An i7 is basically seen as an 8 core. The i5 is a 4 core.

i7 has twice as many cores available. This will make a difference because next-gen games are going to rely on multi-threading because they have weak single core performance. It barely made a difference last gen because games rarely used more than 2 cpu cores. If games only use 2 cores on next-gen systems that means 2 x 1.6ghz, which simply wouldn't be enough to drive next-gen game simulation. They have to use more cores.

There is a chance games in the future will utilize more than 4 cores, most likely 6 which is the amount of cores the PS4 dev's have to work with. But that is a prediction. Right now ipc is all that matters and the difference between Sandy and Haswell for gaming to me is small. There is also a good chance Skylake will be here by the time more than 4 cores are needed making the the i7 4770 vs i5 2500k irrevelant.
 
Last edited:

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Heck I've been getting by on a Core 2 Duo and an AMD 5850 for so long now that I don't mind upgrading at all. I am going to try and hold out until Star Citizen is released before building an entirely new top of the line rig.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
Yes. An i7 is basically seen as an 8 core. The i5 is a 4 core. i7 has twice as many cores available. This will make a difference because next-gen games are going to rely on multi-threading because they have weak single core performance. It barely made a difference last gen because games rarely used more than 2 cpu cores. If games only use 2 cores on next-gen systems that means 2 x 1.6ghz, which simply wouldn't be enough to drive next-gen game simulation. They have to use more cores.

no the i7 is a cpu with 4 physical cores and 8 logical processors
 

Skott

Diamond Member
Oct 4, 2005
5,730
1
76
Here's the thing. At some point PC gamers will have to upgrade their rigs. Just do it and don't whine about it. Save your pennies and upgrade when the time comes or don't. Simple as that. Not trying to sound harsh but that is how it is. Get use to it.
 

sze5003

Lifer
Aug 18, 2012
14,319
682
126

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Heck I've been getting by on a Core 2 Duo and an AMD 5850 for so long now that I don't mind upgrading at all. I am going to try and hold out until Star Citizen is released before building an entirely new top of the line rig.

And thats perfectly fine. I'm calling out the people with comparable specs that are raging on Internet communities that the requirements for modern games have left them behind.

I lingered with a C2D E8500 from 2008 until 2011, and a 4870 from Launch Day until 2011. Didn't feel the need to upgrade from those at 1200p until Witcher 2. But I wasn't raging on the Internet that modern games needed to scale back because I didn't have modern parts.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
good

the witcher 2 was way too small for a traditional rpg

i finished it in like less than a week

the maps were way too small and they could have used more small content

the acts themselves were not that short but they should have made like 8 or 9 or 10 of them
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
good

the witcher 2 was way too small for a traditional rpg

i finished it in like less than a week

the maps were way too small and they could have used more small content

the acts themselves were not that short but they should have made like 8 or 9 or 10 of them

Compared to what? It is an action/adventure game with RPG elements. It was just fine in length for what it was.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
Compared to what? It is an action/adventure game with RPG elements. It was just fine in length for what it was.

any old school rpg

yah it was not so mainstream but still was way too short compared to any of the gold standard rpgs
 

Staples

Diamond Member
Oct 28, 2001
4,953
119
106
I am only disappointed that the new consoles did not have better specs. Most PC gamers on forums are idiots. They do not understand that the games we have been getting recently are largely limited to the console they are made for. We can run the PC games at higher resolutions and higher detail (mostly shadows and other stuff like particles, stuff that didn't take the developers lots of time to develop like textures) which uses a lot more horsepower than the consoles have but the textures and AI are mostly fixed to what the consoles can render.
 

Staples

Diamond Member
Oct 28, 2001
4,953
119
106
And for those talking about lack of optimization for PC, this gen of consoles is more PC like than any, up there with the original Xbox. Porting should be easier than ever and most of the optimizations from the console versions will transfer over to the port running on your desktop.
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
I am only disappointed that the new consoles did not have better specs. Most PC gamers on forums are idiots. They do not understand that the games we have been getting recently are largely limited to the console they are made for. We can run the PC games at higher resolutions and higher detail (mostly shadows and other stuff like particles, stuff that didn't take the developers lots of time to develop like textures) which uses a lot more horsepower than the consoles have but the textures and AI are mostly fixed to what the consoles can render.

You have that wrong. It's not the PC gamers that are idiots, they completely understand which is why they hate that the consoles are so weak. It's the console games that don't get it or care.

While yes, these consoles are pretty much locked down PC's, it doesn't mean that the ports will be any better just because it's easier, sadly.
 

Staples

Diamond Member
Oct 28, 2001
4,953
119
106
It will just be interesting to see if this generation lasts as long as last. If MS does poorly and the PS4 runs away with sales, MS may feel compelled to bring out a new system in five years (or less). But seems CPUs and GPUs are not making huge gains every year like they used to so that may compel another 8 year cycle.

It feels so crazy talking about the next generation since this one hasn't even started. And what do I mean by started? When there are compelling games out there to make me even want to buy the system.
 

Fire&Blood

Platinum Member
Jan 13, 2009
2,333
18
81
As the gap between PC and consoles widens, some studios will start to take advantage of better hardware but overall, majority of games that get PC ports will be restrained by console limitations. I'm optimistic that a Skylake/DDR4+ GTX 980+WIN9/DX12 combo will eventually drive devs to spend more on the PC.

I'm used to seeing my computers get old, I agree with the OP, as long as high specs aren't the result of a poor port.
 

toughtrasher

Senior member
Mar 17, 2013
595
1
0
mysteryblock.com
I am only disappointed that the new consoles did not have better specs. Most PC gamers on forums are idiots. They do not understand that the games we have been getting recently are largely limited to the console they are made for. We can run the PC games at higher resolutions and higher detail (mostly shadows and other stuff like particles, stuff that didn't take the developers lots of time to develop like textures) which uses a lot more horsepower than the consoles have but the textures and AI are mostly fixed to what the consoles can render.

I hate how the consoles limit the technology usage of game companies for PCs, but I also don't want these new consoles to have a short life span.

I have both a gaming PC and a PS4 and it's a struggle lol. But I try not to think about it and just game
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
The sad truth is the new system requirements aren't harsh, this is not some massive jump and it never is with consoles. The new consoles have extremely bad GPUs in them and mediocre CPUs, but then they are budget gaming platforms at the end of the day, that's what they're there for.

If you're complaining about PC spec increase it's probably best that you just go buy a console, they have long upgrade cycles so you don't need to worry about upgrading and they're cheap so it won't really cost you much to buy in.

The PC space is really at its best for hobbyists who are willing to put in time, effort and money to get a superior gaming experience.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
The sad truth is the new system requirements aren't harsh, this is not some massive jump and it never is with consoles. The new consoles have extremely bad GPUs in them and mediocre CPUs, but then they are budget gaming platforms at the end of the day, that's what they're there for.

If you're complaining about PC spec increase it's probably best that you just go buy a console, they have long upgrade cycles so you don't need to worry about upgrading and they're cheap so it won't really cost you much to buy in.

The PC space is really at its best for hobbyists who are willing to put in time, effort and money to get a superior gaming experience.

That's your opinion I know a lot of PC gamers who are happy to play mostly indie games on what you and I would consider antique hardware. Also the console route isn't always a money saver. The main reason I hesitate to buy a PS4 is because I'm going to be stuck buying $60 games for the foreseeable future whereas on PC there is a huge backlog of games that are often going on huge sales, you can amass a library very quickly and cheaply.
 
Aug 11, 2008
10,451
642
126
That's your opinion I know a lot of PC gamers who are happy to play mostly indie games on what you and I would consider antique hardware. Also the console route isn't always a money saver. The main reason I hesitate to buy a PS4 is because I'm going to be stuck buying $60 games for the foreseeable future whereas on PC there is a huge backlog of games that are often going on huge sales, you can amass a library very quickly and cheaply.

Not only that, there is a much wider variety of games available on PC, especially strategy games of various types and indie games. As far as watchdogs and mordor go, I disagree with those who think it is only a small step up, and that people who are concerned are just those that want to game forever on ancient hardware. A 3770K is pretty close to the fastest cpu available, and that is for only "recommended" specs, not ultra. So to me that seems a pretty huge jump, and makes one wonder what is left to upgrade to if future PC games become even more demanding.

I also dont think people should expect PC gamers to be only those who are willing to dump huge amounts of money into their systems. Personally, I have a low end Sandy i5 and a HD7770 and am perfectly happy with it. I play on PC because of the cheap games available, the wider variety of games, and the ability to use a mouse and keyboard. I expect the PS4 will be able to beat my PC for graphics, but that still will not make me want to switch from my PC. Also I can use the PC for normal productivity tasks when not gaming on it.
 

werepossum

Elite Member
Jul 10, 2006
29,873
463
126
Problem is defining what's genuinely required to play & enjoy the game vs what's theoretically "required" simply to stuff everything on Ultra for 2560p @ 120fps e-peen contests, which simply loads the CPU more with optional extra shadows, etc, which are often barely perceptible during static side by side screenshot comparisons let alone gameplay. In fact, some games actually manage to look worse at higher settings. Eg, Skyrim High vs Ultra vs Max comes to mind, with the grass / ground looking sharper on High @ 81fps than Max @ 44fps (and far worse on Ultra than High):-
http://www.techspot.com/articles-info/467/images/High_02.jpg
http://www.techspot.com/articles-info/467/images/Ultra_02.jpg
http://www.techspot.com/articles-info/467/images/Max_02.jpg


Thief recently "recommended" an i7, yet runs just fine on i3's with barely 10% difference vs an i7. In this case, "it needs an i7" = an i7-920 @ 2.66GHz. CoD Ghosts "required" 6GB RAM until it didn't. Some games will inevitably be "heavier" than others (Watch Dogs), but if the core game engine is too heavy (baseline engine, not just "extra shadows for ultra", etc), then how is it going to even work on the PS4 / XB1 whose 8 tablet cores barely match an i3-4340 in 100% perfectly threaded games (of which only 6 are usable for games on consoles)? ie, no matter what optional extra eye-candy you laden the CPU down with on Ultra for PC's, the core game will still have to run on typical console equivalent "Med" (sometimes even Low) setting in order to well, run on consoles! I can see 8GB RAM + 64-bits (and as always, GPU) making far more of a difference than CPU alone. If you have the extra horsepower, you can turn it up. If not, you don't HAVE to run everything on High / Ultra at all to enjoy the game. Half the games I own I can't often tell High from Ultra during actual gameplay, and my i5-3570 @ 4.2GHz isn't even an unlocked K chip and isn't close to being maxed on any game. It's still the GFX card that counts even for "next gen" games.

^ This. PC gaming needs more of a developer attitude upgrade than anything else.

Edit: And that's not just big names developers. As much as I love Indie's, they have their fair share of problems too. "Sir, you are being hunted" somehow manages to get lower fps than Crysis with visuals worse than Morrowind / late 1990's / early 2000's FPS's. And it "recommends" 8GB RAM too - for a game which if written in an older more suitable engine for its GFX like Lithtech 2.0 or Unreal 1.0-2.0, would basically consume 300-750MB RAM. No amount of "throwing hardware at it" will solve discrepancies like that.
Excellent points. Far too often developers throw out insane recommended specs. I think as much as anything, this is to make their games seem higher tech bleeding edge, but either way I'd really like to see some guidance on what (if anything) one gives up with lower than recommended specs. Am I losing the ability to run above 1080P? Not a problem. Cutting my draw distance? Might be worth an upgrade.

Luckily the gaming press is all over that - as long as one is willing to wait a few months after launch.

And for those talking about lack of optimization for PC, this gen of consoles is more PC like than any, up there with the original Xbox. Porting should be easier than ever and most of the optimizations from the console versions will transfer over to the port running on your desktop.
Agreed, and that's good news. But with Microsoft pushing less powerful PCs rather than more, I wonder how much longer developers will assume they need to accommodate more powerful hardware in PCs? If a basic port can be done quick and easy and 75% of PCs are tablets, why would a developer spend scarce resources on higher spec PCs?
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Compared to what? It is an action/adventure game with RPG elements. It was just fine in length for what it was.

The first Witcher was a touch over 50hrs, and it had 4 acts from what I remember and they were rather large. The major blemish on 2 was it ran for 35hrs and was too linear. That and no sex cards. :awe: