Lets talk about incoming increase in system requirements

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
So, back in November of 2013, the new consoles dropped. Won't rehash the full details, we all know what that hardware is. When the XB1 and PS4 dropped, even PC gamers started to get excited as games and engines would no longer be chained to the 9 year old hardware in the 360 and PS3.

Finally, we thought, we would see a massive increase in visuals and capabilities as games became truly multithreaded and 64 bit, taking full advantage of modern high end graphics cards, gobs of system RAM and VRAM.

Now, multiple games have announced their system requirements. Most notably, Watch Dogs, Wolfenstein, and Galactic Civilizations 3. And people are bitching that the games require 6GBs of system RAM, i7s, and 64 bit operating systems.

I know the majority of PC gamers know full well that their hardware won't play the latest titles forever, and sooner or later they'll be forced to either upgrade or drop the detail settings. But it seems like a fairly vocal, what I hope is a minority, honestly thought they'd be able to game on their 32 bit Windows XP, Core 2 Duo machine indefinitely.

Now, lets be fair, and assume that there's going to be sloppy ports and poorly optimized games that make poor use of RAM, GPU power, and CPU threading. Those will never go away. But we are going to see games that do make full use of modern Core iX CPUs, 8GBs of system RAM, 3+ GBs of VRAM, etc.

In the GalCiv3 thread in this forum, there's people bitching that the game requires a 64 bit OS and a DX10 compatible card. These aren't high system requirements at all, and I hate to break it to you, but there's probably not going to be a 32 bit version of Windows 9 at all. Almost every video card made since 2007 had been DirectX 10 ready. Today, every single integrated solution is DX11 ready. Referencing the Steam Hardware Survey, a full 52.5% of every Windows user on Steam has Windows 7 64 bit. Add in 14.6% for W8.1 64, 2.4% for Vista 64, and 9.3% for W8 64. That's 78.8% of users with a 64 bit OS. And I didn't even add the .3% of people using XP 64 into that tally.

Any PC gamer whose even slightly serious about their games already goes above and beyond these requirements.

We're finally starting to leave the old 360 & PS3 generation in the retirement home, where it should have been left back in 2010. I will say wholeheartedly, I can't wait to see developers fully utilize my quad core Haswell and 290X. There are recent and modern released games that only load 1 or 2 cores, and don't even break the 32bit RAM barrier. Thats not acceptable in 2014.

We're going to see a rapid jump in system requirements over the next year before they level off again, once every ounce has been scraped from the PS4.

If you're one that minority who's still clinging to that XP32 and Core 2, you need either shut your pie hole or purchase a new set up.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I agree 200%. Eventually the same people will be moaning that their old 4770's won't be sufficient and you'll need the future 4930K equivalent which gamers who actually game would have long updated to. That part won't change.

Developers should drop last gen consoles, if they develop for those old boxes of junk plus next plus (and they don't have as much next gen experience) and PC, the game ends up cross gen and I doubt it will really push anything unless its a sloppy port. Hopefully Watch Dogs doesn't go down this path. Its about time my 4770 starts to squeal and 8GB of RAM is swallowed. And you forgot Shadow of Mordor:

Recommended:
OS: 64-bit: Win 7, Win 8
Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB RAM
Graphics: NVIDIA GeForce GTX 670 | AMD Radeon HD 7970
DirectX: Version 11
Network: Broadband Internet connection
Hard Drive: 40 GB available space

What I'm really looking forward to is the Witcher 3. That should REALLY shove requirements forward, assuming they don't dumb it down. Developers did say they want it to look awesome on next gen and PC, but PC is leaps and bounds ahead. And 32-bit eh let it die.
 

Lil Frier

Platinum Member
Oct 3, 2013
2,720
21
81
I'm curious to see where this goes myself. I bought a 4670K during the winter (late-December/early-January), but I've yet to find myself wanting to play any games on my PC--I elected to get Watch Dogs on the One instead, since I don't have anyone to play on PC with but have friends/family who might get the game on The Bone. I'm also allegedly going to get my dad's second 290X when he's done mining, so I should have a solid CPU+GPU combo for the next few year (though I need to get a SSD sometime to improve load times).

What I'm curious about is how quickly these requirements rise and for how long. As you say, we're past the crap hardware form the previous gen, for the most part, but the new consoles (particularly the One) aren't powerhouses of any sort. Developers will have to decide how much they care to utilize PC hardware to the fullest when it means extra work to separate that section of development from the consoles, meaning more work.
 

Lil Frier

Platinum Member
Oct 3, 2013
2,720
21
81
I agree 200%. Eventually the same people will be moaning that their old 4770's won't be sufficient and you'll need the future 4930K equivalent which gamers who actually game would have long updated to. That part won't change.

Developers should drop last gen consoles, if they develop for those old boxes of junk plus next plus (and they don't have as much next gen experience) and PC, the game ends up cross gen and I doubt it will really push anything unless its a sloppy port. Hopefully Watch Dogs doesn't go down this path. Its about time my 4770 starts to squeal and 8GB of RAM is swallowed. And you forgot Shadow of Mordor:

Recommended:
OS: 64-bit: Win 7, Win 8
Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB RAM
Graphics: NVIDIA GeForce GTX 670 | AMD Radeon HD 7970
DirectX: Version 11
Network: Broadband Internet connection
Hard Drive: 40 GB available space

What I'm really looking forward to is the Witcher 3. That should REALLY shove requirements forward, assuming they don't dumb it down. Developers did say they want it to look awesome on next gen and PC, but PC is leaps and bounds ahead. And 32-bit eh let it die.

I'd love it if requirements were explained better. Is the 3770 recommended because of the clock speed or the HyperThreading? Would an i5-4570K do better, worse, or the same? If it's worse, how high does it need to be overclocked to match the non-K i7 from the same generation? How does something from Haswell compare?

There's a part of me that loves consoles because I don't have to figure these things out. I don't have to compare parts or read benchmarks. What I see is what I get, and that's that.
 

sze5003

Lifer
Aug 18, 2012
14,195
634
126
It will be a while before I upgrade anyway. At that time the new 20nm cards will be out and all I will need is a new processor and gpu.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
so how much ram will games be able to use?

12 gb?

more?

Well, a 64 bit engine should be capable of using anything above than 3.2GBs. Have to remember the OS limitations though, Windows 7 Home Premium is limited to 16GBs, for example. Exceeding 16GBs means you need Win 7 Pro.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
Well, a 64 bit engine should be capable of using anything above than 3.2GBs. Have to remember the OS limitations though, Windows 7 Home Premium is limited to 16GBs, for example. Exceeding 16GBs means you need Win 7 Pro.

yes i know how much ram a 64 bit computer can actually use and support

what i am asking is how much ram will a next generation game actually support in software
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
I believe the new consoles are limited to about 5gb for games, including VRAM. So they will probably use 3-4gb of system ram.
 

Majcric

Golden Member
May 3, 2011
1,386
48
91
Op, I welco0me the change and ready to start upgrading. Hell, it's half of the fun.
 
Aug 11, 2008
10,451
642
126
I'd love it if requirements were explained better. Is the 3770 recommended because of the clock speed or the HyperThreading? Would an i5-4570K do better, worse, or the same? If it's worse, how high does it need to be overclocked to match the non-K i7 from the same generation? How does something from Haswell compare?

There's a part of me that loves consoles because I don't have to figure these things out. I don't have to compare parts or read benchmarks. What I see is what I get, and that's that.

This. So a requirement for a 3770k, does that mean a 4670k doesnt meet the recommended requirements because it lacks hyperthreading? If that is true, and you need an 8 core AMD or new hyperthreaded intel quad, then probably only 20 percent of so of systems, even newly built ones, will meet the recommended requirements. Somehow I don't see this as a good thing. I am not against getting rid of XP and 32bit OS's and maybe even dual cores, but this seems a bit much. In contrast to this, look at the FB3 engine, which gives outstanding visuals but so far at least seems to scale very well with lower to high end systems.
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
Keep in mind there is a big difference between high requirements due to pushing technology, and high requirements because you are too lazy to optimize, which we know most of the time is due to the latter, not the former. The whole point of PC's is to see how far you can take your current hardware, not buying new hardware because the coding is horrendous and the only way to get past it is shear horsepower.
 

futurefields

Diamond Member
Jun 2, 2012
6,470
32
91
Keep in mind there is a big difference between high requirements due to pushing technology, and high requirements because you are too lazy to optimize, which we know most of the time is due to the latter, not the former. The whole point of PC's is to see how far you can take your current hardware, not buying new hardware because the coding is horrendous and the only way to get past it is shear horsepower.

This.

I care more about well optimized software than just ramping up system requirements to deal with poorly-coded games.

We haven't yet seen how a PS4-XB1 only games translate to PC. COrrect me if I'm wrong but all the multiplatform games have been PS3 and Xbox 360 as well which means you have the lowest-common-denominator effect still happening. That's why I'm curious about how Arkham Knight will run on PC, as it is PS4/XB1/PC only.
 

Lil Frier

Platinum Member
Oct 3, 2013
2,720
21
81
This. So a requirement for a 3770k, does that mean a 4670k doesnt meet the recommended requirements because it lacks hyperthreading? If that is true, and you need an 8 core AMD or new hyperthreaded intel quad, then probably only 20 percent of so of systems, even newly built ones, will meet the recommended requirements. Somehow I don't see this as a good thing. I am not against getting rid of XP and 32bit OS's and maybe even dual cores, but this seems a bit much. In contrast to this, look at the FB3 engine, which gives outstanding visuals but so far at least seems to scale very well with lower to high end systems.

Thing is, they're asking for the non-K i7-3770. I don't know how a Haswell i5 (or even an i3) would stack up against the Ivy i7. Are they asking for the Ivy for the IPC over Sandy? Are they asking for the i7 because they need the threads? If it was about 4 cores at 3.4 GHz, you'd think that they would ask for the i5-3570K, which says to me that it uses/needs HyperThreading, but I've yet to see anything seriously use that on a gaming front.
 

mizzou

Diamond Member
Jan 2, 2008
9,734
54
91
PC has been in a rut when it comes to computing performance gains. You have to spend top dollar to get anything that will knock out the lights on games.

Remember the days when you upgraded your CPU/video card, you could play all the current games, and even some future games, on "ultra"esque resolutions, at the highest resolution, without even a hiccup?

Nowadays, you buy the midrange card and maybe you will play last years game at ultra levels.

:(
 

ShreddedWheat

Senior member
Apr 3, 2006
386
0
0
I'm sorry but can't see much difference between my 2500k oced to 4.3 and a 4770 oced to the same amount as being much of a significant difference for processor power. I doubt many people will upgrade for insignificant cpu power increase. I understand and agree with your argument on 64 bit OS and GPU power completely.

Just my 2 cents.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
Problem is defining what's genuinely required to play & enjoy the game vs what's theoretically "required" simply to stuff everything on Ultra for 2560p @ 120fps e-peen contests, which simply loads the CPU more with optional extra shadows, etc, which are often barely perceptible during static side by side screenshot comparisons let alone gameplay. In fact, some games actually manage to look worse at higher settings. Eg, Skyrim High vs Ultra vs Max comes to mind, with the grass / ground looking sharper on High @ 81fps than Max @ 44fps (and far worse on Ultra than High):-
http://www.techspot.com/articles-info/467/images/High_02.jpg
http://www.techspot.com/articles-info/467/images/Ultra_02.jpg
http://www.techspot.com/articles-info/467/images/Max_02.jpg


Thief recently "recommended" an i7, yet runs just fine on i3's with barely 10% difference vs an i7. In this case, "it needs an i7" = an i7-920 @ 2.66GHz. CoD Ghosts "required" 6GB RAM until it didn't. Some games will inevitably be "heavier" than others (Watch Dogs), but if the core game engine is too heavy (baseline engine, not just "extra shadows for ultra", etc), then how is it going to even work on the PS4 / XB1 whose 8 tablet cores barely match an i3-4340 in 100% perfectly threaded games (of which only 6 are usable for games on consoles)? ie, no matter what optional extra eye-candy you laden the CPU down with on Ultra for PC's, the core game will still have to run on typical console equivalent "Med" (sometimes even Low) setting in order to well, run on consoles! I can see 8GB RAM + 64-bits (and as always, GPU) making far more of a difference than CPU alone. If you have the extra horsepower, you can turn it up. If not, you don't HAVE to run everything on High / Ultra at all to enjoy the game. Half the games I own I can't often tell High from Ultra during actual gameplay, and my i5-3570 @ 4.2GHz isn't even an unlocked K chip and isn't close to being maxed on any game. It's still the GFX card that counts even for "next gen" games.

Keep in mind there is a big difference between high requirements due to pushing technology, and high requirements because you are too lazy to optimize, which we know most of the time is due to the latter, not the former. The whole point of PC's is to see how far you can take your current hardware, not buying new hardware because the coding is horrendous and the only way to get past it is shear horsepower.

^ This. PC gaming needs more of a developer attitude upgrade than anything else.

Edit: And that's not just big names developers. As much as I love Indie's, they have their fair share of problems too. "Sir, you are being hunted" somehow manages to get lower fps than Crysis with visuals worse than Morrowind / late 1990's / early 2000's FPS's. And it "recommends" 8GB RAM too - for a game which if written in an older more suitable engine for its GFX like Lithtech 2.0 or Unreal 1.0-2.0, would basically consume 300-750MB RAM. No amount of "throwing hardware at it" will solve discrepancies like that.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is a wide gulf between the recommended specs and the minimum specs. Most of the minimum specs are less than a current console specs and the recommended are quite a bit higher. its not like anyone needs to buy an i7 for games, most games still don't show much improvement from going dual core to a quad core.

When it comes to DX 10 and 64 bit these things have been a long time coming. We have had 64 bit operating systems for 10 years, if you haven't made the upgrade to 64 bit in a decade well that is lunacy. DX10 isn't much younger either, these really aren't unreasonable things to be requiring from games today. You just can't moan about this, its obvious this requirement was going to start appearing for the last couple of years.
 

mikeymikec

Lifer
May 19, 2011
18,403
11,020
136
But it seems like a fairly vocal, what I hope is a minority, honestly thought they'd be able to game on their 32 bit Windows XP, Core 2 Duo machine indefinitely.

The Amiga market had a similar problem back in its day; people were chuffed to bits with the idea of a cheap computer that could play high-end games (compared to anything else on the market), and couldn't see why it couldn't always be that way. A basic half-meg RAM upgrade was the most popular upgrade for the Amiga 500 and games were designed to take advantage of it, but very few were designed for high-end Amigas and were financially successful.

The problem affected Commodore as well. If the community was generally more enthusiastic about upgrading, then more R&D would have gone into newer chipsets, and instead of the AGA chipset (which was a half-arsed job compared to what was originally planned) which needed to be revolutionary rather than evolutionary, it was too late IMO (that and a bunch of other problems).

The usual business notion applies, "if we build it, they will come", but these days developing a game is expensive business and therefore a much greater gamble.

The PC gaming market is suffering from serious lack of creativity from what I can see. Games like 'Broken Age' might help break it out. We don't need wave after wave of FPS's, just like people are starting to tire of wave after wave of comic book film reboots that are purely being done in order for movie labels to keep hold of the rights to make certain film names and make a pile of cash in the process.

Back to the topic, just like the AGA chipset needed to be revolutionary, revolutionary demos and games would have been needed to market it effectively, just like with the A500 back in its day. I don't think wave after wave of FPS's are going to make me think "ooh, I need this new game". It isn't necessarily about visuals although those are the simplest way to make people think "ooh!".

Furthermore, a revolutionary game or application that catches the public's attention could make Intel/AMD become a lot more interested in desktop hardware again.
 
Last edited:

Majcric

Golden Member
May 3, 2011
1,386
48
91
The difference is the 4770 is about twice as powerful as your 2500k.


i7's are starting to show a decent margain in performance (mostly in sli/crossfire setups)over the i5 in select games, Crysis 3, Battlefield 4, etc. But 50% (talking gaming) are you serious?
 
Last edited: