Lets talk about incoming increase in system requirements

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
They aren't raising it to what enthusiasts want. If they had, the system requirements would say 'i7 3770, 16GBs of RAM, and a GTX 770/780, R9 280X/290X' and not 'i7, 6GBs, and GTX 460'.

Take Galactic Civilizations 3, for example. They require a 64 bit OS and a DirectX 10 compatible GPU. Both readily available since 2007, thats 7 years those parts have been commonly available. Realistically, nobody should even have bought a 32 bit version of Windows 8 at all. Ironically, I've talked a few people out of just that at /r/buildapc. People trying to save 10 dollars, sheesh. :colbert:

Well, to make it clearer, when I'm mentioning AAA ports I'm definitely not talking about Galactic Civilizations 3. I have no sympathy for people moaning about a lack of 32-bit support or requiring DX10. It's a big oversight these days to purchase or build a system using a 32-bit OS.
 

Rhezuss

Diamond Member
Jan 31, 2006
4,118
34
91
I don't know for you but I miss the days where my games lagged lol.
It was the best of times where everything was possible and you could only get better.

Can't wait to feel that again!
 

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
I think that's kind of a simplistic way of looking at it, because the next gen engines are all designed to be very scalable to hardware, and utilize each platforms' strengths.. Plus, PC gamers I've noticed typically seem to want to max out a game's settings yet when they aren't capable of doing so, the game is automatically labeled a bad port with shoddy optimization.

Perfect example is Watch Dogs..

The NeoGaf forum is rife with complaints about the CPU specs being too high and lots of "gamers" are complaining that Ubisoft must have done a poor job in optimization.

While we won't know till next week whether Ubisoft has done a great or poor job at optimizing the game, by all accounts, the game was always going to be very CPU intensive by virtue of it's attributes.

The game is massive open world and features a LOT of simulation plus supposedly very sophisticated A.I, to the point where even the NPCs offer a striking level of disparity..

Whatever is going to be found on the PS4 and Xbox One versions, will also be found on the PC version, but with the potential for a much greater degree and resolution.. For example, the PC version on ultra should offer more complex simulations, and you will be able to have more of them on screen at any one time. Draw distance and NPC count should also be much higher.. All of this is what is driving the CPU specs, and not just developers talking out of their ass.

So what I'm saying is that you can't state that high specs are automatically due to the game being a poor port.. Some games are just going to be very hardware intensive due to their attributes.

But the advantage of PC is that you can fiddle with the settings. If you want a console like experience on PC, then you can get one; especially if you're running older hardware. However, PC gamers seem to always be reluctant to lower their settings and instead opt to bash the game for being un-optimized..

Best example of that is AC IV. Although reviews and tests showed that the game could take advantage of quad core CPUs, lots of PC gamers erroneously bashed it for being CPU bound because most of the processing was done on two threads.

Very well stated. :thumbsup:

Alot of "should" and possible "potential" described though. The scale-ability you are describing, in theory, sounds great. The end results have left something to be desired. Hence the general skepticism. Either A) The system recommended has been more powerful than actually needed or B) The results were not considered significant enough to warrant the extra $$$ spent.

I realize some of the problem may be our high expectations, but the developers need to be able to deliver a product that meets or exceeds them.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Very well stated. :thumbsup:

Alot of "should" and possible "potential" described though. The scale-ability you are describing, in theory, sounds great. The end results have left something to be desired.

It's a new thing for developers, but this will be how they do business from now on because it makes sense, both fiscally and practically.

Developers will be using the PC as the lead platform for their multiplatform games, and downscaling them appropriately for the consoles. The scalable nature of these new engines reflect this new philosophy. Why? Because it's easier to downscale assets than upscale them.

Upscaling may have been the order of the day in the Xbox 360 and PS3 days because those platforms weren't very similar to the PC platform, and there was an order of magnitude difference in terms of performance and capability between the "shitbox" "pees3" and PC. So it made sense for developers to make assets for the lowest common denominator first, rather than vice versa as they do now.

For Ubisoft, AC IV and Watch Dogs both had the PC as the lead platform. For DICE's BF4, the PC was the lead platform. For upcoming games, the PC will be the lead platform for Dragon Age Inquisition, Batman Arkham Knight, and the Witcher 3 :cool:

So the days of PC gamers being treated as second class citizens should hopefully be behind us. Death to the shitbox and pees3!
 

Vdubchaos

Lifer
Nov 11, 2009
10,408
10
0
GPU hardware specs have quadripled and some past 4-5 years.

Games video quality has improved slightly.

Games themselves are same old same old and they suck you dry of money with little BS that should be included in the game.

Gaming industry SUCKS and hardware is irreverent.

sorry

I can no longer play newer games and not feel like a complete sucker/ripped off.

But I do play older games.
 

toughtrasher

Senior member
Mar 17, 2013
595
1
0
mysteryblock.com
GPU hardware specs have quadripled and some past 4-5 years.

Games video quality has improved slightly.

Games themselves are same old same old and they suck you dry of money with little BS that should be included in the game.

Gaming industry SUCKS and hardware is irreverent.

sorry

I can no longer play newer games and not feel like a complete sucker/ripped off.

But I do play older games.

There has to be some modern games you enjoy/enjoyed in the past. I know a lot of them seem like ripoffs but I've bought some awesome games that are actually worth it.
 

ImpulsE69

Lifer
Jan 8, 2010
14,946
1,077
126
Well, the easy way to look at it is never pay full retail price for a game. I know that seems to piss some people off, but whatever. It used to be there was a limited amount to go around that could sometimes be used to justify prices. These days with everyone more or less forcing digital and no resale, there is no shortage. Market dictates price. Just because you FEEL your item should cost X doesn't mean the consumer wants to spend X. Early adopters who throw money at everything deserve to get ripped off. Honestly? Who really believes it takes 100Million+ to make a video game? There is no reason it should, except for wasteful management. Say what you want about non AAA titles but they are still some of the best games, and cost a fraction of the price to make.

This expectation that every game should be released at $59 regardless of...well practically ANY REASONING? Blatent disregard and disrespect for the consumer.
 
Last edited:

Captante

Lifer
Oct 20, 2003
30,353
10,878
136
Considering that my 3rd backup rig (which consists of an X2 6400+ with a GTX-550ti) runs most newer games decently with reduced graphic settings I seriously doubt anyone sporting a 4770k with a recent high-end GPU will be having problems any time soon.

Bottom line is that any game that runs badly on lower end PC's isn't going to sell very well at all since relative to AT'ers thats what most folks run.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I have a suspicion that Wolfenstein 2014 actually can utilise an i7 (scroll down):

http://gamegpu.ru/action-/-fps-/-tps/wolfenstein-the-new-order-test-gpu.html

at the "separate test scene" an i5 hits 50/57, while an i7 hits 55/59. Inconsequential that may be, but in a year or 2 I wouldn't be surprised if a modern i7 is a requirement. Not even overclocked, engines can use all eight threads. Plus with a system with 16GB of RAM, Wolf swallowed 4GB . . . . .
 

spat55

Senior member
Jul 2, 2013
539
5
76
Yes. An i7 is basically seen as an 8 core. The i5 is a 4 core.

i7 has twice as many cores available. This will make a difference because next-gen games are going to rely on multi-threading because they have weak single core performance. It barely made a difference last gen because games rarely used more than 2 cpu cores. If games only use 2 cores on next-gen systems that means 2 x 1.6ghz, which simply wouldn't be enough to drive next-gen game simulation. They have to use more cores.

But an i7 isn't a real 8 core as it uses hyperthreading.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
People have been claiming we would need these super beefy desktop CPUs and the high end for a really long time, and it just hasn't happened yet. I would say this is the first year where I would argue a quad core is beginning to be worth it, that isn't to say its really necessary because the gains are actually normally relatively minor, but for mid range and above GPUs there is fps to be had in some games from a quad core.


I can't see there being a need for hyperthreading (which at absolute best gives 20% extra speed) any time soon nor 6 cores. There is a good reason for this. Unlike how most people think software development of games works what happens is developers write their games to work on hardware that is available or going to be available. They have to do that otherwise they can't actually run the product they are building themselves, so they can't test it. So practically what happens is target machines that many gamers will have, and most don't have 6/8 real cores or 4.4Ghz quad cores or even i7's. So instead they target more mainstream dual cores.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
People have been claiming we would need these super beefy desktop CPUs and the high end for a really long time, and it just hasn't happened yet. I would say this is the first year where I would argue a quad core is beginning to be worth it, that isn't to say its really necessary because the gains are actually normally relatively minor, but for mid range and above GPUs there is fps to be had in some games from a quad core.

must be kidding
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Read some posts from people looking at Watch Dogs screen caps allegedly from a 360 and PS3, and they're complaining it doesn't look 'next gen'. No shirt, Sherlocks. The 360 and PS3 are ten years old. They've been heavily antiquated for years, all the games for them have looked like trash since 2010. What did you expect the 360 and PS3 to do?

The only reason Ubi Soft is even making a version for the 7th gen consoles is so they can sell you one copy for 60 dollars, then sell it to you again when you finally do buy a XB1 or PS4.