Lets talk about incoming increase in system requirements

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
The first Witcher was a touch over 50hrs, and it had 4 acts from what I remember and they were rather large. The major blemish on 2 was it ran for 35hrs and was too linear. That and no sex cards. :awe:

But you did have almost hardcore sex scenes in TW2. :)

Discussion on reddit's PCgaming sub about this same issue. Wasn't started by me, but I've said much the same as I said here.
http://www.reddit.com/r/pcgaming/comments/261d9l/discussion_ridiculous_requirements/

There's a few people over there that were under the believe that their 4 year old gaming machines would last indefinitely. Rude awakening time.
 

biostud

Lifer
Feb 27, 2003
19,938
7,041
136
What I hope they will use the hardware for:
-higher polygon count
-better surfaces
-better physics
-better AI

Hopefully CF/SLI will get even better support in upcoming titles. I'm planning on getting upgrading my mboard/CPU/memory to Haswell-E when it's released, so I'm all for pushing the hardware.
 

norseamd

Lifer
Dec 13, 2013
13,990
180
106
physics and ai first

dx 11 already looks perfect for what we need

also larger gameworlds and more objects should also be first on their priority list
 

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
I think part of the problem is that there's a very vocal minority of PC gamers that believe everyone else is also using powerful systems with expensive cpu and gpu combos, and those that have never kept up aren't real gamers. I partially blame it on the recent narrative that gaming on a pc means you're instantly in 1080/high/60fps+ territory for all games and if you're not you're doing something wrong.

For me, the great advantage of PC isn't superior hardware, it's hardware flexibility. Developers putting artificial restrictions on even launching a game (like 6+ gb ram) that could run it otherwise isn't acceptable. That's a perfectly justified outcry.
 

monkeydelmagico

Diamond Member
Nov 16, 2011
3,961
145
106
I'm used to seeing my computers get old, I agree with the OP, as long as high specs aren't the result of a poor port.

Unfortunately this appears to be the case. Games that can run smoothly on PS4/one should simply require a GPU upgrade for increased PC resolutions. We know what the CPU specs are for the consoles. Yet we are being told "recommended" specs are i7. Either the developers have failed to code/port for like resources or are flat out lying to drive up PC component sales.
 

VashHT

Diamond Member
Feb 1, 2007
3,353
1,434
136
I'm probably going to see how my i5 works before upgrading anything, I knew it wouldn't last forever and I got a good 4 years out of it which is better than most other processors I've used. I don't understand why people put so much stock in these recommended specs anyway, have they ever really been that accurate in the past?
 

sze5003

Lifer
Aug 18, 2012
14,319
682
126
I'm probably going to see how my i5 works before upgrading anything, I knew it wouldn't last forever and I got a good 4 years out of it which is better than most other processors I've used. I don't understand why people put so much stock in these recommended specs anyway, have they ever really been that accurate in the past?

That's what I'm doing too. I've had my i5 since 2011 and its been great. Many times the recommended requirements don't always fall through and the games work fine. As new titles come out we will be able to see the differences soon.
 

davie jambo

Senior member
Feb 13, 2014
380
1
0
I think part of the problem is that there's a very vocal minority of PC gamers that believe everyone else is also using powerful systems with expensive cpu and gpu combos, and those that have never kept up aren't real gamers. I partially blame it on the recent narrative that gaming on a pc means you're instantly in 1080/high/60fps+ territory for all games and if you're not you're doing something wrong.

For me, the great advantage of PC isn't superior hardware, it's hardware flexibility. Developers putting artificial restrictions on even launching a game (like 6+ gb ram) that could run it otherwise isn't acceptable. That's a perfectly justified outcry.

1080p 60fps is so last year

It's all about 4k and 120 fps this year
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Unfortunately this appears to be the case. Games that can run smoothly on PS4/one should simply require a GPU upgrade for increased PC resolutions. We know what the CPU specs are for the consoles. Yet we are being told "recommended" specs are i7. Either the developers have failed to code/port for like resources or are flat out lying to drive up PC component sales.

I think that's kind of a simplistic way of looking at it, because the next gen engines are all designed to be very scalable to hardware, and utilize each platforms' strengths.. Plus, PC gamers I've noticed typically seem to want to max out a game's settings yet when they aren't capable of doing so, the game is automatically labeled a bad port with shoddy optimization.

Perfect example is Watch Dogs..

The NeoGaf forum is rife with complaints about the CPU specs being too high and lots of "gamers" are complaining that Ubisoft must have done a poor job in optimization.

While we won't know till next week whether Ubisoft has done a great or poor job at optimizing the game, by all accounts, the game was always going to be very CPU intensive by virtue of it's attributes.

The game is massive open world and features a LOT of simulation plus supposedly very sophisticated A.I, to the point where even the NPCs offer a striking level of disparity..

Whatever is going to be found on the PS4 and Xbox One versions, will also be found on the PC version, but with the potential for a much greater degree and resolution.. For example, the PC version on ultra should offer more complex simulations, and you will be able to have more of them on screen at any one time. Draw distance and NPC count should also be much higher.. All of this is what is driving the CPU specs, and not just developers talking out of their ass.

So what I'm saying is that you can't state that high specs are automatically due to the game being a poor port.. Some games are just going to be very hardware intensive due to their attributes.

But the advantage of PC is that you can fiddle with the settings. If you want a console like experience on PC, then you can get one; especially if you're running older hardware. However, PC gamers seem to always be reluctant to lower their settings and instead opt to bash the game for being un-optimized..

Best example of that is AC IV. Although reviews and tests showed that the game could take advantage of quad core CPUs, lots of PC gamers erroneously bashed it for being CPU bound because most of the processing was done on two threads.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0
I think part of the problem is that there's a very vocal minority of PC gamers that believe everyone else is also using powerful systems with expensive cpu and gpu combos, and those that have never kept up aren't real gamers. I partially blame it on the recent narrative that gaming on a pc means you're instantly in 1080/high/60fps+ territory for all games and if you're not you're doing something wrong.

For me, the great advantage of PC isn't superior hardware, it's hardware flexibility. Developers putting artificial restrictions on even launching a game (like 6+ gb ram) that could run it otherwise isn't acceptable. That's a perfectly justified outcry.


If you goal is 1080 60 fps or less just buy a PS4 and be done with it.
 

sze5003

Lifer
Aug 18, 2012
14,319
682
126
If you goal is 1080 60 fps or less just buy a PS4 and be done with it.

But currently even ps4 can't handle 1080p 60fps in all games. Watch dogs runs at 900p on ps4. This may change later as consoles are still new.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
If you goal is 1080 60 fps or less just buy a PS4 and be done with it.
So you can game at 900p & 30fps?... The biggest advantage of PC's is not the resolution & performance, but the mods for various games and ergonomics (keyboard + mouse). Something consoles will never have.
 
Last edited:

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
If you goal is 1080 60 fps or less just buy a PS4 and be done with it.

You're echoing the attitude of that vocal minority I'm talking about that's doing more harm than good. There's way more to PC gaming than obsessing over graphics, resolution and FPS.
 
Last edited:

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I think part of the problem is that there's a very vocal minority of PC gamers that believe everyone else is also using powerful systems with expensive cpu and gpu combos, and those that have never kept up aren't real gamers. I partially blame it on the recent narrative that gaming on a pc means you're instantly in 1080/high/60fps+ territory for all games and if you're not you're doing something wrong.

You can get 1080p60 with an i3 and a ~150 dollar GPU. This isn't a hard goal to reach.



1080p 60fps is so last year

It's all about 4k and 120 fps this year

And those console games are going to really look bad at 720p30 against the upcoming 4K displays.


If you goal is 1080 60 fps or less just buy a PS4 and be done with it.

PS4 can't do 1080p60 either, the best its done so far is 1080p30. And we'll see a slow steady progression downwards as developers 'optimize' for it by lower resolutions and frame targets.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The PC gives the option to go well beyond 1080@60 with medium or high settings but its not necessary to do so. I like higher refresh rates and FPS than most so I get two cards to give me twice the frame rate so I don't have to turn down the settings too much. This year I'll probably go to 1440p@144 and that will require even more hardware to do well in games, but I want the higher pixel density as well. Some gamers are going for 2160p@60 to get even higher density images. But to play a PC game you don't need any of that, you can likely exceed a consoles level of graphics with a simple 7870 and a mid range i5. Comparable to console gaming machines can be built for around $600 and you can probably play a game on a lot less.

Its not uncommon to find most games playable on Intel's integrated graphics on low settings at 1080p, because this is the lowest obvious point for game companies to aim at. Its not the best graphics obviously but it will work for most games.

The requirements for games have been rising steadily, they always historically have been. Nothing has really changed other than that a bundle of games that were mostly console focussed and limited by the console hardware are now getting a snapshot upgrade. In a couple of years they will back down to being low end card playable again and in 4 years they will be playable on crazy resolutions on mid range hardware. That is just what the consoles and their games do, they stagnate graphically. But the PC targeted games continue to progress and there are a lot of PC based games.
 

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
You can get 1080p60 with an i3 and a ~150 dollar GPU. This isn't a hard goal to reach.

Those obsessing over resolution and FPS make the erroneous assumption that people who play games on PC are all also hardware enthusiasts. A 2-4gb, i3 dual core equivalent, sub $150 GPU system is a huge portion of PC gamers.

If game developers raised the baseline requirements to the levels hardware enthusiasts wanted, then there would barely be a PC gaming industry. The outcry is from artificial limitations developers are tacking on AAA ports. It's developers doing the usual - shortchanging PC gamers on ports.

I know that 4gb ram + i5 4430 + 7870xt could play any one of those games that 'need' 6gb+ ram just fine. 4k displays, high end GPU's, unlocked i5 and i7's, whatever, that's all niche. That's hardware enthusiasts.
 
Last edited:

SlitheryDee

Lifer
Feb 2, 2005
17,252
19
81
The watchdogs requirements are certain to be tested extensively by reviewers after the game comes out. I'm very interested in those results. In my experience my modestly overclocked 2500K has been overkill for every game I've thrown at it. I have serious doubts about the necessity of an i7, but we'll find out soon enough.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I don't think 8GB of RAM is that unreasonable, we have after all had 64 bit operating systems for nearly a decade at this point, at 8GB has been a normal amount of RAM for 3 years or so. There are a lot of XP holdouts who didn't buy lots of RAM but those that upgraded with Vista and Windows 7 should already have 8GB of RAM.
 

Sohaltang

Senior member
Apr 13, 2013
854
0
0

Sonikku

Lifer
Jun 23, 2005
15,901
4,927
136
Well crap, guess my X800XL isn't going to cut the mustard anymore. Damn you bastards that told me it was good!!!
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
If game developers raised the baseline requirements to the levels hardware enthusiasts wanted, then there would barely be a PC gaming industry. The outcry is from artificial limitations developers are tacking on AAA ports. It's developers doing the usual - shortchanging PC gamers on ports.

They aren't raising it to what enthusiasts want. If they had, the system requirements would say 'i7 3770, 16GBs of RAM, and a GTX 770/780, R9 280X/290X' and not 'i7, 6GBs, and GTX 460'.

Take Galactic Civilizations 3, for example. They require a 64 bit OS and a DirectX 10 compatible GPU. Both readily available since 2007, thats 7 years those parts have been commonly available. Realistically, nobody should even have bought a 32 bit version of Windows 8 at all. Ironically, I've talked a few people out of just that at /r/buildapc. People trying to save 10 dollars, sheesh. :colbert:


The watchdogs requirements are certain to be tested extensively by reviewers after the game comes out. I'm very interested in those results. In my experience my modestly overclocked 2500K has been overkill for every game I've thrown at it. I have serious doubts about the necessity of an i7, but we'll find out soon enough.

As its been said multiple times, a first generation i7 performs like a modern i3 does today. Not exactly the greatest wording from those developers though.

I'm pretty sure I'm not the only person who will be reading the reviews and benchmarks for several of these high profile games.


I don't think 8GB of RAM is that unreasonable, we have after all had 64 bit operating systems for nearly a decade at this point, at 8GB has been a normal amount of RAM for 3 years or so. There are a lot of XP holdouts who didn't buy lots of RAM but those that upgraded with Vista and Windows 7 should already have 8GB of RAM.

Exactly. Said near the same a paragraph up. I've also been trying to talk people at /r/buildapc to go with 16GBs, budget permitting, for exactly this reason. Even if they have to go with just 8, I try keep them in a mainboard or set up where they can easily drop in another DIMM.
 

Super56K

Golden Member
Feb 27, 2004
1,390
0
0
I don't think 8GB of RAM is that unreasonable, we have after all had 64 bit operating systems for nearly a decade at this point, at 8GB has been a normal amount of RAM for 3 years or so. There are a lot of XP holdouts who didn't buy lots of RAM but those that upgraded with Vista and Windows 7 should already have 8GB of RAM.

Steam hardware results show about 20% holding out on 32-bit operating systems and it might be lower if I'm not reading some of it right. That sounds about right at this point, even if it's 5% lower. No sympathy for 32-bit XP holdouts.

For memory, 6gb+ is ~48%, 1-4gb is ~50%. Singling out 4gb and 8gb shows that more people run 8 than 4 (27% and 21%), but that still doesn't mean it's ok for developers to artificially cut out a big chunk of gamers because they don't want to optimize for it.