Ultra HD Gaming! What Can PC Learn From PS4 Pro? [Digital Foundry]

tential

Diamond Member
May 13, 2008
7,355
642
121
I would have agreed with him. Until I turned on the 4K on YOUTUBE and was able to see the difference on my 4K screen between 1800p and 4K Native in his own video CLEARLY. I'm not even wearing my glasses. Sure if you can't run 4K native that sucks. But definitely try to run 4K native.

Edit: Funny he mentions battlelfield 1 getting 60 fps at 4k "80% resolution scale". I'd hope to scale resolution above 100% on a 4K monitor in game like battlefield.
 
  • Like
Reactions: ZGR and XSoldier77X

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
I would have agreed with him. Until I turned on the 4K on YOUTUBE and was able to see the difference on my 4K screen between 1800p and 4K Native in his own video CLEARLY. I'm not even wearing my glasses. Sure if you can't run 4K native that sucks. But definitely try to run 4K native.

Edit: Funny he mentions battlelfield 1 getting 60 fps at 4k "80% resolution scale". I'd hope to scale resolution above 100% on a 4K monitor in game like battlefield.
Battlefield is quite demanding game, so the 1800c rendering resolution was chosen to get more stable 60fps.

It's good to remember that checkerboarding and other temporally enchanted or other resolution independent shading methods still decently early.
We should see a lot improvement from what we have seen.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Battlefield is quite demanding game, so the 1800c rendering resolution was chosen to get more stable 60fps.

It's good to remember that checkerboarding and other temporally enchanted or other resolution independent shading methods still decently early.
We should see a lot improvement from what we have seen.

Battlefield doesn't let you use any checkerboard type rendering on pc, so it was 1800p. Might be 1800c on console though, not sure the exact measurement.

But yeah, there's a lot of refinement that's going into these things. You can see this in games like horizon, ratchet and clank, and iirc, infinite warfare. Those games look very near 4k, with minimal artifacts.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Battlefield doesn't let you use any checkerboard type rendering on pc, so it was 1800p. Might be 1800c on console though, not sure the exact measurement.

But yeah, there's a lot of refinement that's going into these things. You can see this in games like horizon, ratchet and clank, and iirc, infinite warfare. Those games look very near 4k, with minimal artifacts.

My comment had nothing to do with that and just to do with how easy of a game Battlefield 1 is to run that faster GPUs would go above 4K resolution which is what I care about. Was just a side comment in an edit, not meant to be focused on. My only interest in the game is to pick it up to see it on my single/dual vega setup at above 100% slider on my GPU.

Although I think not running at native 4K is a MASSIVE compromise after watching that video, it's still nice to see that "weak" gpus can handle 4K with settings turned down or by selecting a resolution that still would look nice in comparison to just 1080p while giving the gamer options to grow into a 4k monitor with faster GPUs.

If a gamer picked up an RX480+my monitor and their plan was to play at sub 4K resolution until there were truly 4K affordable GPUs? Great to them. Smart move rather than blindly 1080p Ultra.

Ideally I'd have a setup like this:
https://www.youtube.com/watch?v=0Jvav9ue_IU

One can dream though right?

Edit: Why can't Google Fiber and a relatively high end desktop stream 8k youtube? Jesus christ, I didn't even know 8k Youtube existed....
 

Guru

Senior member
May 5, 2017
830
361
106
GPU's should become more powerful for us to be able to run at least 1440p native. What is the point of fake 4k, if you are running at 30fps like the PS4 pro does?

That is a worse gameplay experience and to me it even looks worse at 30fps vs 60fps. I can actually see more at 60fps.

Thing is the majority would have even lesser GPU's than a 1060 6GB or RX 480/580 and my 1060 6GB sure as hell can't keep 60fps even on my 1050p on all games. Some of it is my CPU, but even with higher end CPU's I've looked at benches it does have issues holding constant 50fps.

Sure upscaling a game that runs on average at 80fps makes sense and it will improve quality, especially if you like me are stuck on a in my case 1050p monitor, more commonly 1080p monitor, but that said even 1440p monitors are still quite expensive, at least where I'm from, so its not fully worth it to upgrade when I can't even play at 60fps at 1080p, let alone 1440p.

I feel like the better thing is for 90hz or 120hz or 144hz monitors to come down in price and GPU's to be more capable, I think for gaming playing at 90/96hz or 120/122hz is a lot better, rather than minimal levels on increased details with upscaling or eve native higher resolution.

To me in gameplay I can't tell the difference between 1080p and 1440p, haven't checked, but I probably would be able to tell the difference between 1440p and 4k in gameplay, if I stopped and looked at the same image on both screens I would notice tiny difference, but in terms of gameplay I won't be able to.

That said I can easily notice the difference between 60hz and 120hz.

So to me a better development would be 90hz/120hz/144hz monitors becoming cheaper and more available.

Also I think 1080p ALL max settings is better in graphical quality than upscaled 4k at mix of ultra and medium, etc... Every time you lower the graphics settings you are going to be getting significantly lesser quality, even if textures are still set to max.

There are games where you can set most of low or medium with textures at max and the game looks almost exactly the same as everything maxed, but the large majority of games do look significantly worse with lower quality settings.
 
Last edited: