Panino Manino
Senior member
- Jan 28, 2017
- 628
- 766
- 136
They gave more data.I'm not able to watch it right now. Could you summarize?
Just to "confirm what we already knew and others are testifying".
They gave more data.I'm not able to watch it right now. Could you summarize?
From a very brief skip through the video, it seems like in this video they tried to match 'realistic' configurations (e.g. mid range cpu + mid range GPU) and showed that there is still an issue for Nvidia with these configurations.The name of the video gave me the impression that they'd uncovered some new or important details.
I guess then it's kind of what was concluded here, but it's good to have testing and verification of this.From a very brief skip through the video, it seems like in this video they tried to match 'realistic' configurations (e.g. mid range cpu + mid range GPU) and showed that there is still an issue for Nvidia with these configurations.
This hypothesis neglects that many of us use a PC instead of a console to couch game on inexpensive 4k TVs.the type of people who can afford to buy a high quality 4K display are the same people who can afford a high-end GPU to render a native resolution and image on it.
Upscale 1080p to 4K then. It's basically what the consoles do. However, even if you have a less powerful GPU, the inexpensive 4k TV likely caps out at a 60 Hz refresh rate, which a mid-range GPU can actually hit in a number of titles.This hypothesis neglects that many of us use a PC instead of a console to couch game on inexpensive 4k TVs.
More data, and even more shocking results. With my cpu, in some tests even an old rx580 was keeping up with the GeForce product line.I'm not able to watch it right now. Could you summarize?
For me this would not be true. All of the games I play on the couch are controller / reaction time games, and 60 fps is required. The TV is only capable of 60hz /w black frame insertion.If you look at the TPU reviews for the 3060 which has an MSRP of $330 (yeah I know that's a pointless number right now) they still average right around 50 FPS at 4K across the 23 games that they benchmark and only a small handful of titles drop below 30 FPS, which could probably be alleviated by turning down the settings.
If the 3080 cannot even maintain 60 fps minimums in some games at 1080p medium it does not matter how GPU bound you make the settings you are not going to get smooth 60fps+ gameplay.Upscale 1080p to 4K then. It's basically what the consoles do. However, even if you have a less powerful GPU, the inexpensive 4k TV likely caps out at a 60 Hz refresh rate, which a mid-range GPU can actually hit in a number of titles.
If you look at the TPU reviews for the 3060 which has an MSRP of $330 (yeah I know that's a pointless number right now) they still average right around 50 FPS at 4K across the 23 games that they benchmark and only a small handful of titles drop below 30 FPS, which could probably be alleviated by turning down the settings.
Unless you have a newer Nvidia GPU (or a newer AMD GPU that gets support for their DLSS equivalent) you wouldn't get the capability anyways. Even the bargain basement 4K televisions cost around the same price as a mid-range GPU, or theoretically would if miners were paying two or even three times MSRP for practically any card.
That has to be one hell of a CPU bottleneck. Either that or there's a spiritual successor to Crysis I'm not aware of.If the 3080 cannot even maintain 60 fps minimums in some games at 1080p medium it does not matter how GPU bound you make the settings you are not going to get smooth 60fps+ gameplay.
4790k in watch dogs legion and given the 9400F was 51 fps min in CP2077 I expect it would be the same there too.That has to be one hell of a CPU bottleneck. Either that or there's a spiritual successor to Crysis I'm not aware of.
Yeah, no. When Zen released it was up against Intel 7 series. Zen has aged much better in the respective price ranges. Weaker IPC or not. And It is silly to compare it to a 2600k for a number of reasons, but most importantly, price at release. Back to the timeline - The 7700K 4/8 for $350 on release, is considered the worst CPU purchase of 2017. The i3 and i5 have aged even worse. Particularly given how Intel rushed the 8 series out so quickly, in response to Zen. Anyone that bought 7 series got seriously boned.so some of their older CPUs are a bit "older" than their year of release would suggest.
This is one of those replies, where I feel it muddies the waters too much. The entire premise of this thread, is the overhead with Nvidia cards. Then you site examples of a 1400 w/Nv and Intel w/AMD?I'm making the comparison against the 2600k because that's about where it landed in terms of gaming performance at the time of launch. Generally the original Zen CPUs weren't as good in terms of either IPC or clocks when they were matched up against the Intel equivalents of the time. Where they did shine was being able to get an 8C/16T CPU for under $1,000 and in being able to run highly threaded applications quite well because their SMT was better than Intel's.
From the tables I posted on the first page, the 7700K won't bottleneck in the newest AC game, whereas the 1400 shouldn't be paired with anything newer than a GTX 1080 for that particular title. The table on the second page shows a 7700K is good enough if paired with a 6900XT that it won't be CPU bound in Forza Horizon 4. If you bought a 7700K for gaming, it's still holding up well enough.
You can add price in there and make arguments related to that, but that becomes a bit more subjective when arguments about value enter into the mix. But the GameGPU data suggests that going from a 4770K to a 7700K makes a big difference, at least for the titles that I compiled the data for. It gives a 27.8% uplift for a 6900XT and a 31.3% uplift for a 3090 in Forza. The 3090 gets a 26.6% bump in AC as by going from a 4770K to a 7700K. I think the reason the 4-series was tested it because the 7-series doesn't bottleneck as nearly as much.
It's just some data that I'd posted earlier in the thread that someone else linked from a review. Since the charts are dynamically generated I copied the data into a table and did some basic statistics. The conclusion is that both Nvidia and AMD GPUs will see bottlenecks on older CPUs, but Nvidia more so than AMD does. First generation Zen with low core counts bottlenecks extremely hard because of lower clock speed and IPC on top of this. Anything older than Skylake for Intel will also cause issues for both to some degree.This is one of those replies, where I feel it muddies the waters too much. The entire premise of this thread, is the overhead with Nvidia cards. Then you site examples of a 1400 w/Nv and Intel w/AMD?I confess you lost me. There may have been a great point in there, but I am too dumb to understand it.
If this is the case then how valid are these claims?Testing on a 2600K and Core i3 7100 CPU, the Polaris 10 based cards benefit immensely with 8GB VRAM on Shadow of the Tomb Raider only because the 8GB VRAM allows DX12 to be used without spilling out of VRAM and DX12 under that scenario is seriously faster than DX11.
VRAM has nothing to do with it, look at RTX 3090 benchmarks, sometimes beaten by a 5600xt or 580 with 1/3 or 1/4 of the VRAM when paired with a older CPU.If this is the case then how valid are these claims?
The game benchmarks can fit inside the 8Gb and never have to swap out, but in actual gameplay how realistic is it for a game not to have to swap out, and even if it wouldn't have to swap out doesn't the different mem structure of the consoles force the games to stream in data constantly anyway?
Heh not a big issue for many they say ??Well Digital Foundry did talk a little about this issue but it seems to me that they are throwing shade at Hardware Unboxed's testing and are trying to minimize AMDs advantage here.
RT on AMD in CP2077 is out now. You can bet that for this they will have a video with all the details on how Nvidia is faster.Well Digital Foundry did talk a little about this issue but it seems to me that they are throwing shade at Hardware Unboxed's testing and are trying to minimize AMDs advantage here.
Ofcourse.From now i will just ignore all NVfoundry ehm "digital foundry" GPU reviews.They are biggest nv shills out there.RT on AMD in CP2077 is out now. You can bet that for this they will have a video with all the details on how Nvidia is faster.![]()
Really? I've found their card reviews spot on. They have obviuos bias on rendering technology advances, so things like RT and DLSS techniques are worth a premium to them and it shows in their reviews. For example:Ofcourse.From now i will just ignore all NVfoundry ehm "digital foundry" GPU reviews.They are biggest nv shills out there.