This is misleading or a misunderstanding completely. The game code itself and libraries it use needs to load into memory, that's how programs run. Also, there's tons of memory usage outside of graphics, all the AI, calculations, basically everything happening in game requires some sorts of variable representing that's in memory. Bring up task manager and look at the memory usage of your programs and games. The game itself will be using multiple GBs, and the OS also use memory and keep some in reserve. Bad things happen when the system runs out of memory. The 16GB memory of the console will never be fully used by the GPU. In fact, the series X has a weird memory setup where 10GB has 560 GB/Sec bandwidth and the other 6GB has 336 GB/sec. That tells me the design is for 10GB reserved for graphics and 6GB for the game and OS. I'm not even sure if the full 10GB will be for graphics actually, some of the games take up a lot of memory and I'm not sure if 6GB will be enough for both the game and OS.As long as the PC has at least 16 GB of VRAM, and 16 GB of main ram, it will not need to prefetch any more then a console.
No, the stuttering is reported on new gpu too because of the memory configuration. The default .8 (think this is 80% of something, the total GPU memory?) is too high and it artificially cause unnecessary memory swaps. Besides, if the stuttering goes away because of a memory config, the power of the gpu doesn't matter.^ Can't you also fix the stuttering by getting a real GPU instead of one with 4-year-old specs?
Thanks for the link, Digital Foundry is awesome.
The video shows that 2060 Super cards have better raytracing than the consoles on this particular game. This is fine, Watch Dogs Legion is a ray-tracing showcase for nVidia where every surface is reflective on purpose.
Those reflections are really beautiful on PC, but overall the game is not, just look at those ugly pipes everywhere. It doesn't hold a candle to the Unreal 5 demo, and it shows: as said in the DF video, WD:L is built on top of the same old engine as the previous games.
It's certainly not using Mesh Shaders and Sampler Feedback like the U5 Demo. Those, combined with a fast SSD on an optimized path to GPU memory, are the real game changers exclusive to the consoles for now. Just watch the Mark Cerny PS5 presentation again, when he talks about assets streaming. Once the usage of this kind of technique becomes common, current PCs will have to adapt somehow.
If you're interested in Unreal Engine 5 games on PC, you should invest in a powerful GPU — like something from our best graphics card roundup — and NVMe SSD later in 2021 when the engine launches and games are possibly available. An NVIDIA RTX 2080 laptop managed to run the demo at 40 frames per second.
Let me put it another way: the launch of the new consoles is one of those very few events where we are guaranteed to have games using more VRAM going forward, and the best nVidia could do at the 700USD price point and below was LESS memory than the 3 years old 1080 Ti.
Now, about saying that consoles are second rate hardware... maybe Nintendo consoles, because the Series X and the PS5 are quite something. The Series X consumes as much power as a 2060 Super alone (around 160w), while including 8 Zen2 cores (while not the latest, they're quite recent considering that game consoles have a large gestation period) plus really fast SSDs, chipset, and a GPU that AFAWK loses to a 2060S on RT but is pretty close to a 2080 on rasterisation. Not bad really for a fixed target that will be optimized to death in the coming years.
So, the question then is how are PCs going to implement this same kind of asset streaming? Having more VRAM and being more aggresive prefetching data? Having PCIEx SSDs doing direct memory transfers to GPU memory using SAM? Using lower quality assets?
If you can upgrade your computer every year then this is not a problem, and I'd be glad for you. Unfortunately I know I can't upgrade so often, so I'd be really p***ed to spend 700USD on a video card just to end up using lower quality assets.
No, the stuttering is reported on new gpu too because of the memory configuration. The default .8 (think this is 80% of something, the total GPU memory?) is too high and it artificially cause unnecessary memory swaps. Besides, if the stuttering goes away because of a memory config, the power of the gpu doesn't matter.
That's what I kind of thought...but even with hard evidence that I have posted, I don't understand why this kind of mentality. Whatever.I think he was being facetious and suggesting that NVidia's VRAM capacities for everything below the 3090 are akin to GPUs from four years ago.
I see Tarkov on this page mentioned. What are the other 2 please? Just curious.
So what's the consensus? If you want price for performance go with a 6800xt?
But if you want to run ray tracing go with a 3080 or wait until January for 3080ti.
So what's the consensus? If you want price for performance go with a 6800xt?
But if you want to run ray tracing go with a 3080 or wait until January for 3080ti.
16 GB is beneficial in Ghost Recon
In Ghost Recon Breakpoint, the 10 GB of the GeForce RTX 3080 is not enough for maximum performance, here the 16 GB of memory of the two Radeon graphics cards have a clear advantage. How does that show? With the average FPS, the Radeon graphics cards do not fare well, the Radeon RX 6800 XT is 10 percent slower than the GeForce RTX 3080. In the percentile FPS, the Radeon RX 6800 XT suddenly performs 2 percent better - there the game's highest texture level will fit in 16 GB of memory. That's why the Radeon RX 6800 can expand the 9 percent average FPS advantage over the GeForce RTX 3070 to a full 26 percent in percentile FPS. The 8 GB of the Nvidia graphics cards are even less than the 10 GB of the larger model. The Radeon can at least maintain a stable 30 FPS in Ultra HD, the GeForce RTX 3070 cannot.[\quote]
AMD Radeon RX 6800 und RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing
AMD Radeon RX 6800 XT im Test: Die Taktraten, Benchmarks in Full HD, WQHD sowie Ultra HD und Raytracing / Testsystem und Testmethodikwww.computerbase.de
I personally based off of vram would be more comfortable with a 6800xt. But I do want the ray trace performance and a 3080 is a no go for me with only 10gb. Gonna keep whatever card I end up with a while.The 3080Ti would most likely be a 6900 competitor, not a 6800XT competitor.
I'm tempted to try a 6800xt with my gsync monitor and see how it fares. Although I would definitely go with a 3080Ti for the next 3 years.Buy whatever you can get, try it out, sell it at cost if your not satisfied? Keep the free game(s) for your troubles?
That's how I'm leaning so I guess I'll wait to see if the speculated 3080Ti is a thing. If not, I can put myself on the EVGA shopping queue for a 3080.If you have a GSync monitor I'd go with an NVidia card for that reason alone.
None of the cards released now are going to have good enough RT performance for the future. We probably need another two generations of doubling performance before we start to get there.
I'm tempted to try a 6800xt with my gsync monitor and see how it fares. Although I would definitely go with a 3080Ti for the next 3 years.
I'm skeptical of the 6800xt ray tracing performance for the future. Regarding the performance compared to a 3080, it's super close and if rays weren't a thing I'd pick it right now.
So what's the consensus? If you want price for performance go with a 6800xt?
I think the consensus is just buy whatever one might happen upon in stock, and thank whatever deity you feel appropriate for said good fortune.
Having wasted so much time and effort to finally get a PS5 console..I really am so tired of 2020 and this scarcity of products. I hate constantly fighting the bots/scalpers.Do you know any that I can make sacrifices to in order to improve my chances of getting one?
I know someone who raises goats, but I'm not necessarily above going on a raid for some human captives if utterly necessary.
Having wasted so much time and effort to finally get a PS5 console..I really am so tired of 2020 and this scarcity of products. I hate constantly fighting the bots/scalpers.
I doubt stock will be any better in January but I feel like I should put myself on EVGA's purchase list for a 3080. I just want to play cyberpunk with no compromises maxed out. Then sell that 3080 as soon as the 3080ti announcement comes out..wonder if I'll lose some money.
Having wasted so much time and effort to finally get a PS5 console..I really am so tired of 2020 and this scarcity of products. I hate constantly fighting the bots/scalpers.