DrMrLordX
Lifer
- Apr 27, 2000
- 22,921
- 12,993
- 136
Neither one looks especially better than the other.More Nvidia sponsored crippleware offering enough justification to buy the 7090
Neither one looks especially better than the other.More Nvidia sponsored crippleware offering enough justification to buy the 7090
How do I do this in deferred without doing GPU-nuking levels of overdraw?We should have an option to enable traditional AA such as SSAA and MSAA
He's an idiot.Threat Interactive
They're the only ones trying to push the envelope.Nvidia is moving the entire graphics pipeline one by one so they keep their monopoly by forcing competitors to catchup. Since Nvidia is the one that comes up with it first, the focus is always on them so no matter how well AMD/Intel does with the catchup, in the end, it's just that - Nvidia but late and worse.
They're the only ones trying to push the envelope.
They technically are irrespective of their actual IQ/perf/whatever contribution.I don't know if I'd call the current ideas of frame generation or the latest ray tracing gimmick that barely runs on their own "top" end hardware pushing the envelope.
Threat Interactive made a comment that I was suspecting for a while. Nvidia is moving the entire graphics pipeline one by one so they keep their monopoly by forcing competitors to catchup. Since Nvidia is the one that comes up with it first, the focus is always on them so no matter how well AMD/Intel does with the catchup, in the end, it's just that - Nvidia but late and worse.
I know Intel does research in R&D and I read somewhere that in terms of graphics patents they have at least much as AMD does, if not more. Both vendors need to ignore what Nvidia is doing and push their own. One thing I noticed after being absent in latest games for so long is that modern games are so noisy. I disabled all upscaling features because even without upscaling it's noisy anyway. We should have an option to enable traditional AA such as SSAA and MSAA.
I'm pretty sure Matrox Parhelia's Edge AA can be implemented in modern silicon with very acceptable die size, and with much better performance and zero blurring, better than even SS/MSAA.
It's actually different, because higher resolution on a monitor means better quality without sacrificing clarity. When you supersample AA, you get an overall more blurry result.Isn't supersampling just part of "Resolution Scaling" these days? Where you can render at 100%+ of your monitors resoultion and then it is displayed at your normal resoultion? Hell Battlefield even decsribes it that way. I turn off the crap AA and bump up the resolution to 110-120% or higher, whatever runs well enough.
It's actually different, because higher resolution on a monitor means better quality without sacrificing clarity. When you supersample AA, you get an overall more blurry result.
4K is at the point where you don't really need more, and 1000-series cards were starting to be too powerful for it, so they had to introduce a new feature that could continue to sell more cards - Ray Tracing. But because the performance hit was so high and Moore's Law gain crashing, they had to make it up with ML rendering techniques, which is basically 180 degrees opposite of SSAA, reversing better quality trend of past 40 years. Add to that you have lazy and incompetent DEI-driven developer teams that can't code even if their life depended on it, now games "need" ML rendered subsampling AA so it can perform reasonably, and literally fake frames. I know the trend isn't really going to change, because average people are deaf, dumb, and blind, sprinkled with apathy.
Mind you, games are NOWHERE near photorealism. Carefully and artistically rendered marketing photos might convince you otherwise, but at an actual game level it isn't. So it makes things like artificial limitation such as Ray Tracing pointless in the first place. Because a labor of love such as good art direction lasts - Countless Mario games, or more modern examples like Warcraft series. But a "photorealistic" title looks crap 10 years later, because it's not at photorealism.
hard disagree4K is at the point where you don't really need more
literally the last two games I bought don't use RT and I have to use FSR to make them run at/above 60fps at 4k. 5800x3d and 6900xtand 1000-series cards were starting to be too powerful for it
sure polycounts and textures aren't massively increased over the years, but they're only one part of a scene. photorealistic lighting is one (very important) part of how we get to photorealistic real time rendering.Mind you, games are NOWHERE near photorealism. Carefully and artistically rendered marketing photos might convince you otherwise, but at an actual game level it isn't.
If you aren't slowly walking around, you won't notice most of such details. Also, the gains are so miniscule, when there are other areas to improve.hard disagree
Cause they are wasting resources due to abundance of them. Countless examples of people wasting when given a lot. Innovations and efficiency comes from constraints. Also, the lack of efficiency really started accelerating after Frame Gen debuted. Enough games are having FG as minimum/recommended system requirements!literally the last two games I bought don't use RT and I have to use FSR to make them run at/above 60fps at 4k. 5800x3d and 6900xt
And they still look mostly like cartoons, except in the first year or two when it's released. I know, I grew up with them. I read articles where people thought games in 1995 were realistic. I thought Quake 3 looked so good too.sure polycounts and textures aren't massively increased over the years, but they're only one part of a scene. photorealistic lighting is one (very important) part of how we get to photorealistic real time rendering.
Kinda wrong.And they still look mostly like cartoons, except in the first year or two when it's released.
Cartoons with fancy lighting and shadows are still cartoons. None of the games today look anything near real life. And if it does look semi real life, it'll fade away in a few years. You can fool gamers who subconsciously have a separate standard in games versus real life, but objectively it's nowhere near "photorealistic".Kinda wrong.
Most (all?) PBR titles hold up.
I admire your autism but PBR mats are PBR.Cartoons with fancy lighting and shadows are still cartoons
I admire your autism but PBR mats are PBR.
They're correct.
It's why modern games look same-y to begin with!
What is PBR other than a crappy beverage or a patrol boat?
I thought it was drinking a Pabst Blue Ribbon while watching Professional Bull RidingWhat is PBR other than a crappy beverage or a patrol boat?
I admire your constant personal attacks that are finely tuned to just stay under the moderator radar while being in a personal reality bubble that rivals Apple's RDF in size and strength.I admire your autism but PBR mats are PBR.
You still haven't refuted my point on PBR mats.I admire your constant personal attacks that are finely tuned to just stay under the moderator radar while being in a personal reality bubble that rivals Apple's RDF in size and strength.
ROCm vs ML2CODEML2CODE is included in AMD's GPU computing (GPGPU) platform, ROCm 6.1 and later.
Rather than executing trained AI cores at runtime, it optimizes them as existing Compute Shader code and enables native execution.
This is not true.I admire your autism but PBR mats are PBR.
They're correct.
Does anyone remember this