What about calling it "GOIA" Get over it already.I suggest Artificial Super Resolution since that's what DLSS is.![]()
My first 5700xt had a fan stop bug that didn’t even start the fans until 95C that led to all sorts of issues - and then ran them at 100% indefinitely trying to cool the card back to something like 55C. I had been running a AMD blower 5700 without issues before that (my Dad's) and I had to install special drivers, endure green screens and do all sorts of stupid stuff on my way to trying to get the card to work.Well i got fed up with this stock situacion and the miners buying everything that i ended up getting a used Asus Dual RX5700 that i know was not used for mining literally 2 days before all GPU prices increased around 100% in my... lest call it country. Best decision i could ever made. The stock fan curve is BS, it allows the card to get to 95°C with the fan running at 1100rpm and that was causing it to crash in some games like Cyberpunk. I wonder if that is the issue of so many people complaining about crashes and black screens with Navi.
Hey fun police, next are you going to tell us we can't complain about the complete lack of Gigastabs the RNDA2 provides?What about calling it "GOIA" Get over it already.
Seriusly this amount of hate just for what DLSS does needs to stop.
Smart Resolution, goes with Smart Access Memory, or RAGE Resolution to go with RAGE Mode.Yeah my bad ,the naming is terrible. Could be something as trivial as "Intelligent Upscaling," but probably way too little marketing speak to get anywhere
-Infinity Upscaling to go with Infinity Cache. Radeon Image Reconstruction. Advanced Macro Display... I'll show myself out.Smart Resolution, goes with Smart Access Memory, or RAGE Resolution to go with RAGE Mode.
No thanks.What about calling it "GOIA" Get over it already.
Seriusly this amount of hate just for what DLSS does needs to stop.
Clearly you haven't seen it in action... At least in Control there's zero over sharpening going on. It does wonderful job at keeping image stable in action without aliasing. Much better than the default antialiasing in action. Also in that game it's possible to actually run the DLSS at native resolution simply by changing the resolution from config. Film grain and motion blur doesn't work when DLSS in on but that's hardly a big loss...No thanks.
I'll keep calling quality reducing fake resolution by what it is: fake resolution.
Just because over-sharpened fake4k looks sharper than overdone AA at native doesn't make fake4k better.
For years any quality reducing trickery would rightly face a big backlash from everyone, but now that Nvidia figured out what to do with their tensor units on consumer hardware their marketing department has convinced people that quality reducing trickery is a good thing?
I get it, people liked over sharpened images but if it throws out so much fidelity I don't.
Like a vegie burger! Not real, fake, look alike, etc....lolAlso lets put it this way:
DLSS+RT is
It's not really an argument per se, but AMD just has to have a better implementation of it than Nvidia and then the fanboys on both sides will change their tune. One need look no farther than various arguments about the importance of certain benchmarks, etc. in the CPU space with the most recent Zen 3 CPUs to see this in action.I see no argument for either side that'll sway the others opinion.
Still better!Like a vegie burger! Not real, fake, look alike, etc....lol
It's not just about the horsepower but also the software side.I really don't care about DLSS or Raytracing. Maybe in a couple of generations when multi-chip GPUs are perfected, there will be enough horse power to actually build a game engine in DXR from the ground up.
Actually the noise and denoising artifacts in games (Control in particular) was the most shocking thing for me about RT, when I was finally able to try it out. Compressed Youtube videos don't really show this off very well but it's very visible in gameplay.Bear in mind even with DLSS nVidia cards still need to denoise every frame to get rid of the monte carlo sampling noise that is seen in most kinds of RT rendering when you only use very limited samples per pixel for real time gaming.
*as opposed to indirect lighting aka global illumination.
The time frame for movie VFX and for game production is different.It's going to take some time for developers to figure out how to best use RT and design around it as well. Realistic lighting is something that a lot of games aim for, but in certain scenes the lighting effects are cheated to get a specific look. If Hollywood could fake lighting as well as games could to achieve an exact result they'd do it in almost every film.
This is the biggest source of frustrations and clashes when programmers have to work with visual designers (the kind that has no technical knowledge but offer their service for interface design and the likes).A bit off topic, but that just sounds like an absolutely idiotic way to contract out work. If a pay someone to do some landscaping they aren't going to let me keep changing me mind about where I want everything.
Not so sure of the middle SKU there, fairly sure it's just 40CUs + 192 bit bus and 36 CUs + 160 bit bus. That being said, the performance there for the top end SKU sounds legit.
N22 - trading blows, and beating RTX 3060 Ti.
TBP - even over 200W.
The GPU appears to have similar or even the same MSRP as RTX 3060 Ti.
Cut down N22:
There appear to be two cut down models.
One has 12 GB's of VRAM, second one has 10 GB's of VRAM.
For now, 12 GB is the one which might be priced above 300$.
There is always room for 10 or even 6 GB, budget version of this GPU.
So in essence the lineup might look like this:
RX 6700XT - 40CUs/192 bit bus/12 GB VRAM - up to 399$
RX 6700 - 36 CUs/192 bit bus/12 GB VRAM - above 300$ MSRP.
RX 6600 XT - 36 CUs/160 bit bus/10 GB VRAM - below 300$ MSRP.
Doesn't this lineup reminds us of anything...?
Mimics, exactly the SKU stack that Navi 10 had:
40CUs/ful VRAM, 36 CUs/full VRAM, 36CUs/cut down VRAM.
This might suggest, that RX 6500 XT and non-XT are going to be based on Navi 23.
Which is by far the most exciting GPU of them all.
Either N23 won't be on par with RTX 3060 or It won't be much much more efficient If the above info about N22 is true.Because AMD simply "can" do that. 32 CUs/128 bit GPU will be the same performance as RTX 3060 while being much, much more efficient.
A few games use it very well, but others don't. I played through Control and it makes a big difference there. DLSS is needed with RT, and even with a 3090 I had to play at 1080p to get 80+fps consistently, with 2560x1440 running at 45-60fps. The image looks soft and has sampling noise as mentioned earlier, but much nicer than playing at 4K without RT. I didn't care about RT/DLSS when buying the card but now think it's an important feature even today.Clearly you haven't seen it in action... At least in Control there's zero over sharpening going on. It does wonderful job at keeping image stable in action without aliasing. Much better than the default antialiasing in action. Also in that game it's possible to actually run the DLSS at native resolution simply by changing the resolution from config. Film grain and motion blur doesn't work when DLSS in on but that's hardly a big loss...
Also lets put it this way:
DLSS+RT is much better looking than no DLSS and no RT.