Saylick
Diamond Member
LOL. C'mon Lisa Su. All you gotta do is just walk up on stage, hold up the 7900XT, and say the magical words, "Nine ninety nine".Basically, all AMD (or even Intel) has to do now is a repeat of this:
LOL. C'mon Lisa Su. All you gotta do is just walk up on stage, hold up the 7900XT, and say the magical words, "Nine ninety nine".Basically, all AMD (or even Intel) has to do now is a repeat of this:
LOL. C'mon Lisa Su. All you gotta do is just walk up on stage, hold up the 7900XT, and say the magical words, "Nine ninety nine".
Then drop the mic, not the 7900XT.
Oh what the heck, smash it on the stage and tell us you’ve literally got boatloads of them ready to go, we don’t need that one.
LOL. C'mon Lisa Su. All you gotta do is just walk up on stage, hold up the 7900XT, and say the magical words, "Nine ninety nine".
Assuming the 7900XT gets within 10% of the raster performance of the 4090 at 350W or less and costs $999, that's a day one purchase for me. Instantly. I don't even care so much as to RT performance, which shouldn't even be an issue considering the 7900XT should have noticeably better RT performance than the 3090 Ti, but raster and power comes first for me. I will definitely underclock it by 15% to reduce power by 40% so that I can get essentially 1.5x 3090 Ti performance at <250W.You say that jokingly (maybe), but damn wouldn't that be something if she did that? That would win over some minds for sure.
Then drop the mic, not the 7900XT.
Oh what the heck, smash it on the stage and tell us you’ve literally got boatloads of them ready to go, we don’t need that one.
Basically, all AMD (or even Intel) has to do now is a repeat of this:
It's there to make the 4090 look enticing.What's even the point of the good 4080? The people who will spend $1300 on a gpu are a pretty price inelastic demographic who might as well just buy the 4090 at $1600.
God would I love to see another RX 480 kind of release. That was really something at $200.
There's precedent for that too actually. See the short video below about "reviewing" the iPad (a/v out of sync unfortunately):
Every so often we used to get cards like these. Like the 8800GT. Legendary. Unfortunately those days seem to be over.
My last two GPU were legendary. ATI Radeon 9700 Pro, followed by an 8800GT, which I'm still using!
The downside of DLSS is decreased quality. All the testing has been with DLSS performance which is basically 1080p upscaled. Plus I don't see how having a frame inserted that has no CPU generated game data attached to it cannot be perceived as lag if it occurs during a button press. I don't think this technology will be particularly desirable in all cases. Especially if it interprets the frame incorrectly for some reason.Yeah, I was looking over his replies on Twitter to see if he was just giving the Nvidia marketing spiel or if he truly believes what he says, and it seemed to lean towards the former. Too much use of marketing terms for me to think otherwise.
Regardless, "illusion of higher fps" is an apt way to put it. All I know is this, moving forward: Not all fps is created equal. Will there be scenarios where it is preferable to have 40 fps without DLSS 3.0 over 80 fps with DLSS 3.0 at the same input lag? The selling point, if done right, is that there would be zero cost to enabling DLSS 3.0. We'll have to see in-depth testing to find out...
Plus I don't see how having a frame inserted that has no CPU generated game data attached to it cannot be perceived as lag if it occurs during a button press. I don't think this technology will be particularly desirable in all cases. Especially if it interprets the frame incorrectly for some reason.
Buy the 4090, have to upgrade PSU to 1,200+ watts!
But we all know if you spend that kind of money, you are in two years going to buy the 5090 for $2000 as well with DP2.0 and PCie 5.0 😛I agree I run a pcie-4 card in a gen 3 motherboard. But if your spending that kind of money you would expect the best. Not to mention new specs like Display Port 2.0 looks like to me the safe money is to wait.
Pick up a 3 series at a supper discount use FSR to upscale if needed and wait it out.
R81Z3N1
I also think that if you read the small print of Nvidias slides, the upto 4x performance claim, and general-ish 2x claim is only achieved when using DLSS3, which I dont see enabled on the comparison card - will be interesting to see an apples to apples comparison, because seems to me that tripling the transistor count hasnt done a huge benefit to performance relatively speaking.