1660Ti Reviews Thread

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SMOGZINN

Lifer
Jun 17, 2005
14,202
4,401
136
Are you talking about Turing as a whole? I don't think anyone was reasonably expecting the 1660 Ti to be a capable 4K card.

Yes, pretty much. I'm talking about it's total market position as regard to all of Turing. None of these are the cards we really wanted. Instead we got a new premium feature that no one asked for and isn't even included in this card.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Instead we got a new premium feature that no one asked for and isn't even included in this card.
We have to choose though: either we don't want the feature or we criticize the card for not having the feature. Blaming Nvidia for both would be weird to say the least.
 
  • Like
Reactions: Qwertilot

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
With the FP16 cores, the 1660ti crushes all of the 10xx cards at FP16, if that's any use to anyone.
 

SMOGZINN

Lifer
Jun 17, 2005
14,202
4,401
136
We have to choose though: either we don't want the feature or we criticize the card for not having the feature. Blaming Nvidia for both would be weird to say the least.

Two things, I didn't say we didn't want it. I actually think that ray tracing will be a good upgrade when they finally get it implemented, it is just that we didn't ask for it. I think most of us can agree that we would have rather had 4k gaming. As good as ray tracing will look it will not make nearly as much difference as an upgrade in resolution.
Second, I can complain about both if we assume that Nvidia prioritized implementing ray tracing in the 2000 series instead of perusing technologies that would have made 4k gaming possible. It has an opportunity cost.
 
  • Like
Reactions: OTG

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
With the FP16 cores, the 1660ti crushes all of the 10xx cards at FP16, if that's any use to anyone.
That might be useful going forward, if PC games start using "reduced precision" rendering / shaders. I've read that Mobile games, use FP16 shaders and whatnot, to reduce processing and memory requirements for phones. If that gets to be a thing (PCs getting mobile-phone game ports), then I can see the use for this.

For more "traditional" PC gaming, I don't see it gaining much traction, though. Image quality is a thing for PC games. I imagine FP16 might cause some banding or whatnot. Especially in motion.

Honestly, I see the 1660ti as similar in many ways to NVidia's GTX 460 GPUs. I think that it will be a hot seller, because it hits several "sweet spots".
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
That might be useful going forward, if PC games start using "reduced precision" rendering / shaders. I've read that Mobile games, use FP16 shaders and whatnot, to reduce processing and memory requirements for phones. If that gets to be a thing (PCs getting mobile-phone game ports), then I can see the use for this.

For more "traditional" PC gaming, I don't see it gaining much traction, though. Image quality is a thing for PC games. I imagine FP16 might cause some banding or whatnot. Especially in motion.

Honestly, I see the 1660ti as similar in many ways to NVidia's GTX 460 GPUs. I think that it will be a hot seller, because it hits several "sweet spots".

Some games already use FP16 (sometimes called rapid packed math). I know the far cry games starting with primal (IIRC) use FP16 calculations. The latest Wolfenstein game uses it too. Not all calculations are done using FP16, but a significant portion of them don't need higher than 16 bit calculations and you can get a nice speed bump by doing those calculations using FP16 accuracy without any quality loss on cards that support increased FP16 calculation rates. Lower bit calculations for certain use cases in games I know are being explored as well.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
That might be useful going forward, if PC games start using "reduced precision" rendering / shaders. I've read that Mobile games, use FP16 shaders and whatnot, to reduce processing and memory requirements for phones. If that gets to be a thing (PCs getting mobile-phone game ports), then I can see the use for this.

For more "traditional" PC gaming, I don't see it gaining much traction, though. Image quality is a thing for PC games. I imagine FP16 might cause some banding or whatnot. Especially in motion.

Honestly, I see the 1660ti as similar in many ways to NVidia's GTX 460 GPUs. I think that it will be a hot seller, because it hits several "sweet spots".
Larry, what has happened to you? A close to $300 GPU as the sweetspot?

On reflection however, sweetspot is a relative versus absolute term, so buy on.
 
  • Like
Reactions: coercitiv

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Some games already use FP16 (sometimes called rapid packed math). I know the far cry games starting with primal (IIRC) use FP16 calculations. The latest Wolfenstein game uses it too. Not all calculations are done using FP16, but a significant portion of them don't need higher than 16 bit calculations and you can get a nice speed bump by doing those calculations using FP16 accuracy without any quality loss on cards that support increased FP16 calculation rates. Lower bit calculations for certain use cases in games I know are being explored as well.
Correct me if I'm wrong, but didn't FP16 always exist? It was the fact that GPUs took just as long to process as FP32 that made it pretty useless.

Rapid packed math is a new technique to double the computation rate by doing (2) FP16 at once in the same FP32 resource window.
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
Correct me if I'm wrong, but didn't FP16 always exist? It was the fact that GPUs took just as long to process as FP32 that made it pretty useless.

Rapid packed math is a new technique to double the computation rate by doing (2) FP16 at once in the same FP32 resource window.

Yes, FP16 has existed for a long time it's just that it hasn't been used in modern PC gaming because, as you said, there was very little benefit to do so. Now that GPUs are starting to be able to process FP16 at twice the rate of FP32, it makes it worthwhile to use FP16 in the code where appropriate.
 
  • Like
Reactions: coercitiv

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Two things, I didn't say we didn't want it. I actually think that ray tracing will be a good upgrade when they finally get it implemented, it is just that we didn't ask for it. I think most of us can agree that we would have rather had 4k gaming. As good as ray tracing will look it will not make nearly as much difference as an upgrade in resolution. Second, I can complain about both if we assume that Nvidia prioritized implementing ray tracing in the 2000 series instead of perusing technologies that would have made 4k gaming possible. It has an opportunity cost.
If you would rather have better 4k gaming from Turing (as many of us openly said already) then you don't want the extra specialized RTX hardware in Turing - for now at least, hence criticizing the 1660Ti for not having the features would be borderline hypocritical. (keep in mind the Turing chip inside 1660Ti is physically built without ray tracing cores and without tensor cores, it's not like they disabled the features via software for artificial product segmentation)
 

SMOGZINN

Lifer
Jun 17, 2005
14,202
4,401
136
If you would rather have better 4k gaming from Turing (as many of us openly said already) then you don't want the extra specialized RTX hardware in Turing - for now at least, hence criticizing the 1660Ti for not having the features would be borderline hypocritical. (keep in mind the Turing chip inside 1660Ti is physically built without ray tracing cores and without tensor cores, it's not like they disabled the features via software for artificial product segmentation)

Except they did spend the resources developing the specialized RTX hardware instead of working on better 4k gaming, and now has released their mainstream card and it has neither. It is not hypocritical to complain that instead of working on something that would have benefited their mainstream audience they worked on something that they now can't even afford to put in their mainstream card.
 
  • Like
Reactions: OTG

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
If you would rather have better 4k gaming from Turing (as many of us openly said already) then you don't want the extra specialized RTX hardware in Turing - for now at least, hence criticizing the 1660Ti for not having the features would be borderline hypocritical. (keep in mind the Turing chip inside 1660Ti is physically built without ray tracing cores and without tensor cores, it's not like they disabled the features via software for artificial product segmentation)

-I think I understand the general sentiment. NV isn't serious about RTX until you can access the tech and it's functional in the mainstream space.

Until the 1660Ti there was a lot of speculation that even if RT cores we're out (and maybe they weren't), tensor cores were in so DLSS (for what it's worth) would be accessible to the unwashed masses.

Now here we are, the 1660ti has no RTX capability whatsoever, so NV was never really serious about RTX tech this generation. If they're not serious about RTX adoption this gen, then why did they put resources toward RTX that would have been better spent on the traditional performance elements in their high end?

There is an obvious and solid counter argument that NV needs to build the ecosystem on the backs of early adopters that are willing to Shell out the extra money to finance this endevour and that the tech simply isn't viable in the mainstream segment until 7nm.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is an obvious and solid counter argument that NV needs to build the ecosystem on the backs of early adopters that are willing to Shell out the extra money to finance this endevour and that the tech simply isn't viable in the mainstream segment until 7nm.

Which is obviously true. The other reason to do it now is they've probably just about done it early enough to get it into the next gen consoles, which is key for the most widespread adoption. I can't believe Microsoft won't have it in the Xbox 2 being as they own the DX12 DXR extension. If MS have it then Sony will have to have it too. If Nvidia waited a year then there's no chance the consoles would get it till the XBox 3 in 7 or 8 years time.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Which is obviously true. The other reason to do it now is they've probably just about done it early enough to get it into the next gen consoles, which is key for the most widespread adoption. I can't believe Microsoft won't have it in the Xbox 2 being as they own the DX12 DXR extension. If MS have it then Sony will have to have it too. If Nvidia waited a year then there's no chance the consoles would get it till the XBox 3 in 7 or 8 years time.
This post has me confused. Nvidia, consoles, RTX adoption?
 
  • Like
Reactions: coercitiv

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Except they did spend the resources developing the specialized RTX hardware instead of working on better 4k gaming, and now has released their mainstream card and it has neither. It is not hypocritical to complain that instead of working on something that would have benefited their mainstream audience they worked on something that they now can't even afford to put in their mainstream card.
Now here we are, the 1660ti has no RTX capability whatsoever, so NV was never really serious about RTX tech this generation. If they're not serious about RTX adoption this gen, then why did they put resources toward RTX that would have been better spent on the traditional performance elements in their high end?
This is absurd, they did work on something else, ray tracing is the future in graphics rendering, the R&D is well spent. You can blame Nvidia on timing, on implementation for their first gen, on pricing, on having a green logo and a leather jacket with an attitude, but complaining the mainstream card lacks specialized units that would have lowered it's raster performance is ludicrous. I can't believe I have to defend Nvidia on this.

Imagine complaining the villain is even more unscrupulous because one day he helped the old lady cross the street. Your honor, his act of kindness only enforces the ruthlessness of his prior actions! He is the worst of candy thieves!
 
  • Like
Reactions: DooKey

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
This post has me confused. Nvidia, consoles, RTX adoption?
Consoles probably AMD chips with hardware raytracing support for DX12 DXR because that's what MS want, which means better PC support for raytracing, which suits Nvidia.
 

SMOGZINN

Lifer
Jun 17, 2005
14,202
4,401
136
ray tracing is the future in graphics rendering

I agree it is the future of graphics rendering (well, it is part of the future, it is really just one more tool in a large chest not the next coming of video graphics Jesus) unfortunately they spent their time and money on it today. We are complaining about timing. They based their product line around a technology that they can't even put into the mainstream cards.

The other reason to do it now is they've probably just about done it early enough to get it into the next gen consoles

I think the odds of it getting in the next gen of consoles is just about nil. The 1660ti is about the highest end graphics card we could expect in the consoles and if they could not afford to put it in this card they can't afford to put it in the cards for the consoles either. Microsoft is not going to make a $1000+ Xbox. The cards that have RTX cores cost more then the entire budget for either console.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Consoles probably AMD chips with hardware raytracing support for DX12 DXR because that's what MS want, which means better PC support for raytracing, which suits Nvidia.
Not if Nvidia has proprietory code paths for the RT. Is RTX exactly equal to DXR?

MS wants RT to conform to the DXR format. That is the only way it will benefit the consoles also.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
This is absurd, they did work on something else, ray tracing is the future in graphics rendering, the R&D is well spent. You can blame Nvidia on timing, on implementation for their first gen, on pricing, on having a green logo and a leather jacket with an attitude, but complaining the mainstream card lacks specialized units that would have lowered it's raster performance is ludicrous. I can't believe I have to defend Nvidia on this.

Imagine complaining the villain is even more unscrupulous because one day he helped the old lady cross the street. Your honor, his act of kindness only enforces the ruthlessness of his prior actions! He is the worst of candy thieves!

-Cant speak for Smog, but I think you have my position exactly backwards. I think the 1660Ti is a very solid card (a touch pricey but that's neither here nor there) in terms of tech/arch.

However, I do wish NV had chosen to scale this arch up to at least the 2080 (TU 104) level for smaller, possibly more performant, price conscious parts. Leave RTX for the TU102 parts or as a special Titan tier part. No mistake, I understand why NV took this route, I just don't like it as a consumer.

To use your analogy, it's like watching everyone cheering that the villian is the good guy and everything is forgiven because one day he helped an old lady cross the street.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Two things, I didn't say we didn't want it. I actually think that ray tracing will be a good upgrade when they finally get it implemented, it is just that we didn't ask for it. I think most of us can agree that we would have rather had 4k gaming. As good as ray tracing will look it will not make nearly as much difference as an upgrade in resolution.
Second, I can complain about both if we assume that Nvidia prioritized implementing ray tracing in the 2000 series instead of perusing technologies that would have made 4k gaming possible. It has an opportunity cost.

I'd disagree here. I'm already hungry for what DXR can do in RPGs or games that are more world immersion than BF. I want to play Metro so bad, but screw Epic Games (I have some principles ... some). DXR to me has been a huge upgrade. I wish NV would get the ball rolling with their partners. I've been 4K gaming (with compromises) since last year. BF5 looks AMAZING with DXR on. I'd gladly drop my resolution from 4K to 21:9 1440P to get acceptable DXR performance. Why the port royal benchmark is such a tease. If this is the end goal, get us there ASAP. Performance will come, but right now we have beautiful DXR, throw on DLSS and you get blur (note: I haven't tested first hand since my card was space invaded, so basing this on youtube compression videos and user images) which is a far cry from Port Royal.

I didn't ask for DXR, but I'm glad I got it. Because now I want more of it, and hopefully better implementations of it.
 
  • Like
Reactions: DooKey

Ranulf

Platinum Member
Jul 18, 2001
2,331
1,139
136
Eh, I'm a cranky curmudgeon when it comes to price/perf but a 4k card at "reasonable rates" (40 fps?) is asking a bit much for a barely sub $300 card.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
I'd disagree here. I'm already hungry for what DXR can do in RPGs or games that are more world immersion than BF. I want to play Metro so bad, but screw Epic Games (I have some principles ... some). DXR to me has been a huge upgrade. I wish NV would get the ball rolling with their partners. I've been 4K gaming (with compromises) since last year. BF5 looks AMAZING with DXR on. I'd gladly drop my resolution from 4K to 21:9 1440P to get acceptable DXR performance. Why the port royal benchmark is such a tease. If this is the end goal, get us there ASAP. Performance will come, but right now we have beautiful DXR, throw on DLSS and you get blur (note: I haven't tested first hand since my card was space invaded, so basing this on youtube compression videos and user images) which is a far cry from Port Royal.

I didn't ask for DXR, but I'm glad I got it. Because now I want more of it, and hopefully better implementations of it.

-It needs to come down in cost then otherwise NV will be chasing the proverbial chicken and egg.

I think everything would have been better received this generation if all the parts (102/4/6/16 etc) had slotted into their historic price points and NV just took one in the margins for the team to really get a broad base of RTX hardware out there. No one could argue with a RTX 2070 perf + RTX @ 1660ti prices so on and on, but instead here we are.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
Remember that Turing was supposed to be on Samsung 10 nm (I think?) It obviously wouldn't have helped on cost but perhaps it would have clocked higher.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
Not if Nvidia has proprietory code paths for the RT.

Doesn't appear that there's anything proprietary. All the RT cores do is accelerate the DXR. And there is a compute-only fallback of course but it's tough to beat fixed function.