Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 39 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Rifter

Lifer
Oct 9, 1999
11,522
751
126
These prices seem insane. Ill reserve judgement until third party reviews are out but man, first Vega let me down now this crap, at this rate ill end up keeping my poor RX480 for yet another generation. Sad days indeed.
 

dlerious

Platinum Member
Mar 4, 2004
2,224
995
136
6 months? Yeah, very little chance. But make no mistake, this will almost certainly be a very short generation. They're releasing 12nm cards when 7nm is already available, and the ray tracing stuff, while absolutely freaking awesome, looks like it's too slow to be all that useful. I do expect performance to increase a bit by the time games are released, but the 2080 Ti struggling at 1080p is not a good sign. Even with lack of competition, there are plenty of internal reasons for a 7nm refresh relatively soon. It's not like Nvidia prefers selling huge dies bloated with ray tracing hardware of questionable usability for early adopters in the consumer space. And ray tracing works as the enormous game changer and killer feature it should be only if it's performant enough to actually use. Ideally for Nvidia, they should get to that spot before AMD produces a card with its own similar ray tracing features.
How many games right now do you need better than a GTX1050 or GTX1060 to play? The GTX2050 and GTX2060 aren't getting ray tracing, so depending on how many of those cards there are, game devs may be a little reluctant to forego rasterization for ray tracing in their games. I wonder how many people would be willing to pay $600 on a GPU for a budget build.
 

frowertr

Golden Member
Apr 17, 2010
1,372
41
91
What’s with all the price angst? I bought a Monster Voodoo 1 in 1998ish for $350 which comes to ~$530 in today’s dollars with inflation of 2.15%. That is still less than the RTX 2070 MSRP of $499.

If you want the latest and greatest, crack open your wallet, buy the RTX 2080Ti, and stop whining. Else, buy a lower tier card and be happy that you are paying less than us early adopters were paying 20 years ago.
 

SMOGZINN

Lifer
Jun 17, 2005
14,359
4,640
136
How many games right now do you need better than a GTX1050 or GTX1060 to play?

That matters on the resolution and quality. Want to play at 4k you need better than a 1060.

The GTX2050 and GTX2060 aren't getting ray tracing
We don't know that yet. There has been no official word on that at all.

depending on how many of those cards there are, game devs may be a little reluctant to forego rasterization for ray tracing in their games.
Even if those cards include it I doubt most game devs will be eager to adopt this technology anytime soon. There simply are not going to be enough cards out there in systems that can handle it, and no consoles will have them for years yet. As much as we wish otherwise it is the console market that most devs are really aiming for.

I wonder how many people would be willing to pay $600 on a GPU for a budget build.
Basically none.
 
  • Like
Reactions: dlerious

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
What’s with all the price angst? I bought a Monster Voodoo 1 in 1998ish for $350 which comes to ~$530 in today’s dollars with inflation of 2.15%. That is still less than the RTX 2070 MSRP of $499.

If you want the latest and greatest, crack open your wallet, buy the RTX 2080Ti, and stop whining. Else, buy a lower tier card and be happy that you are paying less than us early adopters were paying 20 years ago.
Are you comparing a 3rd to 4th tier card with the absolute BOSS of it's time?

Crusade times are here with the crusaders swarming us heathens who refuse the gifts of Nvidia as too high a price. I shall not surrender.
 
  • Like
Reactions: Ranulf and psolord

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
That matters on the resolution and quality. Want to play at 4k you need better than a 1060.


We don't know that yet. There has been no official word on that at all.


Even if those cards include it I doubt most game devs will be eager to adopt this technology anytime soon. There simply are not going to be enough cards out there in systems that can handle it, and no consoles will have them for years yet. As much as we wish otherwise it is the console market that most devs are really aiming for.


Basically none.
Good points.

We have two opposing forces at work.
Nvidia trying to raise prices & margins whilst simultaneously trying to sell more cards containing a new tech that needs a larger user base for rapid adoption.

Surely they see these working against each other?
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
Good points.

We have two opposing forces at work.
Nvidia trying to raise prices & margins whilst simultaneously trying to sell more cards containing a new tech that needs a larger user base for rapid adoption.

Surely they see these working against each other?

There will not be a rapid adoption. We had multiple cores in CPUs for a long while before we really started getting games that used more than 1. Even now it still lags behind.

These cards will be like physx where its neat, but, not a big factor.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
These prices seem insane. Ill reserve judgement until third party reviews are out but man, first Vega let me down now this crap, at this rate ill end up keeping my poor RX480 for yet another generation. Sad days indeed.

Same here. I never imagined I'd still be using my GTX 970 four years later, but the GTX 1070 with the big price hike to $450 for roughly 60% more performance wasn't enough IMO. And no way I'm buying Pascal now with Turing out considering how little driver optimization we're likely to see say a year from now on it. If the 2060 isn't really impressive on price to performance I guess I'm sitting out the gpu market until 7nm Navi / Turing Refresh.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Good points.

We have two opposing forces at work.
Nvidia trying to raise prices & margins whilst simultaneously trying to sell more cards containing a new tech that needs a larger user base for rapid adoption.

Surely they see these working against each other?
I'm assuming making FE non-blower like its beefier quadros fits this paradigm as well in that it restricts the quantity and use case of this in more professional environments. Case thermals aren't going to handle 500 watts of heat being dropped into it very well i addition to the CPU unless you go liquid which is a big no-no in professional environments.

Also, this allows them to crank the performance envelope as they are giving no care/concern for the waste heat coming off of these cards. This also puts a strong challenge to AIBs now that Nvidia has cloned their packaging.

All in all, Nvidia is going for that kill shot at its peak.. A very dangerous position and attempt as nothing is forever and greed in the final stages of one's form leads to diminished returns and consequences. This + they just completed a new shiny dedicated campus in Silicon Valley bringing about the long holding valley curse of a 5 year retraction/collapse :
https://www.businessinsider.com/poorly-timed-headquarters-2009-11
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Same here. I never imagined I'd still be using my GTX 970 four years later, but the GTX 1070 with the big price hike to $450 for roughly 60% more performance wasn't enough IMO. And no way I'm buying Pascal now with Turing out considering how little driver optimization we're likely to see say a year from now on it. If the 2060 isn't really impressive on price to performance I guess I'm sitting out the gpu market until 7nm Navi / Turing Refresh.
I'm still on maxwell for gaming.
I use pascal for compute.
I bought a vega card to evaluate it and later dropped it (sold it for double what I paid - thanks memecoin) due to over-stated features that never came to fruition, poor driver support, and poor dev support.

Now I see similar traces with Nvidia... I was seconds away from ordering a 2080 tbqh but it dawned on me that with a $800 price tag and yet to be detailed performance, I might not be able to offload it if it turns out to be a dud. I'm concerned about dev access to Nv-link, ray trace cores, and tensor cores. Namely ray trace cores and Nv-link.

I thought 2070 might be a more reasonable proposition but I read that they cut the die down to even less features which don't include nvlink on the 2070... At $600 with such cut features, it seems they nailed the price lock again.

So, pay a yuge price premium and wait months until drivers and dev tools settle in and pray or just wait another year until it matures, they go to 7nm, see what AMD does in terms of maturing Vega, and see what happens with Vulkan APIs because directX is coming nowhere near my all linux stack. I think I'm settled on waiting it out. Just comforting gathering round the punchbowl as the excitement wares off.

I also think this slow leak marketing fad, pre-order, teases, leaks, etc that has become popular in tech is detestable and long in the tooth. I think it is serving to piss people off more than the intended effect.

Edit : Also... I absolutely detest the cloud, telemetry data collection, etc. Nvidia gimped hardware in Maxwell/Pascal GPUs some time ago and only allow access through Geforce Experience. Their decision to become further intrusive with their 'Cloud' model and now tether the hardware even further through DLSS/etc makes me even more upset and in need of details about what's going on behind the scenes.
 
Last edited:

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
With a neural net, don't expect to know what's going on behind the scenes.

But with that said, we should be able to get good results for basically free after it's been trained.
 

TheF34RChannel

Senior member
May 18, 2017
786
310
136
GTX 1080 - 05/27/2016
NVIDIA Titan X - 08/02/2016

GTX 1080 Ti - 03/10/2017
NVIDIA Titan Xp - 04/06/2017

FYI 2080Ti is not the fully unlocked chip. At the time of the launch, AMD had nothing to challenge them at the top. I'll dream on, thank you.

Well it kinda is the full chip on the gamers side anyway. The fat full chips are reserved for Quadro.

Well you got the top performer for below $500 not too long ago so yeah I can expect it to be at least justifiable by average Joe.

The point remains. performance/$ has been stagnating at best. So with these price hikes anything below a huge increase of >50% is just a effing cash grab. I mean the 2070 costs 50% more so even with a 50% increase performance/$ does not increase. it would need a >70% boost over a 1070 to make me think ok, i get why you buy a mid-range GPU for $600. Same for the other cards.

And to be clear, 70% boost in traditional gaming with no HDR, raytracing or other trickery or cherry picking.

I look at it like this:

2080 Ti replaces Titan Xp
2080 the 1080 Ti
2070 the 1080/1070 Ti

But this time the FE models supposedly have better cooling and higher clocks than the reference cards?

The reference cards Jensen mentioned don't exist anywhere other than in his mind. No AIB will be making cards cheaper and slower than Nvidia reference cards (I refuse to call them FE).

To me it sounds like him saying "if we had cards one tier lower they would've cost that".

This guy says he was at one of the events and he say bf V with rtx on with what seemed like 1440p and fps was above 100..

https://www.reddit.com/r/nvidia/com...80ti_performance_i_was_at_the_event/?sort=new

If I want to go to a 4k monitor what type of refresh rates are out there right now? Just dont know if it would be worth it over 1440p yet.

I wouldn't believe him at all. Some sites played it and said it didn't even feel like 60 and I find them way more reputable than a reddit user ;)

I bet you won't see a new 3000 series for 1.5 years. IOW the shorter of recent release schedules.

Though they could have 7nm models slip into the 2000 series later next year, but I wouldn't expect anything in 2019 to substantially surpass 2080Ti performance. They might just start 7nm at 2050 level. A super small die to test the 7nm waters before embarking on a serious performance uplift for early 2020.

I am betting 3000 series will be a Feb/March 2020 release. By then 7nm will have the bugs worked out, and yields up for bigger dies, to make a serious performance uplift.

That would make the 20 series much more viable (through longevity) than they currently seem. I still believe this is short stopover on the way to 7nm in 2019.

Looking at the teardown of the dual fan founders edition, the fins run the wrong direction to blow air out of the case. So it will nearly all exhaust inside the case.

I really prefer the blower concept. I wish more effort was made to make better blower cards. Or half and half cards with dual fans, where the back fan blows out of the case, while the front exhaust into the case. Exhausting heat out of the case is a very sound concept.

Yeah the reference cooler isn't good at all. They could have taken a look at any of their partners how to do it right but no. Blowers could indeed be made much better, up to now they 'look' cheap. And I've used them for years.

Speaking of coolers, I seem to be the only one who is worried about how heavy these AIB coolers are getting (looking at EVGA 2.75 and 3x slots) and thus cause sag?
 
Last edited:

ub4ty

Senior member
Jun 21, 2017
749
898
96
How many games right now do you need better than a GTX1050 or GTX1060 to play? The GTX2050 and GTX2060 aren't getting ray tracing, so depending on how many of those cards there are, game devs may be a little reluctant to forego rasterization for ray tracing in their games. I wonder how many people would be willing to pay $600 on a GPU for a budget build.
Even though I have a load of 1080s for compute, I game on an even less capable GPU than the gtx1050 and gtx1060. I have zero problems with zero titles. I've been gaming since gaming was gaming.. From duke nukem in DOS mode to Counter strike to a slew of the big titles. In high action, I'm not processing details. I could care less. If the frame count is too low, you simple reduce the graphics settings. The puritan gamer does this to remain competitive especially in multiplayer first person shooters.

Somewhere along the way, computing and gaming became mainstream. The mass of people who used to pick on those who engaged in this niche now embrace it and call it their own. In that, much of the core ideology was lost. The masses care more about comparisons and being on top than the actual experience and objective. Games have flooded with political foolishness. Fun and challenge has been overridden by marketing/appeal to mass sentiment. So, tbqh, there hasn't been much appeal to high range gaming.

You can game on maxwell on just about every title out there because the nature of the games is that they are locked in a Maxwell era. There are a few titles that seem to break this trend which look to be set for 2020+ release. Therein lies the sale for the new features. For now, this has all been one big beta testing mainstream profiteering circus. From 4k res gaming to 140fps nonsense... These are products of an over-commercialized and overstimulated mainstream market. If anything, real hardcore gamers went to consoles when this ridiculously expensive PC gaming fad took off.

At $800/$1200, the absolute shark has been jumped. This is not for gamers anymore. This is limousine service to a 'take my wallet' mainstream segment of a market.

Shadows and higher quality are not what attracts a genuine gamer. They're luxury add-ons.
csgo-settings.jpg


MOST USED RESOLUTION
1024X768 (4:3)


Also, paying $600-$800 for a cellphone is a mainstream fad.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Yeah the reference cooler isn't good at all. They could have taken a look at any of their partners how to do it right but no.

You mean the old reference single blower? I think it's ok for lower power cards used in small cases.

Or did you mean the new dual fan FE cooler. That looks like a reasonable two fan cooler to me. It will probably hold it's own against other two fan designs. Three fan monsters will likely pull ahead.
 

TheF34RChannel

Senior member
May 18, 2017
786
310
136
You mean the old reference single blower? I think it's ok for lower power cards used in small cases.

Or did you mean the new dual fan FE cooler. That looks like a reasonable two fan cooler to me. It will probably hold it's own against other two fan designs. Three fan monsters will likely pull ahead.

Yeah sorry, the new one. I understood the base plate doesn't touch and thus not cools many significant components, unlike AIB designs that have been disassembled so far.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
You mean the old reference single blower? I think it's ok for lower power cards used in small cases.

Or did you mean the new dual fan FE cooler. That looks like a reasonable two fan cooler to me. It will probably hold it's own against other two fan designs. Three fan monsters will likely pull ahead.
https://www.nvidia.com/en-us/design-visualization/quadro-desktop-gpus/
design-visualization-quadro-rtx-6000-front-badge-407-ud.jpg

Blowers are designed for being a good citizen in a scalable professional environment. Nvidia made it clear that they want people to pay a pro premium and not use Geforce cards in the data center. They now have intentionally nerfed this capability for Geforce 20 by removing blower design from the reference card. If its good enough for a Quadro that is 2-3x more expensive. It's good enough for Geforce.

The following :
Supermicro-4028GR-TR-Red-v-Black-NVIDIA-GTX-1080-Ti-8x-GPU.jpg

was becoming a trend whereas, they wanted the following :
m_TYAN%20FT77C-B7079%20Incorporates%20Eight%20NVIDIA%20Tesla%20P100%20GPUs%20to%20Offer%20Massive%20Leaps%20in%20Performance%20and%20Efficiency.jpg



https://www.digitaltrends.com/computing/nvidia-bans-consumer-gpus-in-data-centers/
https://www.theregister.co.uk/2018/01/03/nvidia_server_gpus/

You can't stack the Geforce20 non blower FE cards.
Pony up $2,300 for the lowest end quadro.
Consumer GPUs now have prior quadro pricing.

Nvidia also had plans to stick it to AIBs. Without having to be a good citizen in a computer case (dump all of your waste heat in the case), they can bump the clocks/performance/power utilization to AIB levels.
 
  • Like
Reactions: moonbogg

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
You know I just realized something. With Turing Nvidia has gone full circle back to the days of video cards where every manufacturer had hardware features unsupported by the others yet wanted games to take advantage their features. That time was a fragmented mess. MS hasn't released DXR. Vulkan doesn't have a ray tracing solution yet. With Turing Nvidia has ignored the decades long agreement that standardized APIs would drive featuresets and not hardware vendors. This is what near monopoly power looks like.
 
Last edited:

realibrad

Lifer
Oct 18, 2013
12,337
898
126
You know I just realized something. With Turing Nvidia has gone full circle back to the days of video cards where every manufacturer had hardware features unsupported by the others yet wanted games to take advantage their features. This time was a fragmented mess. MS hasn't released DXR. Vulkan doesn't have a ray tracing solution yet. With Turing Nvidia has ignored the decades long agreement that standardized APIs would drive featuresets and not hardware vendors. This is what near monopoly power looks like.

You can do that when you dominate the market.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
Well you got the top performer for below $500 not too long ago so yeah I can expect it to be at least justifiable by average Joe.

The point remains. performance/$ has been stagnating at best. So with these price hikes anything below a huge increase of >50% is just a effing cash grab. I mean the 2070 costs 50% more so even with a 50% increase performance/$ does not increase. it would need a >70% boost over a 1070 to make me think ok, i get why you buy a mid-range GPU for $600. Same for the other cards.

And to be clear, 70% boost in traditional gaming with no HDR, raytracing or other trickery or cherry picking.

These prices are so perverse. In seven years prices have now more than tripled.

RTX 2080 Ti is the 2018 equivalent to the GTX 570 at 3.6x the price
RTX 2080 is the 2018 equivalent to the GTX 560 Ti at 3.2x the price
RTX 2070 is the 2018 equivalent to the GTX 560 at 3x the price
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
A lot of people may have been waiting for the new cards to come out. Now that they're out and so expensive that only Jensen can afford one, maybe a lot of people will give up on the new stuff and buy a Pascal card. The good news is 1080Ti's are in stock and below MSRP in many cases. I saw some for $660. I'd expect a lot of X70 buyers to grab a 1070 or 1070Ti now that they realize their upgrade was priced at a hilarious $600. Like seriously, milk through the nose funny these prices are.

People holding out for the new cards now have two choices; buy a Pascal card or wait another year to see what happens. I guess they could always just buy a 2000 series, lol. I feel like such a cruel bastard even suggesting that's an option. Its like suggesting that people have the "option" to drink bleach or something. Just doesn't feel right saying it.

I wouldn't touch a 1070 or 1070 Ti when Pascal is more than two years old now. The 1070 is still above fake MSRP on newegg and 1070 Ti is above MSRP there except for one EVGA blower and a Zotac dual fan card. No thanks. If 2060 isn't amazing I'll stick with my 970 another year.
 
  • Like
Reactions: moonbogg

ub4ty

Senior member
Jun 21, 2017
749
898
96
You know I just realized something. With Turing Nvidia has gone full circle back to the days of video cards where every manufacturer had hardware features unsupported by the others yet wanted games to take advantage their features. This time was a fragmented mess. MS hasn't released DXR. Vulkan doesn't have a ray tracing solution yet. With Turing Nvidia has ignored the decades long agreement that standardized APIs would drive featuresets and not hardware vendors. This is what near monopoly power looks like.
And this is what historic falls from grace at the peak begin as.

You can do that when you dominate the market.
Until your shiny new proprietary approach fails/is rejected and doing so w/ no regard for your costumers results in you losing your business to new competition and existing that capitalizes on the sizable and executable value gap you have created.

Giant companies fail for a reason. Greed clouds otherwise sensible judgements at exactly the worst time. A company over-estimates its growth potential at its peak, gouges its customers too hard, and over-invests in a proprietary one-off approach... and then gets hit with reality.

Intel being a recent example that has yet to play out w/ boatloads of older tech companies as older examples.
 
  • Like
Reactions: moonbogg

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
And this is what historic falls from grace at the peak begin as.

I was going to add that I wouldn't be surprised if Nvidia cut out AIBs altogether in their pursuit of more money and control. Have all of the moves Nvidia has been making going to lead them to be another 3dfx? I mean they bought 3dfx's assets. They know how they got them. Could they possibly be that dumb or have that much hubris?
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
I wouldn't touch a 1070 or 1070 Ti when Pascal is more than two years old now. The 1070 is still above fake MSRP on newegg and 1070 Ti is above MSRP there except for one EVGA blower and a Zotac dual fan card. No thanks. If 2060 isn't amazing I'll stick with my 970 another year.
Feels good having bought Pascal around launch and loading up when they were in comfortable ranges. Bought a 1070ti most recently for $389. Something that can easily last 5 years. I'm really over paying premiums for hardware to be a beta tester.
 
  • Like
Reactions: moonbogg

moonbogg

Lifer
Jan 8, 2011
10,736
3,454
136
I'm kind of concerned that AI hardware requirements will give Nvidia control over a hell of a lot more than just the world of video games. I know it sounds a little tin foil hat-ish right now, but I don't see it being far fetched by any stretch. Seriously. It can go very badly IMO.
 
Status
Not open for further replies.