Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 65 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

MrTeal

Diamond Member
Dec 7, 2003
3,911
2,677
136
If Nvidia wants RT to succeed much sooner than later, they need to go all in on it. Not half-assed physx, but they need a significant number of AAA titles to support it. They will have to look past short term shareholders and take a temporary hit on their margins.

Already, based on Turing pricing, it doesn't look like any of the above is going to happen. Turing adoption will be slower and lower than Pascal was over Maxwell, less than Maxwell was over Kepler, etc. It's hard to push features for hardware that is priced out of 80% of the potential market.

Turing is likely going to end up too forward looking, too expensive, and too underpowered for most of it's newest features.
I agree with you. Nvidia has the opportunity to really push RT adoption and continue to increase their lead in the graphics and compute market by aggressively pushing a feature only they have for now and possibly they'll do better for the forseeable future. In advance of evidence to the contrary though, it doesn't seem that the RT and TU units really provide a tangible benefit at present. While I get that the dies for Turing are going to be greatly more expensive than Pascal due to all the extra compute units, I don't think expecting consumers to foot the bill for them shifting the graphics paradigm will work well unless they can show immediate benefit.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Talking about this with the wife, and I basically think it's NV's attempt to usurp AMD's foot hold in consoles. Consoles this gen were very under powered, which led to PC ports with relatively lax PC requirements. Something that's bad for any company trying to sell you overpriced toasters.
Something that is good for consumers. I could care less about what's bad for Nvidia nor should any other consumer.
Nvidia can't usurp AMD's foot hold in consoles by catering to less than 1% of the market. The grand majority of people are effectively playing on console tier PC hardware.. 90%+. The market spoke some time ago : It has no stomach for stupid priced gaming hardware. Have people forgotten what is happening to Intel? People fled the PC some time ago for gaming and went to consoles. You can't usurp the financial barrier people have set. You most certainly cant with an overpriced gimmick all of the upcoming gamers are making memes about. You can waste a complete generation of cards arrogantly thinking you can. No one has a stomach for these silly prices. The hardware surveys speak for themselves. Nvidia does maintain a dominate share of PC gaming hardware. I guess that's the problem for a publicly traded company that is expected to do the impossible (grow forever). They can't. They can't grow their margins because no one wants to buy cards more expensive than the 1070. Game developers who want to eat also wont cater to people past this point which is why the major titles are playable on a $150 APU which is essentially a console. Nvidia can't force or nudge the market into higher brackets. It can piss off its current consumers and lose market share though just like Intel and that's exactly what's going to happen. You can taste the anger in the air on every forum on the internet with Nvidia.

IF, NV can get MSFT to work up Ray Tracing, and can sell it to enough devs through their dev-rels (they have a better chance of RT catching on than Rapid Math or that word that should not be mentioned (primitive shaders)) they create an opening down the road to either A) bolster PC requirements for games (even consoles) leading to being able to sell more expensive toasters, or B) shaking AMD's grip on the consoles.
You don't understand how ray tracing works.
You can download DX12 and DXR and run it on Pascal or any other GPU.
Its just an algorithm. All of the algorithms are series of hacks to mimic light. So, they can generically run on anything. Apple did Jensen's box lighting demo on an iPad. You don't need a powerful chip to do ray tracing. Imagination technologies was one of the first to build hardware accelerated ray tracing years ago on a mobile processor. I'm interested to see how much Nvidia's architecture is a copy/pasta of it. I'll see in 2 days.

What Nvidia just released isn't true ray tracing. It's yet another series of gimmicks and hacks that try to emulate ray tracing which is why its called hybrid ray tracing and its why they packed the SM's with Tensor cores (to polish over the grainy results). You're dreaming if you think this will impact consoles or compel consumers into higher margin consumption. No one is willing to spend the money. The popular games run @ 60fps and ever 120fps on poverty tier hardware. A game developer would be committing suicide to try to force the performance barrier up.

I mean practically everyone and their mother here confirmed Switch was going to be an AMD based SoC. It wasn't and then the Switch was dubbed "the worst console ever" yet it's a run away success (basically, most of the pro-AMD guys here are too blind to listen when even the President of Nintendo said "we're going ARM" but what does the president know, ATF posters are experts).
Some people actually work in the industry and at these companies. You can tell the difference if you do too.
Switch was a success because it was fun. People don't actually care about stupid crap like : 4k/120fps/Umbra, penumbra and antumbra.. They care about a game being fun and they care to play w/ tons of others. Take a look at fortnite :
The point of a game is not to reflect reality.. It's to be something beyond it. No one cares about these 1%'er features companies keep trying to push down people's throat so they can get rich. Consumers of a past age aren't the ones of today. People scrutinize the hell out of products and know a great deal about them. 2008 changed everything. People are by and large quite conscious of pricing. Tons of people use deal sites. Only if you want to squander your money do you go around clamoring for the latest/greatest and buy into ridiculous marketing.

I don't mean NV is going to be in the next gen consoles, but I do mean NV can shake the buck enough that they might be considered for the gen after, or at least shaking the PC/API landscape enough that AMD's limited resources are but through more strain as they now have to try to catch up on that front.

NV is playing 4D chess here, and RT can fizzle or become the new standard. Interested to see where it goes.
Nvidia already owns something like 70% of the PC gaming sales.
They're not going to shake a single thing but their lead off.
When you're on top like this, there's literally nowhere to go but down. It's nature. Nothing goes up forever.
Everyone has limited resources. People are unironically broke. Trade wars are brewing because countries are broke. States are broke. All of what you see was fueled on a massive papering over of 2008s financial crisis. NV is playing like their head is in lala-land and they're trying to maintain the silly conditions they got used to during the crypto boom. The earth is warming or so everyone who claims they're educated says but we let a parasite known as crypto mining exist and persist for years. Tons of PC hardware squandered, energy, and heat exhausted into the earth. Now we have multiple hurricanes swirling the earth in set upon tens of billions of dollars of damage. No one could find GPUs to run their computers because people wanted to play a ponzi scheme fantasy. No regulators stopped them. Nvidia cheered it on and everyone in the chain savagely pillaged consumers. You think people forgot about this? You think people are truly excited to buy these $600/$800/$1200 cards after that? No. I wouldn't be surprised if retro gaming catches on even more along with console purchases. This is exactly how the previous dead period entered into the PC market. Companies went full retard and pissed people off.

Nvidia's going to lose not gain market share. It's what happens all throughout history especially in tech.
Unfortunate and due completely to greed.
 
Last edited:
  • Like
Reactions: psolord

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Making the TU106 so close to the TU104 is actually a return to the old days. It's 75% the shaders. Lately chips have been spread out a lot more, being 66% or 50% of the next chip up.

Barts (Radeon 6870) was 73% of Cayman (Radeon 6970).

GF114 (GTX 560 Ti) was 75% of GF110 (GTX 580)

Nvidia has increasingly been cutting their x70 cards more: 670 is 87%, 970 is 81%, 1070 is 75% of full chips.

The 1070 taught them a lesson, and we see later they made the 1070 Ti in order to get more margins for the exact same manufacturing cost (both GP104 with 8GB of DDR5 8Gbps). Perhaps the final lesson is to just not cut 75% again as it's not worth it and more economical to make another full chip.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
If Nvidia wants RT to succeed much sooner than later, they need to go all in on it. Not half-assed physx, but they need a significant number of AAA titles to support it. They will have to look past short term shareholders and take a temporary hit on their margins.

They are clearly "all in" on Raytracing. It just doesn't mean what you want it to mean.

The are devoting something like extra 40% of die area to RT support. That is massive "RT Tax" on these new dies. They are working with multiple developers to get game support for RT, and worked with Microsoft on getting RT as a standard Direct X API.

But, given how large the die impact is, it can't reach the masses on on generation 1.

2060 will most likely NOT have any RT capability. It will likely only be on Generation 2: RTX 3060, and it might be generation 3: RTX 4050 before it hits x50 level.

So it will be 2 or 3 generations before Raytracing is a realistic options for most of the gaming market.

But it has to start somewhere, and that is with expensive early adopter cards.

So make no mistake, NVidia is "all in" on Raytracing, but realistically it can't roll out to most of the market in generation 1. It will take 2 or 3 generations before that can realistically happen.
 

MrTeal

Diamond Member
Dec 7, 2003
3,911
2,677
136
Making the TU106 so close to the TU104 is actually a return to the old days. It's 75% the shaders. Lately chips have been spread out a lot more, being 66% or 50% of the next chip up.

Barts (Radeon 6870) was 73% of Cayman (Radeon 6970).

GF114 (GTX 560 Ti) was 75% of GF110 (GTX 580)

Nvidia has increasingly been cutting their x70 cards more: 670 is 87%, 970 is 81%, 1070 is 75% of full chips.

The 1070 taught them a lesson, and we see later they made the 1070 Ti in order to get more margins for the exact same manufacturing cost (both GP104 with 8GB of DDR5 8Gbps). Perhaps the final lesson is to just not cut 75% again as it's not worth it and more economical to make another full chip.
True, but GP106 1060 launched at $300 vs the $700 for GP104 1080. Barts launched at 65% of the price of a Cayman.

I just don't agree with Hitman's logic in comparing pricing increases from 1060 to 2070 based on the internal model number. It really doesn't matter how many dies are above or below a product in the stack, it matters how they are positioned in performance and price relative to cards in their own generation and the previous one.

If you had the following chips from two different generations:
A106 - 50% perf - $200
A104 - 100% perf - $400
A102 - 150% perf - $600

B106 - 150% perf - $300
B104 - 200% perf - $400
B102 - 300% perf - $600

All the chips within a generation would have the same perf/$, and all the chips the new generation would have twice the perf/$ that the old ones did. That doesn't mean the B106 chip based card had a 50% price increase, it's just a new segment. There might end up being a $200 card, or that might be a lineup hole, but that's a different issue.
 
  • Like
Reactions: SMU_Pony

Head1985

Golden Member
Jul 8, 2014
1,867
699
136
Making the TU106 so close to the TU104 is actually a return to the old days. It's 75% the shaders. Lately chips have been spread out a lot more, being 66% or 50% of the next chip up.

Barts (Radeon 6870) was 73% of Cayman (Radeon 6970).

GF114 (GTX 560 Ti) was 75% of GF110 (GTX 580)

Nvidia has increasingly been cutting their x70 cards more: 670 is 87%, 970 is 81%, 1070 is 75% of full chips.

The 1070 taught them a lesson, and we see later they made the 1070 Ti in order to get more margins for the exact same manufacturing cost (both GP104 with 8GB of DDR5 8Gbps). Perhaps the final lesson is to just not cut 75% again as it's not worth it and more economical to make another full chip.
Well 2070 vs 2080 its still same cutdown(well not cutdown but its same) like was 1070 vs 1080.Only difference is 2080 is now cutdown so 2080 have only 27% more SP instead 33%.Remember FULL TU104 have 3072SP and 3072/2304=33% more SP just like 1070 vs 1080.

The gap will be smaller this time because 2080 is cutdown and 2070 also uses same memory.But TU104 have now 6xGPC and 2070 only 3x so that will increase the gap a little.
1080 was 25-30% faster in 1440p than 1070.
2080 will be 17-20% faster than 2070 this time.
 
Last edited:
  • Like
Reactions: Hitman928

Hitman928

Diamond Member
Apr 15, 2012
6,639
12,224
136
True, but GP106 1060 launched at $300 vs the $700 for GP104 1080. Barts launched at 65% of the price of a Cayman.

I just don't agree with Hitman's logic in comparing pricing increases from 1060 to 2070 based on the internal model number. It really doesn't matter how many dies are above or below a product in the stack, it matters how they are positioned in performance and price relative to cards in their own generation and the previous one.

If you had the following chips from two different generations:
A106 - 50% perf - $200
A104 - 100% perf - $400
A102 - 150% perf - $600

B106 - 150% perf - $300
B104 - 200% perf - $400
B102 - 300% perf - $600

All the chips within a generation would have the same perf/$, and all the chips the new generation would have twice the perf/$ that the old ones did. That doesn't mean the B106 chip based card had a 50% price increase, it's just a new segment. There might end up being a $200 card, or that might be a lineup hole, but that's a different issue.

If 2080 Ti is not cut more than a normal x80Ti tier and if the 2080 is a full die, instead of a cut die, I would 100% agree with you, but from what we know, this isn't the case. If this ends up being true, like I said all along if new information comes out, I'll happily change my analysis.

From what we know, a 2080 is actually a cut down TU104 die and the 2080 Ti is a further cut die than normal. Add to that the boost clocks appear to be lower relative to the 1060/1080, it makes the comparison a little more strained.

Even with that, it should provide better relative value to a full sized chip than the 1060 did, which is good, but that doesn't change the product stack which was my only point to begin with. I gave no value assessment or comparison with performance, just a note that the product stack had changed, which it has. Then I got jumped on for pointing out this simple fact. All my posts also, as I included from the beginning, are written in context of history and not based on a single generational transition.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Something that is good for consumers. I could care less about what's bad for Nvidia nor should any other consumer.
Nvidia can't usurp AMD's foot hold in consoles by catering to less than 1% of the market. The grand majority of people are effectively playing on console tier PC hardware.. 90%+. The market spoke some time ago : It has no stomach for stupid priced gaming hardware. Have people forgotten what is happening to Intel? People fled the PC some time ago for gaming and went to consoles. You can't usurp the financial barrier people have set. You most certainly cant with an overpriced gimmick all of the upcoming gamers are making memes about. You can waste a complete generation of cards arrogantly thinking you can. No one has a stomach for these silly prices. The hardware surveys speak for themselves. Nvidia does maintain a dominate share of PC gaming hardware. I guess that's the problem for a publicly traded company that is expected to do the impossible (grow forever). They can't. They can't grow their margins because no one wants to buy cards more expensive than the 1070. Game developers who want to eat also wont cater to people past this point which is why the major titles are playable on a $150 APU which is essentially a console. Nvidia can't force or nudge the market into higher brackets. It can piss off its current consumers and lose market share though just like Intel and that's exactly what's going to happen. You can taste the anger in the air on every forum on the internet with Nvidia.

Who knows. MSFT is getting in bed with NV (again.) People in the industry also said Nintendo would never use an NV product.

I remember having this same discussion a few years back. PC gaming was dead. Then next-gen consoles came out, were lack luster, and suddenly PC gaming got a resurgence.

With AMD's grip on consoles, unless either Sony or MSFT push for better/stronger consoles (as you're saying here they most likely won't, cost/etc), NV has to create a need for their products else where. They did it with GameWorks (more marketing than actual product - least from my position) and I see them trying to do something similar with RT. If devs use it, and NV is tied to it, marketing!


You don't understand how ray tracing works.
You can download DX12 and DXR and run it on Pascal or any other GPU.
Its just an algorithm. All of the algorithms are series of hacks to mimic light. So, they can generically run on anything. Apple did Jensen's box lighting demo on an iPad. You don't need a powerful chip to do ray tracing. Imagination technologies was one of the first to build hardware accelerated ray tracing years ago on a mobile processor. I'm interested to see how much Nvidia's architecture is a copy/pasta of it. I'll see in 2 days.

What Nvidia just released isn't true ray tracing. It's yet another series of gimmicks and hacks that try to emulate ray tracing which is why its called hybrid ray tracing and its why they packed the SM's with Tensor cores (to polish over the grainy results). You're dreaming if you think this will impact consoles or compel consumers into higher margin consumption. No one is willing to spend the money. The popular games run @ 60fps and ever 120fps on poverty tier hardware. A game developer would be committing suicide to try to force the performance barrier up.

No idea what this has to do with anything I said, but have at it!


Some people actually work in the industry and at these companies. You can tell the difference if you do too.
Switch was a success because it was fun. People don't actually care about stupid crap like : 4k/120fps/Umbra, penumbra and antumbra.. They care about a game being fun and they care to play w/ tons of others. Take a look at fortnite :
The point of a game is not to reflect reality.. It's to be something beyond it. No one cares about these 1%'er features companies keep trying to push down people's throat so they can get rich. Consumers of a past age aren't the ones of today. People scrutinize the hell out of products and know a great deal about them. 2008 changed everything. People are by and large quite conscious of pricing. Tons of people use deal sites. Only if you want to squander your money do you go around clamoring for the latest/greatest and buy into ridiculous marketing.

Again, no idea what this has to do with anything I said. 1% of people!? You mean a new technology starts some where, slow, and either gets adopted and used else where with improvements or it dies. You appear to be in the camp that it's dead before it launched. Kudos, my crystal ball isn't as good as yours.

EDIT: Actually, I sort of want to add to this after thinking a bit more. The next Samsung Note is going to cost $1200.

These are boutique products especially made for that 1% of people. The peasants, and I say this jokingly, will get the A-series or whatever Samsung does on the lower end.

There are plenty of people to pay for these products. And perhaps down the road you will see hybrid RT (whatever flavor) in cheap products just like I can get a fingerprint sensor in a $200-300 cell phone.

My 2009 car didn't have a fancy touch screen and it cost me more off the lot than the car I just got in 2018 with a rear camera and front detection senors! I think they're doing it wrong.

EDIT #2: Actually, it is really interesting to see what is now standard for car packages. I remember the Cadillac I owned in 2005, ball park, had features that my last car still didn't have as package options! I mean, I can understand why, the Cadillac was made for the 1% (I got it long after it's value crated), but the car I got in 2009 didn't offer some of the stuff the older car I drove did.

Isn't this how it's suppose to work? The 1% pay for it until it becomes standard or cheaper to make and then we can all partake? Isn't this what being an early adopter of any new tech is?

It's not for everyone. Hell, it might not even be for the 1% either. Again, your crystal ball must be amazing.

Nvidia already owns something like 70% of the PC gaming sales.
They're not going to shake a single thing but their lead off.

Possibly. I don't think so. But my tune will change with what Intel brings to the game. I got little hope for AMD currently.

When you're on top like this, there's literally nowhere to go but down. It's nature. Nothing goes up forever.

Is everything extremes with you? Or is this just how your word it? I don't see NV "failing" but I do see their marketshare dropping. I wouldn't be ready to dig a grave.

Everyone has limited resources. People are unironically broke. Trade wars are brewing because countries are broke. States are broke. All of what you see was fueled on a massive papering over of 2008s financial crisis. NV is playing like their head is in lala-land and they're trying to maintain the silly conditions they got used to during the crypto boom. The earth is warming or so everyone who claims they're educated says but we let a parasite known as crypto mining exist and persist for years. Tons of PC hardware squandered, energy, and heat exhausted into the earth. Now we have multiple hurricanes swirling the earth in set upon tens of billions of dollars of damage. No one could find GPUs to run their computers because people wanted to play a ponzi scheme fantasy. No regulators stopped them. Nvidia cheered it on and everyone in the chain savagely pillaged consumers. You think people forgot about this? You think people are truly excited to buy these $600/$800/$1200 cards after that? No. I wouldn't be surprised if retro gaming catches on even more along with console purchases. This is exactly how the previous dead period entered into the PC market. Companies went full retard and pissed people off.

Que!?

Nvidia's going to lose not gain market share. It's what happens all throughout history especially in tech.
Unfortunate and due completely to greed.

I agree with you. I have no idea where you think I was coming from. I was adding my opinion to Athiest's post. NV is risking a lot, if it pays off kudos, if it doesn't, well kudos for AMD/Intel!? Kudos for us if we get more competition? Kudos for everyone!
 
Last edited:
  • Like
Reactions: ZeroRift

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Maybe you should consider what fuel you're adding to the fire and instead contribute to a more meaningful discussion.

Ditto ;)

Consoles are underpowered w.r.t. the CPU, not so much the GPU, especially the Xbox One X or PS4 Pro, with the GPUs in them basically equivalent to a GTX 1060 3/6GB. The next gen console design will in all likelihood use Zen 2 along with whatever AMD's next GPU is going to be. Nvidia got to supply the SoC for the switch simply because AMD doesn't make mobile SoCs.

Yeah, but they're still under powered regardless.

Well, when you say "mobile" do you mean like phone/tablet, or laptop? Let me rephrase what I said about SoC, since I was lazy and assumed people would understand I meant going with an AMD CPU+GPU, either on one die or not. Nintendo said they wanted to focus on a single style platform, marrying their Wii and DS, and staying with ARM. Yet a good portion of posters here laughed when I said I was pretty sure they were going with an ARM CPU, what that meant for the GPU was in the air. But that's beside the point and me just grumbling about how that thread went. It went from "AMD has the Nintendo console in the wraps" to "Switch is DOA" and the little console/handheld that could is a run away success.

If Nvidia wants RT to succeed much sooner than later, they need to go all in on it. Not half-assed physx, but they need a significant number of AAA titles to support it. They will have to look past short term shareholders and take a temporary hit on their margins.

Agreed, and because of their sleeping partners, I sort of see them doing this. They got MSFT to tie their version into DX12. NV has to grease the ball.

Already, based on Turing pricing, it doesn't look like any of the above is going to happen. Turing adoption will be slower and lower than Pascal was over Maxwell, less than Maxwell was over Kepler, etc. It's hard to push features for hardware that is priced out of 80% of the potential market.

Sort of predict NV is going to cater to two user fronts with this. Based on other stuff I've read if Pascal is going to continue to co-exist, it can be the more "gamer tuned" series, still the GTX, and with RTX NV creates a new tier/by-product for the 1%'ers (or whatever you want to call them.) Not everyone needs front sensors on their car or a finger print scanner on their phone. Options, isn't what what we all want? I mean, if GTX 1080 Ti gets rebranded to GTX 2080 or something stupid, cost $500 and sits next to a $700 RTX 2080, consumers have options. Do I want the R (ray tracing) or just go for the lean gamer card?

But I dunno, the future is always cloudly, just don't get why a lot of people are already writing it off. They haven't even launched, we don't have concrete reviews/scores, and people are already claiming its X/Y/Z.

Turing is likely going to end up too forward looking, too expensive, and too underpowered for most of it's newest features.

Most likely. But if i t does it just good enough where some people like it, it can create a market segment or even start a trend! Who knows.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
Yeah, but they're still under powered regardless.

Well, when you say "mobile" do you mean like phone/tablet, or laptop? Let me rephrase what I said about SoC, since I was lazy and assumed people would understand I meant going with an AMD CPU+GPU, either on one die or not. Nintendo said they wanted to focus on a single style platform, marrying their Wii and DS, and staying with ARM. Yet a good portion of posters here laughed when I said I was pretty sure they were going with an ARM CPU, what that meant for the GPU was in the air. But that's beside the point and me just grumbling about how that thread went. It went from "AMD has the Nintendo console in the wraps" to "Switch is DOA" and the little console/handheld that could is a run away success.
They're not underpowered when you look at it from the point of view of what GPUs are being bought the most for PC Gaming. The most popular GPU according to Steam is the GTX 1060, and therefore console GPUs now achieve parity with the current gen mid-tier GPU of choice. This is a pretty significant step up from the previous gen consoles.

AMD doesn't make 5W SoCs that are worth using, and the success of the Switch has more to do with it being a Nintendo product than it having Tegra. Tegra has been a flop in pretty much everywhere, except the Switch. Mobile GPU designs keep improving all the time, so it would not be surprising if the successor to the Switch ditches Tegra in favour of something different.
 
  • Like
Reactions: psolord and ub4ty

ub4ty

Senior member
Jun 21, 2017
749
898
96
I agree with you.
I have no idea where you think I was coming from.
That about sums up your posts. GPU's are not Cadilacs of their day. They are mature products like CPUs.
There's no premium cost to slapping a tiny RT pipeline into a proven architecture.
I've worked on products in tech far more expensive than a $1,200 GPU. Moreso around a starting price of $80,000... $320k fully loaded. When the product becomes mature like this but it is a cash cow, companies restructure people to find out what new goofy features they can slap on to continue hitting those margins and beyond. This doesn't cost a lot of money to develop. People pretend like it does when they market it but internally it doesn't.

Companies don't charge $1,000 for phones because they are majestic slabs of diamonds carved from a far off asteroid. They charge $1,000 for them because people are dumb enough to pay that. When a company reports records earnings and they highlight record margins, this is why it costs a lot. My crystal ball isn't a crystal ball. I work in this industry and I know how the game is played as does anyone with any sense.

I'm an avid gamer and technologist on a number of forums where the sentiment is inline with my "extreme" views not the viewpoint of : Lets just suck it up and pay these ridiculous prices because it's a cadillac. When you behave like this, companies skewer you as they should. 1% is the 1%. Games aren't made to cater to the 1%. A honda civic has all of the bells and whistles a luxury car now has. The luxury car manufactures now resort to gimmicks to continue to justify their margins. This is what occurs with a mature product. If you want to go back 10-20 years when cars just came out, it would be a different story. GPUs have been around forever. Nvidia is creating gimmicks to try to continue to ridiculous margins. People will pay regardless. It's what keeps people in tech with a job. When I hear of someone who has no savings in the bank and makes poverty wages but is running around with a shiny new $1,000 iphone, I come to understand who the faux 1% is who keeps products moving off a shelf and I grasp why so many products have reached ridiculous levels at pricing... The smart consumers are gone from the equation

Do as you please.
Just providing my two cents and facts about where the money is made [in the middle market at volume].

DX12 - Ray tracing runs on Pascal today.
Making one line code changes here and there caused the crude fallback driver code to perform twice as fast.

I guess outsiders are unaware of these kind of antics that are pulled so that corps can sell them on shiny new features. Your wallet not mine. But don't claim this is a costly luxury feature w/ justification for price. GPUs didn't fall out of a fab last week. They've been around forever. Mature products stall on premiums because they hit the obvious walls that these GPUs are hitting and only have gimmicks like : Muh penumbra left.
 
  • Like
Reactions: psolord

railven

Diamond Member
Mar 25, 2010
6,604
561
126
They're not underpowered when you look at it from the point of view of what GPUs are being bought the most for PC Gaming.

At this point you're arguing semantics. I'm aware of the hardware in the consoles, but they are underpowered by your own admission - by the CPU. We can argue until the cows turn blue about the hardware, but if people don't think Gen 8 saw a huge switch to including PC users, they aren't looking close enough. The shift is so strong, I barely use my consoles. The laundry list of "console exclusives" has pretty much dried up.

AMD doesn't make 5W SoCs that are worth using, and the success of the Switch has more to do with it being a Nintendo product than it having Tegra. Tegra has been a flop in pretty much everywhere, except the Switch. Mobile GPU designs keep improving all the time, so it would not be surprising if the successor to the Switch ditches Tegra in favour of something different.

I'm aware of Nintendo's success stories (Pokemon!). My comment was more so how wrong people who have stated their opinions with "I'm in the industry" to fall flat on their faces. I'm still waiting for that miracle sauce GLOW spoke about for almost 2-years.

TL;DR: why are we jumping to conclusions before we even get the product. I'm not even confident RT will take off, but coming around here you get the impression people aren't even interested in a new tech, more so "where is my P/$" or whatever metric rules their roost.


That about sums up your posts. GPU's are not Cadilacs of their day.

My analogy was more about status/icon/vanity/gold_giraffe .

They charge $1,000 for them because people are dumb enough to pay that.

Exactly.

Do as you please.

Isn't that what we all do?
 

Brahmzy

Senior member
Jul 27, 2004
584
28
91
Then don't buy it if the price sucks...

> Atari
> Sega Genesis
> DinoPark Tycoon
> Doom / Quake / Duke Nukem (Dos Mode reboot)
> SC 1 / CS / CS source
> Fortnite / Dota / etc etc

Yeah, I never paid for the most high end GPU. You don't need to. The majority don't. You don't need 60 FPS.
You most definitely don't need 4k. The majority of the major game titles people play can be played on a 1060 or lower. I play @ 2560x1440 on a $140 2GB GPU. It's a champ.

The minority (between 1-2%) own a 1080ti and no sane game developer makes a game for such an exclusive group. They'd go bankrupt if they did.

Resolution?
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
1024 x 768 - 0.65%
trans.gif

1280 x 720 - 0.42%
trans.gif

1280 x 800 - 0.85%
trans.gif

1280 x 1024 - 2.19%
trans.gif

1360 x 768 - 1.97%
trans.gif

1366 x 768 - 14.18%
trans.gif

1440 x 900 - 3.62%
trans.gif

1536 x 864 - 0.31%
1600 x 900 - 3.68%
trans.gif

1680 x 1050 - 2.63%
trans.gif

1920 x 1080 - 60.66%
~91% of people are at OR below 1920x1080

No one's gaming at 4k .. Again, less than 2% of people fall into such use cases.
I've been gaming for a long time and what I long for are fun games that I enjoy not Umbra, penumbra and antumbra and reflections I never pay attention to when I'm trying to get the best K/D ratio. The 1%/2% will never determine the broader market. Nvidia has made a huge mistake by setting a precedent whereby they establish such prices. This is why people are mad. Hardware accelerated ray tracing was not done by Nvidia. It was done by imagination technologies years ago in a mobile phone processor envelope. I guess the 1-2% will be buying these cards and I don't mean that in terms of wealth. Working in tech where people make gobs of money producing this technology, I see some of the most dated cellphones and personal computer hardware. Meanwhile, I see people who can least afford it with $1,000 iphones likely paying installment payments. It's not about that though. Do as you please with your money. People are upset about Nvidia trying to sneakily set a precedent of far more increased prices for gaming hardware. There are consequences to this. If you claim you've been gaming forever, you're recall the deathly period of the PC and PC gaming in which people flocked to consoles. Every game developer remembers it. And its the reason why the most popular games don't need such hardware to run. You can run fortnight at 120fps @ 1920x1080 on a $150 2400G APU from AMD. Guess what is all the rage?

Everyone should be completely bashing Nvidia for this move. Thankfully only a small minority (1-2%) are screaming : Take my money
Got ya beat.
I guess you can call me the 2%. No 1440 or 1080 for me. Once you go big 4K, nothing else compares. And yes I need 4K60.
 
  • Like
Reactions: wilds

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,729
136
At this point you're arguing semantics. I'm aware of the hardware in the consoles, but they are underpowered by your own admission - by the CPU. We can argue until the cows turn blue about the hardware, but if people don't think Gen 8 saw a huge switch to including PC users, they aren't looking close enough. The shift is so strong, I barely use my consoles. The laundry list of "console exclusives" has pretty much dried up.
Quite the opposite in fact. More people have been buying consoles this generation than during any of the previous generations. The PS4 is already close to the number of units the PS3 sold in its lifetime. The PC market has been prohibitively expensive for mainstream gaming for the past 1.5 years, and consoles have been selling extremely well in the meantime. You finding little use for your own console doesn't indicate a shift towards PC gaming at large at the expense of console gaming.

As far as being underpowered is concerned, consoles are powerful enough to compare at the mainstream level, which matters more when it comes to increasing the audience for gaming than having a RTX card or even a 1080 Ti.
TL;DR: why are we jumping to conclusions before we even get the product. I'm not even confident RT will take off, but coming around here you get the impression people aren't even interested in a new tech, more so "where is my P/$" or whatever metric rules their roost.
People aren't interested in RTX firstly due to price, and secondly because while full raytracing promises photorealism, here we get tons of reflections and some questionable GI. Honestly, photorealism mods for Crysis were more exciting 10 years ago than what we're getting today with the new Battlefield.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Quite the opposite in fact. More people have been buying consoles this generation than during any of the previous generations. The PS4 is already close to the number of units the PS3 sold in its lifetime.

You might want to check MSFT and Nintendo...well Nintendo not so much since Switch already surpassed Wii U. MSFT's lost market share is barely offset by Sony's growth. I doubt this generation will out sell the previous, with the "extension" consoles, it be kind of hard to tally it up in the end, but that's just picking hay.

The PC market has been prohibitively expensive for mainstream gaming for the past 1.5 years, and consoles have been selling extremely well in the meantime. You finding little use for your own console doesn't indicate a shift towards PC gaming at large at the expense of console gaming.

Thankfully for PC gaming you don't have to buy hardware every generation. That's only for us PC "enthusiasts" that upgrade regularly. Like the conversations I've had before, I wouldn't tie "PC hardware" sales to "PC Software sales" growth. I'd look at publisher activity. PC is now getting ports of games that were handheld/console exclusives. This is a new trend, and you are seeing PC+PS4 or PC+Xbox only titles where as in generations before they were either PS3 or 360 exclusives with PC either never seeing it or get a port years later.

As far as being underpowered is concerned, consoles are powerful enough to compare at the mainstream level, which matters more when it comes to increasing the audience for gaming than having a RTX card or even a 1080 Ti.

Agreed, but NV can't make money off "good enough" just like AMD failed to capture more growth with "good enough." Why I think NV is attempting to rock the boat. If they can sell their tech it can lead to a new revenue pocket (RTX > GTX, even if "identical").

People aren't interested in RTX firstly due to price, and secondly because while full raytracing promises photorealism, here we get tons of reflections and some questionable GI. Honestly, photorealism mods for Crysis were more exciting 10 years ago than what we're getting today with the new Battlefield.

I get that, but being not interested != dismissing something. I'm not interested in a lot of things. I mean, why buy a Samsung phone when a Blu will mostly cover you at a fraction of the price, but /shrug, gold giraffes.

I find myself interested as much as I was with Rapid Math, Tessellation (during ATI days), and whatever. New tech is good, it will only lead to better tech! (hopefully)
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,681
7,289
136
I barely use my consoles. The laundry list of "console exclusives" has pretty much dried up.

I have barely used my PC in 2017 and 2018 for gaming. Basically just NieR Automata and Dark Souls 1 in that period. Mainly because the console exclusives have been amazing this gen. Persona 5, Horizon Zero Dawn, Bloodborne, the Yakuza series, God of War, Zelda BOTW, etc, and they have been single player games. Meanwhile our only Elder Scrolls this gen was a crap MMO and Bethesda ruined Fallout too by making it into what looks like another bland survival game. The single player console exclusives have been a godsend in this age of microtransactions and focus on MP in the multiplatform games.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I have barely used my PC in 2017 and 2018 for gaming. Basically just NieR Automata and Dark Souls 1 in that period. Mainly because the console exclusives have been amazing this gen. Persona 5, Horizon Zero Dawn, Bloodborne, the Yakuza series, God of War, Zelda BOTW, etc, and they have been single player games. Meanwhile our only Elder Scrolls this gen was a crap MMO and Bethesda ruined Fallout too by making it into what looks like another bland survival game. The single player console exclusives have been a godsend in this age of microtransactions and focus on MP in the multiplatform games.

Completely understandable, but just getting Yakuza on PC in itself is a change of the times. PC barely got any attention from big/small foreign developers. PC is getting once console only franchises. With Sega owning Atlus, I wouldn't be surprised if we started to see more of their catalog getting a PC port (I'd welcome it.)

I didn't say consoles don't have their exclusives, they always will (Sony/NIntendo titles for example), but one can't deny the growth in software sales to the point that niche developers are bringing over their games to PC same day as their console versions. It's a win/win for everyone!
 

beginner99

Diamond Member
Jun 2, 2009
5,315
1,759
136
If you think these ridiculously priced Geforce 20 series cards are being sold in any notable volume, you're asleep. $1200/$800/$600 .. All of these are priced at a point that most people won't purchase them. Pascal was a godsend when it was delivered and a no brainer purchase at various levels. Geforce 20 is a joke compared.

No I don't think that but these high prices justify to keep mid-range cards overpriced and in the end if haven't had much increase in performance/$ since like 2014. An RX 580 or GTX 1060 isn't much faster than a 290(x) which was available in 2014 for $250.
And the lower power use isn't all that relevant to most users.

You must be playing a walking sim to not get eye strain from being that near on a screen of that size. Drop a few post-processing settings like depth of field and turn down shadows and draw(object/LOD) distance from ultra to high and you can have locked 60FPS on a 1080Ti on the majority of AAA games.

This is like arguing with the audiophiles and their $1000 golden layered cables.
 

Timmah!

Golden Member
Jul 24, 2010
1,565
914
136
@PeterScott

Regarding this entire x60/x70 debate, aside of the higher price, you need to take into account that people simply dont like buying less for the same (or even more) money than before. In other words, if the x70 cards used to be historically (well past 2 generations at least) xx104 chip, they dont like its out of blue xx106 one. You keep saying those designations dont matter, cause its just how Nvidia internally names their chips... but they do have a meaning indeed - one does designate second largest chip in their "stack", the other one third largest chip. Since generally bigger chip = more performance, and you once got second largest chip at x70 price-point, people are not quite happy now its the third one. When you say they still get more performance than 1070, so what is the problem, they might counter-argue they would get even more so, if the chip was still 104 one...

Obviously, since 106 chip is 3/4 of 104 this time around and not 1/2, therefore 2070 could be technically both cut 104 and full 106, it is non-issue. Since the "2070 = tu 106" rumor preceded the actual info about tu106 specifics, i guess people assumed 1/2 as before - thus the outrage. I think its very understandable.
 
Status
Not open for further replies.