Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Cableman

Member
Dec 6, 2017
78
73
91
I was thinking of upgrading my aging 24" 1080p60 monitor to the rumored 32" 4k144 that are supposed to arrive next year. That would require an upgrade of my 1070 to a x80ti. But it won't be at those prices and the 2080ti doesn't appear to have the necessary performance. I am definitely waiting for the 7nm GPUs and hoping for better pricing. The good thing is that I have a large backlog of games from the last 10 years so the 1070 should suffice for now.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I was thinking of upgrading my aging 24" 1080p60 monitor to the rumored 32" 4k144 that are supposed to arrive next year. That would require an upgrade of my 1070 to a x80ti. But it won't be at those prices and the 2080ti doesn't appear to have the necessary performance. I am definitely waiting for the 7nm GPUs and hoping for better pricing. The good thing is that I have a large backlog of games from the last 10 years so the 1070 should suffice for now.
4k 144hz will be close to 2k if not more and you need two 2080ti's anyway in that case. I have no idea if 1 7nm card will do that.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
And this is what historic falls from grace at the peak begin as.


Until your shiny new proprietary approach fails/is rejected and doing so w/ no regard for your costumers results in you losing your business to new competition and existing that capitalizes on the sizable and executable value gap you have created.

Giant companies fail for a reason. Greed clouds otherwise sensible judgements at exactly the worst time. A company over-estimates its growth potential at its peak, gouges its customers too hard, and over-invests in a proprietary one-off approach... and then gets hit with reality.

Intel being a recent example that has yet to play out w/ boatloads of older tech companies as older examples.

Its also true that very successful large companies see when its time to compete again and do so. Your presumption is that a competitor will swoop in and come out with something that will hurt them more than the money they are making. I think that is flawed.

Nvidia can milk their situation, invest in R&D in many different things, and when things get competitive again, compete.

Look here.

https://ycharts.com/companies/NVDA/r_and_d_expense

See how Nvidia has been increasing their R&D for a long while now?

Or this.

https://spectrum.ieee.org/view-from...-big-spender-for-semiconductor-rd-thats-intel

"The fastest growing R&D budget, the research firm said, is over at Intel’s Silicon Valley neighbor, Nvidia, whose nearly $1.8 billion R&D investment in 2017 topped its 2016 numbers by 23 percent."

Nvidia is taking many of their profits and doing R&D. They are not sitting still and are actively trying to expand the market, rather than maintain their lead in the markets they are currently winning.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
I was going to add that I wouldn't be surprised if Nvidia cut out AIBs altogether in their pursuit of more money. Have all of the moves Nvidia has been making going to lead them to be another 3dfx? I mean they bought 3dfx's assets. They know how they got them. Could they possibly be that dumb?
Everything about the Geforce20 reflects that there aim is to do just that.
Geforce FE is packaged like a polished super-clocked high-end EVGA card.
They are also selling FE's through prominent retailers like Bestbuy/NewEgg and others. It was on the product page but I can no longer find it. I see no problem with this per say as long as they resolve the problems with AIBs adding on even more extortionist pricing along w/ retailers.

The biggest and ugliest trends I see are an increased tethering of the GPU's hardware to their cloud services via Geforce Experience and now DLSS. Raytracing cores are a meme and aren't doing the major portions of the calculation. Tensor cores are an AA/refinement meme. DLSS being such that it relies on their cloud super computer to pre-compute the runtime data is a meme.

The only thing about Geforce20 that matters is what they did to make the SMs more performant over Pascal which they are being super tight lipped about. So, marketing hoped, over a month of carefully curated leaks and sentiment steering that they'd be able to cleverly manipulate the consumer market into believing this was the second coming. This would have been possible if the pricing didn't cause people to second guess what was occurring.

Instead, and as is always the case at a company's peak, they decided to price gouge which has now brought every single detail about the new cards into question, including past cards & features, and their overall future direction.

And what do you see? You see a litany of proprietary 2nd coming technologies of the past that failed to be adopted. You see a systematic attempt to make everything that wasn't nailed down a proprietary lock-in. You see blatant and clear attempts to secure margins/profit even when consumers buck the trend :
https://www.theregister.co.uk/2018/01/03/nvidia_server_gpus/
https://www.digitaltrends.com/computing/nvidia-bans-consumer-gpus-in-data-centers/

You see wild things like there being a dedicated capture card residing in Geforce cards since Maxwell being disabled once Nvidia figures out it can milk it for its cloud push :
https://steamcommunity.com/app/353380/discussions/0/1318836262651159028/
https://forums.geforce.com/default/...treaming-broken-since-last-nvidia-driver-gfe/

https://developer.nvidia.com/capture-sdk
For Kepler/Maxwell/Pascal generations:
all NVIDIA Quadro 2000 class or higher are supported and all NVIDIA Tesla are supported

So, it's quite clear that Nvidia will go out of its way to try to preserve its margins which is what the sad state of being a publicly traded company forces a company to do. In the past, Nvidia went so far as to disable access to previously accessible piece of hardware (Capture) and restrict it to its cloud. Nvidia also revised user agreements to prevent people from using geforce cards in a professional capacity.

These kind of shenanigans continue and become more emboldened until the consumer market takes a stand and rejects it. All companies claim they are pushing the envelope and want to open computing to all. However, time and time again, with hardware companies in tech, you find them intentionally gimping features that you purchased behind a spaghetti mess of software/drivers/cloud computing memes.

So now that they've made it clear that consumer cards have now becoming segmented to prior quadro pricing and prior quadro pricing has gone HPC pricing and HPC pricing has fone full retard, it's clear this is all about business/margins and an attempt to squeeze a lemon in every market segment. Asic like functionality belongs in an MCM paradigm completely broken out from the main die. Tensor cores along with meme learning aren't an official standard nor are ray tracing cores. Nvidia's going for the kill shot at their peak and it may cost them dearly as new and existing competition goes for a far better and sensible value proposition. Intel was cut at the knee caps by AMD in this way. I expect new entrants and possibly AMD to do the same in the GPU segment. Intel and others are coming online in a front on attack against the main GPU market and the yet to be determined accelerator markets like meme learning which is nothing but a network of matrix math OPs.

https://www.tomshardware.com/news/tpu-v2-google-machine-learning,35370.html
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9NL0wvNzA4NDI5L29yaWdpbmFsLzIwMDEuSlBH

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9LL0UvNzA4MzUwL29yaWdpbmFsLzEwMy5KUEc=

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9LL1EvNzA4MzYyL29yaWdpbmFsLzUwMS5KUEc=

Competition is good for progress .. When it's lacking, a consumer gets some of the worst value which is why I hold off purchasing until there is feverish competition and resulting progress/innovation/value. Big cloud providers have been trying to destroy the profit margins of traditional hardware providers for some time and have been quite successful. Good things lie ahead but Nvidia is indeed going in for the kill.
 

TheF34RChannel

Senior member
May 18, 2017
786
310
136
I was thinking of upgrading my aging 24" 1080p60 monitor to the rumored 32" 4k144 that are supposed to arrive next year. That would require an upgrade of my 1070 to a x80ti. But it won't be at those prices and the 2080ti doesn't appear to have the necessary performance. I am definitely waiting for the 7nm GPUs and hoping for better pricing. The good thing is that I have a large backlog of games from the last 10 years so the 1070 should suffice for now.

2080 Ti replaces Titan Xp with the price to match. 7nm won't be offered for cheaper, prices almost always go up. Sounds like a nice monitor!
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Its also true that very successful large companies see when its time to compete again and do so. Your presumption is that a competitor will swoop in and come out with something that will hurt them more than the money they are making. I think that is flawed.
My well formed framing is :
> Gaming is a mature market with a huge range of capable surplus hardware
> Gaming forms Nvidia's biggest revenue source
Meme learning, self driving cars, robotics, etc are all yet to be determined markets that belongs to no one.
A slew of companies are going after this market. Nvidia maintains a more than premium pricing in it.
Google and many other cloud providers which buy hardware at large scales got tired of this trend by hardware companies in gouging them with enterprise premiums that equate to (You have lots more so we charge you lots more). They began unseating a slew of companies under :
https://www.opencompute.org/
https://www.opennetworking.org/technical-communities/areas/specification/open-datapath/
https://fossbytes.com/open-compute-project-google/
https://www.networkworld.com/articl...red-to-be-entering-the-networking-market.html
https://cloud.google.com/tpu/

So, the competition for the next phase of computing has already swooped in and come out with things to hurt the money Nvidia thought they were going to get.

Beyond this, More and more is being pushed down to smartnics detaching CPUs from storage as shown by the multi core ARM chip that drives NVME drives. In the enterprise, they already have drafted specs towards server less storage.
introduction-to-nvme-over-fabricsv3r-8-638.jpg

Spot the CPU...

So, my thinking is not flawed, it just deeply informed and forward thinking beyond the average understanding.. thus the lack of belief.

Nvidia can milk their situation, invest in R&D in many different things, and when things get competitive again, compete

Look here.
https://ycharts.com/companies/NVDA/r_and_d_expense
See how Nvidia has been increasing their R&D for a long while now?
Or this.
https://spectrum.ieee.org/view-from...-big-spender-for-semiconductor-rd-thats-intel

"The fastest growing R&D budget, the research firm said, is over at Intel’s Silicon Valley neighbor, Nvidia, whose nearly $1.8 billion R&D investment in 2017 topped its 2016 numbers by 23 percent."
Tossing money at R&D doesn't guarantee results especially in an industry where everyone is doing so and a now trending ecosystem of cloud providers whose mission is to completely destroy enterprise hardware profit margins. Nvidia like others are fighting an uphill battle against far more powerful and conjoined groups to destroy their ability and others in the hardware sector to profit immensely from proprietary and overpriced wares. It's called commodification. In such a paradigm, margins and premiums are crushed. So, Nvidia is going to be in a fight for its life to maintain its main revenue source : gaming with sound competition from AMD and now Intel. Others might join the party but it would be difficult. Outside of this, Nvidia invested heavily in R&D in hopes it would solidify numerous and new multi-billion dollar markets. They're facing off against cloud providers with an opposite agenda to make that ecosystem opensource, commodified, whitebox, and cheap. The cloud providers and open standards are going to win and I can almost guarantee you that Nvidia will fail in its current agenda which is tragic because they decided to blunt their main revenue source in attempts to gain new markets. They even boldly attempting to push an unwilling and reeling consumer segment into enterprise pricing. It's about the most lethal mistake a company often makes at their peak : Alienate and hurt your main revenue source in search of new markets which you end up failing to capture. You end up with horrid ROI on R&D, a scattered and confused internal structure, and a frontal attack against your bread and butter. Tech companies go through this all the time at their peak. IBM comes to mind as do others who are no longer in existence.

Nvidia is taking many of their profits and doing R&D. They are not sitting still and are actively trying to expand the market, rather than maintain their lead in the markets they are currently winning.
Their profits, resources, and R&Ds are clearly focused on an agenda of trying to corner a range of markets with higher margins and proprietary lock ins. It's why it will fail in epic fashion when faced against far more powerful groups, with bigger budgets, and a combined goal to destroy hardware profit margins, commoditize it, and open source the software stack. They're not going to expand because their expansion is spearheaded by ridiculous margins and proprietary lockin vs much more powerful forces towards lower margins/open source. Their going to fail to capture those new markets from potential customers that roll their own hardware for far cheaper in open consortiums and they're going to wake up to a feverish competition in their core gaming market after having opened the door with an arrogant money grab (Geforce 20). Ray tracing is only going to be successful through Vulkan and a pretty standardized hardware interface to a completely separate MCM chained chip. Tensor cores are a meme towards the final form and will be held as largely incapable of processing the more complex and necessary variant. 7nm and an industry maturing to MCM will allow for a more full and sound implementation to be realized.

Screen-cap this post.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,502
136
2080 Ti replaces Titan Xp with the price to match. 7nm won't be offered for cheaper, prices almost always go up. Sounds like a nice monitor!

Much like what happened on the CPU side, the only hope for downward pressure on prices is for AMD to come out strong at 7 nm with Navi. I'm not holding out much hope at this point though. Maybe the next gen after Navi will restore some balance.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
I wouldn't touch a 1070 or 1070 Ti when Pascal is more than two years old now. The 1070 is still above fake MSRP on newegg and 1070 Ti is above MSRP there except for one EVGA blower and a Zotac dual fan card. No thanks. If 2060 isn't amazing I'll stick with my 970 another year.

You are correct, but imagine what would happen if Nvidia threw a 30% discount on their whole 10XX stack. Wouldn't you want a 1070 then? Personally I think I would have a very hard time staying away from a 1080Ti and call it a day until 7nm.
 

Hitman928

Diamond Member
Apr 15, 2012
6,755
12,502
136
Found this interesting in regards to GPU pricing talk. I think whoever it was that said they are pushing pricing up to allow the 10xx series to be in the market with the 20xx series to clear 10xx series inventory while bringing the new GPUs to market was spot on.

https://www.digitimes.com/news/a20180822PD202.html

Accordingly more than 10 graphic card makers have no other choice but to swallow contracted shipments released by Nvidia to deplete its inventories, in order to secure that they can be among the first batch of customers to get sufficient supply quotas of new-generation GPUs.
 
  • Like
Reactions: ub4ty

ub4ty

Senior member
Jun 21, 2017
749
898
96
You are correct, but imagine what would happen if Nvidia threw a 30% discount on their whole 10XX stack. Wouldn't you want a 1070 then? Personally I think I would have a very hard time staying away from a 1080Ti and call it a day until 7nm.
Already have one as do many for years as it was a great value and performance increase.
Will continue to have it for years more. Combined with DDR prices, I think the greed in computer hardware is finally going to cause people to wholesale : wait it out which there is no compelling reason why they should rush to buy more capable capacity. I still spend a good deal of time on a dual core 8GB lower power machine with integrated graphics because that's all you need for mainstream computing. I game on a quad core CPU with maxwell GPU and have no compelling reason to change that. When prices are reasoned, I honestly don't think much and upgrade often. However, as with all consumers, their's a ceiling of pricing that causes me to step back and think about what's going on. When companies greedily hit it, there are broader consequences beyond an immediately sale. I tend to slip into protection/speculation mode about not only upcoming purchases but prior. My spending habits retract for some time when I feel I am being outright gouged.

For my more serious computing, I have a range of eval hardware builds consisting of 8 core / 16 core cpus and pascal GPUs.

> 7nm
> PCIE 4.0
> DDR4 prices falling by half
> HBM2.0/3.0 equipped GPUs
> Nvme/SSD prices cut in half or more

I'm waiting it out. A good chunk o people were swept up in the frenzy of 2014/2015/2016/2017 when things were quite affordable and performance increases were substantial. Then things got crazy. So, I'd say a number of people are already rigged up which tends to be good for 5 years... until : 2019/2020/2021/2022. 2019 probably marks the start of a new big wave of upgrade considerations which will get refined in 2020/2021/2022.
 
Last edited:
  • Like
Reactions: Grazick

Cableman

Member
Dec 6, 2017
78
73
91
4k 144hz will be close to 2k if not more and you need two 2080ti's anyway in that case. I have no idea if 1 7nm card will do that.
I am willing to spend a lot for a quality monitor because I expect it to last a long time, but I am not willing to spend $1k+ on a GPU. I am also not sure that the 7nm GPU (3080ti?) would take advantage of that monitor, but I guess we will see next year? To be honest, I am so behind with my gaming library that with the older games I am playing, it might as well be enough.
2080 Ti replaces Titan Xp with the price to match. 7nm won't be offered for cheaper, prices almost always go up. Sounds like a nice monitor!
That's what I am worried about. $1200 for a GPU is too much for me and while the 2080ti replaces the Titan price-wise, I am not sure if that's the case with performance. But yes, once those monitors come out, they will be something to behold.
 

psolord

Platinum Member
Sep 16, 2009
2,142
1,265
136
Already have one as do many for years as it was a great value and performance increase.
Will continue to have it for years more. Combined with DDR prices, I think the greed in computer hardware is finally going to cause people to wholesale : wait it out which there is no compelling reason why they should rush to buy more capable capacity. I still spend a good deal of time on a dual core 8GB lower power machine with integrated graphics because that's all you need for mainstream computing. I game on a quad core CPU with maxwell GPU and have no compelling reason to change that. When prices are reasoned, I honestly don't think much and upgrade often. However, as with all consumers, their's a ceiling of pricing that causes me to step back and think about what's going on. When companies greedily hit it, there are broader consequences beyond an immediately sale. I tend to slip into protection/speculation mode about not only upcoming purchases but prior. My spending habits retract for some time when I feel I am being outright gouged.

For my more serious computing, I have a range of eval hardware builds consisting of 8 core / 16 core cpus and pascal GPUs.

> 7nm
> PCIE 4.0
> DDR4 prices falling by half
> HBM2.0/3.0 equipped GPUs
> Nvme/SSD prices cut in half or more

I'm waiting it out. A good chunk o people were swept up in the frenzy of 2014/2015/2016/2017 when things were quite affordable and performance increases were substantial. Then things got crazy. So, I'd say a number of people are already rigged up which tends to be good for 5 years... until : 2019/2020/2021/2022. 2019 probably marks the start of a new big wave of upgrade considerations which will get refined in 2020/2021/2022.

I understand where you are coming from. Heck I waited till 2018 to upgrade my 2500k. However especially for gpus, there are many in between states, from depriving myself of gaming joy, to rushing and buying a super expensive 2080Ti.
 
  • Like
Reactions: PeterScott

realibrad

Lifer
Oct 18, 2013
12,337
898
126
My well formed framing is :
> Gaming is a mature market with a huge range of capable surplus hardware
> Gaming forms Nvidia's biggest revenue source
Meme learning, self driving cars, robotics, etc are all yet to be determined markets that belongs to no one.
A slew of companies are going after this market. Nvidia maintains a more than premium pricing in it.
Google and many other cloud providers which buy hardware at large scales got tired of this trend by hardware companies in gouging them with enterprise premiums that equate to (You have lots more so we charge you lots more). They began unseating a slew of companies under :
https://www.opencompute.org/
https://www.opennetworking.org/technical-communities/areas/specification/open-datapath/
https://fossbytes.com/open-compute-project-google/
https://www.networkworld.com/articl...red-to-be-entering-the-networking-market.html
https://cloud.google.com/tpu/

So, the competition for the next phase of computing has already swooped in and come out with things to hurt the money Nvidia thought they were going to get.

Beyond this, More and more is being pushed down to smartnics detaching CPUs from storage as shown by the multi core ARM chip that drives NVME drives. In the enterprise, they already have drafted specs towards server less storage.
introduction-to-nvme-over-fabricsv3r-8-638.jpg

Spot the CPU...

So, my thinking is not flawed, it just deeply informed and forward thinking beyond the average understanding.. thus the lack of belief.


Tossing money at R&D doesn't guarantee results especially in an industry where everyone is doing so and a now trending ecosystem of cloud providers whose mission is to completely destroy enterprise hardware profit margins. Nvidia like others are fighting an uphill battle against far more powerful and conjoined groups to destroy their ability and others in the hardware sector to profit immensely from proprietary and overpriced wares. It's called commodification. In such a paradigm, margins and premiums are crushed. So, Nvidia is going to be in a fight for its life to maintain its main revenue source : gaming with sound competition from AMD and now Intel. Others might join the party but it would be difficult. Outside of this, Nvidia invested heavily in R&D in hopes it would solidify numerous and new multi-billion dollar markets. They're facing off against cloud providers with an opposite agenda to make that ecosystem opensource, commodified, whitebox, and cheap. The cloud providers and open standards are going to win and I can almost guarantee you that Nvidia will fail in its current agenda which is tragic because they decided to blunt their main revenue source in attempts to gain new markets. They even boldly attempting to push an unwilling and reeling consumer segment into enterprise pricing. It's about the most lethal mistake a company often makes at their peak : Alienate and hurt your main revenue source in search of new markets which you end up failing to capture. You end up with horrid ROI on R&D, a scattered and confused internal structure, and a frontal attack against your bread and butter. Tech companies go through this all the time at their peak. IBM comes to mind as do others who are no longer in existence.


Their profits, resources, and R&Ds are clearly focused on an agenda of trying to corner a range of markets with higher margins and proprietary lock ins. It's why it will fail in epic fashion when faced against far more powerful groups, with bigger budgets, and a combined goal to destroy hardware profit margins, commoditize it, and open source the software stack. They're not going to expand because their expansion is spearheaded by ridiculous margins and proprietary lockin vs much more powerful forces towards lower margins/open source. Their going to fail to capture those new markets from potential customers that roll their own hardware for far cheaper in open consortiums and they're going to wake up to a feverish competition in their core gaming market after having opened the door with an arrogant money grab (Geforce 20). Ray tracing is only going to be successful through Vulkan and a pretty standardized hardware interface to a completely separate MCM chained chip. Tensor cores are a meme towards the final form and will be held as largely incapable of processing the more complex and necessary variant. 7nm and an industry maturing to MCM will allow for a more full and sound implementation to be realized.

Screen-cap this post.

I will remind you that your point was that companies that get complacent and a competitor comes in with something new/better and captures market share.

"Until your shiny new proprietary approach fails/is rejected and doing so w/ no regard for your costumers results in you losing your business to new competition and existing that capitalizes on the sizable and executable value gap you have created."

Nvidia is doing the opposite of what you are explaining here. Yes, they have captured the vast majority of the gaming market and as such command a premium for their product. They are also expanding into other markets which gives them a better chance to receive revenue if they get out competed.

Nvidia is in no such danger of being unseated in the next few years. They are investing heavily so they stay competitive in the future. That is the exact opposite of what you are purposing.

If you take your example of Intel, and look at their R&D, you see it go up and down quite a bit. Intel is a far better example of what not to do, while Nvidia is an example of what to do currently.
 

TheF34RChannel

Senior member
May 18, 2017
786
310
136
Much like what happened on the CPU side, the only hope for downward pressure on prices is for AMD to come out strong at 7 nm with Navi. I'm not holding out much hope at this point though. Maybe the next gen after Navi will restore some balance.

Nah, been seeing this kind of hope forever and it never happened. Intel GPUs are our best bet.

I am willing to spend a lot for a quality monitor because I expect it to last a long time, but I am not willing to spend $1k+ on a GPU. I am also not sure that the 7nm GPU (3080ti?) would take advantage of that monitor, but I guess we will see next year? To be honest, I am so behind with my gaming library that with the older games I am playing, it might as well be enough.That's what I am worried about. $1200 for a GPU is too much for me and while the 2080ti replaces the Titan price-wise, I am not sure if that's the case with performance. But yes, once those monitors come out, they will be something to behold.

You get so much more longevity out of a display than a card it's well worth the extra money. Only problem is one needs the appropriate hardware to utilize said display ;)

I got my monitor while sporting a 970 and it was nightmarish. Chucked in a 980 Ti Classified (loved that enough to marry it) to set it loose and it ran like a dream.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
I understand where you are coming from. Heck I waited till 2018 to upgrade my 2500k. However especially for gpus, there are many in between states, from depriving myself of gaming joy, to rushing and buying a super expensive 2080Ti.
Most people are on pretty capable Maxwell or Pascal already and quad core CPUs.
The minority are a below this threshold and above : 1080ti/titan etc.
The majority settle in around :
980/750ti/1050/1060/1070 and mid tier RX AMD video cards.

There's really nothing uncomfortable about gaming on such cards. It would probably be years from now before I'd even want to game on my compute only 1070. Most people aren't on the latest "big" titles. 4k high fps gaming is a literal meme. Only in mainstream form, does a luxury segment originate that argues that this is a standard. It's not.

The latest big graphic titles have been utter failures because the gaming studios decided to focus on politics instead of good game making. A large number of people still play CS:GO. Dota/league of legends can be played on dumpster tier cards. PubG and fortnite can be be played at 100fps+ on gutter tier cards.

Game developers make money when tons of people play their games. They surprisingly aren't interested in making their games unplayable on cheap and dated video cards.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
I will remind you that your point was that companies that get complacent and a competitor comes in with something new/better and captures market share.
That wasn't my point. My general and clear point was that a mature big business is historically incapable of avoiding the trappings that result in their ultimate demise. It's a product of nature. Life/Death/decay. Nothing can grow or sustain forever. Things that do are parasitic diseases and even they reach a point of greed which results in their retraction.

Nvidia is doing the opposite of what you are explaining here. Yes, they have captured the vast majority of the gaming market and as such command a premium for their product.
They aren't. Everything about their new product is proprietary and non-standard.
Everything about it reflects premium pricing.
Everything about it reflects a lack of desire to reduce cost in manufacturing via MCM.
Nvidia has R&D on MCM but has yet to implement it. The premium you claim they can charge is causing the whole internet to question their product and direction. If you think this is a positive thing, I have no clue how to carry on our discussion.

They are also expanding into other markets which gives them a better chance to receive revenue if they get out competed.
They are approaching it in a pigheaded proprietary high margin manner which is why they're failing to gain traction. Their potential customers are spinning their own hardware with the completely opposite thinking : cheap/low margin. Did you read anything I posted about this? Or, are you ignoring it? The enterprise is spinning their own hardware. They started years ago.

Nvidia is in no such danger of being unseated in the next few years. They are investing heavily so they stay competitive in the future. That is the exact opposite of what you are purposing.

If you take your example of Intel, and look at their R&D, you see it go up and down quite a bit. Intel is a far better example of what not to do, while Nvidia is an example of what to do currently.
Nvidia is doing everything Intel did that landed them in their current predicament.
They're doing what all historically big and successful tech companies do exampling no one is beyond nature : Life-> decay -> death. The bigger you are the harder you fall. David&Goliath. Small, agile, innovative and hungry vs Big, slow, incremental, and lethargic.

A Startup with 3 million dollars of funding beat Nvidia and AMD to a focused ray tracing pipeline over 10 years ago.
cautic-rt-platform.jpg

The methodology is pretty standardized and straight forward :
Step-1-Ray-tracing-pipeline-scene-hierarchy-generation-650x250.png


Denying this reality comes with its own eventual price and penalty.

PowerVR-GR6500-raytracing-001.png

PowerVR-Ray-Tracing-efficiency-analysis-3.png


True innovation tends to come from the small, hungry, and yet to be proven companies who have to do something game changing make their mark. Big companies either buy them up or adapt their proven technologies. Its why competition is always good and necessary. You're speaking as if Nvidia is invisible and was the first company to implement hardware based ray tracing acceleration... Without them we'd be doomed... hardly the case. They like any other company has to deliver new technology at reasoned prices to stay alive. Believing otherwise, allows others to take your markets from you. Greedily capitalizing on your lead has diminishing returns : See intel as a recent example. Oracle comes to mind as well. Innovation slows as profit steering becomes your main design philosophy.
 
Last edited:
  • Like
Reactions: psolord

ub4ty

Senior member
Jun 21, 2017
749
898
96
Vulkan apparently already supports RayTracing, shown on RTX at Gamescom.

I'm only interested in Ray tracing through Vulkan or direct access.
Ray Tracing via proprietary DirectX12 is DOA for me.
Nvidia is supporting vulkan based Ray tracing as is AMD and I think we can all look forward to a much better hardware ecosystem enabled by Vulkan.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I am willing to spend a lot for a quality monitor because I expect it to last a long time, but I am not willing to spend $1k+ on a GPU. I am also not sure that the 7nm GPU (3080ti?) would take advantage of that monitor, but I guess we will see next year? To be honest, I am so behind with my gaming library that with the older games I am playing, it might as well be enough.That's what I am worried about. $1200 for a GPU is too much for me and while the 2080ti replaces the Titan price-wise, I am not sure if that's the case with performance. But yes, once those monitors come out, they will be something to behold.
It took me forever to get my 1440p monitor, I was so indecisive about it but when I got it I was glad I did. It's too bad there aren't any decent high refresh rate 4k monitors. I was expecting that if the 2080ti is performant enough that I'd go to 4k at this point. But there really isn't any future proofing in monitors if I buy one now and there is no guarantee that next year there will be a card that will smoothly execute those high Hz frames like my 1440p screen does now with one 1080ti.

I heard somewhere that Sept 14th is when the nda should lift and there should be reviews out.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
You are correct, but imagine what would happen if Nvidia threw a 30% discount on their whole 10XX stack. Wouldn't you want a 1070 then? Personally I think I would have a very hard time staying away from a 1080Ti and call it a day until 7nm.

That would mean a $315 1070 Ti. I'd still say no. The best two years of Pascal have been used up and Nvidia's probably going to stop optimizing for Pascal within a year or so. If Pascal goes on fire sale prices like Hawaii in late 2014 then maybe I'd jump on a 1070 Ti. Otherwise? No.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
That wasn't my point. My general and clear point was that a mature big business is historically incapable of avoiding the trappings that result in their ultimate demise. It's a product of nature. Life/Death/decay. Nothing can grow or sustain forever. Things that do are parasitic diseases and even they reach a point of greed which results in their retraction.


They aren't. Everything about their new product is proprietary and non-standard.
Everything about it reflects premium pricing.
Everything about it reflects a lack of desire to reduce cost in manufacturing via MCM.
Nvidia has R&D on MCM but has yet to implement it. The premium you claim they can charge is causing the whole internet to question their product and direction. If you think this is a positive thing, I have no clue how to carry on our discussion.


They are approaching it in a pigheaded proprietary high margin manner which is why they're failing to gain traction. Their potential customers are spinning their own hardware with the completely opposite thinking : cheap/low margin. Did you read anything I posted about this? Or, are you ignoring it? The enterprise is spinning their own hardware. They started years ago.


Nvidia is doing everything Intel did that landed them in their current predicament.
They're doing what all historically big and successful tech companies do exampling no one is beyond nature : Life-> decay -> death. The bigger you are the harder you fall. David&Goliath. Small, agile, innovative and hungry vs Big, slow, incremental, and lethargic.

A Startup with 3 million dollars of funding beat Nvidia and AMD to a focused ray tracing pipeline over 10 years ago.
cautic-rt-platform.jpg

The methodology is pretty standardized and straight forward :
Step-1-Ray-tracing-pipeline-scene-hierarchy-generation-650x250.png


Denying this reality comes with its own eventual price and penalty.

PowerVR-GR6500-raytracing-001.png

PowerVR-Ray-Tracing-efficiency-analysis-3.png


True innovation tends to come from the small, hungry, and yet to be proven companies who have to do something game changing make their mark. Big companies either buy them up or adapt their proven technologies. Its why competition is always good and necessary. You're speaking as if Nvidia is invisible and was the first company to implement hardware based ray tracing acceleration... Without them we'd be doomed... hardly the case. They like any other company has to deliver new technology at reasoned prices to stay alive. Believing otherwise, allows others to take your markets from you. Greedily capitalizing on your lead has diminishing returns : See intel as a recent example. Oracle comes to mind as well. Innovation slows as profit steering becomes your main design philosophy.

Your position seems to be that they are doomed to fail, and that is life, while also arguing that they are doing something wrong.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
As stated, the only thing that matters beyond the immature meme cores tensor/ray tracing is what Nvidia did to the core architecture that resulted in speedups :
If you want to inline images from Videocards, you need to remove the s from https for the image URL
 
  • Like
Reactions: ub4ty

Cableman

Member
Dec 6, 2017
78
73
91
You get so much more longevity out of a display than a card it's well worth the extra money. Only problem is one needs the appropriate hardware to utilize said display ;)

I got my monitor while sporting a 970 and it was nightmarish. Chucked in a 980 Ti Classified (loved that enough to marry it) to set it loose and it ran like a dream.

It took me forever to get my 1440p monitor, I was so indecisive about it but when I got it I was glad I did. It's too bad there aren't any decent high refresh rate 4k monitors. I was expecting that if the 2080ti is performant enough that I'd go to 4k at this point. But there really isn't any future proofing in monitors if I buy one now and there is no guarantee that next year there will be a card that will smoothly execute those high Hz frames like my 1440p screen does now with one 1080ti.

I heard somewhere that Sept 14th is when the nda should lift and there should be reviews out.

Picking the right monitor is my priority (and a capable GPU to drive it). I work from home most of the time and I spend 12+ hours a day staring at my screen. I am willing to spend a lot on a good monitor and peripherals. I love reading about technology, but I actually don't like replacing it that often. There are plenty of excellent 1440p high Hz monitors right now, but I am certain that I will want something better (4k HDR) in a year or two so I'd rather wait another year or two and then get a better monitor.

I generally agree about future proofing, but I think that a 32" 4k HDR 144Hz mini-LED monitor is as close to future proofing as I will get. That should be available 2019/2020. I expect the 7nm GPUs around the same time. I am catching up on games from 2012-2016 so even if a 2019 GPU won't play 2019 games at 4k144, as long as I can get that for let's say Doom (2016), I will be happy. It will take me some time to catch up on the older games, by that time there will be more capable GPUs that can let me play newer titles. I always wait until the games are old enough to be on a 50%+ sale so I am not interested in the "cutting edge" stuff and frankly I don't have the free time to play everything anyway. So I hope/think that 7nm GPUs will be enough to drive those games at 4k144. The question given what I am seeing now is at what price.

Turing has some new cool technology, but I see it as a stop-gap GPU, 7nm is not far off and that should give us a much bigger leap in performance and much more mature RTX implementation. The current GPUs are for those that simply must have the newest toy. They are the ones that will be dropping that kind of money again in a years time for the 7nm GPUs. I'd rather wait and enjoy what I have.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
Picking the right monitor is my priority (and a capable GPU to drive it). I work from home most of the time and I spend 12+ hours a day staring at my screen. I am willing to spend a lot on a good monitor and peripherals. I love reading about technology, but I actually don't like replacing it that often. There are plenty of excellent 1440p high Hz monitors right now, but I am certain that I will want something better (4k HDR) in a year or two so I'd rather wait another year or two and then get a better monitor.

I generally agree about future proofing, but I think that a 32" 4k HDR 144Hz mini-LED monitor is as close to future proofing as I will get. That should be available 2019/2020. I expect the 7nm GPUs around the same time. I am catching up on games from 2012-2016 so even if a 2019 GPU won't play 2019 games at 4k144, as long as I can get that for let's say Doom (2016), I will be happy. It will take me some time to catch up on the older games, by that time there will be more capable GPUs that can let me play newer titles. I always wait until the games are old enough to be on a 50%+ sale so I am not interested in the "cutting edge" stuff and frankly I don't have the free time to play everything anyway. So I hope/think that 7nm GPUs will be enough to drive those games at 4k144. The question given what I am seeing now is at what price.

Turing has some new cool technology, but I see it as a stop-gap GPU, 7nm is not far off and that should give us a much bigger leap in performance and much more mature RTX implementation. The current GPUs are for those that simply must have the newest toy. They are the ones that will be dropping that kind of money again in a years time for the 7nm GPUs. I'd rather wait and enjoy what I have.
I also work from home sometimes so I know what you mean. I found one 27 in 4k 144hz gsync monitor and that's the acer predator which is on sale at microcenter for $1799 right now. Sure I could pay that but do I really want to? Has some odd side curtain things on it too. I could simply just get a 4k 60hz and just use that.

I don't think next year's gpu's will be any cheaper seeing how Nvidia loves up charging, mainly now to clear 10 series stock. I think they will go as high as they can with prices but if they do next year what would be their excuse? This time it's a new gen of new architecture for Ray tracing. Next year will build upon that so we will see. Waiting is a good technique as I too recently got into some games I passed up on pc when they released and I got them super cheap too later.
 
Status
Not open for further replies.