Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 24 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
It's puzzling. If it's even close to true, why couldn't they realize that a demo of it would have cards flying off the shelves?
It would have taken only a minute out of their presentation.

Actually there was a Demo of this, but of course it wasn't a blanket all games thing. They demoed UE4 Infiltrator running 4K, at over 70FPS, while they claimed it runs less than 40 FPS on a 1080Ti.

But that kind of one off demo, just makes you question why that happened in that one instance.

The true performance question won't be answered until reviewers test a variety of Games. Both NVidia/AMD are always guilty of showing performance demos that are most favorable to the HW they want to sell.

Third party reviews are they only real answer.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
It's puzzling. If it's even close to true, why couldn't they realize that a demo of it would have cards flying off the shelves?
It would have taken only a minute out of their presentation.

The product they are launching is not meant to be a high volume product. It may end up being more of a loss to have all that unsold glut of chips from the last gen. You would end up promoting a new chip and selling that over the old chip. Yes, you would have better margins, but, it might not be enough.

Further, you are going to have a bunch of pissed off vendors that bought way too much of the old stuff and can't sell it. So, better to position this new one above the old and not have them compete as much as normal. Sell out of the old, get some nice margins for investors, and kill it with 7nm.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
To no ones surprise, TU104 chip confirmed for 2080. GN doing a tear down:

https://www.youtube.com/watch?v=U5XKtobx7ro

edit: here's a quick pic if anyone wants a go at sizing it.
tu104-jpg.2644
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
There were a number of people who said they wanted to have a technical discussion about ray tracing and hw ray tracing but I don't see them initiating one.. So lets get into it and put a nail in this teased coffin.

Caustic Graphics was one of the first companies to implement hardware ray tracing a DECADE ago.
https://www.crunchbase.com/organization/caustic-graphics#section-overview
Here's their detail. They operated on $3 million of funding 2006-2010 and were acquired by Imagination technologies for $21 million in 2010. WHERE'S THE BILLION DOLLAR R&D BUDGET? 10 years of R&D work .. Nowhere.

Why? Because all ray tracing is are a bunch of ALUs performing vector math in parallel with handler logic for convergence and divergence set upon a BVH. It's a cheap and dumb asic.
https://www.anandtech.com/show/2752
CausticOne.png

The major differences with this hardware will be with the replacement of the FPGAs with ASICs (application specific integrated circuit - a silicon chip like a CPU or a GPU). This will enable an estimated additional 14x performance improvement as ASICs can run much much faster than FPGAs. We could also see more RAM on board as well. This would bring the projected performance to over 200x the speed of current CPU based raytracing performance.

Imagination Technologies then integrated this tech into their powerVR line :
https://www.imgtec.com/legacy-gpu-cores/ray-tracing/
block-diagram.png


I can almost guarantee you that Nvidia does nothing incredibly different because its a standardized algorithm. I could implement this in verilog and have it functioning on a FPGA dev board in a couple of weeks. It's parallelized convergent and divergent vector math. From the horses mouth :

https://www.zhihu.com/question/290167656
The RT core essentially adds a dedicated pipeline (ASIC) to the SM to calculate the ray and triangle intersection. It can access the BVH. Because it is the ASIC-specific circuit logic, it is compared with the shader code to calculate the intersection performance. Mm^2 can be an order of magnitude improvement. Although I have left the NV, I was involved in the design of the Turing architecture. I was responsible for variable rate coloring. I am excited to see the release now.
作者:Yubo Zhang

This isn't anything complicated nor are tensor cores which is why Google was able to whip up their own Asics over night as are others.


So, there's your technical discussion. A time tested vector math algorithm is not complicated. Slap some ALUs together with control logic, registers and memory.
7-Figure2-1.png



Those who came up with the algorithm and refined it are the true people who deserve praise. You know like the researcher that got a mention in the presentation. Stamping this in an asic is child's play. So, what every discussion about IS THE PRICE and Performance... Because Nvidia didn't create the algorithm and they weren't the first to put ray tracing into HW or asic form. This was done over a decade ago and I don't hear anyone praising the small group who got together and did this on a much smaller budget because people actually don't care about the technical accomplishment. You want to pretend like you do but you don't.

So, just about everyone whose a consumer of this product on just about every site on the internet agrees : This pricing is absurd. From the /r/nvidia green team members to prominent hardware reviewers, everyones aghast at the price and they have every reason to be. Nvidia can chock it up to die size like intel does (Well guess what... Go MCM)... However, that's a logistic problem they need to address and should address. No one forced them to do what they're doing. Intel is facing similar pains because of this more costly approach.

Then comes the psychology of it all. You're launching a yet to be proven technology that you need to sell people on as if you don't care if people adopt it or not... This is how big and great tech companies begin faltering. This is why it's always good that new ones are allowed to enter. This is sheer arrogance IMO. They didn't create the algorithms underlying AI but they stamp them into asics (tensor processing units) and try to gouge the hell out of people. They didn't create the ray tracing algorithms but they stamp them into asics (ray trace units) and try to gouge the hell out of people. This is exactly how/why someone comes out of left field and kicks a yuge successful company in the nuts.

This is why I have every belief that we are headed to a democratized accelerator card architecture with a high speed standardized bus network connecting them. AMD is preparing for this and it is known as Heterogeneous System Architecture. The platform must be democratized at this point. Technology is currently chained to the ground because greedy myopic companies want to preserve their legacy pricing models.

- PCIE switches need to come to the desktop and out of the enterprise.
- PCIE 4.0 needs to become a thing pronto
- A new lower latency open standard bus protocol needs to become a thing
- A CPU should become a high level processing router

The age of dedicated and swappable accelerator cards needs to become a reality and this ridiculous desire to silo compute into proprietary domains needs to be broken.

So, I chock this up to be a grand miscalculation on Nvidia's behalf. They just alienated the whole gaming community who is already on high alert and reeling due to the virulent price gouging that occurred during the crypto mania. No one came to their defense or aided them. Everyone denied culpability while they profited from the cancerous plague that wasted earth's resources on a ponzi scheme. You think gamers forgot about this? The extortionist pricing. The denial by the whole chain of profiteers as to why it persisted? Direct selling thousands of Gaming GPUs to miners while shelves laid bare? The free for all on the consumer?

And you repay them with this? Slapping a decade old proven technology into your pipeline with no performance data and Memecoin price levels? Then letting this simmer for a month? While you let virulent marketing poke and prod your consumers? Go over to /r/nvidia. Tons of people woke up today and cancelled their pre-orders when they said the horrid performance numbers in the Tomb Raider demo and they were pissed about it. You call this good marketing? The psychological mind @#%! companies do regarding viral product launches has to stop. It's becoming old and long in the tooth. People are over it. If you're going to announce and launch the product, get your grown arse on stage, detail the product like an adult with the metrics people want to see and call it a day.

Steve Jobs coined a particularly interesting format and it was apt for the time, age, and magnitude of the product. Then, beyond this innovative way of launching a product we had the iterator, and now the idiot. Enough is enough with this hot garbage.

They could have humbly announced a bold new feature they hope will change gaming and compute for ages and launched with the lowest end card first .. Solidify a user base. Solidify buyin. Let everyone know that this is a ship that aims to usher everyone to the next level in computing. See what shapes up.. See what unique ways people use the new cores. Announce the SDK and all the features will be available to every price point. See what comes of it. Let your consumers make new markets for you.

Instead, they jawboned this at extortionist prices and walked off stage as if everyone has to buy it. Word on the street is that people clearly don't and aren't sold on it. In fact, they're offended they were tried like this.

Meanwhile, game makers are inserting annoying and heavily charged political propaganda into every game they sell doing everything to piss off their user base and then telling them : If they don't like it don't buy the game. This is peak arrogance of a chapter in technology that is soon going to be slammed shut.

Good riddance. I want new ideas. New companies and new products. This is why everything in this universe is cyclical. It affords a chance for a cleansing to occur when a wave has jumped the shark.
 
Last edited:

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
Why don't people give away free stuff? Look at Chevy Volt. EV + Gas engine. Arguing that they should just sell it the same price as a gas engine only car because that is the most important part is nonsensical. It cost more money to have the EV part in there, and it cost more money to have the RT/Tensor cores in there. This is a business run with profit motive. Not charity giving fans what they want for the price they want.



It's not cheap, each process shrink increases the cost/mm2. The 2080Ti has the largest, most expensive die ever in a consumer GPU. So it follows that it is the most expensive consumer GPU ever.



The problem here is huge dies and expensive GDDR6 memory. Those things increase the card costs, and NVidia is a business, not a charity, they will try to maintain their margins, so when selling something that is more expensive to produce, they will ask more from the buyer to compensate.

Whenever anyone attempted to add this kind of new HW that eats a lot of die space, there is going to be a hiccup on price/performance, for older games that don't use the new HW.

There are two options:

a) You live with that hiccup.
OR
b) You never have this kind of advance.


I think (a) is a lot better way to proceed. Because there is no (c) where it comes for free.

Ok I am a little tired of hearing all about costs costs costs. Costs for justification for upping prices. You're either defending a company for completely maximizing shareholder profit (which is fine if you are talking in financial circles) or you are explaining increasing consumer-facing prices because of costs that NVIDIA is getting hit with. The thing is, NVIDIA is a publically traded company and their margins, which will tell you all you need to know about costs versus price.

1. Why are we (consumers) not allowed to complain about a company increasing prices on what amounts to basically a fan-forum? We are people that buy these products. They are icnreasing the cost of the products out of line versus the performance that they bring and there you are defending the company because its a.. "company." We get it. Every 10 year old from a Westernized country gets it. Companies exist to make money, everyone understands that.

2. Let's take a look at your cost narrative. NVIDIA GAAP gross corporate margin:
2012: 51.4%
2013: 52.0%
2014: 55.5%
2015: 55.9%
2016: 56.1%
2017: 60.0%
Q218: 63.3%

End of story, NVIDIA is making record margins!.
They are profiting $63 for every $100 they sell, after all research and manufacturing costs are taken into account. This is all because of increased prices. $600 RTX 2070's will help keep this number inflated.

If a company was increasing prices because costs were increasing, you wouldn't see such explosive margin growth. Even Intel, with zero competition, peaked at roughly 61% margin last year before AMD stepped up its game.

Nvidia, despite plunging huge sums into partner programs to spread CUDA into more and more enterprise and deep learning/AI applications, is making record margin. Their COGS, cost of goods sold, which is where you will see fab costs and masking costs, attributable to processes and foundries, has been relatively flat. So please, stop saying these cards cost more to make so NVIDIA has to charge more.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
The product they are launching is not meant to be a high volume product. It may end up being more of a loss to have all that unsold glut of chips from the last gen. You would end up promoting a new chip and selling that over the old chip. Yes, you would have better margins, but, it might not be enough.

Further, you are going to have a bunch of pissed off vendors that bought way too much of the old stuff and can't sell it. So, better to position this new one above the old and not have them compete as much as normal. Sell out of the old, get some nice margins for investors, and kill it with 7nm.
When your consumers know the games you're playing.... It's no longer a game anymore.
When your consumers form expectations and know exactly what you're doing, you no longer have the power of dictating the outcome of a market. So sure, from a mindless and soulless business standpoint you could do that. They also could have not catered to a transient ponzi scheme, pissed off their core customers, and landed themselves in this situation. No one in this market has forgotten about this. Piss off vendors? They better start caring about their customers.

This btw is not the way you launch such a feature. In 2 days, people have become convinced that ray tracing is a meme. Speculation is abound and its all negative. Ray tracing for gaming is effectively being rejected.. Because of extortionist pricing. Great way to launch a feature. Come 7nm, when your game plan is in full vogue, no one's interested.

This is how giants fall. Intel coming to graphics in 2020? Sounds great. I'm tired of Both AMD/Nvidia GPUs. Add some more and new companies if possible. While you're at it, break meme learning cores and ray tracing cores out into new add-on boards and add tens of new companies to the mix there. Going releasing a dedicated meme learning card and basement pricing? Sounds good. Sign me the hell up.

You sometimes have one chance to achieve buy in on a new product/feature. They failed.
I'm not excited about ray tracing from these guys. I'm excited about someone new advancing it to what it should be and beyond.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,772
4,739
136
Ok I am a little tired of hearing all about costs costs costs. Costs for justification for upping prices. You're either defending a company for completely maximizing shareholder profit (which is fine if you are talking in financial circles) or you are explaining increasing consumer-facing prices because of costs that NVIDIA is getting hit with. The thing is, NVIDIA is a publically traded company and their margins, which will tell you all you need to know about costs versus price.

1. Why are we (consumers) not allowed to complain about a company increasing prices on what amounts to basically a fan-forum? We are people that buy these products. They are icnreasing the cost of the products out of line versus the performance that they bring and there you are defending the company because its a.. "company." We get it. Every 10 year old from a Westernized country gets it. Companies exist to make money, everyone understands that.

2. Let's take a look at your cost narrative. NVIDIA GAAP gross corporate margin:
2012: 51.4%
2013: 52.0%
2014: 55.5%
2015: 55.9%
2016: 56.1%
2017: 60.0%
Q218: 63.3%

End of story, NVIDIA is making record margins!.
They are profiting $63 for every $100 they sell, after all research and manufacturing costs are taken into account. This is all because of increased prices. $600 RTX 2070's will help keep this number inflated.

If a company was increasing prices because costs were increasing, you wouldn't see such explosive margin growth. Even Intel, with zero competition, peaked at roughly 61% margin last year before AMD stepped up its game.

Nvidia, despite plunging huge sums into partner programs to spread CUDA into more and more enterprise and deep learning/AI applications, is making record margin. Their COGS, cost of goods sold, which is where you will see fab costs and masking costs, attributable to processes and foundries, has been relatively flat. So please, stop saying these cards cost more to make so NVIDIA has to charge more.
Incoming. But, but, but.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
"Proof of concept" with this first generation is something I totally agree with. I'd much rather stick to higher resolution gaming than go back to 1080p and see some nicer shadows and reflections at the penalty of lower frame rates. It's like they are jumping ahead but also falling down a little. Not to mention that these cards are very expensive and lots of people have already moved away from 1080p.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Ok I am a little tired of hearing all about costs costs costs. Costs for justification for upping prices. You're either defending a company for completely maximizing shareholder profit (which is fine if you are talking in financial circles) or you are explaining increasing consumer-facing prices because of costs that NVIDIA is getting hit with. The thing is, NVIDIA is a publically traded company and their margins, which will tell you all you need to know about costs versus price.

Increase production cost is a completely reasonable reason to increase selling price, and it is pretty much a universal occurrence.

The 754mm2 die is large to an unprecedented degree. For some perspective, that is over 5 Times as large as the Intel 8700K die.

If you know anything about how size affects yield and cost, you would also know that big die will end up cost much more than 5 8700K dies.

Remind me again, how much 5 8700Ks will cost, and bear in mind that those are pretty much nothing but die.

On top of the enormous die cost, you also have to add in 11GB of GDDR6, and huge power delivery section, and expensive cooler.

NVidia is not gouging here. This is a massively expensive product to build.

Now you are free to complain that it is too expensive for you and you won't buy.

But arguing that NVida should just lower the price, because 12nm is cheap, is beyond naive.

It's too expensive for me. But If I won the lotto, I would be pre-ordering one 5 minutes later.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
When your consumers know the games you're playing.... It's no longer a game anymore.
When your consumers form expectations and know exactly what you're doing, you no longer have the power of dictating the outcome of a market. So sure, from a mindless and soulless business standpoint you could do that. They also could have not catered to a transient ponzi scheme, pissed off their core customers, and landed themselves in this situation. No one in this market has forgotten about this. Piss off vendors? They better start caring about their customers.

This btw is not the way you launch such a feature. In 2 days, people have become convinced that ray tracing is a meme. Speculation is abound and its all negative. Ray tracing for gaming is effectively being rejected.. Because of extortionist pricing. Great way to launch a feature. Come 7nm, when your game plan is in full vogue, no one's interested.

This is how giants fall. Intel coming to graphics in 2020? Sounds great. I'm tired of Both AMD/Nvidia. Add some more if possible. While you're at it, break meme learning cores and ray tracing cores out into new add-on boards and add tens of new companies to the mix there.

You sometimes have one chance to achieve buy in on a new product/feature. They failed.
I'm not excited about ray tracing from these guys. I'm excited about someone knew advancing it to what it should be and beyond.

Most simply don't care. They buy a card and care nothing about this stuff. Nvidia will continue to do this and nothing will change until they have to compete again. What they are doing likely won't hurt their brand as most don't care.

Also keep in mind that gamers are not the main target anymore.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Increase production cost is a completely reasonable reason to increase selling price, and it is pretty much a universal occurrence.

The 754mm2 die is large to an unprecedented degree. For some perspective, that is over 5 Times as large as the Intel 8700K die.

If you know anything about how size affects yield and cost, you would also know that big die will end up cost much more than 5 8700K dies.

Remind me again, how much 5 8700Ks will cost, and bear in mind that those are pretty much nothing but die.

On top of the enormous die cost, you also have to add in 11GB of GDDR6, and huge power delivery section.

NVidia is not gouging here. This is a massively expensive product to build.

Now you are free to complain that it is too expensive for you and you won't buy.

But arguing that NVida should just lower the price, because 12nm is cheap, is beyond naive.

It's too expensive for me. But If I won the lotto, I would be pre-ordering one 5 minutes later.
Whose fault is that? Get your process engineering and manufacturing sorted out and feed that back into your design team...

Intel is going to have to go to MCM as is everyone including Nvidia.
http://research.nvidia.com/publication/2017-06_MCM-GPU:-Multi-Chip-Module-GPUs
mcm_0.png


Consumers have had enough with the pricing. So, if you can lower it by a different process, you better get on the ball and do so. That being said, i'm not gonna be your ginny pig while you sort it out. Tensor cores and ray tracing cores should be in a completely different module. I have no clue what Nvidia is doing beyond purposely trying to make an excuse as to why it has to remain expensive so they can preserve their margins. Put the friggin BVH in a shared complex that feeds off to parallel MCM modules that depend on it.
 
Last edited:

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Don't forget they charge more for Quadro cards. Gaming was about smaller margin but more volume. 980 Ti was bigger than any previous GeForce, yet still cheaper than older cards like the 780 Ti and 8800 Ultra. Increasing prices this radically is a choice, not a necessity, due to lack of competition.

If consumers stand firm, then they will decrease prices out of necessity. It's up to the market now to raise some fists.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
Most simply don't care. They buy a card and care nothing about this stuff. Nvidia will continue to do this and nothing will change until they have to compete again. What they are doing likely won't hurt their brand as most don't care.

Also keep in mind that gamers are not the main target anymore.
Unless you're sleeping under a rock, it is quite clear people care on just about every forum/site/youtube channel on the internet. I have a strong belief that there's supply constraints beyond the 'sold out' cards just like there was some time ago. That big arse die isn't flying off the production line in volumes. Their brand is being hurt and you can see so on Nvidia loyalist sites and discussion groups.

Gamers aren't the target? Outside of gaming, people want an alternative for meme learning compute, etc. There are a number of startups in this area. Intel snapped up Nervanasys. Microsoft has their own unique compute processors stamped in FPGAs. Google has their own TPUs. The same is going to occur for ray tracing. I'm sure imagination technologies phone is ringing off the hook. The thinking you reflect occurs cyclically in the tech industry and companies who maintain it (Big ones) pay a heavy price whenever they engage in it at the wrong stage.

Everything about gaming has plateaued. The gaming studios have had major flops because they can't stop inserting idiotic politics into games. VR is a meme that has yet to take off. 4k gaming is a meme. The majority of people are playing older and simpler titles on basic GPUs. The movie industry has been a dumpster fire at the box office because they aren't producing compelling films that speak to their audience. Instead of invigorating and exciting sense of the future, just about everyone is focused instead on exploiting people for short term profit. 50% of the junk on youtube isn't worth watching and is becoming just like the big networks/studious : exploitative. You can do all of the lense flare effects and latest eye popping ray traced graphics you want. If the content is junk, it's junk and people are awaking to this fact.

So, I have no clue what you mean by : Not their main target audience anymore.
Alternatives are abound outside of GPUs and they're coming to market. Cloud providers assembled ages ago and furthered open compute/open network committees to put a nail in the coffin of premium hardware providers. You can only milk a cow for so long.
https://www.opencompute.org/
 
  • Like
Reactions: psolord

ub4ty

Senior member
Jun 21, 2017
749
898
96
Don't forget they charge more for Quadro cards. Gaming was about smaller margin but more volume. 980 Ti was bigger than any previous GeForce, yet still cheaper than older cards like the 780 Ti and 8800 Ultra. Increasing prices this radically is a choice, not a necessity, due to lack of competition.

If consumers stand firm, then they will decrease prices out of necessity. It's up to the market now to raise some fists.
The new Quadro prices are absolutely insane.
Honestly the whole disparity between pro/consumer is one giant artificial segmentation.
The major features they distinguish the two markets with are artificially being propped up and in some cases literal driver switches. I'm ready for a complete overhaul and change of just about everything in computing with brand new approaches, architectures, and companies.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
How much faster does Turing have to be, outside of RTX, for most gamers to accept the price?

50%
75%
100%

I'm guessing that less than 50% is not enough.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
How much faster does Turing have to be, outside of RTX, for most gamers to accept the price?

50%
75%
100%

I'm guessing that less than 50% is not enough.
It has nothing to do with what the performance has to be. It has everything to do w/ what the price MAX should be. A 2070 costs $600 and for some great reason they decided to take Nvlink off of it to force you into the 2080 which is $800 friggn dollars. The prior 1080ti wasn't sold in volumes because it was too dam expensive. Now you're selling a 2080 at that price? Then you go out of your way and cut out the nvlink of the 2070? And charge $600 for it. My max is around $500/$600. The prior price of the 1080. So, what can they deliver at that price point? A neutered GPU that wont come out for another month? They need to get real. I built my whole 8 core rig for less than that.

Release the benchmark graphs. Stop with this tired viral marketing slow leak bs and make the sale.

https://old.reddit.com/r/nvidia/comments/991r39/german_pc_magazin_test_tomb_raider_with_rtx_2080/

Rockageek 378 points 8 hours ago

And suddendly many preorders are canceled


Profanity isn't allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:
  • Like
Reactions: psolord

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
"Review"
________________
"Although we haven’t had the chance to benchmark the card thoroughly....

Unfortunately, due to multiple non-disclosure agreements we signed, we can only tell you that Shadow of the Tomb Raider looks stunning with ray tracing turned on...

In terms of frame rate, Shadow of the Tomb Raider ran at a mostly consistent 50-57 fps, which is impressive giving the game is running on a single GPU and in such an early state – on top of all the new ray tracing techniques."
_______________
$1,200 consumer GPU running at 50-57fps at 1920x1080. "Impressive".

Just 1 month to go. Driver team gotta start working OT?
 
  • Like
Reactions: Ranulf and ub4ty

ZeroRift

Member
Apr 13, 2005
195
6
81
I sense that we may be overlooking something important: Where is the Titan?

IIRC, Nvidia usually releases their Titan cards first, then non-TI, then TI.

This launch hasn't gone down that way, and we also have had some interesting rebrands with RTX/GTX being moved around, so it's possible the rage is due more to recent mining / misunderstanding the new branding.

Does the pricing structure make more sense if we relabel the 2080TI to "Titan" and move everything down 1 step?

2080TI -> Titan
2070TI -> 2080 TI
etc.
 

ub4ty

Senior member
Jun 21, 2017
749
898
96
"Review"
________________
"Although we haven’t had the chance to benchmark the card thoroughly....

Unfortunately, due to multiple non-disclosure agreements we signed, we can only tell you that Shadow of the Tomb Raider looks stunning with ray tracing turned on...

In terms of frame rate, Shadow of the Tomb Raider ran at a mostly consistent 50-57 fps, which is impressive giving the game is running on a single GPU and in such an early state – on top of all the new ray tracing techniques."
_______________
$1,200 consumer GPU running at 50-57fps at 1920x1080. "Impressive".

Just 1 month to go. Driver team gotta start working OT?
When your slow leak viral marketing campaign blows up in your face

Meme removed.

I've already warned you once before on using memes as insults in the tech areas. I wouldn't recommend letting it happen a third time.

AT Mod Usandthem
 
Last edited by a moderator:
  • Like
Reactions: sze5003

ub4ty

Senior member
Jun 21, 2017
749
898
96
I sense that we may be overlooking something important: Where is the Titan?

IIRC, Nvidia usually releases their Titan cards first, then non-TI, then TI.

This launch hasn't gone down that way, and we also have had some interesting rebrands with RTX/GTX being moved around, so it's possible the rage is due more to recent mining / misunderstanding the new branding.

Does the pricing structure make more sense if we relabel the 2080TI to "Titan" and move everything down 1 step?

2080TI -> Titan
2070TI -> 2080 TI
etc.
Ship's already sailed and its a big failboat that's sinking.
Marketing/business departments craft campaigns seeking max profit and under an often antiquated/borrowed/proven methodology. Hide the cons.. Overhype the pros. If something obvious is left out, it's probably because it is a con. In the late stages of a cycle or trend, it results in utter disaster which this product launch is... especially when you give people a long period to scrutinize and rip apart your carefully crafted marketing campaign and product. The vega launch had the same uncanny resemblance as did the features that never came to fruition.
 
Status
Not open for further replies.