Nvidia ,Rtx2080ti,2080,2070, information thread. Reviews and prices September 14.

Page 47 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ub4ty

Senior member
Jun 21, 2017
749
898
96
good read thanks.

Applogies if it's already been discussed, / taken into account, but the other thing I guess to consider is these GPU's are Power limited as always, and as such, using RT /TS cores to do this , you are taking power budget away from traditional CUDA cores for rasteration.. tldr (I think) you'd be looking at lower clockspeeds when operating in this manor which would imply you could never theoretically achieve same FPS

I'll do you a solid.. Find the Leather jacket man 'gigarays' :
The graphics pipeline on computers and mobile platforms is responsible for real-time rendering. Today’s videos and games typically require a frame rate of 60 frames per second (FPS) or higher [Hoffmann et al. 2006], but an FPS of 30 is a minimum requirement. Meanwhile, a high-end display has a resolution of 2560x1440, or around 3.7M pixels. For each pixel, we shoot 10 rays to minimize aliasing. Then totally we have 37M primary rays. When a ray hits a primitive, multiple reflection and/or refraction rays will be generated. Such second rays will incur newer rays and usually we set a limit on the depth of recursion. For a reasonable rendering quality, a conservative estimation is that a primary generates 10 secondary rays on average. Now the ray count is 370M. For a frame rate of 30, we need a processing throughput of 10G rays per second. The above analysis is just for the ray traversal stage, while the construction stage is also compute intensive. For instance, the construction process needs to sort the primitives according to their coordinates, while a typical scene in today’s graphics applications has over 1 million primitives.

ta0atfbspgh11.jpg


Dat frame eater technology... Those who invested in thousand dollar memesync monitors are going to be on suicide watch. High FPS or Umbra, penumbra and antumbra...
:kissingheart::kissingheart:

Based Jensen segmenting the unwashed masses from those with refined optical taste.

Break down your (FPS) to later build it back up..
1wzjqz.jpg






Knock it off with the memes. This is not OT.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
  • Like
Reactions: amenx and psolord

alcoholbob

Diamond Member
May 24, 2005
6,390
470
126
They may churn out a Turing Titan for prosumers. Or wait for 7nm, I can see them do that as well.



Don't you just hate paper launches? I certainly do. As you said, the limbo it creates isn't helping anyone. They should have been available at launch day.

Side note: I've never been a Titan buyer. Too expensive for my taste ad in the past too short lived with a Ti varient right behind it. However, there doesn't seem to come anything between the 2080 and Ti before 7nm.

I'm more of a Ti guy (without a Ti at the moment ha ha). That would put me in the 2080 market, however I find the gap between it and the Ti too large - or rather, the former specced too low as compared to the big daddy. There may not be enough there to entice me (come reviews).

I am particularly interested in DLSS rather than RT as I think the latter will be much better to execute with the next generation or the one after that.

I'm guessing a 4608 core card might get churned out before launch of 7nm cards to keep people happy in 6-9 months.
Many of us said the same thing. Titan is usually an early adopter card for the Biggest chip of the run. But there will be no bigger chip and it is already here with early adopter pricing. So a Titan really makes little sense this generation.

There's always an uncut Titan though that matches the Quadro/Tesla card late in the round. So a 4608 CUDA Core Titan isn't out of the question, but it might have to sell for less than the Titan V since it has less CUDA cores...
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
There's always an uncut Titan though that matches the Quadro/Tesla card late in the round. So a 4608 CUDA Core Titan isn't out of the question, but it might have to sell for less than the Titan V since it has less CUDA cores...

It won't make enough difference to bother with. It's like Titan Xp, that no one cared about. I bet no Titan.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
There will be a 7nm purely compute based monster soon enough. In time that'll spawn the next Titan.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I'll do you a solid.. Find the Leather jacket man 'gigarays' :


ta0atfbspgh11.jpg


Dat frame eater technology... Those who invested in thousand dollar memesync monitors are going to be on suicide watch. High FPS or Umbra, penumbra and antumbra...
:kissingheart::kissingheart:

Based Jensen segmenting the unwashed masses from those with refined optical taste.

Break down your (FPS) to later build it back up..
1wzjqz.jpg


Oh man. It’s some stupid stuff all over again. Oh yeah let’s do bloom. The only point of which is to hurt your eyes.

Now it’s like yeah we’ll take 50% of your performance away to make everything look like they smeared Vaseline on it.

I really don’t mean to be so negative but I saw that robot suit video and it’s just stupid. How much gloss do we need?

Just too many gimmicks introduced by this company. I say that as someone who’s bought every generation of NVidia card. I get that ray tracing is the future but this is being implemented like a joke of a gimmick.

Speaking of Physx it had some cool effects but really they were useful for demos. It was never implemented properly and it just cost way too much for what it added. You could tell you hitting a physx scene by how much louder the GPU fan got with a performance drop in the scene.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I'm guessing a 4608 core card might get churned out before launch of 7nm cards to keep people happy in 6-9 months.


There's always an uncut Titan though that matches the Quadro/Tesla card late in the round. So a 4608 CUDA Core Titan isn't out of the question, but it might have to sell for less than the Titan V since it has less CUDA cores...

I can see it already. Founder’s edition: $1799. Normal edition $1499.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
I can see it already. Founder’s edition: $1799. Normal edition $1499.
At least, probably more. Unlike Maxwell or GP102, TU102 actually has full DP support. Titan T could have DP GFLOPS close to what Titan V does, and that's a $3000 card.
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
That's exactly what they did with Vega FE and Vega 64. How soon some forget.

Vega failed to be competitive in mm2, was strapped to ultra expensive memory, and was very power hungry. It was a failed product.

Turing is much worse off in perf/mm2 in traditional rendering than Pascal, so if AMD can make gains in perf/mm2 over Vega and Polaris, and leave out the specialized functions, they will be in a position to come in at both lower prices than Nvidia and better margins than they were getting off Polaris and Vega.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Turing is much worse off in perf/mm2 in traditional rendering than Pascal, so if AMD can make gains in perf/mm2 over Vega and Polaris, and leave out the specialized functions, they will be in a position to come in at both lower prices than Nvidia and better margins than they were getting off Polaris and Vega.

Much worse?

It looks like RTX 2080 performs similarly to GTX 1080Ti, and from what I estimated from die images (and some Math) the RTX 2080 die, and GTX 1080Ti die are similar size. Exact amounts will depend on real performance in reviews, and exact die sizes when measured/reported, but overall it looks the same to same. +/- a bit for the exact amounts when total details are in.

So perf/mm2 looks essentially the same, despite Turing using significant portions of the die for Ray Tracing, which is fairly impressive.

Edit: fixed to quote to the right person.
 
Last edited:

sze5003

Lifer
Aug 18, 2012
14,320
683
126
I think the 2080ti should be a lot better than the 2080 since the 2080 is very close to 1080ti based on those preview graphs from a week ago. It would be very weird if it comes in only a little bit better than the 2080 but stranger things have happened before.

It also seems last release they let the reviews out a week before the cards shipped so it doesn't seem that strange that we are waiting for them now. It just feels like a lot longer.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
I think the 2080ti should be a lot better than the 2080 since the 2080 is very close to 1080ti based on those preview graphs from a week ago. It would be very weird if it comes in only a little bit better than the 2080 but stranger things have happened before.

It also seems last release they let the reviews out a week before the cards shipped so it doesn't seem that strange that we are waiting for them now. It just feels like a lot longer.

You waited this long....A couple more weeks won't kill you. Worst case you'll go a little crazy from all the speculation.
 

sze5003

Lifer
Aug 18, 2012
14,320
683
126
You waited this long....A couple more weeks won't kill you. Worst case you'll go a little crazy from all the speculation.
We've all been going crazy over speculations. The fun part will begin having to setup nowinstock alerts and getting them to go to my phone since I won't be able to have the browser page up at work. Oh and also waiting for AIB models to show up.

I can't remember if they said those will be available at the same time the pre-orders ship. Either way it will be a long time before I get one if I am content with the performance. I always am a bit too late with ordering once the model I want shows up.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
GPUs always go through this same life cycle, if you've been around long enough you spot it immediately. It's really down to a single problem, the struggle between developers of games and the engineers who build the GPUs, the developers do not want to waste resources developing new features that most of their customers cannot use because it costs money and they don't really see much of a ROI in that investment. At the same time why buy spend money on new GPUs if all your games run fine? If you let that play out long term it would lead to basically stagnation which is against everyones best interest.

So the only way to move forward is awkwardly, that means on the GPU side introducing new features before you really have the power to run them at speed, and from the games side you need to introduce new features not many people will be able to use or wont run well for some time. That's the only way you keep this all moving forward at a decent pace, it's not smooth and elegant and pretty, it's awkward bumbling in some sense.

This is just one of those transition periods, it's getting too hard and too costly to fake a lot of these effects that rasterization doesn't naturally produce and at some point it'll be better to just flip elements over to ray tracing, and we're at that awkward point with the 2000 range of GPUs. I've seen this countless times in the past, you had it with tessellation more recently, you had it multiple times over the years with new shader effects, the cards that came out that could do them first often couldn't run them fast enough to really use them in any real quantity.

I think what makes this round worse is 2 big things, firstly ray tracing is so expensive in terms of GPU load that people are going to have to drop back to 1080p 60fps standards to run it, and that's a big step back for some, of all the awkward leaps forward this is by far one of the biggest. And the other thing is 4k, people were really waiting for 4k capable GPUs, personally I think the 1080 range is good enough for 99% of my library, but if you want the latest and greatest games completely maxed out in 4k then we really need a killer card. I think the 2080 will probably do that but not with anything ray tracing enabled, so that's a very awkward position to be in.

I'm on the fence about getting one, gaming in 4k is great so it could be worth it. What worries me slightly though is that AMD may ignore the ray tracing stuff completely and go for a 12nm chip where all the transistors are dedicated to traditional rendering and if they spit out some large chips that'd be a serious powerhouse and probably floor the 2000 range, or at least match performance for much cheaper if they stick to smaller GPUs. I kinda feel dedicated to 4k now, you can't really go back after you've used it and we probably wont have hardware to run 4k with ray tracing elements in it for at least another 3 generations probably 4, which is rough.
 
  • Like
Reactions: DooKey

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Vega failed to be competitive in mm2, was strapped to ultra expensive memory, and was very power hungry. It was a failed product.

Turing is much worse off in perf/mm2 in traditional rendering than Pascal, so if AMD can make gains in perf/mm2 over Vega and Polaris, and leave out the specialized functions, they will be in a position to come in at both lower prices than Nvidia and better margins than they were getting off Polaris and Vega.

It's kinda funny that Nvidia is doing exactly what AMD did with Vega. That is to say they are selling a bunch of features that really won't help traditional gaming performance. AMD's Vega chip was aimed at professional workloads because they didn't have the money to make a separate gaming chip. Vega came in big and more expensive with less performance than it needed in gaming to compete at the very highest tier. Nvidia has now made a chip that is also designed for professional workloads, but apparently they didn't want to spend the money on a pure gaming chip. So they are trying to sell gamers the professional workload chip with a couple of exclusive game partners as a carrot. Not only that, normally AMD is the company that builds chips with future tech that may or may not be adopted contributing to a card that doesn't quite have what it takes to beat Nvidia. Nvidia is traditionally the company that delivers what the market needs at the present. It's worked out very well for them. What remains to be seen is if Nvidia Turing cannibalized too much die space for the RT and tensor cores and left the traditional raster pipeline too weak to make a palatable product. It's already stupidly expensive which is strike one. My gut feeling is that this won't work out for them. Yes they have market share, but even the mighty can fail.
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
It's kinda funny that Nvidia is doing exactly what AMD did with Vega. That is to say they are selling a bunch of features that really won't help traditional gaming performance. AMD's Vega chip was aimed at professional workloads because they didn't have the money to make a separate gaming chip. Vega came in big and more expensive with less performance than it needed in gaming to compete at the very highest tier. Nvidia has now made a chip that is also designed for professional workloads, but apparently they didn't want to spend the money on a pure gaming chip. So they are trying to sell gamers the professional workload chip with a couple of exclusive game partners as a carrot. Not only that, normally AMD is the company that builds chips with future tech that may or may not be adopted contributing to a card that doesn't quite have what it takes to beat Nvidia. Nvidia is traditionally the company that delivers what the market needs at the present. It's worked out very well for them. What remains to be seen is if Nvidia Turing cannibalized too much die space for the RT and tensor cores and left the traditional raster pipeline too weak to make a palatable product. It's already stupidly expensive which is strike one. My gut feeling is that this won't work out for them. Yes they have market share, but even the might can fail.

The likely big difference is that while devoting even more die space to forward looking features, NVidia will most likely also retain the performance crown.
 

zinfamous

No Lifer
Jul 12, 2006
111,994
31,557
146
The likely big difference is that while devoting even more die space to forward looking features, NVidia will most likely also retain the performance crown.

Which is likely and quite fine from a technical perspective, but the cost of this for the consumer is absurd. I don't really care how much it cost nVidia to design a chip with some 20% of its size devoted to forward-looking goodies, married to a chip that will be several generations behind once those goodies become useful. I only care that they are raising prices at tiers on top of this. That's all I should care about. It's the exact same thing that AMD has been doing, and they get pilloried for this, but at least they were never gouging us on price for such promises.

edit: typo
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Which is likely and quite fine from a technical perspective, but the cost of this for the consumer is absurd. I don't really care how much it cost nVidia to design a chip with some 20% of its size devoted to forward-looking goodies, married to a chip that will be several generations behind once those goodies become useful. I only care that they are raising prices at tiers on top of this. That's all I should care about. It's the exact same thing that AMD has been doing, and they get pilloried for this, but at least they were never gauging us on price for such promises.

For early adopter potential Titan buyers, pricing its no worse than the Titan X release. Even though the 2080Ti has a ridiculously large die, that probably cost twice as much to Fab as the Titan X die.

But I agree, the price bump will be harder to swallow on 2080/2070, and many will understandably opt to hold what they have or buy a deal on previous generations cards that are finally lowering prices.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
You waited this long....A couple more weeks won't kill you. Worst case you'll go a little crazy from all the speculation.

I baled out on PC gaming last year so speculation doesn't effect me. It's fun to read it and I'll still look at the reviews. No plans to ever jump back into PC gaming....I'm a old fart. My 1st build had a Cyrix cpu.
 
Status
Not open for further replies.