GeForce GTX 1180, 1170 and 1160 coming in August. Prices inside

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
Nvidia has to offer something for the lower range market as most gamers are not going to spend $350 on up for a video card. Looks like I will be keeping my GTX 970 for a few more years.

Pretty much how I feel. I have had my GTX 970 nearly four years and what a joke paying $350 right now for a GTX 1060 6GB that only gives a 10-15% improvement in performance.
 

Newbian

Lifer
Aug 24, 2008
24,779
882
126
Nope, I'm comparing price point to price point. I will not spend $400 and up for a gpu.
The problem is you are comparing the wrong card to another.

The gtx 970 is that gens equivalent of the gtx 1070.

You can't blame the mining price increase if you are going to do this really as the msrp of the 1070 was only $40ish more if I recall over the 970.

The 1060 6gb was almost $100 cheaper over the 970 if you compare their prices also.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
You can't blame the mining price increase if you are going to do this really as the msrp of the 1070 was only $40ish more if I recall over the 970.

Who do we blame when the next generation launches at very high MSRP, as it appears set to do. Mining really isn't an excuse anymore. This will just be old fashioned greed.
 
  • Like
Reactions: psolord

Newbian

Lifer
Aug 24, 2008
24,779
882
126
Who do we blame when the next generation launches at very high MSRP, as it appears set to do. Mining really isn't an excuse anymore. This will just be old fashioned greed.
It's based on rumors last I checked and until they are released we will have to wait to see what they compare against really.
 

Guru

Senior member
May 5, 2017
830
361
106
A gtx1060 is 120 watts correct?
12nm will give a 40% power reduction. The next card performing in the 1060 bracket should not use a 6 pin connector.

Wasn't a gtx1060 as fast as a gtx980? And it was a gp106 chip.

I believe the next 1160 will be a cut down GV104 chip. The gtx1160 will be good competition for a Vega 64, should be a gtx1070ti equal.

The next GV106 chip (1150ti) will be as fast as a gtx1060 or faster.
Wut? You are mistaking 7nm process for the 12nm process. 12nm is literally 16nm process, but refined to a big enough way where they can call it 12nm. At most it will provide 20% less power.

While 7nm will provide about 40% less power at 20% more performance.
 

whm1974

Diamond Member
Jul 24, 2016
9,436
1,572
126
Chances are I would be building a new rig anyway once a more worthy GPU upgrade is available.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Wut? You are mistaking 7nm process for the 12nm process. 12nm is literally 16nm process, but refined to a big enough way where they can call it 12nm. At most it will provide 20% less power.


While 7nm will provide about 40% less power at 20% more performance.

Performance per watt will increase 40% with 12nm Nvidia cards.

How do you explain 28nm gtx780 vs 28nm gtx980 increase in performance per watt? 40% less power and 25 % more performance.

In not all about the size of the node.
 

maddie

Diamond Member
Jul 18, 2010
5,204
5,614
136
Performance per watt will increase 40% with 12nm Nvidia cards.

How do you explain 28nm gtx780 vs 28nm gtx980 increase in performance per watt? 40% less power and 25 % more performance.

In not all about the size of the node.
So you're claiming that the equivalent of the radical reorganization of the SMs with Maxwell happens again? Would be nice but you think that's realistic?
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
So you're claiming that the equivalent of the radical reorganization of the SMs with Maxwell happens again? Would be nice but you think that's realistic?
Take a look at gp100 vs gv100 performance per watt.
That should give you an idea of what is coming with the g force line, except it will strip the non gaming stuff out and be very efficient.

I stand by what I said , 40% better performance per watt over Pascal.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
9,373
8,067
136
The problem is you are comparing the wrong card to another.

The gtx 970 is that gens equivalent of the gtx 1070.

You can't blame the mining price increase if you are going to do this really as the msrp of the 1070 was only $40ish more if I recall over the 970.

The 1060 6gb was almost $100 cheaper over the 970 if you compare their prices also.

The 1070 launched at $450 and could almost never be found for below $400 the first few months it was out while the 970 was readily available for $330 within about a month and a half of launch.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Take a look at gp100 vs gv100 performance per watt.
That should give you an idea of what is coming with the g force line, except it will strip the non gaming stuff out and be very efficient.

I stand by what I said , 40% better performance per watt over Pascal.

I assume your performance/watt is simply based on published TDP? That seldom tells the whole story.

On performance and pricing:

GV100 has ~40% more execution units and delivers, ~40% more performance, so it is pretty clear, that performance was simply the result of the MUCH bigger die. Which won't make for a economical way to boost the next generation consumer GPU performance.

IOW that doesn't help perf/$ very much.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I assume your performance/watt is simply based on published TDP? That seldom tells the whole story.

On performance and pricing:

GV100 has ~40% more execution units and delivers, ~40% more performance, so it is pretty clear, that performance was simply the result of the MUCH bigger die. Which won't make for a economical way to boost the next generation consumer GPU performance.

IOW that doesn't help perf/$ very much.
But they use about the same power envelope, 235 watts Gp100 vs 250 watts GV100 .If you get 40% more performance from the same amount of tdp, what does that mean? And will it increase after the tensor cores and other non gaming stuff is stripped out? I'm sure it will.

Edit,
On the other hand if Nvidia decides to clock the chips high over the point of efficiency, it could hurt the power envelope but increase performance even further.
 
Last edited:

Newbian

Lifer
Aug 24, 2008
24,779
882
126
The 1070 launched at $450 and could almost never be found for below $400 the first few months it was out while the 970 was readily available for $330 within about a month and a half of launch.

You can blame the founders edition issues for that for the first few months sadly as it was a joke.

On one hand we hope nvidia learned from that mistake to us as consumers but as a manufacturer they probably made enough money from it to keep on doing it.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
The 1070 launched at $450 and could almost never be found for below $400 the first few months it was out while the 970 was readily available for $330 within about a month and a half of launch.
I bought a gtx1070 extreme edition about 6 month's after release for $385 AR. You could find a stock clocked card for about $350 .Shortly after that , prices went through the roof.
 

Guru

Senior member
May 5, 2017
830
361
106
Performance per watt will increase 40% with 12nm Nvidia cards.

How do you explain 28nm gtx780 vs 28nm gtx980 increase in performance per watt? 40% less power and 25 % more performance.

In not all about the size of the node.

Not sure the GTX 780 is 40% slower per watt, certainly wasn't the case when the GTX 980 was released. Over time as games optimize for the newest architecture and build their engines around those the performance can look exaggerated.

Plus the 980 had architectural improvements purely to save energy, it was also smaller per die compared to the GTX 780.

Again I've not been able to find any 40% performance per watt gain for TSMC 12nm over their 16nm, we have some claims about 50% improvement in power consumption for 7nm compared to 16nm.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Plus the 980 had architectural improvements purely to save energy, it was also smaller per die compared to the GTX 780

The 12nm process is specifically for lower power, plus I'm sure Nvidia didn't just rebadge Pascal ,they made improvements also.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
The 12nm process is specifically for lower power, plus I'm sure Nvidia didn't just rebadge Pascal ,they made improvements also.
TSMC 12nm provides either up to 25% lower power or up to 10% more performance. That doesn't translate into 40% better perf/watt like you claimed earlier. Besides, these tweaked nodes are generally better when it comes to saving power than providing more performance, so overall frequencies won't be any higher than what you already get with top-binned chips.
 

Ottonomous

Senior member
May 15, 2014
559
293
136
Is there a possibility that the 1180+ rumour could actually be true? As in nvidia creating more profitable high-end segmentation through multiple models (ti, VRAM, etc)
 

Elfear

Diamond Member
May 30, 2004
7,169
829
126
Performance per watt will increase 40% with 12nm Nvidia cards.

How do you explain 28nm gtx780 vs 28nm gtx980 increase in performance per watt? 40% less power and 25 % more performance.

In not all about the size of the node.

Even if what you're saying comes true (40% increase in P/W seems doubtful at this point), that means the 1180 will be ~10% faster than the 1080Ti just like what we saw with the 980. Hard to get excited about those kind of gains at the top of the heap IMO.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Even if what you're saying comes true (40% increase in P/W seems doubtful at this point), that means the 1180 will be ~10% faster than the 1080Ti just like what we saw with the 980. Hard to get excited about those kind of gains at the top of the heap IMO.
It may be hard to get excited but that's just enough gains to make a lot of spec hungry enthusiasts to upgrade from their 1080s.
Besides, those days of almost 100% gains each subsequent generation are over. Whether that's due to the lack of competition or laws of physics I'm not sure.
 

slashy16

Member
Mar 24, 2017
151
59
71
I have a feeling we will see an 1180 launch with 10-20% more performance than the 1080ti. I think the 1180 will launch solo and we will see the 1080ti,1080,1070ti get severe price cuts. There are too many 10 series cards on the market for nVidia to absorb from vendors. The 1070ti just launched a short time ago. This card would look very attractive to gamers at $299usd.

Nvidia is in an odd position. They could sit on their 10 series cards for another 12-18 months and not have any serious competition from AMD.
 

AntonioHG

Senior member
Mar 19, 2007
901
617
146
www.antoniograndephotography.com
Nope, I'm comparing price point to price point. I will not spend $400 and up for a gpu.

Do you guys not sell your current cards to buy the next? I almost never pay full price for a new GPU. Even without a craze, I've sold my old cards, put some money towards the new stuff. Same for other PC parts like mobos and CPUs.

During the mining craze, I made money selling my GTX1070s and 1080.
 
Status
Not open for further replies.