[WCCFTECH] NVIDIA GeForce GTX 1180 leaked info and performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Krteq

Senior member
May 22, 2015
998
678
136
Didn't the $699 release price include the $100 bonus "Founder's Edition" tax you got to pay for the crappier cooler?
Yep

And GTX 1070 was $379... +$70 "Fanboy Edition" tax :(

I hope nV won't be so greedy this time, but... well, you know
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
What we know is that when the cards come out, and they are better and faster, all the talk about greed will go away as the cards are bought up faster than ice cream cones on sale in the desert..
 

Guru

Senior member
May 5, 2017
830
361
106
Seems to me like a garbage somewhat educated guess on what it might be. From my educated guess I'm leaning that we won't even see a desktop GPU refresh in 2018. I think we might see Nvidia jump to 7nm, which would be in 2019.

If they did make a refresh the performance increase we are going to see is going to be minimal, 20% over previous model at best. Again going from 16nm to 12nm won't bring much performance improvements, at best 10% more density or 15% more power reduction.
 

jpiniero

Lifer
Oct 1, 2010
15,453
5,942
136
Seems to me like a garbage somewhat educated guess on what it might be. From my educated guess I'm leaning that we won't even see a desktop GPU refresh in 2018. I think we might see Nvidia jump to 7nm, which would be in 2019.

It's 10 nm.
 

amenx

Diamond Member
Dec 17, 2004
4,150
2,431
136
Yep

And GTX 1070 was $379... +$70 "Fanboy Edition" tax :(

I hope nV won't be so greedy this time, but... well, you know
Nvidia pricing entirely dependent on what the other side has as alternative. Yes, they will milk it when given the chance, but once AMD come up with something that convincingly takes it on and cheaper, prices will drop dramatically. The gtx260 was $449 when released. Once the 4870 came out a few weeks later @ $299, the 260 quickly dropped down to that. I think both sides will milk whatever products they have given the opportunity. Unfortunately for AMD, they rarely get that opportunity.
 
  • Like
Reactions: psolord

jpiniero

Lifer
Oct 1, 2010
15,453
5,942
136
Thing is, miners are still buying. Maybe not at the rate they once did, but they still are. If nVidia goes "cheap", retailers will just reap the profits themselves.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Thing is, miners are still buying. Maybe not at the rate they once did, but they still are. If nVidia goes "cheap", retailers will just reap the profits themselves.

That’s the real issue IMO. Nothing we can really do about that point. I don’t think it will happen but I’m waiting for new cards made specifically for mining that are $1000+ and then gaming cards that somehow lock out mining for the usual pricing for the rest of us.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Even without mining the initial peak in demand when NV release a new high powered GPU tends to make availability really quite hard to get for a while. Could be pretty awful if the two combine :(
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Or........
Nvidia could be stockpiling cards for the past few months to take even more market share with gamers, that would be smart.
I 've never know Nvidia to make stupid moves when it comes to card releases.
 

Trumpstyle

Member
Jul 18, 2015
76
27
91
This "rumour" is very likely just a educated guess where the author claims to have a source. What he missed is Gddr6 can do 256 bit bus width on 12GB of ram which is more likely than 16gb ram.

My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.
 

ZGR

Platinum Member
Oct 26, 2012
2,058
671
136
The real upgrade for me is how cheap a used 1080 ti is on 1180's release.

I got burned not buying a used GTX 980 ti over a new 1070 as I thought Pascal would have gotten a Mac driver much sooner. It took nearly a year!

I have no idea if Nvidia and Apple will even support the next gen...
 

amenx

Diamond Member
Dec 17, 2004
4,150
2,431
136
My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.
Doubt very much an 1180 or any xx80 would ever be be on a 192-bit bus, its too close to the flagship. That sort of crimping goes to xx60 cards and is the way its always been.
 

Trumpstyle

Member
Jul 18, 2015
76
27
91
Doubt very much an 1180 or any xx80 would ever be be on a 192-bit bus, its too close to the flagship. That sort of crimping goes to xx60 cards and is the way its always been.

With Gddr6 192-bit bus gives 384 GB/s bandwidth and with some architecture improvement, it should be enough to beat the 1080 ti by 20-30%.
 

DownTheSky

Senior member
Apr 7, 2013
785
154
106
There are limits to compression u know? You can't just magically have 50% less bw needed.

The cards will be the same as Pascal series just +~40%CU and +~40% more bw. Also they will inevitably be bigger(the GPU) with more VRAM and cost more.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
This "rumour" is very likely just a educated guess where the author claims to have a source. What he missed is Gddr6 can do 256 bit bus width on 12GB of ram which is more likely than 16gb ram.

My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.

Ah, very interesting. I didn't know that. 12GB on 256-bit makes sense. While VRAM has doubled on every x04 going back to Fermi->Kepler, they may do this for cost saving reasons. It also helps keep the Titan V the exclusive provider of 16GB (for 2018 anyway). And honestly I don't think VRAM needs to double every 2 years now-a-days, especially since Nvidia sets the market and AMD will be max 8GB for a while.

If we get a 192-bit GTX 2060 on GDDR6, could it be 8GB? Not sure how the tech works this time around.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
AFAIK there is nothing that 6GB isn't enough for, much less 8 and much much less 12 (or 11 on 1080ti)
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Whaa? It doesn't makes sense at all. How would you connect 12GB to 256-bit interface?

8/16 GB on 256-bit bus or 6/12 GB on 192-bit make sense.

I found this (all new to me still):

http://monitorinsider.com/GDDR6.html

"Up to now, the storage capacity of all memory chips has been a nice, clean power fo two, if you exclude error detection or correction bits.

GDDR6 breaks with that tradition and offers in-between options. The standard allows a capacity of 8 to 32 Gbit, but 12 Gb and 24 Gb are possible as well. This will probably make GPU makers happy since it will increase the ability to segment the market based on the amount of memory.

Today, a GPU with a 256-bit bus can only cleanly support 4GB, 8GB or 16GB. With GDDR6, they will also be able to support 12GB, while still maintaining a full balanced load with identical sized memories connected to each controller.
"

So if I understand that right, should it also be possible to have a 9GB 192-bit bus GPU? So no 8GB GTX 2060, but maybe a 9GB GTX 2060.
 
  • Like
Reactions: Krteq

Krteq

Senior member
May 22, 2015
998
678
136
I found this (all new to me still):

http://monitorinsider.com/GDDR6.html

"Up to now, the storage capacity of all memory chips has been a nice, clean power fo two, if you exclude error detection or correction bits.

GDDR6 breaks with that tradition and offers in-between options. The standard allows a capacity of 8 to 32 Gbit, but 12 Gb and 24 Gb are possible as well. This will probably make GPU makers happy since it will increase the ability to segment the market based on the amount of memory.
Ahh, that's it. I didn't know they've added a 12Gb (12 Gigabits = 1.5 GB) chips option with GDDR6 :)

Thx for a link
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
True, we must account for the fact that Bethesda is poised to re-release Skyrim for the next 5-10 years
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Sure, but he mentioned 6GB. Look at similar performing cards with 6 and 8GB. 980 Ti / 1070. 1060 / 580. I haven't been as glued to the benchmarks lately these days but I haven't seen those 6GB cards take a hit at higher resolutions like 4GB cards have been known to.