• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[WCCFTECH] NVIDIA GeForce GTX 1180 leaked info and performance

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Didn't the $699 release price include the $100 bonus "Founder's Edition" tax you got to pay for the crappier cooler?
Yep

And GTX 1070 was $379... +$70 "Fanboy Edition" tax 🙁

I hope nV won't be so greedy this time, but... well, you know
 
Last edited:
What we know is that when the cards come out, and they are better and faster, all the talk about greed will go away as the cards are bought up faster than ice cream cones on sale in the desert..
 
Seems to me like a garbage somewhat educated guess on what it might be. From my educated guess I'm leaning that we won't even see a desktop GPU refresh in 2018. I think we might see Nvidia jump to 7nm, which would be in 2019.

If they did make a refresh the performance increase we are going to see is going to be minimal, 20% over previous model at best. Again going from 16nm to 12nm won't bring much performance improvements, at best 10% more density or 15% more power reduction.
 
Seems to me like a garbage somewhat educated guess on what it might be. From my educated guess I'm leaning that we won't even see a desktop GPU refresh in 2018. I think we might see Nvidia jump to 7nm, which would be in 2019.

It's 10 nm.
 
Yep

And GTX 1070 was $379... +$70 "Fanboy Edition" tax 🙁

I hope nV won't be so greedy this time, but... well, you know
Nvidia pricing entirely dependent on what the other side has as alternative. Yes, they will milk it when given the chance, but once AMD come up with something that convincingly takes it on and cheaper, prices will drop dramatically. The gtx260 was $449 when released. Once the 4870 came out a few weeks later @ $299, the 260 quickly dropped down to that. I think both sides will milk whatever products they have given the opportunity. Unfortunately for AMD, they rarely get that opportunity.
 
Thing is, miners are still buying. Maybe not at the rate they once did, but they still are. If nVidia goes "cheap", retailers will just reap the profits themselves.
 
Thing is, miners are still buying. Maybe not at the rate they once did, but they still are. If nVidia goes "cheap", retailers will just reap the profits themselves.

That’s the real issue IMO. Nothing we can really do about that point. I don’t think it will happen but I’m waiting for new cards made specifically for mining that are $1000+ and then gaming cards that somehow lock out mining for the usual pricing for the rest of us.
 
Even without mining the initial peak in demand when NV release a new high powered GPU tends to make availability really quite hard to get for a while. Could be pretty awful if the two combine 🙁
 
Or........
Nvidia could be stockpiling cards for the past few months to take even more market share with gamers, that would be smart.
I 've never know Nvidia to make stupid moves when it comes to card releases.
 
This "rumour" is very likely just a educated guess where the author claims to have a source. What he missed is Gddr6 can do 256 bit bus width on 12GB of ram which is more likely than 16gb ram.

My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.
 
The real upgrade for me is how cheap a used 1080 ti is on 1180's release.

I got burned not buying a used GTX 980 ti over a new 1070 as I thought Pascal would have gotten a Mac driver much sooner. It took nearly a year!

I have no idea if Nvidia and Apple will even support the next gen...
 
My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.
Doubt very much an 1180 or any xx80 would ever be be on a 192-bit bus, its too close to the flagship. That sort of crimping goes to xx60 cards and is the way its always been.
 
Doubt very much an 1180 or any xx80 would ever be be on a 192-bit bus, its too close to the flagship. That sort of crimping goes to xx60 cards and is the way its always been.

With Gddr6 192-bit bus gives 384 GB/s bandwidth and with some architecture improvement, it should be enough to beat the 1080 ti by 20-30%.
 
There are limits to compression u know? You can't just magically have 50% less bw needed.

The cards will be the same as Pascal series just +~40%CU and +~40% more bw. Also they will inevitably be bigger(the GPU) with more VRAM and cost more.
 
This "rumour" is very likely just a educated guess where the author claims to have a source. What he missed is Gddr6 can do 256 bit bus width on 12GB of ram which is more likely than 16gb ram.

My guess is 2080/1180 will have 12 gb of ram on 192/256 bit bus width.

Ah, very interesting. I didn't know that. 12GB on 256-bit makes sense. While VRAM has doubled on every x04 going back to Fermi->Kepler, they may do this for cost saving reasons. It also helps keep the Titan V the exclusive provider of 16GB (for 2018 anyway). And honestly I don't think VRAM needs to double every 2 years now-a-days, especially since Nvidia sets the market and AMD will be max 8GB for a while.

If we get a 192-bit GTX 2060 on GDDR6, could it be 8GB? Not sure how the tech works this time around.
 
Whaa? It doesn't makes sense at all. How would you connect 12GB to 256-bit interface?

8/16 GB on 256-bit bus or 6/12 GB on 192-bit make sense.

I found this (all new to me still):

http://monitorinsider.com/GDDR6.html

"Up to now, the storage capacity of all memory chips has been a nice, clean power fo two, if you exclude error detection or correction bits.

GDDR6 breaks with that tradition and offers in-between options. The standard allows a capacity of 8 to 32 Gbit, but 12 Gb and 24 Gb are possible as well. This will probably make GPU makers happy since it will increase the ability to segment the market based on the amount of memory.

Today, a GPU with a 256-bit bus can only cleanly support 4GB, 8GB or 16GB. With GDDR6, they will also be able to support 12GB, while still maintaining a full balanced load with identical sized memories connected to each controller.
"

So if I understand that right, should it also be possible to have a 9GB 192-bit bus GPU? So no 8GB GTX 2060, but maybe a 9GB GTX 2060.
 
I found this (all new to me still):

http://monitorinsider.com/GDDR6.html

"Up to now, the storage capacity of all memory chips has been a nice, clean power fo two, if you exclude error detection or correction bits.

GDDR6 breaks with that tradition and offers in-between options. The standard allows a capacity of 8 to 32 Gbit, but 12 Gb and 24 Gb are possible as well. This will probably make GPU makers happy since it will increase the ability to segment the market based on the amount of memory.
Ahh, that's it. I didn't know they've added a 12Gb (12 Gigabits = 1.5 GB) chips option with GDDR6 🙂

Thx for a link
 
Sure, but he mentioned 6GB. Look at similar performing cards with 6 and 8GB. 980 Ti / 1070. 1060 / 580. I haven't been as glued to the benchmarks lately these days but I haven't seen those 6GB cards take a hit at higher resolutions like 4GB cards have been known to.
 
Back
Top