Don't worry about what is being said here. A lot of FUD being spouted by the usual suspects.What about PoP or TSS, for bypassing interposer, isn't that something what mobile APUs would benefit from?
APUs can use 1 stack HBM2 or 2, or 3, or whatever just like GPUs. People saying HBM2 is extremely, extremely expensive don't think expansively. They are locked in with the past and can't separate why something [4 HBM stacks on the 1st interposer GPU product] was done vs what can actually be done. Notice how no numbers are EVER offered in these arguments? Just vague feelings.Sesh Ramaswami, managing director at Applied Materials, showed a cost analysis which resulted in 300mm interposer wafer costs of $500-$650 / wafer. His cost analysis showed the major cost contributors are damascene processing (22%), front pad and backside bumping (20%), and TSV creation (14%).
..........................................................
Since one can produce ~286 200mm2 die on a 300mm wafer, at $575 (his midpoint cost) per wafer, this results in a $2 200mm2 silicon interposer.
Why can't we have an 2 stack HBM2 top end die with 8 GB memory. The product differentiator would be HBM2 vs GDDR5X, not the amount of ram. Anyone wants to argue that 16GB ram is needed anytime soon for consumer GPUs, and those proposing a GP102 are certainly talking consumer and not professional product.
The other may just be practicality. 12 GB of 384 bit wide GDDR5X may provide all the bandwidth the the rumored GP102 actually needs especially if faster variants will be available when it ships.
I frankly don't see why we need 12Gb yet, but if NV needs 384bit GDDR5X to get the usual 30% generational boost, then that's what we'll get. But with a larger chip it could be possible to pull it off with higher clocked 256bit G5X too. Not that I'm complaining memory wise, far from.
I frankly don't see why we need 12Gb yet, but if NV needs 384bit GDDR5X to get the usual 30% generational boost, then that's what we'll get. But with a larger chip it could be possible to pull it off with higher clocked 256bit G5X too. Not that I'm complaining memory wise, far from.
Because.. I posted this a while back but Jayz2cents proved to us that there's at least one title that uses 9GB in DirectX-12 at just 1080p.
Not to mention we've "already had" for the better part of quite a long time now Shadows of Mordor. Which, with the high quality textures pack requires minimum of 6GB of video ram no matter what resolution you're playing at.
Any resolution higher than 1080p will use even more ram with these titles. And they aren't alone, this is a new, growing trend these days with newer games.
I would expect at bare minimum new GPU's from both sides to have at least 8GB - 12GB of ram or they're not worth spending money on.
Most of this is why I didn't even bother trying to buy anything new from the 900 line up.. all of the 8GB cards were incredibly expensive. I'm really hoping we can get at least 8GB cards in the affordable range (GTX 1060) this time around.
You're seriously deluding yourself, & many others, if you think a GB of VRAM is enough. In fact 4GB is quickly become the norm for 1080p, however it largely depends on the game & settings used, & 2GB is the bare minimum to even launch a game.I think you're mistaken. Games don't need more than 1gb vram upto 1080p. Only when you go above that or multi monitor you MAY need more.. So 1gb is more than enough.
So just trust me on this one..
http://www.bitsandchips.it/9-hardware/7037-rumor-gtx-titan-next-con-chip-gp102-e-gddr5x
I think this will be interesting in context of what you were discussing, guys 😉.
And yet we are expected to believe this.You're seriously deluding yourself, & many others, if you think a GB of VRAM is enough. In fact 4GB is quickly become the norm for 1080p, however it largely depends on the game & settings used, & 2GB is the bare minimum to even launch a game.
I would expect at bare minimum new GPU's from both sides to have at least 8GB - 12GB of ram or they're not worth spending money on.
Yeah actually i was wrong to say 1gb is enough. Actually 256mb is more than enough. So i stand corrected. Look at this.You're seriously deluding yourself, & many others, if you think a GB of VRAM is enough. In fact 4GB is quickly become the norm for 1080p, however it largely depends on the game & settings used, & 2GB is the bare minimum to even launch a game.

I think you're mistaken. Games don't need more than 1gb vram upto 1080p. Only when you go above that or multi monitor you MAY need more.. So 1gb is more than enough.
.............
So just trust me on this one..
I don't think he is trolling, he might be a little off, I have used a GTX 570 1.2GB Gram with BF4 and only had a little hitching, it was buttery smooth at ultra settings. When played with my new GTX 970 4GB, the only difference was the hitching was gone and more FPS with 1080p.I'm assuming you're just trolling for the sake of trolling.. which is not cool.
But it's not about FPS performance, everyone knows that. Higher video ram (if the game needs to use it) results in less hitching/stalling/delays when loading new areas/etc and in general leads to a smoother gaming performance.
I only have a 4GB card, but I have a rather short list here of games that use at least that much.. (they want more but I don't know how much more).. even at 1080p @ DX-11.
Project C.A.R.S., GTA-V, BF4, Ark: Survival Evolved. And everyone already knows 6GB minimum for Shadows of Mordor ultra pack, the developers actually specify that as a requirement for the ultra pack.
All these titles slam my card right at flat 4GB / 4GB the entire time playing.
I've almost considered switching from a GTX 770 a 1st gen Titan used (if they come cheap used) instead of 900 series.. just for video ram.
If people are after a smooth 60 FPS -constantly- with no stutters, dips or drops @ 1080p.
Mind you.. all this "high vram usage" and everything is with the game's settings 100% maxed out, every single possible last setting turned on to max.
So yes, everyone in to "butter-smooth gaming" on PC, even at 1080p, pays attention to video ram usage. FPS isn't different but the experience sure is.
https://www.youtube.com/watch?v=rmsfw9GnWUY#t=8m4s
See here, Jayz2centz talks about video ram and it actually has -quite a big impact- on DirectX-12 performance. See his discussion and results and after thoughts.
This^^^ I also can't tell the difference from medium to ultra in the new games. I can't help my self I need to run the games at max.I realized in a lot of newer games are subtle enough that I don't notice the differences right off between ultra, high and medium settings. Some games where you can see the changes as you make them, then I notice what changes, but just to enter a game from ultra to medium, I still see it as looking great and don't realize right off what changed.
I think it's in my mind, that I fear of missing out on some level of immersion when in reality I probably wouldn't notice if someone just switched it without me looking. I once used Geforce experience and let it optimize Farcry 4 for me, It looked fine, but I felt like a heroin addict yearning for a fix...I had to go in, had to change it that one extra notch higher. I even have to set CP quality settings to high quality thinking that clarity difference will matter despite knowing it likely won't. I could probably do fine without a 1080...but fuck it, I wants.
Anyone else think most of the settings are subtle enough not to notice but can't help themselves?
An update from VideoCardz on custom Geforce GTX 1080/1070 models, what is knows so far (rumours/news):
http://videocardz.com/60269/custom-geforce-gtx-1080-1070-cards-what-we-know
Sorry if this is already common knowledge but how long roughly are we expecting to wait to these? I'm thinking of buying a 1080 but not that overpriced 'Founders Edition' garbage
An update from VideoCardz on custom Geforce GTX 1080/1070 models, what is knows so far (rumours/news):
http://videocardz.com/60269/custom-geforce-gtx-1080-1070-cards-what-we-know
$1,400.00 for a Titan-next? I wonder if they are naming it after Graphics Core Next or trying to confuse the naming. If the pricing is anything like accurate, then we can say we called it. It looks like it would be one hell of a card, but aren't they always? Yes, they always are amazing within the context of what's currently available. But as the prices keep climbing, the value and intrigue of the product plummets since it will be soon replaced. The price makes sense based on the 1080's price.
Mid Range - $700
High End 1080Ti - $1,000
Titan Core Next - $1,400.00
Wonder if this bothers anyone...
Jesus people, the only card most of us can afford is a 1060ti, uh, I mean GTX 1070. No one can afford this crap. I wonder who will be buying all these? $1000+ GPU's? LOL damn. I didn't know the economy was doing so well. People must be just rolling in cash (mastercard) to afford these things. Good lord.