Just try to buy the FE 3080 10 GB on launch, then re-evaluate in a few months. If there's really a need for 20 GB, then you can probably re-sell the 3080 for a small loss, or maybe even a small gain if rumors are to believed about artificially limited supply.Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
I think there is a lot of misunderstanding, maybe the source of this is how Sony and Microsoft detailed the new technology. Yes, consoles are designed to manage the data much more efficient, and this is done by several, more or less individual components:Wouldn't the overhead of DirectStorage just be hundred(s) of MB? You only need the buffer big enough to decompress 1 whole asset at minimum right? Going back to your project, if you had DirectStorage technology, do you still need to load all the assets of the level? You're doing that currently because of the performance hit during game play if you have to load new assets right? But if DS takes out the CPU/RAM path, loading asset should have minimal impact to game play. So, instead of loading all the assets at the start of the level, you can load 1/4 (for example) and have a thread to stream in new asset as needed. The ps5 ratchet and clank demo shows the ability to load in completely different levels seamlessly. To me, this indicates that VRAM usage will be much more efficient and smarter going forward.
You can wait for RDNA 3. I'm hearing it will be very good. If not, there's always RDNA 4Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
When is AMD supposed to announce and release their cards anyway? I'm going to try to go for a 3090 but if I don't get one I will just have to wait and see how the competition plays out. I won't be going AMD but at that point if I can't get one it would make sense to wait and see if a Ti model ever comes out or a 3080 with higher memory.
Yeah...... I've decided to skip trying to nab a 3080 this week:-
1. 10GB is fine for now; but I don't think it will be in 5 years time (how long I would keep it)
4GB seemed to be about right for 1080p; so 16GB would be a good fit for 4k
2. Memory prices have gone done; so why hasn't NVidia significantly increased VRAM for the 3080?
It's the same as last generation.
I suspect that NVidia is holding this card so they can respond to AMD's upcoming cards.
They will launch a 20GB 3080; but at what price and that will likely depend on AMD
3. AMD is likely going to offer a 16GB card; this seems a much more reasonable amount for a new top end card
4. The new NVidia power requirements are a problem in my mind
I read the theory that the 3080FE cards being released this week have really good coolers and expensive to make
However, one has to ask why all this is required?
Hot power hungry video cards are simply going to be less reliable & have been problematic in the past
I'm really interested on how AMD will compete on this front
5. This is the first time in a while that AMD may have something competitive
Likely not faster than the 3080 but may be within 20%
We need to let competition kick in and bring back the creep we have had in graphics card price increases
This is going to require patience from everyone and some time
I'm resigned that I probably won't be getting any next generation card in 2020; availability will be low. January 2021 is more likely and one will be in a position to compare products; see how stable drivers etc are and with a bit of luck some price competition.
Zen 3 is on the 8th. Maybe they'll drop some Big Navi info? Otherwise it's October 28th for Big Navi.
5 years? That's a very long time in GPU terms, by which point all current GPU's will be worthless potatoes.
My 980ti begs to differ. I'm not scared of 10GB; worst case scenario, I drop down to high textures from ultra in a few years.
But you had 6GB way before it was cool!
My 290x had a similar run. 2014 - 2019. In retrospect, I don't think I ever regretted getting only the 4GB version, but I did pick up a 2560*1080 monitor on purpose as that resolution is way easier to push than 2560 *1440.
It lives on in my Cycling simulation/back up PC rig. Mostly because I can't part with my LGA 2011 rig so I am keeping it all together until the end times or it is the equivalent of a first gen Phenom.
I'm still on a 2011v3 with a 5820k. Current plan is to try and snag a 3080 FE tomorrow morning if the price really is $700, and maybe stick it in a Zen3 build next month.
"you should pull the trigger", you meant to say...Enjoy the 3080 should you pull the trigger!
5 years? That's a very long time in GPU terms, by which point all current GPU's will be worthless potatoes.
Steve Burke @ Gamers Nexus comments on the 10gb vram of the 3080 (@ 3m mark)
In the past few weeks since the RTX 30-series announcement, there have been quite a few discussions about whether the 3080 has enough memory. Take a look at the previous generation with 11GB, or the RTX 3090 with 24GB, and 10GB seems like it's maybe too little. Let's clear up a few things.
There are ways to exceed using 10GB of VRAM, but it's mostly via mods and questionably coded games — or running a 5K or 8K display. The problem is that a lot of gamers use utilities that measure allocated memory rather than actively used memory (e.g., MSI Afterburner), and they see all of their VRAM being sucked up and think they need more memory. Even some games (Resident Evil 3 remake) do this, informing gamers that it 'needs' 12GB or more to properly run the ultra settings properly. (Hint: It doesn't.)
Using all of your GPU's VRAM to basically cache textures and data that might be needed isn't a bad idea. Call of Duty Modern Warfare does this, for example, and Windows does this with system RAM to a certain extent. If the memory is just sitting around doing nothing, why not put it to potential use? Data can sit in memory until either it is needed or the memory is needed for something else, but it's not really going to hurt anything. So, even if you look at a utility that shows a game using all of your VRAM, that doesn't mean you're actually swapping data to system RAM and killing performance.
You'll notice when data actually starts getting swapped out to system memory because it causes a substantial drop in performance. Even PCIe Gen4 x16 only has 31.5 GBps of bandwidth available. That's less than 5% of the RTX 3080's 760 GBps of bandwidth. If a game really exceeds the GPU's internal VRAM capacity, performance will tank hard.
If you're worried about 10GB of memory not being enough, my advice is to just stop. Ultra settings often end up being a placebo effect compared to high settings — 4K textures are mostly useful on 4K displays, and 8K textures are either used for virtual texturing (meaning, parts of the texture are used rather than the whole thing at once) or not used at all. We might see games in the next few years where a 16GB card could perform better than a 10GB card, at which point dropping texture quality a notch will cut VRAM use in half and look nearly indistinguishable.
There's no indication that games are set to start using substantially more memory, and the Xbox Series X also has 10GB of GPU memory, so an RTX 3080 should be good for many years, at least. And when it's not quite managing, maybe then it will be time to upgrade to a 16GB or even 32GB GPU.
If you're in the small group of users who actually need more than 10GB, by all means, wait for the RTX 3090 reviews and launch next week. It's over twice the cost for at best 20% more performance, which basically makes it yet another Titan card, just with a better price than the Titan RTX (but worse than the Titan Xp and 2080 Ti). And with 24GB, it should have more than enough RAM for just about anything, including scientific and content creation workloads.
HUB's results differ from TPU, GameGPU, KitGuru and other sites. Steve Walton/HUB is the only person who reports this result. He also made an argument re 2080 vram performance weeks or months ago with Doom Eternal 4k vs RVII and 1080ti as well.HUB demonstrated that a 2080 starts to tank at 4K in Doom vs a 1080Ti because of the lack of Vram. That's with 8GB at 4K right now. This wasn't him just looking at some software reading. The performance was ahead at 1440p, then it tanked at 4K due to a lack of RAM. How long before 10GB gets rekt at even lower resolutions? All it takes is one good AAA game to break the 10GB buffer. I'd be pissed.
There's no way this much GPU power is a good fit for only 10GB of ram. It's ridiculous. New models will be out after AMD drops and everyone will look back at the 10GB card and wonder why the hell it was even released. It's the skimped-out starter card to maker Nvidia look good. It only looks good right now (to some people at least) because it's the first release of a new generation for both parties. I'm not buying it.