Are you actually using more RAM at 1080 360Hz than 1080 144Hz at the same settings? 1080 isn’t really the issue, I run 4K and that’s where the concern is.
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?
It might be with NV I/O. Otherwise, I can see at least some XSX/PS5 ports being bottle-necked.
Assuming the PS5 shares RAM between CPU and GPU, it only has 16GB total. An even split affords the PS5 just 8GB of VRAM.
You can start to see how even a 3070 is future proof.
I have yet to see anyone actually show 10GB+ VRAM being used and not just reserved, unlike us lot Nvidia has the tools to see actual usage of VRAM and not the reservation that user side tools show.
360Hz is just nuts, makes my 3x 1080p 120Hz displays seem slow as hell.
I would not feel comfortable with 10gb vram on a flagship level card with new 4K consoles about to be released. That's just me though. 8GB for the 3070 is just crazy imo. I'd definitely get the 16GB version of that model.
Kinda reminds me of the 3gb v 6gb debate and we all know how that panned out (spoiler alert, the 3gb version aged like milk)
They are fine at release, but when a new game comes out even a few months down the road, suddenly you have issues that can't be resolved unless you lower settings.
So for at least 1 game, it appears the new 8 and 10GB cards are obsolete right out of the gate. I'm wondering if the game is just "caching" instead of requiring that framebuffer amount.
Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples – if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
You're wrong, this game hasn't even been benchmarked on a 10GB 30 series, so you don't know the texture requirements for this game on this hardware.
Tensor Memory Compression changes the game here literally.
Can we clear up compression once and for all? I seem to remember the 980 or possibly the 1080 had a compression scheme that suddenly improved bandwidth but that's not what we're talking about here? I see tons of people making outrageous claims about this new memory acting like twice the amount. Even I understand that that's bunk, but has there been any real reduction whatsoever of the actual space assets take up while in GPU RAM, and is that even possible?
You're wrong, this game hasn't even been benchmarked on a 10GB 30 series, so you don't know the texture requirements for this game on this hardware.
Tensor Memory Compression changes the game here literally.
I don't believe you. I think the truth is that the 3080 is simply a 2080 replacement and it's not even all that much faster than a 2080Ti outside of the bandwidth-sensitive 4K scenarios shown by Nvidia. Nvidia hasn't announced an upgrade for 1080Ti people yet. It's going on what, 4 years now? I'm waiting...
just buy the 3090, problem solved
Very true. I've seen some people on other forums who feel stuck like me between downgrading to a 3080 or overspending on a 3090. There isn't an appropriate offering yet, and I think people have caught on to that. Not everyone though. Most people already dumped their $2080Ti on ebay for peanuts; a decision they may come to regret. I got a feeling a $2080Ti will prove to be a faster card that a 3070 and have more Vram to boot, and at $400-ish? Dang. Question is, how will all those $2080Ti people feel if that 3080 isn't quite as good as they thought? I'm really hoping the only downside is that missing gig of ram, but I'm starting to get some bad vibes.
i keep confusing moonbogg with moonbeam
and then i get surprised when his posts aren't off the wall philosophical lectures
though perhaps an off the wall philosophical lecture is what i need to help me decide between the 3090 and the 3080 v2
Being able to fork over the cash for a product is totally separate from an offensively priced product making your stomach turn. Paying an offensive price is a behavioral act independent of all other factors, financial or otherwise. It's a symptom of one's character that manifests itself as a badge of shame during the contemplation of such a purchase. However, once the act is committed, it is forever sealed in both time and our collective memory. The badge then transitions into an iron-branded symbol of TOMFOOLERY upon the center of the forehead of the self-victimized one, remaining forever recognizable as such within the social context of their experience.
I think all of us 1080ti owners are in this situation now. On one hand I dont want to wait another year or 6 months IF a 3080ti ever comes.Very true. I've seen some people on other forums who feel stuck like me between downgrading to a 3080 or overspending on a 3090. There isn't an appropriate offering yet, and I think people have caught on to that. Not everyone though. Most people already dumped their $2080Ti on ebay for peanuts; a decision they may come to regret. I got a feeling a $2080Ti will prove to be a faster card that a 3070 and have more Vram to boot, and at $400-ish? Dang. Question is, how will all those $2080Ti people feel if that 3080 isn't quite as good as they thought? I'm really hoping the only downside is that missing gig of ram, but I'm starting to get some bad vibes.