deustroop
Golden Member
Ok got it now. Thx,Saves for me when I hit F5. When I hit that button, the little tape recording wheel pops up at the top left of the screen and saves the game.
I had the F functions swapped
Ok got it now. Thx,Saves for me when I hit F5. When I hit that button, the little tape recording wheel pops up at the top left of the screen and saves the game.
Yeah, reactivated A/C is doing some magic for GCN cards, but there is still something borked in worst case scenario for GCN cards, but nothing serious.
Computerbase also says AC still needs a new driver for Nvidia
The claims about Nvidia's AC solution being half baked goes back to the pre-Pascal GPUs though. Maxwell couldn't even enable AC in Computerbase's first test.
I personally couldn't care less how Nvidia's solution works as long as it nets a gain in performance. Still using Kepler myself though, it's a bit annoying to watch AMD's equivalents usually aging better.
Still, quite a few people still claimed that Pascal's solution was half baked as well, specifically that it was rooted in software which isn't true. This rumor was spread mostly by Mahigan if you recall.
The notion that AMD GPUs age better is kind of true, and there are reasons for this. The biggest reason of course is that AMD GPUs share similar architecture with the GPUs in the consoles. That's a big advantage, but it's taken a long time to manifest. Also not every vendor is willing to push the ante when it comes to enabling console style optimizations for AMD GPUs. NVidia remarkably is still easily capable of competing though due to their amazing software scheduler.
The second reason is that AMD GPUs always take a long time to reach optimal performance in their life cycle from driver updates, which provides the illusion that they are aging better when in reality, it's just taking longer for their architecture to peak. NVidia is much faster than AMD when it comes to pushing driver updates that exploit new architectures.
I'd wager that Vega at the end of its life cycle should be solidly outperforming the GTX 1080 despite being mostly slower today. GTX 1080 is already topped out, but Vega still has room to grow. It won't reach GTX 1080 Ti levels of performance though.
Whilst I agree for the most part I would swap importance of drivers vs console influence. The AMD "fine wine" thing really came to the fore as GCN 1.0 products aged, you can even notice it with the first 6-9 months of Tahiti. And this is much too small a time frame for the console's influence to come through. So I think AMD's driver team has a lot to answer for.
That being said, and what does tie into the console thing, I believe AMD's GCN GPUs are fundamentally more flexible and "forward looking" in the sense of pure hardware capabilities. Anyone is welcome to disagree, but as far as I'm concerned it's not a surprise game devs and driver teams take some time to catch up.
And Vega 64 is now averaging over 25% faster than a 1080 with patch 2 in this game. So it's a surprise to see the easy claim it "won't reach GTX 1080 Ti levels of performance though". It's not that I'm a believer in the upcoming magic drivers or anything. Just saying this is a pretty bold claim... Especially given Vega is already around "GTX 1080 Ti levels of performance" with this game.
It's not "best case scenario" it's average fps.That's only in the best case scenario. If you look at the worst case scenario, then the GTX 1080 is still ahead of Vega 64, although its lead has diminished somewhat due to having asynchronous compute disabled in the game.
Vega 64 is already averaging in the same ballpark as a 1080Ti in this game atm. So I think your claims of certainty are a bit rich. Personally I don't see Vega 64 matching a 1080Ti across the board any time soon, if ever, but I think it's a bit presumptuous to discount the possibility. Let alone make a claim, like you have, that they will never (in any situation?) have equal performance.I'd wager that Vega at the end of its life cycle should be solidly outperforming the GTX 1080 despite being mostly slower today. GTX 1080 is already topped out, but Vega still has room to grow. It won't reach GTX 1080 Ti levels of performance though.
It's not "best case scenario" it's average fps.
Vega 64 is already averaging in the same ballpark as a 1080Ti in this game atm. So I think your claims of certainty are a bit rich. Personally I don't see Vega 64 matching a 1080Ti across the board any time soon, if ever, but I think it's a bit presumptuous to discount the possibility. Let alone make a claim, like you have, that they will never (in any situation?) have equal performance.
Yes, it is. Check some side-by-side performance review videos on YouTube, there are tons of those and there is a variety of scenes tested in them. Still Radeons are faster across the board.Evidently you didn't read the review. Computerbase.de tested several scenarios, one in which the Radeon has a large lead (best case scenario), and the other in which the Geforce cards have a large lead (worst case scenario). So the game is not universally faster on Radeons.
OK I admit they didn't take a fps reading over the course of the entire game. Shame on them. What they did was take a common "demanding" scene and they graphed the averages. From the review (Google translated of course):Vega 64 is not averaging in the same performance as the GTX 1080 Ti in this game. Read the entire review, and not just look at a single graph.
Now I think we arrive at the problem:The first test scene corresponds to a demanding scenario , as it occurs in the game again and again. It is a sequence inside and outside a building in combination with various particle effects. The second scene, on the other hand, is a worst-case scenario . For the first hour and a half, there has only been one sequence of this kind.
Yes. Yes it is. Vega 64 average performance about the same as a 1080Ti in this game at this time. It just is.Vega 64 is not averaging in the same performance as the GTX 1080 Ti in this game.
I don't want to accuse you of grasping at straws, or even building straw men out of them, but nobody thought just drivers would make up ~30% performance. Other things can make up ~30% performance, like this game engine...The GTX 1080 Ti FE is about a good 30% faster than Vega 64 on average. No amount of driver optimizations is going to make up such a large gap.
But it's not about driver "optimizations".No amount of driver optimizations is going to make up such a large gap.
No.Vega's power only can be explored at DX12.1/Vulkan
Yes, it is. Check some side-by-side performance review videos on YouTube, there are tons of those and there is a variety of scenes tested in them. Still Radeons are faster across the board.
And there is nothing like "best case scenario" in CB.de tests.
Yes. Yes it is. Vega 64 average performance about the same as a 1080Ti in this game at this time. It just is.
Roughly at least. I can't see any direct comparisons but we can deduce that Vega 64 is easily averaging within 10% with the current patch. Heck, since nV products have lost a little performance with the most recent patch I would bet Vega 64 is within a couple of percent.
Other things can make up ~30% performance, like this game engine...
And if a game engine can make Vega 64 over 25% faster than a 1080, then how can you deny the possibility a different game engine and/or drivers can make it 30% faster? It doesn't need much (if any) more performance to match a 1080Ti (in this situation), and we've already agreed in theory about fine wine. So it seems you're sticking your head in the sand to me.
Obviously this is the best case we have for Vega at the moment, but why do I have to bother correcting people who insist Vega will never match a 1080Ti in gaming performance??
And it's still not doing TBDR or fast path culling.Vega is using shader intrinsics, 16 bit floats and probably some other architectural optimizations
snip
Also, NVidia optimizations will eventually be implemented in the game as well. It will just take longer to manifest.
.
Do you honestly think this game engine will be representative of the large majority of other 3D engines on the market? If so, I have an igloo to sell you somewhere in the Sahara desert.
AMD PAID to have these optimizations implemented. So unless AMD has enough wealth to convince all the other major 3D engine developers to implement these changes, I don't see it happening across the board.
That one doesn't.Xbox One X support FP16
I wouldn't count the chickens just yet for this title. There are many more patches and drivers to come.
Primitive Shaders are not enabled nor working from either idtech 6 or AMD Drivers.
Although RPM is enabled from idtech 6 it's not enabled in AMD Drivers Yet.
All we have is Async Compute from the beta patch and it's only with AMD 17.10.3 but disabled for Nvidia 388.13.
No offense, but I think that's a silly justification for the poor performance. I think that because the game is released and I've already spent my hard earned money on it, as have many others. I shouldn't have to wait for patches to make the game playable, or have decent performance, that may or may never come.
I guess what I'm saying is that if those graphical features and performance patches hat you mentioned weren't ready by the scheduled release date, maybe they should've delayed the game until they were implemented.
Test with second patch - CB.de - Wolfenstein 2: Neuester Patch beschleunigt RX Vega um bis zu 22 Prozent
A test that was uploaded a week ago which compares Vega 64 to GTX 1080 Ti and GTX 1080 in Wolfenstein 2:
Vega manages to outclass the GTX 1080 for the most part, but it still doesn't catch up to the GTX 1080 Ti across most of those scenes, even with asynchronous compute disabled on the latter. Also, this game is still brand new and has a bunch of optimizations for Vega off the bat. Geforce optimizations will be coming down the pipeline assuredly. It will be similar to Doom, where the Geforce GPUs will get faster over a longer period of time.
Well German isn't my native language, but it seemed to me like that was their intent.
I sure do not hope so, then we get another wolfenstein new order which crashes often on AMD cards and workarounds have to be used that solve the crashes but also slow down performance.
IDtech 5 was riddled with Nvidia specific optimizations that caused many problems.
That one doesn't.
XBX APU does not support packed math.It does actually, along with the other Polaris GPUs.