• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

updatevideocardzAMD Polaris 10 engineering sample ‘pictured’

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Everywhere? I haven't authored a thread on that game anywhere.

There you go again... making things up.

I've authored Asynchronous Compute and DX12/Vulcan related threads.
What advantage Asynchronous Compute provide this game? Did it sold well, Did it perform well, did it provide better graphics?
 
Only read the last two pages...not sure what Polaris has to do with Intel, but ok!

I've been AWOL a few days, why are we talking about Polaris 10? Did I miss something? Wasn't Polaris 10 demo'd a few months back as the power / efficient design? Competing with a GTX 750 Ti or something?
 
Only read the last two pages...not sure what Polaris has to do with Intel, but ok!

I've been AWOL a few days, why are we talking about Polaris 10? Did I miss something? Wasn't Polaris 10 demo'd a few months back as the power / efficient design? Competing with a GTX 750 Ti or something?

Polaris 10 is the bigger chip, Polaris 11 is the smaller one.
 
At the AMD event at GDC.

I just read through that thread. So someone screwed up the names along the way?

Oh well, now a little more caught up, seems Polaris 10 is going to bring some heat!

(Personal question: I wonder if the new cards will have an RTG style badge or will it stay AMD? AMD RTG <Name of GPU> or just RTG <name of GPU>.)
 
I just read through that thread. So someone screwed up the names along the way?

Oh well, now a little more caught up, seems Polaris 10 is going to bring some heat!

(Personal question: I wonder if the new cards will have an RTG style badge or will it stay AMD? AMD RTG <Name of GPU> or just RTG <name of GPU>.)
nah seems like they do name their bigger cards on smaller numbers(for some reason)
 
Those are the code names for the chips, not the retail video card. I would assume that the naming isn't going to change for the cards.

well nvidia does the same thing
g204 and gm200 intel also
but still i really dont understand why they do it (unless it has to do with the creation process)
 
Even at that price given the performance and power consumption they would be a deal.

Well I'd be interested in a Polaris 11/1070 gtx @ ~ 110 watts, 2x the performance of my overclocked gtx960 (overclocked gtx980 performance) @ ~ 300$.

I think Polaris 10 /1080gtx might be out of the price range I'm comfortable paying for a card.

Think its possible?

Otherwise I might as well grab a G1 gamer gtx970 overclock it (182watts) to 980 speeds for ~240$ and eat the extra 70 watts. 😀

A 182 watt card is at toaster oven levels, not quite a space heater.:thumbsup: j/k
 
Last edited:
Just to clarify for the people who've been away: Polaris 10 is the big (perhaps not that big tho) one, Polaris 11 is the one that was matched against the GTX 950 at CES.
 
And if you didn't understand the point, I'm wary of paper spec claims, we've seen what happened with NV's DX12 claims already. -_-

NVIDIA has been doing well in directx 12 games that have been released. Gears of war was a disaster for AMD and in tomb raider the 970 was beating Fury. AMD does well on rumors and hype, not so much on reality
 
NVIDIA has been doing well in directx 12 games that have been released. Gears of war was a disaster for AMD and in tomb raider the 970 was beating Fury. AMD does well on rumors and hype, not so much on reality

Actually after the game was patched and AMD updated their drivers GOW is running great on AMD cards:
http://forums.anandtech.com/showthread.php?t=2467194

Fury X beating 980TI, 390 beating 970, and 380 beating 960. Looks like Once the game was fixed AMD is out performing again.
 
NVIDIA has been doing well in directx 12 games that have been released. Gears of war was a disaster for AMD

This guy...

GOW PC is like consuming a Tootsie Pop with the wrapper on, it tastes a little bit like root beer but a lot like wax paper. It's nowhere close to being a native DirectX 12 game.
 
On second thought this made me kinda sad.

One thing I prefered about the 7970 I had to the 970 was how big it was. It looks awesome having a big card in a big case.

But now I have a 7850 in there as a stopgap and it just looks so small with so much wasted case space. If all the GPUs I can afford get small that is great for my Mini ATX rig, but my main rig's epeen will never be the same.

Maybe I will just buy a three fan 390x or Fury when the new cards force them to go on clearance.
Or you could simply attach USB charged sex toy to your rig. I'd bet no could brag about that much e-peen in the world 🙂

Psst., i just bought a full tower cabinet. I hear you. when i saw Nano, i was like, "that's it? where's the rest of it?" You remember creative sound cards which had two cards?
 
NVIDIA has been doing well in directx 12 games that have been released. Gears of war was a disaster for AMD and in tomb raider the 970 was beating Fury. AMD does well on rumors and hype, not so much on reality
Facts... Brought to you by both NVIDIA and AMD during a joint talk at GDC:
0b96142fd5d78272307a57bf01f837d4.jpg
Architecture specific paths are required for both NV and AMD. This was added to GoW UE with the latest patch and they're present in both Hitman and AotS.

f617f67dc7e5ddc7f5f1c4519f17e552.jpg
Compute wise, GCN is far ahead of Maxwell as you can see here.

532cd97b09773328635993eaacbe2bb9.jpg
Devs are being told to always use Async Compute and maintain a non-Async Compute path for DX12 titles.

586cc8909566e290ac8db28d1f106084.jpg
Framebuffer over commitment needs to be tackled by developer's. This is one of the big issues with Rise of the Tomb Raider


Once AMD work with the developers (Square Enix) of Rise of the Tomb Raider DX12, expect GCN to surpass Maxwell (just as it is the case for GoW UE, AotS and Hitman).

DX12, under Rise of the Tomb Raider, is a work in progress (the developer's have stated as such).

Factually speaking, your opinion is incorrect. Therefore based on what NVIDIA and AMD have stated, I am more inclined to side with the notion that if NVIDIA are to make head way, it won't be with Maxwell but rather Pascal and Volta.


GCNs superior compute shader, and multi-engine, capabilities, due to GCN flexibility brought on by being more CPU like (HSA), are also listed here:
3b63f1fabb3da6c6fb97cb70a573d4a4.jpg

Maxwell's short comings, in terms of Asynchronous Compute, slow context switching (up to 1,000 cycles delay) and preemption, were also listed by NVIDIA here:
3dbc5d3e885f3bfbeb5c44c1c0ad6926.jpg


Where as GCNs ACEs can pause, stop, start various tasks allowing for finer grained preemption. GCN can also execute a priority based compute shader in parallel with a current draw call (not limited by a draw call boundary for pre-emption). GCN also is capable of fast context switching as a result of the ACEs (1 cycle performance cost):
802aa0318569a25b983ede3d3f545862.jpg

93415bb41af9b05443f38af72698e159.jpg
 
Last edited:
Back
Top