[Official] Maxwell Discussion Thread

SinOfLiberty

Senior member
Apr 27, 2011
277
3
81
I thought to create a thread related to the next gen. Graphics from Nvidia.

Here it is, begin the discussion!

Cant wait for 880!!
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
Not 28nm.
after seeing how both companies held back on us with the Big Chips, i expect a GTX880 to be around equal to GTX780TI and later a GTX980 to be as fast as R9-295X2.
either way i don't expect the next gen of cards to be able to handle 4k @ 60FPS. maybe next-next-gen and only on 20nm or below.
 
Last edited:

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Hoping for proper changes in terms of programmability to most areas of architecture. (ROPs, ALUs, TEX units.. etc.)
We really haven't seen new features since Fermi. (IE. texture/framebuffer formats/compression, programmable blending, Tesselation improvements. etc.)
 

Kippa

Senior member
Dec 12, 2011
392
1
81
At the moment I have a Titan and am waiting for single gfx card that can handle 4K at circa 60fps. I'll probably have to skip and wait at least 2 maybe 3+ generations for a single card to handle it. I don't want a dual card setup and am willing to wait, though it will probably be a long wait unless there is suddenly a huge change in technology.

What might tempt me to get a new gfx card is if an NVidia one comes with 12gb+ of video ram, which would be great for 3D rendering in Lightwave + Octane as you'd be easily able to do very complex scenes with that much video ram (Octane uses CUDA btw).
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
At the moment I have a Titan and am waiting for single gfx card that can handle 4K at circa 60fps. I'll probably have to skip and wait at least 2 maybe 3+ generations for a single card to handle it. I don't want a dual card setup and am willing to wait, though it will probably be a long wait unless there is suddenly a huge change in technology.

What might tempt me to get a new gfx card is if an NVidia one comes with 12gb+ of video ram, which would be great for 3D rendering in Lightwave + Octane as you'd be easily able to do very complex scenes with that much video ram (Octane uses CUDA btw).

they do = Quadro K6000 ;)
 
Last edited:

Kippa

Senior member
Dec 12, 2011
392
1
81
I know the Quadro K6000 does have 12gb but that cost like £4,300. If they put 12gb on a mainstream gaming card then I might interested for something between £600 and £1000 then I might be interested. :)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Hoping for proper changes in terms of programmability to most areas of architecture. (ROPs, ALUs, TEX units.. etc.)
We really haven't seen new features since Fermi. (IE. texture/framebuffer formats/compression, programmable blending, Tesselation improvements. etc.)

That's because we are still on DX11. Need the API to support new features before the IHV's can add them.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
That's because we are still on DX11. Need the API to support new features before the IHV's can add them.
No, just no.
It would be silly to create API without any actual hardware support and then wait few years of chip designs..

IHV's can create hardware which is well beyond current DX API limitations.
Features can then be exposed first with OpenGL and DX extensions. (like Intel did with PixelSync.)
 

dangerman1337

Senior member
Sep 16, 2010
346
9
81
The only person I fully trust on Maxwell rumors is Eyrines at Beyond3D, he was the first one who said that the Second generation Maxwell was going to be on 28nm (then everyone else started hearing it) and he is pretty adamant from his sources that even Big Maxwell is on 28nm.

While his reports on Fermi were nearly spot on (he said Q1 2010 but slipped technically into Q2 2010 :p) I'd be a bit skeptical.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
For me to consider upgrading, the following needs to happen:

1) True next generation games come out with next gen graphics that justify the extra GPU power - not poorly coded console ports with avg. graphics such as Watch Dogs;

2) $1k priced non-TN 32-37 inch 4K monitors with HDMI 2.0, DisplayPort 1.3 launch;

3) GPU that has at least 50% more power than 780Ti and has 6-8GB of VRAM.

Until these things start to happen, I don't see myself upgrading as my setup breezes through almost any game at 1080p; and I have no interest in going 1440/1600p as a stop gap.
 

dangerman1337

Senior member
Sep 16, 2010
346
9
81
For me to consider upgrading, the following needs to happen:

1) True next generation games come out with next gen graphics that justify the extra GPU power - not poorly coded console ports with avg. graphics such as Watch Dogs;

2) $1k priced non-TN 32-37 inch 4K monitors with HDMI 2.0, DisplayPort 1.3 launch;

3) GPU that has at least 50% more power than 780Ti and has 6-8GB of VRAM.

Until these things start to happen, I don't see myself upgrading as my setup breezes through almost any game at 1080p; and I have no interest in going 1440/1600p as a stop gap.
Next gen games with nearly maxed out settings at 4K at say 60FPS? I think you'd have to wait for Pascal TBH.

I think I am going to grab a GM204 and then get a new 4k/VR build 2016/2017 (4K monitor and 4K Oculus).
 

QuantumPion

Diamond Member
Jun 27, 2005
6,010
1
76
I have SLI 680's and am eagerly awaiting for an upgrade to a single card. Too many games lack support of SLI and apparently it doesn't go well with Oculus Rift due to increased stutter/lag. However I don't want to spent hundreds to sidegrade to a single 780ti. Hurry up already Nvidia grr!
 

Larnz

Senior member
Dec 15, 2010
247
1
76
I have SLI 680's and am eagerly awaiting for an upgrade to a single card. Too many games lack support of SLI and apparently it doesn't go well with Oculus Rift due to increased stutter/lag. However I don't want to spent hundreds to sidegrade to a single 780ti. Hurry up already Nvidia grr!

I am in exactly the same boat as you, really hoping the 880 will hopefully surpass 680SLi so I can go to 1 card with and drop SLI.

I'm also conflicted between (when available) getting a ROG Swift so I get 120hz 1440p with G-sync or getting a 60hz 4K monitor without G-Sync. I am thinking G-Sync wll be too hard to pass up and 4k will require extra grunt that may hinder me with only 1 880 or current setup.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I don't care what node process its on if performance, tdp, and prices are significantly improved over current cards.

No way will prices be better lower. I fully expect them to go up or be the same.
 

tollingalong

Member
Jun 26, 2014
101
0
0
For me to consider upgrading, the following needs to happen:

1) True next generation games come out with next gen graphics that justify the extra GPU power - not poorly coded console ports with avg. graphics such as Watch Dogs;

2) $1k priced non-TN 32-37 inch 4K monitors with HDMI 2.0, DisplayPort 1.3 launch;

3) GPU that has at least 50% more power than 780Ti and has 6-8GB of VRAM.

Until these things start to happen, I don't see myself upgrading as my setup breezes through almost any game at 1080p; and I have no interest in going 1440/1600p as a stop gap.

Pretty much. I'd like to upgrade and have the cash around but I don't see anything worth buying.
 

alcoholbob

Diamond Member
May 24, 2005
6,380
448
126
Quad Titan Blacks only hits about 49 fps at 4K resolutions in Crysis 3 and this isn't even with all the details fully up. It's going to require a flagship 14nm GPU before we can even think about hitting 60fps at 4K maxed with 2012/2013 era games, unless the majority of our future progress is coming via drivers/software and not hardware. Realistically we are probably looking at 30-35fps at 4K for the new games with a single card for the coming decade.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Next gen games with nearly maxed out settings at 4K at say 60FPS? I think you'd have to wait for Pascal TBH.

You are probably right. But if enough next gen games come out and hammer dual 7970s at 1080p, I'll consider upgrading. I don't think I would consider 880 assuming it is the rumored 2560core/256-bit card only to see Nv release a 4000 core / 512-bit card in 15 months. Without bitcoin mining sponsoring GPU upgrades, I won't be paying $500 for a midrange GM204. If NV again pulls the midrange ---> high end stunt, I'll keep waiting for 550mm2 GM210 cards.

I have to see what the next wave of games brings. So far we haven't seen anything even approach Crysis 3/Metro LL since PS4/XB1 came out. Witcher 3 is about the only shining light unless Dragon Age Inquisition surprises.
 
Last edited:

dangerman1337

Senior member
Sep 16, 2010
346
9
81
You are probably right. But if enough next gen games come out and hammer dual 7970s at 1080p, I'll consider upgrading. I don't think I would consider 880 assuming it is the rumored 2560core/256-bit card only to see Nv release a 4000 core / 512-bit card in 15 months. Without bitcoin mining sponsoring GPU upgrades, I won't be paying $500 for a midrange GM204. If NV again pulls the midrange ---> high end stunt, I'll keep waiting for 550mm2 GM210 cards.

I have to see what the next wave of games brings. So far we haven't seen anything even approach Crysis 3/Metro LL since PS4/XB1 came out. Witcher 3 is about the only shining light unless Dragon Age Inquisition surprises.
I think there won't be a 512-bit monster until Pascal, I serverly dobut Nvidia is going to bother with 20nm (a lot seem adamant of a Kepler style release but with 20nm shrink refresh but no evidence of this). Eyrines at Beyond3d (he got GM204 on 28nm first) has said a GM200 will be on 28nm but said nothing of GM210 or a 20nm shrink at all. I'll be grabbing a GM204 GPU as I play on a 1920x1080 120hz and very looking foward to the Witcher 3 and some 2015/2016 releases on PC.

I think Pascal will be a good jump for 4k performance due to 3D memory (and Volta as well due to memory stacked on-top of the GPU).