Mantle's Role In The Lifespan Of A PC GPU AND 4K Gaming

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

seitur

Senior member
Jul 12, 2013
383
1
81
28nm GPUs will not be powerful enough for 4K era gaming. Manta or not.

I doubt even 20nm will.

You're propably gonna wait at least till 14 / 16nm nodes on GPUs till whole ecosystem (also cheap enough 4K screens for average Joe, connection ports standards that allows 4K@60 or 120 HZ that are common enough, etc) switches to 4K.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I guess we'll see in december with the Mantle update for BF 4.
Personally for me its a key event since i don't have a GPU, if Radeons trample the GTX's with it then im going for Radeon easily.

I assume its gonna be a deciding factor for many people aswell since its such a high profile title.

For one developer only...if anyone else was using it surely AMD would have announced that as well.
 

CakeMonster

Golden Member
Nov 22, 2012
1,621
798
136
Same with the 512bit bus and insane amount of memory bandwidth & memory these new cards have.

The 300GB/s number they put out there has been achieved with slightly OC'ed 79xx's for almost two years. It can not be labeled "insane amount of memory bandwidth" anymore.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I don't think mantle has any useful baring on 4k at all. All it does is reduce the overhead of the draw calls, that is completely resolution independent. If you were CPU limited, and the limitation was the draw calls (very rare today, possibly no games at all) then mantle will improve the frame rate. But it doesn't matter what the resolution is. It might help CPUs with lower performance on a single thread quite a bit or as an API truly support multithreaded calls finally, but practically I want to see the impact it makes before I make any judgement on it. Its a major problem to be adding a propriety single card API at this point, so unless it really excels its going to disappear as quick a it arrived.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,107
9,360
136
If they can pull this off (big if) this will be the boon PC gaming has been waiting for. IGPs will be viable low end solutions while even mid range cards will be able to push out 30fps @4k. Process nodes are yielding fewer results and time between refreshes is getting longer and longer apart. Mantle might really shake thongs up. One can hope.
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
My bet: 290X will play console ports of 2011 or older in 4k with max details/noAA with more than 30 FPS.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
28nm GPUs will not be powerful enough for 4K era gaming. Manta or not.

I doubt even 20nm will.

You're propably gonna wait at least till 14 / 16nm nodes on GPUs till whole ecosystem (also cheap enough 4K screens for average Joe, connection ports standards that allows 4K@60 or 120 HZ that are common enough, etc) switches to 4K.

Wow . . . despite every review and write up on the Asus 4K display showing the exact opposite, this still gets stated in every thread on 4K gaming by various people.
 
Last edited:

uclaLabrat

Diamond Member
Aug 2, 2007
5,632
3,045
136
28nm GPUs will not be powerful enough for 4K era gaming. Manta or not.

I doubt even 20nm will.

You're propably gonna wait at least till 14 / 16nm nodes on GPUs till whole ecosystem (also cheap enough 4K screens for average Joe, connection ports standards that allows 4K@60 or 120 HZ that are common enough, etc) switches to 4K.
What? I'm running a 1600P on a single 6970...7970 crossfire or 780 SLI should be plenty for 4K.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
True-Audio chip.


lulz


There is a reason nobody is talking about the new hardware, it's unlikely AMD will beat Titan in DX11. Plus the new revised GCN chips slot in right above what you can currently get from AMD price/performance wise.

Also it should be quite obvious from the terminology they used, what what they tried to focus on. Audio chips, fastest single gpu they've made, and 512 bus for 4k (which I think we all know these cards, even Titan - lack the power to drive without sacrificing IQ/fluidity).
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Sometimes the pace of human progress in some areas impresses and astounds, other times it gets really annoying. Surely they could make next gen consoles a little modular (like PC's ) so they could soup up the CPU/GPU bits like they can add hard drives etc. Strikes me they're deliberately slowing the pace to milk emergent markets for ever.

I don't know what is the point of this post, thanks for defeating the point of a console?

GPU perf went through the roof since Crysis in 2007, yet I don't see any game remotely looking way better this 6 years. So muuuuuch for the vaunted PC master race and their overwhelming brute force power. Huh, the crying about how PS3/360 dragging us down or the lack of investment for pushing graphics on a niche market? It's economics at work, deal with it.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
the advent of 4K is kind of a nightmare situatoin for me, for I say screw more pixels when we're still woefully behind on speed/motion clarity

I need to see an officially supported 1440p120Hz first (with Lightboost) before I even consider the next porgression to 4K, and I fear the glitz and glamor of 4K will wrongfully overshadow motion clarity and very well could do quite a bit of harm in the slow progression of improvement we've been seeing with the development of 120+Hz

The problem is that LCD technology is just shit, period. Better motion or not, the contrast still absolutely blows on even the best LCD sets. The main problems that they addressed to begin with were size and energy consumption when compared to CRT's. They were never a better display technology from a standpoint of picture quality, especially when they first hit the market. It's been over a decade since they first started becoming affordable, and they still suck.

We've seen the best we're going to see out of the technology. 4K is an easy improvement. Fixing the shortcomings of LCD technology would have happened by now if it were going to happen at all. Everyone is focusing their efforts on OLED at this point.
 

NIGELG

Senior member
Nov 4, 2009
852
31
91
The problem is that LCD technology is just shit, period. Better motion or not, the contrast still absolutely blows on even the best LCD sets. The main problems that they addressed to begin with were size and energy consumption when compared to CRT's. They were never a better display technology from a standpoint of picture quality, especially when they first hit the market. It's been over a decade since they first started becoming affordable, and they still suck.

We've seen the best we're going to see out of the technology. 4K is an easy improvement. Fixing the shortcomings of LCD technology would have happened by now if it were going to happen at all. Everyone is focusing their efforts on OLED at this point.
CRT was nice while it lasted.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I like plasma. It has its shortcomings too. It isn't as bright and can have image retention (not good for PC desktop use perhaps) but I think the quality of the image is superior to LCD.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
CRT was nice while it lasted.

I'm glad its dead for most mainstream computing. :)

Would need to run run vents to the exterior of my house with two 27in CRTs on my desk. Oh, and I'd need a different desk . . . don't think this one could support 150lbs of monitors.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I like plasma. It has its shortcomings too. It isn't as bright and can have image retention (not good for PC desktop use perhaps) but I think the quality of the image is superior to LCD.

yeah, image retention is the main reason. The newer ones get plenty bright though. Honestly, IR isn't even an issue with most of them, but it can happen which is why they aren't used as computer monitors. I have a panasonic ST30 and an ST60 that I've abused with an HTPC and hours and hours of gaming, without so much as a hint of IR.

There's honestly no reason that anyone should own an LCD TV with the plasma's that are currently available. The ST60 is absolutely amazing. It doesn't show a hint of IR, it gets nice and bright, and the blacks are superb.

I do have to admit that owning a couple of plasma TV's is what has caused me to hate LCD's so much. The black levels especially on IPS screens are just absolute garbage.
 
Status
Not open for further replies.