[theverge] AMD won the next-gen console war, and PC gamers could reap the reward

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Obviously if a game is made with OpenGL, it will render in OpenGL. The problem is AMD and Nvidia do not have proper OpenGL support on the desktop. That would have to change if all the games start using OpenGL.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Consoles should have packed in 16GB RAM, take away the OS overheads and you'd be left with 5GB or 6GB, if these will stagnate for another 7yrs, more RAM would have been a better idea.
 
Aug 11, 2008
10,451
642
126
Kinda bummed about being frozen out of sports this generation as well. I understand the benefits of a unified memory pool, but does anyone truly believe a decent gaming PC couldn't handle the sports titles?

Yea, this statement pegged my BS detector. If this is true, why have there not been any decent EA sports games on the PC for years?

I also agree that any decent gaming PC could certainly handle the sports games.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
Problem with PC sports games is they don't require us to buy a new game each year for an updated roster, we have free mods for that instead of wasting $60 a year.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Consoles should have packed in 16GB RAM, take away the OS overheads and you'd be left with 5GB or 6GB, if these will stagnate for another 7yrs, more RAM would have been a better idea.

PS4 uses about 1GB for OS, with 7GB left over. XB1 uses about 3GB for OS with 5GB left over. How are you arriving at 5-6GB left-over after OS overhead on a hypothetical console with 16GB of RAM?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Problem with PC sports games is they don't require us to buy a new game each year for an updated roster, we have free mods for that instead of wasting $60 a year.

Sports games on the PC? Please, it's a waste of time.

"But while that engine will power EA's sports games on the PlayStation 4 and Xbox One, the company says the PC edition of FIFA 14 will not take advantage of it...."

"EA ran into a similar problem with FIFA games on the current generation of consoles; the console versions used a more advanced engine than their PC counterparts from 2005 through 2009."
http://arstechnica.com/gaming/2013/06/no-pc-port-for-eas-high-end-sports-engine/
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
PS4 uses about 1GB for OS, with 7GB left over. XB1 uses about 3GB for OS with 5GB left over. How are you arriving at 5-6GB left-over after OS overhead on a hypothetical console with 16GB of RAM?

Yeah, i'll also add that consoles (you'd think this is obvious, but still) require far less in terms of resources (eg memory) than a dedicated PC device - PCs are all encompassing devices for many purposes with a lot of tasks loaded into memory, consoles are not. They are only for entertainment. When I do a minimal boot into windows, there are a gabillion processes that I probably don't need for gaming loaded into memory, but because of PC functionality - and windows being a general usage OS - they are loaded into memory. Not so much the case with a console. I don't think 1GB will be the case, either - probably much less.

There's also the fact that the memory management on both upcoming consoles are handled much differently than the PC - the RAM is unified with no distinction between system memory and VRAM. That's definitely a good thing since even the worst GRDDR3/5 memory is significantly faster in terms of latency/access than DDR3.
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
More particularly Kaveri and 8xxx GPU purchasers will reap the rewards.

Expect all major next gen engines and AAA games to be highly optimized for future HSA capable APUs and GPUs.

Expect much poorer hit and miss optimization for Intel and Nvidia.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's definitely a good thing since even the worst GRDDR3/5 memory is significantly faster in terms of latency/access than DDR3.

Do you have any sources for this? I had a spirited debate with a few guys about this in another thread a few weeks ago, and I told them that DDR3 has lower latency than GDDR5. I told them this not because I had read any solid engineering sources, but because that was the putative view held by many on computer oriented forums across the net. Looking at the latency times for GDDR5 on a video card, it's MUCH higher than DDR3. But I don't know if that's just a function of the memory itself or the memory access pattern of the GPU, which tends to be more focused on bandwidth. Since GPUs are inherently parallel, they aren't affected by latency as much as a CPU..

I searched for a good source, but I was never able to find any.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
You know what is really going to happen? Developers will spend millions to build a console game.. then because it will be so easy.. they will simply port it to the pc. So we will be stuck with console games on PC. They won't bother.. I mean why should they.. look.. you get the same experience as the console guy!
The ultimate goal will be to push people to consoles.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You know what is really going to happen? Developers will spend millions to build a console game.. then because it will be so easy.. they will simply port it to the pc. So we will be stuck with console games on PC. They won't bother.. I mean why should they.. look.. you get the same experience as the console guy!
The ultimate goal will be to push people to consoles.

period period period dot dot dot dot?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Do you have any sources for this? I had a spirited debate with a few guys about this in another thread a few weeks ago, and I told them that DDR3 has lower latency than GDDR5. I told them this not because I had read any solid engineering sources, but because that was the putative view held by many on computer oriented forums across the net. Looking at the latency times for GDDR5 on a video card, it's MUCH higher than DDR3. But I don't know if that's just a function of the memory itself or the memory access pattern of the GPU, which tends to be more focused on bandwidth. Since GPUs are inherently parallel, they aren't affected by latency as much as a CPU..

I searched for a good source, but I was never able to find any.

Latency was the wrong terminology, but it's the peak output. If you look up JEDEC specifications, the output (Gbit/sec) of GDDR5 vs DDR3 is heavily in favor of GDDR5, which is why GPUs don't use DDR3 - DDR3s bandwidth is far lower than GDDR5. GDDR5 has more than twice the bandwidth of DDR3 depending on clockspeed - Furthermore, this 2-3 times higher bandwidth does come at a cost, it is absurdly expensive. By contrast, DDR3 is pretty dirt cheap.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I don't want to be that guy, but uh, isn't all of this a moot point since there is no way the AMD cpu is going to be able to take advantage of all the bandwidth available to it?

Like probably a 10th of it's bandwidth?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Latency was the wrong terminology, but it's the peak output. If you look up JEDEC specifications, the output (Gbit/sec) of GDDR5 vs DDR3 is heavily in favor of GDDR5, which is why GPUs don't use DDR3 - DDR3s bandwidth is far lower than GDDR5. GDDR5 has more than twice the bandwidth of DDR3 depending on clockspeed - Furthermore, this 2-3 times higher bandwidth does come at a cost, it is absurdly expensive. By contrast, DDR3 is pretty dirt cheap.

Yeah, but what would be the point of using GDDR5 on a desktop CPU when it can't even utilize that bandwidth? Also, CPUs tend to be more restricted by latency, and if I'm right, DDR3 has much lower latency than GDDR5.

I'm still in favor of hybrid memory types for gaming rigs rather than unified. Unified may work best for consoles or low end computers, but for high end, it's better to have lower latency RAM for the CPU, and higher bandwidth RAM for the GPU.

The Xbone is using eSram which should give it the advantage in latency compared to the PS4's GDDR5.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,193
2
76
Yeah, but what would be the point of using GDDR5 on a desktop CPU when it can't even utilize that bandwidth? Also, CPUs tend to be more restricted by latency, and if I'm right, DDR3 has much lower latency than GDDR5.

I'm still in favor of hybrid memory types for gaming rigs rather than unified. Unified may work best for consoles or low end computers, but for high end, it's better to have lower latency RAM for the CPU, and higher bandwidth RAM for the GPU.

The Xbone is using eSram which should give it the advantage in latency compared to the PS4's GDDR5.

The point of unified gddr5 on a desktop would be to speed up the igpu immensely.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The point of unified gddr5 on a desktop would be to speed up the igpu immensely.

Hence why I said unified GDDDR5 would be best for consoles and low end computers that use IGPs or APUs.

For high end desktops though, it wouldn't be good.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I don't know if this is a serious post.

Let me guess: You think their is only one open graphics language? :biggrin:

Sony using their own GL. That und the different software stack make it much more worse to port than a Xbox game.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,801
3,249
136
I don't want to be that guy, but uh, isn't all of this a moot point since there is no way the AMD cpu is going to be able to take advantage of all the bandwidth available to it?

Like probably a 10th of it's bandwidth?

its like 8.8% but that not the part that really matters, its the interconnects between CPU and GPU and the way they can be used to allow the most effective execution hardware work on the data that it makes sense to.
 

NTMBK

Lifer
Nov 14, 2011
10,247
5,043
136
Obviously if a game is made with OpenGL, it will render in OpenGL. The problem is AMD and Nvidia do not have proper OpenGL support on the desktop. That would have to change if all the games start using OpenGL.

Um wut. One of the most popular desktop games, Minecraft, is OpenGL.
 

NTMBK

Lifer
Nov 14, 2011
10,247
5,043
136
Do you have any sources for this? I had a spirited debate with a few guys about this in another thread a few weeks ago, and I told them that DDR3 has lower latency than GDDR5. I told them this not because I had read any solid engineering sources, but because that was the putative view held by many on computer oriented forums across the net. Looking at the latency times for GDDR5 on a video card, it's MUCH higher than DDR3. But I don't know if that's just a function of the memory itself or the memory access pattern of the GPU, which tends to be more focused on bandwidth. Since GPUs are inherently parallel, they aren't affected by latency as much as a CPU..

I searched for a good source, but I was never able to find any.

I thought that the latencies were defined in number of clock cycles? So GDDR5 looks a lot higher, but because it has such a high clock speed the latencies (measured in ms) are actually roughly the same.
 

NTMBK

Lifer
Nov 14, 2011
10,247
5,043
136
I don't want to be that guy, but uh, isn't all of this a moot point since there is no way the AMD cpu is going to be able to take advantage of all the bandwidth available to it?

Like probably a 10th of it's bandwidth?

The GPU will certainly be able to use that bandwidth!
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Um wut. One of the most popular desktop games, Minecraft, is OpenGL.

So...that doesn't mean they support OpenGL well. The current strategy used by AMD and Nvidia, is their desktop cards get great DirectX support, and OpenGL gets very little attention. This may be in part to ensure their professional cards are superior at professional applications.

When a game is released that needs OpenGL, they will update their drivers to make sure that game works reasonably well.

You may also notice that a lot of people get very poor performance with Minecraft, even with very powerful systems. This may very well be why.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
Isn't it the graphics chip that generates the data so therefore the CPU bandwidth remains only partially used to process the OS data stream?