Nvidia and AMD should stand up against these stupid Console port

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I'd rather Crysis all over again than this. At least Crysis pushed the envelope. High/Ultra is just a meaningless label for settings. If they made a game that only ran on medium on anything but dual Titan X, but still looked better than anything out today I'd be happy playing it on medium.

It just seems that any 'successful' PC title, other than RTSs (like a dev Paradox) does an immediate, or close to immediate cash-grab on the console side. See saw that with Crytek and likely with CD Projekt on TW3. The potential money is just so high that cross-platform is just too tempting for most devs to properly code the game to truly scale well.
 

NTMBK

Lifer
Nov 14, 2011
10,239
5,024
136
Game makers can't win. If they go all-out, balls to the wall, demanding top end cards, people moan about how they did a lazy console port, and no game could possibly really need 8GB of VRAM! If they limit things, keep it sensible, and lower the system requirements, people complain that it's artificially limited by consoles.

Stop trying to tell developers how to do their jobs. If you think you can do a better job, go work for a game developer.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
If we look at the 360/PS3 generation, the "high-end" PC versions were typically indeed console ports

The PC exclusives were usually DX9-only games and had poor multicore support
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I have no problem with multi-platform and developers have to spend resources where most of the revenue is for their titles. As long as they decide the PC platform is viable to offer their gaming titles -- pleased over-all.

My idealism: Would like to see more developer fidelity focus on the pc platform based on its hardware advantages at times. Not really realistic at this time based on this focus costs precious resources but there is hope based on PC software revenue is gaining traction.

Not ideal but provides some fidelity good for their respective customers: AMD and nVidia are both instrumental of providing fidelity and performance and go beyond what the developer may of intended for their PC game. It creates awareness for one's brands, the PC platform over-all. Without AMD's and nVidia's talents and assistance there would be less focus on PC fidelity over-all.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Not trying to stir the pot here, but does MS and Sony really pay these devs to 'tone down grapchis' on the PC? It seems this is taken as the truth, but I guess I have not see an actual case of this, unless maybe I missed it?

I get that the devs are encouraged, and maybe even provided incentives to push the consoles as far as possible, but are they actively handicapping the PC or is that just an opportunity cost in pushing the PC version?

I want to know this too. I don't see why MS or Sony suddenly care that a PC port looks better after YEARS of 360/PS3 ports looking better. The other side is people rage about the game looking the same while not metioning the PC version runs at 60fps (instead of 30) or runs at full 1080p/4K instead of 900p or something.


I see this differently as OP, I see it as kinda a golden age of PC gaming.

I mean sure these are mostly console ports, but so many games now get ported because the consoles are so similar to the PC. Uncharted/Halo doesn't, but so many console "exclusives" like Titanfall or SF5 end up on the PC too eventually. That plus a real standard for controllers (the 360 controller) that will be improved (when the Xbox One wireless adaptor is released) makes PCs more livingroom friendly than they have ever been.

Heck you can pretty much MAKE a better Xbox One nowadays, and get most of the same game library (often cheaper thanks to Steam sales) with the same controller! Thanks to Steam you get a console-like GUI with games that have full controller support. I personally don't have a new console yet because PC gaming is good enough.

Is graphics everything?
 
Mar 10, 2006
11,715
2,012
126
Game makers can't win. If they go all-out, balls to the wall, demanding top end cards, people moan about how they did a lazy console port, and no game could possibly really need 8GB of VRAM! If they limit things, keep it sensible, and lower the system requirements, people complain that it's artificially limited by consoles.

Stop trying to tell developers how to do their jobs. If you think you can do a better job, go work for a game developer.

:thumbsup:
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Game makers can't win. If they go all-out, balls to the wall, demanding top end cards, people moan about how they did a lazy console port, and no game could possibly really need 8GB of VRAM! If they limit things, keep it sensible, and lower the system requirements, people complain that it's artificially limited by consoles.

Stop trying to tell developers how to do their jobs. If you think you can do a better job, go work for a game developer.

Pretty much. People sit on forums and complain, good portion of complainers have no intention of buying the product yet their opinions need to be validated.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I want to know this too. I don't see why MS or Sony suddenly care that a PC port looks better after YEARS of 360/PS3 ports looking better. The other side is people rage about the game looking the same while not metioning the PC version runs at 60fps (instead of 30) or runs at full 1080p/4K instead of 900p or something.

Perhaps due to the starting point of consoles.

cIOnJdZ.jpg


The current consoles started out way below the bar. Even for their own expectations. Xbox360 and PS3 was pretty good graphics wise at launch. Xbox360 even set the standard of graphics.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Perhaps due to the starting point of consoles.

cIOnJdZ.jpg


The current consoles started out way below the bar. Even for their own expectations. Xbox360 and PS3 was pretty good graphics wise at launch. Xbox360 even set the standard of graphics.

Trying to explain this to the console crowd always end up with "graphics don't matter!" until the "X-game downgrade mega thread" at popular NeoGAF.

Once the crew at a smaller forum I belong learned of the leaked console specs, we berated it until they launched. But the console half said we didn't know jack because "consoles can utilize hardware better than PC." Sure, but this hardware is already obsolete - don't expect miracles from it.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
I want to know this too. I don't see why MS or Sony suddenly care that a PC port looks better after YEARS of 360/PS3 ports looking better. The other side is people rage about the game looking the same while not metioning the PC version runs at 60fps (instead of 30) or runs at full 1080p/4K instead of 900p or something.


I see this differently as OP, I see it as kinda a golden age of PC gaming.

I mean sure these are mostly console ports, but so many games now get ported because the consoles are so similar to the PC. Uncharted/Halo doesn't, but so many console "exclusives" like Titanfall or SF5 end up on the PC too eventually. That plus a real standard for controllers (the 360 controller) that will be improved (when the Xbox One wireless adaptor is released) makes PCs more livingroom friendly than they have ever been.

Heck you can pretty much MAKE a better Xbox One nowadays, and get most of the same game library (often cheaper thanks to Steam sales) with the same controller! Thanks to Steam you get a console-like GUI with games that have full controller support. I personally don't have a new console yet because PC gaming is good enough.

Is graphics everything?

This, this, and this again. Graphics are pretty, but they are not the primary driver of game sales.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I could have sworn that The Witcher 3 would have Hairworks for PC and a few other settings, not on the console. Many other games, like Dragon Age Inquisition also had a few higher settings on the PC, not on the console. While they were designed 1:1, they are really more like 1:1+.

I really am not bothered by the graphics being so similar, with only a few added features for the PC. What bothers me is the 1:1 UI and game play. Consoles are limited by their controls, and game decisions have to be altered to work with them, often dumbing down games/genres so that consoles can play them, leaving PC users with a poorer game than it should have been.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I could have sworn that The Witcher 3 would have Hairworks for PC and a few other settings, not on the console. Many other games, like Dragon Age Inquisition also had a few higher settings on the PC, not on the console. While they were designed 1:1, they are really more like 1:1+.

I really am not bothered by the graphics being so similar, with only a few added features for the PC. What bothers me is the 1:1 UI and game play. Consoles are limited by their controls, and game decisions have to be altered to work with them, often dumbing down games/genres so that consoles can play them, leaving PC users with a poorer game than it should have been.

Exactly.

Consoleitis runs way deeper than mere graphics. Notice the many 3rd-person-over-the-shoulder games. Notice the slow panning and unresponsive camera controls. Notice poorly mapped controls, or worse, controls with no ability to be rebound. Notice your PC game with prompt boxes that say "Press A" and the A is a green Xbox Controller 'A'. Notice the scope of games being reduced because consoles can't handle long draw distances.

Then the things you dont notice. Ambitious games that never escape the drawing board because consoles don't have the grunt. Interesting AI or pathfinding that is too costly to implement on console CPUs.

Anyone who thinks its "just graphics" is very poorly informed.
 

5150Joker

Diamond Member
Feb 6, 2002
5,559
0
71
www.techinferno.com
It's just about money, nothing new.

If PC is the major revenue source, developers would give it more attention.

I respect devs that still focus on PC and eventhough its a cross-platform game, they design it for PC first and then tone down for consoles.

There is more than enough money to be made on the PC if the developer spends time on it. There's nothing to preclude them from building a game on the PC first and then optimizing it later for consoles so it works seamlessly across all platforms. Even GTA V, which was built for consoles, had optimizations done pretty well for the PC release and as a result, they made about $100 million in sales the first 24 hours.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
I really am not bothered by the graphics being so similar, with only a few added features for the PC. What bothers me is the 1:1 UI and game play. Consoles are limited by their controls, and game decisions have to be altered to work with them, often dumbing down games/genres so that consoles can play them, leaving PC users with a poorer game than it should have been.

Anyone who thinks its "just graphics" is very poorly informed.
^ This has been my primary gripe with "consolitis" too. Not the GFX, but the very obvious sometimes "in your face" way "previously solid K+M input with full rebindability & usage of "F1-F12 & 1-0 keys" plus a decent looking UI at 2ft" has ended up "designed for controller with K+M not even play-tested, mouse not working in menu's (or variable 'weapon wheel' acceleration due to "simulated analogue stick"), ugly size 36 'designed for TV' font UI's and half the controls "contextualized" or even long term features of a franchise completely cut out changing the nature of the franchise because we run out of controller buttons (but rather than admit that limitation we'll give it a trashy excuse catch-phrase like 'evolved gameplay'...)". D: Witcher 3's texture downgrades are the least of my worries in modern cross-platform games.

---

As for this thread and half a dozen "Witcher 3 = Epic fail!" style threads, here's the best thing to do to avoid future "issues":-

When you hear about a game you like, make a note of its estimated launch date, bookmark it, then completely "switch off" to the hype machine. Wait until it gets launched then read reviews (not the 10/10 or 0/10 Metacritic shills & trolls, but the sensible well written ones going into detail of exactly what's right & wrong). You might want to wait a few weeks anyway until it's patched and actually playable. Then and only then, make a decision to buy it or not. Either way, your decision will be informed and based on reviews, not uninformed and hanging onto someone else's hype machine for months / years on end. Same goes for "gaming community sites" - they are best used to compile a summary list of games that look interesting to be researched close to launch time, not as a "fix" for an endless craving to be constantly bombarded with all-over-the-place speculation month after month leading to completely skewed expectations and "inverse education" (the more you read, the less accurate your understanding of the final release will be vs someone who isn't 'following' it at all). ;)
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I'm hoping DX12 ignites PC gaming in a big way. I'm optimistic at least.

If it delivers like it's supposed to, it will bring acceptable gaming performance to a lot higher % of PCs and Laptops.
 

Annisman*

Golden Member
Aug 20, 2010
1,918
89
91
I think the real disappointment is seeing another (good) PC centered developer like CDPR joining the masses of other developers with the way they release their games, like someone else said the releases are not 1:1 they are more like 1:1+, we do get extra benefits but for those of us who spend a lot of money on their PC we hope for more than that. Our only hope is that the few PC only developers like Tripwire Interactive continue to push the bar with PC only releases, and as a whole us PC lovers should applaud them when they do and bring them excellent sales to justify going the extra mile for us.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
Not trying to stir the pot here, but does MS and Sony really pay these devs to 'tone down grapchis' on the PC? It seems this is taken as the truth, but I guess I have not see an actual case of this, unless maybe I missed it?

I get that the devs are encouraged, and maybe even provided incentives to push the consoles as far as possible, but are they actively handicapping the PC or is that just an opportunity cost in pushing the PC version?

I guess where I am coming from is that if the devs want to build efficiently, and the consoles ARE a limiting factor, how do we expect them to build a well-performing, scalable arch that works (and looks great) on both the consoles and PC, in all cases?

Maybe it is just the case that consoles are slow, and devs struggle to make a game that shines on both, with the budget at hand. I feel that it is more likely the publisher/dev that pushes on them for a lower-budget, and they still need to release for all platforms.

Just playing devil's advocate here.
GTA 5 has sold more than an order of magnitude more on the consoles than on the PC. My system has 4.5x the shaders of a PS4 running at 45% higher clocks, but I really don't expect a console port to look much better on my system than on a console. I might get higher resolution and higher framerates, but that's about the end of what I expect.

I'd agree with you that it's not a conspiracy to keep down PC gaming. Developers just have a limited amount of time, capital and human resources, and if a choice needs to be made between really pushing the bounds of what is possible with top of the line hardware or optimizing for a low-performing fixed platform that accounts for 95% of their sales, you don't need to rely on hidden payola to figure out what they're going to do.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'd rather Crysis all over again than this. At least Crysis pushed the envelope. High/Ultra is just a meaningless label for settings. If they made a game that only ran on medium on anything but dual Titan X, but still looked better than anything out today I'd be happy playing it on medium.

Oh man, I can't rep this post enough. SOOOOOOOOOOO much win in this post from all angles. Today's AAA PC games are an utter failure from a technical point of view compared to what Crysis 1 or Crysis 3 or Far Cry 1 did. Ryse Son of Rome and the Order 1886 are the only 2 games that impressed me in the last 12 months from a graphics stand-point (there are other artistically amazing looking games like Ori the Blind Forest or Trine 3).

It's actually becoming very difficult to notice the differences between High and Ultra settings in modern games without pixel peeping. I would even say it's actually pretty hard in many cases to notice a difference between a combination of Medium+High vs. Ultra. The textures, amount of foliage and lighting are about the biggest differences. The FPS though takes a hit of 50%, 2x sometimes for what amount to minimal increases in IQ.

AMD and Nvidia need work together on this sort out solution and stop console port.

Maybe but it's unrealistic. By making poorly optimized PC console ports with sprinkles of GW or GE, it does nothing to entice me to buy $1000 flagship cards, especially not in pairs. I literally can't think of a single game since Crysis 3 or Metro series that wowed me graphically (not discussing gameplay).

If AMD and NV work together with the developer on pushing the envelope on the PC and introducing brand agnostic effects that are most importantly well optimized and look amazing (not TW3, not GTA V, not AC Unity, not Watch Dogs), then I am going to actually care and upgrade. It's going to make the PC version so much superior. I am not going to upgrade for a game that needs a Titan X for 1080P but that same game looks barely better on Ultra than on High/Medium settings. Some PC games only look slightly better than a PS4 game (i..e, OK so they get slightly higher textures and 1-2 effects but when coupled with MSAA, the performance drop is like 100-200%). I know a lot of PC gamers are very impressed by TW3 or especially GTA V. Graphically/technically, I am not impressed by either of those games. They don't look true next gen, sorry. 2013 TW3 did, but not the 2015 version. GTA V looks like a console game unless it's running on a 5K monitor.

I remember my lowly 8800GTS could barely hit 30 fps at 1280x720 in Crysis 1 but even at those settings, it looked better than any game I had played up to that point at 1600x1200 with 8xAA!

I have absolutely no problem with a PC game needing $2000 Titan X SLI to max out at 1080P but it better be THE best looking game and not by 5%, by a country mile. No such game has come out in 2015 for the PC.

*** Obviously if taking into account limited budgets, time, human resources, I would much rather take a 100 hour single-player campaign game that's brilliant and fun to play than the most beautiful shallow game.

GTA 5 has sold more than an order of magnitude more on the consoles than on the PC. My system has 4.5x the shaders of a PS4 running at 45% higher clocks, but I really don't expect a console port to look much better on my system than on a console. I might get higher resolution and higher framerates, but that's about the end of what I expect.

I'd agree with you that it's not a conspiracy to keep down PC gaming. Developers just have a limited amount of time, capital and human resources, and if a choice needs to be made between really pushing the bounds of what is possible with top of the line hardware or optimizing for a low-performing fixed platform that accounts for 95% of their sales, you don't need to rely on hidden payola to figure out what they're going to do.

I agree with your post. It's embarrassing for me to admit that I am more impressed by Ryse Son of Rome PC version (XB1 game originally) and The Order 1886 (PS4), as well as Uncharted 4 uncompressed gameplay footage, than any PC game out in 2015. This should not happen under any circumstances considering how weak the console hardware is and that we have 6-core HT i7s and Titan Xs. Maybe DX12 will help to solve some of the bottlenecks PCs currently face as I feel the level of optimization/DX11 GPU hardware to run the games vs. the graphical output is not to my liking.

The current consoles started out way below the bar. Even for their own expectations. Xbox360 and PS3 was pretty good graphics wise at launch. Xbox360 even set the standard of graphics.

But anyone who follows business knows that Xbox 360 was a money pit, same for PS3. It would be better for Sony and MS to just sell off or close their console division if they repeated the mistakes of Xbox 360 and PS3.

Report: Microsoft's Xbox division has lost nearly $3 billion in 10 years

asfd3221125511352.jpg


Trying to put the blame 100% on consoles is a cop out. We have Star Citizen, we had Crysis 1 and 3, we had Far Cry 1, Metro 2033/Last Light was amazing, etc. Battlefield 4 had a great lighting system when it launched. Some developers have shown that they can make great looking PC games. TW2 was a very good looking RPG for its time, TW3 has none of the same impact.

Look at what Rockstar did with such an old game as GTA V. It looks miles better than on consoles and the game uses an outdated graphics engine. It's pretty embarrassing when GTA V made 2 years ago and targeted primarily for PS3/360 generation looks as good or better than games made specifically for newer consoles such as DAI, TW3, AC Unity, etc.
 
Last edited:

Innokentij

Senior member
Jan 14, 2014
237
7
81
Problem is the way that games are designed for PS4/X1, from what i understand they almost dont use any normal DDR ram or CPU but feed everything trough a GPU that has 7GB+ of super fast ram. So when they port it to the PC it just sucks at using the resources, that would be the CPU and ram.

Here is some more info on it incase my view on it is wrong:

Again, what Sony has done with the PS4 is something that PC builders simply cannot do yet. PC’s come with two separate chips connected over a PCI-E chipset.

The PS4, on the other hand, houses an integrated CPU/GPU custom AMD chip—the “Jaguar” CPU is not available for purchase yet and the GPU side of the equation is said to be similar to AMD cards running in the $200 price-range. The secret weapon here isn’t either the 8-core CPU or the GPU, but rather how the two are paired.

Both the processor and the graphics card are built into the same chip and both tap into that 8GB of DDR5 memory at once—it’s a “unified memory” setup as opposed to the system your PC uses, with the CPU utilizing your DDR3 system memory and your GPU harnessing the more robust GDDR5.

What does this mean? Basically it means that the two chips will be able to communicate with one another much faster and more efficiently than in a traditional PC set-up. Combine this with the high-bandwidth GDDR5 memory and the fact that much of the traditional CPU tasks will be offloaded to the GPU, and you have a machine that you simply cannot compare to a modern PC.


So to sum it up until we have 8GB of VRAM where we can throw all the bad consoll coding for pc we just have to wait.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
Perhaps due to the starting point of consoles.

cIOnJdZ.jpg


The current consoles started out way below the bar. Even for their own expectations. Xbox360 and PS3 was pretty good graphics wise at launch. Xbox360 even set the standard of graphics.

That chart is misleading. It lists the XBox as twice as powerful as a GeForce 3, but the XBox GPU really wasn't really much more powerful than the Ti500 that was already out when the XBox was released in mid-Nov 2001, other than having the second vertex shader. The chart also seems to imply that it took a year for PC GPUs to match the Xbox, when it was actually less than 3 months before the Ti4600 was released which was 20% faster than the XBox.

Consoles will probably never get close to high end discrete GPUs again though, unless the GPU market dies entirely. When the XBox was make, a Ti4200 was pulling 25-35W which is entirely possibly to fit in a relatively quiet console. For the XBox one, even if MS had wanted to ditch the APU for a discrete full Hawaii die, there's no way they could pack that 250W of power into a console that would be even close to acceptably quiet in someone's living room. It really has more to do with what's considered an acceptable high end GPU these days than anything to do with the consoles being underpowered.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
If witcher 3 final Build was same as 2013 than i grantee that PC sale would be more than GTA V at 1st hour of game launch.

I notice there are a lot of bold claims in each of your posts with nothing to back them up desprado. Have you seed Crysis sales numbers? They were terrible. The majority of PC Gamers cannot run a game with the graphics shown in 2013 period.

You tell me I am wrong and state that even at 4k there would be no difference the Titan X is a waste for these ports.. but you simply ignore the actual information and metrics we already have. Go look for yourself and see what FPS a Titan X is getting with the current feature set enabled at 4k. Hint: It is not a playable framerate.

If you are going to straight up tell me I am wrong post more than just some hyperbole please. My statement made in the very first reply stands as fact. Titan X can BARELY play with 1440p Ultra settings and NO AA, NO Gameworks features from gamernexus. 1080p Ultra settings with AA show the Titan X averaging slightly above 60fps from wccftech.

You are asking for features that you cannot even use and telling me "No" without even putting a second of thought into your response.

Sorry you are upset that they did not keep the graphics level of the game where the demos showed it to be. Doesn't change the fact that most of the rigs in this forum couldn't have handled it let alone the average gamer's.
 

digitaldurandal

Golden Member
Dec 3, 2009
1,828
0
76
There is more than enough money to be made on the PC if the developer spends time on it. There's nothing to preclude them from building a game on the PC first and then optimizing it later for consoles so it works seamlessly across all platforms. Even GTA V, which was built for consoles, had optimizations done pretty well for the PC release and as a result, they made about $100 million in sales the first 24 hours.

Your statement is true but the majority of developers are going to focus on what will make them the most money the fastest.

GTA V sold $800 million in sales in the first 24 hours of release on consoles.

I personally do not mind if things are released on console first and then time is given to improve things for a PC release. However I know a number of gamers that if the wait is longer than a few weeks will just buy the game on the console and be done with it.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Not trying to stir the pot here, but does MS and Sony really pay these devs to 'tone down grapchis' on the PC? It seems this is taken as the truth, but I guess I have not see an actual case of this, unless maybe I missed it?

I think its more devs are forced to scale back more because of console performance concerns. They're trying to be realistic. Problem is this scaling back screws over the PC side.
 

NTMBK

Lifer
Nov 14, 2011
10,239
5,024
136
Consoles will probably never get close to high end discrete GPUs again though, unless the GPU market dies entirely. When the XBox was make, a Ti4200 was pulling 25-35W which is entirely possibly to fit in a relatively quiet console. For the XBox one, even if MS had wanted to ditch the APU for a discrete full Hawaii die, there's no way they could pack that 250W of power into a console that would be even close to acceptably quiet in someone's living room. It really has more to do with what's considered an acceptable high end GPU these days than anything to do with the consoles being underpowered.

This, a thousand times. The amount of heat generation, noise and power draw that "high end" gamers are willing to put up with is ludicrous. I had a GF110 graphics card under my desk at work for a while doing some CUDA development, and when that thing spun up it was loud. Not to mention hot! It was noticeably (and significantly) hotter underneath my desk than directly above it.

Even the 7770 in my home machine is irritatingly noisy compared to my XBox 360 (though that may have more to do with the cooler on it).
 

phoenix2150

Junior Member
May 13, 2015
4
0
0
I'd rather Crysis all over again than this. At least Crysis pushed the envelope. High/Ultra is just a meaningless label for settings. If they made a game that only ran on medium on anything but dual Titan X, but still looked better than anything out today I'd be happy playing it on medium.

Star Citizen, a game designed for PC's from the ground up, there will be no console port!