• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

The future of console gaming..

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
1. Outrage from early adopters whose older console models either couldn't run newer games, or ran them poorly.
just go to youtube and search for "stutter ps4" ,early adopters are outraged as it is.
Like was said, you're asking for a PC market disguised as a console one, and it's not happening.
Only that it already has,the new consoles are nothing more than PCs,only difference being that programmers have to make the interface work with gamepads.
Why do you think no game comes out that supports unified memory or anything special the consoles might have? Companies want to be able to use the same code on regular PCs so they write them compatible in the first place,no more "polished" games.
 
I don't think there's really an issue with providing "upgrades", and it's not like we haven't seen that in the past. The Sega Genesis had the 32X and Sega CD, and the N64 had the Expansion Pak. Now, there is one important thing to point out... those were segmenting devices, and the methodology suggested by other people above is not. Although, to be fair, the console owners would have to mandate that developers not segment their products on specific hardware revisions.

Albeit, I'll also be clear in that I don't think straight upgrades would ever really work. The current consoles are built more like laptops, and consequently, most parts cannot be swapped. They could switch to using things like MXM cards for the video card, but that also means they cannot use a heterogeneous model like they do now. So, they could develop a card of their own, which includes a CPU+GPU and a cooler, but that sounds a little excessive.

If anything, they'd probably just do it as a refresh setup like they currently do where they produce chips on smaller nodes. Do I think it would be "against the idea of console gaming"? No. I think that's just a bunch of silly hogwash. Console gaming is normally presented as the more straightforward option, and if the graphics were still preset depending on the model, what difference does it make to the end user? All it means is that people with newer hardware get more AA, a higher base resolution, or maybe slightly better textures.

Development for X1 + X2 and PS4 + PS5 will be cheaper and easier since most of the same code will work on both. It might be as simple as the build step toggling some engine settings and selecting lower or higher resolution art assets, but in some cases the X2 / PS5 build might also tweak other setttings like more CPU cycles for physics and AI.

I've seen a lot of people claim this because of the transition to x86, but they always ignore the GPU. PCs may have been using x86 for a long time (much to the chagrin of companies pushing alternate ISAs), but the graphics architectures change about every year or two. This is largely a moot point for PCs because of the use of graphics drivers and WDDM.

EDIT:

Why do you think no game comes out that supports unified memory or anything special the consoles might have?

I would suspect that the console games most definitely do take advantage of HSA.
 
and then when there are 4 versions of the ps4 out there, and the commercial for uncharted 6 looks incredible on the commercial so a user goes and buys it, comes home with the game and puts it in their first gen PS4 and it looks like tomb raider on PSX ... i'm sure they will be ecstatic!
 
and then when there are 4 versions of the ps4 out there, and the commercial for uncharted 6 looks incredible on the commercial so a user goes and buys it, comes home with the game and puts it in their first gen PS4 and it looks like tomb raider on PSX ... i'm sure they will be ecstatic!

Resorting to nonsensical exaggerations don't exactly strengthen your position much.

The thing is... if you're just adjusting some of the more superfluous things like extra AA, slightly higher resolution, or slightly better framerate, most users may not even notice. That's the one thing about console gaming... it's usually played from a distance, so it's harder to notice some of the graphical artifacts that PC gamers (who typically sit far closer) usually spot. Although, I did mention one benefit that might be huge to some people: better framerate. One comment that I've heard about Bloodborne is that some parts of the game have a poor framerate. I don't know if they were exaggerating, but they compared it to Blighttown in Dark Souls.
 
Resorting to nonsensical exaggerations don't exactly strengthen your position much.

The thing is... if you're just adjusting some of the more superfluous things like extra AA, slightly higher resolution, or slightly better framerate, most users may not even notice. That's the one thing about console gaming... it's usually played from a distance, so it's harder to notice some of the graphical artifacts that PC gamers (who typically sit far closer) usually spot. Although, I did mention one benefit that might be huge to some people: better framerate. One comment that I've heard about Bloodborne is that some parts of the game have a poor framerate. I don't know if they were exaggerating, but they compared it to Blighttown in Dark Souls.

how is that nonsensical exaggerating?

uncharted 4 is coming out in 2016. changing the ps4 model every 2 years (if we started putting out a new ps4 every 2 years from today) would put the 4th gen of ps4 coming out in 2021. so that is giving ND 5 years between u4 and u6, which is realistic.

so yeah, uncharted 6 coming out 8 years after ps4 came out, which would be the 4th gen (in this 2 year model the article talks about) puts it right on track. or are you saying that the difference in technology in 8 years won't change much?

not really sure why you are calling it nonsensical. they already do bullshots on commercials for many games. do you really think they would make commercials showing off all 4 versions of the game running on ps4?

EDIT:

also not "trying to strengthen a position" or anything. i'm already onboard with the market that has been working for 30 years, not jumping onto this ludicrous one mentioned in 1 article.

as you mentioned, they already tried this crap with 32x, segaCD, turbo duo, etc. and they all flopped.

and if it were to literally run the same game on all iterations of the same console, then you would hear the same old tired argument you already hear from pc gamers, about pc games being dumbed down so they can run on consoles. that would be the same exact thing happening if this dumb idea ever happened.
 
Last edited:
I've seen a lot of people claim this because of the transition to x86, but they always ignore the GPU. PCs may have been using x86 for a long time (much to the chagrin of companies pushing alternate ISAs), but the graphics architectures change about every year or two. This is largely a moot point for PCs because of the use of graphics drivers and WDDM.

The people that claim this don't understand development, especially development dependent on hardware (GPU) drivers and access. Think about this: games that ran DX7 were x86 and they can't be run on modern systems. And, as consoles, operating systems, and drivers get more and more complex, the ability to emulating them becomes zero. Just because development will be against x86, doesn't mean it will be anywhere similar or that the costs will be cheaper / easier.

With that said, expansion or upgrades simply won't work in this environment of consoles. It was fine for the N64 and Genesis days because those were single unit systems. An Xbox One and and Xbox One+ or whatever would have to be able to interact to the same backend servers seemlessly and run the same games, otherwise, you lose massive sales.

People who think PCs are going to vastly outpace consoles this generation (more so than any), don't understand diminishing returns of increased polygon counts.
 
how is that nonsensical exaggerating?

Honestly, I figured that it would be clear that I was referring to the Tomb Raider comparison as that's clearly a rather over-the-top exaggeration. 😛

so yeah, uncharted 6 coming out 8 years after ps4 came out, which would be the 4th gen (in this 2 year model the article talks about) puts it right on track. or are you saying that the difference in technology in 8 years won't change much?

I think I might be looking at it in a slightly different way. I don't necessarily think that the upgrades have to be purely upgrades. Yeah, that sounds kind of weird, but I'm talking about "side-grades". As a real-world example, I have a GeForce GTX 780 Ti in my computer, and it cost me around $600. My friend recently built a computer with a GeForce GTX 970, which is just about as powerful as my GPU, and he paid half as much. One other interesting fact is that the 970 also uses a good deal less power while having similar performance.

In other words, Sony can use newer technology to reduce cost and energy consumption, which ultimately reduces cooling requirements. This is sort of what they do with slim models when they use a smaller lithography process (i.e. 48nm to 32nm). Except, in my case, I'm also suggesting newer technology to help provide a benefit.

Although, if I take off my rose-colored glasses, it's also easy to see that this could be more than just a simple change. As I mentioned above, GPU architecture changes quite often, and in my example above, my friend is using the Maxwell architecture and I'm using Kepler. If Sony/Microsoft performed the same swap in a console, it would not be a drop-and-go scenario. They'd have to rework their compiler to use the new architecture and provide replacement binaries for everything that doesn't use abstraction. Honestly, that would be a ton of work, and it would be interesting to see if the logistics of it would pay off.

However, that is just looking at the GPU. A more powerful CPU in the current consoles could allow the company to perform more background activities without permanently segmenting resources. In other words, the PS4 could keep one core instead of two.

Frankly, I'll take whatever I can get to make the PS4 not take 30 minutes to transfer 1GB of video to a USB 3 thumb drive. :|

also not "trying to strengthen a position" or anything. i'm already onboard with the market that has been working for 30 years, not jumping onto this ludicrous one mentioned in 1 article.

as you mentioned, they already tried this crap with 32x, segaCD, turbo duo, etc. and they all flopped.

One interesting reworking was just released recently: the "New" Nintendo 3DS XL, and it has segmented the player-base.
 
we already have that. it's called a pc.

that's just being silly

there's far more difference between the console gaming experience and the pc gaming experience than the frequency of hardware updates.

it would make sense to release upgraded hardware every couple years because they could hit multiple price points. Want top of the line graphics on 4k TVs? For $1000 the xbox 2019 can rock your world. Don't need quite as much performance? The $500 xbox 2017 might be for you. Still want to play your favorite games with all your buds but don't have a job? The $150 xbox 2015 can get the job done (albeit with severely limited graphics).
 
No, it wouldn't make sense. It would make MORE sense to give up home consoles altogether, and turn the Xbox into a Steam Machine competitor. They wouldn't need any custom hardware that way, and developers wouldn't need to create separate games for the Xbox and PC. However, either option is highly unlikely. Both routes would alienate a good chunk of casual users who would hate seeing constant hardware refreshes.

Honestly, how would what you suggest be any different from a PC-like experience, the game's disc format and the OS?
 
Honestly, how would what you suggest be any different from a PC-like experience, the game's disc format and the OS?

by keeping the number of 'refreshes' limited to say once every 2 years, you can guarantee compatibility and avoid various driver issues

that is a world of difference from PCs where there are millions of different possible hardware combinations that you have to support

Both routes would alienate a good chunk of casual users who would hate seeing constant hardware refreshes.

That's the beauty of limited refreshes, guaranteed backwards compatibility. If you buy the xbox 2017, you don't have to upgrade to the xbox 2019 unless you just want a better graphical experience. It will still play all the same games your friends do, just at lower quality.

By continually refreshing the platform, you may indeed keep your console buy relevant LONGER than otherwise because the platform as a whole remains relevant longer
 
Last edited:
Horrible ideas...glad you guys don't work on the Playstation or Xbox team.

When they sell a game there is already a downgrade from the trailers. Lets see how people take to seeing a trailer get downgraded quadruple time because they didn't buy the new console model. That's a sure fire way to kill off your console not prop it up. The entire reason people are playing on a console and not PC provided the price is similar, is because they don't have to worry about how a game will run. Segmenting the console market with multiple models will only confuse consumers and piss off your most loyal fanbase. You want to hand the entire gaming market over to Valve and SteamOS? I sure as hell don't.
 
Lets see how people take to seeing a trailer get downgraded quadruple time because they didn't buy the new console model.

everyone understands that buying the older/cheaper model results in lower graphics quality


The entire reason people are playing on a console and not PC provided the price is similar, is because they don't have to worry about how a game will run.

They still won't have to worry about how the game will run, it will run perfectly fine. Just with less eye candy than if they had the newest/most expensive model.

Segmenting the console market with multiple models will only confuse consumers and piss off your most loyal fanbase

price segmentation exists in every market and consumers deal with it just fine

most consumers would actually appreciate it as it gives them more options/flexibility. Is graphics that important to you? then you can spend more and get the better graphics you really want. Are you more price sensitive? Well now you can get a console for even cheaper than you ever could before.
 
Last edited:
everyone understands that buying the older/cheaper model results in lower graphics quality




They still won't have to worry about how the game will run, it will run perfectly fine. Just with less eye candy than if they had the newest/most expensive model.



price segmentation exists in every market and consumers deal with it just fine

most consumers would actually appreciate it as it gives them more options/flexibility. Is graphics that important to you? then you can spend more and get the better graphics you really want. Are you more price sensitive? Well now you can get a console for even cheaper than you ever could before.

No you just don't get it. A console will never ever work this way and be successful. It won't happen. It'll be like selling a Blu-Ray player that can only process half the color detail. People will notice and people will be pissed. Most consumers expect to buy a game and it to play the way it should. Not have to worry about it looking worse than their friend's game and likely running worse (lower performing hardware and all produces lower frame rates).

Again I'm glad you aren't working for the teams in charge of console hardware.

There is a hardware model set up exactly the way you seem to want it. It's called a PC. I love my PC, I play lots of games on it but there are certain reasons a console is advantageous and a better option for most gamers. If the console model becomes a PC, the console will die and quite frankly I would never buy it. You cannot compete with the PC and win if you try to push hardware specs.
 
Last edited:
No you just don't get it. A console will never ever work this way and be successful. It won't happen. It'll be like selling a Blu-Ray player that can only process half the color detail. People will notice and people will be pissed. Most consumers expect to buy a game and it to play the way it should. Not have to worry about it looking worse than their friend's game and likely running worse (lower performing hardware and all produces lower frame rates).

Again I'm glad you aren't working for the teams in charge of console hardware.

There is a hardware model set up exactly the way you seem to want it. It's called a PC. I love my PC, I play lots of games on it but there are certain reasons a console is advantageous and a better option for most gamers. If the console model becomes a PC, the console will die and quite frankly I would never buy it. You cannot compete with the PC and win if you try to push hardware specs.


I completely agree. I used to be an avid PC gamer, but as I got older, the simplicity and longevity of the console won me over. I play weekly with a group of old friends spread out over the whole US. It is our way of keeping in touch. None of us have the time to keep up with hardware. We buy one console, and it lasts us 5-7 years. We're perfectly happy with this. And given how long it takes programmers to really push a consoles limits, this type of cycle seems more than adequate.
 
No you just don't get it. A console will never ever work this way and be successful. It won't happen. It'll be like selling a Blu-Ray player that can only process half the color detail. People will notice and people will be pissed. Most consumers expect to buy a game and it to play the way it should. Not have to worry about it looking worse than their friend's game and likely running worse (lower performing hardware and all produces lower frame rates).

Again I'm glad you aren't working for the teams in charge of console hardware.

There is a hardware model set up exactly the way you seem to want it. It's called a PC. I love my PC, I play lots of games on it but there are certain reasons a console is advantageous and a better option for most gamers. If the console model becomes a PC, the console will die and quite frankly I would never buy it. You cannot compete with the PC and win if you try to push hardware specs.

But in a way, it does work that way today, just not at the graphics level. The XB1 was sold with and without the Kinect. This can either give you enhanced controls and additional functionality or you can skip it. All the commercials at launch had Aaron Paul talking to his machine and doing various functions. Somehow the consumers understood that the Kinect was required to have that experience and yet they purchased more of the cheaper Kinectless units.

Maybe not the best example, but a little education of the consumer and they will understand that a enthusiast model cares a premium and offers superior graphics.
 
But in a way, it does work that way today, just not at the graphics level. The XB1 was sold with and without the Kinect. This can either give you enhanced controls and additional functionality or you can skip it. All the commercials at launch had Aaron Paul talking to his machine and doing various functions. Somehow the consumers understood that the Kinect was required to have that experience and yet they purchased more of the cheaper Kinectless units.

Maybe not the best example, but a little education of the consumer and they will understand that a enthusiast model cares a premium and offers superior graphics.

that isn't the same at all. all the kinect is a peripheral that came with the console. peripherals have been bundled with consoles forever. like the light gun coming with NES back in the day, or a bundle coming with the super scope six for SNES. some games take advantage of it, some don't.

the underlying console isn't different at all.
 
I completely agree. I used to be an avid PC gamer, but as I got older, the simplicity and longevity of the console won me over. I play weekly with a group of old friends spread out over the whole US. It is our way of keeping in touch. None of us have the time to keep up with hardware. We buy one console, and it lasts us 5-7 years. We're perfectly happy with this. And given how long it takes programmers to really push a consoles limits, this type of cycle seems more than adequate.

There is practically zero complexity to PC gaming these days and almost nil in the area of keeping up with hardware. You can thank consoles for that. The way I see it, if a person uses a PC at all, you should be able to run a game on one. Will you hit bugs? Yes, but even consoles have issues, so they aren't without faults and it will be much more apparent this gen and really already have shown their true colors.
 
that isn't the same at all. all the kinect is a peripheral that came with the console. peripherals have been bundled with consoles forever. like the light gun coming with NES back in the day, or a bundle coming with the super scope six for SNES. some games take advantage of it, some don't.

the underlying console isn't different at all.

What previous peripheral was integrated into the core of the console like the XB1 Kinect?
 
What previous peripheral was integrated into the core of the console like the XB1 Kinect?

The thing is, it was unnecessarily in the core of the console. The console was coded to not function without the Kinect. The requirement was software-based, not hardware based, like varied graphics performance is (that's actually affected by both, but you get the point). You used the Aaron Paul commercial as your example, and that works easily to refute your claim. In that commercial, he does things like use Game DVR, Snap, and Skype. All of these can be done without the Kinect. The hardware of the console is not changed. It's all optional, secondary functionality. The performance and visuals of a game are not.
 
But in a way, it does work that way today, just not at the graphics level. The XB1 was sold with and without the Kinect. This can either give you enhanced controls and additional functionality or you can skip it. All the commercials at launch had Aaron Paul talking to his machine and doing various functions. Somehow the consumers understood that the Kinect was required to have that experience and yet they purchased more of the cheaper Kinectless units.

Maybe not the best example, but a little education of the consumer and they will understand that a enthusiast model cares a premium and offers superior graphics.

Not really. See the Kinect was supposed to be required to use the console. They did a 180 on that. They decided to keep it bundled. Nobody knows exactly why but I suspect they wanted to get the Kinect out there for people to see how it is to use voice commands, use the IR blaster functionality to turn their A/V gear on or off, and since every console came with Kinect everyone with an XB1 had it so developers would use it. Sales were not looking too good so they had to figure out something to boost numbers. Dropping kinect and lowering the price of the system helped tremendously. People didn't buy it just because it didn't have kinect now, they bought it because it was cheaper and was coming with 2 free games from many retailers for a time. That was a tremendous value.
 
The folks i use to play XBL with on a regular basis held off because of the forced inclusion of the Kinect, and the high price tag that resulted therein. Then by the time the came out with the Kinect-free, lower price unit, COD went overboard on the arcade mentality of their games and now we don't play anymore.

That's saying something when you go from playing two nights a week for more than a decade and then to not playing at all. Before COD we played all of Ubi's GR/RB6/SC and when Ubi had a stroke and killed those we moved to COD. Now no COD and nothing to replace it has done us in.

BTW, BF never got liked. We tried BF3 when it came out, but it seemed asinine in how teams were groups and no intuitive understanding of how to play the game against another party. Add in the glorious bugs and no more BF, too.

Will there ever be another FPS like those GR/RB6/SC/COD of the past? What good is a cheap console if you don't find any games you like to play on it.
 
We're getting Rainbow Six: Siege this year. Rumors have CoD going back to WW2, and it's Treyarch's turn to make a game. They've made the only good CoD games in, like, 5 years, but there's no telling what we are going to get.
 
I thought the future of console gaming was VR? Won't that take dedicated hardware, if only the visor and haptic gloves?

I'm calling it now and I'm ready to eat my hat if I'm wrong. VR is going to be another flash in the pan, just like 3DTV. I give it three years tops, and even that feels far too generous.

The problem with VR is the goggles add an extra level of inconvenience. Interest right now in the Oculus Rift is mainly due to its scarcity and novelty factor. I don't see it catching on with the mainstream, who just want to sit in front of the TV and play. Expensive peripherals also rarely sell very well. Especially when they're not bundled with every system.

I also get the feeling that Sony's going to pull a Move/Vita with the Morpheus. The nanosecond consumer interest starts lagging, they cut all first party support and screw the early adopters.
 
Back
Top