My point is that the 360 being awesome helped PC graphics out, and without it, we wouldn't be where we are today (unless PC gaming gained popularity in that time instead). When the 720 is released with average specs, it wont help push cutting edge graphics, it will stifle it for some time.
This is the article i read yesterday: http://arstechnica.com/gaming/news/...he-console-industry.ars?clicked=related_right
The best way is not to. Throw more GPU power at it, and scorn devs that try to add detail only by textures, which simply does not scale. Encourage them to be creative with that DX11 GPU (DirectCompute & HLSL) that will inevitably be as crippled as consoles of the past, in bandwidth terms. It's not just consoles, either. While we have far more to work with, we are limited by space and bandwidth on the PC, as well, and would also benefit from less reliance on textures for detail, as they necessarily eat up crazy amounts of memory bandwidth. We have the option to throw money at the problem, of course, but even that will have its limits, if current trends keep going on as they have been (affordable >384-bit RAM width video cards? Affordable alternative RAM tech? Not likely).For the graphics to get significantly better the raw amount of data required is simply outrageous. The Xbox 720 supposedly doesn't even have a disk drive and my guess is disks simply can't hold the amount of data required for the next generation games which are already rumored to have 4k resolutions. LCD manufacturers have found a cheap way to produce 4K monitors and TVs and are already adapting their production lines. The burning issue no one has an answer for is how the next generation consoles will deal with the volume of data required to produce those kinds of resolutions.
That article is focusing on yesterday's issues. They even hint at it in the article, but faster processors and more powerful graphics cards just aren't going to make much of a difference in future games. Not because the graphics won't get any better, but because they already are plenty fast and powerful for the foreseeable future. The remaining hurdles to be overcome are raw bandwidth issues.
For example, the original Rage was 1TB of mostly textures that would cripple any desktop in existence. I'm talking eye popping jaw dropping ultra high resolution textures. As it is even the stripped down version they sell is half movie that you just stream off your hard drive complete with shadows pre-baked into the textures. Likewise Intel is developing knights bridge to use for cloud based ray cast video games that would cripple any desktop. Its not the processors and graphics cards holding back video games anymore, but sheer bandwidth issues.
For the graphics to get significantly better the raw amount of data required is simply outrageous. The Xbox 720 supposedly doesn't even have a disk drive and my guess is disks simply can't hold the amount of data required for the next generation games which are already rumored to have 4k resolutions. LCD manufacturers have found a cheap way to produce 4K monitors and TVs and are already adapting their production lines. The burning issue no one has an answer for is how the next generation consoles will deal with the volume of data required to produce those kinds of resolutions.
There is no way the next console generation will run 4K monitors. Current desktop GPU's barely have the horsepower to run eyefinity resolutions with any type of image quality. All console hardware rumors are stating that the new consoles will use low end southern islands or 6670 level GPU's. There is no way they will be able to power a 4K resolution.
They will run in 1080p most likely which will actually be a large step up from the 640p that most console games run at now. I remember when I bought my PS3 and I thought the thing was broken because it wouldn't run in 1080p. I made threads on the PS forums asking why my console wasn't running at HD resolutions and to my dismay I learned that the horrid resolutions were by design.
The best way is not to. Throw more GPU power at it, and scorn devs that try to add detail only by textures, which simply does not scale. Encourage them to be creative with that DX11 GPU (DirectCompute & HLSL) that will inevitably be as crippled as consoles of the past, in bandwidth terms. It's not just consoles, either. While we have far more to work with, we are limited by space and bandwidth on the PC, as well, and would also benefit from less reliance on textures for detail, as they necessarily eat up crazy amounts of memory bandwidth. We have the option to throw money at the problem, of course, but even that will have its limits, if current trends keep going on as they have been (affordable >384-bit RAM width video cards? Affordable alternative RAM tech? Not likely).
I think Rage uses a great technology, but the implementation is on the wrong side of things (high detail by textures, using compression to add even more!). That kind of high detail should be combined with some combination(s) of asset re-use, and better real-time light/shadow. FI, something Just Cause 2 with added detail from using streaming megatexture would probably look gorgeous. It's a good salve, don't get me wrong, but while it can make 20GB of HDD space act like 500 (arbitrary numbers), it only helps so much to making 1GB of VRAM act like more (it does manage even that, though, by aggressively loading and unloading the textures, which is pretty cool).
We have increasing ALU power, generation over generation, relative to even cache bandwidth. More detailed textures, with more layers, are not going to help. FI, have lighting handle different surfaces differently, so that skin, FI, won't need 3+ textures (RGBA, normal, UV, ...). Or, FI, create models from the start using DX11's tessellation, instead of pushing buttons to bring multi-million poly models down to size. Oh, and quit trying to make things look realistic. It doesn't work well enough, and it eats up resources that are scarce.
It's not any one 4K texture, it's that the total scene is going to be limited to some maximum MBs-worth, and some GB/s worth of total transfers, so if you use simple models and detailed textures, they get to be low-detail textures by the time it's released. Increasing space and bandwidth needed to give more detail by adding more textures with higher resolutions is just unrealistic. Space and bandwidth requirements grow by at least 4x for every linear doubling of detail.
It doesn't change the fact that they will not support 4k resolutions. 1080p is going to be around for the next 5 years at least. The only 4K tv I can find for sale right now costs $500,000. That's hundred thousand and not a typo. If you think they are going to be priced for the likes of mere mortals in the next 3-5 years you are far more optimistic than any person in history.
1080p is going to be the resolution for this console generation no ifs ands, or buts about it. Honestly, broadcast TV isn't even at 1080p yet so I have no idea why you think the new consoles will support 4k resolutions.
Yes, you have to address bandwidth issues. But, they cannot be dealt with by any magic. The technology to provide high bandwidth, low latency, and large storage simply does not yet exist at anything close to consumer prices, and potentially disruptive technologies will need to prove themselves for awhile, first.There is no benefit to adding more gpu power at this point because we've already taken rasterization to an extreme. Lighting effects, tessellation, DoF, etc. are all there and the only remaining way to increase the graphics dramatically is through resolution. To go beyond that would require ray casting which, again, has outrageous bandwidth requirements. Either way you have to address the bandwidth issues or there just isn't much more that can be done.
Evidently you didn't read what I wrote. All the major LCD manufacturers are right now retrofitting their production lines to produce cheap ultra high resolution monitors and TVs. At most the screens will cost 1/3 again more then current 1080p ones and have faster response times and use less energy to boot. Consoles are developed with ten year life cycles in mind and they simply will not do so without accounting for ultra high resolution TVs taking over the market.
Yes, you have to address bandwidth issues. But, they cannot be dealt with by any magic. The technology to provide high bandwidth, low latency, and large storage simply does not yet exist at anything close to consumer prices, and potentially disruptive technologies will need to prove themselves for awhile, first.
There may be great technologies around the corner, but until they are in affordable mainstream hardware, so what? For the near term, a sure thing is needed, and less total texture memory used at any given time is pretty much it. There have always been great technologies around the corner, but very few pan out perfectly in a very short time span. Those memristors, if finally ready for prime time, FI, will need a few years in production before being considered for the likes on game consoles.
You can make all the specialized HW you want, but it has been proven time and time again that making that HW too hard to program is a problem, and the only solution we would have now would be very difficult to program for (SMP mini-GPU+CPU+ram nodes with high-speed interconnects). At a console cost, the best we can provide now will only just handle 1080P well, just as the XB360 and PS3 have consistently struggled at 720P.
4K monitor might make it at the tail end of the next console generation, but it will be as useful to console gaming as 1080p is for the current console gen, and 1080p was around at the start of it. If 4K is supported, it will be purely marketing bullet points. There is no way it will be playable at 30fps with anything other than 2D side scrollers. The move to higher than 1080p resolution in consumer panels now is not really for using that full resolution, its for displaying that at a fraction for increased clarity when scaling the UI, like on phones, laptops, tablets, etc. Now, if you were talking about 2K, I could see them supporting that, but high end GPUs in PCs will have a hard time with that, let alone a stripped down console version.
TV manufacturers have a hard enough time convincing consumers that 1080p is worth upgrading for. 4K makes no sense unless your are in film, medical, science, or something that requires the resolution. I dont think gaming will drive that, and it seems silly for console makers to strive for that when its all about reducing the cost of the hardware.
No, you are assuming all this technology will, combined in one box plus a monitor, cost <$700 almost as soon as it is released, and that each piece will somehow be a panacea. We realize that such a situation is unrealistic.You still aren't listening.
At what cost? And even then, they will want to wait to use those 4k displays, because faster memory still won't allow enough GPU power to utilize a 4k display within any reasonable TDP. If memristors come out, are affordable, and are fast enough to be GB-sized caches, then power will just rear its ugly hot head, as it has been doing for years, now.The technology to provide all this has been in development for over a decade and is already planned for release within the next two years. This is not a drill. These are not demos. HP intends to sell their first memristors next year.
Will they be affordable by 2017? Do you have an actual MSRP on say, a 27"? What kind of video interface do they use (2xDL-DVI, or some other kind of thing practically no one has?)?LCD manufacturers will start producing ultra high resolution screens before the end of this year.
Yes, but that won't solve any graphics quality or performance problems. It will solve the problem that threaded C and C++ is too damned difficult to get right, for historical reasons. Today, game development gets money thrown at it to solve that problem.Intel and AMD intend to have unified cpu and gpu memory and hardware accelerated transactional memory within two years.Hardware accelerated transactional memory will allow processors to thread themselves and be easier to program then existing separate cpu and graphics cards.
That's like saying Intel's 386 can handle 4GB of memory (it was not possible until the P3, IIRC, and was not reasonable until the Core 2 had AMD competition). I would be surprised if you could actually play anything with a 32TB texture. Its ISA is capable of addressing that much. So, they won't have to worry about reaching their limits for awhile. If they're going to make an MMU spec (what it amounts to, though texture-specific), why not make it to last a decade or two?You're wrong. Before you start making sweeping statements about technology the least you can do is read up on it. AMD's current 7970 is already capable of handling textures as large as 32TB.
You're wrong. Before you start making sweeping statements about technology the least you can do is read up on it. AMD's current 7970 is already capable of handling textures as large as 32TB. This website even wrote a rather great article on the subject explaining exactly how the video card can handle such large textures and how it makes it possible to change the display fast enough for video games.
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6
Graphics can actually improve gameplay
Ambience != gameplay
No, you are assuming all this technology will, combined in one box plus a monitor, cost <$700 almost as soon as it is released, and that each piece will somehow be a panacea. We realize that such a situation is unrealistic.
At what cost? And even then, they will want to wait to use those 4k displays, because faster memory still won't allow enough GPU power to utilize a 4k display within any reasonable TDP. If memristors come out, are affordable, and are fast enough to be GB-sized caches, then power will just rear its ugly hot head, as it has been doing for years, now.
Will they be affordable by 2017? Do you have an actual MSRP on say, a 27"? What kind of video interface do they use (2xDL-DVI, or some other kind of thing practically no one has?)?
Yes, but that won't solve any graphics quality or performance problems. It will solve the problem that threaded C and C++ is too damned difficult to get right, for historical reasons. Today, game development gets money thrown at it to solve that problem.
In addition, it will take much more than 2 years for the programming of CPU/GPU to get easier. That, even today, is very much an API and programming language issue. Llano's rather basic integration shaves of enough latency to start worrying about how to best program them for cooperation.
That's like saying Intel's 386 can handle 4GB of memory (it was not possible until the P3, IIRC, and was not reasonable until the Core 2 had AMD competition). I would be surprised if you could actually play anything with a 32TB texture. Its ISA is capable of addressing that much. So, they won't have to worry about reaching their limits for awhile. If they're going to make an MMU spec (what it amounts to, though texture-specific), why not make it to last a decade or two?
I am perfecting comfortable with by sweeping tech statements. I have actually seen unreleased higher than 1080p displays, and I know what they are going to be used for. You seem to forget that we are talking about a console which is due in maybe 2-3 years, and a display that is no where near ready for consumer deployment.
The last time I went to NAB, at least 5 years ago, companies where talking about 4K, and even 8K monitor for business and consumer deployment. NHK even had what they were calling Ultra HD in their booth. Guess what, didnt happen. Theres a reason, its cost and specialization. Next gen consoles will be lucky to achieve current high end PC performance. Targeting phantom displays at more than twice the resolution of highest available consumer display is not practical from a console perspective, its fantasy and would be purely marketing.
Again, games like Rage are already half movie and just stream the textures straight off the hard drive with a lot of graphics effects already baked right in. Id had to run the original game through a rendering farm to do that, but the whole point was to make the game run on something as wimpy as even an iPhone and still look good.
Also, like I already pointed out to Cerb the technology is just cheap and easy to add. The console manufacturers can even leave out the extra memory required for 4k resolutions and charge customers extra to upgrade.
I think you misspelled 7970 in crossfire, there. 4k with a 6670? Not on this world. On this one, a 6670 is just adequate for 1080, and you'd really want better than that. If consoles are going to have something of that performance level, and will be expected to keep at least as much detail in games as the last generation, then they won't be rendering 4k lines. I don't doubt that a next-gen console designed today might have a 6670 equivalent. I absolutely disbelieve, however, that such a GPU could do much more than browse the web at such high res.You can believe what you want, but I have links to reputable websites that say different. 4K screens are already coming and within a few years the prices should drop to roughly 1/3 more then 1080p. A radeon 6670 is already dirt cheap and adding a few custom modifications won't make it any more expensive.
The only link you've provided has been a single AT article explaining that new Radeons can divide textures into subtiles, and arbitrarily fill memory with a map of those subtiles. Yes, it's good technology, but it will, at best, increase the effective space/bandwidth a single one-time linear factor. I have been reading what you've written, but it's mostly been wishful thinking, that we have some hardware and software fairies ready to make technology that doesn't even exist for commodity use yet become cheap very fast. It takes time, it takes money, and costs don't radically go down without very high confidence, or early adopters helping to pay back some of the costs.If you aren't going to listen to what I've written and refuse to read up on the technology using the links I provide then its pointless to argue with you about it. I've already countered these objections and have nothing more to say on the issue.
So, no affordable TVs for a good 5+ years, then?The first ones will no doubt be small screens for phones and tablets, but they are retrofitting their entire production lines and TVs should be coming out next year. Exactly what kind of interfaces they'll support or what color the box comes in you'll just have to wait and see.
You mean we go and do...what we've been doing since 3D cards came into existence? Physics and AI have always run on my CPU. I don't see why they should have stopped, at any point. With AVX2, I expect the minor performance improvement GPUs have the ability to offer to finally go away, as well.In this case it means games can be better optimized from the get go and, eventually, you can run things like physics and AI on the processor so the discrete graphics card can work on everything else.
:biggrin: Um, no. The software will decide whether to use the CPU or GPU, just like today. The OS will decide what threads run where, when, and for how long, just like today. HTM gets rid of severe overhead for STM, and has the potential to remove locks entirely, eventually.You evidently don't understand transactional memory. The whole point is to have the chip thread itself to a significant extent to simply programming the chip. The thing decides for itself whether to use the cpu or gpu.
...and all of this will hit the market, and be cheap, by 2014, before single GPUs are even capable of playing any current game at 4k lines? Right. The manufacturing technology is not there, and it would take many coinciding disruptive breakthroughs to make it happen, which we don't have evidence of. If 4k monitors of reasonable size come out at reasonable cost, we will then still need video processing capable of rendering to them, and unless some large fab company has a miracle up its sleeve for their next node(s), even the mighty Intel wouldn't be able to pull that off.That's true to some extent but, the point is the technology does exist and is easy to add. It may be that like some current consoles it will have the option of adding more memory for higher resolutions.
You could do that with Doom-level graphics, on computers from ages ago, or you could make it shiny, with glowing signs and reflections, today. That's not done because it takes a ton of time to develop that city, and all the people in it, not because of lack of hardware.Yes it can.
Imagine a city in a role playing game that is actually a city, rather than a town with half a dozen houses. Now imagine trying to hunt down one individual in that city which consists of thousands of NPC's. Imagine trying to find them, and then imagine trying to do a stealth kill on that guy.
Just one of many examples where powerful hardware increases the gameplay aspect.
ids tech might not even see light of day outside of Bethesda. The last I heard they weren't looking to license. You're also adding time and money to development by prerendering all the effects, and just looking at the results from Rage, it doesn't seem worth it. The jury is still out on what id tech 5 means to the industry.
The problem with having consoles upgradeable is that devs will target the lowest common denominator, like what we saw with the 360 with memory units vs the hard drive. Especially when you don't have the monitors in consumers hands. I know you believe consumer 4K is coming right around the corner, but I've heard that before. I can see the next gen consoles possibly recognizing the 4K display and having the UI to match, which could be added via firmware update after launch, but it doesn't seem practical to target that as a gaming resolution. I think 4K will kick in towards the middle to the end of the next gen console lifecycle, which is too late.
Even if a 4K, 40" $2000 tv showed up tomorrow, I don't have feel as to what the consumer response would be. People have or are in the process of converting their TVs to 1080p sets. It just feels too soon to push a new standard on them. Then theres the lack of 4K content. It'll be used by consumers with PCs before it hits the living rooms, and most people have been resistant to 1440p and 1600p monitors due to cost. A lot has to happen for the market to be right for these displays.
Don't get me wrong, if you turn out to be a prophet I would be pleasantly surprised. It just doesn't seem likely.
An optimist is someone who believes this is the best of all possible worlds, while a pessimist is someone who's afraid he is right.
We'll just have to wait and see how it turns out, but like 3D its a cheap feature that can be added to at least high end TVs and consoles.
I think you misspelled 7970 in crossfire, there. 4k with a 6670? Not on this world. On this one, a 6670 is just adequate for 1080, and you'd really want better than that. If consoles are going to have something of that performance level, and will be expected to keep at least as much detail in games as the last generation, then they won't be rendering 4k lines. I don't doubt that a next-gen console designed today might have a 6670 equivalent. I absolutely disbelieve, however, that such a GPU could do much more than browse the web at such high res.
The only link you've provided has been a single AT article explaining that new Radeons can divide textures into subtiles, and arbitrarily fill memory with a map of those subtiles. Yes, it's good technology, but it will, at best, increase the effective space/bandwidth a single one-time linear factor. I have been reading what you've written, but it's mostly been wishful thinking, that we have some hardware and software fairies ready to make technology that doesn't even exist for commodity use yet become cheap very fast. It takes time, it takes money, and costs don't radically go down without very high confidence, or early adopters helping to pay back some of the costs.
So, no affordable TVs for a good 5+ years, then?
You mean we go and do...what we've been doing since 3D cards came into existence? Physics and AI have always run on my CPU. I don't see why they should have stopped, at any point. With AVX2, I expect the minor performance improvement GPUs have the ability to offer to finally go away, as well.
:biggrin: Um, no. The software will decide whether to use the CPU or GPU, just like today. The OS will decide what threads run where, when, and for how long, just like today. HTM gets rid of severe overhead for STM, and has the potential to remove locks entirely, eventually.
...and all of this will hit the market, and be cheap, by 2014, before single GPUs are even capable of playing any current game at 4k lines? Right. The manufacturing technology is not there, and it would take many coinciding disruptive breakthroughs to make it happen, which we don't have evidence of. If 4k monitors of reasonable size come out at reasonable cost, we will then still need video processing capable of rendering to them, and unless some large fab company has a miracle up its sleeve for their next node(s), even the mighty Intel wouldn't be able to pull that off.