Next gen console's effect on PC gaming graphics

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lozina

Lifer
Sep 10, 2001
11,711
8
81
I am still impatiently waiting for them to make a leap in AI since the 1970s ........
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
My point is that the 360 being awesome helped PC graphics out, and without it, we wouldn't be where we are today (unless PC gaming gained popularity in that time instead). When the 720 is released with average specs, it wont help push cutting edge graphics, it will stifle it for some time.

This is the article i read yesterday: http://arstechnica.com/gaming/news/...he-console-industry.ars?clicked=related_right

That article is focusing on yesterday's issues. They even hint at it in the article, but faster processors and more powerful graphics cards just aren't going to make much of a difference in future games. Not because the graphics won't get any better, but because they already are plenty fast and powerful for the foreseeable future. The remaining hurdles to be overcome are raw bandwidth issues.

For example, the original Rage was 1TB of mostly textures that would cripple any desktop in existence. I'm talking eye popping jaw dropping ultra high resolution textures. As it is even the stripped down version they sell is half movie that you just stream off your hard drive complete with shadows pre-baked into the textures. Likewise Intel is developing knights bridge to use for cloud based ray cast video games that would cripple any desktop. Its not the processors and graphics cards holding back video games anymore, but sheer bandwidth issues.

For the graphics to get significantly better the raw amount of data required is simply outrageous. The Xbox 720 supposedly doesn't even have a disk drive and my guess is disks simply can't hold the amount of data required for the next generation games which are already rumored to have 4k resolutions. LCD manufacturers have found a cheap way to produce 4K monitors and TVs and are already adapting their production lines. The burning issue no one has an answer for is how the next generation consoles will deal with the volume of data required to produce those kinds of resolutions.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
For the graphics to get significantly better the raw amount of data required is simply outrageous. The Xbox 720 supposedly doesn't even have a disk drive and my guess is disks simply can't hold the amount of data required for the next generation games which are already rumored to have 4k resolutions. LCD manufacturers have found a cheap way to produce 4K monitors and TVs and are already adapting their production lines. The burning issue no one has an answer for is how the next generation consoles will deal with the volume of data required to produce those kinds of resolutions.
The best way is not to. Throw more GPU power at it, and scorn devs that try to add detail only by textures, which simply does not scale. Encourage them to be creative with that DX11 GPU (DirectCompute & HLSL) that will inevitably be as crippled as consoles of the past, in bandwidth terms. It's not just consoles, either. While we have far more to work with, we are limited by space and bandwidth on the PC, as well, and would also benefit from less reliance on textures for detail, as they necessarily eat up crazy amounts of memory bandwidth. We have the option to throw money at the problem, of course, but even that will have its limits, if current trends keep going on as they have been (affordable >384-bit RAM width video cards? Affordable alternative RAM tech? Not likely).

I think Rage uses a great technology, but the implementation is on the wrong side of things (high detail by textures, using compression to add even more!). That kind of high detail should be combined with some combination(s) of asset re-use, and better real-time light/shadow. FI, something Just Cause 2 with added detail from using streaming megatexture would probably look gorgeous. It's a good salve, don't get me wrong, but while it can make 20GB of HDD space act like 500 (arbitrary numbers), it only helps so much to making 1GB of VRAM act like more (it does manage even that, though, by aggressively loading and unloading the textures, which is pretty cool).

We have increasing ALU power, generation over generation, relative to even cache bandwidth. More detailed textures, with more layers, are not going to help. FI, have lighting handle different surfaces differently, so that skin, FI, won't need 3+ textures (RGBA, normal, UV, ...). Or, FI, create models from the start using DX11's tessellation, instead of pushing buttons to bring multi-million poly models down to size. Oh, and quit trying to make things look realistic. It doesn't work well enough, and it eats up resources that are scarce.

It's not any one 4K texture, it's that the total scene is going to be limited to some maximum MBs-worth, and some GB/s worth of total transfers, so if you use simple models and detailed textures, they get to be low-detail textures by the time it's released. Increasing space and bandwidth needed to give more detail by adding more textures with higher resolutions is just unrealistic. Space and bandwidth requirements grow by at least 4x for every linear doubling of detail.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
That article is focusing on yesterday's issues. They even hint at it in the article, but faster processors and more powerful graphics cards just aren't going to make much of a difference in future games. Not because the graphics won't get any better, but because they already are plenty fast and powerful for the foreseeable future. The remaining hurdles to be overcome are raw bandwidth issues.

For example, the original Rage was 1TB of mostly textures that would cripple any desktop in existence. I'm talking eye popping jaw dropping ultra high resolution textures. As it is even the stripped down version they sell is half movie that you just stream off your hard drive complete with shadows pre-baked into the textures. Likewise Intel is developing knights bridge to use for cloud based ray cast video games that would cripple any desktop. Its not the processors and graphics cards holding back video games anymore, but sheer bandwidth issues.

For the graphics to get significantly better the raw amount of data required is simply outrageous. The Xbox 720 supposedly doesn't even have a disk drive and my guess is disks simply can't hold the amount of data required for the next generation games which are already rumored to have 4k resolutions. LCD manufacturers have found a cheap way to produce 4K monitors and TVs and are already adapting their production lines. The burning issue no one has an answer for is how the next generation consoles will deal with the volume of data required to produce those kinds of resolutions.

There is no way the next console generation will run 4K monitors. Current desktop GPU's barely have the horsepower to run eyefinity resolutions with any type of image quality. All console hardware rumors are stating that the new consoles will use low end southern islands or 6670 level GPU's. There is no way they will be able to power a 4K resolution.

They will run in 1080p most likely which will actually be a large step up from the 640p that most console games run at now. I remember when I bought my PS3 and I thought the thing was broken because it wouldn't run in 1080p. I made threads on the PS forums asking why my console wasn't running at HD resolutions and to my dismay I learned that the horrid resolutions were by design.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
There is no way the next console generation will run 4K monitors. Current desktop GPU's barely have the horsepower to run eyefinity resolutions with any type of image quality. All console hardware rumors are stating that the new consoles will use low end southern islands or 6670 level GPU's. There is no way they will be able to power a 4K resolution.

They will run in 1080p most likely which will actually be a large step up from the 640p that most console games run at now. I remember when I bought my PS3 and I thought the thing was broken because it wouldn't run in 1080p. I made threads on the PS forums asking why my console wasn't running at HD resolutions and to my dismay I learned that the horrid resolutions were by design.

You are thinking in terms of standard desktops and gpus, but this is a console with custom hardware and software. For example, the 6670 in this case will undoubtedly have custom modifications which could include AMD's new hardware acceleration for partially resident textures. On the radeon 7970 it can handle textures up to 32TB.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
The best way is not to. Throw more GPU power at it, and scorn devs that try to add detail only by textures, which simply does not scale. Encourage them to be creative with that DX11 GPU (DirectCompute & HLSL) that will inevitably be as crippled as consoles of the past, in bandwidth terms. It's not just consoles, either. While we have far more to work with, we are limited by space and bandwidth on the PC, as well, and would also benefit from less reliance on textures for detail, as they necessarily eat up crazy amounts of memory bandwidth. We have the option to throw money at the problem, of course, but even that will have its limits, if current trends keep going on as they have been (affordable >384-bit RAM width video cards? Affordable alternative RAM tech? Not likely).

There is no benefit to adding more gpu power at this point because we've already taken rasterization to an extreme. Lighting effects, tessellation, DoF, etc. are all there and the only remaining way to increase the graphics dramatically is through resolution. To go beyond that would require ray casting which, again, has outrageous bandwidth requirements. Either way you have to address the bandwidth issues or there just isn't much more that can be done.

I think Rage uses a great technology, but the implementation is on the wrong side of things (high detail by textures, using compression to add even more!). That kind of high detail should be combined with some combination(s) of asset re-use, and better real-time light/shadow. FI, something Just Cause 2 with added detail from using streaming megatexture would probably look gorgeous. It's a good salve, don't get me wrong, but while it can make 20GB of HDD space act like 500 (arbitrary numbers), it only helps so much to making 1GB of VRAM act like more (it does manage even that, though, by aggressively loading and unloading the textures, which is pretty cool).

We have increasing ALU power, generation over generation, relative to even cache bandwidth. More detailed textures, with more layers, are not going to help. FI, have lighting handle different surfaces differently, so that skin, FI, won't need 3+ textures (RGBA, normal, UV, ...). Or, FI, create models from the start using DX11's tessellation, instead of pushing buttons to bring multi-million poly models down to size. Oh, and quit trying to make things look realistic. It doesn't work well enough, and it eats up resources that are scarce.

It's not any one 4K texture, it's that the total scene is going to be limited to some maximum MBs-worth, and some GB/s worth of total transfers, so if you use simple models and detailed textures, they get to be low-detail textures by the time it's released. Increasing space and bandwidth needed to give more detail by adding more textures with higher resolutions is just unrealistic. Space and bandwidth requirements grow by at least 4x for every linear doubling of detail.

Rage uses the id tech 5 which like all id tech engines is designed for speed rather then extreme graphics. However, to get really higher resolutions there is no alternative but to use something like partially resident textures or video games will start to look pixelated and repetitious. AMD's solution uses a variation invented by Disney for making movies and developers don't even have to use it for the entire game like Rage does, but just for things like panoramic scenes if they prefer.

As for how high resolution the textures will be you need to think in terms of the ten year life cycle of consoles. HP has already announced they will begin selling their molecular scale (5nm) memristors next year and offered to put 2gb right on top of Intel's Haswell chip. There are at least a dozen molecular scale memory technologies major manufacturers have been working on for over a decade and within five years your computer won't resemble anything you are familiar with today. I'm serious when I say that. Not only will cheap graphics cards become history, but cpu processors will become history and you just need to wrap your head around the idea computers will be significantly different with capacities that seem outrageous by today's standards.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
It doesn't change the fact that they will not support 4k resolutions. 1080p is going to be around for the next 5 years at least. The only 4K tv I can find for sale right now costs $500,000. That's hundred thousand and not a typo. If you think they are going to be priced for the likes of mere mortals in the next 3-5 years you are far more optimistic than any person in history.

1080p is going to be the resolution for this console generation no ifs ands, or buts about it. Honestly, broadcast TV isn't even at 1080p yet so I have no idea why you think the new consoles will support 4k resolutions.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
It doesn't change the fact that they will not support 4k resolutions. 1080p is going to be around for the next 5 years at least. The only 4K tv I can find for sale right now costs $500,000. That's hundred thousand and not a typo. If you think they are going to be priced for the likes of mere mortals in the next 3-5 years you are far more optimistic than any person in history.

1080p is going to be the resolution for this console generation no ifs ands, or buts about it. Honestly, broadcast TV isn't even at 1080p yet so I have no idea why you think the new consoles will support 4k resolutions.

Evidently you didn't read what I wrote. All the major LCD manufacturers are right now retrofitting their production lines to produce cheap ultra high resolution monitors and TVs. At most the screens will cost 1/3 again more then current 1080p ones and have faster response times and use less energy to boot. Consoles are developed with ten year life cycles in mind and they simply will not do so without accounting for ultra high resolution TVs taking over the market.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
There is no benefit to adding more gpu power at this point because we've already taken rasterization to an extreme. Lighting effects, tessellation, DoF, etc. are all there and the only remaining way to increase the graphics dramatically is through resolution. To go beyond that would require ray casting which, again, has outrageous bandwidth requirements. Either way you have to address the bandwidth issues or there just isn't much more that can be done.
Yes, you have to address bandwidth issues. But, they cannot be dealt with by any magic. The technology to provide high bandwidth, low latency, and large storage simply does not yet exist at anything close to consumer prices, and potentially disruptive technologies will need to prove themselves for awhile, first.

There may be great technologies around the corner, but until they are in affordable mainstream hardware, so what? For the near term, a sure thing is needed, and less total texture memory used at any given time is pretty much it. There have always been great technologies around the corner, but very few pan out perfectly in a very short time span. Those memristors, if finally ready for prime time, FI, will need a few years in production before being considered for the likes on game consoles.

You can make all the specialized HW you want, but it has been proven time and time again that making that HW too hard to program is a problem, and the only solution we would have now would be very difficult to program for (SMP mini-GPU+CPU+ram nodes with high-speed interconnects). At a console cost, the best we can provide now will only just handle 1080P well, just as the XB360 and PS3 have consistently struggled at 720P.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Evidently you didn't read what I wrote. All the major LCD manufacturers are right now retrofitting their production lines to produce cheap ultra high resolution monitors and TVs. At most the screens will cost 1/3 again more then current 1080p ones and have faster response times and use less energy to boot. Consoles are developed with ten year life cycles in mind and they simply will not do so without accounting for ultra high resolution TVs taking over the market.

4K monitor might make it at the tail end of the next console generation, but it will be as useful to console gaming as 1080p is for the current console gen, and 1080p was around at the start of it. If 4K is supported, it will be purely marketing bullet points. There is no way it will be playable at 30fps with anything other than 2D side scrollers. The move to higher than 1080p resolution in consumer panels now is not really for using that full resolution, its for displaying that at a fraction for increased clarity when scaling the UI, like on phones, laptops, tablets, etc. Now, if you were talking about 2K, I could see them supporting that, but high end GPUs in PCs will have a hard time with that, let alone a stripped down console version.

TV manufacturers have a hard enough time convincing consumers that 1080p is worth upgrading for. 4K makes no sense unless your are in film, medical, science, or something that requires the resolution. I dont think gaming will drive that, and it seems silly for console makers to strive for that when its all about reducing the cost of the hardware.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
Yes, you have to address bandwidth issues. But, they cannot be dealt with by any magic. The technology to provide high bandwidth, low latency, and large storage simply does not yet exist at anything close to consumer prices, and potentially disruptive technologies will need to prove themselves for awhile, first.

There may be great technologies around the corner, but until they are in affordable mainstream hardware, so what? For the near term, a sure thing is needed, and less total texture memory used at any given time is pretty much it. There have always been great technologies around the corner, but very few pan out perfectly in a very short time span. Those memristors, if finally ready for prime time, FI, will need a few years in production before being considered for the likes on game consoles.

You can make all the specialized HW you want, but it has been proven time and time again that making that HW too hard to program is a problem, and the only solution we would have now would be very difficult to program for (SMP mini-GPU+CPU+ram nodes with high-speed interconnects). At a console cost, the best we can provide now will only just handle 1080P well, just as the XB360 and PS3 have consistently struggled at 720P.

You still aren't listening. The technology to provide all this has been in development for over a decade and is already planned for release within the next two years. This is not a drill. These are not demos. HP intends to sell their first memristors next year. LCD manufacturers will start producing ultra high resolution screens before the end of this year. Intel and AMD intend to have unified cpu and gpu memory and hardware accelerated transactional memory within two years.Hardware accelerated transactional memory will allow processors to thread themselves and be easier to program then existing separate cpu and graphics cards. If necessary console manufacturers will wait until the technology is available rather then release bum products that become outdated within a year.
 
Last edited:

Nvidiaguy07

Platinum Member
Feb 22, 2008
2,846
4
81
From what i heard, PS3 will have 4k support, but obviously no games will play at it. I guess supporting a 4k res will enable these consoles to do 1080p 3d? (im guessing current 3d movies get scaled down to 720p?) - not sure, and also dont really care about 3d anyway.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
4K monitor might make it at the tail end of the next console generation, but it will be as useful to console gaming as 1080p is for the current console gen, and 1080p was around at the start of it. If 4K is supported, it will be purely marketing bullet points. There is no way it will be playable at 30fps with anything other than 2D side scrollers. The move to higher than 1080p resolution in consumer panels now is not really for using that full resolution, its for displaying that at a fraction for increased clarity when scaling the UI, like on phones, laptops, tablets, etc. Now, if you were talking about 2K, I could see them supporting that, but high end GPUs in PCs will have a hard time with that, let alone a stripped down console version.

TV manufacturers have a hard enough time convincing consumers that 1080p is worth upgrading for. 4K makes no sense unless your are in film, medical, science, or something that requires the resolution. I dont think gaming will drive that, and it seems silly for console makers to strive for that when its all about reducing the cost of the hardware.

You're wrong. Before you start making sweeping statements about technology the least you can do is read up on it. AMD's current 7970 is already capable of handling textures as large as 32TB. This website even wrote a rather great article on the subject explaining exactly how the video card can handle such large textures and how it makes it possible to change the display fast enough for video games.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

The technology already exists, is on the market, and the latest 4k iPad 4 has received complaints from customers for the fact that low resolution websites look like crap on their super high resolution screens. Its not necessary for our current TVs and monitors to even have 1080p resolutions, but people can see the difference and they want the better looking picture.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You still aren't listening.
No, you are assuming all this technology will, combined in one box plus a monitor, cost <$700 almost as soon as it is released, and that each piece will somehow be a panacea. We realize that such a situation is unrealistic.

The technology to provide all this has been in development for over a decade and is already planned for release within the next two years. This is not a drill. These are not demos. HP intends to sell their first memristors next year.
At what cost? And even then, they will want to wait to use those 4k displays, because faster memory still won't allow enough GPU power to utilize a 4k display within any reasonable TDP. If memristors come out, are affordable, and are fast enough to be GB-sized caches, then power will just rear its ugly hot head, as it has been doing for years, now.

LCD manufacturers will start producing ultra high resolution screens before the end of this year.
Will they be affordable by 2017? Do you have an actual MSRP on say, a 27"? What kind of video interface do they use (2xDL-DVI, or some other kind of thing practically no one has?)?

Intel and AMD intend to have unified cpu and gpu memory and hardware accelerated transactional memory within two years.Hardware accelerated transactional memory will allow processors to thread themselves and be easier to program then existing separate cpu and graphics cards.
Yes, but that won't solve any graphics quality or performance problems. It will solve the problem that threaded C and C++ is too damned difficult to get right, for historical reasons. Today, game development gets money thrown at it to solve that problem.

In addition, it will take much more than 2 years for the programming of CPU/GPU to get easier. That, even today, is very much an API and programming language issue. Llano's rather basic integration shaves of enough latency to start worrying about how to best program them for cooperation.

You're wrong. Before you start making sweeping statements about technology the least you can do is read up on it. AMD's current 7970 is already capable of handling textures as large as 32TB.
That's like saying Intel's 386 can handle 4GB of memory (it was not possible until the P3, IIRC, and was not reasonable until the Core 2 had AMD competition). I would be surprised if you could actually play anything with a 32TB texture. Its ISA is capable of addressing that much. So, they won't have to worry about reaching their limits for awhile. If they're going to make an MMU spec (what it amounts to, though texture-specific), why not make it to last a decade or two?
 
Last edited:

Childs

Lifer
Jul 9, 2000
11,313
7
81
You're wrong. Before you start making sweeping statements about technology the least you can do is read up on it. AMD's current 7970 is already capable of handling textures as large as 32TB. This website even wrote a rather great article on the subject explaining exactly how the video card can handle such large textures and how it makes it possible to change the display fast enough for video games.

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

I am perfecting comfortable with by sweeping tech statements. I have actually seen unreleased higher than 1080p displays, and I know what they are going to be used for. You seem to forget that we are talking about a console which is due in maybe 2-3 years, and a display that is no where near ready for consumer deployment.

The last time I went to NAB, at least 5 years ago, companies where talking about 4K, and even 8K monitor for business and consumer deployment. NHK even had what they were calling Ultra HD in their booth. Guess what, didnt happen. Theres a reason, its cost and specialization. Next gen consoles will be lucky to achieve current high end PC performance. Targeting phantom displays at more than twice the resolution of highest available consumer display is not practical from a console perspective, its fantasy and would be purely marketing.
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,973
1,276
126
I think you guys that say "Who cares about the graphics" miss a very obvious thing.

Graphics can actually improve gameplay. More powerful GPU's and CPU's can mean more NPC's on screen, more bustling cities, improved visuals to enhance the atmosphere, more variety in npc's, improved AI, larger maps etc.

So weak consoles do hinder game development and it's just not all about wanting eye candy.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,973
1,276
126
Ambience != gameplay

Yes it can.

Imagine a city in a role playing game that is actually a city, rather than a town with half a dozen houses. Now imagine trying to hunt down one individual in that city which consists of thousands of NPC's. Imagine trying to find them, and then imagine trying to do a stealth kill on that guy.

Just one of many examples where powerful hardware increases the gameplay aspect.
 
Last edited:

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
No, you are assuming all this technology will, combined in one box plus a monitor, cost <$700 almost as soon as it is released, and that each piece will somehow be a panacea. We realize that such a situation is unrealistic.

You can believe what you want, but I have links to reputable websites that say different. 4K screens are already coming and within a few years the prices should drop to roughly 1/3 more then 1080p. A radeon 6670 is already dirt cheap and adding a few custom modifications won't make it any more expensive.

At what cost? And even then, they will want to wait to use those 4k displays, because faster memory still won't allow enough GPU power to utilize a 4k display within any reasonable TDP. If memristors come out, are affordable, and are fast enough to be GB-sized caches, then power will just rear its ugly hot head, as it has been doing for years, now.

If you aren't going to listen to what I've written and refuse to read up on the technology using the links I provide then its pointless to argue with you about it. I've already countered these objections and have nothing more to say on the issue.

Will they be affordable by 2017? Do you have an actual MSRP on say, a 27"? What kind of video interface do they use (2xDL-DVI, or some other kind of thing practically no one has?)?

The first ones will no doubt be small screens for phones and tablets, but they are retrofitting their entire production lines and TVs should be coming out next year. Exactly what kind of interfaces they'll support or what color the box comes in you'll just have to wait and see.

Yes, but that won't solve any graphics quality or performance problems. It will solve the problem that threaded C and C++ is too damned difficult to get right, for historical reasons. Today, game development gets money thrown at it to solve that problem.

In this case it means games can be better optimized from the get go and, eventually, you can run things like physics and AI on the processor so the discrete graphics card can work on everything else.

In addition, it will take much more than 2 years for the programming of CPU/GPU to get easier. That, even today, is very much an API and programming language issue. Llano's rather basic integration shaves of enough latency to start worrying about how to best program them for cooperation.

You evidently don't understand transactional memory. The whole point is to have the chip thread itself to a significant extent to simply programming the chip. The thing decides for itself whether to use the cpu or gpu.

That's like saying Intel's 386 can handle 4GB of memory (it was not possible until the P3, IIRC, and was not reasonable until the Core 2 had AMD competition). I would be surprised if you could actually play anything with a 32TB texture. Its ISA is capable of addressing that much. So, they won't have to worry about reaching their limits for awhile. If they're going to make an MMU spec (what it amounts to, though texture-specific), why not make it to last a decade or two?

That's true to some extent but, the point is the technology does exist and is easy to add. It may be that like some current consoles it will have the option of adding more memory for higher resolutions.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I am perfecting comfortable with by sweeping tech statements. I have actually seen unreleased higher than 1080p displays, and I know what they are going to be used for. You seem to forget that we are talking about a console which is due in maybe 2-3 years, and a display that is no where near ready for consumer deployment.

The last time I went to NAB, at least 5 years ago, companies where talking about 4K, and even 8K monitor for business and consumer deployment. NHK even had what they were calling Ultra HD in their booth. Guess what, didnt happen. Theres a reason, its cost and specialization. Next gen consoles will be lucky to achieve current high end PC performance. Targeting phantom displays at more than twice the resolution of highest available consumer display is not practical from a console perspective, its fantasy and would be purely marketing.

Again, games like Rage are already half movie and just stream the textures straight off the hard drive with a lot of graphics effects already baked right in. Id had to run the original game through a rendering farm to do that, but the whole point was to make the game run on something as wimpy as even an iPhone and still look good.

Also, like I already pointed out to Cerb the technology is just cheap and easy to add. The console manufacturers can even leave out the extra memory required for 4k resolutions and charge customers extra to upgrade.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
Again, games like Rage are already half movie and just stream the textures straight off the hard drive with a lot of graphics effects already baked right in. Id had to run the original game through a rendering farm to do that, but the whole point was to make the game run on something as wimpy as even an iPhone and still look good.

ids tech might not even see light of day outside of Bethesda. The last I heard they weren't looking to license. You're also adding time and money to development by prerendering all the effects, and just looking at the results from Rage, it doesn't seem worth it. The jury is still out on what id tech 5 means to the industry.

Also, like I already pointed out to Cerb the technology is just cheap and easy to add. The console manufacturers can even leave out the extra memory required for 4k resolutions and charge customers extra to upgrade.

The problem with having consoles upgradeable is that devs will target the lowest common denominator, like what we saw with the 360 with memory units vs the hard drive. Especially when you don't have the monitors in consumers hands. I know you believe consumer 4K is coming right around the corner, but I've heard that before. I can see the next gen consoles possibly recognizing the 4K display and having the UI to match, which could be added via firmware update after launch, but it doesn't seem practical to target that as a gaming resolution. I think 4K will kick in towards the middle to the end of the next gen console lifecycle, which is too late.

Even if a 4K, 40" $2000 tv showed up tomorrow, I don't have feel as to what the consumer response would be. People have or are in the process of converting their TVs to 1080p sets. It just feels too soon to push a new standard on them. Then theres the lack of 4K content. It'll be used by consumers with PCs before it hits the living rooms, and most people have been resistant to 1440p and 1600p monitors due to cost. A lot has to happen for the market to be right for these displays.

Don't get me wrong, if you turn out to be a prophet I would be pleasantly surprised. It just doesn't seem likely.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You can believe what you want, but I have links to reputable websites that say different. 4K screens are already coming and within a few years the prices should drop to roughly 1/3 more then 1080p. A radeon 6670 is already dirt cheap and adding a few custom modifications won't make it any more expensive.
I think you misspelled 7970 in crossfire, there. 4k with a 6670? Not on this world. On this one, a 6670 is just adequate for 1080, and you'd really want better than that. If consoles are going to have something of that performance level, and will be expected to keep at least as much detail in games as the last generation, then they won't be rendering 4k lines. I don't doubt that a next-gen console designed today might have a 6670 equivalent. I absolutely disbelieve, however, that such a GPU could do much more than browse the web at such high res.

If you aren't going to listen to what I've written and refuse to read up on the technology using the links I provide then its pointless to argue with you about it. I've already countered these objections and have nothing more to say on the issue.
The only link you've provided has been a single AT article explaining that new Radeons can divide textures into subtiles, and arbitrarily fill memory with a map of those subtiles. Yes, it's good technology, but it will, at best, increase the effective space/bandwidth a single one-time linear factor. I have been reading what you've written, but it's mostly been wishful thinking, that we have some hardware and software fairies ready to make technology that doesn't even exist for commodity use yet become cheap very fast. It takes time, it takes money, and costs don't radically go down without very high confidence, or early adopters helping to pay back some of the costs.

The first ones will no doubt be small screens for phones and tablets, but they are retrofitting their entire production lines and TVs should be coming out next year. Exactly what kind of interfaces they'll support or what color the box comes in you'll just have to wait and see.
So, no affordable TVs for a good 5+ years, then?

In this case it means games can be better optimized from the get go and, eventually, you can run things like physics and AI on the processor so the discrete graphics card can work on everything else.
You mean we go and do...what we've been doing since 3D cards came into existence? Physics and AI have always run on my CPU. I don't see why they should have stopped, at any point. With AVX2, I expect the minor performance improvement GPUs have the ability to offer to finally go away, as well.

You evidently don't understand transactional memory. The whole point is to have the chip thread itself to a significant extent to simply programming the chip. The thing decides for itself whether to use the cpu or gpu.
:biggrin: Um, no. The software will decide whether to use the CPU or GPU, just like today. The OS will decide what threads run where, when, and for how long, just like today. HTM gets rid of severe overhead for STM, and has the potential to remove locks entirely, eventually.

That's true to some extent but, the point is the technology does exist and is easy to add. It may be that like some current consoles it will have the option of adding more memory for higher resolutions.
...and all of this will hit the market, and be cheap, by 2014, before single GPUs are even capable of playing any current game at 4k lines? Right. The manufacturing technology is not there, and it would take many coinciding disruptive breakthroughs to make it happen, which we don't have evidence of. If 4k monitors of reasonable size come out at reasonable cost, we will then still need video processing capable of rendering to them, and unless some large fab company has a miracle up its sleeve for their next node(s), even the mighty Intel wouldn't be able to pull that off.

--

Yes it can.

Imagine a city in a role playing game that is actually a city, rather than a town with half a dozen houses. Now imagine trying to hunt down one individual in that city which consists of thousands of NPC's. Imagine trying to find them, and then imagine trying to do a stealth kill on that guy.

Just one of many examples where powerful hardware increases the gameplay aspect.
You could do that with Doom-level graphics, on computers from ages ago, or you could make it shiny, with glowing signs and reflections, today. That's not done because it takes a ton of time to develop that city, and all the people in it, not because of lack of hardware.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
ids tech might not even see light of day outside of Bethesda. The last I heard they weren't looking to license. You're also adding time and money to development by prerendering all the effects, and just looking at the results from Rage, it doesn't seem worth it. The jury is still out on what id tech 5 means to the industry.

Id doesn't own the technology for partially resident textures. It's been widely used for making animated films for some time now.

The problem with having consoles upgradeable is that devs will target the lowest common denominator, like what we saw with the 360 with memory units vs the hard drive. Especially when you don't have the monitors in consumers hands. I know you believe consumer 4K is coming right around the corner, but I've heard that before. I can see the next gen consoles possibly recognizing the 4K display and having the UI to match, which could be added via firmware update after launch, but it doesn't seem practical to target that as a gaming resolution. I think 4K will kick in towards the middle to the end of the next gen console lifecycle, which is too late.

Even if a 4K, 40" $2000 tv showed up tomorrow, I don't have feel as to what the consumer response would be. People have or are in the process of converting their TVs to 1080p sets. It just feels too soon to push a new standard on them. Then theres the lack of 4K content. It'll be used by consumers with PCs before it hits the living rooms, and most people have been resistant to 1440p and 1600p monitors due to cost. A lot has to happen for the market to be right for these displays.

Don't get me wrong, if you turn out to be a prophet I would be pleasantly surprised. It just doesn't seem likely.

An optimist is someone who believes this is the best of all possible worlds, while a pessimist is someone who's afraid he is right.

We'll just have to wait and see how it turns out, but like 3D its a cheap feature that can be added to at least high end TVs and consoles.
 

Childs

Lifer
Jul 9, 2000
11,313
7
81
An optimist is someone who believes this is the best of all possible worlds, while a pessimist is someone who's afraid he is right.

I just dont think its likely. I took on all comers in defense of the Steam box, so I can see possibilities. lol Just looking at where the market is and looking back at the previous generation, it just doesnt seem likely. If we're still around in 5 years, one of us can ping the other about this thread. I promise not to rub it in...much!

We'll just have to wait and see how it turns out, but like 3D its a cheap feature that can be added to at least high end TVs and consoles.

lol I dont agree with this either, but whatever, I dont necessarily want to get into it. Arguing the future is kinda pointless.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I think you misspelled 7970 in crossfire, there. 4k with a 6670? Not on this world. On this one, a 6670 is just adequate for 1080, and you'd really want better than that. If consoles are going to have something of that performance level, and will be expected to keep at least as much detail in games as the last generation, then they won't be rendering 4k lines. I don't doubt that a next-gen console designed today might have a 6670 equivalent. I absolutely disbelieve, however, that such a GPU could do much more than browse the web at such high res.

No, I didn't make a mistake. The rumors are it will use a 6670 and I'm guessing they'll add on the hardware acceleration for partially resident textures first released on the 7970. The only other thing it will require then is extra memory. When you have hardware acceleration the power of the gpu is irrelevant. The thing is hardwired to produce those kinds of resolutions.

The only link you've provided has been a single AT article explaining that new Radeons can divide textures into subtiles, and arbitrarily fill memory with a map of those subtiles. Yes, it's good technology, but it will, at best, increase the effective space/bandwidth a single one-time linear factor. I have been reading what you've written, but it's mostly been wishful thinking, that we have some hardware and software fairies ready to make technology that doesn't even exist for commodity use yet become cheap very fast. It takes time, it takes money, and costs don't radically go down without very high confidence, or early adopters helping to pay back some of the costs.

So, no affordable TVs for a good 5+ years, then?

http://www.technologyreview.com/communications/39952/

Costs go down overnight all the time especially in electronics. That's what keeps pushing Moore's Law is that the demand never ceases and the costs go down steadily. Even the original Commodore 64 computer went down overnight to a quarter its original asking price when they found a cheap way to make ram. We'll just have to wait and see how fast they come down for ultra resolution TVs.

You mean we go and do...what we've been doing since 3D cards came into existence? Physics and AI have always run on my CPU. I don't see why they should have stopped, at any point. With AVX2, I expect the minor performance improvement GPUs have the ability to offer to finally go away, as well.

No, I'm talking running physics and AI that require 80-300 simplified processors right on the cpu. Things that have traditionally been done on a discrete graphics card will eventually be done on the APU. About 16 cpu cores is the max most home computers can use before it just becomes faster and less energy demanding to use simplified cores. Its the divide and conquer strategy and AMD's newest southern islands architecture does the same thing on a discrete gpu as well allowing 4 different tasks to run simultaneously.

:biggrin: Um, no. The software will decide whether to use the CPU or GPU, just like today. The OS will decide what threads run where, when, and for how long, just like today. HTM gets rid of severe overhead for STM, and has the potential to remove locks entirely, eventually.

Nope. The hardware is hardwired to make the decision and the operating system has nothing to do with it. Fish gotta swim, birds gotta fly, and hardware accelerated transactional memory has to decide for itself what to do with a program. You either help it do its thing, tell to ignore something, or stand back and watch it do whatever it can without your help.

...and all of this will hit the market, and be cheap, by 2014, before single GPUs are even capable of playing any current game at 4k lines? Right. The manufacturing technology is not there, and it would take many coinciding disruptive breakthroughs to make it happen, which we don't have evidence of. If 4k monitors of reasonable size come out at reasonable cost, we will then still need video processing capable of rendering to them, and unless some large fab company has a miracle up its sleeve for their next node(s), even the mighty Intel wouldn't be able to pull that off.

Like I said, HP's new 5nm memristors will hit the market next year.

http://www.theregister.co.uk/2011/10/10/memristor_in_18_months/

It's not just the mighty Intel, but AMD as well coming out with their own version of hardware accelerated transactional memory.

It's a convergence of technologies all coming on the market at once, but the research to produce these things goes way back and need to bring them to market has grown since the first dual core processor was produced. For some of this technology the only reason it hasn't come on the market before is because it was simply to expensive and nobody could afford to buy it.
 
Last edited: