nVidia Graphics 1994-present

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Weird that they would show GRID... 1) It runs better in ATI IIRC. 2) It's not exactly new. The ray tracing demo was neat, although you could see a few hiccups every now and then. I guess that speaks to the level of hardware you need to do ray tracing.
 

brblx

Diamond Member
Mar 23, 2009
5,499
2
0
i fail to see how a bit of babbling and a grid demo is 'the history of graphics.' i guess i was expecting a lot more.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: brblx
i fail to see how a bit of babbling and a grid demo is 'the history of graphics.' i guess i was expecting a lot more.

This.
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
It wasn't meant to be a actual history storyline about GPU's. It was just showing what it looked like in 1994 compared to today. Tweaktown sort of made a misleading title.
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
After we get photo realism we start taking 2D 2560x1600 images and turning them into complete holographic 2560x1600x2560 scenes. That would theoretically keep chip makers busy for another 22 years.
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: firewolfsm
After we get photo realism we start taking 2D 2560x1600 images and turning them into complete holographic 2560x1600x2560 scenes. That would theoretically keep chip makers busy for another 22 years.

Thing is, we will be reaching the limit of matter within 5 years. Nanotechnology, working at the atomic/molecular level. We have 32nm coming next year, a year after that 22nm is coming (fudzilla.com/index.php?option=...ent&task=view&id=13385). Then all we have left is 16nm, and then 11nm which is nanoelectronics.

Once we reach 11nm, you can't go any smaller... Because there's nothing stable smaller than an atom lol. So from there we merely have to create more efficient chips and use new technologies such as photonics, superconductors, carbon nanotubes, graphene and quantum computing. Chips will just be getting bigger and more optimized after we get to that point. Possibly 3D chips. People having a big cube processor lol.

It's amazing how close we are though, just around 5 years away.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
After we get photo realism

Decades away, at least.

Once we reach 11nm, you can't go any smaller...

We can go a lot smaller then 11nm-

One nanometer (nm) is one billionth, or 10-9, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12-0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200 nm in length.

Around 5nm we run into material issues, but 11nm is a relatively speaking long way off from atomic level :)
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: BenSkywalker
After we get photo realism

Decades away, at least.

Once we reach 11nm, you can't go any smaller...

We can go a lot smaller then 11nm-

One nanometer (nm) is one billionth, or 10-9, of a meter. By comparison, typical carbon-carbon bond lengths, or the spacing between these atoms in a molecule, are in the range 0.12-0.15 nm, and a DNA double-helix has a diameter around 2 nm. On the other hand, the smallest cellular life-forms, the bacteria of the genus Mycoplasma, are around 200 nm in length.

Around 5nm we run into material issues, but 11nm is a relatively speaking long way off from atomic level :)

11nm is just the lithography, the actual components on the chip are smaller, with the gates only being a few nm. And obviously you can't make a gate with a single atom lol. After we hit 11nm process, it is still probably going to take a few years to really master creating atomic level technology. But it's still amazing how close we are to that point, and most people don't know that chips won't be shrinking for much longer.

I kind of worry what will happen to the PC industry a bit after that point, not being able to create faster chips by downsizing anymore.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: Kakkoii
Originally posted by: firewolfsm
After we get photo realism we start taking 2D 2560x1600 images and turning them into complete holographic 2560x1600x2560 scenes. That would theoretically keep chip makers busy for another 22 years.

Thing is, we will be reaching the limit of matter within 5 years. Nanotechnology, working at the atomic/molecular level. We have 32nm coming next year, a year after that 22nm is coming (fudzilla.com/index.php?option=...ent&task=view&id=13385). Then all we have left is 16nm, and then 11nm which is nanoelectronics.

Once we reach 11nm, you can't go any smaller... Because there's nothing stable smaller than an atom lol. So from there we merely have to create more efficient chips and use new technologies such as photonics, superconductors, carbon nanotubes, graphene and quantum computing. Chips will just be getting bigger and more optimized after we get to that point. Possibly 3D chips. People having a big cube processor lol.

It's amazing how close we are though, just around 5 years away.

Wiki is a poor substitute for reality in this case. Sometimes it is pretty good, other times it is just simply misleading.

We've been calling it nanotechnology and nanoelectronics ever since we crossed below the 110nm node.

I take it you aren't "in the industry" so to speak, I don't fault you for seeking out info on the web and taking it at face value as you have little else to scrutinize it with respect to.

But trust me we aren't 5yrs from the limits of anything, and we don't "run out" of nodes just because 11nm makes for a sexy wiki page title.

Sometime around 2020 (maybe as late as 2025) the use of silicon-based CMOS for leading edge IC's will run out of steam, there is a whole periodic table full of elements just waiting to be leveraged in the pursuit of making computations faster and faster.

From what I have seen, if we don't go below 11nm it won't be for any materials science, chemistry, or physics based limitations or reasons, it will be solely due to a lack of compelling economics of the markets for which the products would be targeted.

Flying cars and moonbases, all technically feasible, economically not so much.
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: Idontcare
Originally posted by: Kakkoii
Originally posted by: firewolfsm
After we get photo realism we start taking 2D 2560x1600 images and turning them into complete holographic 2560x1600x2560 scenes. That would theoretically keep chip makers busy for another 22 years.

Thing is, we will be reaching the limit of matter within 5 years. Nanotechnology, working at the atomic/molecular level. We have 32nm coming next year, a year after that 22nm is coming (fudzilla.com/index.php?option=...ent&task=view&id=13385). Then all we have left is 16nm, and then 11nm which is nanoelectronics.

Once we reach 11nm, you can't go any smaller... Because there's nothing stable smaller than an atom lol. So from there we merely have to create more efficient chips and use new technologies such as photonics, superconductors, carbon nanotubes, graphene and quantum computing. Chips will just be getting bigger and more optimized after we get to that point. Possibly 3D chips. People having a big cube processor lol.

It's amazing how close we are though, just around 5 years away.

Wiki is a poor substitute for reality in this case. Sometimes it is pretty good, other times it is just simply misleading.

We've been calling it nanotechnology and nanoelectronics ever since we crossed below the 110nm node.

I take it you aren't "in the industry" so to speak, I don't fault you for seeking out info on the web and taking it at face value as you have little else to scrutinize it with respect to.

But trust me we aren't 5yrs from the limits of anything, and we don't "run out" of nodes just because 11nm makes for a sexy wiki page title.

Sometime around 2020 (maybe as late as 2025) the use of silicon-based CMOS for leading edge IC's will run out of steam, there is a whole periodic table full of elements just waiting to be leveraged in the pursuit of making computations faster and faster.

From what I have seen, if we don't go below 11nm it won't be for any materials science, chemistry, or physics based limitations or reasons, it will be solely due to a lack of compelling economics of the markets for which the products would be targeted.

Flying cars and moonbases, all technically feasible, economically not so much.

I know it has been called nanotechnology for a long time. But true nanotech is working at the smallest scale possible. The 11nm page isn't just there because it's sexy, it's part of the International Technology Roadmap for Semiconductors.

Also, I didn't say we run out of nodes, I'm merely saying that around that time we can't shrink the chip components any more really. Using a smaller node would merely allow for better yields and detail/accuracy. 2020 is too far away, you have to look at Intel's roadmap because they have been steadily beating the ITRS's predictions for quite some time. And going by Intel's roadmap, it is only around 5-6 years away that we reach the limit of size the gates on a chip can be. We will have to transition to more advanced tech, probably the first being the move to carbon nanotube lanes, graphene gates. Then the next step may be using photonics to transmit data around our chips/motherboard for a much higher clock. And eventually quantum computers if that ever becomes feasible.
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
BTW I agree that photorealism is be far and away when it comes to drawing humans.
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: lopri
BTW I agree that photorealism is be far and away when it comes to drawing humans.

I disagree! :p

http://www.youtube.com/watch?v=SwAV2fXoy6E&fmt=18

http://www.youtube.com/watch?v=lvI1l0nAd1c&fmt=18

http://www.youtube.com/watch?v=-wtv4bsLWvw&fmt=18

http://www.youtube.com/watch?v=bLiX5d3rC6o&fmt=18

http://www.youtube.com/watch?v=e3vfxK-pMzE&fmt=18


I think were pretty close. Making skin texture looks like the real thing under different lighting conditions still needs some work though..
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: Kakkoii
The 11nm page isn't just there because it's sexy, it's part of the International Technology Roadmap for Semiconductors.

:laugh: I spent time developing the ITRS roadmaps, I'm well acquainted with its limitations as well as its purpose, some of which I attempted to communicate in my post. At any rate you have convinced me of the steadfastness of your convictions and I'm none too keen to impart contusions to an otherwise deceased equine (that is, to beat a dead horse) this evening so I'll busy myself elsewhere for a while.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: lopri
are

The other examples were ok but I really thought this one that I quoted was really superb :thumbsup:

It needs some junk floating in the water (in it and on the surface) as otherwise it looks too perfect still to be real, but other than that it seems pretty photorealistic to me.
 

Henrah

Member
Jun 8, 2009
49
0
0
Originally posted by: lopri
we

I always thought Crysis looked amazing, but now I've noticed that the grass is really poor when not in shadow. I understand that grass is very resource intensive, but surely it was in the same league as the plants and other vegetation which have much higher fidelity. Or maybe each blade was so small that high-quality shaders were not worth it? In this particular image, I think it should be.

Any idea what spec machine those renders came from, and whether they were from playable frame rates?
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: Henrah
Originally posted by: lopri
we

I always thought Crysis looked amazing, but now I've noticed that the grass is really poor when not in shadow. I understand that grass is very resource intensive, but surely it was in the same league as the plants and other vegetation which have much higher fidelity. Or maybe each blade was so small that high-quality shaders were not worth it? In this particular image, I think it should be.

Any idea what spec machine those renders came from, and whether they were from playable frame rates?

It's from a mod that's being created for Crysis. Those are in game shots with quality settings on the highest. Don't know about AA and AF levels though. But judging by the image, AA and AF are probably on their highest, or near highest.

http://www.crymod.com/thread.php?threadid=26547


Originally posted by: Idontcare
Originally posted by: Kakkoii
The 11nm page isn't just there because it's sexy, it's part of the International Technology Roadmap for Semiconductors.

:laugh: I spent time developing the ITRS roadmaps, I'm well acquainted with its limitations as well as its purpose, some of which I attempted to communicate in my post. At any rate you have convinced me of the steadfastness of your convictions and I'm none too keen to impart contusions to an otherwise deceased equine (that is, to beat a dead horse) this evening so I'll busy myself elsewhere for a while.

I'm not here to argue, just discuss. If there are fallacies in my last post, please point them out because I live to learn. So far you have just made broad statements about it towards me, which doesn't really help me to learn, now does it?
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: Kakkoii
I'm not here to argue, just discuss. If there are fallacies in my last post, please point them out because I live to learn. So far you have just made broad statements about it towards me, which doesn't really help me to learn, now does it?

Fair enough. FWIW I had thought I was being rather specific in addressing the fallacy of perceiving 11nm as some barrier to scaling as well as the timeline of 5yrs being needlessly conservative. If my comments came across as being broad, generic, or vague its perhaps because the truth of it is that the sky is very much the limit still in scaling silicon-based CMOS (provided someone feels like paying the bills) and as such any attempts to talk about nebulous engineering labels such as "11nm node" is really not addressable with specifics that can be composed in a post while being 30 lines of text or less. (not that I won't try every now and then ;))

To me, trying to argue with someone nowadays that their perception of scaling limitations is in error is kinda like trying to argue with someone 10yrs ago that there was no such thing as a "1GHz barrier"...there were some heated "wtf are you smoking" threads over why non-engineers kept referring to an arbitrary clockspeed like 1GHz as being a barrier somehow different and special in comparison to clocking an IC at 999MHz or 1.001 GHz. Fast forward to today and comments about 11nm being the end of scaling, etc, are just more of the same to my semi-glazed over eyes.

I've spent the last 15 yrs listening to folks tell me that scaling (i.e. my job) was going to die in two nodes or less; I guess I get a little needlessly terse about it sometimes after listening to what seems to be a broken record year after year (not your fault, your not the one repeating it, the same guy never does twice) so please give an old(ish) guy a moment to regain his composure. I apologize if I said anything that seemed argumentative or antagonistic above.

Let's move past it and on to something with more meat in it for your digestion and contemplation. This thread contains quite a few posts discussing the industry's ability to move to 5nm with existing proven technologies (basically the addressment of a critical barrier, the litho) and includes a link to Samsungs own R&D executive positing that scaling to 1.2nm is viewed as having known pathways.

(known pathways mean just add money and it is within scientific expectation that the engineering challenge is solvable, versus an unknown situation like say cold-fusion where there is still doubt as to whether it is theoretically feasible regardless of the quantity of R&D money one might spend on it)
 

Kakkoii

Senior member
Jun 5, 2009
379
0
0
Originally posted by: Idontcare
Originally posted by: Kakkoii
I'm not here to argue, just discuss. If there are fallacies in my last post, please point them out because I live to learn. So far you have just made broad statements about it towards me, which doesn't really help me to learn, now does it?

Fair enough. FWIW I had thought I was being rather specific in addressing the fallacy of perceiving 11nm as some barrier to scaling as well as the timeline of 5yrs being needlessly conservative. If my comments came across as being broad, generic, or vague its perhaps because the truth of it is that the sky is very much the limit still in scaling silicon-based CMOS (provided someone feels like paying the bills) and as such any attempts to talk about nebulous engineering labels such as "11nm node" is really not addressable with specifics that can be composed in a post while being 30 lines of text or less. (not that I won't try every now and then ;))

To me, trying to argue with someone nowadays that their perception of scaling limitations is in error is kinda like trying to argue with someone 10yrs ago that there was no such thing as a "1GHz barrier"...there were some heated "wtf are you smoking" threads over why non-engineers kept referring to an arbitrary clockspeed like 1GHz as being a barrier somehow different and special in comparison to clocking an IC at 999MHz or 1.001 GHz. Fast forward to today and comments about 11nm being the end of scaling, etc, are just more of the same to my semi-glazed over eyes.

I've spent the last 15 yrs listening to folks tell me that scaling (i.e. my job) was going to die in two nodes or less; I guess I get a little needlessly terse about it sometimes after listening to what seems to be a broken record year after year (not your fault, your not the one repeating it, the same guy never does twice) so please give an old(ish) guy a moment to regain his composure. I apologize if I said anything that seemed argumentative or antagonistic above.

Let's move past it and on to something with more meat in it for your digestion and contemplation. This thread contains quite a few posts discussing the industry's ability to move to 5nm with existing proven technologies (basically the addressment of a critical barrier, the litho) and includes a link to Samsungs own R&D executive positing that scaling to 1.2nm is viewed as having known pathways.

(known pathways mean just add money and it is within scientific expectation that the engineering challenge is solvable, versus an unknown situation like say cold-fusion where there is still doubt as to whether it is theoretically feasible regardless of the quantity of R&D money one might spend on it)

Yeah I understand what your saying. I probably should have worded my initial posts a little better, because I by no means intended that lithography couldn't go past 11nm, I was merely implying that the components on the chip when we get around that scale will be coming close to the atomic limit. If the gate size at 11nm is 6nm, then that means it's only roughly 60 atoms wide, no? So that's getting pretty darn close, you only have so many atoms :p, still a few nodes to go after that I know, but close none the less. But yeah I retract most of my past statements, that it is going to take more than 5 years to reach the point of only a few atom thick construction. But we are getting pretty damn close! :p. It will be an awesome day when I get to have my own carbon nanotube and graphene gate CPU! haha.

I understand the point about trying to argue with someone about something like this. When a person has been making a claim they believe to be true enough times, and then someone with the knowledge finally confronts them about it, it's hard for the brain to just let go of what it thought was right for so long. It's like arguing with a heavily religious person about evolution.

And it's always awesome to meet someone who has such a job. And thanks for the link.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Me thinks we are pretty close

BTW I agree that photorealism is be far and away when it comes to drawing humans.

There honestly is no difference in the technical hurdles for getting environments or humans to be photo realistic. Humans it is more striking to most people as we spend a lot of our life focusing on other humans, not so much on grass.

We have three rather large issues right now- the 'plastic' look, movement, and repetitiveness. The plastic look is partly a shortcoming of shader limitations, but the larger factor at the moment is the ability to accurately reproduce light properties. The only way to solve this is to use radiosity or something comparable. Any of the methods currently in use dwarf the computational intensity of trivial things like ray tracing.

Movement issuse at the moment have a whole bunch of problems, a very large factor is the extremely limited geometry we have to deal with. Right now we have bunches of grass that move in tandem, on a realistic basis each blade needs to be able to move in hundreds of different ways. For gaming that will obviously be way down the list on things to do, but it is a simple example of a far more complex technical issue. Stare at yourself in the mirror with a somber expression and then smile, ignore the area around your mouth and focus on how many different things move from this simple action. All of the differences in the countours of your face, eyes, even your scalp and hair(hair is another area we are terrible in, but that ties in with movement). We need many orders of magnitude geometry more then what we have now for everything. Our animation system is also very primitive. Right now- players arm moves and players arm moves. The subtleties are far more complex, how your skin is simply a layer, and you have many different layers that move in different ways not all in sync. Clothing is another area, obviously. It is easy to point out tons of issues just with humans, which is why I mentioned a blade of grass first. We are a staggering ways off from photo realism in movement(animation techniques right now aren't much better then they were in GLQuake, then we have the physics details required to accomplish this).

Repetitiveness is another staggering hurdle. I can blow up trees in Crysis(the plastic looking ones that don't move correctly ;) )- and they all fall into a small handful of identical trees everywhere. Even if you modify the shape/size the bark is still the same. Trees don't grow in perfect tandem in the wild, and it is very obvious. Why this presents major technical hurdles is that we need a way to create differences in a procedural manner or asset creation budgets would approach billions of dollars(actually, it would easily push into the billions for a single larger scale game). To be realistic about seeing realistic levels of variety we need to have staggering levels of power at our disposal to generate these things on the fly, they can't all be hand designed.

There are several other smaller problems, physics and actual air conditions- overwhelmingly every game right now has the look of air when it is well below zero, even ones that don't tend to at short distances at least. We have nothing in place outside of a couple simple 'haze/blur/dof' effects to deal with this, it makes a very noticeable difference. Particle systems need to exapand a staggering amount also, another area that requires a ton of computational power on both the rendering and non rendering computational side of things. I see a breeze, I should see dust/debris moving in a realistic sense(which means all objects need to have physics tied to them, not just things we can hit with a gravity gun ;) ).

Each of the major elements require an increase of computational power over what we have now greater then moving from GLQuake to Crysis. That isn't collectively, that is each by themselves. Even after you have all of that taken care of, the minor effects(which really start to stand out when you have the big stuff covered) would require more computational power to handle by themselves then what we have now by a considerable margin, if that is the only thing current tech was doing.

Movie CGI cheats a ton because they only have to do exactly what you see, games have to plan on you seeing everything. The level of computational power to get photo realistic in games is considerably more then what went into rendering out something like say, Episode 3.

To say again, decades away, at least.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: Kakkoii
Yeah I understand what your saying. I probably should have worded my initial posts a little better, because I by no means intended that lithography couldn't go past 11nm, I was merely implying that the components on the chip when we get around that scale will be coming close to the atomic limit. If the gate size at 11nm is 6nm, then that means it's only roughly 60 atoms wide, no? So that's getting pretty darn close, you only have so many atoms :p, still a few nodes to go after that I know, but close none the less. But yeah I retract most of my past statements, that it is going to take more than 5 years to reach the point of only a few atom thick construction. But we are getting pretty damn close! :p. It will be an awesome day when I get to have my own carbon nanotube and graphene gate CPU! haha.

I know it's OT for this thread, but I just saw this and figured you'd get pretty good read out of it along with a few others (otherwise I would have just sent the link by PM...thilan29 please forgive the OT intrusion
rose.gif
):

IMEC Tips 10 nm Options at Tech Forum

At the IMEC Technology Forum in Brussels, Belgium, IMEC Fellow Marc Heyns presented various CMOS transistor possibilities for 15 nm and beyond. "We are at the brink of a new era of innovation," Heyns said, adding that he sees no fundamental barriers to scaling to the 10 nm node.

http://www.semiconductor.net/a...ions_at_Tech_Forum.php