So when is the next 'gamechanger' pc hardware/software going to come along?

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76
It feels like graphics capability has stagnated for a while now. Game AI has barely seen any improvement for any game in the last 15 years as well. It's like they pour every penny into voice acting and polygon count but without a whole lot of benefit.

Some new technology whether it be software or hardware needs to come along sometime, but when? There has been mention of future computers using light instead of electrical pulses, has anything become of this? Is there a way to get away from using polygons in game engines in the future and replace with something else? There was a tech demo like 6 years ago demonstrating using particles instead of polygons but nothing ever became of it.

With the possibility that Valve could be doing something to steer away from Windows 8 this got me thinking of what else could be changed.

Also, considering how shadows work in games, they have always been awful, in visual quality and performance-hit on hardware when used. It's always the same thing, a glitchy shadow on character face here, a jagged shadow off the wall there... Can the way that game engines are designed be approached differently to allow for more flexible shadows without the performance hit? How about a "Shadow Processing Unit"(SPU?), a piece of hardware to plug into a pci-e slot? Give devs an API to use it. Thinking in this way, could we break down the function of the graphic card into smaller modular units, here's a shader card, here's a gpu ram card, here's a texture processing card etc...

I hardly know what I'm talking about here, I guess I'm bored with todays technology and want something different.
 

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
It will happen after the upcoming video game crash, when people get tired of playing CoD/Battlefield every year.
 
Feb 4, 2009
35,862
17,406
136
When everyone has 4k monitors, even cheap PC's can play stuff real good. There seems to be a crap load of wasted performance in most games.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
All the pieces are rapidly falling into place for the development a new generation of computers that only vaguely resemble the ones we have today. The single biggest innovation to look forward to in next generation computer hardware is what I like to call Frankenstein chips. Just as shrinking components on a chip reduces latencies you can stack one chip right on top of another and/or side by side on a silicon "transposer" to create one massive chip sewn together from parts like Frankenstein's monster. This way the parts are closer together for lower power requirements and reduced latencies. With each chip being millimeters thick and the size of a finger nail the sky is the limit if you can deal with heat and other technical issues economically. The most frustrating thing for me about this technology is that it is completely unpredictable and the expensive Frankenstein chip you buy today could become obsolete overnight.

Stacking may sound like pie in the sky to some, but servers and smart phones already use stacked chips, HP has offered to put 2gb of their memristors on top of any existing chip, and stacks of up to 8 conventional ram memory chips can be made. The only remaining issue is cost and last year Intel demonstrated their new hybrid memory cubes with 1Tb transfer speeds and pictures of their upcoming Haswell chip indicate it was designed to use a transposer. Already a consortium of all the major manufacturers has formed to establish a new standard for hybrid memory cubes so they can replace traditional ram sticks as soon as possible. Using 70% less space and a 7 fold decrease in power they are ideal for portable applications. Because each contains its own controller chip for input and output the memory chips themselves can eventually be replaced with anything including nonvolatile phase change memory.

All the evidence indicates we're about to get slammed with a variety of Frankenstein chips and, as if that were not confusing enough, some of the individual chips will have heterogeneous architectures containing multiple different types of processors. As best I can determine 8 cpu processors is an ideal minimum for processing full blown matrices and roughly 80-300 simplified GPU processors are ideal for anything from transcoding to physics to AI. For a desktop PC that would mean more of the load normally placed on the video card today can be done on the APU and transferred over the currently underutilized PCI-e 3.0 bus. Exactly how these heterogeneous architectures will evolve is anyone's guess but, thankfully, they will incorporate hardware accelerated transactional memory making them easier to program.

The only way I know to evaluate the power of such monstrosities is by measuring raw bandwidth capacity and, apparently, a lot of that bandwidth will soon be taken up by ultra high definition screens. OLEDs continue to slowly come on the market, but LCD manufacturers now have a way to produce ultra high definition screens almost as cheaply as the current high definition LCD ones and are retooling their assembly lines as quickly as possible. To leverage the available bandwidth even better the first video cards capable of using system ram as well as vram for displaying graphics are already on the market perhaps indicating the shape of things to come. That is, computers where the distinctions between the individual components become increasingly blurred as the emphasis shifts to maximizing overall bandwidth potential by designing all the components to be flexible enough to assist each other in almost any task.

The remaining question to be answered is what kind of cheap storage system will replace DVD drives. All those extra textures and whatnot will require cheap storage and as yet there is no obvious successor. Bluray disks might do the trick, but the higher the capacity the better and bluray is marginal at best for the kind of storage required for ultra high definition. Then there is Intel's new Knightsbridge/larrabee which they are investigating for possible use in cloud computing video games. Not terribly exciting for shooters, but for Sims and slower paced games capable of ray traced graphics. All of this should be available in the next 3 years or so when the next generation consoles come out and that's when we can probably expect the first video games as well that blow everyone's doors off the way Crysis did.
 
Last edited:

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76

This sounds neat. A box full of 'generic' components all capable of collectively doing the same thing. It seems like nvidia and ati would be obsolete if this were to happen. At this point it's all up to the software to decide what to do with everything. I'm guessing there would still have to be some amount of specialized components though. I really would like a modular graphics card/bay/whatever. Instead of having to buy a whole new card because you don't have enough ram on your card you could just grab a memory upgrade card.
 

HeXen

Diamond Member
Dec 13, 2009
7,840
40
91
when was the last crash?

Right before NIntendo made the NES.
You see, Atari didn't require licenses. Anyone could make a game for it. They did, they sucked, huge crash happened. No one wanted to play games anymore.
Nintendo said, ok, if you want to make games with our hardware, you have to sign here and here and transfer money to here. They called this "quality assurance" Which it did weed out the super bad games for the most part.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
This sounds neat. A box full of 'generic' components all capable of collectively doing the same thing. It seems like nvidia and ati would be obsolete if this were to happen. At this point it's all up to the software to decide what to do with everything. I'm guessing there would still have to be some amount of specialized components though. I really would like a modular graphics card/bay/whatever. Instead of having to buy a whole new card because you don't have enough ram on your card you could just grab a memory upgrade card.

They won't be perfectly identical components anymore than they are today; just more capable of sharing the load among themselves. The push for lower power components also means current desktops with all their fancy cooling systems and 1000 watt power supplies will eventually become history. Instead of spending all that money and space on those you'll be spending more of it on chips you simply plug into your mobo or gpu. People are already beginning to speculate the days of the oversized cpu cooler are history.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
It will happen after the upcoming video game crash, when people get tired of playing CoD/Battlefield every year.

This.

There is no money in innovating anything or spending too much time producing and polishing a good product. Today it's all about reusing last year's game, throwing some spit on it, and selling it again and again to the retard ADHD mainstream masses.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
I want to see improvements in data storage more than anything. CPUs and GPUs are overpowered right now and game companies aren't in a hurry to utilize it if they are making millions on 10 year old games re-released every 3 months with a new label.

I want to see non volatile universal main memory. 256 GB main memory that is both your RAM and "disk space" at the same time; no storage device needed except an external interface to transfer data in and out of the PC's main memory.

It doesn't seem like it would be enough, but you have to shift your brain around a new paradigm. Most of your RAM just redundantly duplicates things on disk; levels, textures, file caches, loads things already on disk into RAM, etc.

In a sense, you wouldn't even need a whole lot of RAM in the conventional sense anymore in the first place because you wouldn't have to "load" things ever again. If everything was just already there, CPU can access it instantly with a JMP or MOV in place. No more loading. No more I/O bottleneck. Merge the file system and page table constructs into a single universal main memory paradigm that's like a permanent RAM drive as fast as SRAM and with no battery required.

256 GB+ STT-MRAM running at 50 GB/sec and nanosecond access times please... and no floating gate degradation limited life span, and no power required to retain data indefinitely until changed.

Yes please. The HDD access light and cylinder icon needs to go the way of the parallel port and FDD. Say no to I/O! Say yes to zero wait state instant access computing. A human should not ever have to stare at "please wait loading" or "gathering information" for 5 minutes on machine that is supposedly capable of over one trillion operations per second....

Not even NAND flash is going to solve the I/O bottleneck for long.
 
Last edited:

micrometers

Diamond Member
Nov 14, 2010
3,473
0
0
The biggest improvement will come in generated worlds where the artist doesn't have to paint EVERY SINGLE little rock and leaf, but can instead have an algorithm generate it for him.

And also complete physics modeling.

Then you'll see downloadable games that have the same "richness" as today's AAA titles.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
I want this!

I wonder if it's technologically feasible?

There are numerous technologies in the works all competing and promising to be the holy grail of universal computer memory storage devices, but they are all sitting on the back burner because flash memory is so easy to make and so profitable.

Laying out floating gate transistors in series on silicon is like the cheapest easiest thing in the world to make and it's like printing money for what NAND retails for

Naturally everyone wants to run all their plants at 250% capacity printing money instead of taking a risk with a potential industry changing superior technology. So we are stuck with slow data storage even with SSDs.

New STT-MRAM techniques should scale well on current .2x nm processes and compete with flash memory density but with the speed of modern DRAM/SRAM and no erase/write cycle limitations if anyone actually attempted to produce it.

If you figure 8-16 chips = 256 GB SSD, instead figure that to be 1 DIMM, now pack 4 DIMMs into your system and get rid of SATA ports. 1024 GB non volatile RAM drive basically...
 
Last edited:

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76
The biggest improvement will come in generated worlds where the artist doesn't have to paint EVERY SINGLE little rock and leaf, but can instead have an algorithm generate it for him.

And also complete physics modeling.

Then you'll see downloadable games that have the same "richness" as today's AAA titles.


It seems to me like third parties would be all over stuff like this, creating physics, world automation systems, game engines and trying to sell them to developers to put their games together in. It would reduce costs all around since the developers wouldn't have to put so many resources into the engine and can then focus entirely on game mechanics. I think we'd get better games and at a quicker pace. The problem here is uniqueness of games might be poor due to using the same engine and /or restrictions of the game engine. Find a way to make them modular and we'd probably be set. I'm sure there are people brilliant enough out there to make something like this happen. Although I'm not sure if it would actually be a good thing.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
It would reduce costs all around since the developers wouldn't have to put so many resources into the engine and can then focus entirely on game mechanics.

No it wouldn't. You'd still be paying $59.99 for Call of Duty 9 even 3 years after it's release, and $100s of dollars of DLC that only took 5 minutes to make.

Pisses me off so much when I go to game stores. For a while I repeatedly saw Gears of War 3 still $59.99 years after launch. It has no business being priced that much still. Just imagine when things go all digital distribution that purposely makes used game sales and trading impossible.
 
Last edited:

Kalmah

Diamond Member
Oct 2, 2003
3,692
1
76
No it wouldn't. You'd still be paying $59.99 for Call of Duty 9 even 3 years after it's release, and $100s of dollars of DLC that only took 5 minutes to make.

Well that's a slap of reality to the face. lol. I forgot to take corporate greed into consideration.
 

micrometers

Diamond Member
Nov 14, 2010
3,473
0
0
No it wouldn't. You'd still be paying $59.99 for Call of Duty 9 even 3 years after it's release, and $100s of dollars of DLC that only took 5 minutes to make.

Pisses me off so much when I go to game stores. For a while I repeatedly saw Gears of War 3 still $59.99 years after launch. It has no business being priced that much still. Just imagine when things go all digital distribution that purposely makes used game sales and trading impossible.

I have never bought a game for $59.99. The most I've paid for a game this generation was $20.
 
Feb 4, 2009
35,862
17,406
136
Agreed, I can't figure out why everyone starts at $60. I'm a sales manager and I respect the start big idea but why not $30/$40 then DLC add ons or why does every MMO start at $60/$15 per month or completely free with pay to win. Why not $20 or $30 then $8 per month or $60 per year then supplement with DLC?
 

CuriousMike

Diamond Member
Feb 22, 2001
3,044
544
136
When economics no longer make sense.

New consoles are going to require even more people to make more content; more being "more" and more being "more complicated."

Then people will bitch that the $60 game they bought doesn't have the ending they wanted and demand their money back.

Then they'll go back to paying $0.99 for their mobile games and bitch and whine about lack of innovation.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
When economics no longer make sense.

New consoles are going to require even more people to make more content; more being "more" and more being "more complicated."

Then people will bitch that the $60 game they bought doesn't have the ending they wanted and demand their money back.

Then they'll go back to paying $0.99 for their mobile games and bitch and whine about lack of innovation.

I had no problem paying $79 for games on SNES because they were actually good and didn't exist for the sole purpose of nickle and diming you with the same tired shit month after month.

Even now I find myself paying 3 digits for SNES games in complete box sets.

But I refuse to buy a mainstream game like Call of Duty or Gears of War 3 for full retail price a year after launch.

It's not about not being willing to support developers for great games, it's about not supporting the locusts and leeches that are sapping the life out of gaming.

I'm picking up brand new DS games for $20 that I guarantee took more development man hours than another Call of Duty/Gears of War copy/paste job. And those man hours went into plot and game development and not modeling barrel explosions.
 
Last edited:

gladiatorua

Member
Nov 21, 2011
145
0
0
Agreed, I can't figure out why everyone starts at $60.
Because people are willing to pay. Mostly on consoles. PCs currently have much more flexible payment system.
r why does every MMO start at $60/$15 per month or completely free with pay to win.
It was true couple of years ago. Now more and more big companies embrace f2p with reasonable micro-transactions. Sony with DCUO an upcoming Planetside2 is good example. Arena.net with buy2play model for Guild Wars, which did very well, and now NCsoft with Guild Wars 2. MechWarrior is on the way. The whole thing might be a result of huge success of WoW. When people started to thing that other MMOs were not good enough to have monthly sub... even if they were.
F2p might be destructive though. How can you seriously compete with f2p TF2 for example? I don't think Valve intended any harm, but it's hard to do reasonably well while not being EA, Activision or Valve. TF2 is hugely successful, has really good developer support and is f2p. How can you beat that? And Valve might go even further and release more of their games as f2p to attract more people to Steam. Especially now that Gabe is threatened by what W8 might bring.
I want this! I wonder if it's technologically feasible?
Very. Current motherboards support upto 32gb of ram(excluding high-end). You can build a hybrid system today easily preloading 20gb stuff to ram.
In five years it might really be possible
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
I think there is a change coming, don't know about a crash though.

Blizzard bombing out and releasing a terrible game for Diablo 3 is definitely a sign that the new modus of designing for profit not enjoyment is starting to get blowback.

Valve has managed to stay out of the mess, but they only release a game once every 50 years.
 

happysmiles

Senior member
May 1, 2012
340
0
0
Next gen consoles...
Assuming that the next Xbox takes a lot of it's efficiency from Win8 and shares a lot of DirectX 11.1 APIs then there would be more room for Developers to improve on than just raw processing power.
In my mind it's more shader effects then true 1080p console gaming.
 

HeXen

Diamond Member
Dec 13, 2009
7,840
40
91
i found all the COD's quite as entertaining as i found Stalker series. No need to hate just cause its popular or its not super complex with drag n drop inventories for every body part and crashes all the time.
Gaming is about having fun. Sequals do not have to be some new formula to be as fun as the original.