Are we close to the next paradigm of processing?

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
In the late 1800s, electromechanical computation devices were invented which used punchcards. In the 1930s, relay devices were used and were replaced by vacuum tubes. Then the transistor was the successor, followed by integrated circuits which we are currently using. Integrated circuits have been used for a long time and shrinking them is yielding diminishing returns. At the least, engineers will run out of silicon atoms to use. Are there any promising successors to the integrated circuit?
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
The silicon roadmap currently extends to 2020 at 6 nm. The transition to optic and/or graphene microcircuits will be well underway outside the consumer space by that time.

There are no genuine paradigm shifts, real change always happens gradually. The number of post-silicon processors will increase steadily and exponentially from their place in labs today until they cover the planet.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Quantum computing is also heading towards the commercial space with the first of the affordable machines now being made ($200k). There are still large breakthroughs to make to make it more generally useful but it has the potential to bring a completely different type of computation into our world.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
There are several promising technologies, but the one making the most advances right now is reconfigurable computing where the circuits themselves can be reconfigured to form whatever circuits you want on the fly. If a program requires more memory or a specific logic circuit it just adapts the circuitry on the chip.

IBM's experimental neuromorphic memristor chip is perhaps the best example right now. Instead of shrinking the chip components smaller, you shrink the number of components you need on the chip to do the same job. Within ten years they hope to have somewhere between the equivalent of a cat and a human brain's processing power on a single chip.

The biggest drawback to this approach is that the mathematics for designing and programming them are horrendous. IBM's chip however imitates the brain so that instead of programming it you teach it what it needs to know. Eventually the chips could be designing and teaching their own successors.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
The silicon roadmap currently extends to 2020 at 6 nm. The transition to optic and/or graphene microcircuits will be well underway outside the consumer space by that time.

There are no genuine paradigm shifts, real change always happens gradually. The number of post-silicon processors will increase steadily and exponentially from their place in labs today until they cover the planet.

Yeah, what has happened so far is that a processing technology is slowly phased out as a newer one becomes cheaper and more developed, such as vacuum tubes being replaced before they hit a physical wall in how much they could shrink them. Hopefully silicon circuits can be replaced before they start shrinking more slowly. We've seen exponential growth for over a century, hopefully it will continue.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
There are several promising technologies, but the one making the most advances right now is reconfigurable computing where the circuits themselves can be reconfigured to form whatever circuits you want on the fly. If a program requires more memory or a specific logic circuit it just adapts the circuitry on the chip.

IBM's experimental neuromorphic memristor chip is perhaps the best example right now. Instead of shrinking the chip components smaller, you shrink the number of components you need on the chip to do the same job. Within ten years they hope to have somewhere between the equivalent of a cat and a human brain's processing power on a single chip.

The biggest drawback to this approach is that the mathematics for designing and programming them are horrendous. IBM's chip however imitates the brain so that instead of programming it you teach it what it needs to know. Eventually the chips could be designing and teaching their own successors.

This makes me think that artificial intelligence isn't out of reach.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Memresistors will likely shake up the current status quo in a big way. By giving us very fast storage the current need to load and save may disappear, instead all programs are running all the time in persistent RAM.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Quantum computing is also heading towards the commercial space with the first of the affordable machines now being made ($200k). There are still large breakthroughs to make to make it more generally useful but it has the potential to bring a completely different type of computation into our world.

I'm very surprised that quantum computing is coming close to fruition. I found this company that sells one: http://www.dwavesys.com/en/products-services.html . I wonder how much it costs.
 

wirednuts

Diamond Member
Jan 26, 2007
7,121
4
0
memristors will replace transistors. so instead of 1 and 0, it will be more like 1 or anything up to 100 or 1000. it will make binary coding seem primitive...
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
This makes me think that artificial intelligence isn't out of reach.

Yeah, it probably isn't out of reach, but its still a long way off. These experiments are still just trying to reproduce the basic functionality of neurons and synapses. Similar to the first attempts to create a transistor or basic electronic circuits. At best the results will resemble idiot savants with a very limited awareness, but remarkable capabilities. Until we actually understand the overall or global organization of the brain there isn't much hope of doing better.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Yeah, it probably isn't out of reach, but its still a long way off. These experiments are still just trying to reproduce the basic functionality of neurons and synapses. Similar to the first attempts to create a transistor or basic electronic circuits. At best the results will resemble idiot savants with a very limited awareness, but remarkable capabilities. Until we actually understand the overall or global organization of the brain there isn't much hope of doing better.

Neuroscience just isn't developed enough to provide the blueprint for emulating the brain. Hopefully more of the brain will be understood otherwise yes, we'll have savants at best.
 

exdeath

Lifer
Jan 29, 2004
13,679
10
81
And despite all this, most people will still be limited to the 200 kilobyte per second random access performance provided by historical mechanical magnetic track recording for data storage...

Human brain can store and retrieve random access data at many gigabytes per second. Just take an instant to remember a vivid image of something, and BAM you just recalled several megabytes of uncompressed pixel data instantly, you didn't have to "seek" for it. Every time you look at something and recognize objects in your environment, how many gigs per second are you processing with no long term memory access delays? (eg a HDD) All the bitmap data from the retinas, being compared to everything you've ever seen and learned since childhood and allowing you to instantly recognize with no delay "that is a computer monitor, that is an image of a car on the computer monitor, that is a phone", etc. In 3D no less. And recalled instantly from every angle, every sound, every smell, make, model, brand, color, size, shape, etc. And you still instantly recognize objects. Try that on Google!

Our data storage technology has about 5 decades of catching up to do before we start worrying about faster processors. We are handicapped by storage technology as it is. Let me know when we have 50 GB/sec non volatile main memory that completely eliminates the need for a HDD/SSD to retain data on power loss. And can do parallel searches on the entire memory array all at once. eg: search for a specific file (no cheating with indexing) on a 4 TB HDD instantly every time with no access delays, progress bars, or flash light / dog sniffing animations...

Processing power is only as good as the ability to access the data to process. I don't care if you have 16 cores and 100 GB /sec of main memory bandwidth, you're still having to load your data file from a spinning disk at kilobyte/megabyte per second speed. Humans, cats, and artificial intelligences don't walk around freezing and stuttering 5-10 minutes at a time any time the disk access light goes solid...
 
Last edited:

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Human brain can store and retrieve random access data at many gigabytes per second. Just take an instant to remember a vivid image of something, and BAM you just recalled several megabytes of uncompressed pixel data instantly, you didn't have to "seek" for it.


The brain doesn't retrieve gigabytes of data, far from it. The brain makes use of shortcuts to process information. Whenever you view something like a chair in a room your brain takes the basic shape of the chair, a low resolution image is the closest thing to compare it to, it then begins trying to match that with other images that may be close, it starts with the most recent images that might match and then proceeds to go further back in time, searching for matches. It has been shown that if a person is shown a set of flashcards with images of a wall with writing then that person is shown a single flashcard with writing that does not match any of the flashcards in the original set , the person will first claim that the writing said what was in the set of flashcards because the brain jumps to conclusions based on the most recent information.
Police have a serious problem with this because it leads to wrong information in crime scene witnesses. People report what they see based on their own personal memories and what they have witnessed recently, not what is really present.

When you look at a room , the brain processes the room using shadows to determine the layout, not how the room really is designed. That is how optical illusions work, the brain uses lots of shortcuts to determine things like memories , vision, hearing because it really isn't a fast processor of information. The brain runs at a frequency of around 40 Hz max, and because the brain cannot do more than one thought processing activity at once we can't do more than 40 thoughts per second. The brains 40HZ rate is why calculators, computers, can do number crunching so much faster than us, there is no way we can do it as fast.

Instead of trying to go for speed what computer designers should be looking at is adapting the shortcuts the brain uses to computers . If the shortcuts could be adapted to a cpu then imagine the abilities of something like even a 1Mhz clock for things like visual processing. Currently visual processing for computers uses lasers, sound, to determine something like a path down a road, our brains could never process that much data, instead they need to look at how the brain looks for clues on the path down the road and uses very minimal information to determine the path.
 
Last edited:

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
I also wonder what storage medium we will use next. Hard drives have scaled exponentially for quite some time, but presumably the fundamental technology has to change. DNA would offer an extremely dense information storage that is extremely reliable, but I question whether the speed of what are fundamentally chemical reactions will be an impediment to using this as a storage technology for our computers.
 

epidemis

Senior member
Jun 6, 2007
796
0
0
Memresistors will likely shake up the current status quo in a big way. By giving us very fast storage the current need to load and save may disappear, instead all programs are running all the time in persistent RAM.

Memristors seem promising, but I have yet to see an actual prototype of it. It seems to have an insurmountable obstacle to it.
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I also wonder what storage medium we will use next. Hard drives have scaled exponentially for quite some time, but presumably the fundamental technology has to change. DNA would offer an extremely dense information storage that is extremely reliable, but I question whether the speed of what are fundamentally chemical reactions will be an impediment to using this as a storage technology for our computers.

The latest trend in the industry is called "packaging" where you try to place several chips onto a common silicon substrate (an "interposer") or stack them. At just a few millimeters thick and the size of a finger nail if you could stack chips on top of each other you could reduce latencies and cram a small supercomputer into something the size of a sugar cube. Heat becomes an issue, but the entire industry is now dedicated to producing more efficient processors that produce less heat. One issue for long term storage then is how cost effective it is to integrate directly into existing silicon chips.

HP has already offered to put 2gb of their memristors right on top of any existing chip and the potential to stack them in multiple layers exists. Using interposers you could connect multiple cubes with low latencies. Since memristors can be programmed to perform different functions it is possible such a computer could trade processing power for memory and vice versa. The more you fill it up with data, the less capable the computer becomes.

It could be then that future home computers and portables might rely on memory integrated right into the processor and mother board and "long term storage" could become synonymous with external archiving solutions.
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
i also wonder what storage medium we will use next. Hard drives have scaled exponentially for quite some time, but presumably the fundamental technology has to change. Dna would offer an extremely dense information storage that is extremely reliable, but i question whether the speed of what are fundamentally chemical reactions will be an impediment to using this as a storage technology for our computers.

lol dna as reliable?!?!
 

magomago

Lifer
Sep 28, 2002
10,973
14
76
DNA replication during mitosis is orders of magnitude more reliable than hard drive storage. It's only because replication happens so many times do mutations occur.


sure, the MECHANISM to replicate DNA is reliable....but DNA as a storage medium is not very reliable. It constantly needs repair because its prone to mutation by many things in the environment.
 

Murloc

Diamond Member
Jun 24, 2008
5,382
65
91
sure, the MECHANISM to replicate DNA is reliable....but DNA as a storage medium is not very reliable. It constantly needs repair because its prone to mutation by many things in the environment.
use-windows-get-virus-use-mac-get-cancer-thumb.jpg

shit just got real.
Computers will get cancer.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Neuroscience just isn't developed enough to provide the blueprint for emulating the brain. Hopefully more of the brain will be understood otherwise yes, we'll have savants at best.
Gaining the kind of operational understanding that we simply can't get by observational methods, is basically the point of IBM's project. We are at a level of knowledge and technology where trying to build a brain emulator may be more efficient than any further observation or invasion of existing nervous system activity, to gain more of an understanding of how we work.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,617
5
81
There are several promising technologies, but the one making the most advances right now is reconfigurable computing where the circuits themselves can be reconfigured to form whatever circuits you want on the fly. If a program requires more memory or a specific logic circuit it just adapts the circuitry on the chip.

IBM's experimental neuromorphic memristor chip is perhaps the best example right now. Instead of shrinking the chip components smaller, you shrink the number of components you need on the chip to do the same job. Within ten years they hope to have somewhere between the equivalent of a cat and a human brain's processing power on a single chip.

The biggest drawback to this approach is that the mathematics for designing and programming them are horrendous. IBM's chip however imitates the brain so that instead of programming it you teach it what it needs to know. Eventually the chips could be designing and teaching their own successors.

My avatar suggests this is a bad idea.
 

Smartazz

Diamond Member
Dec 29, 2005
6,128
0
76
Funny that this thread came back up. I recently had a two hour discussion with Murray Campbell from IBM about his work with Deep Blue. He shed a lot of light on why they picked chess as a game to master with artificial intelligence and a lot of how they did it. Really a brilliant guy.