• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

How fast do you think our brains process information at?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Bigsm00th
this question probably has no answer because i dont think you can relate it to a computer.
Actually, most predictions I have read deduce that processing power will reach the capacity of the average human brain at some point near the year 2020-2030. Use Moore's Law, and compute forward for an approximation.
 
Originally posted by: ActuaryTm
Originally posted by: Bigsm00th
this question probably has no answer because i dont think you can relate it to a computer.
Actually, most predictions I have read deduce that processing power will reach the capacity of the average human brain at some point near the year 2020-2030. Use Moore's Law, and compute forward for an approximation.

true :thumbsup:
 
Originally posted by: Bigsm00th
Originally posted by: Philippine Mango
I know but what I'm thinking of is the bandwidth..

wtf do you mean "brain bandwith"? that makes no sense really, because i dont think we measure our brain patterns in bits. this question probably has no answer because i dont think you can relate it to a computer.


edit: after thinking about it some more, i might be wrong. im not sure.

well think about it...you could measure each neuron firing per second.
 
Originally posted by: spidey07
Originally posted by: Bigsm00th
Originally posted by: Philippine Mango
I know but what I'm thinking of is the bandwidth..

wtf do you mean "brain bandwith"? that makes no sense really, because i dont think we measure our brain patterns in bits. this question probably has no answer because i dont think you can relate it to a computer.


edit: after thinking about it some more, i might be wrong. im not sure.

well think about it...you could measure each neuron firing per second.

yeah, but does that directly translate into what we call a bit? i just dont see how you can get a measure of this in relation to a computer. ill admit i dont know much about it, but its still not making sense to me.
 
Originally posted by: Bigsm00th
Originally posted by: spidey07
Originally posted by: Bigsm00th
Originally posted by: Philippine Mango
I know but what I'm thinking of is the bandwidth..

wtf do you mean "brain bandwith"? that makes no sense really, because i dont think we measure our brain patterns in bits. this question probably has no answer because i dont think you can relate it to a computer.


edit: after thinking about it some more, i might be wrong. im not sure.

well think about it...you could measure each neuron firing per second.

yeah, but does that directly translate into what we call a bit? i just dont see how you can get a measure of this in relation to a computer. ill admit i dont know much about it, but its still not making sense to me.

maybe not a bit, but definately an "Instruction"

The article posted above related it to "operations per second"
 
Originally posted by: spidey07

maybe not a bit, but definately an "Instruction"

The article posted above related it to "operations per second"

makes more sense :thumbsup:

didnt see that part
 
Originally posted by: Bigsm00th
Originally posted by: Mo0o
Originally posted by: Philippine Mango
Originally posted by: Mo0o
Yet more bitchin by Mo0o

Look at me! Im a big flaming retard!

Still doesn't negate the fact that every thread you start is asinine

Mo0o is right. you are an idiot Philippine Mango.

How am I an idiot? It's not like I asked "how many MHZ does poo run at?"...... your a douche with nothing better to do but nef and raise your post count. You guys bitch all the time claiming that WERE (people who make threads that may seem odd) neffing but instead you nef in the "alledged neffing thread" to bitch about neffing. For every thread where "inane posts are made" you guys will make like 20 post bitching about how the OP is stupid and sh!t, you guys are just as hipocritical.
 
Originally posted by: Vaerilis
It doesn't work the same way our computers do. Some calculations are done instantly (quick understanding of 3D environments, complex movement processes), but some rather simple ones (like dividing numbers) take a lot of time to complete.
It can store petabytes of information, but the compression method is often very lossy (try recalling a movie).

Actually, I don't think the compression is lossy at all. I just think the method of retrieving the data is the bottleneck for memory recall.
 
I don't know why you guys are coming down so hard on the OP. Sure he has posted nonsense in the past but this is a pretty good question, and the thread has been interesting.

edit: ok I take all that back after seeing his own parody thread. retard.
 
Our brains are analog, so it wouldn't be measured in hertz or bits per second. Our brains are special purpose devices. It's evolved to work with the senses that we have.
 
Originally posted by: ActuaryTm
Actually, most predictions I have read deduce that processing power will reach the capacity of the average human brain at some point near the year 2020-2030. Use Moore's Law, and compute forward for an approximation.[/quote]

We've been dreaming that computers will have human-like capabilities for the last 50 years, and it seems to always be "just 20 years away". It isn't even known yet if it's even possible to do with a digital circuit.

Moore's law is often mis-used to talk about capabilities of computers. It doesn't tell you anything about the speed or capabilities that a computer may have, such as artificial intelliegence or human-like thought. Moore's law only tells you about transistor count in integrated circuits, stating that the complexity will double every 18 months.

Computers have increased greatly in speed but they haven't increased in capabilities such as thought. What a computer could do 5 years ago, it can now do much faster, but the capabilities have remained the same- it's still a glorified calculator. No computer has ever showed emotion or independent thought, and making one of today's computers 1000x faster won't enable a computer to become self-aware.

 
Originally posted by: MAME

Actually, I don't think the compression is lossy at all. I just think the method of retrieving the data is the bottleneck for memory recall.

Our memories have been proven to be lossy. That's just how our brains work. We don't store thought digitally, it's all analog and inherently lossy.

 
Originally posted by: 91TTZ
No computer has ever showed emotion or independent thought, and making one of today's computers 1000x faster won't enable a computer to become self-aware.
Believe you may have missed several key elements of this thead. The query revolved around processing, and not encompassing the entire brain.

As far as my own comments, you may want to reexamine "capacity" and "approximation". I think most here are intelligent enough to realize anything more would be rather trivial.
 
Originally posted by: ActuaryTm
[
Believe you may have missed several key elements of this thead. The query revolved around processing, and not encompassing the entire brain.

As far as my own comments, you may want to reexamine "capacity" and "approximation". I think most here are intelligent enough to realize anything more would be rather trivial.[/quote]

I was trying to state that the original poster's question is difficult to answer, since a computer handles and processes only a limited type of task while our brains process wildly differing types of data. The processing rate for each of those tasks is proportionate to its function and necessity. When we talk about "data rate" today, we are talking about data processed by a digital computer. Our brains are analog computers, so it would be tough to compare.

If I had to make some kind of comparison, I'd imagine that the portion of our brain that we use to do math is extremely slow compared to even the slowest computers. However the portion of our brain used for visual processing, touch, balance, and situation awareness is highly developed, since through evolution we had to navigate our surroundings. Even the most advanced computers isn't able to accurately pick familiar people out in a crowd if they're wearing different makeup, etc, but we're able to spot that kind of thing very quickly since our brains process that type of data quickly.

It's quite amazing how complex and application-specific that life is. Even a dust mite has capabilities that modern robotics scientists only dream of. Being able to analyse its environment, make a decision on how to act, and act on that decision is pretty impressive.
 
i'm guessing its in TB or PB/sec... i remember reading an article about how scientists simulated touch over fiber optic lines, required massive data transmission
 
Originally posted by: SKORPI0
Computer To Be As Fast As Human Brain - Nov 2002

A human brain's probable processing power is around 100 teraflops, roughly 100 trillion calculations per second, according to Hans Morvec, principal research scientist at the Robotics Institute of Carnegie Mellon University.

That's pretty fast. Wasn't there a computer at Stanford that they got up to 61 or 71 teraflops recently? I remember reading about it in CPU in December.
 
Originally posted by: MAME
Originally posted by: Vaerilis
It doesn't work the same way our computers do. Some calculations are done instantly (quick understanding of 3D environments, complex movement processes), but some rather simple ones (like dividing numbers) take a lot of time to complete.
It can store petabytes of information, but the compression method is often very lossy (try recalling a movie).

Actually, I don't think the compression is lossy at all. I just think the method of retrieving the data is the bottleneck for memory recall.

Good point - after all, with regression and hypnotherapy, people are supposedly able to recall previously forgotten childhood memories.
 
Originally posted by: aplefka

That's pretty fast. Wasn't there a computer at Stanford that they got up to 61 or 71 teraflops recently? I remember reading about it in CPU in December.

Yeah, it used to be one of the fastest, but somebody brought it along to a fraternity party where it smoked some pot and began mingling with the wrong crowd. Now it just processes 2 teraflops per second working as a cash register in a surf shop in San Diego.
 
Back
Top