Brains vs. Computers

Differences in how computers and the brains of living creatures operate can be linked to the intentions behind the creation of each. Computers are designed to carry out symbolic operations with absolute precision, while brains have no need to be that accurate, and even a good approximation of the answer suffices for almost all of its tasks. Millenniums of evolution has resulted in a computational system responsible for actions, perception, cognition, emotions, and interactions of all animals. How the brain functions is little understood, but we know a lot about its physiology, and we will continue to provide a comparison between the biological brain and its silicon counterpart.
Neurons or nerve cells are the most basic building blocks of the brain, which can be thought of as its logic gates. However, the convergence/divergence values of neurons (a concept similar to fan-in/fan-out terminologies in computer engineering) are nothing like its digital counterparts and could go up to thousands of inputs and outputs. Neurons communicate with other cells through their synapses, highly noise tolerant biological pathways, which are used to carry out information in the form of pure electro-chemical impulses (~ 100 mV) referred to as spikes, generated by the ion channels and governed by the DNA. This rather simple channel for communication poses the biggest challenge to any brain-modeling project since no electrical wiring can capture the connectivities synapses create in neural tissue which goes up to \(10^{15}\) in some regions of the brain.
Another important aspect of this form of communication is that the spikes themselves contain no particular information, and the data is encoded in the timing of the spikes. Not only that but the use of spikes for communication makes fault-tolerance more straightforward since a faulty component would be nothing but a section that has gone silent, whereas, in a digital board the failed component would be stuck at somewhere in the range of possible voltage levels. Biological neural networks learn through synaptic plasticity, a process in which specific patterns of local synaptic activity result in alterations in synaptic connections, removing the ineffective synapses while forming new ones. Neuromodulation is another mechanism that contributes to learning in the brain. Neurons can release chemicals, referred to as neuromodulators, which can affect the synapses close to them. Dopamine is a neuromodulator, that scientists believe represents a reward for plasticity changes. There are other mechanisms at play here, and overall, a lot has yet to be worked out and investigated before we can apply this method of learning in digital hardware.
Having said that, there are still some lessons that can be learned from these biological computation systems, such as parallelism, energy-efficiency, and scalability and fault-tolerance.