Many scientists and scholars of brain functioning opined as to the possible amount of information processed by it. Some argue that the number incógta and that is why we should not expect to quantify them. Others claim that it is possible to quantify cerebral information based on the number of neural connections, since the assumption of a repetition memories within 1 second. And there are those who opine that the issue of boundary information processed by the brain depends on how the CNS uses the incoming data and to compare these data with neural codes older. As noted, there is no easy task to talk about brain function, especially on how much information you can capture and process every second. Quantum physicists around the world are studying the possibility of an underlying order in which the neural conditions can operate on the same laws of quantum mechanics, interfering with the way the individual perceives, analyzes, compares and obsorve information.

There is also, beyond the current problem of the functioning brain, the investigative transcursos of advanced neurosciences about how the brain produces consciousness and how memories are captured and stored in the form of neural codes. Some neurobiologists argue that actually should be a law in which the intrinsic cerebral phenomena are evident in terms of quantum but do not appear in fateful ordinary laboratory experiments. Others take a traditional view, arguing that it is only to neuroscience to explain these phenomena. But some argue nowadays that modern science debe not limited to the explanation of the brain only by the approach of the Nervous System, but the behavior of the entire universe. This line reveals a global way of describing quantum phenomena or mysterious, such that every reality show is in its obscurity into the context of the difficult problem of describing the information behavior of the brain.

There are those who say that the brain processes informational 10,800 threads per second, but there is no reliable data to consider this statement as correct. Scientists like Harvard psychologist Jeffrey Statinover and neurophysiologist, a geneticist and physicist Joe Dispenza, agree that the brain must procesar about 400 billion bits of information per second, but that may not have awareness of all this amount of data. There is a problem regarding the consensus regarding how informational the brain per second on average, in view of unconscious mental states, homeostasis and Processes related to the neurophysiological mechanisms of inhibition in the CNS. But it seems they should have respected scientists come closer to the amount of information that the brain unconsciously processes. The scientist John Hagelin, who worked with the unification of quantum fields at Stanford University and has several publications in this field (including the discovery of super-symmetry) was part of a documentary in which several scientists in addition to the aforementioned, establish a consensus number of bits of information that the brain unconsciously use every second. The 400 billion bits per second have been affirmed and reaffirmed by several professors and Ph.Ds in Medicine, Physics and Quantum Mechanics of large universities in the world.

According to these predictions, large amounts of information are recognized by almost any individual. When you reach the observer, are processed by the sensory organs, and its shares dropped on a large scale exchange between reality and mental ambiente.Tem there is a transition informational vehicle that holds the data of lesser importance to the brain and in accordance with the limitations of sensory systems every second.

Neuroimaging: The funcionamentoNeuroimagem in brain: The brain at work

The consciousness is taking only the most important and the amount that reflects the maximum possible absorption of objective reality. Of the 400 billion bits of information per second that reach the brain, only 2,000 bits are utilized so that man has conscience of the world around them. It is as much information about the environment, about their bodies and about what kind of decision will be taken in time. The perception of reality so to speak, is extremely limited. Thus, the eyes are not the real editors of reality, but the visual cortex in the occipital lobe which establishes the relationship with the mnemonic “meaningless images” exposed to the viewer through the retina. But all this amount of information the brain is still the subject of investigations by many scientists. Some were able to show some reasons why such data should correspond to the actual brain. Others carried the approach to research with human subjects and possible confirmation of this theory.

The scientist who described more objectively about the mathematical neural coding in bits of information was the genius of physics Maicon Santiago. To do so, and relate the data with the number of neurons in the human brain, Santiago took into account the fact that the brain works through random states, like any other probabilistic quantification of nature that can be calculated, according to predictions of the theory quantum. Considered the fact that besides the probability of action of packets of information, there is also the likelihood representation of random data arrival in neurons through the “dendritic spines” (first described by his ancestor Santiago Ramon y Cajal, Nobel Prize Medicine with Camillo Golgi in the early twentieth century). This overview discusses the phenomena called “caoticidade systems,” in turn present throughout nature. As we have more enlightening, as for example that the Brownian motion is random in their description of collisions in Albert Einstein, the fields of the brain are likely to random coding information states in the description of Maicon Santiago, although they are two phenomena different natures.

Assuming that the experiments of Santiago are largely mental, the peculiarity with which describes the brain states deserved special attention by many physicists and neuroscientists, mainly because it is a unique way of interpreting the functioning of the brain in relation to the functions of the mind. The ingenious hypotheses Maicon are of enormous complexity and challenge our current technology to move in search of new ways to put into practice his experiments in the future. Contrary to what many think, the scientific theories of Santiago are from a description of the universe which he published in 2010, the scientific proposal which became known as Autotrópica Interpretation. This interpretation is to describe the universe in a microscopic, entering combinations atomic configuration in the context of the phenomena of nature, from just two probabilities of occurrence, “or combination of elements is there or not.” The nature of this aspect of the scientist called “self-education” of the universe, because in any given time only two natural reasons coexist and are opposed to each other. Quantum theory of Santiago is centered on the idea that the likelihood of combination between atoms in a medium “chaotic” that make the universe look like you have. In this interpretation, Santiago explains why the universe is a mutant and macrostructure probability of 13.7 billion years old.

The description of the energy while the universe of information on the microscopic scale Maicon led to formulate a more comprehensive theory, expressing concern about the emergence of intelligence in the universe, starting with the analysis of the intellectual capacity of man. Under this proposal, which was called “probabilistic informational fields”, intelligent life has been built slowly and gained a more active and practical when brain tissue reached a standard which caoticidade an integrated action was compatible with the “use analytical information “and manipulation of neural data, as a result of the chaos of the universe, where the increase of information is an ominous trend. The brain being a product of the universe, the information also tend to increase in its interior, but in this case producing memory mechanisms mediated by random interactions. This interpretation suggests that the universe must also have some kind of “memory information” responsible to govern its phenomena and that the brain governs the mechanisms of thought. It is suggested that the “memory in the universe” takes to play nuclear forces, gravity and electromagnetic time, while in the brain it generates coding information and its invocation in time.

The autotrópic brain, according to the law of Santiago

By type of memory called the universe Maicon “auto configuration” of matter which is subject to a natural property known as “disorder temodinâmica” (increase of total entropy of the universe). Thus also in the brain is the configuration information. And this information codes which are also produced under the same disorder over time. This would explain the reason for the destruction of memories, making a person forget something you learned in the past. While the total entropy of the universe increases, resulting in the brain occurs information entropy.

 The study of the entropy of the physical universe is a rather complex approach, especially in large-scale descriptions of the universe. But almost no one talks about entropy when referring to the overall functioning of the brain. However, it is known that the encoding processes are governed by neural events random oscillations, occasionally seen in some approaches electrophysiological. Some physicists have theorized also that the total information the brain must be compatible with how much information the brain uses daily. The problem was that some calculations showed a very big difference compared to 2000 bits per second that acceptable use is made aware. Others point out certain similarities when considering a numerical value information used daily in the peak of informational consciousness. Do not know the exact amount of how much information the brain uses every second. The 400 billion bits per second is a question much debated by theoretical physicists and neuroscientists, mainly because it is controversial when one considers the steady pattern associative response involving both cerebral hemispheres.

                                   The two cerebral hemispheres (Telencephalon)

Brilliant minds and offer controversial interpretations ever deeper to explain how the brain uses information. One is the proposal of Probabilistic Brain Fields, based on the concept of “binary indices of consciousness”, which operate according to the coincidence of events over the entire brain at once.

Leaving aside the mathematical complexity for now, we might say of a very simplistic way this works like this:

A stimulus reaches the cerebral cortex. This will produce the excitability response of CNS neurons and soon other neurons are being activated by association with the activated networks. As a result, the production of sensations, memories and other thoughts activate neurons in different brain regions, according to the type which makes interpretation of the new stimulus. In succession, all other neurons in the brain dverão be activated, because every feeling and thought have their cortical representation in each area of the brain. This is what is called the informational approach and holographic brain. This approach suggests that all brain areas act together, but only a few prevail throughout the integrated operation. Those who prevail are those most directly involved in the sensation and the memories evoked. The less functional are those who stay for a shorter period of time, until one or some of them may predominate, as to be found in the laboratory. The 400 billion is more accepted because they refer to a standard association entropy which produces a delay in the response in both hmisférios brain, making each of the 100 billion neurons are activated in the process. This function is characterized by the arrival of information (INPUT) and its resulting analysis (association), which determine a pattern of output (OUTPUT), the neural response. So the 100 billion neurons receive stimuli, each with its bits can be analyzed. Then, the analysis of these bits occur that the association is made. Given the interpretation, the answer medium is prepared by the brain. To complete the entire circuit is necessary that each neuron is adjusted to an average of four bits per second, which means 1 bit each stage of the circuit. Upon arrival, the associative analysis, interpretation of reality and the response to it, each with its own bit, multiplied by the number of neurons in the brain, is equal to 400 billion bits of information. It means that we are evoking memories, thinking and having feelings at all times, but do not realize. Just realize what we feel and what we think when the information is important to us.

The most intriguing thing is that a large part of this content can not be known to consciousness! Only a maximum of 2000 bits, as mentioned in this article, can be used every second to ensure the perception of the world around us. There is one important reason that leads to this fact, which is also the subject of debate, but can be computed in a coherent description that makes it generally acceptable. However, these speculations and theories promising modern topics are too complex for our current technology. Geniuses who formulate theories such as these challenge the limits of science itself is a step ahead very high, especially the technological point of view in order to obtain better resources to put its proof peculiar assumptions.



  1. Pingback: Yok Et, Çarpıt, Genelle | Baturay Özden

Deixe uma Resposta

Preencha os seus detalhes abaixo ou clique num ícone para iniciar sessão:

Logótipo da

Está a comentar usando a sua conta Terminar Sessão /  Alterar )

Google+ photo

Está a comentar usando a sua conta Google+ Terminar Sessão /  Alterar )

Imagem do Twitter

Está a comentar usando a sua conta Twitter Terminar Sessão /  Alterar )

Facebook photo

Está a comentar usando a sua conta Facebook Terminar Sessão /  Alterar )


Connecting to %s