I can’t remember how I first came across Robert Pepperell’s Consciousness as a Physical Process Caused by the Organization of Energy in the Brain. At the time, I remember the paper struck me as interesting and made some good points but, at the same time, didn’t actually seem to deliver a lot that was new. It was like looking at a large three dimensional artwork from a different angle. The object of vision wasn’t substantially changed; it just looked a little different.
One idea that did stick with me was the idea of the brain as difference engine.
In light of these mechanisms, the energy-hungry brain might be understood as a kind of ‘difference engine’ that works by actuating complex patterns of motion (action potential propagation) and tension (antagonistic pushes and pulls between forces) at various spatiotemporal scales. Firing rates and electrical potentials vary within neurons, between neurons, between networks of neurons, and between brain regions, so maximizing the differential states the brain undergoes. A decrease in activation, or a reduction in firing rate, can create a differential state just as much as an increase. And, as is indicated by the work of Schölvinck et al. (2008), deactivation may be an energy efficient way for the brain to increase its repertoire of differential states. Maintaining a global E-I balance across spatiotemporal scales, meanwhile, is thought to promote ‘efficient coding’ in sensory and cognitive processing (Zhou and Yu, 2018). All this lends support to the idea, proposed above, that one of the roles of energetic activity in the brain is to efficiently actuate differences of motion and tension that advance the interests of the brain-bearing organism. It is the actualized difference that makes the difference.
The notion of differences in connection with the brain came back to me again when I read recently about the Weber–Fechner law. The law is not quite a law but more like a rule of thumb. Basically it states there is a logarithmic relationship between stimulus and perception. The classic example relates to weight perception. If a weight of 105 g can barely be distinguished from that of 100 g, then if the mass is doubled, the differential threshold needs to be 10 g, so that 210 g can be distinguished from 200 g. The “law” seems to apply to all senses with some minor discrepancies and exceptions.
The oscillatory networks of the brain also seem to act in a similar way to input. Much of the effect of inhibitory neurons is to dampen inputs so that neurons do not fire wildly with each small change in the environment. The result is a brain that mostly is running its own emulation of the world based on its own self-sustaining rhythms that is occasionally incorporating outside input.
Let’s suppose we are walking along a clear, flat, well-paved road with no traffic. Picking up each foot and pushing off into a stride would require almost no input from the outside world or from the brain. A coalescing view of the evolutionary origin of the spinal cord and brain is that they were necessary for locomotion. Central pattern generators are “biological neural circuits that produce rhythmic outputs in the absence of rhythmic input”. The experiments of Thomas Graham Brown in the early twentieth century showed that the basic pattern of stepping can be produced by the spinal cord without the need of descending commands from the cortex. Encountering a stray rock that had been thrown on road might require an adjustment to the stride. At that point the brain would get involved even though the perception of the rock may not rise to the level of awareness.
Back to energy again. It would make sense that evolution would optimally select for lower energy solutions for the brain all else being equal. It requires a lot more energy to take in and assimilate information from the environment than it does to operate with as little information as possible as long the model of the world that the brain is using is good enough for survival. Here we may have the core of what lies behind several different observations. For one, the idea of Hoffman and others that our model of the world is not likely veridical because it would be too costly energy wise to create a truer model. For another, what Baars and others have pointed out, learning new skills require a lot of energy in the brain because they require the assimilation of a lot of information, but the use of learned skills require less energy and occupies smaller regions of the brain.
The picture that emerges for me is that of a brain that primarily is generating its own model of the world. Into the model, it allows as little information as it needs to function. The brain prefers to operate in a low-energy homeostatic state as much as possible. The brain in effect might prefer to be a zombie. A small rock while we are walking may trigger a small modification of the model. The sudden appearance of large truck coming towards us, however, might require a completely different sort of activity. Quickly attention and resources must be brought to bear on dealing with the truck. Large amounts of information must suddenly be brought into the model. An fMRI of the brain at that time might detect a large spike in nutrient utilization in various parts of the brain as many neurons begin to fire rapidly.
Could we be misunderstanding the spikes in fMRIs that are so often used in neuroscience and consciousness studies? Could the spikes be simply evidence of the attentive process rather than the more low-energy processes that form the foundation of consciousness?
Some would argue the attentive process is consciousness, but then what creates and maintains our model of the world – the visual, auditory, tactile expanse that exists even without attention.