Владислав Педдер – The Experience of the Tragic (страница 6)
Error: If there is a discrepancy, a prediction error is generated.
Model Update: The error is transmitted back upward, where the model is adjusted to improve future predictions.
When actual sensory information matches predictions, the brain minimizes prediction error, which conserves resources. However, when information does not align with expectations, a prediction error arises, signaling the need to update the model of the world.
Within the neural layers of the brain, there is a division between “prediction neurons,” which form expectations, and “error neurons,” which indicate when predictions have failed. For example, in the supragranular layers (the upper layers of the cortex), error neurons activate when something unexpected occurs. In the deeper layers reside neurons that emit prediction signals.
However, the effectiveness of predictive coding is influenced by various factors, including hormones, neurotransmitters, microbiota, and trauma. Hormones such as cortisol, produced in response to stress, can alter neuronal sensitivity, affecting the brain’s capacity for adaptation and learning. Neurotransmitters, for instance dopamine, play a key role in motivation and reward processes, which can enhance or diminish certain predictions and responses. The gut microbiota, interacting with the central nervous system, can influence mood and cognitive functions, thereby affecting the predictive process. Trauma, particularly traumatic brain injury, can disrupt the normal functioning of neural networks responsible for predictive coding, leading to cognitive and emotional disorders.
Errors in the predictive coding process may arise from various causes. They can be linked to insufficient accuracy of sensory data, incorrect interpretation of information, or failure to update world models. Such errors may result in distorted perception and impaired adaptive behavior. For example, during chronic stress, elevated cortisol levels may reduce the brain’s ability to correct predictions, leading to persistent perceptual errors and increased anxiety.
Thus, predictive coding constitutes the foundation of adaptive behavior and human cognitive functions. Understanding the mechanisms of this process and the factors affecting its efficacy opens new horizons for developing treatments for various psychiatric and neurological disorders associated with disruptions in predictive coding.
Conclusion
The emergence of the mind is the result of a complex evolutionary process that led to the development of diverse forms of intelligence across different species. Predictive coding and Bayesian approaches demonstrate how the brain constructs models of the world and adapts to new conditions by minimizing prediction errors. These mechanisms form the foundation of our perception, learning, and thinking, rendering the mind a powerful tool for understanding and transforming reality.
4. The Existential Limit of Prediction
Mental models are internal cognitive structures through which we interpret and predict the world. These models help us navigate life by constructing more or less accurate representations of reality. However, like any other tool, they are limited. Mental models, akin to filters of the mind through which we perceive the world, are inevitably simplifications based on experience and expectations, allowing us to interact with the environment more efficiently. Yet, as with any tool, these models cannot always reflect reality accurately, as the world does not always conform to the frameworks we impose upon it.
These ideas found continuation in Plato’s philosophy. In his famous allegory of the cave, Plato portrays people sitting in a dark cave, seeing only shadows cast by objects in front of a fire. These shadows represent a distorted perception of reality, taken to be truth because the cave’s inhabitants have never seen the light. Only the one who escapes the cave can behold the true reality hidden behind the shadows (Plato, 1990). Plato’s allegory symbolizes the limits of our perception, which reflects only a fragment of the full picture of the world.
Later, Immanuel Kant discussed that we do not perceive the world as it is as a Thing-in-itself (
The idea that our perception of the world is always limited was further developed in the later work of Thomas Bayes, whom we discussed earlier. Bayes used the example of the rising and setting sun to illustrate how our models of the world can be updated based on observations. For instance, a person emerging from a cave for the first time observes a sunrise and wonders: does this happen every day? With each new observation, the person updates their belief using Bayesian reasoning. Each additional sunrise strengthens the hypothesis that the sun indeed rises daily. However, should this prediction one day fail – if the sun does not rise or does not appear in the expected location – then the model of the world must be revised in light of the new data.
In this way, the Bayesian approach reveals a process of continuous updating of our mental models based on new observations, which also echoes Plato’s idea of seeking the true reality beyond distorted perception. Bayes emphasizes that the perception and prediction of the world are dynamic processes, always subject to revision, and that the reality we attempt to grasp may always run deeper than what our current model of perception allows.
These ideas were expanded upon by Nate Silver, who examined the principles of prediction under uncertainty. Silver argues that successful forecasting depends on the ability to distinguish between “signal” (important information) and “noise” (random or irrelevant data), a process closely related to Bayesian model updating. (Silver, 2019)
However, Silver goes further by emphasizing that not all models can be corrected by simply including new data. In a world full of uncertainty and randomness, many predictions fail even when the correct methodology is followed. He stresses how people often overestimate their ability to interpret data, relying on forecasts that seem plausible but are in fact driven by perceptual errors and cognitive biases. According to Silver, it is not enough to account for new data – we must also understand the context in which it arises. In this sense, as in the Bayesian framework, correcting mental models requires not only observation, but an awareness of the epistemic constraints we face in interpreting the world. Silver also highlights that the importance of “noise” in data is often underestimated, and that without the ability to separate noise from signal, we cannot build adequate predictive models, even with the most advanced analytical techniques.
Thus, like Bayes, Silver emphasizes the necessity of constantly reassessing our assumptions and revising our models of the world. However, unlike classical Bayesian theory, Silver underscores the difficulty of prediction in the real world, where signal is often hard to distinguish from noise, and our ability to make accurate forecasts remains inherently limited.
And yet, even though our mental models can be updated in response to new observations – despite the complexity of prediction – the process of adaptation is not infinite. When the world becomes too complex, or when our expectations encounter fundamentally novel and unpredictable phenomena, our models confront limits that cannot be overcome by ordinary means of correction. This presents the mind with an unbridgeable rupture – a moment when we are no longer able to adapt our predictions to reality.
In such moments, when even the most flexible models fail, the mind undergoes a crisis – a rupture caused by the impossibility of forecasting or understanding what is happening. This confrontation with uncertainty generates an existential tension that calls into question the very capacity of reason to make sense of the world. Despite all efforts at updating and revising our models, it becomes evident that human cognition inevitably encounters boundaries that cannot be transcended by any familiar predictive mechanism.
The Existential Limit of Prediction refers to the boundary at which the human brain encounters fundamentally unpredictable phenomena – phenomena that cannot be integrated into predictive models due to a lack of data, experience, or the capacity for error correction. When the brain reaches the limits of its cognitive capabilities, this results in an irresolvable cognitive conflict, giving rise to profound existential experiences.