An attempt to regulate the mind introduces two processes:
an operational process that promotes the intended change by looking for mental contents consistent with the intended state; and
a process for detecting mental contents that are inconsistent with the intended state.
Together, they boost whatever degree of mental control is enjoyed.
When capacity is diminished for whatever cause (distraction, cognitive load, stress, time pressure, etc.), the intended control does not simply revert to an uncontrolled baseline or zero level. Rather, mental control exercised during mental load frequently results in thought states that go beyond "no change" and become the opposite of what is wanted. The desired happiness becomes sadness, the desired relaxation becomes tension, the desired interest becomes boredom, and so on.
For example, when a person is attempting to be happy, the operational process searches for mental contents relevant to happiness, while the monitoring process searches for mental contents indicating that happiness has not been achieved. Unlike the operating process, which is demanding and consciously directed, the monitoring process is typically unconscious, autonomous, and requires less mental effort.
Interaction between the two processes results in mental control. The operating process brings about the change you want by filling your mind with thoughts and feelings that are related to the state you want to be in. The monitoring process looks in secret for thoughts that show when control is needed. This is how it decides whether or not to start the operating process at any given time. If the monitor sees signs of control failure, it starts the process over again.
Under mental load, trying to control the mind sets off a monitoring system that not only looks for when mental control fails, but also tries to make that failure happen. Sometimes, just looking for thoughts or feelings that are related to a failed control is enough to bring them into consciousness. This makes the control that was supposed to happen not happen because of a mistake that was made on purpose.
Complex mental states, like relaxation, joy, sleep, anger, belief, or the like, may include not only momentary cognitive orientations but also more stable cognitive structures and key body states.
The body's responses include patterns of arousal of the autonomic nervous system, motor movements, facial expressions, postures, and even patterns of activation and inhibition of the brain. So long as paying attention or not paying attention to certain thoughts can affect these kinds of actions—and this kind of influence is often not perfect—the operating process goes beyond the control of attention and into the control of complex mental states.
In general, wanting a certain mental state leads to an operating process that looks for things that fit that state, while wanting to avoid a state leads to an operating process that looks for things that don't fit that state. That is, an operation can only bring things into consciousness. It can't avoid or get rid of things.
For example, if you want to focus on what is happening down the other end of the field, you might look for the feelings that the game gives you as a stimulus, and you might also look for things in your memory that are related to the game, these could be plays, or tasks your coach has asked your team to fulfil. In the case of desired thought suppression, the operating process also looks for things that fit with the desired state, but in this case, these things are not the unwanted thought. In order to gain mental control, the search turns to distractions.
Controlling your mind is, after all, a strangely reflexive process. Processes that try to change consciousness must be compatible with the states of consciousness they are trying to create, because the control processes may sometimes show up in consciousness. This reflexivity constraint says that any control processes that are shown in consciousness while control is being used must be in line with the state of mind that is the goal of the control mechanism.
The operating process is present in consciousness because it is the "dominant action system" (Shallice, 1978), "current concern" (Klinger, 1978), or "prepotent act identity" (Vallacher & Wegner, 1987) at the time it is activated. People may also know about some of the operating system's ongoing work and be able to report on it as it happens. If asked, people would say that the operation is what they are "doing" to control their minds.
Controlled (non-automatic) or resource-dependent cognitive processes are also hard work (see Bargh, 1984, 1989; Hasher & Zacks, 1979; Logan, 1979, 1988; Navon & Gopher, 1979; Posner & Snyder, 1975; Shiffrin & Schneider, 1977). The operating process can be interrupted by other activities that need your attention, and it doesn't always restart. Automaticity and control have been separated in logical and empirically distinct ways in the past. Cognitive ability doesn't tell us how much a process must be actively guided (vs. self-guided) or how much it can be started or stopped by decision (Bargh, 1989; Jonides, Naveh-Benjamin, & Palmer, 1985; Kahneman & Treisman, 1984). So, saying the operating process is challenging distinguishes it from those less likely to be affected by other tasks.
As a result of the theory, it would make sense to avoid mental load and be careful when using mental control intentions as ways to make ironic errors less common or worse when they do happen. But there is one other way to resist that is worth thinking about: automating the process of running the business.
At the end of the day, the intentional operating process that helps any mental control intention work should be trainable. Like other conscious and deliberate actions that become automatic with practise, this one should become more automatic with training and become less conscious, less effortful, and maybe even less likely to be susceptible to interruption and inhibition as well. In turn, these changes might make the ironic monitor unnecessary or at least less important. Even though there might still be some kind of monitoring system that works to make us aware even when automatic processes break down, it doesn't seem like this kind of monitoring would have as much access to consciousness as the monitoring of conscious mental control. (If every time an automatic system failed in some way, it told us right away, our minds would be constantly buzzing with these stories.)
Consider automatic or skilled action consciousness. Many people believe they make mistakes when conscious of their actions (Sudnow, 1978). Being conscious of one's fingers typing on a keyboard, for example, is a prelude to error, as is being conscious of one's movement when playing an instrument or executing a tennis backhand (cf. Vallacher & Wegner, 1987). An intentional operating process and ironic monitoring process may be placed on the automatic action, generating ironic—and relatively automatic—error. This suggests that some cases in which people "choke under pressure" to unleash the most harmful error to their current purpose (Baumeister, 1984), may occur because they have been influenced to make their otherwise automatic actions intentional, and so they lose the resistance to irony that practice and automaticity normally convey.
The idea that mental control could become automatic and so not be affected by irony has implications for how well mental control attempts work in general. It's possible that people who practice thought suppression often enough develop such skilled and automatic operating processes that they become very good at it and have few problems with the ironic monitoring processes getting in the way. In the same way, learning to relax or control your mood through practice could be the key to self-regulation that helps you avoid the ironic process described in this article. Some people do seem to be self-control savants, showing off their magical powers of mental control through acts of repression, self-denial, or what seems to be self-deception. However, their skills may have come from making mental control activities into well-learned habits by doing them over and over again.
Conclusion
The theory of ironic processes of mental control could explain a wide range of psychological effects that happen when people try to control their thoughts and actions. The theory talks about a two-process mental control system that helps explain how people can consciously change the way their minds are working. The mechanism has a conscious operating process that looks for mental content that fits with the proposed change and an unconscious monitoring process that looks for mental content that doesn't fit with the proposed change to test how well this control is working. In this way, the theory is about how people go from wanting to seek or avoid thoughts, emotions, and motives to actually having them or keeping them from happening. The theory also explains another group of effects that are hard to explain and often make us want to scream for help. The theory says that the ironic monitor is to blame when we find ourselves doing, saying, thinking, or feeling exactly what we didn't mean to.