There are strong pervasive memes and ideas in society that discourage people from changing their minds, especially in public.

I think this happens in all fields and circles of society without exception, even science. It is especially obvious in politics and public affairs.

There is no shortage of cases of politicians being attacked because it comes to light that they championed policies in the past that contradict views they champion at present. It is especially damaging to their prospects if they tried to veil those past views from present public attention, but it is also damaging even if they made no special effort to hide those past views.

This happens even in scientific and academic circles, although it is a bit more subtle. If a scientist was a strong proponent of one theory, and then at some point decided to abandon that theory in favour of another one (sometimes the challenging theory), that often costs them credibility with their colleagues and in the field at large. It also hurts their chances of publishing more work in the future. This is especially true if this change of mind happens more than once and/or before they achieve tenure.

First, we need to agree that this is bad. This is bad because the aim of everything we do is – or certainly ought to be – to improve life. Improving life has many metrics and operationalizations that we don’t need to get into, or even agree on, right now. The point is that by definition, you cannot improve without change. For something to get better, its state needs to change from the previous less optimal one, to the current best guess one. This should be an obvious premise. A less obvious premise is that the current state or idea, while being our current best guess, will never be the perfect guess, and we should not want it or expect it to be the best guess forever. There will always be room for improvement. Let’s call that improvement error-correction.

It should now be clear why the reactions the politicians and scientists (and everyone else) almost always get when they change their minds publicly are bad ones. We want people to change their minds about things, or do error-correction, as the evidence requires. Science, for example, creates knowledge by proposing theories that explain data, creating experiments to test those theories, and discarding the theories that were falsified by the results.

The way academia is currently set up often incentives researchers to choose a topic (or group of topics), support a theory in each of those topics, and try to produce evidence that demonstrates failure to falsify those theories, while falsifying opponent theories, in their labs. That’s fine. What’s not fine is what happens when their theories are falsified. The way things are currently set up discourage you from turning around, saying you were wrong about this theory you were arguing for last year, and that you think a new theory (either a modification of the older one, or a more radically-different one) is true. This may cause difficulties in your academic department, at conferences, or when submitting research for publishing.

The discouragement is not only institutional. Socially, people often react with suspicion if you’ve changed your mind about something significant in the time they have known you. They think you’re flakey and unreliable, and sometimes do not even believe that your mind changed at all, instead, they think that you lied about your initial preferences for malicious or deceptive reasons.

The mistaken common standard is to be right and consistent. Pick the right story from the beginning, and stick to it. It’s even worse than that. Sometimes people would prefer you were consistent even if maybe you think you’re wrong, rather than you changing your behaviours as your ideas about what is right change.

Overcoming your own internal barriers to changing your beliefs and principles is difficult enough without people from the outside giving you a hard time about it. Instead of this horrible tradition described above, people should be curious when others they know change their minds about things, ask them why, and criticize the reasons given or learn from them.