Any change that actually improves our way of living – makes it more sustainable and less destructive, or enhances its connectivity with the rest of the biosphere – involves learning
. And just as our civilization requires inputs of energy from natural sources, learning requires information
from reliable sources.
Information is useful to the extent that it serves some purpose; propaganda, for instance, is useful to whoever benefits from the belief it aims to propagate. But information is genuine to the extent that it connects form, experience and reality. Our purposes guide our habitual practices, but they can only work within the limits of our imagination and our experience – and we habitually forget how limited these are. Reality is much broader than anyone’s experience and much deeper than we imagine. But our only direct contact with reality is through experience, and our only way of comprehending experience is by recognizing the forms embodied in it. This means that our only way of learning anything is through signs connecting form, experience and reality to create genuine, habit-changing information. But when learning opportunities occur, our attentional habits – the habits which determine what we pay attention to and what we ignore – often make the difference in how we read the signs, and thus whether we learn anything from them or not.
One of our deepest instinctive habits is to maintain the integrity of our belief system. If a new message or other sign conflicts with strong beliefs, you are unlikely to accept it unless your own experience (or something even stronger) forces you to. But since most human belief systems are fairly complex (even though they simplify the real world for us), it’s not unusual for new information to conflict with some part of the system while confirming other parts. If it turns out to be a genuine discovery, then your whole belief system will reorganize itself, incorporating some new beliefs and eliminating some old ones. In other words, you will learn something that changes your habits. This is how common sense evolves. But habit systems would rather not change if they can avoid it: they are naturally conservative, because their survival depends on stability and consistency.
Belief systems tend to conserve their own simplicity in the face of life’s complexities. One way of doing this is to adopt certain beliefs as fundamental, which means clinging tenaciously to them and rejecting or ignoring whatever would challenge them. But faith in a fundamental belief is at best a substitute for genuine faith in the integrity of the belief system as a living whole capable of growth. If we have genuine faith in our ability to recognize the truth about some subject – which does not depend on what any person or group thinks about it – then we have to accept that any one belief is fallible, open to question and improvement. That doesn’t mean we can question everything at once – questioning itself relies on the integrity of the belief system, which means that in practice we have to take some things for granted in order to question others. Some beliefs may be taken for granted so consistently and productively that nothing ever happens to call them into question. But if and when the question does arise, the healthy belief system is open to it, whereas the fundamentalist will fight to keep itself closed.
When the process of inquiry goes public, new complications arise because we are dealing with a collective belief system that can only learn by testing new ideas against many observations. In science (meaning an organized system of public inquiry), a hypothesis that would modify the established belief system is not accepted unless it is supported by the experiments or observations of many independent investigators, and even then its acceptance is provisional. In order to count as part of this testing process, an investigation must be open to the scrutiny of other investigators qualified to assess its methods: the individual can only contribute to the inquiry as part of a network. The author of a new idea may arrive at it through intuition or inspired guesswork, but she can’t rely on her authority to get it accepted into the scientific belief system. In science, as C.S. Peirce pointed out, ‘experience is our only teacher’ and authority counts for nothing. But since this kind of inquiry requires a large investment of time and attention (not to mention money), often for inconclusive results, only a minority of people can fully engage in the kind of specialized inquiry we call scientific. In the broader community, since we can’t afford that kind of investment, we have only indirect access to that kind of experience. So we have to rely on simpler ways of deciding what information to accept or reject.
One short cut we commonly use is to rely on the authority of specific trusted sources. But how do we know what sources to trust? The simplest way is to trust those who tell us what we already believe – but this amounts to indulging in self-deception. Another way is to trust those sources which are most persuasive. But as we all know, persuasion has become a highly sophisticated, powerful and lucrative industry, often used – by those who can afford it – to manipulate public opinion. So this leaves us open to another kind of deception, unless we apply some critical thinking to the means of persuasion. Hence the importance of ‘sales resistance’ for the information economy.
Accordingly, many of us are in the habit of ‘questioning authority’ and resisting whatever we hear from the mainstream media, the scientific or professional establishment, or the government. Indeed this is just common sense, given the level of corporate ownership and influence over all of the above. But we sometimes overcompensate for our distrust of the establishment by placing uncritical trust in ‘alternative’ sources of information. We seem to think that since the ‘authorities’ are constantly lying to us, anyone who makes a point of opposing them must be telling the whole truth. This tends to make us partial to unconventional beliefs, ‘revolutionary’ (but untested) theories, ‘alternative’ medical treatments and so on.
When we are partial to a belief, we tend to overlook the lack of evidence for it. This makes us partial to conspiracy theories, which give us a convenient excuse for ignoring the lack of evidence: it’s not there because “They” have conspired to cover it up! Only a believer in conspiracy theories would think that the lack of supporting evidence for a theory is a good reason to believe it. Conspiracy theories are tempting, though, because we all know that conspiracies do happen; besides, the corporate media do suppress information, when they can get away with it, without even having to conspire. But the only way to stop them getting away with it is to find and document the facts and make them public. Endless debates about who killed JFK or who was responsible for 911 or what’s hidden at Roswell are the stuff of tabloid information, not genuine public inquiry.
If we believe anything simply because it’s contrary to what established science or authority says, or accept anything as fact without looking at the evidence for and against it, we’re indulging in self-deception. That’s a step backward in the evolution of common sense, which calls for critical thinking to be applied impartially to every idea that claims a place in our belief system – and it’s the beliefs we are partial to that really need our critical attention. We can’t honestly claim to know something unless we can tell how we know it.
But this brings us back to the clash between the flood of information and the limitations of our attention. Genuine critical thinking is not easy, and we don’t have time to apply it to everything. So the third and last part of this series will set forth a few guidelines for judging whether an information source is worth paying attention to or not. There’s nothing terribly original in these guidelines – they’re just common sense, really. But like any other set of tools, they’re more likely to be used wisely if we can see why they work.