Individual self-deception is exactly what you would think, but happens at a lower level than you would think. Human information collection, memory, recall, and reasoning can all be unduly influenced by high-level motivations. Sometimes this is actually good, such as when it helps us turn our aspirations into reality. Sometimes it's very bad, such as when it locks us into abusive situations.
The body of philosophical and psychological literature on individual self-deception is impressive and illuminating. The same isn't true of collective self-deception yet. The page I linked to has some great stuff, though, which is very applicable to the church.
From the link:
This said, relative to solitary self-deception, the collective variety does present greater external obstacles to avoiding or escaping self-deception, and is for this reason more entrenched.
An analogous fault in the organization of the church is how upper management is insulated from feedback from the rank and file by their bishops and stake presidents, by policy. This structural fault is compounded by the fact that such lower managers are restricted by policy to be men who fit the Mormon mold, are chosen for their loyalty, and function as both judge and counselor. This is part of what makes the church's "consciousness" (to use Trivers's analogy) selectively deaf to members when they give critical feedback that the church is wrong about something.For example, Robert Trivers (2000) suggests that ‘organizational self-deception’ led to NASA’s failure to represent accurately the risks posed by the space shuttle’s O-ring design, a failure that eventually led to the Challenger disaster. The organization as a whole, he argues, had strong incentives to represent such risks as small. As a consequence, NASA’s Safety Unit mishandled and misrepresented data it possessed that suggested that under certain temperature conditions the shuttle’s O-rings were not safe. NASA, as an organization, then, self-deceptively believed the risks posed by O-ring damage were minimal. Within the institution, however, there were a number of individuals who did not share this belief, but both they and the evidence supporting their belief were treated in a bias[ed] manner by the decision-makers within the organization. As Trivers (2000) puts it, this information was relegated “to portions of … the organization that [were] inaccessible to consciousness (we can think of the people running NASA as the conscious part of the organization).” In this case, collectively held values created a climate within NASA that clouded its vision of the data and led to its endorsement of a fatally false belief.
Well, that and just plain arrogance, and authoritarian tradition and doctrine.