Maxwell Institute Mental Gymnastics at its Worst

The catch-all forum for general topics and debates. Minimal moderation. Rated PG to PG-13.
_I have a question
_Emeritus
Posts: 9749
Joined: Fri Feb 13, 2015 8:01 am

Re: Maxwell Institute Mental Gymnastics at its Worst

Post by _I have a question »

Meadowchik wrote:People will find ways to accomodate their beliefs especially when they don't have alternative. So things like this make me wonder what link is missing for these individuals?


There are probably as many answers to this question as there are people who have ever believed falsehoods. Nonetheless, psychologists have shown that a relatively small set of cognitive biases or mental shortcuts can explain a lot about how false notions take root. One of the most agreed-upon ideas in the field of psychology is that people routinely use mental shortcuts to understand what happens around them. All kinds of things occur in the world around us, and we don't always have the time or energy to sit down and carefully examine all of them. So, we tend to use quick and largely unconscious rules-of-thumb to determine what we should believe—and these shortcuts sometimes steer us in the wrong direction.

https://www.psychologytoday.com/gb/blog ... ren-t-true
Emotional Reasoning

Whether we like it or not, all of us can be powerfully swayed by emotions. We'd like to think that our feelings are driven by logic and reason, particularly when it comes to our political beliefs. Unfortunately, this relationship is often reversed. Sometimes we end up using our reasoning ability to justify or defend a conclusion that we’ve already drawn based on our emotions. This phenomenon, called emotional reasoning, can lead us astray without our ever knowing. Psychiatrist Aaron T. Beck first noticed this in depressed patients. He observed that many patients drew obviously untrue conclusions about themselves based on how they felt, rather than the actual facts. "If I feel depressed,” one of his patients might say, "then there must be something objectively wrong with my job, my marriage, my children, or other parts of my life." But feelings are just feelings, even when they're powerful, and they can sometimes lie to us. Even in those of us who aren’t depressed, this tendency can affect our beliefs about virtually any emotionally charged topic, whether we’re talking about sexuality, religion, money, crime, or war. When we feel scared, angry, anxious, or even just uneasy about a topic, we can easily jump to the conclusion that the topic is somehow objectively bad or dangerous. Next time a topic makes you feel uncomfortable, that’s probably reason to keep an open mind, not to draw a conclusion.


Confirmation Bias

Once we have a belief, we tend to cling to it, even when it’s untrue. The confirmation bias is the tendency to seek out information that supports what we already believe. We do this in two important ways. First, we tend to surround ourselves with messages that confirm our pre-existing opinions. This is why, in the United States, conservatives tend to get their news from sources like Fox, whereas liberals tune into MSNBC. Second, we tend to ignore or discount messages that disprove our beliefs. If we’re sure that climate change is a hoax and someone shows us a research study disputing this belief, we might dismiss the study’s findings by saying that the researcher is obviously biased or corrupt. This protects us from having to change our beliefs. When our ideas are true, this probably isn’t such a bad thing. Unfortunately, it also can keep us firmly believing things are false.
“When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.” (Mathew Syed 'Black Box Thinking')
_Meadowchik
_Emeritus
Posts: 1900
Joined: Tue Apr 18, 2017 1:00 am

Re: Maxwell Institute Mental Gymnastics at its Worst

Post by _Meadowchik »

I have a question wrote:
Meadowchik wrote:People will find ways to accomodate their beliefs especially when they don't have alternative. So things like this make me wonder what link is missing for these individuals?


There are probably as many answers to this question as there are people who have ever believed falsehoods. Nonetheless, psychologists have shown that a relatively small set of cognitive biases or mental shortcuts can explain a lot about how false notions take root. One of the most agreed-upon ideas in the field of psychology is that people routinely use mental shortcuts to understand what happens around them. All kinds of things occur in the world around us, and we don't always have the time or energy to sit down and carefully examine all of them. So, we tend to use quick and largely unconscious rules-of-thumb to determine what we should believe—and these shortcuts sometimes steer us in the wrong direction.

https://www.psychologytoday.com/gb/blog ... ren-t-true
Emotional Reasoning

Whether we like it or not, all of us can be powerfully swayed by emotions. We'd like to think that our feelings are driven by logic and reason, particularly when it comes to our political beliefs. Unfortunately, this relationship is often reversed. Sometimes we end up using our reasoning ability to justify or defend a conclusion that we’ve already drawn based on our emotions. This phenomenon, called emotional reasoning, can lead us astray without our ever knowing. Psychiatrist Aaron T. Beck first noticed this in depressed patients. He observed that many patients drew obviously untrue conclusions about themselves based on how they felt, rather than the actual facts. "If I feel depressed,” one of his patients might say, "then there must be something objectively wrong with my job, my marriage, my children, or other parts of my life." But feelings are just feelings, even when they're powerful, and they can sometimes lie to us. Even in those of us who aren’t depressed, this tendency can affect our beliefs about virtually any emotionally charged topic, whether we’re talking about sexuality, religion, money, crime, or war. When we feel scared, angry, anxious, or even just uneasy about a topic, we can easily jump to the conclusion that the topic is somehow objectively bad or dangerous. Next time a topic makes you feel uncomfortable, that’s probably reason to keep an open mind, not to draw a conclusion.


Confirmation Bias

Once we have a belief, we tend to cling to it, even when it’s untrue. The confirmation bias is the tendency to seek out information that supports what we already believe. We do this in two important ways. First, we tend to surround ourselves with messages that confirm our pre-existing opinions. This is why, in the United States, conservatives tend to get their news from sources like Fox, whereas liberals tune into MSNBC. Second, we tend to ignore or discount messages that disprove our beliefs. If we’re sure that climate change is a hoax and someone shows us a research study disputing this belief, we might dismiss the study’s findings by saying that the researcher is obviously biased or corrupt. This protects us from having to change our beliefs. When our ideas are true, this probably isn’t such a bad thing. Unfortunately, it also can keep us firmly believing things are false.


Thanks IHAQ. These are things that "go wrong," or that help us maintain/prevent us from shedding unfounded beliefs. So what *needs* to go right for those tendencies to be unnecessary? Or, what can we do to minimize the dependence on unfounded belief? I think I have answered some of that for myself (I actually posted a pared-down belief system in the Crying Day thread) but I wonder how a general answer might look.
_I have a question
_Emeritus
Posts: 9749
Joined: Fri Feb 13, 2015 8:01 am

Re: Maxwell Institute Mental Gymnastics at its Worst

Post by _I have a question »

Meadowchik wrote:Thanks IHAQ. These are things that "go wrong," or that help us maintain/prevent us from shedding unfounded beliefs. So what *needs* to go right for those tendencies to be unnecessary? Or, what can we do to minimize the dependence on unfounded belief? I think I have answered some of that for myself (I actually posted a pared-down belief system in the Crying Day thread) but I wonder how a general answer might look.


In my opinion....People need a psychological shift to a place where one doesn’t doubt their doubts when it comes to examining an existing belief system. For believing Mormon’s that only happens once they’ve made a mental jump to a position where they want to know if the Church is not true. Only at that point will they honestly embrace contrary evidence and facts. The same is also true in reverse. If somebody believes the Church isn’t true, are they really embracing and considering any evidence presented that shows in favour of the Church’s truth claims?
“When we are confronted with evidence that challenges our deeply held beliefs we are more likely to reframe the evidence than we are to alter our beliefs. We simply invent new reasons, new justifications, new explanations. Sometimes we ignore the evidence altogether.” (Mathew Syed 'Black Box Thinking')
_Craig Paxton
_Emeritus
Posts: 2389
Joined: Fri Jul 06, 2012 8:28 pm

Re: Maxwell Institute Mental Gymnastics at its Worst

Post by _Craig Paxton »

I have a question wrote:
Meadowchik wrote:Thanks IHAQ. These are things that "go wrong," or that help us maintain/prevent us from shedding unfounded beliefs. So what *needs* to go right for those tendencies to be unnecessary? Or, what can we do to minimize the dependence on unfounded belief? I think I have answered some of that for myself (I actually posted a pared-down belief system in the Crying Day thread) but I wonder how a general answer might look.


In my opinion....People need a psychological shift to a place where one doesn’t doubt their doubts when it comes to examining an existing belief system. For believing Mormon’s that only happens once they’ve made a mental jump to a position where they want to know if the Church is not true. Only at that point will they honestly embrace contrary evidence and facts. The same is also true in reverse. If somebody believes the Church isn’t true, are they really embracing and considering any evidence presented that shows in favour of the Church’s truth claims?


While I'm sure I'm guilty of reversed comfirmaion bias on some level, I take comfort in the very long list of things that would have to be explained, fixed, altered or reimagined to make Mormonims what it claims to be. In other words I'm comfortable in my conclusion that Mormonims is a religion built on fiction, distortion, misrepresentation and out right fraudulent truth claims. I'm still however willing for some believer to prove me wrong and to show me that my bias' are misplaced.


The entire world, as I believe it to be, would have to be compeltely turned upside down, turned inside out and compeltely make no sense at all for Mormonism to be what it claims to be. This is why I no longer believe it is built on a foundation of truth or can actually fulfill its promises of the reward of eternal life.
"...The official doctrine of the LDS Church is a Global Flood" - BCSpace

"...What many people call sin is not sin." - Joseph Smith

"Reality is that which, when you stop believing in it, doesn't go away" - Phillip K. Dick

“The meaning of life is that it ends" - Franz Kafka
_Meadowchik
_Emeritus
Posts: 1900
Joined: Tue Apr 18, 2017 1:00 am

Re: Maxwell Institute Mental Gymnastics at its Worst

Post by _Meadowchik »

I have a question wrote:
In my opinion....People need a psychological shift to a place where one doesn’t doubt their doubts when it comes to examining an existing belief system. For believing Mormon’s that only happens once they’ve made a mental jump to a position where they want to know if the Church is not true. Only at that point will they honestly embrace contrary evidence and facts. The same is also true in reverse. If somebody believes the Church isn’t true, are they really embracing and considering any evidence presented that shows in favour of the Church’s truth claims?


I think the reference to mental states and shifts is very helpful! That psychological place is one where they feel safer than anywhere else. So they will not shift from belief to unbelief or unbelief to belief until they perceive the destination as safer.

So, for example, if a man develops his Mormon testimony over the years by connecting its meaning to verifiable fact, logic, and morals, and eventually starts navigating the sometimes contradicting nature of the church with this joint basis (including verifiable fact, logic, and morals) he may inadvertently build a place that serves altogether as an alternative. So the more he trusts that latter basis, or the less he trusts the church alone, or both, the alternative will become safer on its own, and eventually safer than the church.

Another example might be a woman who perceives danger in the world at large, and relative safety in the church, where she prefers its benevolent sexism over the hostile sexism she perceives from the world at large. Yet there are pockets of the church that are more hostile, and there are definitely places in the world at large that are more benevolent, but her personal experience is going to impact her perception of the church as safe for her as a woman.

Of course these examples can be switched around and include other criteria that establishes safe alternatives, and other identities that feel safer or less safe in the church.

In that light, it's hard to imagine the Maxwell Institute doing anything other than their all to legitimize Mormonism. That's its safe space.

(As for those who evaluate the church and decide to join, I think similar processes occur, where they perceive the church as safer than where they are.)
Post Reply