Sensibly Speaking Podcast #163: Beating Tribal Thinking


Manage episode 219366285 series 1288873
By The Sensibly Speaking Podcast. Discovered by Player FM and our community — copyright is owned by the publisher, not Player FM, and audio is streamed directly from their servers. Hit the Subscribe button to track updates in Player FM, or paste the feed URL into other podcast apps.

Show notes:

Facebook post: Something to never ever forget is how strongly people’s biases, prejudices and tribalism actually alters their perceptions. Memories are formed based on how someone thinks they saw or heard something (as well as by actual physical limitations with their senses), not based on what really happened. When there is general agreement about an event from a number of people one can feel safer that their collective recollection is probably accurate, but there are MANY examples in history of mass hysteria and never forget how easily mobs are riled up around complete lies.

This is more important and pervasive than I think is generally understood. People’s actual lives are usually very different from what they think they lived. This is something I keep in mind often when recalling my own experiences and when interviewing others. But it’s more important when it comes to how and why we all form judgments.

Social media is the epitome of this. Free reign is given to complete delusion and then people are running with it, lining up dutifully on opposing sides and all too willing to “destroy” each other when someone doesn’t agree with them completely. I worry about our future given how common this is becoming. I hope my worries are just unfounded.

Tribalism, first of all, is kind of interesting all by itself.

From Wikipedia: Tribalism is the state of being organized by, or advocating for, tribes or tribal lifestyles. Human evolution has primarily occurred in small groups, as opposed to mass societies, and humans naturally maintain a social network. In popular culture, tribalism may also refer to a way of thinking or behaving in which people are loyal to their social group above all else, or, derogatorily, a type of discrimination or animosity based upon group differences.

Tribalism, or at least the desire to find similarities based on common characteristics, is littered throughout our language. There are many phrases which show this:

Like two peas in a pod

Thick as thieves

Birds of a feather flock together

Cast in the same mold

On the same wavelength.

One of the same kind

A mirror image

A meeting of minds

Be of like mind

Two halves of a whole

From the same school (or factory)

Tarred with the same brush

Cut from the same cloth

The brother I never had

My brother from another mother

Like two drops of water

Some tribes can be located in geographically proximate areas, like villages or bands, though telecommunications enables groups of people to form digital tribes using tools like social networking websites.

Loyalty to the tribe is the highest moral principle. This is why when you rat out the tribe or anyone in it, no matter for what reason, you are the bad guy. If you bring disgrace to the tribe, you have committed the ultimate sin because the tribe represents all that is good and right and ultimately true. This is what justifies mass slaughter of people who are not in the tribe, whether that slaughter is real or just figurative.

There’s a group called the Cultural Cognition Project which is actually studying this at Yale. Cultural cognition refers to the tendency of individuals to conform their beliefs about disputed matters of fact (e.g., whether humans are causing global warming; whether the death penalty deters murder; whether gun control makes society more safe or less) to values that define their cultural identities.

The fact that this phenomenon even needs to be studied shows without any doubts that people do not form ideas or conclusions based on facts so much as they do their affiliations and perceived identity. If they identify as a blue smurf, then of course they will live in mushroom houses, because there is no other way to live. And here’s the catch: they will then tailor all of their thinking, even if just subconsciously, to fit that conclusion. The conclusion comes first and any facts they gather or listen to are only facts if they support that conclusion. Otherwise, they can be dismissed as biased, “fake news” “alternative facts” or whatever other toss-off nonsense you want to use to dismiss objective reality.

Cults of course are just the most intense version of tribalism. A person buys into the premise of the cult or tribe and usually experiences some kind of event which they interpret as a spiritual or profound personal revelation and that seals the deal. They are then convinced that the cult or tribe is the bearer of all truth, or at least the most important truths there are to know. They don’t have to necessarily think that the cult has ALL the answers, but certainly the important answers that make life make sense.

Now here’s the real kicker: there’s no getting out of tribalism. We are social creatures. We are evolutionarily tuned to have to be in tribes. We feel isolated, alone, outcast and our self-esteem or even our sanity can go to hell if we are not accepted in some group, even if it’s just a book club or other small social meeting circle. Let me quote from a 2014 Frontline article which cited research done on this subject.

“In one notorious study from the 1950s, University of Wisconsin psychologist Harry Harlow placed rhesus monkeys inside a custom-designed solitary chamber nicknamed ‘the pit of despair.’ Shaped like an inverted pyramid, the chamber had slippery sides that made climbing out all but impossible. After a day or two, Harlow wrote, ‘most subjects typically assume a hunched position in a corner of the bottom of the apparatus. One might presume at this point that they find their situation to be hopeless.’ Harlow also found that monkeys kept in isolation wound up ‘profoundly disturbed, given to staring blankly and rocking in place for long periods, circling their cages repetitively, and mutilating themselves.’ Most readjusted eventually, but not those that had been caged the longest. ‘Twelve months of isolation almost obliterated the animals socially,’ Harlow found.

“Similar studies on human subjects are rare — in part because most modern universities would never consent to them — but in 1951 researchers at McGill University paid a group of male graduate students to stay in small chambers equipped with only a bed for an experiment on sensory deprivation. They could leave to use the bathroom, but that’s all. They wore goggles and earphones to limit their sense of sight and hearing, and gloves to limit their sense of touch. The plan was to observe students for six weeks, but not one lasted more than seven days. Nearly every student lost the ability ‘to think clearly about anything for any length of time,’ while several others began to suffer hallucinations. ‘One man could see nothing but dogs,’ wrote one of the study’s collaborators, ‘another nothing but eyeglasses of various types, and so on.'”

“Stuart Grassian, a board-certified psychiatrist and a former faculty member at Harvard Medical School, has interviewed hundreds of prisoners in solitary confinement. In one study, he found that roughly a third of solitary inmates were “actively psychotic and/or acutely suicidal.” Grassian has since concluded that solitary can cause a specific psychiatric syndrome, characterized by hallucinations; panic attacks; overt paranoia; diminished impulse control; hypersensitivity to external stimuli; and difficulties with thinking, concentration and memory. Some inmates lose the ability to maintain a state of alertness, while others develop crippling obsessions.”

Obviously very few of us live lives of such isolation, but I think the point is clear. There are parts of our brain or mind that demand we interact with others and if we don’t, our mind starts to break down. So the solution to extreme tribalism, if there is one, is NOT to go off and be an island and think you’re going to make your own way without the advice or help of other people. That simply isn’t an option.

I’ve been diving into what are called cognitive mechanisms lately because these are very revealing in terms of how we actually think, versus how we think we think or how we would like to think. Processing information, forming opinions or judgments and coming to conclusions are all regulated by how we think. And if our thinking is screwy, then of course our conclusions are going to be screwy and that means we are going to get into all kinds of trouble with other people. So it is definitely in our best interests to understand what is going on 2-3 inches behind our foreheads.

The first thing to overcome is the natural instinct when faced with conflicting information to put up our defenses and fight. It’s actually kind of funny because so much of our social discourse is something we now treat like a battle. But we are often our own worst enemy in these fights because while we like to think we’re armed with facts and opinions and authorities that are always right, the truth is that we are not always right and sometimes we are incredibly wrong. If winning arguments is your goal, that’s easy. You can engage with anyone at any time and whenever you feel like it, you can declare yourself a winner and walk away. People do this all the time and they haven’t really accomplished much of anything except riling themselves up and pissing off the person they are arguing with. It doesn’t feel like much of a victory because it isn’t one. Maybe you’re the kind of person who can do that and just walk away and not give it another thought, but for me, I hate it when I leave a conversation or disagreement unsettled. I feel anxious, upset, angry and sometimes the adrenaline and testosterone have flooded my system so much I want to go find a punching bag. My blood pressure can feel like it’s through the roof and sometimes my heart even starts going because I get so emotionally invested in the conversation.

Yet on the flip side, when things go well and I’ve managed to convince the person I’m arguing with that we actually can agree on something and we find common ground, all of that goes away. Sometimes that happens when I show them something they didn’t know or they show me something I didn’t know or we realize we maybe both had something to learn and when we’re done talking, we’re both better people for the experience. Wouldn’t that be a more ideal way to deal with this? I think so. In order to truly win, it would seem to me that you’d want to walk away with something approximating the truth. So what gets in our way? Why are we our own worst enemy sometimes? Let’s talk about two of these cognitive mechanisms I was referring to earlier.

Cognitive dissonance – this term describes the mental discomfort experienced by a person who simultaneously holds two or more contradictory beliefs, ideas or values. This discomfort is triggered by a situation in which the belief of a person clashes with new evidence perceived by that person.

One way we deal with cognitive dissonance is motivated reasoning. I don’t particularly think this is a healthy way of handling things but people do it all day every day. Motivated reasoning is basically emotion-based decision-making where the thing that makes a person feel good or right is the thing the person believes must be true, regardless of any objective evidence to the contrary. In other words, “rather than search rationally for information that either confirms or dis-confirms a particular belief, people actually seek out information that confirms what they already believe.” The bottom line is the thing that makes their decisions for them is not the rational validity of the information but how they feel about it.

This of course is where a great deal of denial comes from.

Now here’s a funny and interesting way of sidestepping this and tricking our minds into doing our thinking a bit better. A lot of the initial research on motivated reasoning was done at Princeton University by Ziva Kunda and she wrote that when accuracy is the goal instead of feeling good or feeling right, then we automatically will be a lot more careful in our thinking. If we have to justify our conclusions to other people or if we are told before forming our conclusions that the information is going to be used in some important way that is going to matter, then we will dig deeper when we are doing our thinking and evaluating. In other words, if we raise the stakes or consequences, then we will take a little more time to make sure that our conclusions don’t just make us feel good, but that we actually can back them up with real facts that will stand up to scrutiny.

Now obviously there are lots of things we do in life where accuracy doesn’t really matter that much or we can fake it until we make it or we can fudge some things and still come out okay. Cooking, for example, is not an exact science. We can throw a few things into the cake mix that we think might work and odds are no one is going to lose their minds over it. But what if you were making a cake for Gordon Ramsay or Guy Fieri, would you be so loosy goosy? You see how that works? When I said that, I’m pretty sure most of you listening thought “Oh no, I’d definitely be more careful about following that recipe if I was cooking for them.” That’s the difference I’m talking about between being emotional in your thinking versus accurate.

So when does it matter? Well, priorities and importances are relative. What’s important to me here in Denver doing my YouTube channel is probably different from what would be important to a corn farmer in Iowa or an Arabian concrete worker in Dubai. Yet we all use the same cognitive mechanisms in our thinking. And we all should probably strive to be more accurate in our thinking when we are dealing with matters of life and death, or matters that are going to have long-term effects on our personal lives.

Ultimately, it’s up to us what we think matters and where we invest our time and mental energy. I happen to think that subjects like politics, news and religion should be given deep thought and we should take care to be accurate and goal-oriented when we dive into those topics. I think the laziest thing anyone can do is say “Well, I was raised that way so that’s how it is.” or “God said it, so that’s the end of the conversation.” I mean, talk about someone who isn’t using their brain. God said it? Where? I’d like to hear God speak. Where did you hear him speak? Anyway, not to get into religion-bashing here, I’m just pointing out that if the reason you think God is real is because someone told you that when you were three years old and you’ve never questioned it since, it would definitely be in your best interest to think a bit more about that.

I wish we had easy answers to all of this trouble we have with our thinking, but we don’t. The research on all of this is still very new and very much in development. These mechanisms and phrases like “cognitive dissonance” and “motivated reasoning” may well develop in whole new directions over the coming years and I hope they do. I hope we learn a lot more about all of this by coordinating what we have learned in psychology, sociology and neurology. Maybe we’ll eventually figure out what the ego and consciousness really are all about and wouldn’t that be something? In the meantime, we struggle along and do the best we can. I hope that the information I’ve given here helps you to make better and more informed and rational decisions in your life.

Thanks for watching.

The post Sensibly Speaking Podcast #163: Beating Tribal Thinking appeared first on The Sensibly Speaking Podcast.

212 episodes