2 MINUTE READ
Even though you probably don’t give it a lot of thought, we are naturally more drawn to and more likely to befriend people who are similar to us. And, conversely, less likely to extend much interest to people who aren’t.
This has serious real-world implications for people in organizational environments, and there is a stickiness to this orientation that transcends even the most ambitious diversity and inclusion efforts.
The origin of this exclusivity is anthropological: Back before weapons facilitated human dominance over animal predators, our survival depended heavily on familiarity. Safety and comfort were associated with whomever comprised our tribe and we subsequently developed an attraction to people within our in-group, many of whom were related to us.
This sense of kinship extends beyond just our appearance. We tend to “click” with people with whom we have identified commonalities for a number of reasons, including the following:
In addition to the exclusivity our preferences foster, when we surround ourselves with people who are similar to us, it serves as a continual reinforcement of who we are and what we think, which creates comfort and comradery. Unfortunately, however, limited our exposure to different ideas and beliefs, which keeps us in a self-serving feedback loop of sorts and, can – at the team and organizational level – stymie innovation and creative problem-solving.
Journal of Personality and Social Psychology | Similarity in Relationships as Niche Construction: Choice, Stability, and Influence Within Dyads in a Free Choice Environment (2016)
Journal of Social and Personal Relationships | Is Actual Similarity Necessary for Attraction? A Meta-Analysis of Actual and Perceived Similarity (2008)
3 MINUTE READ
A blocker to both personal and professional effectiveness is a blend of two biases we all have.
Biases are not inherently bad; we have them because our brains have to - very quickly - assess, prioritize, and organize the non-stop barrage of information it receives. While our brains can process approximately 11 million bits of information every second, our conscious minds can only handle up to about 60 bits. Because so much of this filtering happens at the unconscious level, there is a lot of processing going on that we’re not aware of.
While there are likely unnumerable cognitive biases, 188 have been mapped (see graphic below). The two for today’s focus are, especially in combination, particularly troublesome for us to navigate.
The first bias to consider is called Superiority Illusion, and we all have it (although your bias may lead you to believe that you don't). We generally think we’re smarter, more attractive, have more engaging personalities, and are nicer than other people perceive us to be. This gap is the result of dopamine, an important neurotransmitter that helps us motivate, focus, enjoy pleasure, and find things interesting.
Superiority Illusion bias likely factors into the sometimes stunningly tragic auditions on tv talent shows, frequently causes people to believe they are better equipped to handle a gun than they are, and is probably what motivates your co-worker of middling intelligence to pontificate ad nauseum in team meetings.
It keeps us happy and feeling good about ourselves and is usually only a minor irritant to others; until it meets and marries the second bias we’re focusing on today: Confirmation bias.
Confirmation bias, simply put, is our tendency to affirm information that aligns with our existing beliefs, opinions, and theories and to dismiss information that refutes them.
The results of Confirmation bias, combined with Superiority Illusion bias, are perhaps best exemplified online. Particularly with social media, where seemingly everyone is an expert, we are quick to state our opinions as fact, and can often easily be provoked into belittling, disparaging, and sometimes insulting anyone or anything refuting our thinking.
Now that we’re aware of this bias combo, what can we do about it?
The best thing we can do is to approach people, places, experiences, and our beliefs from a place of curiosity rather than certainty. As Microsoft CEO Satya Nadella says, “the learn-it-all does better than the know-it-all.” When he assumed leadership of Microsoft in 2014, he observed a culture in which everyone in every meeting felt they had to prove they were the smartest one in the room. This type of thinking stymies learning, collaboration, and creative thinking/problem-solving. It’s not good for businesses and it’s not good for us as humans - we are designed to grow, learn, and evolve.
You can also fight against the tendency we all have to be driven by our egos. Be open. Admit when you’re wrong. Pull yourself out of your perspective and hover at 30,000 feet for a while. Does the view look different?
And lastly, pause before you react. This one is particularly challenging for me, as I am well-wired (not hard-wired – I am making progress here) toward reactivity and impulsivity. The work is in the pause. It’s giving time and space for the information to travel from the limbic part of our brains (where our fight or flight response system lives) to our pre-frontal cortex, which enables us to be thoughtful and responsive.
Changing our largely automated systems and processes is hard. I know this phrase is on pretty much every other Instagram post, but it's true that what works is going for progress, not perfection. Be nice to yourself as you start moving toward more objectivity. And you'll know you're making progress when you start noticing that you're being nicer to others. Or at least thinking about it.
Scientific Reports | The Capacity of Cognitive Control Estimated From a Perceptual Decision Making Task (2016)
Scientific American | The Superiority Illusion: Where Everyone is Above Average
NPR | Understanding Unconscious Bias
WebMD | What is Dopamine
Fast Company | Satya Nadella: The C In CEO Stands For Culture (2017)
LinkedIn | Satya Nadella on Growth Mindsets (Direct quote)
Graphic: Visual Capitalist
2 MINUTE READ
As mentioned previously, as humans we’re constantly managing our brain’s tendency to perceive incoming stimuli as more threatening than warranted.
Because our brains haven’t evolved sufficiently from our early ancestor days of co-existing with apex predators and ongoing tribal warfare, they are currently ill-equipped to discern between experienced and witnessed threats.
This reality, combined with the advent of an endless stream of competing 24/7 news sources over the past 30 years, can have chronic negative impacts on our mental, emotional, and physical health. And because the consumption of news is so prevalent and accepted as a societal norm, we’re likely not really giving any of this much thought.
Here’s the thing, though: Every time your brain perceives something it sees or hears as a threat (this can be a perceived offensive social media post, any form of “news” or what passes for news, and even just a headline at a quick-glance), it automatically activates a threat response (typically fight or flight) and dumps a surge of hormones, including adrenaline and cortisol, into your body.
Adrenaline increases your heart rate, your blood pressure, and energy levels. Cortisol releases sugars (glucose) into your blood stream. These natural defense mechanisms better equip us to either fight or flee, but we generally don’t need these physiological changes to respond to the stimuli.
Over time, repeated threat response activation takes a toll on our body and leads to a host of issues related to chronic stress including obesity, chronic headaches, muscle strains/pain, panic disorders, depression, immune disorders…this is just a small sample of possibilities, but you get the idea.
With this in mind, I invite you to challenge yourself this week to be more aware of your news and social media scrolling/consumption. Try making a commitment to only check each for 5-10 minutes per day instead of regularly hopping on various sites for distraction or out of boredom. Be kind to yourself, however; our brains have been conditioned over the past 10-20 years to seek the newness/novelty of mindless scrolling. You're likely to feel uncomfortable and deprived while you detox from this ubiquitous habit.
To get started, think of 2-3 other things you can substitute for scrolling: Have a book nearby and read a page or two, stretch, meditate, make a quick journal update, go for a walk outside/get some sun…anything really to break the tendency to give our brains the novelty “fix” they crave from news sites and our social media feeds.
And, if it’s killing you to go cold turkey, commit to a week of getting your news from sites like AP and Reuters, which consistently rank the high in reliability and low in bias. These sites are much less likely to trigger a threat response while allowing you to stay up-to-date on what’s happening around the world.
Most important, however: Make it a point to check in with yourself and how you’re responding to incoming stimuli. Are you setting yourself and your life up for peace or conflict? For calm or tension? This extends beyond the screen to your family and friendships, how you interact on the road and on planes/public transport, and your customer service interactions online, on the phone, and in person.
Ultimately your wellbeing reflects who you are and how you show up in your life, so choose. Choose judiciously. Choose when to engage and when to side-step. And critically, choose what you allow into your day, into your awareness, and into your precious, always-protective brain.
Towards Data Science | Who is the Least Biased News Source (Pryor, 2020)
Harvard Health Publishing | Understanding the Stress Response (2020)
American Psychological Association | Stress Effects on the Body (2018)
Mayo Clinic | Chronic Stress Puts Your Health at Risk
2 MINUTE READ
While scientists have a significant amount of evidence in support of human evolution, the leap to self-consciousness has yet to be explained. Obviously, this missing link is intriguing, and my interest was piqued last week while watching the Netflix documentary, Fantastic Fungi, which posits a hypothesis that psilocybin may have played a role in our brain development.
Psilocybin is found in more than 180 species of mushrooms, which first evolved on the planet approximately 800 million years ago. When ingested, the body converts psilocybin into psilocin, which has psychoactive properties that increase activity in areas of the brain associated with dreaming and our ancient emotional system. It is also known to expand conscious awareness, i.e. the “breath of associations made by the brain and the ease by which they are visited is enhanced.” All of which make for a compelling argument that sustained ingestion could have resulted in permanent changes to our ancestral brains over time.
Please note that I am not advocating for this hypothesis, which would be difficult to valiadate anyway. I also don’t think it’s sound reasoning to suggest something as complex as the evolutionary process to higher thinking can be reduced to one primordial moment or movement.
I’m only stating here that it’s an interesting idea to contemplate; one that potentially aligns with notable gains in our cognitive ability approximately 200,000 years ago.
The thinking behind this idea is that early humans, migrating around the African plains, may have, over a significant period of time, consumed mushrooms containing psilocybin. And that this dietary predilection could have contributed to the expansion of their cognitive capacity to the point of conscious awareness.
Another perspective -
A very different theory (and one given much more credence by the scientific community), proposes that our higher thinking evolved as the brain sought to process incoming signals and was forced to prioritize certain signals over others (this is a very simplistic explanation, obviously, you can read in detail about Attention Schema Theory here). This theory suggests that consciousness evolved gradually (over, say, a half billion years) and is present in an array of non-human species. It is of particular interest to scientists as, if proven true, advancements could lead to infusing future AI-technology with self-awareness.
Journal of Archaeological Method and Theory | The Origins of Inebriation: Archaeological Evidence of the Consumption of Fermented Beverages and Drugs in Prehistoric Eurasia
Scientific American | Does Self-Awareness Require a Complex Brain?
Progress in Neurobiology | Control and the Attention Schema Theory of Consciousness
Drug Science | Psilocybin
psys.org | First Mushroom Appeared Earlier Than Previously Thought
The Conversation | Magic Mushrooms Expand Your Mind and Amplify Your Brain's Dreaming Areas (Direct quote attribution)
2 MINUTE READ
To get started, let’s take a quick, extremely high-level tour of our extraordinary and wildly complex brains.
The largest part of our brain, the cerebrum, is comprised of four lobes: Frontal, parietal, temporal, and occipital. The outer layer of the cerebrum is comprised of gray matter and is called the cerebral cortex. It houses between 14-16 billion neurons and is involved in things such as our intelligence, personality, ability to plan and organize, and other sensory and motor functions.
Going through all four lobes in detail is likely to drive you to leave this site and never return, so for now, let’s focus on the frontal lobe and temporal lobes, as those are the most relevant parts of the cerebrum for our purposes.
The frontal lobe, the largest of the four, houses an area called the prefrontal cortex. The prefrontal cortex is significant as it contributes to (although not exclusively) our executive functioning, namely our ability to reason, focus, plan for the future, anticipate events, predict consequences, coordinate and adjust complex behaviors, and importantly – manage our emotional reactions, a.k.a. our impulse control.
In an area of the temporal lobe, specifically the medial temporal lobe, is a region of the brain called the amygdala. It is a component of the limbic system and is primarily associated with emotional processing.
Unlike emotional regulation, which is seated in the prefrontal cortex, our amygdala is on high-alert for danger. It will automatically activate a stress response (fight, flight, freeze, or fawn) when it perceives danger, whether real, exaggerated, or non-existent.
This automated stress response was critical to our survival when the threat of physical harm was real and prevalent. This issue for us today, however, is we don’t spend most of our time in actual dangerous situations, but our brain doesn’t realize that and so it freaks out all the time, and causes us to react in all sorts of ways that we’re often not proud of once we calm the heck down.
More, much more, on that to come as increasing our self-regulation is critical to our success as trusted, well-liked, happy, and healthy humans.
2 MINUTE READ
To begin, what is neurolifeworks? Why are you doing it? And who are you, again?
All pertinent questions. Let’s dive right in.
What is neurolifeworks?
The purpose of neurolifeworks is to connect you with insights and information to help you:
Why are you doing it?
To help you better understand the brain-science informing who we are, why we think the way we do, and do the things we do. So much of neuroscience currently is presented in dense, complex packaging that is a barrier for people who are busy, in a constant juggle of competing priorities, and simply aren’t inclined to seek out and read lengthy treatises on anything, and especially not on brain science. That said, it’s fascinating and drives very nearly everything we do, so the intent here is to present practical elements of it in easy-to-digest, engaging bite-sized chunks of awareness-building.
And who are you, again?
While I more formally introduce myself here, I work with clients (from start-ups to Fortune 500 mulitnationals) all over the world on creating and sustaining meaningful change. Through my work, I’ve learned a lot about why most personal and organizational change doesn’t work. Conversely, I’ve also learned what works, and really, much of it comes down to, you got it: our brains and working with vs. against them.