Intentional Insights Intentional Insights blog brought to you by History News Network. Mon, 19 Aug 2019 17:50:02 +0000 Mon, 19 Aug 2019 17:50:02 +0000 Zend_Feed_Writer 2 ( Is There Anything We Can Do to Stop Politicians from Lying? Dr. Gleb Tsipursky is the author of the forthcoming The Truth-Seeker’s Handbook: A Science-Based Guide and is currently writing The Alternative to Alternative Facts: Fighting Post-Truth Politics with Behavioral Science. He is aprofessor of history at Ohio State University and President of the nonprofit Intentional Insights. This article is part of the author’s broader work on promoting rational thinking and wise decision-making. To learn more about The Truth-Seeker’s Handbook book and be notified of its publication, click on this link. He blogs here at Intentional Insights on HNN. 

We are in unprecedented historical territory when a Senator calls the President of his own political party “an utterly untruthful President” as Bob Corker did in regard to Donald Trump, and when another Senator from the same party, Jeff Flake, describes the President as having a "flagrant disregard of truth.” Consider the recent example of Trump making false statements about his phone conversation with a Gold Star widow, and then doubling and tripling down on them. For a more policy-oriented example, recall how Donald Trump’s rally speech in Phoenix on August 22 was full of falsehoods. He gave a revisionist and false history of his reaction to the Charlottesville violence to make himself look better, made false statements about media reporting and misled the audience over his economic achievements. Trump’s actions point to the normalization of post-truth politics, when appeals to personal beliefs and emotion win out over objective facts. To avoid this normalization, we need to borrow the successful tactics of the environmental movement.

Trump’s behavior – the speech and the attacks on the Gold Star widow – represents part of a broader pattern: Of Trump’s statements fact-checked by Politifact, an astounding 49 percent are false. By comparison, his Democratic opponent in the U.S. presidential election, Hillary Clinton, has 12 percent of her fact checked statements rated false; 14 percent of Republican Speaker of the House Paul Ryan’s are.

Despite Trump’s extremely high rate of deception, many still believe him. As an example, 44 percent of those polled believed his falsehoods about Obama wiretapping Trump Tower during the 2016 election campaign. Unfortunately, 29 percent of the public, and only 12 percent of Trump supporters, trust fact checkers.

Moreover, research on debunking falsehoods shows such debunking sometimes backfires. Called the backfire effect, scientists have shown in a number of cases people believe in falsehoods even more strongly after being presented with contradictory evidence. This situation enables Trump to pollute our politics with deception, destroying trust in our democratic political system.

Political and social science research summarized in the 2003 Trust and Governance, edited by Valerie Braithwaite and Margaret Levi, shows trust is vital for healthy democracies. Citizens in a democracy have a basic expectation of their public officials being trustworthy, in their words and actions. In return, citizens comply with laws, pay taxes and cooperate with other government initiatives. By comparison to a democracy, an autocratic state bears a much higher resource burden of policing to make its citizens comply with its laws. In his 2002 work, Trust and Trustworthiness, political scientist Russell Hardin also shows the vital role of trust in creating and cultivating civil society in a democracy. When political leaders act in ways that destroy trust—as Trump is doing through misleading statements and outright lies—people will increasingly stop complying with laws, paying taxes and engaging in civil society. Trump’s actions are fatally undermining the health of our democracy.

His behavior falls within the sphere of what behavioral scientists term “tragedy of the commons,” following a famous 1968 article in Science by Garret Hardin. Hardin demonstrated that in areas where a group of people share a common resource—the commons—without any controls on the use of this resource, individual self-interest may often lead to disaster for all involved. Because each individual may well have a strong interest in using more of the common resource than is their fair share, all suffer the consequences of the depletion of that resource. Environmental pollution is a clear example where the common resource of clean air and water is abused by polluters who destroy this shared resource.

Trump is abusing the commons of trust in our political environment, and he is setting a clear example for other politicians to follow through his successful tactics. West Virginia Attorney General Patrick Morrisey and Kentucky Gov. Matt Bevin are adopting the post-truth tactics of condemning media as “fake news” whenever the media report stories unfavorable to them. As an example, Bevin personally attacked a journalist who reported on Bevin’s purchase of a mansion for about a million dollars under market value from a hedge fund manager, which some suggested might be a bribe in return for under-the-table political favors. Such trickle-down of post-truth politics points to its normalization within our political system, thus enabling corruption and undermining our democracy.

How do we stop this pollution of truth? The modern environmental movement has been dealing successfully with a tragedy of the commons: industrial pollution. The historical consensus is that the launch of the modern environmental movement came with the publication of Rachel Carson’s Silent Spring in 1962. This and other similar publications brought about an awakening of the public to the dangers posed by environmental pollution to individual and community health, and led to the coordinated movement of activists—Republican and Democrat—fighting for the environment.

As a result, environmental problems drew much wider public attention. Consider the 1969 fire on the Cuyahoga River in Cleveland. The river has had a long history of pollution, and in June 1969 oil-covered debris caught fire, causing $100,000 worth of damage to two railroad bridges. This event drew national attention and became a major story in Time. Cleveland’s mayor testified before Congress to urge greater attention to pollution by the federal government. Notably, the Cuyahoga River had experienced many other fires due to industrial pollution, such as one in 1952 that resulted in over $1.3 million in damage—10 times that which incurred in 1969. This much bigger and more destructive fire, however, inspired little national attention—or efforts to change the situation—as compared with the conflagration of 1969.

The marked difference in the reaction to the two fires stemmed from the launch of the modern environmental movement, combining the coordinated actions of activists to seek out and highlight these problems with heightened public attention awareness of the danger of environmental pollution. We can do the same for the pollution of truth by launching a nonpartisan pro-truth movement. Such a movement would require a coordinated group of activists holding public figures accountable for deception as well as publicly highlighting the danger that post-truth politics poses to the health of our democracy.

Whereas the 1960s required the publication of books to raise awareness and launch a movement, our contemporary digital environment gives us easier tools. One example is the Pro-Truth Pledge project at, which allows private citizens and public figures to take a pledge committing them to 12 behaviors that research suggests are most likely to lead to a truth-oriented society. This site both offers a coordination venue for those determined to roll back the tide of lies and protect our democracy, and raises awareness of the dangers of political deception. Hundreds of private citizens across the U.S. and many dozens of public figures have already taken the pledge, including household names such as Peter SingerJonathan Haidt, and Steven Pinker as well as over 50 Democratic and Republican politicians.

By launching a pro-truth movement uniting people across the political divide, we can avoid the normalization of post-truth politics. Doing so will help ensure that the kind of falsehoods uttered by Trump get a response equivalent to the 1969 fire on the Cuyahoga river, rather than the 1952 one. Whether the pro-truth movement takes off depends on how many people choose to take the pledge and join the effort to protect the health of our democracy from the pollution of truth.

Mon, 19 Aug 2019 17:50:02 +0000 0
The Behavioral Science of Political Deception in the 2016 Election Dr. Gleb Tsipursky is the author of The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge.Caption: Head with brain and puzzle pieces (Geralt/Pixabay)

How did Donald Trump win, when he used so many misleading statements and outright deceptions? Couldn’t people see through them? As an expert in brain science, I want to share why his followers fell for his lies and what can be done to address this situation in the future.

First, let’s get the facts straight., a well-known non-partisan website, rates only about 4 percent of statements by Trump as fully “True” and over 50 percent as either completely “False” or what they call ridiculously false – “Pants on Fire,” with the rest in the middle. By comparison, Hillary Clinton rated 25 percent as fully “True” and only 12 percent as either “False” or “Pants on Fire.”

The Washington Post, one of the most reputable newspapers in the country, wrote that “There’s never been a presidential candidate like Donald Trump — someone so cavalier about the facts and so unwilling to ever admit error, even in the face of overwhelming evidence.” In their rulings on statements made by Trump, this paper’s editors evaluated 64 percent of them as Four Pinocchios, their worst rating. By contrast, statements by other politicians tend to get the worst rating 10 to 20 percent of the time.

These sentiments are representative of other prominent news media and fact-check outlets, yet according to an ABC News/Washington post poll, most voters on the eve of the election perceivedDonald Trump as more trustworthy than Hillary Clinton. This false perception came from the Trump campaign building up on previous Republican criticism of Clinton, much of it misleading and some accurate, to manipulate successfully many voters into believing that Clinton is less honest, in spite of the evidence that she is much more honest than Trump. The Trump campaign did so through the illusory truth effect, a thinking error in our minds that happens when false statements are repeated many times and we begin to see them as true. In other words, just because something is stated several times, we perceive it as more accurate.

You may have noticed the last two sentences in the previous paragraph had the same meaning. The second sentence didn’t provide any new information, but it did cause you to believe my claim more than you did when you read the first sentence.

The Biology of Truth Vs. Comfort

Why should the human brain be structured so that mere repetition, without any more evidence, causes us to believe a claim more strongly? The more often we are exposed to a statement, the more comfortable it seems. The fundamental error most people make is mistaking statements that make them feel comfortable for true statements.

Our brains cause us to believe something is true because we feel it is true, regardless of the evidence – a phenomenon known as emotional reasoning. This strange phenomenon can be easily explained by understanding some basic biology behind how our brain works.

When we hear a statement, the first thing that fires in our brain in a few milliseconds is our autopilot system of thinking, composed of our emotions and intuitions. Also known as System 1, the autopilot system is what the Nobel Prize-winning scientist Daniel Kahneman identified as our two systems of thinking in his 2011 Thinking, Fast and Slow, and represents the more ancient system of our brain. It protected us in the ancestral environment against dangerous threats such as saber-toothed tigers by making us feel bad about them and drew us toward what we needed to survive such as food and shelter by making us feel good about them. The humans who survived learned well to heed the autopilot system’s guidance, and we are the children of these humans.

Unfortunately, the autopilot system is not well calibrated for the modern environment. When we hear statements that go against our current beliefs, our autopilot system perceives them as threats and causes us to feel bad about them. By contrast, statements that align with our existing beliefs cause us to feel good and we want to believe them. So if we just go with our gut reactions – our lizard brain – we will always choose statements that align with our current beliefs.

Meme saying “Lizard brain thinking is killing democracy – Please think rationally”  (Ed Coolidge, made for Intentional Insights)

Where Do We Get Our News?

Until recently, people got all their news from mainstream media, which meant they were often exposed to information that they didn’t like because it did not fit their beliefs. The budget cuts and consolidation of media ownership in the last decade resulted in mainstream media getting increasingly less diverse, well described in the 2009 Media Ownership and Concentration in America by Eli Noam. Moreover, according to a 2016 survey by Pew Research Center, many peopleare increasingly getting their news mainly or only from within their own personalized social media filter bubble, which tends to exclude information that differs from their own beliefs. So their own beliefs are reinforced and it seems that everyone shares the same beliefs as them.

This trend is based on a traditional strong trust in friends as sources of reliable recommendations, according to the 2015 Nielsen Global Trust in Advertising Report. Our brains tend to spread the trust that we associate with friends to other sources of information that we see on social media. This thinking error is known as the halo effect when our assessment of one element of a larger whole as positive transfers to other elements. We can see this in research showing that people’s trust in social media influencers has grown over time, nearly to the level of trust in their friends, as shown by a 2016 joint study by Twitter and analytics firm Annalect.

Even more concerning, a 2016 study from Stanford University demonstrated that over 80 percent of students, who are generally experienced social media users, could not distinguish a news story shared by a friend from a sponsored advertisement. In a particularly scary finding, many of the study’s participants thought a news story was true based on irrelevant factors such as the size of the photo, as opposed to rational factors such as the credibility of the news source outlet.

The Trump team knows that many people have difficulty distinguishing sponsored stories from real news stories and that’s why they were at the forefront of targeting voters with sponsored advertorials on social media. In some cases they used this tactic to motivate their own supporters, and in others they used it as a voter suppression tactic against Clinton supporters. The Trump campaign’s Republican allies created fake news stories that got millions of shares on social media. The Russian propaganda machine has also used social media to manufacture fake news stories favorable to Trump and critical of Clinton.

Additionally, Trump’s attacks on mainstream media and fact-checkers before the election, and even after the election, undercut the credibility of news source outlets. As a result, trust in the media amongst Republicans dropped to an all-time low of 14 percent in a September 2016 Gallup poll, a drop of over 200 percent from 2015. Fact-checking is even less credible among Republicans, with 88 percent expressing distrust in a September 2016 Rasmussen Reports poll.

All this combined in the unprecedented reliance on and sharing of fake news by Trump’s supporters on social media. With the rise of the Tea Party, a new study by the Center for Media and Public Affairs (CMPA) at George Mason University used Politifact to find that Republicans have tended to make many more false statements than Democrats. Lacking trust in the mainstream media and relying on social media instead, a large segment of Trump’s base indiscriminately shared whatever made them feel good, regardless of whether it was true. Indeed, one fake news writer, in an interview with The Washington Post, said of Trump supporters: “His followers don’t fact-check anything — they’ll post everything, believe anything.” No wonder that Trump’s supporters mostly believe his statements, according to polling. By contrast, another creator of fake news, in an interview with NPR, described how he “tried to write fake news for liberals — but they just never take the bait” due to them practicing fact-checking and debunking.

Meme saying “People are most comfortable dealing with reality in terms of black or white, but reality tends to like shades of grey”  (Wayne Straight, made for Intentional Insights)

This fact-checking and debunking illustrates that the situation, while dismal, is not hopeless. Such truth-oriented behaviors rely on our other thinking system, the intentional system or system 2, as shown by Chip and Dan Heath in their 2013’s Decisive: How to Make Better Choices in Life and Work. The intentional system is deliberate and reflective. It takes effort to use but it can catch and override the thinking errors committed by system 1 so that we do not adopt the belief that something is true because we feel it is true, regardless of the evidence.

Many liberals associate positive emotions with empirical facts and reason, which is why their intentional system is triggered into doing fact-checking on news stories. Trump voters mostly do not have such positive emotions around the truth, and believe in Trump’s authenticity on a gut level regardless of the facts. This difference is not well recognized by the mainstream media, who treat their audience as rational thinkers and present information in a language that communicates well to liberals, but not to Trump voters.

To get more conservatives to turn on the intentional system when evaluating political discourse we need to speak to emotions and intuitions – the autopilot system, in other words. We have to get folks to associate positive emotions with the truth first and foremost, before anything else.

To do so, we should understand where these people are coming from and what they care about, validate their emotions and concerns, and only then show, using emotional language, the harm people suffer when they believe in lies. For instance, for those who care about safety and security, we can highlight how it’s important for them to defend themselves against being swindled into taking actions that make the world more dangerous. Those concerned with liberty and independence would be moved by emotional language targeted toward keeping themselves free from being used and manipulated. For those focused on family values, we may speak about trust being abused.

These are strong terms that have deep emotional resonance. Many may be uncomfortable with using such tactics of emotional appeals. We have to remember the end goal of helping people orient toward the truth. This is a case where ends do justify the means. We need to be emotional to help people grow more rational – to make sure that while truth lost the battle, it will win the war.

P.S. To learn more about truth-seeking strategies in politics and other life areas, check out the article author’s book, The Truth-Seeker’s Handbook: A Science-Based Guide.

Mon, 19 Aug 2019 17:50:02 +0000 0
Roy Moore's Systemic Danger to Our Democracy Wikimedia commons) 

Dr. Gleb Tsipursky is the author of the forthcoming The Truth-Seeker’s Handbook: A Science-Based Guide. One of the lead creators of the Pro-Truth Pledge, he is a professor at Ohio State and President of the nonprofit Intentional Insights. Connect with Dr. Gleb Tsipursky on Twitter, on Facebook, and on LinkedIn, and follow his RSS feed and newsletter.  

The front-runner candidate for Alabama Senate, Republican Roy Moore, called The Washington Post “fake news” after the newspaper published a thorough investigation reporting on sexual encounters between Moore and multiple teenage girls, one as young as 14. Moore’s attacks on this highly-reputable newspaper are part of a recent broader pattern of prominent public figures using the label of “fake news” to denounce quality investigative journalism that reveals corruption and abuse of power. Such attacks pose an urgent and systemic danger to our democracy, as they encourage corruption and abuse of power by undermining credible media reporting on such behavior.

As a high-quality, well-respected venue, The Washington Post would not publish such a controversial story without a thorough investigation. The article was based on multiple interviews with over 30 people who knew Moore at the time the sexual encounters happened, between 1977 and 1982. The journalists were careful to paint a balanced story, including some negative facts about the women who accused Moore, such as divorces and bankruptcies.

Perhaps most telling of the high quality of reporting and credibility of the newspaper is the fact that a number of prominent Republican leaders are calling on Moore to withdraw from the race. Immediately after The Post publishes its story, Republican Senator John McCain called for Moore to step aside immediately, and Montana Senator Steve Daines withdrew his endorsement, as did Utah Senator Mike Lee. After a fifth woman stepped forward to accuse Moore independently of The Post’s story, Senate Majority Leader Mitch McConnell stated that Moore “should step aside,” and so did Speaker of the House Paul Ryan.

On the other hand, Republicans well-known for making false accusations of mainstream media outlets being “fake news” defended Moore and supported his attack on The Post. For example, former Donald Trump adviser and head of Breitbart Stephen Bannon accused the The Post of being “purely part of the apparatus of the Democratic Party” for conducting its thorough investigation. Prominent Virginia Republican Corey Stewart also refused to criticize Moore and instead attacked the newspaper. A number of Fox News commentators, such as Gregg Jarrett, also attacked The Post.

Unfortunately, these attacks on quality investigative reporting represent part of a broader trend of conservative politicians across the country adopting the tactic of condemning media as “fake news” whenever there are stories unfavorable to them. As an example, Republican Kentucky Governor Matt Bevin tweeted that the reporter Tom Loftus of the largest newspaper in Kentucky, The Courier-Journal, is “a truly sick man” for “sneaking around” Bevin’s manor. Loftus at the time was working on a story about how Bevin faced an ethics complaint over an accusation of bribery for purchasing this manor for about a million dollars below market price from a local investor, Neil Ramsey. Apparently, shortly before getting a million-dollar discount on this manor, Bevin appointed Ramsey to the Kentucky Retirement Board, which oversees $16 billion in investments.

Republican Governor of New Jersey Chris Christie used a similar approach when caught abusing his power. He ordered a number of state-run beaches in New Jersey closed on June 30, yet he used a closed state beach in Island Beach State Park for himself and his family on July 2. Reporters for New Jersey’s largest newspaper, The Star-Ledger, secretly photographed him and his family using the beach. When asked about whether he was on the beach that day, Christie denied it. When confronted with photographic proof, Christie did not acknowledge and apologizing for his lies and his abuse of power in using a closed public beach for the benefit of himself and his family. He instead attacked The Star-Ledger for its reporting.

Without the attacks on the media, the investigations of Christie and Bevin would have simply revealed the sordid affairs of corruption and abuse of power. Our democracy would have worked correctly with voters appropriately getting the important information from credible sources, the largest newspapers in Kentucky and New Jersey. With these accusations, Bevin and Christie distract attention from the corruption and abuse of power, and instead present themselves as fighters against supposed media bias.

In doing so, Moore, Bevin, Christie and many others are tapping the anti-media bias of the Republican base inflamed by Trump’s attacks on the media. He has expressed pride over his branding of high-quality venues like “CBS, and NBC, and ABC, and CNN” as “fake news.” We are now reaping the whirlwind of politicians caught engaged in immoral, abusive, and corrupt behavior using Trump’s anti-media rhetoric to protect themselves and continue engaging in such activities.

Now, it doesn’t mean that Democrats will not try similar tactics. For example, the prominent film director Harvey Weinstein, a well-known and high-profile fundraiser for and influencer in the Democratic Party, accused The New York Times of publishing fake news when they revealed his sexual harassment. However, neither the Democratic base nor prominent Democrats bought this accusation, and Weinstein was quickly ousted from his leading roles.

By contrast, Bevin’s popularity in the polls was climbing in Kentucky, a conservative state, at the same time that he was making his accusations. Moore has continued to be staunchly supported by the Alabama Republican Party and base, despite the accusations and the withdrawal of support from many mainstream Republicans. Only in New Jersey, a liberal-leaning state, did voters express discontent over Christie’s behavior.

However, all of us – regardless of our party affiliation – will be greatly harmed if politicians are able to get away with corruption, immorality, and abuse of power through labeling of credible media sources as fake news. This tactic is posing an existential and systemic threat to our democracy, and we must do everything possible toprotect quality journalism and overall promote truthful behavior.

P.S. Want to promote truth and fight lies? Take the Pro-Truth Pledge at, get your friends to take it, and call on your elected representatives to do so.

Mon, 19 Aug 2019 17:50:02 +0000 0
How to Address Truth Denialism Effectively Over the Holidays Thomas Guest/Flickr)

Dr. Gleb Tsipursky is the author of The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge.

It’s the holiday season, which means plenty of opportunities for uncomfortable interactions with friends and family who are truth deniers. For example, my close friend invited me to her holiday party recently, where I sat across the table from her brother Mike. We got to talking about Donald Trump’s recently-successful efforts to ban people from many majority-Muslim countries from entering the US and his retweeting of anti-Muslim videos.

Mike strongly supported Trump’s ban and rhetoric, and other anti-Muslim policies. By the end of that meal, he grew to be much more tolerant and inclusive of Muslims. To get him to update his beliefs  – something I do regularly during interviews with conservative talk show hosts –  I relied on my research on how to get people to accept the facts, specifically a strategy that can be summarized under the acronym EGRIP (Emotions, Goals, Rapport, Information, Positive Reinforcement).

The typical response to truth deniers of presenting facts and arguing is generally not effective in changing people’s minds on charged issues. Research on the confirmation bias shows that people tend to look for and interpret information in ways that conforms to their beliefs. Moreover, studies on the backfire effect reveal that when people are presented with facts that challenge their identity, they sometimes develop a stronger attachment to their incorrect belief as a defense mechanism.

If someone denies clear facts, you can safely assume that it’s their emotions that are leading them away from reality. You need to deploy the skill of empathy, meaning understanding other people’s emotions, to determine what emotional blocks might cause them to deny reality. In Mike’s case, it was relatively easy to figure out the emotions at play by making a guess based on what research shows about what conservatives value: security. I confirmed my suspicion through active listening and using curiosity to question Mike about his concerns about Muslims, and he shared extensively his fears about all Muslims being potential terrorists.

Next, establish shared goals for both of you, crucial for effective knowledge sharing. With Mike, I talked about how we both want security for our society. I also pointed out how sometimes our emotions lead us astray. We might want to eat all the Yule log on the table, but it would harm our health, so we should focus on our goals over our gut intuitions. We should also commit to the facts, as we want to avoid deceiving ourselves and thus undermining our safety and security. I told him that I - along with thousands of others - committed to the Pro-Truth Pledge and asked him to hold me accountable. He appreciated me sharing about this commitment, and it raised my credibility in his eyes. 

Third, build rapport. Using the empathetic listening you did previously, a vital skill in promoting trusting relationships, echo their emotions and show you understand how they feel. In the case of Mike, I echoed his fear and validated his emotions, telling him it’s natural to feel afraid when we see Muslims committing terrorism, and it’s where my gut goes as well.

Fourth, move on to sharing information. Here is where you can give the facts that you held back in the beginning. There were eight terrorist acts in the US motivated in part by Islamic beliefs in 2016, with nine terrorists in total. Given that there are about 1.8 million Muslim adults in the US, you have a one-in-200,000 chance that any Muslim you see would commit a terrorist act in one year. That's like picking out a terrorist randomly from the number of people in several football stadiums, and focusing our efforts on surveilling Muslims will make us less secure by causing us to miss the actual terrorists.

Moreover, the FBI praises Muslims for reporting threats, and anti-Muslim policies will make Muslims less likely to report threats. Besides, we already see Trump’s anti-Muslim rhetoric used to recruit terrorists in the US, and more anti-Muslim policies will only result in more materials to recruit terrorists. The key here is to show your conversation partner, without arousing a defensive or aggressive response, how their current truth denialism will lead to them undermining the shared goals we established earlier.

Mike was surprised and moved by this information, presented in an emotionally-sensitive manner. He agreed that anti-Muslim policies seem unwise, and we should be more tolerant and inclusive for the sake of increasing our security, even if that’s not how we intuitively feel. I offered positive reinforcement for his orientation toward the facts, a research-based tactic of encouraging people to change their identity.

Think of how much better your holiday dinner could go if you use EGRIP instead of arguing!

Mon, 19 Aug 2019 17:50:02 +0000 0
The GOP's Latest Scam Was to Convince the Base the Tax Law Is a Middle Class Tax Cut

Image of hand with Christmas gifts (Max Pixel)

Dr. Gleb Tsipursky is the author of The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge.

President Donald Trump called the recently-passed tax bill “an incredible Christmas gift” for middle-class Americans. In reality, the tax bill takes money from the pockets of middle-class Americans and gives it to corporations. Anyone who claims the tax bill primarily benefits the middle class is spreading falsehoods.

With the new bill, the tax rate for corporations is reduced from 35 percent to 21 percent. That makes a total reduction of 40 percent from what they were paying earlier. Other benefits for corporations include doing away with the alternative minimum tax, along with many provisions that will reduce the taxes they do pay.

What about tax cuts for individuals? Consider a household making $50,000 to $75,000: the average tax cut for them is 1.6 percent, or $870. The wealthiest would get the biggest tax breaks, as a household earning over a million would see an average cut of $69,660, or 3.3 percent increase.

Unfortunately for individuals, the tax cuts they get are limited to 8 years, and expire after 2025. So without any changes, the same household making $50,000 to $75,000 would actually be paying $30 more in taxes after 2025. The wealthy would be much better off, with the average household making over a million getting a cut of more than $23,000 after 2025, along with a host of other benefits. Overall, after that date, households making over a million  –  approximately .6 percent of all taxpayers  –  would get 81.8 percent of the total benefit of this bill. By contrast, the corporate tax rate cuts are permanent, and will not expire.

This extremely disproportionate tax cut comes with a hefty price tag. The nonpartisan and authoritative congressional scorekeeper Joint Committee on Taxation found that the tax bill would cost approximately $1.4 trillion, which would be added to the existing $20 trillion national debt.

Who will now be responsible for paying the taxes to address this debt? Due to the extreme tax cut for corporations, individual American taxpayers will have a much bigger proportional tax burden in paying for the debt. Since the most wealthy had especially large tax breaks, and they tend to be the large shareholders in corporations that benefit from this law, middle-class Americans will be increasingly stuck with the tab for the debt. This is especially the case after 2025, when the tax breaks for individuals expire.

The Republican politicians who support the tax bill say it will pay for itself by creating jobs and improving the business climate, and thus in the end benefit the middle class. However, they are not experts at economics. The Joint Committee on Taxation, which is acknowledged as nonpartisan and expert by Democrats and Republicans alike, found that over 10 years the tax bill would produce $400 billion in revenue, leaving unpaid an additional $1 trillion. Likewise, a survey of top economists indicated that the vast majority believed the tax bill would not substantially improve the US economy, would substantially increase the debt burden, and would redistribute wealth from the middle class to corporations and the wealthy.

Deferring to expert analysis is a critical component of truthfulness. Any time we see someone  –  especially a politician  –  reject expert analysis, we should be very suspicious, and see whether they have hidden motivations to mislead us. After all, while politicians are not experts at economics, they are experts at getting elected. They have strong incentives to do what would get them elected and mislead the public if needed.

In the case of this tax bill, the hidden motives are quite obvious. For example, Representative Chris Collins, a New York Republican, told a reporter that “my donors are basically saying, ‘Get it done or don’t ever call me again’” regarding the tax bill. According to Senator Lindsey Graham, a Republican from South Carolina, if the tax bill is not passed, the “financial contributions will stop."

In order to ensure they get elected, Republicans had to pass the tax bill in order to keep getting donations from the wealthy and corporations, who really pay attention to and know what is going on. Now, President Donald Trump is calling on his Republican colleagues to sell the tax bill to everyday voters, who pay much less attention to the details of tax policies.

Republicans have been misrepresenting the essence of the tax bill all along. They presented it as all about tax cuts to the middle class, even though the biggest cut has been for corporations. Repeating this falsehood invokes the illusory truth effect, a psychological phenomenon where a false statement repeated often enough becomes seen as true. Indeed, most of the Republican base bought these falsehoods, with around 60 percent thinking the bill primarily favors the middle class.

In reality, the tax bill falls into the classic category of trickle-down economics. This policy approach involves taking money from the middle class and giving it to corporations via such tax cuts. Republicans justify trickle-down economics by saying that corporations will use such money better than middle-class Americans, despite experts disagreeing with them about the growth resulting from the tax bill.

Historically, trickle-down economics has been most strongly associated with Ronald Reagan. Unfortunately, Reagan’s economic policies had bad economic consequences. More recently, thorough analyses of trickle-down economics by such reputable organizations as the International Monetary Fund suggest that this approach does not stimulate economic growth. Instead giving money to the lowest income earners stimulates growth much more. However, that’s not what the tax bill does.

We may debate about the effectiveness of trickle-down economics. However, the more salient point is that the large majority of Republicans have not been courageous enough to say openly that this tax bill is an example of trickle-down economics. While we may disagree on whether trickle-down economics works, we should all agree that spreading falsehoods about the reality of the tax bill erodes our democracy.

Will the misrepresentations of the tax bill succeed or will the American people recognize the truth about this tax bill as taking money from the pockets of middle-class Americans and giving it to corporations? You can make a difference by calling out any politicians and journalists who misrepresent the tax bill and calling on them to commit publicly to truthful behavior, as well as committing to truthful behaviors yourself by taking the Pro-Truth Pledge at

Mon, 19 Aug 2019 17:50:02 +0000 0
We Need to Address the Danger from Trump's Fake News Awards

Dr. Gleb Tsipursky is the author of the #1 Amazon bestseller The Truth-Seeker’s Handbook: A Science-Based Guide. He is anAssistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge, which aims to unite all who care about facts and truth.

Donald Trump’s “Fake News Awards” for what he calls “the most corrupt & biased of the Mainstream Media” have drawn mockery. However, behavioral science research suggests they are deadly serious. These awards create an institution for Trump’s relentless attacks on mainstream media and position Trump as the only voice who gets to determine truthful media. Unfortunately, the typical style of news coverage will perpetuate Trump’s agenda. However, a different style informed by behavioral science strategies would convey more accurate information and address the damage from the Fake News Awards.

The purpose of any award is to create an institutionalized way of promoting a certain cause through drawing public attention. As an example, consider perhaps the most well-known prize in the world, the Nobel Prize, awarded for the most important scientific and cultural advances. Every year, the media is filled with headlines describing the awards and their recipients, resulting in significant public attention that uplifts the importance of science and culture.

This attention taps into the “availability heuristic,” our tendency to assign excessive importance to whatever happens to be at the forefront of our minds, and the “priming effect,” where we perceive exaggerated connections between past and future stimuli. Thus, the Nobel Prize causes the public to focus on scientific and cultural achievements, and interpret future advances in light of the winners of last year’s Nobel Prize.

More subtly, an award positions the grantor of the award as the sole legitimate voice in determining who deserves the award. Several Swedish and Norwegian institutions decide who gets the various Nobel Prize awards. Perhaps the most prestigious one, the Nobel Peace Prize, is determined by a committee elected by politicians in the Norwegian Parliament. Thus, the internal domestic politics of Norway powerfully influence this prize.

In parallel, the Fake News Awards promote Trump’s attacks on mainstream media. In a January 2, 2018 tweet, he described the award as highlighting “Dishonesty & Bad Reporting in various categories.” We can get a more clear nature of what he means by “various categories” from when Trump first tweeted on November 27, 2017 about handing out a fake news trophy for “the most dishonest, corrupt and/or distorted in its political coverage of your favorite President (me).”

Trump, in other words, aims to use the award to perpetuate the narrative of himself as the victim of unfair and dishonest mainstream media coverage: after all, he is well-known for using the label “fake news” to attack accurate news stories that he doesn’t like. The President will use these awards to draw massive public attention to supposed “fake news” coverage by mainstream news sources. In fact, he even delayed the granting of the awards due to the extensive public attention to the awards.

Official Fake News Trophy Featured in GOP Email 1/18/18

The availability heuristic will cause the public to focus on “fake news” in mainstream media’s coverage of the President, regardless of whether this coverage is accurate or not. The priming effect will move news consumers to be more likely to perceive negative coverage as fake.

Since such awards will likely be given annually, they will institutionalize Trump’s agenda of attacking the mainstream media, while also legitimating Trump as the grantor of these awards. He will get to determine which media venues get labeled as providing “the most dishonest, corrupt and/or distorted” coverage. You can bet that it will not be the media venues that actually are the most dishonest, but the ones that depict Trump in a negative light, regardless of how factual (or not) such depictions may be.

Some believe that Trump will lose credibility from granting these awards because he will draw attention to unflattering stories about himself. Unfortunately, behavioral science research suggests that the style of coverage by news media will facilitate Trump’s agenda.

The typical style of headlines about any awards generally focus on who got the awards. Unfortunately, research shows that only 41% of readers go beyond the headlines, with most getting their news from the headline alone. Many of the rest do not read beyond the first paragraph, which in most stories would summarize who received the awards and in what category. Even the ones who do go further will experience “anchoring,” a thinking error where the first information we get about a topic drastically colors our overall perspective. Yes, first impressions really do matter.

Studies reveal that the standard journalistic methods of correcting people’s misconceptions with accurate facts backfires in the long term. If you first state the false information and then provide evidence of why it is wrong, people will tend to forget over time the evidence for why it is wrong, and start to misremember the original falsehood as true. Thus, even though many articles covering the Fake News Awards will eventually explain that these awards are meant to perpetuate Trump’s attacks on mainstream media and were awarded at Trump’s sole discretion, the damage will already be done.

To prevent this outcome of media consumers getting the wrong impression about the Fake News Awards, mainstream media need to go against its typical style of reporting, and instead align its coverage with behavioral science research. Instead of headlines about who received the awards, headlines should say something like “In Yet Another Attack on the Media, Trump Issues Fake News Awards” so that the majority of their readers who only glance at the headlines get the right impression. The first paragraph of the article should focus on how this award attempts to perpetuate and institutionalize Trump’s attack on the media and position Trump as the sole voice of truth, before talking about who received the awards.

Articles on the awards should devote some space to the “Press Oppressors awards” issued by the Committee to Protect Journalists. These awards - issued in response to Trump’s announcement of the Fake News Awards - focus on world leaders “who have gone out of their way to attack the press and undermine the norms that support freedom of the media.” Can you guess who received the “Overall Achievement in Undermining Global Press Freedom” award?

You as a media consumer can encourage media venues to cover the Fake News Awards appropriately by writing letters-to-the-editor suggesting more appropriate coverage, or more simply by tweeting and emailing them with a link to this article. You can also encourage them to take the Pro-Truth Pledge at to commit to truthfulness. Consider taking the pledge yourself, which aims to unite all private citizens and public figures who care about truth and facts in our society.

You can also make sure to share only articles that cover the awards appropriately. When others post articles on social media with problematic coverage, you can make comments that give a more accurate impression and draw attention to the Committee to Protect Journalists.

You have the power to address the damage from these awards.

Mon, 19 Aug 2019 17:50:02 +0000 0
Winning At Life…By Not Losing

Caption: photo of woman playing tennis (Skeeze/Pixabay)

Guest post by Peter Livingstone

After hearing several references to a 1973 book called Extraordinary Tennis for the Ordinary Player by Simon Ramo, I decided to give it a read. I was wowed! It’s not because I used it to improve my game. In fact, you might be surprised to learn that I don’t play tennis, and I don’t plan to start because I read the book. What compelled and excited me was the bigger lesson conveyed by the book.

Ramo describes how in amateur tennis, about 80 percent of points are lost, not won. Lost points, as defined by Ramo, are those resulting from a player making an unforced error, such as hitting an easy return out-of-bounds, rather than hitting a brilliant shot that is impossible for an opponent to return. The lesson is that the vast majority of amateur tennis players will have much more success by working on “not losing,” rather than by trying to “win.”

I was struck by the fact that this simple idea is transferable to nearly every aspect of one’s life. Here’s how I think it can be applied to not be a loser in the game of life.

Caption: Photo of family playing "The Game of Life” (Kathryn/Flickr)

A Bit More About Dr. Ramo… and Tennis

Simon Ramo was a prominent American physicist, engineer, and businessman. Later in his life when he wanted to improve his tennis game, he applied the same rigorous, evidence-based approach that led to his successful career.

As a scientist and statistician, he gathered data by simply counting points won versus points lost. What he discovered is that in amateur tennis, the game’s outcome is determined by the player who makes the most mistakes. Thus, the best strategy to win in amateur tennis is to keep the ball in play, allowing the other player to make more errors. Occasionally, your opponent will hit a shot you can’t return. More frequently, however, he or she will hit it into the net or out-of-bounds, or fail to return it at all.

Keep in mind that Ramo discovered that outcomes in professional tennis work the opposite way - about 80 percent of points are won. That is, the professionals who win hit extraordinary shots that are essentially impossible to return. So, unless you are one of those professionals, the best way to win is to avoid losing!

Transferring Tennis Lessons to Life Lessons

Domain independence is the idea that certain knowledge may be applicable across other fields. I think Ramo’s insights into tennis can be considered domain independent for many other endeavors.

"In order to succeed it is necessary to know how to avoid the most likely ways to fail." When I first read this statement, referred to as Minsky’s Admonition in The Systems Bible, it struck me that it could have been lifted directly from Dr. Ramo’s book on tennis. Hyman Minsky was an American economist whose research attempted to provide an understanding and explanation of the characteristics of financial crises.

I doubt there was ever any collaboration between Ramo and Minsky, so I take this as evidence of domain independence. Two different people, researching two completely different subjects, have come to the same conclusion on achieving success!

Here are a few other areas where I think this concept may apply.


In the classic investment book Winning the Loser’s Game, Charles Ellis makes the case that investing works much the same way as tennis. Ellis proposes that most investors, like most tennis players, end up defeating themselves by making avoidable mistakes. Like Ramo, Ellis uses compelling mathematical evidence to make his arguments.

Consider, for example, some statistics: The average annual compounded return of the broad US stock market, as measured by the S&P 500 Index, for the past 30 years was just over 10%. For that same period, the average individual investor in stock market funds achieved a return of slightly less than 4%. For an individual investing $300 a month in a retirement account over 30 years, this is a difference of having about $650,000 versus $200,000.

Why do most investors underperform the market by so much? While a fraction of the underperformance can be attributed to trading costs and other fees, Ellis explains that most investors are like amateurs playing tennis. That is, they think they can outperform the market by attempting brilliant, “winning” moves, but by doing so make unforced errors. One such error is trying to “time” the market through a pattern of buying and selling. Another error includes buying into “hot” funds or individual stocks - those that have had recent superior performance - and selling losers. These actions, more often than not, lead to buying high and selling low, the exact opposite of a winning strategy. Also, paying high fees to funds you expect to outperform will usually lead to underperformance. Funds that charge high fees, on average, underperform funds with low fees. One might get lucky once in awhile, but over time these actions lead to the huge discrepancy between the market performance and average individual performance.

What’s the best way to avoid these investment mistakes and achieve results close to the market average? It’s the same as in tennis: just work on “not losing." For individual investors with a long-term time horizon, the best option is to systematically invest in a low cost fund that tracks a broad market index, such as the S&P 500, or a global index such as the MSCI ACWI, which includes the US and most other world markets. Put your money into these investments incrementally over time and leave it there, at least until you are close to retirement.

Health and Fitness

Ramo lists a group of “don’ts” for tennis - those behaviors, characteristic of many amateur players, which should be eliminated. Simply focusing on reducing, and preferably eliminating, these actions can significantly improve one’s play. As in tennis, improvements in health and fitness can come from the elimination of harmful actions.

According to the ongoing Global Burden of Disease study, tobacco use is the leading cause of preventable death in the US and the world. Diet is the second highest risk factor after smoking. Many diet-related diseases are the result of overconsumption of calories in the form of simple sugars. Additionally, increasing death rates from the abuse of alcohol and opioids in the US has been widely acknowledged as a public health crisis.

It may take you a bit of reframing to view behaviors such as smoking, poor eating, and drug abuse as errors. Additionally, identifying these errors is usually pretty simple, but eliminating them can be extremely difficult. Behavior change is hard. Here is an article I found useful on how to avoid impulsive temptations, and one on building willpower.

Many of us think improving our fitness is only possible by adding activities to our routines, such as going to the gym or taking up running. Consider, however, some things we can eliminate to get actually more exercise.

How many times have you stood on an escalator, or ridden an elevator, only to find that someone who took the stairs arrived at the same destination as quickly as you? Do you have the opportunity to walk or ride a bike to some destinations, rather than take a car or bus?

By framing some of these modern conveniences as errors to eliminate, you may be able to improve your fitness without sacrificing time or money. There may be tremendous opportunity for improving your health and fitness by just working on eliminating some things you are doing, rather than doing more.

Diminishing Returns

Learning to avoid mistakes may be one of the fastest and easiest methods of improving. Have you ever noticed how quickly you can improve when you start learning something new, especially if you’ve had the opportunity to learn from a good coach or teacher? I’ve really enjoyed watching children learn a new activity from a good coach, whether it be playing a sport, or even a board game requiring some skill.

It seems to me that most of their improvement, at least initially, comes from learning how to avoid mistakes. Of course, those improvements usually tend to slow down over time, a phenomenon known as diminishing returns. As we improve, it naturally gets harder to keep up that rate of improvement, or learning curve. Perhaps most of that rapid improvement comes from simply learning to not make mistakes. Keep this in mind to avoid frustration. The more you improve, the harder it may be to become even better.

…But Will This Approach Lead To Mediocrity?

By now you may be thinking “If all I do is focus on not losing, won’t I just be mediocre in everything?” No doubt many of us can and do achieve greatness in some domains, but consider these two points:

1) Since no one starts off anything at a high level of expertise, why not begin by “not losing” and learn first to avoid errors?

2) Even people talented, dedicated, and lucky enough to achieve greatness in one or two fields will undoubtedly be closer to ordinary in many other areas. It is important to recognize at what, if anything, you are truly great or desire to be great, and what falls in your “ordinary” range.

Let’s consider an example of how someone at the top of their field could have used this approach for a better outcome in another endeavor. History is filled with many top performers in one field having disasters in other areas. In 2009, Sports Illustrated estimated that 78% of NFL players have gone bankrupt or have been in financial stress within two years of retirement, and that 60% of former NBA players are broke within five years of retirement. Considering that the average annual salary of these professionals in 2012 was about $2 million and $5 million, respectively, this seems unbelievable.

Sticking to our tennis theme, consider the case of Bjorn Borg. Undoubtedly the greatest tennis player of his time, and considered by many one of the greatest ever, Borg won 64 titles, including 11 Grand Slams, over a 10 year career. His tournament earnings alone, in today’s dollars, were about $15 million. Borg retired from professional tennis in 1983 and pursued business opportunities. By 1990 his companies collapsed and were declared bankrupt. In 2006, Borg was forced to sell off many of his trophies to achieve "financial security”.

According to tennis writer Richard Evans, Borg “was much too trusting. He made bad choices which led to bad luck.”

Perhaps Borg, and many other top athletes, fall into the trap of approaching personal finance in the same manner as winning in their professional field. Maybe Borg would have fared better by approaching his businesses and personal finances from an “ordinary” perspective, at least until he developed into an extraordinary businessman.

The Relativity of Ordinary

Being a scientist, Ramo paid homage to Albert Einstein by invoking the term relativity. What if, relative to your opponent, you are the equivalent of a pro? In this case, Ramo’s advice for tennis should not be taken as absolute, but should be adapted for the situation.

Perhaps, given a weaker opponent, you can benefit by trying a more aggressive court position, much like a professional. You can use this adaptation in other areas of life too. As you improve and get closer to a professional level, consider some actions that challenge your abilities. Just don’t try these during a critical “match” point. For example, if you’re just learning how to drive, you might want to practice in an empty parking lot, maneuvering around rubber cones, before you cruise through busy city streets. It’s OK to make some errors, providing you learn from them and you are willing to accept their consequences.

What if, on the other hand, you really are a pro player, but your play has become a little erratic and you are temporarily making more errors? It can be difficult admitting part of your game is ordinary. If this is your case, Ramo suggests considering that you may be only a bit ordinary. You are still eligible to benefit from working on winning by not losing, and by eliminating errors.

As Ramo puts it, you can improve, going from “ordinary” to “ex-ordinary,” whether or not you ever become extraordinary.

Questions to Ask Yourself

  • What are some areas in which you can benefit by taking the ordinary approach, as in working on not losing, and eliminating errors?

  • What are some actionable steps you can take to go from “ordinary” to “ex-ordinary”?

  • How can you apply a simple measurement of your performance, as Ramo did in tennis using points lost versus points won? ---

    Dr. Gleb Tsipursky is the author of the #1 Amazon bestseller The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge, which aims to unite all who care about facts and truth.
  • ]]>
    Mon, 19 Aug 2019 17:50:02 +0000 0
    (Dis)Trust in Science: What Can We Do About the Scourge of Misinformation? Caption: Woman looking at homeopathic medicine (Wikimedia Commons)

    Dr. Gleb Tsipursky serves as the volunteer President of the nonprofit Intentional Insights and is a co-founder of the Pro-Truth Pledge. He authored a number of a number of books, most notably the #1 Amazon bestseller The Truth Seeker’s Handbook: A Science-Based Guide, and is regularly featured in venues like CBS News, Time, Scientific American, Psychology Today, Newsweek, The Conversation, CNBC, and elsewhere.

    At least 10 US children died and over 400 were sickened after taking homeopathic teething medicine laced with a poisonous herb called “deadly nightshade.” Carried by CVS, Walgreens, and other major American pharmacies, the pills contained this poison based on the alternative medicine principle of homeopathy, the treatment of medical conditions by tiny doses of natural substances that produce symptoms of disease. 

    These children did not have to die. Numerous research studies show that homeopathy does not work. Despite this research, homeopathy is a quickly-growing multi-billion dollar business, taking advantage of people’s distrust in science and the lack of government regulation of “alternative medicine.”

    These deaths are among many terrible consequences of the crisis of trust suffered by our institutions in recent years. While headlines focus on declining trust in the media and the government, science and academia are not immune to this crisis of confidence, and the results can be deadly.

    Consider that in 2006, 41% of respondents in a nationwide poll expressed “a lot of confidence” in higher education. Less than 10 years later, in 2014, only 14% of those surveyed showed “a great deal of confidence” in academia.

    What about science as distinct from academia? Polling shows that the number of people who believe that science has “made life more difficult” increased by 50% from 2009 to 2015. According to a 2017 survey, only 35% of respondents have “a lot” of trust in scientists; the number of people who do “not at all” trust scientists increased by over 50% from a similar poll conducted in December 2013.

    This crumbling of trust in science and academia forms part of a broader pattern, what Tom Nichols called The Death of Expertise in his 2017 book. Growing numbers of people claim their personal opinions hold equal weight to the opinions of experts.

    Children dying from deadly nightshade in homeopathic medicine is only one consequence of this crisis of trust. For another example, consider the false claim that vaccines cause autism. This belief has spread widely across the US, and leads to a host of problems. For instance, measles was practically eliminated in the US by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities.

    Should We Actually Trust Scientific Experts?

    While we can all agree that we do not want children to suffer, what is the underlying basis for why the opinions of experts - including scientists - deserve more trust than the average person in evaluating the truth of reality?

    The term “expert” refers to someone who has extensive familiarity with a specific area, as shown by commonly-recognized credentials such as a certification, an academic degree, publication of a book, years of experience in a field, or other way that a reasonable person may recognize an “expert.” Experts are able to draw on their substantial body of knowledge and experience to provide an opinion, often expressed as “expert analysis.”

    That doesn’t mean an expert opinion will always be right: it’s simply much more likely to be right than the opinion of a non-expert. The underlying principle here is probabilistic thinking, our ability to predict the truth of current and future reality based on limited information. Thus, a scientist studying autism would be much more likely to predict accurately the consequences of vaccinations than someone who has spent 10 hours Googling “vaccines and autism” online.

    This greater likelihood of experts being correct does not at all mean we should always defer to experts. First, research shows that experts do best in evaluating reality in environments that are relatively stable over time and thus predictable, and also when the experts have a chance to learn about the predictable aspects of this environment. Second, other research suggests that ideological biases can have a strongly negative impact on the ability of experts to make accurate evaluations. Third, material motivations can sway experts to conduct an analysis favorable to their financial sponsor.

    However, while individual scientists may make mistakes, it is incredibly rare for the scientific consensus as a whole to be wrong. Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists. Thus, for the large majority of them to agree on something – for there to be a scientific consensus – is a clear indicator that whatever they agree on reflects reality accurately.

    The Internet Is for… Misinformation

    The rise of the Internet, and more recently social media, is key to explaining the declining public confidence in expert opinion.

    Before the Internet, the information accessible to the general public about any given topic usually came from experts. For instance, scientific experts on autism were invited to talk on this topic on mainstream media, large publishers published books by the same experts, and they wrote encyclopedia articles on this topic.

    The Internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly-curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism or the effectiveness of homeopathic medicine. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely.

    Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: just consider that US adults believed 75% of fake news stories about the 2016 US Presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.

    Blogs with falsehoods are bad enough, but the rise of social media made the situation even worse. Most people re-share news stories without reading the actual articles, judging the quality of the story by the headline and image alone. No wonder that research indicates that misinformation spreads as much as 10 times faster and further on social media than true information. After all, the creator of a fake news item is free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.

    These problems result from the train wreck of human thought processes meeting the Internet. We all suffer from a series of thinking errors such as confirmation bias, our tendency to look for and interpret information in ways that conform to our beliefs.

    Before the Internet, we got our information from sources such as mainstream media and encyclopedias, which curated the information for us to ensure it came from experts, minimizing the problem of confirmation bias. Now, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts. Moreover, some unscrupulous foreign actors - such as the Russian government - and domestic politicians use misinformation as a tool to influence public discourse and public policy.

    The large gaps between what scientists and the public believe about issues such as climate change, evolution, GMOs, and vaccination exemplify the problems caused by misinformation and lack of trust in science. Such mistrust results in great harm to our society, from children dying to damaging public policies.

    What Can We Do?

    Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia.

    For example, we can uplift the role of science in our society. The March for Science movement is a great example of this effort. First held on Earth Day in 2017 and repeated in 2018, this effort involves people rallying in the streets to celebrate science and push for evidence-based policies. Another example is the Scholars Strategy Network, an effort to support scholars in popularizing their research for a broad audience and connecting scholars to policy-makers.

    We can also fight the scourge of misinformation. Many world governments are taking steps to combat falsehoods. While the US federal government has dropped the ball on this problem, a number of states passed bipartisan efforts promoting media literacy. Likewise, many non-governmental groups are pursuing a variety of efforts to fight misinformation.

    The Pro-Truth Pledge combines the struggle against misinformation with science advocacy. Founded by a group of behavioral science experts (including myself) and concerned citizens, the pledge calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness. Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts as more likely to be true when the facts are disputed. Over 500 politicians took the pledge, including members of state legislatures Eric Nelson (PA) and Ogden Driskell (WY), and members of US Congress Beto O’Rourke (TX) and Marcia Fudge (OH). Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance. Thus, taking the pledge yourself, and encouraging people you know and your elected representatives to take the pledge is an easy action to both fight misinformation and promote science.\


    I have a dream that one day, children will not be dying from taking poisonous homeopathic medication or getting sick with measles because their parents put their trust in a random blogger instead of  extensive scientific studies. I have a dream that schools will be teaching media literacy and people will know how to evaluate the firehose of information coming their way. I have a dream that we will all know that we suffer from thinking errors, and watch out for the confirmation bias and other problems. I have a dream that the quickly-growing distrust of experts and science will seem like a bad dream. I have a dream that our grandchildren will find it hard to believe our present reality when we tell them stories about the bad old days.

    To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope. Our information ecosystem and credibility mechanisms are broken. Only a third of Americans trust scientists and most people can’t tell the difference between truth and falsehood online. The lack of trust in science - and the excessive trust in persuasive purveyors of misinformation - is perhaps the biggest threat to our society right now. If we don’t turn back from the brink, our future will not be a dream: it will be a nightmare.

    Mon, 19 Aug 2019 17:50:02 +0000 0
    When Truth Isn’t Truth Courtesy of Gage Skidmore)

    Dr. Gleb Tsipursky co-founded the Pro-Truth Pledge (at, a project joined by anyone who cares about creating a united constituency of all who care about truth and facts. He authored a number of a number of books, most notably the national bestseller The Truth Seeker’s Handbook: A Science-Based Guide, and is regularly featured in venues like CBS News, Time, Scientific American, Psychology Today, Newsweek, The Conversation, CNBC, and elsewhere. Connect with Dr. Gleb Tsipursky on Twitter, on Facebook, and on LinkedIn, and learn more about him on his website.

    “Truth isn’t truth” according to Rudy Giuliani, a statement he made on August 19th on NBC’s “Meet the Press.” The phrase was immediately derided as a verbal blunder embodying the Trump administration’s complete disregard for the facts. Yet a closer look at Giuliani’s message shows an underlying strategic approach to undermining the truth similar to that used by “scientists” producing industry-sponsored studies rejecting human-caused climate change and links between tobacco and cancer.

    The transcript of the exchange reveals how Giuliani made his statement while defending Donald Trump’s unwillingness to testify for Robert Mueller's Russia investigation. According to Giuliani, “I am not going to be rushed into having him testify so that he gets trapped into perjury. And when you tell me that, you know, he should testify because he’s going to tell the truth and he shouldn’t worry, well that’s so silly because it’s somebody’s version of the truth. Not the truth.”

    The moderator, Chuck Todd, responded: "Truth is truth." Then, Giuliani said: "No, it isn’t truth. Truth isn’t truth." Giuliani went on: “Donald Trump says I didn’t talk about Flynn with Comey. Comey says you did talk about it, so tell me what the truth is” and then added “we have a credibility gap between the two of them. You’ve got to select one or the other. Now, who do you think Mueller’s going to select? One of his best friends, Comey, or the president.”

    Let’s unpack that exchange. Giuliani’s first statement conveyed that there are many versions of the truth, and denied the existence of any underlying factual reality.

    Todd pushes back, saying - “truth is truth” - referring to truth as what physically happened in reality, independent of anyone’s interpretation or spin. Giuliani disagrees, stating “truth isn’t truth”: he denies the existence of anything that really happened, implying that it’s all about different interpretations and the one who determines the interpretation wins.

    He uses this denial of factual reality to defend his reluctance for Trump to testify. After all, once Trump’s testimony is on paper, the president can be charged with perjury if his version of the truth does not win out. Giuliani then suggests that the Mueller is biased and will side with his friend Comey over Trump, leading to Comey’s version winning out.

    It’s telling that this exchange occurred just as the Environmental Protection Agency under Trump is looking to reverse the long-standing position of the EPA that there is no safe level of fine particle pollution. This reversal is occurring regardless of the lack of science behind the new position and the extensive research showing that exposure to fine particles contributes to asthma and heart attacks. Likewise, the Trump administration is planning to repeal the Obama administration’s Clean Power Plan, which aims to cut carbon dioxide emissions, with no credible science behind this repeal.

    What are the parallels behind these seemingly different events? The strategy widely used by climate change deniers - and now adopted by the Trump administration - of casting doubt on truth as a way of promoting their political agenda.

    A widespread consensus among climate scientists exists on the reality of substantial human-caused climate change. Unfortunately, less than 20 percent of Americans are aware of this consensus, despite extensive communication about this consensus by scientists.

    Why? Research shows this low level of awareness comes from economically and politically motivated challenges to the reality of climate change from groups with substantial access to resources that influence public opinions. Most notably, the fossil fuel industry has funded the research of a tiny minority of scientists in order to cast doubt on human-caused global climate change.

    Why do people believe this tiny minority of scientists? Because the fossil fuel industry then used its enormous financial and political resources to spread this paid-for “research” widely.

    People who are not experts in climate change are thus exposed extensively to false information due to the huge megaphone of the fossil fuel industry. Such exposure triggers the “illusory truth effect,” a psychological phenomenon where the more we are exposed to a lie, the more likely we are to believe in. Indeed, research on climate denialist messaging demonstrates that exposure to such information substantially reduces both people’s belief in human-caused climate change and the truthfulness of climate science.

    These tactics used in climate change denialism are part of a broader pattern of science denialism perpetrated by groups with economic and political interests in casting doubt on credible research as well as undermining belief in scientific truth more broadly. Thus, many of the same “scientists” who are now at the forefront of climate change denialism produced research denying the links between smoking and lung cancer, coal smoke to acid rain, and CFCs to the hole in the ozone layer. As a tobacco executive wrote, “doubt is our product” - no doubt the same kind of product peddled by fossil fuel executives funding “research” denying climate change.

    Giuliani is in the same boat of peddling doubt as a strategy. His denial of an underlying truth of reality uses the same strategy used by deniers of climate change and links between smoking and cancer. Just as they use industry-funded “alternative science” to cast doubt on ever finding the truth of reality, he claims that we can’t speak about what really happened - “truth isn’t truth” - because alternative narratives exist.

    Whether in the courtroom or in the lab, peddlers of doubt like Giuliani decimate our ability to make the kind of sound decisions on which democracy relies. To preserve our democracy from destruction by such tactics requires an organized effort to unite all who care about truth across the political spectrum. Regardless of what Giuliani states - or what the industry-funded “scientists” claim - truth is truth, and it must be protected for the sake of our shared future.

    Mon, 19 Aug 2019 17:50:02 +0000 0
    The Pro-Truth Pledge prompts truthful behavior, according to psychology studies

    Traditionally,  identifying truth in politics comes from mainstream media and its fact checking. A recent Gallup poll, however, showed that only 29 percent of Americans trust fact checking.

    Research in behavioral science suggests that we can address the spread of misinformation through a number of other effective strategies, which are brought together in the Pro-Truth Pledge (PTP) project. Several months ago, I wrote a post explaining the Pro-Truth Pledge and its mission. Since that time, two peer-reviewed studies have provided evidence of its effectiveness in changing the behavior of pledge-takers — private citizens and public figures  both— to be more truthful, for more than a month after they have taken the pledge. Both studies were published in prestigious psychology journals, Behavior and Social Issues and the Journal of Social and Political Psychology.

    Quantitative evidence shows the Pledge is effective

    The study in the peer-reviewed Journal of Social and Political Psychology suggests that taking the pledge results in a statistically significant increase in alignment with the behaviors of the pledge. The survey involved 24 participants filling out Likert scale (1–5) surveys self-reporting their Facebook engagement with news-relevant content on their own profiles and also with other people’s posts and in groups before and after they took the pledge, with 1 at lowest level of alignment to the pledge behaviors and 5 being full alignment. To avoid the Hawthorne effect of study participants being impacted by observation, the study did not evaluate current behavior, but past behavior.

    We only recruited participants who took the pledge 4 or more weeks ago to fill out the survey, and asked them about their behavior after taking the pledge. Giving them this period also gave people an opportunity to have the immediate impact of taking the pledge fade from their mind, thus enabling an evaluation of the medium-term impact of the PTP on sharing news-relevant content.

    This study method was informed by the approaches used by studies of whether honor codes address cheating, which is the most comparable form of intervention to the PTP. Such studies similarly rely on self-reporting by students on whether they have cheated or not cheated.

    The study found that on one’s own Facebook profile, the median alignment with the PTP score before taking the PTP is 4 (SD=1.14), and the median alignment score after taking the PTP is 4.5 (SD=0.51). For engaging with newsworthy content on other people’s profiles and in groups, the median PTP alignment score before taking the Truth Pledge is 3.5 (SD=1.06).

    The median PTP alignment score after taking the Truth Pledge is 4.5 (SD=0.65). For sharing content, 70.83% of participants (17 of 24 respondents) reported an increase of their PTP alignment after taking the PTP. The figure below provides a visual summary of the preliminary survey data.

    Figure 1, Visual summary of preliminary survey data with PTP alignment in Facebook engagement


    We conducted a second study, the one published in Behavior and Social Issues, to address the weakness of the first study’s reliance on self-reporting. The second study sampled 21 people, and involved researchers observing and evaluating the quality of Facebook engagement by study participants on their own Facebook profile.

    Similarly to the first study, the second study avoided the Hawthorne effect of study participants being impacted by observation by evaluating past behavior. Researchers looked at the first ten Facebook posts with news-relevant content made four weeks after the pledge. Then, the researchers compared these ten posts to the first ten posts for the same period the year before the study participant took the pledge. Each post was coded according to quality, from 1 of lowest level of alignment with the PTP, to 5 of highest alignment.

    The second study showed that the average PTP alignment before taking the pledge was 2.49, and after taking the pledge was 3.65, and conducted a paired t-test to examine whether Pro-Truth Pledge Alignment is significantly different after taking the PTP. The null hypothesis H0 for the paired t-test states that there is no significant alignment difference before and after taking the pledge and the alternative hypothesis H1 proposes a significant difference. There was a significant difference in the scores for Pledge Alignment before]]> Mon, 19 Aug 2019 17:50:02 +0000 0 3 Steps to an Intentional Life


    Are you getting all you want? Are you achieving all of your goals and succeeding in life? Are you living a fully intentional life?

    If you are, I salute you. I can’t make the same claim. To live a more intentional life, I constantly strive to gain greater agency, the quality of living intentionally.

    In doing that, it helps to take the following three steps: evaluate reality clearly, make effective decisions, and achieve your goals.

    Step 1: Evaluate Reality Clearly

    What does it mean to evaluate your reality clearly? That means gaining a deep understanding of your external environment – your immediate surroundings, your social circle, your career, and anything else of relevance. That also means your own internal environment – your patterns of feeling, thinking, and behaving.

    Four factors obstruct our ability to evaluate reality clearly:

    Learning about and watching out for these challenges in a systematic manner improves our decision-making.

    Step 2: Make Effective Decisions

    Next, you want to make effective decisions about how to reach your goals. Consider your options, based on your knowledge of your outer and inner environment. Be aware that you can change both your external surroundings, and your own thoughts, feelings, and behaviors, to help you to get what you want in life.

    Evaluate the various paths available to you, assess the probability that each path will get you to your goals. Then make a plan for how to proceed, and take the path that seems best suited to go where you want.

    Step 3: Achieve Your Goals

    Finally, implement the decisions you made and travel along the path. Remember, you will usually encounter some unknown obstacles on your road to what you want. Be excited about getting feedback from your environment and learning about better paths forward.

    Take the opportunity to change your path if a new one opens up that seems better suited to help you meet your goals. Be open to changing your very goals themselves based on what you learn.

    As you can imagine, these things are easy to say, but hard to do. It’s very helpful to get support along the way, through learning about strategies oriented toward this purpose. However, above all, it takes your own commitment to the goal of gaining greater agency over your life and living intentionally to succeed in life.


    Key Takeaway

    To live a truly intentional life, make sure to take these three steps: 1) Evaluate Reality Clearly; 2) Make Effective Decisions; 3) Achieve Your Goals ---> Click to Tweet



    Bio: Known as the Disaster Avoidance Expert, Dr. Gleb Tsipursky authored the national bestseller on avoiding professional and personal disasters, The Truth Seeker’s Handbook: A Science-Based Guide, and you can pre-order his new book, Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters. He has over 20 years of experience dramatically empowering leaders to avoid business disasters as the CEO of the boutique consulting and training firm Disaster Avoidance Experts. Tsipursky also has a strong research and teaching background in behavioral economics and neuroscience with over 15 years in academia, including 7 years as a professor at the Ohio State University, with dozens of peer-reviewed academic publications. Visit his website, subscribe to his monthly Disaster Avoidance Tips, email him at gleb[at]disasteravoidanceexperts[dot]com, follow him on Twitter @gleb_tsipursky, on Instagram @dr_gleb_tsipursky, on Facebook DrGlebTsipursky, and on LinkedIn Dr. Gleb Tsipursky.

    Mon, 19 Aug 2019 17:50:02 +0000 0
    8 Key Steps to Prevent Project Management Failure


    When was the last time you saw a major project management or process management failure? Such disasters can have devastating consequences for high-flying careers and successful companies. Yet they happen all too often, with little effort taken to prevent failure.


    For example, many leaders stake their reputations on key projects such as successful product launches. However, research shows that most product launches fail. Nike’s FuelBand, launched with much fanfare in 2012, flopped on arrival. By 2014, Nike fired most of the team behind FuelBand, discontinuing this product.


    One of the most important types of projects for a business is a merger or acquisition. Yet 70 to 90 percent of mergers and acquisitions fail to create value, and CEOs who lead failed M&As are frequently replaced. For instance, Microsoft’s CEO Steve Ballmer left in large part due to the tensions around his push to acquire Nokia, which eventually led to Microsoft writing off $8.4 billion.


    Process failures can be just as bad. Safety failures led to the recall of over 20 million pounds of food across the US in 2018; from 1996 to 2017, more than 390 million cars and other motor vehicles had a recall, along with 154 million motor vehicle parts.


    Japanese airbag maker Takata Corporation, with revenue of $6.6 billion and over 50,000 employees in 2016, declared bankruptcy in 2017 due to the costs of a recall and lawsuits over faulty airbags. Boeing’s engineers knew that the 737 Max aircraft display alert system software failed to meet requirements, but failed to do anything about it before the deadly October 2018 Lion Air crash. The grounding of Boeing’s 737 Max aircraft after that crash and the March 2019 Ethiopian Airlines Flight 302, caused in large part by the display alert system software, cost the company over a billion.


    Of course, while examples from big companies make the headlines, mid-size and small businesses have their share of catastrophic project management and process management failures. Such mistakes largely come from the many dangerous judgment errors that result from how our brains are wired, what scholars in cognitive neuroscience and behavioral economics call cognitive biases. Over 100 cognitive biases exist, and more are found all the time by scholars in behavioral economics and cognitive neuroscience. 


    These errors lead to dangerous mistakes in the workplace, in everything from mergers and acquisitions to assessing company performance. They also hurt is in our personal life. For example, a survey shows that we tend to go with our gut reactions and thus fall for cognitive biases in our shopping decisions.


    Fortunately, recent research in these fields shows how you can use pragmatic strategies to notice and address these dangerous judgment errors. You can do so using structured decision-making techniques for making quick everyday decisions, for more complex and significant ones, and for critically important and highly complex choices.


    But what do you do after you make your decision? You also need to avoid failures and maximize success in implementing decisions, as well as in managing projects and processes that result from these decisions.


    The most relevant scholarship in implementing decisions deals with prospective hindsight, meaning looking back in advance. Prospective hindsight helps you anticipate and avoid threats as well as notice and seize opportunities. Thus, you can defend yourself against failures and maximize the likelihood of success in major projects and processes, and in implementing decisions.


    8 Key Steps to Preventing Project Management Failure


    “Failure-Proofing” is a pragmatic and easy-to-use strategy for obtaining the benefits of prospective hindsight. Having developed this technique based on behavioral economics and cognitive neuroscience studies, I then tested it on the front lines of my over 20 years of experience consulting and coaching leaders in large and mid-size companies and nonprofits avoid project or process failures. I wrote it up so that anyone – not only the people who hire me – can avoid such failures and maximize success.


    Use Failure-Proofing after you decided to start any significant project and to check in regularly on existing processes. Don’t use Failure-Proofing on smaller, day-to-day decision-making, since doing so would take too much time. For those decisions, use the “5 Key Questions” technique instead. The failure-proofing technique is best done in teams, and should involve representatives of all relevant stakeholders; you can also do this technique by yourself, but consider showing your results to a trusted adviser for an external perspective.


    Step 1: Gather


    Gather all the people relevant for making the decision in the room, or representatives of the stakeholders if there are too many to have in a group.  A good number is 6, and avoid more than 10 people to ensure a manageable discussion.


    Make sure the people in the room have the most expertise in the decision to be made, rather than simply gathering higher-up personnel. The goal is to address what might go wrong and how to fix it, as well as what might go right and how to ensure it. Expertise here is as important as an authority. At the same time, have some people with the power to decide how to address problems and seize opportunities that might be uncovered.


    It’s very helpful to recruit an independent facilitator who is not part of the team to help guide the exercise. You can get someone from your Advisory Board, someone from another part of the organization, your mentor, or a coach or consultant. If you are going through this technique by yourself, write out various stakeholders that are relevant to the project or process, even different aspects of yourself that have competing goals.


    Step 2: Explain


    Explain the exercise to everyone by describing all the steps, so that all participants are on the same page about the exercise.


    Step 3: Next Best Alternative


    Then, develop two Next Best Alternatives (NBAs) to the project or process you are evaluating. Have each participant on the team come up with and write down one NBA anonymously. Anonymity is critical to ensure that unpopular or politically problematic opinions can be voiced (“perhaps we should wait for a better opportunity rather than acquiring this company”).


    The facilitator gathers what people wrote – thus ensuring anonymity if the facilitator is not part of the team and doesn’t know people’s handwriting – and voices the alternatives. Then, have team members vote on the choices that seem most viable, and choose two to discuss. Make sure to give them a fair hearing by having two team members – including at least one with authority – defend each NBA.


    After discussing the NBA, take an anonymous vote on whether the NBA seems preferable to the original project or process under discussion. If the original project or process still seems best (which is what happens in the large majority of cases), consider if the project or process can be strengthened by integrating any components of the two NBAs into your plan. If you are going through the technique by yourself, get outside input at this stage if you have difficulty generating an NBA.


    Step 4: Reason for Failure


    Next, ask all the stakeholders to imagine that they are in a future where the project or process definitely failed (an approach informed by the Premortem technique). Doing so gives permission to everyone, even the biggest supporters of the project or process, to use their creativity in coming up with possible reasons for failure.


    Otherwise, their emotions – which determine 80-90% of our thoughts, behaviors, and decisions – will likely inhibit their ability to accept the possibility of project or process failure. That’s why simply asking everyone to imagine potential problems works much less well. Supporters of the project experience a defensive emotional response that leaves their minds much less capable of creatively envisioning possible problems.


    After giving such permission, have each participant anonymously write out plausible reasons for this disaster. Anonymity is especially important here, due to the potential for political danger in describing potential problems (“the product launch will fail because the marketing department overhyped it, leading to unhappy consumers). Ask everyone to come up with at least three most plausible failures, while highlighting that the reasons for coming up with these failures is to address them effectively.


    These failures should include internal decisions under the control of the project team, such as cost and staffing, as well as potential external events, such as an innovation introduced by a competitor. Encourage participants to focus particularly on reasons they would not typically bring up because it would be seen as rude or impolitic, such as criticizing someone’s competency, or even dangerous to one’s career, such as criticizing the organization’s strategy. Emphasize that everyone’s statements will remain anonymous.


    The facilitator gathers everyone’s statements, and then highlights the key themes brought out as reasons for project failure, focusing especially on reasons that would not be typically brought up, and ensuring anonymity in the process. If you are going through this technique by yourself, write out separate reasons for project or process failure from the perspective of each relevant aspect of yourself.


    Step 5: Most Likely Problems


    Discuss all the reasons brought up, paying particular attention to ones that are rude, impolitic, and dangerous to careers. Check for potential cognitive biases that might be influencing the assessments. The most significant ones to watch out for are loss aversion, status quo bias, confirmation bias, attentional bias, overconfidence, optimism bias, pessimism bias, and halo and horns effect.


    Then, assess anonymously the probability of each reason for failure, ideally placing percentage probabilities. If doing so is difficult, use terms like “highly likely”, “somewhat likely”, “unlikely”, and “very unlikely.” Also consider how harmful each reason for failure might be, and pay more attention to the ones that are most harmful.  Here, the expertise of individual members of the team will be especially useful.


    The leader or person assigned as note-taker writes down all the problems brought up, as well as assessments of the probabilities. If you are going through the technique by yourself, get outside input at this stage.


    Step 6: Fixing Problems


    Decide on several failures that are most relevant to focus on, and brainstorm ways of solving these, including how to address potential mental blindspots. Also, discuss any evidence you might use that would serve as a red flag that the failure you are discussing is occurring or about to occur. For this step, it is especially important to have people with authority in the room.


    The leader or note-taker writes down the possible solutions. If you are going through the technique by yourself, get outside input at this stage.


    Step 7: Maximizing Success


    We addressed failure: now let’s make sure you not simply avoid failure, but maximize success! Next, imagine that you are in a future where the project or process succeeded far beyond what you expected. Have each participant anonymously write out plausible reasons for this success. Next, have the facilitator highlight the key themes.


    Discuss all the reasons, and check for the same cognitive biases as above. Evaluate anonymously the probability of each reason for success, and decide which deserve the most attention. Then, brainstorm ways of maximizing each of these reasons for success.


    The leader or note-taker writes down the ideas to maximize success. If you are going through the technique by yourself, get outside input at this stage.


    Step 8: Revising Project


    The leader revises the project or process based on the feedback, and, if needed, repeats the exercise.




    Make sure to use the “Failure-Proofing” technique prior to any large project and to evaluate existing processes and systems to prevent failures. To see case studies with in-depth guidelines of how you can apply this strategy as an individual or a team, see the Manual on Failure-Proofing.


    Key Takeaway


    To prevent a project management or process management disaster, imagine that it completely failed. Then, brainstorm all plausible reasons for failure, and generate solutions to these potential problems. Integrate these solutions into your project or process. —> Click to Tweet

    To maximize project management or process management success, envision that it succeeded spectacularly. Brainstorm likely reasons for such success, and generate strategies that would lead to such success. Integrate these strategies into your project or process. —> Click to Tweet


    Questions to Consider (please share your answers below)


    • What questions do you have about applying this technique?


    • Where do you think Failure-Proofing might best fit into your organization’s processes?


    • What will be your next steps in most effectively bringing it to your team and integrating it into your organization’s processes?



    Image credit: Flickr/freeimage4life



    Bio: Dr. Gleb Tsipursky empowers you to avoid business disasters as CEO of the boutique consulting, coaching, and training firm Disaster Avoidance Experts. He is a best-selling author of several well-known books, including Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters and The Truth Seeker’s Handbook: A Science-Based Guide. Tsipursky’s cutting-edge thought leadership was featured in over 400 articles and 350 interviews in Fast Company, CBS News, Time, Scientific American, Psychology Today, Inc. Magazine, and elsewhere. His expertise stems from his background of over 20 years of consulting, coaching, speaking, and training experience across North America, Europe, and Australia. It also comes from his strong research and teaching background in behavioral economics and cognitive neuroscience with over 15 years in academia, including 7 years as a professor at the Ohio State University, with dozens of peer-reviewed academic publications. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, Facebook, YouTube, and LinkedIn. Most importantly, help yourself avoid disasters by getting a free copy of the Assessment on Dangerous Judgment Errors in the Workplace by signing up for his Disaster Avoidance Tips.


    Posted in Goal Achievement, Intentional Decision-Making, Leadership & Organizational Development and tagged , , ]]>
    Mon, 19 Aug 2019 17:50:02 +0000 0
    How to Evaluate Unconscious Bias Caused by Cognitive Biases at Work


    To evaluate unconscious bias caused by cognitive biases, first think about these three questions:


    • What percentage of projects in your workplace miss the deadline or go over budget?
    • How often do you see hiring decisions and employee assessments influenced by factors not relevant to job competency?
    • How frequently are your team’s members overconfident about their decisions?

    If you didn’t answer “rare to none” for any of these, you got a problem. In fact, these questions get at only 3 out of over a 100 dangerous judgment errors that scholars in behavioral economics and cognitive neuroscience call cognitive biases.


    Do you regularly – over 10% of the time – see projects in your workplace go past deadline or over budget? It’s a sign that the cognitive bias known as the planning fallacy is undercutting performance. The planning fallacy refers to our intuitive belief that everything will go according to plan, resulting in us failing to plan for the many potential problems that cause projects to go over budget or past deadline. Cost overruns and delays result in serious damage to the bottom lines of our businesses.


    How about assessments for hiring, performance, and promotion impacted by non-relevant factors? Well, two dangerous judgment errors play a major role in causing such problematic evaluations, the halo effect and the horns effect. The halo effect refers to the fact that if we feel a significant positive emotion toward one characteristic of someone, then we will have an overly positive evaluation of that person as a whole. That’s why taller men get promoted at higher rates into positions of authority, and both men and women perceived as physically attractive are more likely to be hired. The horns effect is the opposite: if we don’t like a characteristic that is significant to us, we will tend to have a worse evaluation of that person as a whole. For instance, overweight people are less likely to be hired.


    Finally, excessive confidence in making decisions – and other work areas – is a symptom of the mental blindspot known as the overconfidence effect. Overconfidence has been associated with many problems in the workplace. For example, overconfidence leads people into financial shenanigans, such as overstating earnings. Overconfident leaders tend to resist constructive criticism and dismiss wise advice, letting their intuition drive their decision-making as opposed to making thoughtful plans.


    So now that you know about the planning fallacy, the halo and horns effects, and the overconfidence effect, you’re safe from these 4 cognitive biases, right? Unfortunately, just learning about these mental blindspots will not work to assess where they occur in your workplace or to defeat them, as research shows. In fact, some techniques that would seem intuitively to help address unconscious bias caused by cognitive biases make them worse.


    Fortunately, recent research has revealed strategies that you can use to notice when you’re about to fall for these mental blindspots, as well as when you’ve been suffering from them for a while without knowing it. Moreover, it shows how you can use pragmatic strategies to overcome these dangerous judgment errors to avoid unconscious bias and make the best decisions, in your work and career, in your professional and personal relationships, and in other life areas as well.


    The first step to solving cognitive biases does involve learning about them. However, simply having knowledge doesn’t help. For instance, students who learned about mental blindspots showed the same vulnerability to these errors as students who didn’t.


    What is much more helpful is making sure that people are strongly emotionally motivated to address cognitive biases. Our emotions determine 80-90 percent of our decisions, thoughts, and behaviors, and tapping our feelings is clearly effective in helping notice and address dangerous judgment errors. On a related note, it really helps for people to feel that the effort to address mental blindspots is important to them, getting them truly involved and bought into the outcome of debiasing cognitive biases.


    To do so, you need to evaluate thoroughly the impact of each cognitive bias on your own professional activities, as well as more broadly in your team and organization. Then, you have to make and implement a plan to address the problems caused by such unconscious bias, again, not only for yourself but also for your team and your business.


    Fortunately, you don’t have to address all the cognitive biases. Just going through the 30 most dangerous judgment errors in the workplace will get you the large majority of the benefit from such an analysis to help you avoid unconscious bias. All of these mental blindspots, along with clear next steps on what to do after the evaluation, can be found in the Assessment on Dangerous Judgment Errors in the Workplace. It’s available for sale in print or digital form and you can get the digital version for free when you register for the Wise Decision Maker Course.


    Assessment on Cognitive Biases in the Workplace to Address Unconscious Bias


    The assessment starts with an evaluation of how frequently each of the 30 cognitive biases occurred in your workplace in the last year in the form of percentages. Don’t feel obliged to be absolutely precise, approximate numbers are fine.


    If you don’t remember something occurring, give it a low percentage score, including 0 if you think it doesn’t occur. For instance, if all of your projects came under budget and within the deadline, then planning fallacy is not a problem for you.


    Each of the 30 questions should take 10-15 seconds. Just put down the first number that seems to make the most sense for you. You can go back later and tweak it if needed. However, for the first run-through, do it fast. Remember, if you tend to be an optimistic person in general, temper your optimism and give a somewhat higher percentage than you intuitively feel is appropriate. Same goes for pessimism: give a lower percentage if you tend to be pessimistic.


    Following this evaluation, you will score the assessment to see the current state of dangerous judgment errors in your workplace. Next, you’ll evaluate the impact of these problems on the bottom line of your personal work, your organizational unit, or the company as a whole, to the extent that you can estimate this question. After all, knowing the bottom line impact will enable you to decide how much to invest into addressing the problem. You’ll then evaluate the performance of your workplace on the four broad competencies of addressing cognitive biases: how the people in your organization do on evaluating themselves, evaluating others, strategic evaluations of risks and rewards, and tactical evaluations in project implementation.


    Finally, you’ll get to the next steps. There, each dangerous judgment error is explained, focusing on its business impact. You’ll also get to decide which of the mental blindspots you’ll focus on addressing in the short term future.


    The assessment will prove invaluable as you take the next steps to solve the problems you identified. You should have yourself and others in your organization do the assessment after you introduce the concept of cognitive biases but before you launch any interventions. Then, you can use your assessment results as a baseline to assess the impact of any interventions.


    To develop your interventions, see the book that’s based around this assessment and provides both techniques and business case studies for how to address cognitive biases: Never Go With Your Gut: How Pioneering Leaders Make the Best Decisions and Avoid Business Disasters. You can also learn and use research-based strategies to make the best decisions in quick everyday choices, in moderately important decisions, and in critically important ones.


    Additionally, you would benefit from a method to avoid failure and achieve success when implementing your decisions, as well as another technique to address threats and seize opportunities in your long-term strategic plans. Finally, it would be really valuable for your to develop the mental habits and skills necessary to address the unconscious bias caused by cognitive biases. These techniques and skills, along with the knowledge in the book, will help you address effectively the dangerous judgment errors we tend to make.


    While enacting the interventions, have yourself and the others in your workplace take the assessment regularly – once a week if the intervention is intense, once a month if it’s less intense – to evaluate the effectiveness of the intervention. Revise the intervention as needed to account for your results.


    After the intervention is complete and you are satisfied, keep taking the Assessment on Dangerous Judgment Errors in the Workplace every quarter. Doing so will help keep up vigilance and ensure that you keep protecting yourself from the disastrous consequences of falling into dangerous judgment errors.


    Key Takeaway


    To address unconscious bias caused by cognitive biases in your workplace, you need to evaluate their impact on your own professional activities and on your team and organization. Then, make and implement a plan to address these biases. —> Click to Tweet


    Questions to Consider


    • Which of the following biases most negatively impacts your workplace: the planning fallacy, the halo and horns effects, or the overconfidence effect? What does that negative impact look like?
    • What would be the benefit to you, your team, and your organization of addressing the 30 most dangerous judgment errors in the workplace?
    • How did you score on dangerous judgment errors in your workplace when you took the assessment? How do you feel about your score?


    Image credit: Flickr/Geoffrey Fairchild




    Bio: Dr. Gleb Tsipursky empowers you to avoid business disasters as CEO of the boutique consulting, coaching, and training firm Disaster Avoidance Experts. A best-selling author, he wrote Never Go With Your Gut, The Blindspots Between Us, and The Truth Seeker’s Handbook. Tsipursky’s cutting-edge thought leadership was featured in over 400 articles and 350 interviews in Fast Company, CBS News, Time, Scientific American, Psychology Today, Inc. Magazine, and elsewhere. His expertise stems from his background of over 20 years of consulting, coaching, speaking, and training experience across North America, Europe, and Australia. It also comes from his strong research and teaching background in behavioral economics and cognitive neuroscience with over 15 years in academia, including 7 years as a professor at the Ohio State University, with dozens of peer-reviewed academic publications. Contact him at Gleb[at]DisasterAvoidanceExperts[dot]com, follow him on Twitter @gleb_tsipursky, Instagram @dr_gleb_tsipursky, Facebook, YouTube, and LinkedIn. Most importantly, help yourself avoid disasters and maximize success, and get a free copy of the Assessment on Dangerous Judgment Errors in the Workplace, by signing up for his free Wise Decision Maker Guide.


    Posted in Leadership, Wise Decision Making and tagged , , , , , ,

    Mon, 19 Aug 2019 17:50:02 +0000 0