When we listened to the You Are Not So Smart Podcast this semester, it was very eye opening and a new topic I had never thought about. It challenged my very way of thinking. The episode we listened to talked about the “backfire effect” which is essentially how people react to information that is initially found to be true, but is then found to be false. I learned that people will accept information that aligns with their beliefs a lot quicker than they will believe information that goes against their beliefs. I began thinking of one of my close friends, who does exactly this, particularly involving politics, and it made me wonder. Why is it that he clings to his beliefs despite when one of those beliefs is proven to be wrong? I learned in during the podcast that around 15% of negative info causes people to fear that they are wrong about their candidate, and begins to have people hold onto their opinions like it is their own flesh and blood. This quote from the article gives an explanation as to why people will cling to their beliefs, and I found myself relating to it. I began to reflect upon how I will hold onto some of my beliefs because I am afraid that I was wrong, and nobody wants to be wrong.
I can see this problem relating to the “white-savior industrial complex” we talked about during our many discussions on medical voluntourism. We talked about how groups will go into these third world communities with the mindset that they are going to fix all of their problems. In my opinion, one can very easily figure out that they most certainly will not be able to magically solve these problems. However, people who go in to these countries to provide assistance can experience the backfire effect after realizing that they are not able to save these people. But, they will cling to that belief and try even harder to accomplish their unreachable goals driven by the fear of failure and being wrong.