I watched the January 6 storming of the US Capitol building with my heart in my throat and my iPhone in my hand. In the moments I could tear my eyes away from the TV, I was messaging my group chat about what we were collectively witnessing. But when one of my friends shared a tweet accusing the Capitol Police of letting the insurrectionists into the building, I paused. And I encouraged her to wait.
This reaction is a far cry from what I would have done if presented with the same situation five years — or even one year — ago. But in the past year I, like many of us, have done a lot of thinking about online misinformation. Where it starts. How it spreads. And what it does to us, both as individuals and as a society.
The story of the Capitol Police on January 6 is a great example of why we shouldn’t automatically click that “retweet” or “share” button when something enraging crosses our screens. Like so many other important stories, the truth was more complicated than the image of some police officers opening the barriers. Or the video of Officer Eugene Goodman distracting insurrectionists and leading them away from the lawmakers. Or the shooting of Ashli Babbitt as she attempted to breach the Speaker’s Lobby.
It is more complicated because it is all of those things — and more. Because stories and truth are complicated and complex and very, very rarely can be captured in 280 characters or a headline written for maximum clicks. (Or, for that matter, one 500-word article.) And when emotions are high — like when, for example, your Capitol building is being taken by force and you may or may not be witnessing a military coup on live TV — we can’t trust our brains to be rational.
Think about this way: When you’re really angry, how well do arguments with your significant other go? Probably not too well, right? That’s because when our emotions are heightened, a whole bunch of things happen inside of our body that make it difficult to be rational: Testosterone goes up, cortisol goes down, and our “fight or flight” instinct kicks in. Our bodies are primed to react, not analyze.
While one retweet or share or reactive comment might not feel like much in the grand scheme, social media is designed to amplify the things that are being shared. To borrow a metaphor from Jon Ronson’s book So You’ve Been Publicly Shamed, that means your post might be one snowflake, but it’s going to join up with a couple more snowflakes. Then a couple more, then a couple more, then a couple more, until they cause an avalanche. And, in the case of January 6, the snowflakes were misinformation about the election — and the avalanche was a deadly attack on the US Capitol.
My experience at home on the day of the attack highlights another important fact about misinformation: It’s spread by people on both the left and the right. No matter how much we might want to demonize the other side, we’re all a part of a system (social media) that rewards the reactive share. And we’re all human beings, which means we all have endocrine systems that get fired up, which means we’re all going to be that snowflake at least occasionally.
But, luckily, being human means we’re also able to retrain ourselves into better behavior. We don’t have to be reactive. Instead, we can be thoughtful and analytical and caring. We can take the time to read and think before we share. And if everyone on every part of the political spectrum did this even just occasionally? The internet and the world would undeniably be better places.
On the Media has a great source called the Breaking News Consumer’s Handbook that helps people analyze not only specific current events, but also news in general. So to help everyone on their path toward being compassionate, thoughtful digital citizens, here are some of their tips on how to make sure you’re not spreading misinformation.
Do a gut check
Misinformation is often formatted to get the maximum reaction. That’s how it spreads: It activates your endocrine system and gets you to push that share button. So, On the Media says, “If a story makes you angry, it’s probably designed that way.” And that’s a good sign that it’s either untrue or not the full story.
Read more than the headline
Headlines are designed to get you to click — not to give you a full story. As a result, they’ll often focus on the most attention-grabbing or salacious part of the story. (Trust me: I’ve been writing them for more than a decade.) So while it’s tempting to read a headline and move along, know that it’s not giving you all of the information you need to really know what’s going on.
Search for other sources for verification
When you see something that gets your back up, search for other forms of verification. Fact check it. See if a reputable news source has covered it. Basically, don’t just take the tweet/Facebook post/random article at face value. Use your critical thinking skills and do five minutes of internet sleuthing to draw your own conclusions.
If you’re unsure, don’t share it
Just don’t! There’s no need. You’re good.
*** This is a Security Bloggers Network syndicated blog from Blog | Avast EN authored by Avast Blog. Read the original post at: http://blog.avast.com/why-we-spread-misinformation-avast