Study: Facebook can actually make us more narrow-minded

If of which sounds bleak, of which’s because of which kind of can be.

“Our findings show of which users mostly tend to select in addition to share content related to a specific narrative in addition to to ignore the rest. In particular, we show of which social homogeneity can be the primary driver of content diffusion, in addition to one frequent result can be the formation of homogeneous, polarized clusters,” the paper concludes.

In different words, you in addition to all of your friends are all sharing the same stuff, even if of which’s bunk, because you think alike in addition to your tightly-defined exchange of ideas doesn’t allow for anything brand new or challenging to flow in.

What of which means for “fake news”

Alessandro Bessi, a postdoctoral researcher with the Information Science Institute at the University of Southern California, co-authored the paper. He says the point of the study was definitely to investigate how in addition to why misinformation spreads online.

He says the team got interested inside phenomenon after the planet Economic Forum listed massive digital misinformation among the main threats to modern society.

“Our analysis showed of which two well-shaped, highly segregated, in addition to mostly non-interacting communities exist around scientific in addition to conspiracy-like topics,” Bessi told sy88pgw. “Users show a tendency to search for, interpret, in addition to recall information of which confirm their pre-existing beliefs.” of which can be called “confirmation bias,” in addition to Bessi says of which’s actually one of the main motivations for sharing content.

So instead of sharing to challenge or inform, social media users are more likely to share an idea already commonly accepted in their social groups for the purpose of reinforcement or agreement. of which means misinformation — which can be a much more appropriate term for “fake news” — can rattle around unchecked.

“Indeed, we found of which conspiracy-like claims spread only inside the echo chambers of users of which usually support alternative sources of information in addition to distrust official in addition to mainstream news,” Bessi says.

What can we do about of which?

Even if you pride yourself on avoiding misinformation in addition to think you’re having open, accepting conversations online, Bessi cautions of which we’re all subject to confirmation bias on some level.

“If we see something of which confirms our ideas, we are prone to like in addition to share of which. Moreover, we have limited cognitive resources, limited attention, in addition to a limited amount of time.”

of which can lead to reckless sharing — we sometimes share something without definitely examining what of which can be.

“For example, I may share a content just because of which has been published by a friend of which I trust in addition to whose opinions are close to mine,” Bessi says.

inside future, Bessi says, there may be programs or algorithms of which can help clean up misinformation. For at of which point, he recommends a more analog approach: Do your own fact-checking — in addition to soul-searching — before you share.

Study: Facebook can actually make us more narrow-minded

Related Posts

About The Author

Add Comment