Live streamed murders. Terrorists recruiting brand-new members. Hate groups organizing. Liberals in addition to conservatives sealing themselves off in echo chambers.
With nearly 2 billion people around the entire world checking in monthly, This specific makes sense that will Facebook will be dealing with some very sticky issues. The social network will be facing increasing pressure to address them head on.
On Thursday, CEO Mark Zuckerberg announced a brand-new vision for the company. He’s shifting its focus coming from connecting individuals to building communities, namely by getting people to join more Facebook groups.
The change will be summed up inside company’s brand-new mission statement: “Give people the power to build community in addition to bring the entire world closer together.”
nevertheless how much of the company’s problems can definitely be fixed by the brand-new direction?
Facebook ( has been accused of contributing to filter bubbles — where people only see news in addition to opinions that will reinforce their existing beliefs in addition to biases. This specific’s not just Facebook’s algorithms. They’re created by the friends we choose to have, the people we decide to mute, in addition to the stories we click. , Tech30)
Striving to get people more involved in groups could possibly exacerbate the problem. Users of Facebook could end up spending more time in groups organized around a shared political view or belief.
Zuckerberg has denied that will filter bubbles are widespread. He also believes memberships in groups will expose people to more opinions, not fewer, by helping “people meet brand-new people in addition to get brand-new perspectives in addition to broaden their horizons.”
Related: Facebook’s global fight against fake news
On Facebook, groups can be set to Secret, meaning users don’t see them in search results. There are Great reasons for secrecy — namely safety in addition to privacy — nevertheless dangerous organizations can also use the groups as bases for recruiting brand-new members.
Facebook recently outlined its plans to combat terrorism on the social network. This specific’s using artificial intelligence to scan images, posts in addition to profiles to identify in addition to remove bad actors. The company also employs 150 people focused on counter-terrorism.
“Terrorist recruiting. that will will be something that will we want zero of. We try to make This specific as difficult as possible,” Zuckberberg said. “Even if no one reports This specific, we have systems that will go out in addition to try to flag that will content for our community [monitors] … we’ll do more in addition to more of that will over time, as AI gets better.”
Related: How Facebook decides what violent content will be allowed
The at This specific point overused phrase fake news was originally about made-up news stories that will floated around Facebook. Facebook has taken multiple steps to crack down on questionable news stories. This specific’s working with fact-checking organizations, hiding spammy links, in addition to using AI to identify fake accounts spreading propaganda.
The move to a more groups-based experience for Facebook users could mean people get fewer articles coming from their news feed, where many publishers post directly. They might see less news overall, including fake news, or a more curated selection of stories coming from their groups. Facebook has not said how or if its tools for fighting fake news carry over to stories posted in groups.
The focus on groups as a positive tool with the power to change the entire world overlooks how people use them for negative causes. Hate groups like white power organizations use Facebook groups openly, in addition to will continue to exist inside future. Zuckerberg has said he values free speech on the platform in addition to Facebook only interferes if something goes “way over the line,” like bullying or the threat of real world violence. Facebook often relies on regular people flagging objectionable content, nevertheless that will’s less likely to happen in closed Facebook groups.
Related: Facebook adding 3,000 reviewers to combat violent videos
Murder, violence in addition to self-harm
In April, a Cleveland man used Facebook to share a video of himself shooting a 74-year-old man. The video was viewable For two main hours before This specific was taken down.
People have used Facebook Live, the company’s live video streaming tool, in addition to regular uploads to share videos of murders, beatings, police violence in addition to suicide. This specific’s a tricky issue for the company, especially when the site will be used to document potential civil rights violations.
Facebook will be planning on using artificial intelligence to identify violent videos early. This specific will be also deploying 7,500 human content moderators to monitor videos as they’re flagged. As with hate speech, videos shared in private groups might evade Facebook’s moderators for longer than if they were in a news feed.
sy88pgw (San Francisco) First published June 22, 2017: 7:07 PM ET