Will Facebook start letting users know when they read or share stories that turn out to be fake news?
Vice president of News Feed Adam Mosseri spoke on a panel at the University of California Berkeley Thursday night and mentioned doing just that as a possible solution, Megan Rose Dickey of TechCrunch reported.
Mosseri said during the panel, as reported by Dickey:
You want to make sure as little comes in the system as possible, and when it happens, you need to react as quickly as you can. And if you didn’t find it until later, then you need to consider letting people know. The question is who and how. I don’t know if we’ll do that, but it’s certainly something we’re considering.
He also addressed the impact of fake news on the 2016 U.S. presidential election, according to Dickey, saying:
In terms of how much we’ve seen, we actually haven’t seen a ton of increase around the election. The amount of fake news on the platform, actually–and I’m not trying to diminish the importance of the issue–is relatively small. It’s a very small percentage of what people see. It should be smaller. It should get as close to zero as possible.
And on publishers that distribute fake news, Mosseri said, according to Dickey:
We need to do what we can to reduce the distribution that fake news publishers get as close as we can to zero. That’s kind of what we started to do in December, and we have more work to do.
Readers: What are your thoughts on the idea of Facebook letting users know they have read or shared fake news?
Image courtesy of Shutterstock.