Sponsoer by :

Sunday, November 20, 2016

Facebook news you can't use: The TechCrunch Sunday Snapshot

Sponsored

THE DAILY CRUNCH
SUNDAY, NOVEMBER 20 2016 By Darrell Etherington

Sunday Snapshot 11/20/16

All the news that isn't fit to print: This week we saw criticism of Facebook's fake news problem come to a head, with CEO and founder Mark Zuckerberg responding to the social network's problem with propagating false or misleading headlines in a lengthy post Friday evening. Zuckerberg had already made not one, but two previous personal posts about the issue, but the first dismissed the notion of FB's influence on the election as a "crazy idea," and the second barely took the issue more seriously.

The third post, however, is far less dismissive of the idea that Facebook has real influence and power when it comes to distributing media in a way that can skew opinion one way or another. The post comes after reports like this one from BuzzFeed that show using hard data that fake news on Facebook can have an outsized influence, at least in terms of audience engagement, vs. factual stories, especially during sensitive times like in the lead-up to the election.

Zuckerberg skipped to acknowledging that the spread of false stories was a difficult problem to solve, both from a technical perspective and from the perspective of having to define factuality. But, the founder no longer comes off dismissive of the issue – either he or his PR team have realized that their earlier measures, which were clearly designed to soothe critics temporarily in the hopes the whole thing would blow over, won't work.

Instead, Zuckerberg now outlines a plan of action, which includes better technical detection, easier reporting tools for users, third-party verification via unaffiliated fact-checking agencies and breaking the monetization models that were helping make fake news a reliable source of income.

Facebook got rid of the editorial staff it had assigned to provided some kind of human judgement about what topics appeared in the Trending Topics news section of its website, in a story that made headlines earlier this year. The company, being a company led by engineers, seems to have wanted to excise any messy human element from the chain, and turn things over to pure math – the algorithm, which cannot act morally or with a political bias, in theory.

The problem with that approach, and with any solution that claims to eschew human responsibilities by making key choices purely technical, is that all it does is offload the human element somewhere further on down the chain. Consider Uber, where the company's early claim that it was merely a technology platform provider was used to help it distance itself from responsibility for responsibilities to or for individual drivers. Likewise Facebook and its news sources.

In both cases, the idea was to minimize a business risk by leaning on technology to take human decision-making out of the equation. And it works, which is why these businesses work. But the risk doesn't vanish; it's simply offset, and lands in the hands of someone else instead. For Facebook and fake news, responsibility resides with publishers and readers (where it always has, to some extent, but the means of distribution previously shared some of that responsibility).

Facebook looks to be willing to reclaim more of that responsibility, provided they continue to work on this problem in the ways outlined by Zuckberberg, as explored in detail by Kate in this TechCrunch article. But the other parts of the equation could also do with some improvement, including the least-discussed side as far as I've seen: the audience. Media literacy education is something that could really contribute to the issues presented by fake nows by informing a potential audience and arming them with the tools they need to spot misinformation early and easily.

Education reform is a much more tangled problem that's harder to get our collective hands around, however, whereas Facebook has centralized control systems and a directly responsible individual you can ask for change. Ultimately both need addressing, but it's still imperative to hold Zuckerberg and Facebook accountable for these changes they now say themselves they want to make.

Get the context:

Get more stories at techcrunch.com 

Newest Jobs From CrunchBoard:

SEE MORE JOBS ON CRUNCHBOARD
Post your tech jobs and reach millions of TechCrunch readers for only $200 per month
Facebook   Twitter   Youtube   Instagram   Flipboard
View this email online in your browser
If you do not want to receive this email or you would like to update your preferences click here.
410 Townsend Street, San Francisco, CA 94107
© 2016 AOL Inc. All rights reserved.  Privacy Policy   Terms of Service
                                                           

No comments:

Post a Comment

My Blog List