www.muktobak.com

Google says it’s fighting misinformation, but how hard?


 By Mathew Ingram, cjr    ২২ ফেব্রুয়ারি ২০১৯, শুক্রবার, ১২:১৮    আন্তর্জাতিক


Google recently presented a white paper at a digital-security conference in Germany, in which the search giant detailed all the steps it is taking across its various divisions—YouTube, Google News and Google Search—to fight misinformation and disinformation. The company said it is working hard in a number of areas including using quality signals to help surface better content, which involves relying on human search curators to determine whether something is a high-quality result for a specific query. Google also noted that it has been adding more “context” for searches, including links to related information, as well as different ways of notifying users that certain results have been fact-checked by reliable organizations. And the company said it is trying to crack down on trolls and hackers who hijack accounts or pretend to be someone they are not.

These kinds of efforts are clearly worthwhile, given the kind of influence and reach that Google products have. Facebook continues to get the bulk of the press (mostly bad) for its role in helping to weaponize misinformation networks during the 2016 election and elsewhere, but Google’s search and recommendation algorithms arguably have more impact—it’s just not as visible or as obvious as Facebook’s. The search company has taken full advantage of its somewhat lower profile whenever the topic comes up: During multiple congressional hearings into misinformation and the election, Google has pushed the idea that since it isn’t a social network, it doesn’t suffer from the same kinds of problems as Facebook, where social sharing makes misinformation go viral and algorithms exacerbate the problem.

But that’s only partly true. While Google Search may not be subject to those kinds of effects, YouTube is very much a sharing-based social network, and as a result there are very similar social dynamics to the misinformation on that platform—and very similar problems caused by the recommendation algorithms that determine what content users see once they have finished a video. In fact, a number of research and news reports have detailed just how easy it is to get sucked into a rabbit hole of conspiracy theories on YouTube, even after watching something innocuous.

According to one recent study, a significant number of users wound up being skeptical the the earth is round after watching YouTube videos, even though most of them said they were not interested in such theories before they watched. YouTube said in January that it had made changes to its recommendation algorithm to stop conspiracy theories from being promoted. Part of the challenge with getting rid of this kind of content is hinted at in Google’s security report. As the company put it, “it can be extremely difficult (or even impossible) for humans or technology to determine the veracity of, or intent behind, a given piece of content, especially when it relates to current events.” Also, Google goes on to say: “Reasonable people can have different perspectives on the right balance between risks of harm to good faith, free expression, and the imperative to tackle disinformation.”

The argument, then, is this: Not only is it hard for even the smartest algorithms (and richest companies) to figure out what is capital T true and what is false, there’s also the question of how much speech Google and other platforms should effectively be censoring. If a bunch of freaks on the Internet want to think that the world is flat or that 9/11 was a hoax, so what? Should their juvenile videos be removed from the Internet completely?

That’s not the only thing that makes cracking down on misinformation difficult for a platform like YouTube, however. As a recent New York Times story noted, one of the implications of a crackdown is that the company might actually have to start censoring—or at least down-ranking—videos from YouTube stars like Shane Dawson, who has gained tens of millions of followers in part by posting videos questioning the moon landing and other accepted historical facts. Not only is that going to make YouTube distinctly unpopular with heavy users who rely on it for income, but that in turn could threaten the company’s revenue stream. How strongly will Google continue to push back against misinformation when its own livelihood is at stake? For that answer, check the next white paper…

Here’s more on Google, YouTube and misinformation:

Other notable stories:




 আরও খবর