Apps

Finally, YouTube Will Stop Recommending “borderline content”

Published

on

In a much awaited move, YouTube, the popular user-content-generated video platform shall not recommend “borderline content” or conspiracy theories such as the moon landing was faked or the earth being flat, going forward.

On Friday Google-backed video content platform made the significant announcement in its content policy change based on long standing criticism by users about “clickbaity videos with misleading titles and descriptions (“You won’t believe what happens next!”),” the blog post noted.

Ongoing Improvement of recommendations        

According to the official blog, Youtube’s latest recommendation is based on ‘viewer satisfaction’ over that of “views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often.”

However, these recommendations were also found to be irksome by users. Following many complaints of people receiving similar recommendations based on one view, the current changes have been rolled out.

“We now pull in recommendations from a wider set of topics—on any given day, more than 200 million videos are recommended on the homepage alone. In fact, in the last year alone, we’ve made hundreds of changes to improve the quality of recommendations for users on YouTube.”

Watch-out for Violators of community Guidelines

Further, the media publisher will sharpen its look out “at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines.”

In fact, it would cease to recommend non-factual, ‘borderline content’ such as phone miracles for cancer or similar uploads.

Conceding that only 1 percent of the Youtube population would actually view ‘borderline content’ it vowed that it will not run any blanket ban on such posts.

Pulling-out its Community Guidelines as the ultimate judge on quality of content it would publish, Youtube has reiterated that ‘borderline content’ would be available for those who would like to watch such information. However, henceforth it would not recommend such videos based on its “recommendation” algorithm. Secondly, subscribers of such channels would continue to receive such recommendations as part of search results.

Youtube thus squarely leaves the responsibility to view content to the user community. It’s only restraint that it would officially not bring out ‘borderline content’ to those who do not appear to be interested in such content, until further use-pattern by the user.

A full-fledged program is already in place at YouTube United States by getting public guidelines-trained evaluators to train via machine learning systems to evaluate and generate recommendations.

Trending