We welcome expressions, support, and collaboration from like-minded organizations

 

 

US Renew News: Where Facts Make a Difference (Check Out Our News Coverage Below)

How YouTube Stokes Political Division

Technology Policy Brief #67 | By: Erik Pillar | December 29, 2021

Header photo taken from: NBC News

Follow us on our social media platforms above

Browse more technology policy briefs from the top dashboard

Photo taken from: Mozilla Foundations

Policy Summary

Social media algorithms, political echo chambers, and more are feeding into an ever-increasing disparity between rightwing-leftwing political perspectives.

According to the Pew Research Center, in a 2020 survey, the disparity between party-based opinions on topics such as the handling of Covid-19, voting rights, and even how the future may appear were Trump or Biden to be named president was more at odds than any other similarly compared nation. More recent Pew studies show that trend continuing with an increased percentage gap in issues such as abortion, economic issues, racial disparity, and more.

Part of the problem may be in how we interact with social media, and how social media interacts with us. The internet web-browser Mozilla conducted a 10-month long survey, released in 2021, of YouTube users on their video watching habits and the type of content they see show up in their recommended feeds.

The study, called YouTube Regret, regret for when a user regrets watching the content they were recommended by YouTube, found that 71% of all regretted content reported came from recommended video links embedded in the content they were watching.

“YouTube needs to admit their algorithm is designed in a way that harms and misinforms people,” says Brandi Geurkink, a Mozilla Senior Manager of Advocacy, in relation to the study findings, “Research confirms that YouTube not only hosts, but actively recommends videos that violate its own policies.”

YouTube’s community guidelines and policies on harmful content can be found here: https://support.google.com/youtube/answer/2801964?hl=en

Several examples from the study highlight the issue of unwanted and potentially harmful recommended content. One user reporting watching a video about the U.S military, where he was recommended to watch a video titled “Man humiliates feminist in viral video.” Another watched several Art Garfunkel music videos and was exposed to a recommended video called “Trump Debate Moderator EXPOSED as having Deep Democrat Ties, Media Bias Reaches BREAKING point.”

According to an article from The Atlantic, in a cited Pew survey, 64% of all channel and video recommendations were to videos with more than a million views. Only about 5% of recommendations were to videos with fewer than 50,000 views. In addition, YouTube favors fresh content heavily over older videos. YouTube is also likely to favor channels or videos from creators previously watched that fall within the above parameters.

For example, were a more democrat leaning user to search for information on traditionally left-leaning topics, they would most likely be directed to the channel or videos on the subject with the largest numbers watched that update the most often.

One such a channel might be the left-leaning YouTube news network The Young Turks. Their network is one of the most watched in their category for left-wing topics and stances, and they follow the daily release format YouTube favors. Once a user watches one of their videos, YouTube seeks to keep a user engaged with the website. Their next recommendation is likely to be another Young Turks video, and if not that specifically, a similar network with similar views.

Thus, while a user may start out with a deliberate search for information and is given information that lets him see different perspectives; he then becomes inundated with recommendations for sites that skew towards his or her political bias and no longer has access to opposing points of view.

The same can be said for a user, like those in YouTube Regret, who did not seek out or search for the content to start with. Since YouTube promotes highly watched content, regardless of the relation to what a user may be seeking, a user can end up in a place they did not seek.

Policy Analysis

While the reasoning behind the YouTube recommendation algorithm is understandable, in that it is designed to keep a user engaged for the maximum amount of time -increasing ad revenue- it makes for a problematic discourse on public policy issues.

Very rarely is one side of an argument all right or all wrong. When the public gets their news from one source however, quickly you end up with a population that sees and identifies with only that one source. The Pew surveys show a country more at odds with itself than nearly any other short of outright war.

Adding to the issue is the nature of the sources sites like YouTube funnel users to. Creators on YouTube are businesses, especially when they are seeking to provide a service, such as news or political commentary. These businesses seek to maximize revenue just as YouTube does in their recommendations.

A quick search on YouTube for left and right commentators pulls up many varied creators, but some tend to stand out. One is the previously discussed Young Turks, another is The Daily Wire. The opinions voiced by creators working under these networks could not be more different, however the method for engaging users is very similar.

Both channels adhere to the YouTube system of daily releases, both see large number pulls, and both have a multitude of smaller creators and channels under them to flow user traffic to once an initial video is watched, which YouTube is more than happy to do as it keeps the user engaged.

Photo taken from: audiencegain.net

(click or tap to enlargen)

In addition, both channels monetize their services through subscriptions, creating a link to the channel and the user directly though monetary involvement, and go further to sell merchandise and goods attacking opponents or supporting those they agree with.

While the issue is in part the way in which these networks seek to keep and monetize users, that the platform itself, YouTube, directs and then keeps users seeing such content is the problem.

A user may become hooked by an unwanted recommendation, having watched said recommendation be given more, and over a very short time see mostly only that content. This is a problem, but one not easily solved.

Freedom of speech laws make any form of censorship short of calls to action or hate speech legal and not easily prevented. In addition, where does one draw the line at what should or should not be allowed? It is the goal of a business to make money. When a business legally earns money through legal methods, even if it comes at a cost to public discourse, should they be prevented from doing so?

The answer is one not yet known as politicians and policy setters continually battle over what it means to be a publisher or platform, what is or is not allowed.

One fact is clear, the political divisions in the US are becoming deeper, exacerbated by the business models used by social media providers such as YouTube.

DONATE NOW
Subscribe Below to Our News Service
x
x

Join the Resistance---Your donation helps support the work we do to bring you news and analysis of government policies and the organizations seeking to resist them.

Pin It on Pinterest

Share This