Social media platforms are under immense scrutiny from both Donald Trump and Joe Biden supporters. If Trump supporters say that their dissent was being stifled by Facebook, Twitter and YouTube's algorithms, Biden followers were not happy that the social media giants let some misleading claims about voter fraud spread across the internet.
The U.S. Presidential Race of 2020 has not gone smoothly. While both Twitter and Facebook, to some extent, have tried to limit the reach of misleading posts about unfounded claims of voter fraud, YouTube has refused to remove the videos with such claims.
There were multiple videos from at least nine YouTube channels that promoted Trump's voter fraud claims with so-called shreds of evidence that were already debunked by Associated Press, Reuters and other reputed news outlets. The channels also included one with over 600,000 followers.
However, despite its rules that prohibit creators from promoting "claims that are demonstrably false and could significantly undermine participation or trust in an electoral or democratic process" the videos continued to be up on the platform.
The content mostly contained alleged voter fraud in the U.S. Presidential Election in Michigan, Pennsylvania and North Carolina. One of the channels, John Talks, which appeared to produce pro-Trump content, had five videos claiming voter fraud. In two videos, he alleged that Biden's camp committed voter fraud in Detroit, Michigan. The latest video contained a similar allegation in Philadelphia, Pennsylvania, where a person said that he voted twice despite not being a registered voter of the county.
In another allegation, it was said that suitcases, wagons and coolers were used to carry ballots into a counting center in Detroit. However, later three separate investigations carried out by three media outlets found that it was food for election workers and a camera for a local TV station, Reuters reported.
YouTube told Reuters that it was reviewing the contents from those nine channels. While it refused to take down the videos, it said that if it found any violation, YouTube could suspend ads and membership sales while demonetizing the channels.
Another video from a pro-Trump channel One America News network claimed that there was widespread voter fraud besides Democrats "boldly cheating" in the election. The claims in the videos were debunked by multiple news outlets but could still fetch over 420,000 views as it was not taken down. YouTube said that while the claims were "demonstrably false", it didn't remove the video. Instead, the video was demonetized.
There were also videos on verified channels of YouTube that live-streamed results on Election Day with projected votes being shown as results. All of these videos were monetized.
Unlike Facebook and Twitter that have become more vigilant in curbing the misinformation during this election by either taking down posts or limiting the reach, YouTube has no robust plan to do so. It can only demonetize a video or a channel over false claims if reported but not remove viral content.
Before the Coronavirus pandemic, YouTube didn't have the policy to ban medical misinformation. Only on May 20 when there were millions of reported cases worldwide that YouTube imposed a ban on such contents. "We're so focused on the other platforms that we don't demand the same accountability and transparency from YouTube, and nobody kicks up a fuss. That has created a blind spot in our public discourse about misinformation and disinformation and all the same content moderation issues," Harvard Law School professor Evelyn Douek told the Washington Post.
YouTube has limitations though. There are millions of videos that are uploaded on its platform every minute and unlike texts, videos are difficult to check. Hence, instead of removing content, YouTube now puts up a banner on such videos with links to the trusted news source.
Last month, YouTube removed all contents related to QAnon conspiracy theories that claimed Trump was the savior of American people and that he would bring down the Satan-worshipping pedophile rings. It remains to be seen if YouTube takes action to remove election-related misinformation.