YouTube puts mail-in voting information next to videos on the topic

YouTube puts mail-in voting information next to videos on the topic

google-hq-sede-hill-view.jpg

YouTube adds an information field to the mail-in voting.

Angela Lang / CNET

YouTube said Thursday it would give people information about mail-in polls when they watch videos discussing the issue. The voting method, like President Donald Trump, is fraught with misinformation tried to discredit The process does not provide any information about security gaps in the proven system.

To give people more context, the YouTube software will add a text box accompanying the voting via email and pointing to information from the Bipartisan Policy Center, a Washington, DC-based think tank.

“Mail-in ballots that meet the eligibility and validity requirements are counted with every election,” says the page that YouTube users see when they click the link. “The law requires that all valid votes be counted in every election, regardless of how they are cast.”

YouTube, owned by Google, isn’t the only tech giant trying to suppress misinformation related to mail-in polls. Facebook and Twitter have both flagged Trump’s posts on the topic. Earlier this month, the two social networks added labels to the post of president, potentially encouraging people to vote twice if they felt their postal vote was not counting.

Thursday’s announcement comes as Silicon Valley companies try to prove they can avoid the pitfalls they encountered in 2016. That choice was marred by interference from Russia, which used platforms from Google, Facebook and Twitter to influence the outcome of the competition.

Google announced earlier this month that it would block autocomplete suggestions on its search engine related to questions about voting procedures or donations to candidates. Last month, YouTube announced it was banning videos containing information obtained through hacking that could interfere in elections or censuses.

YouTube first introduced information boards two years ago in which short blurbs were added that appear under false or misleading videos and are intended to expose misinformation by linking it to precise sources. Since then, the company has added the panels to videos over COVID-19, the moon landing and other topics are rife with conspiracy theories.

The panels have not always worked as planned in the past. When Notre Dame Cathedral in Paris went up in flames last April, the YouTube algorithm inadvertently displayed a notice board about the September 11th terrorist attacks because the software made a mistake while analyzing the images in the video. Following the fire, YouTube said its systems made the “wrong call”.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *