- YouTube says it will try to surface more credible health information and add labels with information viewers will have to click on.
- The measures come as the company has faced repeated criticism for hosting health misinformation amid the Covid-19 pandemic and vaccines.
- Last week, the U.S. Surgeon General blamed social media companies like YouTube for rampant misinformation that he said led to more deaths.
YouTube said Monday it will promote more credible health resources and label some videos to direct viewers away from misinformation, which has been prevalent across the service for more than a year.
The delayed action comes as the company faces questions about hosting misinformation surrounding Covid-19 and vaccines as cases and deaths begin to rise again. The U.S. Surgeon General last week called out tech companies for their role in hosting misinformation and urged companies to take a number of actions, including sharing data with researchers.
The Google-owned company said it will "highlight" more videos with authoritative sources when a user searches for specific health topics. It will also add "information panels" that display a link to credible sources recommended by the National Academy of Medicine. The tool's effectiveness will be based on the viewers' willingness to click on it, however. Experts have repeatedly doubted similar tools the company added to election videos last year.
"This is our first step towards identifying and designating authoritative health sources on YouTube," stated Dr. Garth Graham, director and global head of healthcare and public health partnerships at YouTube, in Monday's blog post.
The Surgeon General's report cited a study by the Journal of Medical Internet Research, which stated "Social media platforms such as YouTube are hotbeds for the spread of misinformation about vaccines."
Results of the study said viewers "are more likely to encounter antivaccine videos through direct navigation starting from an antivaccine video than through goal-oriented browsing. In the two seed networks, provaccine videos, antivaccine videos, and videos containing health misinformation were all found to be more likely to lead to more antivaccine videos."