New York
CNN
—
YouTube is cracking down on content related to gambling as sports betting and other online prediction markets have taken off in the United States.
The platform announced Tuesday it will no longer allow content that directs users to “unapproved” gambling websites through links, images, text, logos or verbal references. YouTube defines unapproved gambling sites as those that don’t meet local legal requirements and haven’t been reviewed by YouTube or parent company Google.
The update builds on YouTube’s existing policy prohibiting linking to external sites that violate its rules, including unapproved gambling sites.
“We’ve strengthened our policies that prohibit content directing viewers to unapproved gambling websites or applications,” YouTube spokesperson Boot Bullwinkle told CNN exclusively ahead of the Tuesday announcement. “We will also begin age-restricting content that promotes online casinos.”
With the update, users under the age of 18 and those who are not logged in will not be able to view content that depicts or promotes online betting sites.
Online sports betting has become a massive business since the Supreme Court allowed states to legalize sports gambling in 2018. With its rise, other types of online betting, such as on the outcome of elections, have also become popular.
YouTube videos that promise to teach viewers how to make money on online sports and prediction markets have been viewed hundreds of thousands of times.
But regulations around online betting can vary by location, and experts have raised alarms that millions of Americans may be at risk of a “severe gambling problem.”
YouTube says it has long prohibited content with “sensational language” promising guaranteed winnings or loss recovery from online betting sites. But it has updated its language to note that content promising guaranteed returns — even from approved sites — will now be removed.
Tuesday’s announcement marks the latest adjustment to YouTube’s content moderation policies in recent years. The platform has taken a targeted approach to restricting certain kinds of content; for example, removing videos with false claims about vaccines and abortion or those featuring eating disorder-related behaviors such as extreme calorie counting.
In 2023, YouTube also began requiring users to disclose certain artificial intelligence-generated content that could mislead viewers.
For many social media platforms, however, the challenge historically has not been introducing content moderation policies but enforcing them, with tech giants repeatedly coming under fire for failing to adequately uphold their own rules. YouTube said it will update its policy Tuesday and begin enforcing the new guidelines on March 19.