Social media, and particularly video-sharing platforms, have soared in popularity in the last few years. In fact, globally, research shows that video traffic now makes up 81% of consumer web traffic, compared with 72% in 2016.
However, this has meant an increase in users being exposed to harmful or inappropriate content, which is something tech companies have so far failed to deal with effectively.
Because of this, the UK regulator Ofcom has introduced new measures that will require all video-sharing platforms to take action and protect their users better.
The new rules, which apply to all video-sharing platforms including TikTok, Snapchat, Twitch, and Vimeo. They say that these companies must take “appropriate measures” to ensure users are protected from any content related to child sexual abuse, terrorism, racism, or xenophobia.
Companies are also required to block any content that might impair the physical moral, or mental development of a child under 18, as well as any other illegal content.
According to the Ofcom report, a third of users have been exposed to hateful content. Now, the regulator says it will fine any company that fails to reach these guidelines. In more serious cases, it says it could suspend the service entirely.
Under the regulations, video-sharing platforms will need to provide clear rules to users for uploading content and effectively enforce these rules.
They must also make sure reporting and complaints processes are simple and easy to follow, and that age verification will be in place to restrict children from viewing adult content.
To make sure the companies involved are taking the appropriate steps to meet these requirements, Ofcom says it will produce a new report new year for companies that fall within the UK jurisdiction – this includes most major platforms.
Following this, other platforms like YouTube will be covered by the Online Safety Bill which aims to tackle other technology companies like Google, Facebook, and Twitter.