Research has shown that social media sites, like Facebook, Instagram, and YouTube can, if used excessively, can cause mental health issues in children and teenagers. And with more young people using these platforms than ever before, is it time they start doing more to protect them from the dangers?
According to MP’s in the UK, social media giants, like Facebook, Instagram, and YouTube, should have a “legal duty of care” when it comes to protecting their younger members from the possibility of mental health issues.
This follows a report by the Commons Science and Technology Committee, in which is was found that the safety of young users wasn’t guaranteed. The committee referred to the regulations as a “patchwork” and “lottery” of standards, which could be leading to problems with body image and sleep, as well as leaving them vulnerable to bullying.
Because of these findings, the group has advised the government to take another look at the current legislation. Some of the recommendations include making sure companies share data to protect those at a higher risk, and setting a goal of halving the number of cases of abuse within two years.
When commenting on the paper, the chair of the committee, Norman Lamb, said: “Worryingly, social media companies – who have a clear responsibility towards particularly young users – seem to be in no rush to share vital data with academics that could help tackle the very real harms our young people face in the virtual world.”
A spokesman for the Department for Digital, Culture, Media and Sport, added: “We have heard calls for an Internet Regulator and to place a statutory ‘duty of care’ on platforms, and are seriously considering all options. Social media companies clearly need to do more to ensure they are not promoting harmful content to vulnerable people. Our forthcoming white paper will set out their responsibilities, how they should be met and what should happen if they are not.”