This week the Government’s Online Safety Bill came back before the House of Commons. This was originally meant to happen before Christmas, but the decision was made that there were some changes needed to improve it so further work was needed.
It's a large and complex Bill which seeks to address the growing challenges caused by the internet. Like many new things, the internet has transformed the way we live our lives. People have access to huge amounts of information; they increasingly shop online social media platforms have made it easier for them to connect with old friends and make new ones. But like all new innovations, there are darker consequences that eventually have to be tackled through Government regulation. Social media can be exploited by extremist groups; we have seen a huge growth in so-called "fake news" and lies and misinformation; some social media platforms are actually quite anti-social and the anonymity that they provide can bring out the worst in human behaviour. People are rude in a way that they would never be in person. Finally, social media can be an easy platform for bullying which is a particular problem with teenagers who spend so much of their lives on social media platforms. Social media is increasingly a factor in some of the mental health challenges we see in young people and that is why the issue needs to be addressed.
The Bill seeks to balance the importance of allowing free speech with the need to protect children and young people from harmful content. One important change that the bill introduces is that companies like Facebook, Twitter, Snapchat, and TikTok will finally legally be held responsible for the content on their sites or face billion-pound fines. It has become far too common for us to read stories in the media of social media sites promoting horrific self-harming trends or other potentially dangerous activities. Now under the protections introduced by the Online Safety Bill, tech companies must use age-checking measures to ensure that all users are over the age of 13 and must protect younger users from seeing harmful content, from cyberbullying and pornography to content that encourages eating disorders or violence.
Adults will be covered by their own separate ‘triple shield’ of defence. You will be protected from posts that are illegal; from content that is prohibited by the social media companies in their own terms and conditions; and you will be given more control over the content you see on your own social media feeds. This is not about restricting the internet; but about protecting the most vulnerable in our society from genuinely harmful content.
During the pandemic, people became more dependent on social media than ever. Children were separated from friends. We, humans, are social creatures. Friendships and the company of others are important. Forming those bonds and friendships is a really important part of growing up, whether it is in the formative early years as children start their first years as infants at primary school, or whether it is in those tricky teenage years as young people wrestle with all the insecurities and concerns that accompany that stage of life. Given the reality that social media now plays a role in that process, it is essential that we try to make social media more social and less anti-social, and this legislation is an important step.