Young active users of social networks are more likely to experience low life satisfaction and dissatisfaction with their appearance, school and family than their less Internet-addicted peers.
A report by the Children’s Society last year found that 7% of children aged 10 to 15 in the UK – or an estimated 306,000 young people – feel dissatisfied with their lives, and social media is considered a contributing factor. “Social media gives people the opportunity to be mean,” Yara, 16, told the charity. “They can be so cruel.”
But despite the link between social media and mental illness, more kids than ever are signing up to create their own online accounts. Communications regulator Ofcom’s 2020/21 report “Kids and Parents: Media Use and Attitudes” found that 87% of children aged 12 to 15 use social media sites or apps and 42% of children aged 5 to 12 years.
What are the age restrictions?
The fact that nearly half of children aged 5 to 12 use social media sites or apps is significant, as the minimum age limit for most social media platforms is 13.
Instagram, Twitter, TikTok, Snapchat, and Facebook require users to be at least 13 years old before creating an account with them. WhatsApp users in the European Economic Area must be at least 16 years old to register an account. And YouTube states that you must be at least 13 years old to use its service, however, “Children of all ages may use the service and YouTube Kids (if available) if permitted by a parent or legal guardian.”
When Ofcom researchers asked parents of children aged 5 to 15 if they were aware of these minimum age requirements, nearly 90% said they were. But fewer than four out of ten of them could pinpoint exactly what the age requirements really were.
Three out of 10 parents of children under 13 surveyed by Ofcom “said they would allow their child to use social media despite age restrictions.”
Are these age limits effective?
Despite TikTok requiring a minimum age, another Ofcom report released earlier this year found that about 16% of three- and four-year-olds have viewed content on the video-sharing platform, reports The Guardian. He described the trend as “the uprising of TikTots” – in other words, “kids ignoring age restrictions to use social platforms.”
One 12-year-old respondent told Ofcom that “for TikTok and Snapchat, I think I put in a fake birthday because I was allowed to.”
In response to Ofcom’s 2022 report, a TikTok spokesperson said: “TikTok is strictly a platform for persons over the age of 13, and we have processes in place to enforce our minimum age requirements both at the time of registration and through ongoing proactive removal of suspected minors. . accounts from the platform.
The report also exposed the issue of Finstas, fake Instagram accounts used by young people to hide aspects of their online lives from their parents. Ofcom researchers estimate that two-thirds of children between the ages of eight and 11 have multiple Instagram accounts, and nearly 50% have an account just for their family.
What are child protection platforms doing?
Last year, the NSPCC commended TikTok for restricting direct messaging to accounts owned by users aged 16 and 17, and for actively asking users under 16 to decide if they want their videos to be visible to their followers, their friends or just yourself.
The platform also announced that it will no longer send push notifications after 21:00 to users aged 13 to 15 and after 22:00 to users aged 16 to 17. “We want to help our younger teens, in particular, develop positive digital habits early on,” a TikTok spokesperson said.
Also in 2021, Meta (then known as Facebook) announced what The Guardian called “drastic changes” to Instagram, which included making private accounts available to under 16s by default, “ensuring that children would only share content publicly on if they are actively diving. to settings and change your privacy settings accordingly.”
YouTube then announced a “surprisingly similar” set of changes that updated the default privacy settings for users under 18.