Instagram boosts safety features for teenage users

Instagram now prevents adults from sending messages to people under 18 who don’t follow them, and is urging teens to be more cautious about interactions in DMs

  • Adults won’t e able to message teens who don’t already follow them on the app
  • Instagram is also making it more difficult for adults to find and follow teenagers
  • It’s working to make sure teens don’t lie about their age to set up an account 
  • Its terms of service require all users to be at least 13 years old to have an account

Instagram is to restrict the ability of adults to contact teenagers who do not follow them on the platform as part of new safety tools aimed at protecting younger users.

Under the new rules, adults will be blocked from sending a direct message (DM) to any Instagram user under 18 if that account does not already follow them.

It will also begin sending safety alerts to users aged under 18 to encourage them to be cautious in conversation with adults they are already connected to but have exhibited potentially suspicious behaviour.

This could be sending a large amount of friend or message requests to teenage users, for example. 

Instagram revealed new features and resources as part of ‘ongoing efforts to keep our youngest community members safe’

In addition, Instagram is making it more difficult for adults to find and follow teenagers on the app.

It’s restricting teen accounts from appearing in the Suggested Users section of the app and hiding content from teen users in both Reels and Explore.

Young users are also being encouraged to make their accounts private.

Instagram is currently developing new artificial intelligence and machine learning technology to help it better identify the real age of younger users, it also revealed.

The Facebook-owned photo sharing app acknowledged that some young people were lying about how old they were in order to access the platform.

Its terms of service require all users to be at least 13 years old to have an account.

‘Protecting young people on Instagram is important to us,’ the social media giant said in a blog post on Tuesday. 

‘Today, we’re sharing updates on new features and resources as part of our ongoing efforts keep our youngest community members safe.

‘We believe that everyone should have a safe and supportive experience on Instagram. 

‘These updates are a part of our ongoing efforts to protect young people, and our specialist teams will continue to invest in new interventions that further limit inappropriate interactions between adults and teens.’

The online safety of teenagers using social media has been a key issue for technology firms for some time.

Companies are under continued scrutiny in the wake of repeated warnings from industry experts and campaigners over the dangers for young people online.

The government is set to introduce an Online Safety Bill later this year, which is expected to introduce stricter regulation around protecting young people online and harsh punishments for platforms found to be failing to meet a duty of care, overseen by the regulator Ofcom.

Tech firms could be fined millions or blocked if they fail to protect users under new bill that has sparked fears curbs may be used to limit free speech 

Tech firms face fines of up to 10 per cent of their turnover if they fail to protect online users from harm.

Under a new online harms bill announced in December 2020, businesses will have a new ‘duty of care’ to protect children from cyberbullying, grooming and pornography.

Larger web companies such as Facebook, TikTok, Instagram and Twitter that fail to remove harmful content such as child sexual abuse, terrorist material and suicide content could face huge fines – or even have their sites blocked in the UK.

They could also be punished if they fail to prove they are doing all they can to tackle dangerous disinformation about coronavirus vaccines.

Ministers say that, as a last resort, senior managers could be held criminally liable for serious failings – although that law would only be brought in if other measures are shown not to work.

Oliver Dowden, the Culture Secretary, said at the time: ‘Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation. 

‘We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.’

Read more: Online safety bill sparks fears curbs may be used to limit free speech