New rules on the hugely popular TikTok app mean under-16s will no longer be allowed to send or receive direct messages.
It is the first time a major social-media platform has blocked private messaging by teenagers, on a global scale.
A survey by UK regulator Ofcom suggested TikTok was used by 13% of 12- to 15-year-olds last year.
Critics say the new rules will not stop children lying about their age online.
Until now, all users have been able to send direct messages to others, when both accounts follow each other.
The change means those under the age of 16 will no longer be able to communicate privately on the platform under any circumstances.
They will still be able to post publicly in the comments sections of videos.
TikTok says those affected will receive an in-app notification soon and will lose access to direct messages on 30 April.
The limit is based on the date of birth added to the account when it is created – but no verification takes place and the system is based on trust.
In 2018, Facebook introduced rules to make WhatsApp available to over-16s only across the EU, to adhere to its General Data Protection Regulation.
“The interesting thing here is that TikTok’s biggest group of users are teenagers,” said social-media consultant Matt Navarra.
“This restriction will impact a large number of their core demographic.
“Also, blocking use of a core feature such as messaging between its biggest sub-set of users is bold move.”
NSPCC child safety online policy head Andy Burrows said: “This is a bold move by TikTok as we know that groomers use direct messaging to cast the net widely and contact large numbers of children.
“Offenders are taking advantage of the current climate to target children spending more time online.
“But this shows proactive steps can be taken to make sites safer and frustrate groomers from being able to exploit unsafe design choices.
“It’s time tech firms did more to identify which of their users are children and make sure they are given the safest accounts by default.”