Kuala Lumpur: Malaysia has called on TikTok to implement stricter age verification measures following concerns over the impact of social media on children’s mental health. The request comes after Malaysian authorities summoned TikTok’s top management to accelerate efforts in curbing harmful content on the platform.
Communications Minister Fahmi Fadzil stated that Fadzil is dissatisfied with TikTok’s current measures but has allowed the company to work with authorities to resolve the issue. Fadzil added that a mechanism for age verification needs to be studied in collaboration with the Malaysian Communications and Multimedia Commission and law enforcement.
Since January, Malaysia has required social media platforms and messaging services with over 8 million users to obtain a licence. Platforms failing to comply with regulations could face penalties, particularly if they allow content that promotes online gambling, scams, child exploitation, cyberbullying, or sensitive material related to race, religion, or royalty.

TikTok, owned by China’s ByteDance, has not immediately responded to requests for comment. Malaysian authorities have also indicated plans to summon representatives from X and Meta Platforms, including Facebook, WhatsApp, and Instagram, to discuss similar concerns.
The move aligns with global efforts to protect children online. Australia banned children under 16 from using social media last year, and the UK has required pornography sites and other platforms hosting harmful content to verify users’ ages. Meanwhile, France, Spain, Italy, Denmark, and Greece are jointly testing an age verification app template to ensure minors are shielded from unsafe content.
Malaysia’s push for age verification reflects growing scrutiny of online content that could negatively affect children. The authorities aim to balance platform accessibility with the safety and mental well-being of young users, sending a clear message to social media companies about their responsibility to minors.

