📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

Act 866 to require platforms to block under-16 users; non-compliance may incur RM10mil penalty.

PETALING JAYA: The federal government is moving toward operationalising age-verification requirements for social media platforms under the Online Safety Act 2025, with fines of up to RM10 million for non-compliance once subsidiary rules are enforced.

Deputy Communications Minister Teo Nie Ching told the Dewan Negara that the requirement for social media platforms to prevent users aged 16 and below from owning accounts will be introduced through subsidiary legislation under Online Safety Act 2025 (Act 866).

“The obligation to ensure users aged 16 and below do not use or own social media accounts will be imposed on social media platform providers based on the relevant subsidiary instrument under Act 866.

“Failure to comply would constitute an offence and may be subject to financial penalties of up to RM10 million,” Teo said.

Teo was responding to Senator Norhasmimi Abdul Ghani, who sought to know whether the government plans to impose stricter controls on social media access for children under 16 and strengthen enforcement against harmful online content.

Teo further clarified that the government, through the Malaysian Communications and Multimedia Commission (MCMC), is currently evaluating age and identity verification mechanisms as part of the Act 866, which had came into force on Jan 1.

“At present, among MCMC’s main focuses is to evaluate the implementation of age and identity verification mechanisms as part of the online safety framework under Act 866.

“The method of age verification will be finalised through the relevant statutory instrument under Act 866 in the second quarter of 2026.

Teo said the government is running a pilot testing programme (regulatory sandbox) with social media providers to assess suitable technological approaches, icludiong the use of artificial intelligence (AI).

“MCMC is collaborating with social media platform providers to assess appropriate technological approaches for age verification, entity authentication and the use of AI to detect high-risk content, as well as faster and more effective complaint-handling mechanisms.”

When pressed further by Senator Dr Wan Martina Wan Yusoff, who asked about action against platforms that fail to filter harmful content, Teo said the government is developing mandatory codes to strengthen “safety by design” obligations.

“We are now in the process of developing various codes to ensure it becomes an obligation for platform providers to make their content filtering more age-appropriate… so that platforms carry responsibility to ensure their users are safe on their platforms.”

Teo added that public consultation on the proposed risk mitigation and child protection codes began on Feb 12 and will run until March 13, stressing that a whole-of-society approach remains critical to strengthening online safety for minors.

 The Sun Malaysia

📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

About the Author

Danny H

Seasoned sales executive and real estate agent specializing in both condominiums and landed properties.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}