📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

The government aims to finalise mandatory age checks for social media users by Q2 2025, with platforms facing RM10 million fines for non-compliance.

KUALA LUMPUR: The government expects to finalise an age verification mechanism for social media users through subsidiary legislation under the Online Safety Act 2025 in the second quarter of this year.

Deputy Communications Minister Teo Nie Ching said the mechanism is part of ongoing efforts to protect children and teenagers from harmful online content.

She stated that the Malaysian Communications and Multimedia Commission is currently assessing the implementation of age and identity verification mechanisms.

READ MORE: Govt to enforce social media age checks, RM10mil fine

This assessment considers account security, personal data protection, privacy and compatibility with the existing legal framework.

“MCMC is also conducting a Regulatory Sandbox process with social media platform providers to evaluate suitable technological approaches,” she told the Dewan Negara.

She was replying to a question from Senator Norhasmimi Abdul Ghani on proposals to impose stricter restrictions on social media access for children under 16.

Teo said the testing includes age verification and entity validation mechanisms and the use of artificial intelligence to detect high-risk content.

“Following the assessment and the Regulatory Sandbox process, the obligation to ensure users aged 16 and below do not operate social media accounts will rest fully with platform providers under subsidiary legislation,” she said.

“Non-compliance could result in financial penalties of up to RM10 million.”

She added that the move aims to ensure social media algorithms are age-appropriate and to prevent exposure to negative content such as cyberbullying and sexual harassment.

In a supplementary reply, Teo revealed that between January 1, 2022, and February 15 this year, 1,578 requests were submitted to service providers for the removal of extremely offensive content involving children.

She said 96% of that content was successfully taken down.

To strengthen digital safety, the MCMC has also launched a public consultation from February 12 to develop a Risk Reduction Code and a Child Protection Code.

These codes are expected to be finalised after March 13.

“Our intention is to make compliance with these codes mandatory for platform providers so their algorithms are safer, and to establish faster and more effective complaint mechanisms,” she said.

 The Sun Malaysia

📈 Explore REIT Investing with a Smarter Trading App

Perfect for investors focused on steady income and long-term growth.

📈 Start Trading Smarter with moomoo Malaysia →

(Sponsored — Trade REITs & stocks with professional tools and real-time market data)

About the Author

Danny H

Seasoned sales executive and real estate agent specializing in both condominiums and landed properties.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}