ST Picks: How do countries deal with the scourge of youth online addiction?
SINGAPORE – Countries across the world are creating laws to protect their young from social media and gaming addiction.
- by autobot
- July 7, 2024
- Source article
Publisher object (23)
SINGAPORE – Countries across the world are creating laws to protect their young from social media and gaming addiction. In Singapore, the Government is set to announce measures to protect children from this type of online harm, following other jurisdictions that have made attempts to regulate online platforms. The Straits Times highlights efforts by the authorities the world over to crack down on online addiction among the young. South Korea was the first nation to implement a “shutdown law” that prohibits those under 16 from playing games online after midnight in 2011. But the law was abolished in 2021 to respect children’s rights and encourage at-home education, said South Korea’s Ministry of Culture, Sports and Tourism and Ministry of Gender Equality and Family. Excessive online gaming will instead be managed by the country’s “choice permit” system, which lets parents and guardians arrange approved play times. The state of New York in June passed a similar Bill to protect internet users under 18 from addictive social media feeds. The Bill will soon prohibit social media services from sending notifications to young users between midnight and 6am. It is yet to be seen how this will be enforced. China has rolled out some of the most stringent screen-time rules for children. In 2021, it limited the time minors can play online games to an hour a day and only on Fridays, weekends and public holidays, by requiring companies to register players using real identities and enforce time limits. China’s online regulators in 2022 introduced rules for tech firms in online gaming and media to set up a “youth mode”. These settings require platforms to provide young users with a “clean” online environment, including settings that limit use, control payments and allow only age-appropriate content. Douyin was among platforms that launched a “youth mode” for minors. Users under 14 years old are barred from the app at night and can use the video app for only 40 minutes, with a five-second delay between each video. Tencent, known for its WeChat instant messaging app, reduced the time children can play popular games to one hour per day, down from 90 minutes, and installed measures, including facial recognition, to confirm other users are adults. The Chinese authorities in 2023 prohibited video game companies from incentivising excessive gaming, such as by giving players rewards if they log in daily or if they spend on the game. The Belgian authorities went a step further in 2018 to ban all loot boxes – randomised virtual items that players can buy with real currency – a gaming mechanism that has been likened to gambling. Beyond gaming, under New York’s Stop Addictive Feeds Exploitation for Kids Act (the Safe Act, for short), platforms like TikTok and Instagram will soon be forced to spare young users from targeted content and advertising, in an early effort to crack down on social media algorithms. New York state has also passed laws that bar online sites from collecting and using the personal data of users under 18 without informed consent, to allow children to freely use the internet without being watched. Companies cannot sell the data or process it without the users’ permission – or their parents’ permission – if they are under 13 years old. “Right now, there’s nothing stopping websites or other digital services from monitoring every minute detail of our children’s online lives,” the New York State Senate wrote in its Bill, adding that such information can be exploited to manipulate users as they move to adulthood. Britain’s Online Safety Act, which comes into force in 2025, will pin the responsibility on social media providers to protect children from harmful material. In Australia, online safety laws that came into force in 2022 grant its eSafety Commissioner the power to order social media firms to remove flagged harmful content or face penalties. Most recently, it gave tech companies six months till October to pitch an enforceable code on how they will stop children from watching pornography and other inappropriate content. The eSafety Commissioner suggested measures such as age verification and default parental controls. Singapore’s legal measures to protect young people are similar to Britain’s. The Republic’s media regulator – the Infocomm Media Development Authority – is empowered to order tech platforms to proactively detect and take down harmful content, like content that promotes suicide and terrorism, or be fined. Social media services also need to provide tools for parents and guardians to manage what their child can encounter online and limit any unwanted interactions. On July 5, Minister for Communications and Information Josephine Teo announced to protect children from downloading apps that are inappropriate for their age. , Health Minister Ong Ye Kung and Minister for Social and Family Development Masagos Zulkifli said in separate Facebook posts in June, on the back of research showing negative effects of such usage and its links to worsening mental health.