Key Child Protection Laws to Watch - Compliance Guidance for Developers

The internet offers numerous learning opportunities, but much of its content is geared towards adults, and children can easily access inappropriate material. As a result, there is a growing consensus that more needs to be done to ensure age-restricted access to goods and services. With an estimated one in three internet users under 18, many children are exposed to online environments designed for adults. To address these issues, various legislations are being introduced globally to enhance online safety for minors, such as the Online Safety Bill, the Digital Services Act, and the California Age-Appropriate Design Code Act. Some major platforms, like SuperAwesome, are exploring age assurance options to obtain parental consent and protect children. However, many organizations have yet to grasp the scale of the issue and start addressing the challenges of content, contact, contract, and conduct between underage and overage users. The Age Appropriate Design Code, introduced by the UK Information Commissioner's Office in 2021, requires online services to prioritize the best interests of children. It focuses on data processing, recommending high privacy settings as default and minimizing data collection. The Code also considers how products and features can harm children, such as private chat functionality and livestreaming. Across the globe, jurisdictions like Australia, Singapore, India, and Canada are also focusing on child safety. The California Age-Appropriate Design Code Act, modelled on the UK's Code, places legal obligations on companies offering online services likely to be accessed by children under 18, requiring them to undertake a Data Protection Impact Assessment. The assessment must address questions such as whether the design of a feature harms children or exposes them to exploitation. Non-compliance can result in serious penalties, including fines of $2,500 to $7,500 for each affected child. The UK's Online Safety Bill will impose extensive duties on regulated companies to protect users, especially children, from harmful content. The Bill will usher in a new era of online safety, regulation, and accountability, with potential fines of up to £18 million or 10% of annual global revenue for non-compliance. In Europe, the Audio Visual Media Services Directive, the Digital Services Act, and the Digital Markets Act also reference age assurance and age appropriateness. The Digital Services Act includes a strategy for a better internet for kids and a code of conduct. To comply with these regulations, game developers must know the age or age range of their users and adapt their content to deliver age-appropriate experiences. This may involve having age-appropriate prompts, high privacy settings as default, age-appropriate explanations, and ensuring voice chat is turned off by default. Age assurance techniques, such as facial age estimation and Digital ID apps, can help establish the likelihood of a user's age without verifying their full identity. These methods are being used by companies like SuperAwesome to verify parental consent and protect children. Other age verification options include credit card checks, one-time ID document checks, and checks to mobile phone operators or eIDs. It is essential to offer users a choice in how they prove their age to ensure age assurance is inclusive and accessible to all. Online age verification is no longer optional; it is a necessity in gaming to provide age-appropriate experiences and comply with regulations.