Key Child Protection Regulations: Compliance and Best Practices
The online landscape offers numerous educational opportunities, but it also poses significant risks to children, as much of the available content is designed for adults and can be easily accessed by minors. As a result, there is a growing consensus among governments and the global population that more needs to be done to ensure age-appropriate access to goods and services. It is estimated that approximately one-third of internet users are under the age of 18, resulting in vast numbers of children being exposed to online environments designed for adults. A significant issue is that many children can lie about their age to access platforms and content intended for older users, while adults with malicious intentions can target platforms designed for younger users. To address these concerns, various legislations are being introduced worldwide to enhance online safety for children, including the Online Safety Bill, the Digital Services Act, and the California Age-Appropriate Design Code Act. Major platforms like SuperAwesome are already exploring different age assurance options to obtain parental consent and protect children and young people. However, many organizations have yet to grasp the scale of the issue and start exploring how to tackle the challenges of the "four Cs": content, contact, contract, and conduct between underage and overage users. While there is no single solution for child safety, various options exist to create age-appropriate experiences and protect children online, and there is no excuse for platforms not to have started making progress. The UK Information Commissioner's Office introduced the Age Appropriate Design Code in 2021, which requires online services to prioritize the best interests of children. The Code focuses on how children's data is processed, recommending high privacy settings as default and minimizing data collection. It goes beyond other data protection laws, considering how products and features can harm children. The California Age-Appropriate Design Code Act, modeled on the UK's Code, was signed into law in 2022 and will take effect in 2024. It places legal obligations on companies offering online services likely to be accessed by children under 18 to undertake Data Protection Impact Assessments (DPIAs). DPIAs must address questions such as whether the design of a feature harms children or exposes them to exploitation by harmful contacts. The UK's Online Safety Bill will impose extensive duties on regulated companies to protect users, especially children, from harmful content. The Bill will usher in a new era of online safety, regulation, and accountability, with potential fines for non-compliance of up to £18 million or 10% of annual global revenue. Game developers should consider whether their platform falls within the scope of these regulations and take action accordingly. In Europe, various legislations, such as the Audio Visual Media Services Directive, the Digital Services Act, and the Digital Markets Act, also refer to age assurance and age appropriateness. To ensure age assurance is inclusive and accessible, people should be offered a choice in how they prove their age. This can include facial age estimation, which accurately estimates age from a selfie, or Digital ID apps, which allow individuals to verify their identity against a government-issued ID document and share specific details, such as date of birth or age attributes. Other age verification options include credit card checks, one-time ID document checks, and checks with mobile phone operators or eIDs. Online age verification is no longer optional; it is a necessity in gaming to ensure players are given age-appropriate experiences. Regulations will hold platforms accountable for keeping children safe online, and with innovative age solutions, platforms can provide an incredible, yet age-appropriate experience.