The ICO’s Age Appropriate Design Code: Best Interests of a Child and Age-appropriate Design
The draft Age Appropriate Design: A Code of Practice for Online Services (Code) published by the UK’s Information Commissioner’s Office (ICO) is comprised of 16 “standards,” two of which will be addressed in this post.
Standard 1 is the foundation of the Code. It would require providers to make the “best interests” of the child the “primary consideration” in the design and development of online services likely to be accessed by children. (Please see our post regarding the breadth of the “likely to be accessed by” standard and the definition of “children.”) The Code adopts the United Nations Convention on the Rights of the Child (UNCRC), which recognizes not only children’s right to privacy, but also their right to be free from economic exploitation, to access information, to associate with others, to play, and to have a voice in matters that affect them. The ICO acknowledges that these interests will need to be balanced and, at times, may even be competing, for example, when the best interests of different children do not align. However, the ICO makes clear that “[i]t is unlikely . . . the commercial interests of an organization will outweigh a child’s right to privacy.”
To implement the “best interests” standard, providers would be required to consider the rights and needs of children users of all different ages and design their online services in a way that best supports those sometimes-varying needs. When the best interests of a child conflicts with the commercial interests of a provider, the provider would be required to place the child’s interests first.
Closely related, Standard 2 would require providers to design their online services to account for the age range of their audience and the different needs of children at different ages and stages of development. In other words, all children must be protected, but not necessarily using the same methods. To aid with this process, the ICO provides guidance on the “relevant capacities, needs, skills and behaviors” of children in different age ranges and developmental stages.
As a default, the ICO recommends a “child-appropriate service” be provided to all users, “with the option of age-verification mechanisms to allow adults to opt out of the protections . . . and activate more privacy-intrusive options if they wish.” As a practical matter, for online services likely to be accessed by children, this would be more a requirement than a recommendation unless a provider were able to implement a reliable mechanism to distinguish between adults and children. “Asking users to self-declare their age or age range,” however, would not suffice. Age verification mechanisms would need to be “robust and effective,” making it difficult for children to circumvent.
The ICO recognizes that age-verification tools are still developing. Accordingly, the Commissioner has pledged to support work to establish clear industry standards and certification schemes to help identify compliant services.