Opinion: As lawmakers consider how to keep children safe online, they should look to Britain and California
Elizabeth Denham, CBE, was UK Information Commissioner (2016-21) and British Columbia Information and Privacy Commissioner (2010-16). She is a Trustee of 5 Rights, an international charity dedicated to putting children’s rights at the heart of digital design. She also works as an international consultant for Baker McKenzie’s data and technology practice.
Personal data drives our digital economies. It can open doors and create connections, but it can also be exploited or abused.
Privacy laws have been a bulwark against careless or malicious actors for more than 20 years. However, only recently have laws been introduced that address the particular vulnerability of young people. The governments of British Columbia and Canada appear poised to introduce legal protections for children online, 10 years after we gave the world a devastating example of why we need strong laws.
Since Amanda Todd died in 2012 at the age of 15 after being the victim of cyberbullying and sexual blackmail, young people have been at greater risk of online harm. Children are spending more and more time logged in, and at an increasingly younger age. Technologies that engage people in the digital experience—engaging and sharing, creating content, and consuming ads—have evolved at a dizzying pace. Recommendations, nudges, notifications, endless scrolls and popularity metrics permeate the design of the digital systems our children use. Social media, mobile games, streaming sites, and online stores all generate massive amounts of user-generated data.
Digital threats to Canada’s young people include invasions of privacy and connections with strangers. They also include games, products and services that are inherently addictive.
Two years ago, when I was UK Data Protection Commissioner, I worked with government agencies, stakeholders and both houses of Parliament to develop and legislate the Age Appropriate Design Code. It consists of 15 enforceable standards. Collectively, they blame tech companies for the experiences children have on their platforms. Under the Code, companies must not use children’s information in ways that are known to be harmful, e.g. B. by recommending harmful content or sharing profiles with strangers. At its core, the Age Appropriate Design Code dictates that businesses put the interests of children first by design, by default.
Within months of the Code’s entry into force in the UK, the big names in tech have made positive changes. Instagram has banned adults from messaging children. It also disabled location tracking and introduced prompts encouraging kids to pause while scrolling. Google has made SafeSearch the default browser mode for kids and disabled YouTube’s autoplay feature. TikTok recently made accounts private by default for people under the age of 16. Sensible, even obvious, modifications like this are long overdue. And they still have to be implemented across the board. Nonetheless, following recent legislative changes in the UK and US, we are seeing an encouraging change in legal requirements, penalties for non-compliance and consumer expectations.
Britain’s first Age Appropriate Design Code of its kind has already inspired legislation in other jurisdictions. In September, the California Age Appropriate Design Code was enacted. And so the home state of Silicon Valley now benefits from comprehensive online protection for children. Similar developments seem inevitable, with promising signals coming from other US states as well as the Netherlands and Ireland. We see the emergence of a global standard for regulating technology for children’s privacy, safety and autonomy in the digital realm.
As BC and Canada are now modernizing private sector privacy laws, I hope Canadian lawmakers will take the baton. And I hope regions and nations will make a coordinated effort. Different approaches could slow down or even undermine the entire effort. Different standards could create regulatory loopholes that companies could exploit. Additionally, technology companies that adopt privacy-friendly processes deserve clear expectations, no matter where they or their users are located.
To harmonize standards, Canadian legislators should take inspiration from existing regulations in the UK and California. Establishing coherent global regulation will close loopholes and facilitate compliance for an industry that is never constrained by national borders. Any future Made-in-Canada legislation should codify existing children’s rights. Then we need to let tech companies do what they do best: innovate to get the results you want.
We must act urgently and thoughtfully. Lawmakers from around the world should work together to protect children in the digital world they have inherited. The internet needs to become a safe place for them to learn, play and connect with their peers.