Summary: | In recent years, the pace of Internet regulation around the world has quickened, with states increasingly confident that they can and should hold major platform companies to account. New laws have been developed to address the risks of digital technologies and law-makers have drawn on familiar regulatory principles and legacy frameworks in addressing them. But the nature of the technologies and the business models supporting them bring new challenges which make it less clear that old approaches will work. To succeed, legislative frameworks must evolve and adapt. Against this backdrop we assess the UK's Online Safety Act 2023 (OSA), which was expected to provide an innovative and broad-reaching ‘systems-based’ approach to reducing user risks and harms, particularly in relation to child safety. We argue that although the Act does incorporate measures to regulate platform design, it fails to fully embrace this and faces some challenges in ensuring proportionality and accountability. We conclude that the development of the OSA is hampered by a legacy focus on content controls which may limit its ability to effectively improve online safety, particularly as services evolve.
|