Act of Parliament | |
Long title | An Act to make provision for and in connection with the regulation by OFCOM of certain internet services; for and in connection with communications offences; and for connected purposes. |
---|---|
Citation | 2023 c. 50 |
Introduced by | Michelle Donelan, Secretary of State for Science, Innovation and Technology (Commons) The Lord Parkinson of Whitley Bay, Parliamentary Under-Secretary of State for Arts and Heritage (Lords) |
Territorial extent |
|
Dates | |
Royal assent | 26 October 2023 |
Commencement | On royal assent and by regulations. |
Status: Current legislation | |
History of passage through Parliament | |
Text of statute as originally enacted | |
Text of the Online Safety Act 2023 as in force today (including any amendments) within the United Kingdom, from legislation.gov.uk. |
The Online Safety Act 2023[1][2][3] (c. 50) is an act of the Parliament of the United Kingdom to regulate online speech and media. It passed on 26 October 2023 and gives the relevant Secretary of State the power, subject to parliamentary approval, to designate and suppress or record a wide range of speech and media deemed "harmful".[4][5]
The act requires platforms, including end-to-end encrypted messengers, to scan for child pornography, despite warnings from experts that it is not possible to implement such a scanning mechanism without undermining users' privacy.[6]
The act creates a new duty of care of online platforms, requiring them to take action against illegal, or legal but "harmful", content from their users. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. It obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues.
The bill that became the act was criticised for its proposals to restrain the publication of "lawful but harmful" speech, effectively creating a new form of censorship of otherwise legal speech.[7][8][9] As a result, in November 2022, measures that were intended to force big technology platforms to take down "legal but harmful" materials were removed from the bill. Instead, tech platforms are obliged to introduce systems that will allow users to better filter out the "harmful" content they do not want to see.[10][11]
The act grants significant powers to the secretary of state to direct Ofcom, the media regulator, on the exercise of its functions, which includes the power to direct Ofcom as to the content of codes of practice.[vague] This has raised concerns about the government's intrusion in the regulation of speech with unconstrained emergency-like powers that could undermine Ofcom's authority and independence.