The UK Online Safety Act 2023 will have significant implications for services like X (formerly Twitter), Instagram, Facebook ad Bluesky particularly concerning freedom of expression, the use of algorithms, transparency, and democratic protections.
For example, as a large platform with a substantial UK user base, X will be classified as a Category 1 service, subjecting it to the most stringent requirements of the Act.
Freedom of Expression:
The Act strives to balance online safety with the protection of free speech. While requiring platforms to address harmful content, it emphasises upholding freedom of expression and ensuring that legitimate content and diverse viewpoints are not unduly restricted. However, critics have expressed concerns about the potential for overzealous content removal and a chilling effect on free speech, especially given the Act’s broad definition of “content that is harmful to children”.
There are concerns that the robust safety duties might outweigh the “balancing measures” intended to protect freedom of expression.
The Act’s impact on freedom of expression for services like X will depend on how Ofcom interprets and enforces its provisions. Striking a balance between user safety and free speech remains a complex challenge.
Use of Algorithms:
While the Act doesn’t explicitly mandate transparency about how algorithms are used to manage the risk of misinformation, the emphasis on transparency suggests that algorithms used for content moderation will likely face scrutiny. Ofcom has also highlighted the potential for algorithms to repeatedly expose users, particularly children, to harmful content, emphasizing the need for providers to mitigate these risks.
The Act mandates that platforms consider the risks their algorithms pose in relation to illegal content and content that is harmful to children, potentially requiring them to adjust algorithms or platform design to minimise potential harms.
X will need to provide information about its algorithms in transparency reports, risk assessments, and terms of service, disclosing how they identify and mitigate harmful content like hate speech and misinformation. X will also need to ensure its algorithms comply with the Act’s requirements for protecting children.
Transparency:
Transparency is a key theme in the Act, especially for Category 1 services like X. The Act requires X to be transparent about its content moderation practices, especially those related to content of democratic importance. This includes:
- Publishing annual transparency reports detailing its content moderation practices, the volume of harmful content removed, the use of algorithms, and their impact on users.
- Providing clear information in its terms of service explaining its policies on content moderation, user safety, and reporting mechanisms.
- Disclosing the use of “proactive technology,” such as automated tools or algorithms, used to detect and remove harmful content.
These transparency requirements aim to hold platforms accountable and empower users by providing clarity about how their data is used and content is moderated.
Democratic Protections:
The Act includes provisions to protect content of democratic importance, such as news publisher content, journalistic content, and user-generated content that contributes to political debate. Category 1 services like X must implement systems to ensure that decisions regarding content moderation consider the importance of free expression and provide equal treatment to diverse political opinions.
However, the Act does not specifically address whether algorithmic requirements apply to content of democratic importance. It remains to be seen how Ofcom will address this in future guidance.
Conclusion:
The UK Online Safety Act 2023 will have a significant impact on platforms like X. The Act’s focus on user safety, transparency, and accountability will require X to make substantial changes to its content moderation practices, algorithmic transparency, and approach to democratic content. X’s compliance with the Act will be closely monitored by Ofcom, with potential penalties for breaches.
It is crucial for platforms like X to proactively engage with the Act’s requirements and Ofcom’s guidance to ensure compliance and navigate the challenges of balancing online safety with freedom of expression. The Act’s effectiveness will ultimately depend on Ofcom’s ability to enforce its provisions and adapt to the evolving online landscape.