Written by Catherine Allen
Navigating cultural and legal changes
The last couple of decades have seen increasingly more cultural organisations launch digital projects, exploring new creative tools whilst growing online audiences and user communities. The intersection between the tech world and the arts world has never been stronger, especially after the wake of Covid-19, when for many, digital content was the only way of engaging with the arts. This intersection has led to some incredibly powerful and critically acclaimed audience experiences. It also means, however, that cultural and legal shifts in the tech world will no doubt affect arts and heritage.
The new Online Safety Act (OSA) will begin to come into force in March 2025. The cultural sector is in a good place to adapt to, and even embrace this change since there is such strong base of safeguarding practice and ethical consideration to build from – but there is still work to be done.
This article gives an introductory overview of the OSA for cultural organisations, and draws from the insights of the webinar (below) The Space hosted in 2024 with Catherine Allen (this article’s author), Andrew Bravin from Sheridan’s law firm and Richard Collard from the NSPCC. It is worth noting for this guide that it can be a good idea to draw from the processes outlined, eg. ethics risk assessments, even when your project is not in scope of the Act.
Let’s start with an example of the types of arts project that would be covered and the law’s potential impact. A live stream experience from the US in 2017 called “He Will Not Divide Us” is a good illustration. Actor and artist Shia LaBeouf ran a live stream video experience outside the Museum of the Moving Image in New York. The project occurred in the aftermath of Trump’s presidential victory and aimed to unite people through chanting protest slogans and displaying messages of resistance to a camera that was constantly live streaming.
Unfortunately, this user-to-user service inadvertently attracted the attention of online trolls who then singled out and targeted individual participants, revealing their real life identities and subjecting them to online abuse, including racial harassment and hatred. What was intended as an effort to bring people together in a time of national turmoil ended in some harrowing real-world consequences, raising questions about the mitigation of such risks. If a user-to-user project such as this was launched in the UK in the coming years then things might indeed be different – perhaps most notably because the OSA requires a user safety risk assessment and proactive mitigation measures to be put in place.
What is the Online Safety Act?
The roots of the Online Safety Act stretch back to 2018 when the then-Secretary of State Matt Hancock committed to its development. Over the years, the Act has garnered support from both major political parties, reflecting a cross party effort to address online safety.
The Act was made law in October 2023 and is a regulatory framework designed to hold online service providers accountable for the content circulating on their platforms. At its core, the Act aims to ensure a safer online environment by requiring service providers to proactively address issues of illegal and harmful content. Being proactive, through a safety by design approach, is a core theme.
To clarify what is meant by harmful and illegal content in the act: harmful content encompasses various issues, including online abuse, cyberbullying, harassment, content promoting self-harm, eating disorders, drug use, violence, and more. The Online Safety Act defines 15 types of illegal harms, ranging from terrorism offenses to child sexual exploitation and hate crimes. While these may not directly affect your organisation, service providers must protect users from such content. Harms are listed more thoroughly in the Ofcom guidance and are well worth familiarising oneself with.
While the Act primarily targets large technology companies, encompassing social media platforms, messaging apps, and search engines, its effects ripple through the entire digital ecosystem – and in actuality it covers all user-to-user service providers of any size and in any sector. This means that if your cultural organisation is providing a digital service to end users, you could be in scope of the Act. It could be argued that the main, mass change the OSA brings is that all organisations providing any digital platform, regardless of their size, must check if they are in scope.

The OSA is part of a broader trend that the global landscape is generally moving towards in regards to comprehensive online safety regulations – for instance similar laws exist in the EU and Australia. So, even if, as a cultural organisation, you’re not operating a platform that is in scope, understanding these rules can help you avoid potential reputational issues; for instance helping you and your team assess which technology platforms to engage with and which tech companies to partner with.
What do you need to know now?
The Act is still evolving, some aspects came into force at the end of 2024, but there’s still time to get acquainted with your obligations. Ofcom, the UK’s communications regulator, will oversee this Act. As well as being the regulator, they’re tasked by the Government with publishing initial guidance, and this information is readily available on their website.
The big question is, who is in scope?
The Act applies to three types of service providers:
1. Those allowing users to interact with each other’s content.
Some examples of these include:
- Image sharing platform
- Forum website
- Instant messaging app
- Chatroom
- Virtual reality multi-user space
- User generated live video streaming service
- Virtual event where users can interact with each other
- Audio journey sharing platform
2. Providers of search engines enabling users to search multiple websites and databases.
3. Providers of internet services that publish or display pornographic content.
Now, to determine whether the rules apply to you, consider three factors:
1. Does your service have links with the UK, making it accessible to individuals there?
2. Do you provide one of the relevant services outlined earlier?
3. Are there any exemptions that might apply to your service?
Exemptions include internal business use, public authorities, and educational childcare providers.
Non-compliance with the Act can result in substantial fines, the severity of which depends on the organisation’s size and scale. In addition to financial penalties, senior executives may face criminal sanctions, reinforcing the urgency of adhering to regulatory requirements.

What will you need to do once the law is in force?
At core, if your organisation’s digital project is in scope, you must take demonstrable proactive measures to prevent the circulation of illegal and harmful content across your digital platforms. This entails developing strategies to identify and mitigate risks effectively. This process must be something you can evidence. The initial method for this process stipulated by Ofcom is through a thorough risk assessment.
Ofcom also recommends several key steps for all service providers:
1. Evidenced compliance with legal content safety duties.
2. Reporting and complaints procedures.
3. Content moderation systems or processes to take down illegal content.
4. Indicative timeframes for considering complaints.
5. Updating terms of service to reflect how individuals are protected from harmful or illegal content.
The act also states that it is important to strike a balance between freedom of expression and user safety.
More information on your duties, if you are in scope can be found on Ofcom’s website.
How to prepare?
1. Review your digital projects: Assess which of your current and upcoming digital projects fall under the scope of the Act, particularly those involving user-to-user communication, forums, or content sharing. Consider this for any new digital project idea and grant application.
2. Update policies: Revise your organisation’s policies and guidelines to align with the Act’s upcoming requirements, especially those related to content moderation and user safety.
3. Training: Consider, like many have done with GDPR, appointing a ‘champion’ for OSA compliance before the act comes into force. Train staff members to understand the Act’s implications (perhaps led by the ‘champion’) and their responsibilities in maintaining user safety.
4. Monitoring and reporting: If you haven’t already, implement mechanisms for monitoring and reporting harmful content or behaviour in your existing digital projects. Don’t wait until the act comes into force to do this.
5. User engagement: Actively engage with your existing audience to gather feedback and address any concerns, proactively demonstrating your commitment to their online safety and wellbeing. This might be in the form of focus groups or surveys. It might be worth sharing the findings of this process back with your funders.
6. Stay informed: Keep up to date with any developments or changes related to the Act’s regulation thorough Ofcom’s consultation phase.
There is of course immense creative potential in the evolving intersection of digital technologies and arts and culture. The possibilities offered by technologies like AI, user generated content and virtual reality can enrich audience experiences in ways previously unimagined.
However, it is crucial to remember the lessons of history: just as with most other forms of technology, be it cars, planes, or even just preparing food, a solid foundation of safety and end-user trust is essential to really make the most of benefits of these innovations. In my own experience with my company Limina Immersive’s touring VR theatre, the more emphasis we placed on audience members feeling comfortable and secure, the more tickets we sold and the better our audience feedback was.

The advent of the Online Safety Act represents a significant cultural and legal shift. While your project may not be directly in scope of the Act, the core process of conducting a user safety risk assessment, even at an early stage, can offer your project longevity and really strengthen its appeal to audiences.
About the author
Catherine is a BAFTA-winning, immersive media specialist and the founder of Limina Immersive. She has been responsible for a range of high profile digital entertainment products and has worked with major brands including Disney, Siemens and the BBC.
Catherine’s expertise spans across the fields of:
– Virtual reality
– Augmented reality
– Industry inclusivity & diversity
How useful was this resource?