Author | John Libonatti-Roche |
Date | 9th November 2023 |
Executive Summary
The Online Safety Act is a new piece of legislation. It applies to companies who provide online services that include, as part of the service:
- A User-to-User interaction element
- User Content Generation
- A search engine
- Messaging
- Offer age sensitive content
This is the case even where a business is based outside of the UK if:
- the service has a significant number of UK users or,
- UK users form a target market for the service and/or,
- the service is capable of being used in the UK by individuals and,
- there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK.
Those businesses most affected by the OSA are social media, digital messaging, search, and online advertising services. However, all companies who, as part of their business, offer a service with the above features must comply.
What is required?
The Act’s focuses on obligating companies “to identify, mitigate and manage the risks of harm” to users.
The Act seeks to ensure that online services:
- Are safe by design
- Offer a higher standard of protection for children than for adults,
- Ensure that users’ rights to freedom of expression and privacy are protected,
- Focus on achieving transparency and accountability.
- Are compliant.
Duty of care
The OSA imposes the duty to:
- Assess your user base.
- Assess risks to users.
- Implement suitable and sufficient risk assessments for “harm to users”.
- Implement mechanisms for user protection including mechanisms to:
- remove illegal content quickly.
- prevent child access to harmful content.
- Strong consent mechanisms e.g., consent easy to give, as to take away.
- Implement transparent, fair, and accessible content reporting/complaints procedures and user processes.
- Implement policies and mechanisms to ensure company response to content reporting/complaints are quick.
- Balancing freedom of expression, privacy, and obligations of the OSA to ensure safety for users.
- Protect users privacy
- Audit and review compliance with OSA regularly.
- Keep sufficient records.
Additional requirements:
Terrorism/ Criminal/ Child exploitation (CSEA) will need to be capable of being identified and eliminated using accredited technology. And policies and procedures must be in place to prevent this kind of content appearing on services.
Systems of processes must be in place to ensure CSEA content is, following identification, reported to the National Crime Agency.
Companies must put in place proportionate systems and processes to remove fraudulent advertising. Further policies and procedures must be in place to prevent it.
User empowerment policies and procedures must be put in place to allow users to exercise greater control over the content they see.
Content of Democratic Importance must be treated carefully such that content moderation decisions should be taken with care and not be used to prevent a wide diversity of political opinions from being freely expressed.
Appoint a senior manager responsible for ensuring compliance with the OSA.
Careful implementation of procedures and processes that allow identification and prosecution of new Communications offences introduced by the OSA.
Oversight and Penalties
- OFCOM will act as a regulator for the act.
- OFCOM will undertake an assessment of “how effective [a companies] processes are at protecting internet users from harm.”
- A failure to meet OSA requirements may lead OFCOM to take legal action against companies.
- OFCOM can issue fines up to £18 million or 10 percent of their annual global turnover, whichever is greater.
- Furthermore, where senior managers do not respond to information requests from OFCOM criminal action may be taken.
- Senior managers and their companies will be criminally liable if they fail to comply with enforcement notices regarding child safety.
- Where fines, enforcement notices and possibly criminal action are ignored and a company continue to take on board their need to implement the requirements set by the Online Safety Act, “OFCOM, with approval from the courts, will be able to require payment providers, advertisers and internet service providers to stop working with a site, preventing it from generating money or being accessed from the UK.”
The overall effect of OFCOM’s oversight is that compliance with the act will be intensely monitored. Consequently, non-compliance poses a significant threat to companies. Thus, pro-activity in achieving compliance is advisable.
What should you be doing now?
- Consider to what extent the Act applies to you
- Conduct a risk assessment of your operations
- Review privacy, data protection and complaints policies
- Review terms of service and up date where necessary
- Consider how to monitor content in a way that balances freedom of expression and your duty to protect users from harm;
- Survey options for age verification, content scanning etc.
- Put in place additional protection for Children
- Keep up to date with Ofcom’s publication of guidance
AI
The act refers to situations where companies should use “pro-active technology” to ensure that they meet requirements set by the act. e.g., age verification, scanning for illegal content, fraudulent advertising. Consequently, the act may represent an opportunity for AI companies to develop widely usable technology of this kind for the mass market and gain approval from OFCOM for its use in this manner.
AIPRIVSEC
For more information on the how AIPRIVSEC can help you achieve a sufficient level of privacy protection in light of new legal developments or for access to our whitepapers.
Get in touch by clicking here.
DISCLAIMER
This document is intended to be read for reference only.
It is not intended as legal advice and should not be acted on as if it is.
Leave a Reply