The OSA introduces a new regulatory regime for social media companies. Section 10 of the OSA places specific duties on them to take or use proportionate measures relating to the design or operation of their service, both to prevent users from encountering ‘priority illegal content’ and to effectively mitigate and manage the risk of being used for the commission or facilitation of a ‘priority offence’. It also imposes a duty on companies to use proportionate systems and processes designed to minimise the length of time for which any priority illegal content is present on the platform. In discharging those duties, social media companies are expected, where proportionate, to take measures across a non-exhaustive list of specified areas, including:
- Regulatory compliance and risk management arrangements
- Design of algorithms
- Content moderation
- Functionalities that allow users to control the content they encounter. [1]
The duties extend to the kind of conduct over which financial regulators have become exercised. The schedule of ‘priority offences’ includes breaching the restriction on financial promotions [2], Fraud Act offences and market abuse offences, both making misleading statements and creating misleading impressions [3].
The regulatory framework under the OSA cannot be considered in a vacuum. On 30 November 2023, the Government announced that some of largest tech companies had agreed to develop and commit to an Online Fraud Charter, the terms of which will include measures to prevent the posting of, and to remove, fraudulent content.
Outside of how the wider landscape could impact the discharge of the duties under the OSA, the features of each species of offending content will determine whether measures taken in response are proportionate. For example, financial promotions are unlawful where they have not been communicated or approved by an authorised person, or are otherwise exempt. Accordingly, in discharging their duties, social media platforms will need to consider how they can identify potentially offending content, before either removing it or flagging it to users as problematic. Such measures have already been adopted by some platforms. In its Perimeter Report for 2021/2022 [4], the FCA reported that some social media companies had introduced verification policies to ensure that they allow only financial promotions that are made by, or with the approval of, authorised persons.
It should be emphasised that the duties under the OSA concern unlawful financial promotions. Separately, the FCA has consulted on how the requirements that relate to financial promotions within its perimeter should apply to social media [5]. This focus has been triggered by the increasing use of platforms for such promotions. As such, the regulator has a keen eye and institutional interest on this area.
Regulators have also made public announcements about the risks of market abuse posed by social media. [6] In our view, these risks are likely to become more significant for two key reasons:
- The use of social media will continue to grow and become more embedded in our information infrastructure. In turn Algorithmic trading models may increasingly rely upon less traditional data feeds, including social media.
- The production and dissemination of false information is becoming more sophisticated, and with this increasingly believable. Deep-fake videos are the most obvious and vivid example. This content is compelling and, therefore, spreads virally, risking creating rapid and extreme market movements.
"The production and dissemination of false information is becoming more sophisticated, and with this increasingly believable. Deep-fake videos are the most obvious and vivid example".
In discharging their duty to mitigate and manage the risk of being used for the commission or facilitation of market abuse, social media companies will need to design and implement tailored surveillance and content moderation tools. However, misleading statements and impressions are materially different in nature from other priority offence content. For the most part, the priority offences represent risks to individuals. Market abuse represents a systemic risk, which could crystallise quickly. As a result, where the technology permits, compliance with the duty under the OSA may require the effective containment of false statements, which, if at risk of influencing markets, will naturally be of public significance and interest. To that extent, effective systems may be in tension with free speech issues [7].
OFCOM, the UK communications regulator, is the authority responsible for overseeing and enforcing the regulatory framework in the OSA. However, given the growing intersection between social media and financial services, we think it would be surprising if the FCA was merely a passive stakeholder in the implementation and oversight of the duties contained within the OSA.
"It would be surprising if the FCA was merely a passive stakeholder in the implementation and oversight of the duties contained within the OSA."
So, how might the FCA work with OFCOM and engage with the terms and powers under the OSA to advance its statutory objectives? There will have to be significant dialogue between the FCA and OFCOM, if not with the social media companies themselves, to maximise the effectiveness of the protections offered by the duties under the OSA. The FCA, responding to intelligence sources and consumer concerns, will want to update the social media companies on the changing nature and risks of problematic content posted on their platforms, in respect of matters under the FCA’s remit. Given the duties set out in the OSA, social media companies will be expected to respond to that information, adjusting their systems, particularly content moderation, accordingly. Whilst there has been dialogue and collaboration with the FCA previously, social media companies now have a statutory standard against which their reaction will be measured.
Subject to sufficient cooperation and resource, the FCA’s influence and impact on how the OSA’s framework is applied, and therefore shaped, could extend far wider than the input and application of its industry knowledge in respect of those species of offending conduct that sit in its sphere of interest. The duties under Section 10 of the OSA require in-scope companies to assess the risk that their platform will be used for the commission of the specified offences, before designing and implementing proportionate systems and processes to combat that risk. The efficacy of such systems, which will be complex and are likely to involve algorithmic technology, will depend in part on sound governance, including senior management oversight and auditing. These issues and concepts are familiar territory to the FCA when both supervising and enforcing regulated financial services. As are the related processes, for example the OSA allows for a requirement of a skilled person’s report, a common supervisory tool employed by the FCA.