Parliament asked Mr Rathi whether, given its regulatory interest, the FCA’s powers are adequate to tackle the problem. In his response he noted that, “it’s our job now, with OFCOM and others to implement the Online Safety Act”. Whilst the FCA will have a role to play in ensuring that the Online Safety Act (“OSA”) is effective in achieving its legislative purpose, it will not be on the front line of overseeing and enforcing the requirements under the Act. That sits with OFCOM. The FCA is left relying upon the cooperation of social media sites and, failing that, the appetite and resource of OFCOM to advance its objectives by proxy through the OSA. To that end, there are levers the financial regulator can pull, both through the framework of the OSA and in its regulatory relationship with OFCOM.
Section 10 of the OSA places duties on in scope social media companies, as providers of user-to-user services, i.e. a service by which content is generated or uploaded by a user and may be encountered by another user. One of those duties requires service providers to take proportionate measures, relating to the design or operation of the service, to
- prevent users encountering ‘priority illegal content’;
- effectively mitigate and manage risks of the service being used for the commission or facilitation of ‘priority offences’; and
- effectively mitigate and manage risks of individual users being harmed.
Service providers are also under a duty to use proportionate systems and processes that are designed to minimise the time that ‘priority illegal content’ is present, and also to remove any illegal content swiftly upon becoming aware of it. The Act stipulates the areas in which, where it is proportionate to do so, service providers are required to take or use measures.
Priority illegal content is content that amounts to a ‘priority offence’, the list of which includes conduct that is within the FCA’s remit and interest, specifically, offences under FSMA (e.g. unauthorised financial promotions and breaches of the general prohibition), market abuse offences and fraud.
Under section 38, in respect of large user-to-user services (‘Category 1’), a duty similar to the second mentioned above applies in respect of fraudulent advertising, defined as paid for advertising, as opposed to user generated content, that amounts to one of the offences mentioned above, about which the FCA has an interest. However, the duty to protect against fraudulent advertising is more onerous, in that it also requires the provider to operate the service using proportionate systems and processes designed to prevent individuals from encountering such advertisements.
OFCOM has the power to enforce these duties. Whilst the FCA can prosecute persons, jurisdictional reach permitting, who use social media to disseminate unauthorised financial promotions (like ‘finfluencers’), or otherwise commit an offence that falls within the Regulator’s remit, it has no powers in respect of the service providers themselves.
To date, it has worked to identify and report illegal content and fraudulent advertising to the social media companies, thereby triggering those providers’ duties to take the content down “swiftly”. During his evidence before Parliament, Mr. Rathi noted that last year the FCA asked for the amendment or withdrawal of 20,000 financial promotions, compared with only 570 a few years earlier. The extent to which this uptick is a consequence of FCA activity, compared with the proliferation of such promotions is unclear. In any event, the limits of this approach were revealed during the course of his evidence. Mr. Rathi acknowledged that, the FCA “cannot force the tech firms to take down promotions that we see as problematic [but] rely on co-operation from them”. The FCA had previously expressed its frustration that certain service providers were reactive, not proactive, and had also been overly slow in removing content. Mr. Rathi expanded on those prior comments, noting that, once certain content has been removed, almost identical content is created soon thereafter.
These concerns raise questions about the ambit of the duties under the OSA, as summarised above, at least in the context of financial services consumer protection. What will OFCOM demand from service providers when assessing whether content has been removed “swiftly” under the Act? How will the proportionate use of content moderation flex in circumstances where noticeably similar content has just been flagged and removed? Any assessment of these issues will inevitably involve consideration of the speed with which harm can be caused by such offending content and the severity of that harm. The FCA will be well placed to inform, if not perform, that assessment. As such, through communication with both OFCOM and providers, it will have a significant voice in shaping and influencing what is expected under the duties, and how the compliance with those duties is assessed. Separately, given the FCA is arguably a more mature enforcer, and has routinely investigated the adequacy of complex systems and processes, it may have something to offer OFCOM, where the latter finds itself probing the adequacy and effectiveness of a system’s design, operation and related governance structures.
However, ultimately the FCA will rely upon OFCOM to oversee and enforce the duties. The extent to which the current regulatory landscape will be effective in meeting the FCA’s agenda will depend in large part on the relationship and coordination between the two regulators. The position will become clearer next year, when OFCOM expands its attention to fraud and financial services offences (to date it has largely been focused on child safety). The quality and integrity of the relationship between the two regulators will, in large part, determine the extent to which fraudulent and unlawful investment schemes can be tackled in the coming years.