Ofcom is investigating whether or not the supplier of a suicide discussion board has didn’t adjust to its duties beneath the On-line Security Act.
It is the primary investigation into a person service supplier beneath the brand new legal guidelines.
The act was handed in 2023 and requires companies to scale back unlawful and dangerous content material, however its protections are solely simply taking impact by way of Ofcom codes of follow.
The watchdog mentioned it was whether or not the discussion board didn’t have applicable measures to guard customers from unlawful content material and the way it might be used to commit or facilitate “priority” offences, together with encouraging or helping suicide.
The supplier and discussion board haven’t been named.
Ofcom mentioned it had tried to have interaction with the supplier and issued a legally binding request to submit the report of its unlawful harms threat evaluation.
Nonetheless, it mentioned it had acquired a “limited response” and “unsatisfactory information about the steps being taken to protect UK users from illegal content”.
On 17 March, duties got here into drive which means suppliers should defend UK customers from unlawful content material and exercise, together with proportionate measures to:
• Mitigate the danger of their service getting used to commit or facilitate a precedence offence• Forestall people from encountering precedence unlawful content material• Swiftly take down unlawful content material as soon as they comprehend it
Encouraging or helping a suicide in England and Wales can result in as much as 14 years in jail.
If on-line suppliers refuse to have interaction with Ofcom over systemic considerations, it might probably concern fines of as much as 10% of an organization’s international income and perform “business disruption measures”.
It raises the potential for enormous penalties for the large gamers in social media, reminiscent of Instagram and Fb proprietor Meta.