Last week the Online Safety Bill received Royal Assent, ensuring technology companies worked to make everyone much safer on the internet, and in particular children.
As a result, Ofcom have outlined their first draft new rules hoping to better protect children online. It is hoped that these codes will see change brought about more quickly in protecting children online and protecting many from online fraud.
Ofcom believe that the ever-increasing use of AI and the more AI develops, the easier it will become for people to create fraudulent material and target said material to the most vulnerable.
The first codes to be drafted will focus on illegal material online, which covers child sexual abuse material, grooming content and fraud.
This code will, by default, require the largest platforms, to ensure that children on their sites:
- are not presented with lists of suggested friends;
- do not appear in other users’ lists;
- will not have visible location information, and is not visible to other users; and
- do not receive direct messages from people outside their agreed connections.
Ofcom is set to publish more rules over the next few months on online safety with a focus on the promotion of material related to suicide and self-harm.
The tech firms and larger platforms, who will be compliant with the codes, must also nominate an accountable person, who reports to senior management on compliance with the code.
It is important to note, however, that each new code will require parliamentary approval before it is put in place.
Given that, Ofcom hopes that the codes announced this morning will be enforceable by the end of next year.
Secretary of State for Science, Innovation and Technology, Michelle Donelan, believes that the publication of the first codes is a “crucial” step towards making the Online Safety Act a reality. She said:
“[by] cleaning up the Wild West of social media and making the UK the safest place in the world to be online.
Before the bill became law, we worked with Ofcom to make sure they could act swiftly to tackle the most harmful illegal content first.
By working with companies to set out how they can comply with these duties, the first of their kind anywhere in the world, the process of implementation starts today.”
Susie Hargreaves, Chief Executive of the Internet Watch Foundation, commented:
“We stand ready to work with Ofcom, and with companies looking to do the right thing to comply with the new laws.
It’s right that protecting children and ensuring the spread of child sexual abuse imagery is stopped is top of the agenda.
It’s vital companies are proactive in assessing and understanding the potential risks on their platforms, and taking steps to make sure safety is designed in.
Making the internet safer does not end with this bill becoming an act. The scale of child sexual abuse, and the harms children are exposed to online, have escalated in the years this legislation has been going through parliament.
Companies in scope of the regulations now have a huge opportunity to be part of a real step forward in terms of child safety.”