New age verification regulations in the UK aimed at preventing children from accessing harmful online content began enforcement on Friday, with advocates calling it a “milestone” in their long-standing effort for stricter regulations.
According to the new guidelines, which will be enforced by the UK’s media regulatory body, websites and applications that host potentially harmful material will be required to conduct age verification using methods such as facial recognition and credit card information.
Melanie Dawes, Chief Executive of the British regulatory body Ofcom, reported that approximately 6,000 adult content sites have agreed to adopt these measures.
Other platforms, including X, which is currently facing challenges over similar regulations in Ireland, are also required to protect minors from illegal pornographic, hateful, and violent material, she added.
Ofcom indicated that around 500,000 children between the ages of eight and 14 were exposed to pornography online last month.
These long-anticipated regulations, which seek to prevent minors from encountering topics related to suicide, self-harm, eating disorders, and pornography, are a result of the 2023 Online Safety Act.

This act imposes legal obligations on technology companies to enhance the protection of children and adults online and requires penalties for those who fail to comply. Fines for violations can reach up to £18 million ($23 million) or 10% of the company’s global revenue, whichever sum is higher, as outlined by the government.
Criminal charges may also be brought against senior executives who do not ensure adherence to Ofcom’s information requests.
The implementation of these measures is underway now, following the allocation of time for the industry and the regulator to prepare.
The government, led by Prime Minister Keir Starmer, is considering introducing a daily limit of two hours for children on social media applications.
Trending