Internet regulation has been very much in the public eye lately, particularly following the Cambridge Analytica scandal, and the government recently published its Online Harms White Paper which seeks to address some of the concerns surrounding the ‘Wild West Web’. One of the key issues regularly raised is the protection of children from exposure to online pornography.
Back in 2013, former PM David Cameron announced that, following extensive discussions between government and internet service providers (ISPs), the largest ISPs had agreed to switch on ‘family friendly filters’ by default. These filters were designed to block pornographic and other offensive material (subject to parental controls) but they were criticised as ineffective, applying a cumbersome broad brush approach which blocked content not intended to be targeted (such as educational material about biology) whilst failing to block much of the offensive material – not to mention the fact that tech-savvy youngsters were often able to easily circumvent the filters.
The following year, in the face of the abject failure of default ISP filters to effectively tackle the problem, the now defunct video-on-demand regulator Atvod suggested that age checks be made mandatory for all adult sites, and the government put forward proposals soon after. Two years later, it looked like mandatory age checks had finally been realised by Part 3 (sections 14-30) of the Digital Economy Act 2017 (DEA 2017) which set out a regulatory framework. Individual websites are free to implement their own age verification methods, but it’s envisaged that it would generally work by requiring credit card details to be submitted prior to access being granted to view any pornographic material (whether free or paid). The British Board of Film Classification (BBFC) has been chosen to act as the age verification regulator.
In July 2017, then digital minister Matt Hancock announced that mandatory age checks would come into force from April 2018. But just weeks before this date, the Department for Digital, Culture, Media and Sport scrapped the introduction, apparently due to a lack of preparation (a spokesman said at the time: “We need to take the time to make sure we get it right”). A new date of 15 July 2019 was set but, once again, it was delayed at the last minute, apparently due to a failure by government to notify the European Commission. Culture secretary, Jeremy Wright, has said this new delay is due to last “in the region of six months” and he emphasised that it did not indicate a change in policy.
Separately, the Law Commission is examining the legislative position surrounding so-called ‘revenge porn’ (where sexual images or videos are distributed by ex partners) and a public consultation is being launched to consider laws which tackle the phenomena of ‘cyber flashing’ (when someone receives an unsolicited sexual image on their smartphone) and ‘deepfake pornography’ (where someone’s face is superimposed on sexual imagery).