In the UK, websites can be blocked for the following reasons: they contain illegal images of child abuse; they infringe copyright law; they contain material that is unsuitable for children; they contain material that is considered hate speech or that promotes violence.
Open Rights Group draws a strong distinction between blocking material that is illegal and that which is not. Child abuse images are illegal in almost every country in the world, and there is general agreement that they should be blocked and removed from the Internet. Unfortunately, discussions around the blocking of legal pornographic material involving consenting adults are sometimes conflated with the blocking of illegal images of child abuse.
There are many technical reasons why blocking is ineffective. While it can prevent accidental discovery by the majority, blocking gives a false sense of security. It is easy to bypass for site owners and users who are determined, motivated, and technically adept. So, for example, web filters may prevent a young child from stumbling on a porn site but they are unlikely to restrict a tech-savvy teenager. In addition, blocking is a crude instrument, carrying with it the significant risks of over-blocking, of insufficient redress, of damage to innovation, and of driving the widespread adoption of avoidance measures such as encryption and anonymising technologies. Banning or breaking these would have serious repercussions for the privacy and security of consumers online.
ORG campaigns for more transparency about web blocking. If used at all, blocking should be necessary, proportionate, and based on independent evidence assessed by a court. It should respect fundamental human rights such as freedom of expression. It should be transparent. And it should be implemented through a fair and clear legal process.
ORG’s main campaigns around web blocking are:
ISP ‘family filters’
Following a series of private meetings, the UK Government persuaded the four main Internet Service Providers (ISPs) to develop network level filters that would block material deemed unsuitable for under 18s. BT, Sky, Talk Talk and Virgin Media rolled out these filters to new and existing customers by the end of 2015. Sky and Talk Talk switch on filters by default but BT and Virgin Media offer their customers a choice. The main public Wifi providers have also committed to applying "family-friendly" filters wherever children are likely to be present.
The filters vary from ISP to ISP and block websites in a range of categories - from pornography to mentions of alcohol - and most have different levels of strictness. All have been shown to incorrectly block websites that pose no harm to children. To address this, ORG and a group of volunteers developed Blocked, a tool that lets anyone check whether a website is blocked by a provider. When we tested the top 100k websites (according to Alexa), we found that one in five was blocked by one filter or another. Equally, it has helped us to show that many porn sites are not blocked by filters. The tool now gives users the ability to browse by category to help us identify patterns of blocking.
Blocked provides greater transparency about filters and helps web owners find out if their sites are being censored. Parents have the right to use network-level filters but we believe that there should be more transparency about overblocking and the limitations of filters so that they can make an informed choice. Given the volume of overblocking, we do not think that ISPs should switch on filters by default.
Digital Economy Act 2017
The Digital Economy Act (DEAct) obliges porn websites to verify the age of their users. The British Board of Film Classification (BBFC) has been given the power to instruct ISPs to block porn sites that fail to do so. The BBFC has claimed that only a few hundred sites will be blocked but as the law is drafted, it could mean that hundreds of thousands of sites, containing legal pornography, could be blocked. Once this framework is in place, we believe it could be used to block other types of legal content - for example websites relating to suicide, anorexia or extreme political views. This poses a major threat to the free speech of UK citizens.
ORG challenged the provisions in the DEAct while it was being debated by parliament. We will monitor how the BBFC uses its blocking powers and campaign against any extension of its powers.
The Counter Terrorism Internet Referral Unit (CTIRU), run by the Metropolitan Police, removes unlawful terrorist material content from the internet. Content that incites or glorifies terrorist acts can be removed under Section 3 of the Terrorism Act 2006. CTIRU compiles a list of URLs for material hosted outside of the UK which are blocked on networks of the public estate. In November 2014, it was announced that BT, Sky, TalkTalk and Virgin Media would incorporate the CTIRU blocklist into their filters
In March 2017, following a terrorist attack on Westminster Bridge, the UK’s Home Secretary Amber Rudd MP met with representatives from Twitter, Google, Facebook and Microsoft to discuss ways to improve how extremist material could be blocked or removed from the Internet. In June 2017, the Government announced further proposals for a Digital Charter and a review of its extremism strategy in order to challenge extremist content online.
Court orders are used to block sites accused of facilitating copyright infringement, such as Newzbin2 and The Pirate Bay. ORG campaigns for more transparency about these orders. We developed Error 451, a new error code for ISPs to show when people visit websites blocked for legal reasons. Showing the Error 451 message makes it clear when a website has been blocked after a court order. See also our broader campaign about copyright infringement.
What you can do