As advertised yesterday, a meeting to consider some of the pitfalls of online blocking and filtering took place last night in the Commons.
This took place under the Chatham House Rule, which states:
When a meeting, or part thereof, is held under the Chatham House Rule, participants are free to use the information received, but neither the identity nor the affiliation of the speaker(s), nor that of any other participant, may be revealed. (unless any particular individual specifically waives their privilege!)
Apart from the initial background briefing note, it should be understood that what follows is an amalgamation of a discussion and no individual or individual organisation present can be presumed to support all – or even a significant part – of what was said. There was some consensus. Also some dissent on key issues.
It was attended by representatives from many of the UK’s leading LGBT organisations, sex education and abuse (against children and adults), as well as groups who have been campaigning against online censorship in wider terms (both from the law and from the adult film industry). There were also representatives from the industries most involved in filtering/blocking: members of both Houses of parliament were also in attendance.
In advance of the meeting, i circulated the following brief briefing paper.
The key points made in this were:
– that since the launch of automatically on blocking/filtering of the internet in the autumn of 2013, there have been teething troubles, with a perception, that cannot be proven one way or the other, that LGBT and other minority groups are being disproportionately filtered;
– this has put pressure on mobile operators, who have been filtering for many years, with little public concern over that activity;
– filtering itself is carried out by non-UK commercial operators;
– there is a lack of transparency as well as a lack of forthrightness in respect of technical explanation on the part of ISP’s and mobile operators, possibly betraying the fact that public-facing staff are insufficiently aware of the technicalities of what is being done;
– the entirety of the filtering/blocking enterprise is, at some remove, within the remit of the UK Council on Child Internet Safety (UKCCIS), which was asked by David Cameron in July 2013 to ensure over-blocking of services for young people did not happen;
– UKCCIS appears to be top-heavy with industry representation, but lacks technical and minority input: a sub-committee tasked with looking at over-blocking did not meet until December 2013;
– key questions for the meeting were whether UKCCIS could provide adequate governance on this issue and whether online filtering needed to be subject to regulation and license.
The session looked at the experience of young people from minority groups. Research was cited in respect of how young people manage the coming out process by finding communities on the net. This reduces fear. Young people join Mermaids (an organisation for trans children) at 13: however, research suggests average of awareness of transness is 8.
Many people use the internet to discover they are not freaks. For the most part, they only find inappropriate material if they enter derogatory or porn-associated terms.
Similar experiences from LGB children. Magazines such as Diva are a lifeline for many people – not just the young. It is essential for women to find sites letting them know they are like everyone else. Some women only just realise late in life they are gay.
This is about people accessing information that will help them lead healthy lives. Blocking content around gender identity, sexual orientation and sex education will create and exacerbate issues around mental health, physical health and sexual health.’
Several stories from health and sex education sites of being blocked BECAUSE they provide information that young people can use to protect themselves in intimate relations.
Blocking of drug information sites also highlighted, including sites that warned of dangers from bad batches of drugs.
Other experiences cited include the take-down of a highly popular LGBT Christian blog on Yahoo as a result of objections to it by US evangelist groups, and the blocking of a business site, apparently on the basis that the proprietor is transgender.
Against this, not just porn getting thru: example sited of an anti-semitic site easily available to school children that appears, on the surface, to be about Martin Luther King.
It is hard to cater to all needs and keep it simple. All mobile operators offer a filter product set by default within framework provided by BBFC. Initially used other body to provide, but BBFC have provided regulatory framework since September 2013.
If a site appears to be blocked, individuals can appeal. The current system does not use silent blocks: individuals will know if they attempt to access a blocked site. However, silent blocking by some services like google means individuals will never know that they have been blocked.
The basic mobile blocking package is an attempt to block under 18’s from viewing adult material. In addition, EE and O2 offer a strict level – a parental control option – targeted at younger children: like a rubber mat. O2 uses a whitelist approach, which is why they appear to be blocking a significant amount of material.
ISP’s don’t really wish to be doing this sort of thing: they would much prefer to be mere conduits. Claire Perry MP and the Daily Mail want ISP’s to police the internet. This argument has been going on for some while and in the last couple of years, ISP’s have been strong-armed into taking on this task. In fact, many are temperamentally opposed to filtering.
BT and others have been providing filtering to parents as a client side option for several years. Mostly they were not taken up. Ofcom report cited to suggest that the majority of parents are satisfied that they trust their children and adequately supervise their online activity.
ISPA don’t see filtering as a silver bullet.
The introduction of US filters, originally designed as legal back-protection for employers who might otherwise be exposed to discrimination lawsuits, follows from fact that the home broadband model is “stack ‘em high: sell ‘em cheap”.
The applications are unfit for purpose/not particularly good.
Further filtering is applied in public spaces by a consortium of five wi-fi organisations: these were not represented at the meeting.
The Open Rights Group is starting to build serious tools to understand what is being blocked
Various points made and set out here in no particular order.
Many warned of issues discussed above, including over and under blocking. In general, there is no case for blocking of lawful content. Legal blocks on material that helps individuals are highly problematic.
Concerns expressed at the role of the BBFC in this matter and the way in which responsibility for blocking is passed around a triangle from BBFC to CPS to police, with each, in turn, citing need to comply with Obscene Publications Act, but questions asked as to how accurate their joint interpretation of OPA is.
BBFC claims its framework would block hate/discrimination, but it would not block political parties. “Mischievous observation” from some present as to what would happen if BNP was referred to BBFC (currently they will not block them) or some tabloid output.
Strong point made that the debate should not be about “children”: children, as category, range in age from 1 to 18. Need to be evidence-based guidelines for what supports parents need at different age stages.
From sex ed perspective: the problem is exaggerated. Filtering alone doesn’t work: parents need to be actively involved in what kids are looking at. Need to talk to kids to keep them safe online. Need to repackage the debate around health and safety as opposed to nebulous evil porn.
Pressure from Ofsted to safeguard children online: however, experience suggests managed systems are better for young people than lock
The sexualisation report (qv.) examines evidence behind concerns re. impact of certain media on children.
Several speakers voiced concerns about transparency. Not just in terms of what is being blocked, but also, as per Jane Fae initial submission, greater opportunity for techies to look at and deconstruct the technology underlying existing filters.
Some debate about possible legal peril for ISP’s and mobile operators. What is their liability if they wrongly block a business? Where their filters are used by public bodies, is there a duty for public sector equalities assessment to be carried out? Have these been?
Most accepted no malicious intent by ISP’s to discriminate against minorities. Concern, however, that methodology used to create blocking systems might inadvertently have led to systematic discrimination. This would likely be indirect discrimination in UK Law. What are ISP/mobile liabilities? Is a judicial review/test case possible?
Some divide in meeting between those who:
– supported filters
– accepted filters will happen and therefore need some regulation
– fundamentally oppose filtering
In latter respect, debate should not just be about “good” vs. “bad” material: often question is about appropriateness of audience or space where material is available.
Also in this context, mixed reception to idea of licensing filter providers. On one hand, seen as tacit acceptance that such filtering is inevitable: on other, perhaps only way to rein in filtering imposed on UK from non-UK cultural base. As minimum, filtering systems providers should be demonstrated to be compliant with Equality Act.
Slightly quixotic sentiment expressed to effect that Claire Perry, MP had done a service because a debate on this is needed and is now only really beginning, thanks to recent implementation. Terms of debate have hitherto been set by anti-porn brigade/moral majority. Sponsoring of this meeting by Julian Huppert, MP, is good because it helps to create debate and discussion amongst the parties involved.
The following include some of the principle next steps/actions arising from discussion:
- ISP’s and mobile operators should be providing a single central point for checking whether something is blocked and by whom
– ISP’s and mobile operators should be providing a single central point for objecting to particular blocks
– ISP’s/mobile operators should be prepared to open themselves to more techy review of their systems by some of those present and qualified to do so: request to be put back to relevant bodies
Also important is to raise awareness of issues highlighted by meeting:
- within ISPA, MBG and other industry bodies
– at European parliamentary level
– within Council of Europe
Parliamentary and other representatives need to consider:
– whether UKCCIS/UKCCIS sub-committee is adequate to provide governance in this area
– how minority groups can be protected from discriminatory or irrational filtering: is transparency enough, or is there need for formal (state) regulation of filtering?
16 January 2014
* * * * * * * * * * * * * *
P.S. If anyone should discover that this write-up has been blocked by anyone for ill-advised use of the word porn, please let me know!