Technology Law Column

This page has been speech-enabled for Macintosh owners using the Talker Netscape Plug-in. Hit Escape to discontinue speech.

Published in the Chicago Daily Law Bulletin, March 12, 1998 at page 5.

Filtering software poses legal pitfalls

Copyright 1998 by David Loundy


A decision is expected shortly in a case brought in the United States District Court for the Eastern Division of Virginia, Mainstream Loudoun v. Board of Trustees of the Loudoun County Public Library, No. CA-97-2049-A, which is being watched carefully by libraries, legislatures, civil rights activists and the anti-pornography crowd. The suit concerns the use of "filtering software" (often referred to as "blocking software" or simply as "censorware").

Filtering software is designed to screen Internet material for "inappropriate" content. Such software packages have been widely adopted, especially in light of their endorsement by President Clinton at a White House Summit following the U.S. Supreme Court's mention of the software as perhaps being a preferable alternative to legislation such as the ill-fated Communications Decency Act. Various states have also jumped on the bandwagon by proposing legislation that would require the installation of such software or other means of content restriction in schools and public libraries.

Sen. John McCain, R-Ariz., has also introduced legislation in the U.S. Senate (S1619 IS, available on the Internet at ftp://ftp.loc.gov/pub/thomas/c150/s1619.is.txt), which would deny certain funds to schools and libraries that fail to implement a filtering or blocking system for Internet-connected computers.

There are a few problems with these legislative attempts and other voluntary efforts to install such software: the software packages do not work as well as most people think they do, and they also erroneously block Constitutionally protected material. To understand the legal pitfalls associated with filtering software, it is necessary to look at the technology and how it operates.

Filtering software works by employing a variety of schemes. Two common blocking schemes used in filtering software either screen, based on the presence of key words, or block certain addresses. Some filtering software packages will search for words present in Internet material which match a list of prohibited terms. If a prohibited term is present, the material is blocked from viewers. Other filtering software may block material based on its URL (Uniform Resource Locator-- a standardized way of describing an Internet address, be it a web page, a usenet news post, an e-mail address, or an FTP file archive). Blocked URLs are usually included on a list that comes with the software after the manufacturer examines the material and classifies it as objectionable to a particular audience. Thus, users are offered options to filter particular types of material they wish to avoid, such as material which contains sexual content, violence, profanity, etc. Users must obtain updated lists to account for new sites that are found or addresses that have changed after the software was purchased.

Unfortunately, both of these filtering schemes are flawed. First of all, key word blocking will not block images. Second, if a key word filter blocks key words appearing in an address, such as in a domain name, all of the content appearing at that domain will be blocked, regardless of what material is actually housed at that domain. Third, key words can be circumvented. For instance, if a filter blocks the word "breast" it might not block "bre_ast." And fourth, if the list of blocked key words is expanded too greatly, then inoffensive content may also be blocked, as occurred in the famous incident where part of the White House web site was blocked by a filtering package because the software blocked occurrences of the word "couple"-- which was used to describe Bill and Hillary Rodham Clinton.

Filtering software which blocks based on the material's address may allow for more precision in theory, but it also suffers some drawbacks in practice. To block based on a URL requires that all URLs be checked and classified. This is generally a subjective endeavor allowing for inaccuracies in classification and, thus, filtering.

Blocking by URL is fundamentally an impossible proposition. The Internet is growing too quickly for a small software company to keep up with the volumes of new material. It is not economically feasible for a software company to hire sufficient numbers of people to rate every web site and usenet news group, much less stay abreast of changing content. As a result, some filtering software may block an entire domain or portion thereof as a short-cut. If the domain belongs to an Internet service provider, then access to all of the service provider's clients' web sites may be blocked because of the rating assigned to one or two of the service provider's users. In addition, some content may be available through a database which spontaneously generates web pages, and therefore has no stable address to block.

Any legislation that requires that all inappropriate material be blocked cannot be complied with using existing technology. All of the existing filtering technology may be considerably over-inclusive in its restrictions, a state of affairs that is not likely to survive last year's U.S. Supreme Court decision in Reno v. ACLU, 117 S.Ct. 2329 (1997). Additionally, the Constitutional tests for obscenity and indecency both include a "community standards" element. Any statute that requires that access be blocked to "obscene," "indecent," or "illegal" material requires evaluation based on local community standards. Some filtering package promoters make the claim that their software blocks only illegal material. This is a nonsensical claim. Either the software must employ the judgment of the software company as to what material is inappropriate, or each individual community must rate the entire Internet (as the McCain bill would require of each school board or library).

These issues are being squarely debated in the Mainstream Loudoun case. In this case, U.S. District Judge Leonie Brinkema (who, at the end of February, struck down as unconstitutional a Virginia statute which sought to restrict State employees' access to sexually explicit material using state-owned computers) is faced with the issue of whether the Loudoun public library is violating the First Amendment by requiring the use of filtering software on library computers.

A citizens' group and a few assorted plaintiffs are suing the Loudoun Library Board, claiming that the "X-Stop" filtering software installed on library computers is infringing their Constitutional rights. Specifically, the plaintiffs argue that the library policy "is a harsh and censorial solution in search of a problem." It restricts all users to content suited to the most sensitive users, and threatens criminal penalties to any who try and circumvent the block. None of the libraries in the County system had complained that there was a problem with inappropriate material, and the library board was presented with data "that less than two-tenths of one percent of the information available on the Internet is even arguably 'pornographic'" before it imposed what some consider to be the nation's most restrictive access policy.

In addition, the plaintiff's have argued that the policy requires the software to perform, in essence, a legal test to determine what material is inappropriate. Furthermore, enforcement of the library policy requires that Internet terminals be placed in full view, thus increasing, rather than reducing, the chance that library patrons will be exposed to material they find offensive. This public placement of terminals may also have a chilling effect by dissuading patrons from looking even at unfiltered content which they do not want to share with any library patron who may be in the area.

The plaintiffs also argued that the legislation is overbroad and that the filtering software removes the ability of a parent to determine what his or her children (or self) should be allowed to see.

Perhaps the plaintiffs' best argument against the legislation is that the filtering software would block material on the Internet that is available to library patrons by simply picking up the same material from the library's shelves. (An argument not likely to be as effective is that the policy requiring blocking software violates the library's own "Freedom for Ideas-- Freedom From Censorship" policy (as well as the American Library Association's principals of freedom and its explicit resolution condemning the imposition of filtering software).)

The defendants' arguments are also interesting, but unpersuasive. The defendants argue that the legislation is based on a policy restricting the library's obtaining of objectionable material at a library patron's request. However, the library board has argued that calling up material from a remotely located machine on an Internet-connected computer is analogous to using the library facilities to request an interlibrary loan of the material. The defendants have stated that as far as they know "no court has ever held that libraries are required by the First Amendment to fulfill a patron's request to obtain a pornographic film-- or any other information-- through an interlibrary loan." Furthermore, they argue that there is Supreme Court precedent in a sharply divided case (Board of Education v. Pico, 457 U.S. 853 (1982)) that intimates that school boards should have the freedom to decide what materials to house in their libraries.

The interlibrary loan argument is unpersuasive because the Internet connection and its benefits are already present in the library, and the library staff is not needed to arrange for the transfer of any content available to an Internet-connected library computer. The software which restricts access to certain material, on the other hand, is brought into the library by its staff in order to remove access to material which would otherwise be freely available to library patrons but for the blocking software. A better analogy would be for the librarians to tell patrons that they may read any books in the library, except the ones the librarians grab out of the patron's hands if they try to take the restricted books off the shelf.

I predict that some of the legislation requiring blocking of Internet content will pass. I also predict the library patrons will win (as, hopefully, will the plaintiffs who challenge any passed filtering legislation). The stakes in this debate are high. At issue here are small battles in schools and libraries.

However, there are two issues more important than whether the Loudoun County libraries allow uncensored Internet access. First, there are whole countries that use "proxy servers" that function as national filtering software. Some proposed filtering-enabling schemes, such as PICS (Platform for Internet Content Selection), constitute what some believe to be the ultimate tool for government censorship by building a mechanism for censorship into the Internet's infrastructure. While countries are entitled to their own Internet content laws, the mainstreaming of such tools should proceed only with care and consideration as to the potential effects.

The second issue, to return to the beginning, is that these filtering tools do not work as most people believe them to work. People need to understand what they may be missing, and to what they may still be subjected. Filtering software is not the Holy Grail, at best, it is the Holy Colander.


[Article Index] [E-Law Web Page]

http://www.Loundy.com/CDLB/1998-Censorware.html