Filter Schmilter: Libraries and Internet Filtering Software
IntroductionRemember the early 1990s when so many of us thought the Internet was a fad, a flash-in-the-pan amusement for people with more time than brains? Many of us were reluctant to embrace the new technology, partly because we did not want to spend money on something that could turn out to be ephemeral and also because we did not yet understand the Internet’s ultimate usefulness. How little we knew. Anyone who works on a library reference desk can tell you the Internet has become so essential that it is now difficult to imagine working without it. It has become as important to librarians as books are, and for some reference needs, such as accessing organizations’ contact information or looking up current legislation, the Internet far surpasses print sources for authoritativeness, currency, and ease of use. In fact, many organizations have already begun the process of eliminating the offline alternative: for example, some post-secondary schools disseminate almost all information about their programs online, minimizing print publications and in some cases even eliminating them altogether. The Internet is used so much in our schools nowadays that even the most curmudgeonly of Luddites cannot deny that it is an established part of our education systems: at least 93% of Canadian schools use the Internet as a teaching tool (Statistics Canada, 2003/2004). In the United States , the figure is almost 99% (Simmons, 2005). Educational purposes aside, however, the Internet is also a dark and seamy place. There has never been a centralized authority to control what goes online, and today it would be impossible to impose such authority (Benson, 2003). Uncensored information and images can be posted with relative ease by virtually anyone. Internet pornography has grown into a 2.5 billion dollar business, with every indication that this growth will continue. How many porn sites are there on the Web? Depending on whom you ask, the number is anywhere between 200,000 and 4.2 million (Musgrove, 2006). Porn site operators, just like any other online entrepreneurs, design their sites to encourage and facilitate access. Nor is pornography the only contentious area: the Internet is also a prime communications medium for racist groups, political propaganda, the promotion of illegal drug use, libel, violent images and other objectionable materials. One vendor of blocking software claims there are 400 million objectionable websites on his company’s list of blocked sites (Guelph Mercury, 2005). Internet filtering software has often been proposed as the solution to the problem of protecting children from inappropriate materials. According to its vendors, you install the software and worry no more. Unfortunately, nothing is ever as easy as that. This modest capriccioso of a paper will examine the pros and cons of Internet filtering software and will assess its effectiveness for its intended purpose, as well as discuss whether Canadian libraries should use it. The focus here will be on issues rather than on technological specifics. Blocking versus Filtering SoftwareLibraries are conflicted on the issue of Internet filtering software. Librarians by nature want to provide patrons with access to as much information as possible. In this context, there is no such thing as too much information. This ideal is espoused by both the American Library Association (ALA) and the Canadian Library Association (CLA); both organizations unreservedly advocate the principles of intellectual freedom and universal access to information. Even so, libraries have traditionally maintained control over the materials their patrons could access. Librarians examined materials, selected the right ones for their collections, and took a pass on the unsuitable ones. This laborious but simple process kept inappropriate materials out of the library, but it no longer works this way: in the space of a few years, we have seen how the Internet “totally changes the paradigms for the distribution of information” (Auld, 2005). Libraries provide Internet access and are consequently drawn, willingly or not, into censorship controversies. Many libraries have opted to use Internet filtering or blocking software while others have adamantly refused to do so. Also known as “censorware,” the development of this software began in the early 1990s. It was generally unknown before a Time magazine cover story drew national attention to the presence of pornography on the Internet and how easily it could be viewed by minors (Benson, 2003). The story, entitled “Cyberporn,” was one of the first mainstream news stories to cover the subject, and the resultant publicity probably created a boom period for online porn (Elmer-Dewitt, 1995). A sidebar to the story somewhat simplistically described filtering software as the solution to the problem; the Time authors did not test the software and could not comment on whether or not it actually worked (Quittner, 1995). While some people, including legislators and vendors who should know better, use the terms “filtering” and “blocking” more or less interchangeably, there are important differences between the two. Filtering software denies access to a website based on its content, while blocking software denies access based on the offending site’s URL. Blocking software contains lists of hundreds of thousands or even millions of URL’s deemed to contain offensive content and simply does not permit access to these sites (Boss). A drawback to blocking software is that a contentious URL has to be inserted into the software, which cannot be done until the vendors have viewed the site. Most vendors update their lists of prohibited URL’s regularly, but there is still a delay in getting new URL’s included. Filtering denies access to sites based on keywords, such as common profanities and slang terms for genitalia or sex acts as well as proper clinical terms, but it is far from perfect for its intended purpose. If the software denies access to sites containing the word “breast;” for example, it will prevent users from viewing some porn sites, but it will also deny access to sites containing information about breast cancer, double-breasted men’s suits or supermarket chicken breast prices. Try to imagine a librarian refusing requests for information about any of these subjects. Depending on how the particular software is configured, if “sex” is one of the keywords prohibited by the filter, computer users may be unable to view websites that include information about the Earl of Sussex or the musician Ron Sexsmith (Wikipedia, Censorware). A single mention of the word “sex” caused X-STOP, a popular brand of filtering software, to deny access to The Religious Society of Friends, a website with Quaker-related information and links (Meeder, 2005). If the paper you are now reading was posted online, access to it too could be denied by filtering software even though these pages contain nothing pornographic or offensive. Evaluating the Effectiveness of Filtering SoftwareRight from the beginning, a common complaint about filtering software is that it is inconsistent for its intended purpose. The filters prevent access to some, but not all, websites with objectionable content while, at the same time, preventing access to sites with educational, non-offensive content that happens to include keywords that have been pre-designated as offensive. Consumer Reports tested filtering and blocking products in 1997, when the software was still in its infancy, and found that it was, for the most part, feeble. Of the four products tested, one allowed access to 19 of 22 pornographic web sites, while another, NetNanny Version 3.1, allowed access to all 22 (Consumer Reports, 1997). The tests revealed other problems, too: not all of the products worked with all online services. Some were extremely rigid in what they blocked and did not block, and some contained warnings from the manufacturer that tampering with the software in any way, such as trying to alter what it does or does not permit access to, could disable your computer. In 1997, filtering software was probably more trouble than it was worth. Consumer Reports also tested filtering and blocking software in 2001. Again, the product performances were spotty: the software products generally denied access to some but not all objectionable sites while at the same time prohibiting access to educational sites (Consumer Reports, 2001). Some of the tested software disallowed access to The National Institute on Drug Abuse, Southern Poverty Law Centre, and Rutgers University ’s Sex, Etc., all of which are legitimate educational websites. Once again, NetNanny performed particularly poorly: in one test, it allowed access to a number of adult sites, expunging unacceptable words but failing to filter sexually graphic images. Consumer Reports found that the filters which most effectively blocked objectionable sites were also the most likely to block legitimate ones. In 2005, Consumer Reports again tested filtering and blocking software and concluded that although it had improved overall in its capabilities to deny access to pornographic websites, it was virtually useless for denying access to sites that promote race hatred, violence, or recreational drug use (Consumer Reports, 2005). Most of the tested products stopped access to almost all pornographic sites, although one of them blocked only eighty-eight percent of them. The math on this last example is discomforting: if, as noted above, there are a minimum of 200,000 porn sites on the web, then this particular product fails to deny access to 24,000 of them (and keep in mind that 200,000 is the minimum estimate of the number of pornographic websites in existence). Once again, the products that filtered out porn most effectively also tended to filter out any sites containing references to sex education, health, civil rights, politics, and gender issues. Seven of the eleven products denied access to an entire Google or Yahoo results page if one of the links on the page contained an objectionable word. As it had done in previous years, Consumer Reports charted its 2005 ratings. In this chart, Consumer Reports rates each product in a number of categories but seems to underrate the importance of protection. To get a “good” rating in this category, filters needed to deny access to at least 76% of offensive websites; this is hardly adequate since this would still permit access to many thousands of objectionable sites. The danger with software like this is that it will give parents a false sense of security. While the filtering of keywords is fraught with complications, the filtering of images is even more problematic. Pornography is more visual than cerebral. Erotic or titillating prose has existed since ancient times, but when we discuss pornography, especially on the Internet, we are almost always talking about visual depictions rather than mere descriptions. Herein lies a huge technological obstacle for proponents of filtering: today’s filters can recognize words but cannot censor images; images are filtered only if they are on sites that contain objectionable words (Bick, 2006). As long as websites avoid using words that are likely to get filtered, and these words are relatively predictable, filters will permit access. Deliberate misspellings may also foil filters, for example writing sex as “sx” or ass as “azz,” while still letting users know that the site contains sexual content although filter vendors will catch on to these misspellings and include them in their lists of targeted keywords. However, some filters can be bypassed by simply doing the search in another language (Wikipedia, Censorware). Also, keep in mind that there are always going to be computer-savvy kids who will find ways to thwart filtering software (Simmons, 2005). In fact, there are websites that give tutorials and tips on how to do so effectively (StupidCensorship.com, HOWTO Bypass Internet Censorship, Seth Finkelstein’s Anticensorware Investigations). Filtering Software within the Library SettingMany of the arguments against Internet filtering in libraries are primarily based on civil liberties and the unimpeded right of access to information, particularly in the U.S. where constitutional guarantees have such a longstanding legal tradition. It is a sad fact that a significant percentage of low-income people do not own computers and depend on libraries for some or all of their Internet access (Kranich, 2005). If library Internet computers are filtered, then there is in effect a two-tiered access system as high-income families who own computers get access to all Internet information while low-income families without computers get only the filtered, inferior Internet at the library. In other words, filtering prevents the underprivileged from “gaining equal access to lawful, useful materials” (Kranich). The Consumer Reports tests indicate that filtering software has improved in the last decade but is still highly flawed. There are numerous other reports of instances where filtering caused frustratingly ludicrous search results for students. In one high school, a student using a library computer to research an essay on smoking was denied access to any website that mentioned smoking, even in a medical context (Barack, 2005). Another student researching medical marijuana was denied access to ninety percent of the websites on the subject and was forced to finish the project at home, an option that some students do not have (Barack). Teachers have also found that filtering software causes more problems than it solves. In one survey, teachers acknowledged the necessity of regulating or at least monitoring student Internet use, but they also reported that their schools’ filtering software hampered their efforts to integrate the Internet into their lesson plans, as well as hampering students’ efforts to do research (Simmons, 2005). Intellectual freedom is also a hot-button issue in the filtering debate. Libraries strive to permit and even encourage the consideration of diverse viewpoints and the free exchange of ideas, without interference from governments; in the U.S. , libraries that do not use filters are denied some federal funding, which many librarians perceive as the daunting precedent of government-imposed censorship (Kranich, 2005). It can be argued that filtering is little different from the selection procedures libraries have always practiced. For practical, moral, and legal reasons, the principles of universal access and intellectual freedom have never been absolute in practice. If graphically pornographic web sites were pre-Internet print documents, they would not likely have been selected for library collections; it has never been the goal of libraries to make it easier for minors or anyone else to access pornography. Librarians still use their own judgment when deciding which materials to include in their collections as they have always done with little obligation to select material deemed unsuitable or irrelevant to their libraries’ mandates (Auld, 2005). Unfortunately, the use of filters takes selection responsibility away from librarians, who are most qualified consider the suitability of sources based on their merits, and gives that responsibility to filter vendors, who make blanket, better-safe-than-sorry selection decisions based on keywords, regardless of context, that a website may contain. This is throwing the baby out with the bathwater and starkly contrary to traditional library selection practices. Governments in Canada and the U.S adopt censorship laws in response to their constituents desire to keep offensive or inappropriate materials out of the hands of minors, so it can therefore be argued that Internet filtering is merely an extension of censorship laws already in place (Auld, 2005). While it is true that libraries are obliged to conform to local and federal censorship laws, censorship is not a library function. Legal authorities and parents are the ones who should decide which materials their children can or can not see. While librarians do have a role in policing the content viewed on library computers, parents have a greater role. Parents, not librarians, are the ones who decide if and when their children are given library memberships. If they are truly concerned with what their kids may see on library computers, they always have the option of accompanying them to the library and supervising their computer access. Too often, library workers are deemed to be babysitters and are given excessive responsibility for unattended children. In this context, filtering software may be viewed by proponents as a means to facilitate in place of parents who are for some reason unwilling to supervise their own children properly. Freedom of Access to Information and Federal Mandates Affecting LibrariesThe issue of whether or not libraries should use Internet filtering software is complex. In both Canada and the U.S. , access to information is a fundamental right, and this right cannot be unreasonably denied. The Canadian Charter of Rights and Freedoms ranks “freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication” as one of the four primary fundamental freedoms (Canadian Charter or Rights & Freedoms). These freedoms cannot be summarily abrogated or minimized: they are “subject only to such reasonable limits prescribed by law as can be demonstrably justified in a free and democratic society” (Canadian Charter of Rights & Freedoms, part I, 1.). While “reasonable limits prescribed by law” can be variously interpreted, there is nonetheless a foundational legal structure that protects intellectual freedom and access to information in Canada . U.S. guarantees of intellectual freedom have been in existence much longer than those of Canada , resulting in a much greater legal entrenchment of fundamental civil rights. The U.S. Constitution’s 1st Amendment, which became law in 1791, guarantees the principle of free speech and intellectual freedom: “Congress shall make no law…abridging the freedom of speech, or of the press” (Legal Information Institute, Amendment 1). Even though this lofty notion is phrased generically enough to elicit multiple interpretations, the 1st Amendment carries great weight in American courts. The Communications Decency Act (CDA), an early attempt to control Internet content, was passed by the U.S. Congress in early 1996, but by June 1997 the U.S. Supreme Court declared much of the CDA to be an unconstitutional encroachment of 1st Amendment rights (Centre for Democracy & Technology website). The successor to the CDA was the Child Online Protection Act (COPA) of 1998, but it too succumbed to a Supreme Court challenge on 1st Amendment grounds (Wikipedia, Child Online Protection Act). In 2000, Congress passed the Children’s Internet Protection Act (CIPA) (FCC website). Unlike the CDA, the CIPA did not directly target Internet content and instead put the onus of access responsibility on schools and libraries. In short, schools and libraries will not receive portions of their federal funding unless they use filtering or blocking software on all Internet computers (including those used only by staff), adopt and enforce stringent Internet use policies, and monitor Internet use by minors. The CIPA is not obligatory, and any American school or library can choose to ignore it (Jaeger, 2005), which makes the CIPA more likely than its predecessors to survive 1st Amendment challenges (it has already lasted longer than both the CDA and the COPA combined). Interestingly, the CIPA does not affect the estimated one-third to one-half of American public libraries that do not receive federal funding subject to CIPA provisions although this may determine whether these libraries apply for such funding in the future (Jaeger, 2005). In 2005, 65% of American public libraries used some level of Internet filtering ( Oder, 2006). Technically, this has all been done voluntarily, but some library managers consider the CIPA to be a form of blackmail withholding badly needed funds in order to impose virtual censorship. ConclusionIn general, librarians believe strongly in intellectual freedom and open access to information but at the same time recognize that allowing minors to view inappropriate websites serves the best interests of no one. Filtering software is a profoundly imperfect solution to this dilemma. Since its creation in the early 1990s, filtering software has over-filtered by denying access to educational websites, and has at the same time under-filtered by permitting access to a significant percentage of unacceptable websites. With all its flaws, filtering software provides a false sense of security for people who count on it to protect their children. The main problem is that software does not have the human capacity to make value judgments and cannot distinguish between unacceptable content and that which is merely sensitive. Blocking software is also highly imperfect. Vendors can update blocking software by adding new URL’s but only after the sites have been viewed and added to the verboten list, which is a labour-intensive process that cannot be done without delay. This leaves enough of a window for viewers to view unacceptable websites; new websites are springing up every day, so it is virtually impossible for blocking software to keep tabs on all inappropriate websites all of the time. This also creates a scenario in which library content is being adjudged by software company staff who are unlikely to have the skills of professional librarians who traditionally were charged with the job of selection. Libraries should therefore avoid using filtering or blocking software unless it is an absolute last resort. The software may improve in the future: according to Consumer Reports, it is already much better than it was ten years ago, even though it is still deeply flawed (Consumer Reports, 2005). In the meantime, libraries should consider other options, such as clear, unambiguous Internet use policies, zero tolerance policies against misuse of library computers, or simply putting computers in places where library staff can see what is being viewed by the user (Kranich, 2005). This latter suggestion is not without complications; no librarian will be able to watch all of the computer screens all of the time. Yet, until filtering software can distinguish between inappropriate and sensitive content, between mature subject matter and obscene subject matter, it cannot be expected to make decisions best made by librarians. References (1997). Is your kid caught up in the Web? Consumer Reports, 62 (5). 27-31. (2001). Digital chaperones for kids. Consumer Reports, 66 (3), 20-3. (2005). Filtering software: better, but still fallible. Consumer Reports, 70 (6). 36-8. (2005). Sweeping the Internet clean. Guelph Mercury, Feb. 26, 2005. A4. American Library Association website. Retrieved Feb. 28, 2006 from http://www.ala.org/. Auld, H. (2005). Filtering materials on the Internet does not contradict the value of open access to material. Public Libraries, 44 (4), 196-8. Benson, A. C. (2003). Connecting kids & the Web: A handbook for teaching Internet use and safety. New York : Neal-Schuman Publishers. Barack, L. (2005). Filters impede learning. School Library Journal, 51 (12). 24. Bick, J. (Jan 30, 2006). Surfing at the library could get less restrictive. In New Jersey Law Journal, p.NA. Retrieved October 04, 2006, from Expanded Academic ASAP via Thomson Gale: Boss, R.W. (2006). Meeting CIPA requirements with technology. American Library Association website, Tech Notes page: Retrieved Feb. 28, 2006 from http://www.ala.org/ala/pla/plapubs/technotes/internetfiltering.htm. Canadian Charter of Rights and Freedoms, Schedule B, Constitution Act (1982). Retrieved Feb. 28, 2006 from http://laws.justice.gc.ca/en/charter/. Canadian Library Association website: http://www.cla.ca/. Center For Democracy & Technology website, Communications Decency Act (CDA). Retrieved Feb. 28, 2006 from http://www.cdt.org/speech/cda/. Elmer-Dewitt, P. (1995). Cyberporn. Time, 146 (1). 32-9. Federal Communications Commission (U.S.) website, FCC Consumer Facts section, Children’s Internet Protection Act page. Retrieved Mar 2, 2006 from http://www.fcc.gov/cgb/consumerfacts/cipa.html. Finkelstein, S. Seth Finkelstein’s Anticensorware Investigations – Censorware Exposed website. Mar. 2, 2006 from http://sethf.com/anticensorware/. HOWTO bypass Internet Censorship [sic] website. Retrieved Feb. 28, 2006 from http://www.zensur.freerk.com/. Jaeger, P. T., McLure, C. R., Bertot, J. C., and Langa, L. A. (2005). CIPA: Decisions, Implementations, and Impacts. Public Libraries, 44 (2). 105-9. Kranich, N. (2005). Filtering materials on the Internet contradicts the value of open access to material. Public Libraries, 44 (4). 198-200. Legal Information Institute website, U.S. Constitution page. Retrieved Feb. 27, 2006 from http://www.law.cornell.edu/constitution/constitution.billofrights.html#amendmenti. Meeder, R. (2005). Access denied: Internet filtering software in K-12 classrooms. TechTrends, 49 (6). 56-8, 78. Musgrove, M. (2006). Technology’s seamier side: fates of pornography and internet businesses are often intertwined. Washington Post, Jan. 21, 2006. D1. National Institute on Drug Abuse website. Retrieved Mar. 3, 2006 from http://www.nida.nih.gov/. Oder, N. (2006). Ripple effects. Library Journal, 131 (1) 59-60. Quittner, J. (1995). How parents can filter out the naughty bits. Time, 146 (1). 39. The Religious Society of Friends website. Retrieved Mar 1, 2006 from http://www.quaker.org/. Sex, Etc. website. Retrieved Mar. 2, 2006 from http://www.sxetc.org/. Simmons, D. G. (2005). Internet filtering: the effects in a middle and high school setting. Meridian , 8 (1). Retrieved Feb. 26, 2006 from http://www.ncsu.edu/meridian/win2005/Internetfiltering/index.html. Southern Poverty Law Centre website. Retrieved Mar. 1, 2006 from http://www.splcenter.org/. Statistics Canada website. Study: connectivity and learning in Canadian schools, academic year 2003/2004. Retrieved Mar. 2, 2006 from http://www.statcan.ca/Daily/English/040924/d040924a.htm. StupidCensorship.com: http://en.wikipedia.org/wiki/Internet_filter#Bypassing_filters. Wikipedia, Censorware page. Retrieved Feb 28 from http://en.wikipedia.org/wiki/Censorware. Wikipedia, Child Online Protection Act page. Retrieved Mar. 2, 2006 from http://en.wikipedia.org/wiki/Child_Online_Protection_Act.
Return to the Post-Tracks Overview page.
|
This work is licensed under a  Creative Commons Attribution 3.0 United States License