LightNovesOnl.com

Children's Internet Protection Act (CIPA) Ruling Part 2

Children's Internet Protection Act (CIPA) Ruling - LightNovelsOnl.com

You're reading novel online at LightNovelsOnl.com. Please use the follow button to get notifications about your favorite novels and its latest chapters so you can come back anytime and won't miss anything.

1. Internet Use Policies in Public Libraries

Approximately 95% of libraries with public Internet access have some form of "acceptable use" policy or "Internet use"

policy governing patrons' use of the Internet. These policies set forth the conditions under which patrons are permitted to access and use the library's Internet resources. These policies vary widely. Some of the less restrictive policies, like those held by Multnomah County Library and Fort Vancouver Regional Library, do not prohibit adult patrons from viewing s.e.xually explicit materials on the Web, as long as they do so at terminals with privacy screens or recessed monitors, which are designed to prevent other patrons from seeing the material that they are viewing, and as long as it does not violate state or federal law to do so. Other libraries prohibit their patrons from viewing all "s.e.xually explicit" or "s.e.xually graphic" materials.

Some libraries prohibit the viewing of materials that are not necessarily s.e.xual, such as Web pages that are "harmful to minors," "offensive to the public," "objectionable," "racially offensive," or simply "inappropriate." Other libraries restrict access to Web sites that the library just does not want to provide, even though the sites are not necessarily offensive.

For example, the Fulton County Public Library restricts access to the Web sites of dating services. Similarly, the Tacoma Public Library's policy does not allow patrons to use the library's Internet terminals for personal email, for online chat, or for playing games.

In some cases, libraries inst.i.tuted Internet use policies after having experienced specific problems, whereas in other cases, libraries developed detailed Internet use policies and regulatory measures (such as using filtering software) before ever offering public Internet access. Essentially four interests motivate libraries to inst.i.tute Internet use policies and to apply the methods described above to regulate their patrons' use of the Internet.

First, libraries have sought to protect patrons (especially children) and staff members from accidentally viewing s.e.xually explicit images, or other Web pages containing content deemed harmful, that other patrons are viewing on the Internet. For example, some librarians who testified described situations in which patrons left s.e.xually explicit images minimized on an Internet terminal so that the next patron would see them when they began using it, or in which patrons printed s.e.xually explicit images from a Web site and left them at a public printer.

Second, libraries have attempted to protect patrons from unwittingly or accidentally accessing Web pages that they do not wish to see while they are using the Internet. For example, the Memphis-Shelby County (Tennessee) Public Library's Internet use policy states that the library "employs filtering technology to reduce the possibility that customers may encounter objectionable content in the form of depictions of full nudity and s.e.xual acts."

Third, libraries have sought to keep patrons (again, especially children) from intentionally accessing s.e.xually explicit materials or other materials that the library deems inappropriate. For example, a study of the Tacoma Public Library's Internet use logs for the year 2000 showed that users between the ages of 11 and 15 accounted for 41% of the filter blocks that occurred on library computers. The study, which we credit, concluded that children and young teens were actively seeking to access s.e.xually explicit images in the library. The Greenville Library's Board of Directors was particularly concerned that patrons were accessing obscene materials in the public library in violation of South Carolina's obscenity statute.

Finally, some libraries have regulated patrons' Internet use to attempt to control patrons' inappropriate (or illegal) behavior that is thought to stem from viewing Web pages that contain s.e.xually explicit materials or content that is otherwise deemed unacceptable.

We recognize the concerns that led several of the public libraries whose librarians and board members testified in this case to start using Internet filtering software. The testimony of the Chairman of the Board of the Greenville Public Library is ill.u.s.trative. In December 1999, there was considerable local press coverage in Greenville concerning adult patrons who routinely used the library to surf the Web for p.o.r.nography. In response to public outcry stemming from the newspaper report, the Board of Trustees held a special board meeting to obtain information and to communicate with the public concerning the library's provision of Internet access. At this meeting, the Board learned for the first time of complaints about children being exposed to p.o.r.nography that was displayed on the library's Internet terminals.

In late January to early February of 2000, the library installed privacy screens and recessed terminals in an effort to restrict the display of s.e.xually explicit Web sites at the library. In February, 2000, the Board informed the library staff that they were expected to be familiar with the South Carolina obscenity statute and to enforce the policy prohibition on access to obscene materials, child p.o.r.nography, or other materials prohibited under applicable local, state, and federal laws.

Staff were told that they were to enforce the policy by means of a "tap on the shoulder." Prior to adopting its current Internet Use Policy, the Board adopted an "Addendum to Current Internet Use Policy." Under the policy, the Board temporarily inst.i.tuted a two-hour time limit per day for Internet use; reduced substantially the number of computers with Internet access in the library; reconfigured the location of the computers so that librarians had visual contact with all Internet-accessible terminals; and removed the privacy screens from terminals with Internet access.

Even after the Board implemented the privacy screens and later the "tap-on-the-shoulder" policy combined with placing terminals in view of librarians, the library experienced a high turnover rate among reference librarians who worked in view of Internet terminals. Finding that the policies that it had tried did not prevent the viewing of s.e.xually explicit materials in the library, the Board at one point considered discontinuing Internet access in the library. The Board finally concluded that the methods that it had used to regulate Internet use were not sufficient to stem the behavioral problems that it thought were linked to the availability of p.o.r.nographic materials in the library. As a result, it implemented a mandatory filtering policy.

We note, however, that none of the libraries proffered by the defendants presented any systematic records or quant.i.tative comparison of the amount of criminal or otherwise inappropriate behavior that occurred in their libraries before they began using Internet filtering software compared to the amount that happened after they installed the software. The plaintiffs' witnesses also testified that because public libraries are public places, incidents involving inappropriate behavior in libraries (s.e.xual and otherwise) existed long before libraries provided access to the Internet.

2. Methods for Regulating Internet Use The methods that public libraries use to regulate Internet use vary greatly. They can be organized into four categories: (1) channeling patrons' Internet use; (2) separating patrons so that they will not see what other patrons are viewing; (3) placing Internet terminals in public view and having librarians observe patrons to make sure that they are complying with the library's Internet use policy; and (4) using Internet filtering software.

The first category channeling patrons' Internet use frequently includes offering training to patrons on how to use the Internet, including how to access the information that they want and to avoid the materials that they do not want. Another technique that some public libraries use to direct their patrons to pages that the libraries have determined to be accurate and valuable is to establish links to "recommended Web sites" from the public library's home page (i.e., the page that appears when patrons begin a session at one of the library's public Internet terminals). Librarians select these recommended Web sites by using criteria similar to those employed in traditional collection development. However, unless the library determines otherwise, selection of these specific sites does not preclude patrons from attempting to access other Internet Web sites.

Libraries may extend the "recommended Web sites" method further by limiting patrons' access to only those Web sites that are reviewed and selected by the library's staff. For example, in 1996, the Westerville, Ohio Library offered Internet access to children through a service called the "Library Channel." This service was intended to be a means by which the library could organize the Internet in some fas.h.i.+on for presentation to patrons. Through the Library Channel, the computers in the children's section of the library were restricted to 2,000 to 3,000 sites selected by librarians. After three years, Westerville stopped using the Library Channel system because it overly constrained the children's ability to access materials on the Internet, and because the library experienced several technical problems with the system.

Public libraries also use several different techniques to separate patrons during Internet sessions so that they will not see what other patrons are viewing. The simplest way to achieve this result is to position the library's public Internet terminals so that they are located away from traffic patterns in the library (and from other terminals), for example, by placing them so that they face a wall. This method is obviously constrained by libraries' s.p.a.ce limitations and physical layout.

Some libraries have also installed privacy screens on their public Internet terminals. These screens make a monitor appear blank unless the viewer is looking at it head-on. Although the Multnomah and Fort Vancouver Libraries submitted records showing that they have received few complaints regarding patrons'

unwilling exposure to materials on the Internet, privacy screens do not always prevent library patrons or employees from inadvertently seeing the materials that another patron is viewing when pa.s.sing directly behind a terminal. They also have the drawback of making it difficult for patrons to work together at a single terminal, or for librarians to a.s.sist patrons at terminals, because it is difficult for two people to stand side by side and view a screen at the same time. Some library patrons also find privacy screens to be a hindrance and have attempted to remove them in order to improve the brightness of the screen or to make the view better.

Another method that libraries use to prevent patrons from seeing what other patrons are viewing on their terminals is the installation of "recessed monitors." Recessed monitors are computer screens that sit below the level of a desk top and are viewed from above. Although recessed monitors, especially when combined with privacy screens, eliminate almost all of the possibility of a patron accidentally viewing the contents on another patron's screen, they suffer from the same drawbacks as privacy screens, that is, they make it difficult for patrons to work together or with a librarian at a single terminal. Some librarians also testified that recessed monitors are costly, but did not indicate how expensive they are compared to privacy screens or filtering software. A related technique that some public libraries use is to create a separate children's Internet viewing area, where no adults except those accompanying children in their care may use the Internet terminals. This serves the objective of keeping children from inadvertently viewing materials appropriate only for adults that adults may be viewing on nearby terminals.

A third set of techniques that public libraries have used to enforce their Internet use policies takes the opposite tack from the privacy screens/recessed monitors approach by placing all of the library's public Internet terminals in prominent and visible locations, such as near the library's reference desk. This approach allows librarians to enforce their library's Internet use policy by observing what patrons are viewing and employing the tap-on-the-shoulder policy. Under this approach, when patrons are viewing materials that are inconsistent with the library's policies, a library staff member approaches them and asks them to view something else, or may ask them to end their Internet session. A patron who does not comply with these requests, or who repeatedly views materials not permitted under the library's Internet use policy, may have his or her Internet or library privileges suspended or revoked. But many librarians are uncomfortable with approaching patrons who are viewing s.e.xually explicit images, finding confrontation unpleasant.

Hence some libraries are reluctant to apply the tap-on-the- shoulder policy.

The fourth category of methods that public libraries employ to enforce their Internet use policies, and the one that gives rise to this case, is the use of Internet filtering software.

According to the June 2000 Survey of Internet Access Management in Public Libraries, approximately 7% of libraries with public Internet access had mandated the use of blocking programs by adult patrons. Some public libraries provide patrons with the option of using a blocking program, allowing patrons to decide whether to engage the program when they or their children access the Internet. Other public libraries require their child patrons to use filtering software, but not their adult patrons.

Filtering software vendors sell their products on a subscription basis. The cost of a subscription varies with the number of computers on which the filtering software will be used.

In 2001, the cost of the Cyber Patrol filtering software was $1,950 for 100 terminal licenses. The Greenville County Library System pays $2,500 per year for the N2H2 filtering software, and a subscription to the Websense filter costs Westerville Public Library approximately $1,200 per year.

No evidence was presented on the cost of privacy screens, recessed monitors, and the tap-on-the-shoulder policy, relative to the costs of filtering software. Nor did any of the libraries proffered by the government present any quant.i.tative evidence on the relative effectiveness of use of privacy screens to prevent patrons from being unwillingly exposed to s.e.xually explicit material, and the use of filters, discussed below. No evidence was presented, for example, comparing the number of patron complaints in those libraries that have tried both methods.

The librarians who testified at trial whose libraries use Internet filtering software all provide methods by which their patrons may ask the library to unblock specific Web sites or pages. Of these, only the Tacoma Public Library allows patrons to request that a URL be unblocked without providing any identifying information; Tacoma allows patrons to request a URL by sending an email from the Internet terminal that the patron is using that does not contain a return email address for the user.

David Biek, the head librarian at the Tacoma Library's main branch, testified at trial that the library keeps records that would enable it to know which patrons made unblocking requests, but does not use that information to connect users with their requests. Biek also testified that he periodically scans the library's Internet use logs to search for: (1) URLs that were erroneously blocked, so that he may unblock them; or (2) URLs that should have been blocked, but were not, in order to add them to a blocked category list. In the course of scanning the use logs, Biek has also found what looked like attempts to access child p.o.r.nography. In two cases, he communicated his findings to law enforcement and turned over the logs in response to a subpoena.

At all events, it takes time for librarians to make decisions about whether to honor patrons' requests to unblock Web pages. In the libraries proffered by the defendants, unblocking decisions sometimes take between 24 hours and a week. Moreover, none of these libraries allows unrestricted access to the Internet pending a determination of the validity of a Web site blocked by the blocking programs. A few of the defendants'

proffered libraries represented that individual librarians would have the discretion to allow a patron to have full Internet access on a staff computer upon request, but none claimed that allowing such access was mandatory, and patron access is supervised in every instance. None of these libraries makes differential unblocking decisions based on the patrons' age.

Unblocking decisions are usually made identically for adults and minors. Unblocking decisions even for adults are usually based on suitability of the Web site for minors.

It is apparent that many patrons are reluctant or unwilling to ask librarians to unblock Web pages or sites that contain only materials that might be deemed personal or embarra.s.sing, even if they are not s.e.xually explicit or p.o.r.nographic. We credit the testimony of Emmalyn Rood, discussed above, that she would have been unwilling as a young teen to ask a librarian to disable filtering software so that she could view materials concerning gay and lesbian issues. We also credit the testimony of Mark Brown, who stated that he would have been too embarra.s.sed to ask a librarian to disable filtering software if it had impeded his ability to research treatments and cosmetic surgery options for his mother when she was diagnosed with breast cancer.

The pattern of patron requests to unblock specific URLs in the various libraries involved in this case also confirms our finding that patrons are largely unwilling to make unblocking requests unless they are permitted to do so anonymously. For example, the Fulton County Library receives only about 6 unblocking requests each year, the Greenville Public Library has received only 28 unblocking requests since August 21, 2000, and the Westerville, Ohio Library has received fewer than 10 unblocking requests since 1999. In light of the fact that a substantial amount of overblocking occurs in these very libraries, see infra Subsection II.E.4, we find that the lack of unblocking requests in these libraries does not reflect the effectiveness of the filters, but rather reflects patrons'

reluctance to ask librarians to unblock sites.

5. Internet Filtering Technology 1. What Is Filtering Software, Who Makes It, and What Does It Do?

Commercially available products that can be configured to block or filter access to certain material on the Internet are among the "technology protection measures" that may be used to attempt to comply with c.i.p.a. There are numerous filtering software products available commercially. Three network-based filtering products SurfControl's Cyber Patrol, N2H2's Bess/i2100, and Secure Computing's SmartFilter currently have the lion's share of the public library market. The parties in this case deposed representatives from these three companies.

Websense, another network-based blocking product, is also currently used in the public library market, and was discussed at trial.

Filtering software may be installed either on an individual computer or on a computer network. Network-based filtering software products are designed for use on a network of computers and funnel requests for Internet content through a centralized network device. Of the various commercially available blocking products, network-based products are the ones generally marketed to inst.i.tutions, such as public libraries, that provide Internet access through multiple terminals.

Filtering programs function in a fairly simple way. When an Internet user requests access to a certain Web site or page, either by entering a domain name or IP address into a Web browser, or by clicking on a link, the filtering software checks that domain name or IP address against a previously compiled "control list" that may contain up to hundreds of thousands of URLs. The three companies deposed in this case have control lists containing between 200,000 and 600,000 URLs. These lists determine which URLs will be blocked.

Filtering software companies divide their control lists into multiple categories for which they have created unique definitions. SurfControl uses 40 such categories, N2H2 uses 35 categories (and seven "exception" categories), Websense uses 30 categories, and Secure Computing uses 30 categories. Filtering software customers choose which categories of URLs they wish to enable. A user "enables" a category in a filtering program by configuring the program to block all of the Web pages listed in that category.

The following is a list of the categories offered by each of these four filtering programs. SurfControl's Cyber Patrol offers the following categories: Adult/s.e.xually Explicit; Advertis.e.m.e.nts; Arts & Entertainment; Chat; Computing & Internet; Criminal Skills; Drugs, Alcohol & Tobacco; Education; Finance & Investment; Food & Drink; Gambling; Games; Glamour & Intimate Apparel; Government & Politics; Hacking; Hate Speech; Health & Medicine; Hobbies & Recreation; Hosting Sites; Job Search & Career Development; Kids' Sites; Lifestyle & Culture; Motor Vehicles; News; Personals & Dating; Photo Searches; Real Estate; Reference; Religion; Remote Proxies; s.e.x Education; Search Engines; Shopping; Sports; Streaming Media; Travel; Usenet News; Violence; Weapons; and Web-based Email.

N2H2 offers the following categories: Adults Only; Alcohol; Auction; Chat; Drugs; Electronic Commerce; Employment Search; Free Mail; Free Pages; Gambling; Games; Hate/Discrimination; Illegal; Jokes; Lingerie; Message/Bulletin Boards; Murder/Suicide; News; Nudity; Personal Information; Personals; p.o.r.nography; Profanity; Recreation/Entertainment; School Cheating Information; Search Engines; Search Terms; s.e.x; Sports; Stocks; Swimsuits; Tasteless/Gross; Tobacco; Violence; and Weapons. The "Nudity" category purports to block only "non-p.o.r.nographic"

images. The "s.e.x" category is intended to block only those depictions of s.e.xual activity that are not intended to arouse.

The "Tasteless/Gross" category includes contents such as "tasteless humor" and "graphic medical or accident scene photos."

Additionally, N2H2 offers seven "exception categories." These exception categories include Education, Filtered Search Engine, For Kids, History, Medical, Moderated, and Text/Spoken Only.

When an exception category is enabled, access to any Web site or page via a URL a.s.sociated with both a category and an exception, for example, both "s.e.x" and "Education," will be allowed, even if the customer has enabled the product to otherwise block the category "s.e.x." As of November 15, 2001, of those Web sites categorized by N2H2 as "s.e.x," 3.6% were also categorized as "Education," 2.9% as "Medical," and 1.6% as "History."

Websense offers the following categories: Abortion Advocacy; Advocacy Groups; Adult Material; Business & Economy; Drugs; Education; Entertainment; Gambling; Games; Government; Health; Illegal/Questionable; Information Technology; Internet Communication; Job Search; Militancy/Extremist; News & Media; Productivity Management; Bandwidth Management; Racism/Hate; Religion; Shopping; Society & Lifestyle; Special Events; Sports; Tasteless; Travel; Vehicles; Violence; and Weapons. The "Adult"

category includes "full or partial nudity of individuals," as well as sites offering "light adult humor and literature" and "[s]exually explicit language." The "s.e.xuality/p.o.r.nography"

category includes, inter alia, "hard-core adult humor and literature" and "[s]exually explicit language." The "Tasteless"

category includes "hard-to-stomach sites, including offensive, worthless or useless sites, grotesque or lurid depictions of bodily harm." The "Hacking" category blocks "sites providing information on or promoting illegal or questionable access to or use of communications equipment and/or software."

SmartFilter offers the following categories: Anonymizers/Translators; Art & Culture; Chat; Criminal Skills; Cults/Occult; Dating; Drugs; Entertainment; Extreme/Obscene/Violence; Gambling; Games; General News; Hate Speech; Humor; Investing; Job Search; Lifestyle; Mature; MP3 Sites; Nudity; On-line Sales; Personal Pages; Politics, Opinion & Religion; Portal Sites; Self-Help/Health; s.e.x; Sports; Travel; Usenet News; and Webmail.

Most importantly, no category definition used by filtering software companies is identical to c.i.p.a's definitions of visual depictions that are obscene, child p.o.r.nography, or harmful to minors. And category definitions and categorization decisions are made without reference to local community standards.

Moreover, there is no judicial involvement in the creation of filtering software companies' category definitions and no judicial determination is made before these companies categorize a Web page or site.

Each filtering software company a.s.sociates each URL in its control list with a "tag" or other identifier that indicates the company's evaluation of whether the content or features of the Web site or page accessed via that URL meets one or more of its category definitions. If a user attempts to access a Web site or page that is blocked by the filter, the user is immediately presented with a screen that indicates that a block has occurred as a result of the operation of the filtering software. These "denial screens" appear only at the point that a user attempts to access a site or page in an enabled category.

All four of the filtering programs on which evidence was presented allow users to customize the category lists that exist on their own PCs or servers by adding or removing specific URLs.

For example, if a public librarian charged with administering a library's Internet terminals comes across a Web site that he or she finds objectionable that is not blocked by the filtering program that his or her library is using, then the librarian may add that URL to a category list that exists only on the library's network, and it would thereafter be blocked under that category.

Similarly, a customer may remove individual URLs from category lists. Importantly, however, no one but the filtering companies has access to the complete list of URLs in any category. The actual URLs or IP addresses of the Web sites or pages contained in filtering software vendors' category lists are considered to be proprietary information, and are unavailable for review by customers or the general public, including the proprietors of Web sites that are blocked by filtering software.

Filtering software companies do not generally notify the proprietors of Web sites when they block their sites. The only way to discover which URLs are blocked and which are not blocked by any particular filtering company is by testing individual URLs with filtering software, or by entering URLs one by one into the "URL checker" that most filtering software companies provide on their Web sites. Filtering software companies will entertain requests for recategorization from proprietors of Web sites that discover their sites are blocked. Because new pages are constantly being added to the Web, filtering companies provide their customers with periodic updates of category lists. Once a particular Web page or site is categorized, however, filtering companies generally do not re-review the contents of that page or site unless they receive a request to do so, even though the content on individual Web pages and sites changes frequently.

2. The Methods that Filtering Companies Use to Compile Category Lists

While the way in which filtering programs operate is conceptually straightforward by comparing a requested URL to a previously compiled list of URLs and blocking access to the content at that URL if it appears on the list accurately compiling and categorizing URLs to form the category lists is a more complex process that is impossible to conduct with any high degree of accuracy. The specific methods that filtering software companies use to compile and categorize control lists are, like the lists themselves, proprietary information. We will therefore set forth only general information on the various types of methods that all filtering companies deposed in this case use, and the sources of error that are at once inherent in those methods and unavoidable given the current architecture of the Internet and the current state of the art in automated cla.s.sification systems. We base our understanding of these methods largely on the detailed testimony and expert report of Dr. Geoffrey Nunberg, which we credit. The plaintiffs offered, and the Court qualified, Nunberg as an expert witness on automated cla.s.sification systems.

When compiling and categorizing URLs for their category lists, filtering software companies go through two distinct phases. First, they must collect or "harvest" the relevant URLs from the vast number of sites that exist on the Web. Second, they must sort through the URLs they have collected to determine under which of the company's self-defined categories (if any), they should be cla.s.sified. These tasks necessarily result in a tradeoff between overblocking (i.e., the blocking of content that does not meet the category definitions established by c.i.p.a or by the filtering software companies), and underblocking (i.e., leaving off of a control list a URL that contains content that would meet the category definitions defined by c.i.p.a or the filtering software companies).

Click Like and comment to support us!

RECENTLY UPDATED NOVELS

About Children's Internet Protection Act (CIPA) Ruling Part 2 novel

You're reading Children's Internet Protection Act (CIPA) Ruling by Author(s): United States District Court. This novel has been translated and updated at LightNovelsOnl.com and has already 649 views. And it would be great if you choose to read and follow your favorite novel on our website. We promise you that we'll bring you the latest novels, a novel list updates everyday and free. LightNovelsOnl.com is a very smart website for reading novels online, friendly on mobile. If you have any questions, please do not hesitate to contact us at [email protected] or just simply leave your comment so we'll know how to make you happy.