Filtering Software
Just as some countries wish to censor the objectionable material that is on the Internet, there are Christians who wish to do the same thing in their homes, public libraries, and schools. For these Christians, there are a variety of commercial software packages available, including Bess, Cyber Patrol, CYBERsitter, KinderGuard, Net Nanny, Surf Watch, and X-Stop. As some of the names imply, many of these packages are designed to protect children against the less savory aspects of the Internet.
While each differs slightly in its particular details, most of these purport to "filter out" webpages containing objectionable material. As a result, these are collectively known as filtering software packages. Such packages generally use one or both of the following techniques:
a keyword filter intercepts incoming webpages, scans them, and then only relays pages that are free of any offensive words. Most packages allow the user to customize the list of offensive words.
an address filter intercepts incoming webpages, checks their URL, and then only relays pages that are not from prohibited locations. These may be set to block an entire domain, a given website, and/or specific webpages. The list of prohibited websites is usually maintained and controlled by the software company who manufactures the package, though some packages allow a user to customize this list.
The task of filtering offensive content from the Web is difficult for several reasons, including:
Keyword filters cannot block graphical images (e.g., pornographic pictures), because such images have no associated words.
Keyword filters invariably block sites that have no offensive content. One keyword filter set to block pages containing the word "couple" blocked access to the White House webpage, where the word was used to describe the President and First Lady.
Address filters cannot keep up with the growth of the Web. With so many new pages (many containing potentially offensive material) being added to the Web each day, it is impossible for filtering software to screen all of them. Some believe that it is impossible even to screen the majority of them.
Ultimately, use of either a keyword or address filter is voluntary. Put differently, a person who is determined to circumvent the filter can generally find a means of doing so.
At an even more basic level, it can be argued that filtering software is fundamentally flawed because it seeks to provide a technological solution for what is in essence a moral issue. That is, filtering software eliminates human decision-making from and hence reponsibility for the moral temptation to access objectionable materials. In the words of Mitch Kapor of the Electronic Frontier Foundation,
we damage ourselves with technology when we use it as a substitute for higher human functions rather than a supplement. It's a tool and it can be used well or poorly.
To illustrate, suppose you click an innocent-seeming link and the entry point to a pornographic website appears on your screen. Since you arrived at this site by accident, you have done nothing morally wrong; and though you may be tempted to continue, temptation is not sin.
The critical moral question is this: What do you do next? If you make the decision to continue into the site to be titillated, then you've made a decision toward immorality and may be caught in a trap. If you hit the Back button and resume what you were doing, you've made a decision toward morality, avoided a trap, and are probably stronger for the experience. To the extent to which it eliminates the need for Christians to make moral decisions, filtering software can prevent Christians from being tempted to sin. However to the same extent, it can also produce Christians who are less experienced at and prepared for making moral decisions in the real world.
For all of these reasons, filtering software packages represent a less-than-perfect solution to the problem of how to handle the objectionable material on the Internet.
This page was written by
Joel Adams.
© 2001 Calvin College, All Rights Reserved