Serena Armstrong
This paper will look briefly at the issue of attempting to prevent children from accessing harmful online materials. It will look at the two major approaches which have been taken in responding to this issue so far, namely software (such as PICS or filters) or law (through the development of the CDA and COPA). Finally it will discuss a solution for preventing children from accessing harmful material online. It is assumed that the reader will be familiar with the first three reading packets from the UMass Cyberlaw class.
   
    SOFTWARE DEVELOPMENTS There are two significantly different software solutions to ensure children do not access harmful online materials. The older of the two is filtering. Filtering works by using either text recognition to block out sites containing inappropriate words or by using lists of blocked sites. There are some extremely convincing arguments for not employing filtering software. Perhaps the most convincing is the common accusation that filtering systems are "clumsy and imprecise, blocking huge amounts of information that is not pornographic while at the same time allowing some pornography to slip through" [Harmon, Librarians Search for Answers on Internet Censorship, p2]. Furthermore, the companies that select which sites to block don't "think about whether blocked pages are constitutionally protected or socially valuable" [Wallace, CyberPatrol: The Friendly Censor, p3]. Such a selection process becomes particularly contentious when government-funded bodies such as libraries and universities begin using them.PICS (Platform for Internet Content Selection) is a newer development. It works by using a set of descriptive labels. These labels may be given by numerous people and organizations and each site may have several labels assigned to it. These labels are then used in combination with a "set of filtering rules which say what kinds of labels to pay attention to" [[Resnick ed, PICS, Censorship, & Intellectual Freedom FAQ, p2]. The greatest criticism in reference to PICS have been the suggestion that a "labeling system . . . will tend to stifle noncommercial communication." [Resnick ed, PICS, Censorship, & Intellectual Freedom FAQ, p8]. Because labels take time and energy to create some sites may be unlabeled. These sites will not be accessed by those unwilling to risk viewing the contents of unlabelled sites. Finally there are concerns that PICS use could become mandatory and unrated sites outlawed, thus creating a "bland and homogenized" Internet. [ACLU, Fahrenheit 451.2: Is Cyberspace Burning?, p3]
   
    LEGAL DEVELOPMENTS There have been two attempts at legislative solutions to ensure children do not access harmful online materials. The first of these, the CDA (Communications Decency Act) was proclaimed unconstitutional by the Supreme Court in Reno v. ACLU. Here the court held the; "indecent transmission and patently offensive display provisions abridge the freedom of speech protected by the First Amendment". The court noted that it failed to allow parents to consent to children's use of restricted material, that the act was not limited to commercial transaction and that it failed to define key terms. Furthermore it was a criminal statute which resulted in the suppression of a "large amount of speech that adults have a constitutional right to receive and to address to one another." [Reno v. ACLU.] The son of CDA, COPA (Child Online Protection Act ) has recently been the subject of a judgment in which a preliminary injunction against the act was granted. This injunction was granted on the basis that it contained terms which were too vague, that it's Affirmative Defenses were a "crippling requirement for small commercial organizations because of their cost, thus reduces speech which the person may be entitled to publish under the First Amendment.", and that it was not proven that "COPA is the least restrictive means available to achieve the goal of restricting the access of minors to this material."[ACLU v. Reno ]
    PROPOSAL Any proposal which aims at regulating minors' access to "harmful" material online must consider carefully the legal and technological limitations that exist. Any legislation that has criminal sanctions is more likely to be declared unconstitutional because of the burden it places upon the members of society, particularly as may be argued that citizens will be "scared" away from publishing material which they have a constitutional right to publish because of fears that it might contravene the relevant act. However, without criminal sanctions the effectiveness of the act is likely to be greatly reduced. I propose however, to avoid criminal sanctions if possible and to rely upon societies' fear of the law, as even criminal sanctions do not deter everyone. The use of common, clearly understood terms and thorough definitions for each important term would be essential for any legislative solution. While the use of an adult ID system or digital certificates ("encrypted digital objects that make it possible for the holder of the certificate to make credible assertions himself. Lessig,What Things Regulate Speech , p 28) is an attractive solution, unless this proof can be obtained free-of-charge and can be installed extremely cheaply by any online publisher then the recent case in connection to COPA suggests that this may well be ruled unconstitutional. Leaving some areas of the Internet unregulated would avoid various legal problems but would render the act less effective in protecting minors. I suggest that there are few solutions to the problem of children accessing harmful online materials that will stand up in the courts. Until cheaper and more effective technology is developed I suggest the government must immerse a great amount of money into its solution. It can do this through extensive policing and litigation. I would suggest that instead of this it would be preferable for the government to provide PICS for free. Thus it would create standardized labeling and filtering. Given the exponential growth of the Internet this is simply too expensive. But then all the solutions are far too expensive. Perhaps it is about time for us to stop and realize that we simply haven't the money to regulate everything. That education must replace regulation.
   
Jennifer Ausiello One of the most controversial issues surrounding the Internet today is the accessibility of pornographic material to minors under the age of 18. In an attempt to thwart childrens' viewing of these materials, several possible solutions have been introduced, some by the Government, others by software companies. The Communications Decency Act (CDA), the Child Online Protection Act (COPA), also known as CDA II or Son of CDA, PICS, and filtering software are all potential solutions. All of these "solutions," however, have drawbacks. The problems with the CDA are its breadth and its ambiguity (which violate the First and Fifth Amendment, respectively). The CDA encompasses a great many categories, thus reducing Internet content to a child's level. It denies adult access to constitutionally protected speech.
The CDA uses the words "indecent" and "patently offensive" as standards by which certain material may be blocked. The government uses these words, however, without actually defining them. How then, can material be rated without some sort of guideline? Even more, what may be viewed as indecent and offensive to some may not be to others. The Child Online Protection Act, also known as the Son of CDA, is different. The vague terms of "indecent" and "patently offensive" have been replaced with the words "harmful to minors," which has an actual legal definition. CDA II, unlike CDA, would only apply to websites meeting criteria for "commercial." These websites would then be required to restrict access of their sites to adults only, by means of some sort of age verification technique or with a credit card. CDA II also has criminal implications for mis-rating, not restricting access, and "knowingly" making this material available to any minor. The definition of knowingly also comes into question here. It is another vague term. Webspeakers can spend up to six months in jail, or they can be fined $50,000 per offense. "Code," being filtering software or PICS, is another unreliable solution. Filtering software blocks websites based on key words, without actually viewing the site to see what is on it. There are simply too many sites to view. Therefore, words with the word "sex" in it, for example, may be blocked even though the site may not even be pornographic. This software may also block eductaional sites such as those dealing with breast cancer, sex education, birth control and gay and lesbian issues. This speech would be protected if it were in the form of a flyer, but because it is on the Internet, it may be blocked. This software is similar to the CDA because it is too broad. The technology has not yet been perfected, and there is no way to actually know what should and shouldn't be blocked based on these key words alone. PICS, on the other hand, does not actually filter or block sites. Rather, it is a labeling system working in conjunction with the filtering software. Webspeakers would be required to self-rate their sites. If one rates their birth control site as explicit, it may then be associated with something pornographic. If one opts to not rate their site at all, it would still be blocked as an unrated site. It's basically a lose-lose situation for the webspeaker. Self-rating may shut down any non-commercial sites, as the rating process becomes very costly. PICS also runs into problems when dealing with art, news and chat. These areas are very difficult to rate, especially chat. On the international arena, PICS may pose a problem, as well. An international site may be blocked because it has not been rated due to a speakers' ignorance of the rating system, for example. There would also be a penalty for those who mis-rate. This may discourage people from expressing themselves.
Third-party ratings have also been considered, but these, too, may be subjective, and they may only deal with the larger, more popular websites. Valuable information, once again, will not be obtainable even to adults who have the right to access this information.
In the midst of this debate, my solution is to put filters on computers in children's areas only of the libraries. I am not fully comfortable even with this solution due to the high margin of error within the software. Information that is truly not harmful to minors may be blocked. Filtering software is a good idea for use in one's own home as long as it can be turned on and off.
The filtering companies should also be required by law to divulge to the public their lists of blocked sites. As consumers, we should be able to see what is being blocked and then make our own decisions as to whether or not we want those sites blocked. PICS should not be used at all. Ratings, too, are harmful, not only to the right of free speech, but also to the webspeaker, economically speaking. This disrupts the equal footing of the affluent and not so affluent on the Web.
The concept of adult access by use of credit cards or other age verification techniques is helpful, but again, not completely fool proof. A child may accidentally get his/her hands on a parent's credit card, thus allowing the child to enter these "adult" sites. This, too, becomes costly for the webspeaker.
Until the technology can effectively handle all of these problems, there should be no regulation or censorship on the Internet. The real solution lies with the parents. Parents should closely monitor their children, esp. in their own home. Granted a parent cannot be with their child at every given moment, but it is a parent's responsibility to babysit their children, not the government's.
Jill Bevis
The Child On-line Protection Act (COPA) is Congress's new attempt to limit the amount of indecent material available to minors over the internet. However, like it's predecessor, the Communications Decency Act (CDA) which was struck down as unconstitutional, this Act is heading down the same path. Limiting access to certain websites to protect children is an extremely challenging task and the moment any attempt to achieve this infringes on adult rights of free speech guaranteed in the Constitution, it is no longer valid. COPA has tried to deal with many of problems the Court found with CDA, however, the act still has many legal and technical problems and is not realistically enforceable. The vagueness of the wording and enforcement limitations are just some of the reasons this Act has failed, or will soon be a failed attempt to controlling what kind of information children can receive via the internet. In regard to vagueness, one example is under Section 239 (e)(6)(a) of the act which states that when defining what is harmful to minors, people need to apply "contemporary moral standards." This obviously poses a problem when you ask, What are the contemporary community standards? Being such an enormous country, there are obviously a tremendous number of communities, each with their own standards, making it a problem to specify any universal ones. The very fact that the problem is being discussed dealing with the world wide web is another problem that makes this act extremely hard to regulate. The United States may create laws to prosecute those people allowing children access to pornographic material, but what happens when this material is posted by someone in another country? A young child using a computer does not just receive information from only inside the United States, they receive information from all over the world and the government of the United States does not have any control over what citizens of other countries do. A solution to Web pornography can not be achieved on a worldwide level, controlling it would basically be impossible. Even from just these few examples it is easy to see that controlling the material people receive by punishing the people who send it is nearly impossible and presents an hopeless challenge to national law makers.
As a result of these and the numerous other problems that go along with trying to regulate the information on the internet through the individuals who send out the information, I feel the only truly effective, and Constitutional way any limitations can be placed on the internet is through the use of PICS, Platform for Internet Content Selection. PICS is rapidly becoming the popular solution the problem of the Internet and the Children. The most attractive feature of PICS is that it labels and filters material in accordance with programs the parent choose. No one is not going to not be allowed to see something unless it has been blocked out by the labeling program they have selected. The problem with this comes when deciding who should assign the labels and if their should be criteria for the labels. At a national level, to be the most effective in protecting young children, there would obviously need to be some criteria set on how to rate certain web sites, and I do not really foresee any problem having the standards used to label, not the labeling itself, being in control of the government, as long as the labels are consistent and available to the public. Once the criteria are set, the government can offer their own labeling program, in addition to the ones offered by companies or groups. By doing this there will be some effort by the government to control the harmful materials minors receive, but most of the control will fall into the hands of the parents. A parent can install whatever labeling system they want in the computer to protect their children from any material they feel their children should not have access to. Self-Rating, I do not feel would be the correct approach to achieve this goal. Individuals should be allowed to rate their own sites, but so should others. There obviously needs to be some control outside those who produce the websites because, as we all know, no one is going to rate their own website as harmful if means hardly anyone will be able to have access to it. Another feature of PICS that is attractive is that when an adult want to use the computer, they can either have a secret password to get past the block or they can turn it off and on. So, when they want access to the things they do not feel are appropriate for their children, they still have the option to view it themselves.
This solution is obviously far from perfect, however, I think it offers the best solution to the problem of indecent material falling into the hands of children. The government can offer the tools to protect children from what they feel is pornographic and indecent, and it is then up to the parents to enforce this. All the responsibility can not be put on the government, which it is not with this type of system, and the appropriate steps will have been taken to help solve the problem. There is nothing wrong with a labeling or a filtering system, as long as the individual has a choice as to what system they use. A national law requiring everyone to own a certain filtering program would not be constitutional, however, a program that gives parents options, is right now the best method available to controlling the problem of minors receiving harmful material over the internet.
Gil Bartov The problem with CDA and son of CDA is that they seem to have been created by people who don't have quite as much of an understanding of the internet as one would like to have when writing these kinds of policies. Phrases such as "initiates the transmission of", or "knowingly permits" are actually quite vague and unclear when pertaining to the internet. Once a person creates a web site, it is almost imposible for that person to controll who can and can not enter the site. Therefor, is that person really initiating a transmission and how does one knowingly or unknowingly permit someone fron viewing the site?
In addition, there is the problem of actually enforcing any U.S. law because the internet is global. Any legislation restricting or even pertaining to the internet would have to be passed by every single country in the world in order to be enforcable. And what is the use of making laws that are not enforcable?
There are many other problems with restricting content on the internet such as determining what is pornography and what is art or educational. Also, the term "patently offensive" leaves a lot open for discussion. Since almost everthing is offensive to someone in the world, and since it is impossible to control where the web site will end up being viewed, restricting material which may be patently offensive would mean removing almost everything on the web.
In my opinion the best solution is to use software such as PICS and filters. While these systems are flaud do to the rating system (different organizations rate similar sites differntly) they are still a very useful tool. At home parents will have the option of whether or not to use this software. In public places such as libraries, there can be filtered computers in the childrens section and unfiltered computers in the adult or research section. In cases where the library is too small and only has one or two computers, the computers can be unfiltered. It will then be the parents responsibility to make sure their children do not get into web sites they shouldn't which souldn't be difficult in such a small library.
Just like movies don't allow children under certain ages to see NC17 or R rated movies without a parent, and porn and liquor shops don't sell to minors, public places will not allow minors to view unsuitable material. As for the question of what is unsuitable for minors the answer is simple; patently offensive material. While it is impossible for the source of the transmission (the person or organization who created the web site) to determine what is patently offensive since they don't know where it is going to end up, it is much simpler for a town or city to determine what is patently offensive in their own community. Each town or city council can decide which rating system it wants to use to filter its internet, based on the common views of the community. Since adults will not be subject to filtering and children will be able to view the unfiltered internet with parental concent, no one can claim that its book burning or sensorship.

Mark DiAntonio

The question of how to stop children from viewing controversial material is an interesting one. The most basic answer lies in the most personal of situations. Families must address the issue of perusing graphic Web sites with their children. Better communication and understanding at home would help curb the issue of Web site content before it would develop into a problem.
It becomes a problem when parents rely on the computer, like the television, to help chaperone their children. This problem is compounded by the fact that computers are becoming more "user friendly". The concepts behind today's operating system design are so easy to understand that very young children can comprehend how to start up the computer and open up a program. Commercialization of technology has undermined the respect for the power that each user now wields.
What we're left with is dealing with the natural curiosity that every child has. It is a situation similar to the problems presented with late night cable television. Children, like adults, talk in social situations, whether it is the schoolyard or the peewee soccer game. Our gregarious nature as human beings and the peer pressures inherent in our society for young Americans all but insures that our children will find out about many things immoral far too early in life.
Education is the key in aiding our children in making the correct decisions or, at the very least, to help them learn from their mistakes. As for how this pertains to the Web and the censorship of the content of Web sites, we should focus on installing healthy morals so our children can keep graphic content in perspective.
Approaching censorship must be done with extreme care. My own personal view on the subject is that the Web should remain as far removed from the government as possible. If censorship has to be dealt with it should be done by the people involved in the evolution of the technology.
The software used to view the content, the browsers like Netscape and Internet Explorer, could be time protected so parents could "lock out" sections of the day that they don't want their children to use the program.
You could possibly argue that the programmers could alter the coding so the browser could cut out questionable materials, but I don't believe that would work. Browsers read the scripting language that the page was written in and then translates that on the screen loading images and text as necessary. Currently most of the pages out there are written in HTML or XML with some added scripting like JavaScript. These languages load images when called from any location that is live. The images do not need to be "local", in the same directory or on the same server. Because of this a person in Massachusetts could effortlessly load images on a server in Michigan and nobody would be the wiser. A person could actually change the physical file names on the server and the HTML would not discriminate.
Since Web sites need a server in order to reach the people on the other end of the Browser, the servers themselves should be individually maintained and reviewed for questionable material. The problem is that with the evolution of high bandwidth connections like newer cable modems and ISDN lines individuals are more capable of starting up their own server. Governmental groups like the FCC would run themselves into the ground attempting to sort the volumes of sites out there. In closing the Web is a resource that can mimics prior media types because of the need for designers to reach the massive audiences. There is nothing that dictates how we make use of the Web, but our own sense of tradition, the fact that we are comfortable with these media already, and common sense. The Web is not television, it is not a newspaper, and it is not a radio channel. The tools it takes to create Web sites are available to everyone. No other media has ever offered somebody the opportunity to publish something world wide with speed and ease that the Web does. That is why we have such a hard time attempting to govern it.
Dan Hahn
Parents have always played the role of policemen when it came to the screening of their child or children's material on TV or anywhere in the media. The parents could choose whether a program was appropriate for their child or whether a book was appropriate for their child because of past experience or just a few minutes in front of the TV. But, a whole new realm of media has developed; a realm in which, in many cases, the children have more knowledge than the parent. The Internet has grown into a great and mighty sea of information and knowledge within a keystroke of the nearest child. Children are on the crest of the Internet wave because the Internet is being taught in all public schools and is the topic of many playground discussions of children. Because the Internet is growing at an incredibly fast rate, those who are not in touch with the technology may not be familiar with what the Internet holds. The Internet has become an open forum for any topic and any subject that comes to a web creator's mind, and because children are more familiar with the net than their parents, parents have looked to resources outside of the home for supervision solutions.

   The United States government looked to protect children from harmful and sometimes obscene content on web sites by crafting up the Communications Decency Act (CDA) in 1994. Unfortunately, the politicians who drafted the CDA were unfamiliar with the technology and how the web actually works. The language used in the law did not entirely encompass the whole process of posting pornographic material on the net and retrieving that information from a web site. The law placed blame on the Internet Service Provider (ISP) who granted the publisher of pornographic material space on their server rather than placing blame on the actual creator. The enforcement thus became a logistical nightmare because the ISP many times has no idea what their clients place on the net and do not have enough resources to monitor every web page on their server. Although the CDA was not passed, it opened the eyes of the general public, and parents in general, to the issue of adult content on the web.

   Several other ideas have been proposed to keep kids from seeing harmful content on the web. A new and updated law, son of the CDA, was proposed in congress. Once again the law was challenged and lacked the votes for a successful campaign. The problem with the son of the CDA was the language used in the law. Once again the words obscene and lewd were used. In order to pass a federal law with these standards, a universal definition of the words would have to be created. Since community standards differ across the country, no common definition for lewd and obscene could possibly be created. We are left asking ourselves if any sort of law or program can protect children from "harmful" content on the web.

   The best system for filtering the web for children is for the site creators to rate their own sites using the PICS system. This allows creators to assign a rating to their site and Internet browsers can then be set to only allow a certain rating, such as PG or G, to be seen at the home computer level. Currently, over 300,000 sites have instituted this rating system, meaning that most reputable sites are rated. The web browser in the home can then be allowed to only accept rated sites, and unrated sites can only be accessed using a password. This system allows parents who have very little knowledge of computers and the Internet to censor what their kids are seeing in a very easy and ultimately free way. It also means that no mistakes will be made when a child types a wrong letter in a web address and a pornographic site is displayed. If that site has an XXX rating, or is unrated, it will not be displayed. Because no policy is 100% fool proof, parents must realize that the best policy is still sitting with the children at the computer and sharing the Internet experience.

   
   

Aparna Korwar
I think that a combination of legislation and controls implemented by the creators of web pages will be the most effective in ensuring that children don't have access to objectionable material. Or is it? PICS may be an appropriate option because of the potential benefits that come with having a gatekeeping type of presence on the info superhighway-kids are kept away from "bad" stuff and adults don't have to see things that disturb them.

   
   I wonder whether it is reasonable to let someone wholly unconnected with me decide what I can and can't view on the inernet. This is certainly a dilemma because I also think that adults should be able to see pornographic images on-line if that's what intrigues them, but kids should not have access to them.

   
   It's really tough to say whether a combination of new legislation to either replace or supplement the CDA and COPA coupled with self-imposed controls like PICS or some other mechanism implemented by the site creator will actually deter kids from accessing harmful images on-line.

   
   While it's tough to say what the best solution is, what is rather clear to me is that something has to be done in light of a recent report on a nightly newscast that increasing numbers of children are able to buy alcohol on-line and pay with an adult's credit card. Should something like this be considered harmful to kids and thus subjected to censoring?? Before we can decide what to do, we have to ferret out exactly what is considered wrong for kids to see. The trouble is, who is allowed to make that decision? The government? Site creators? Religious leaders? This is the fundmental quandary and until it's fully resolved, we can never adequately conquer the problem of kids accessing objectionable materials on-line.

Sarah Leeper
Censorship on the Internet presents more problems than it solves. Besides violating the Bill of Rights, statutes such as the Communications Decency Act (CDA) and the Child Online Protection Act (COPA) have left many of their proponents unsatisfied. Such statutes are based on the premise that certain material is deemed "indecent" or "patently offensive" and being labeled as such they are considered unsuitable for minors to view. Anyone convicted of providing this material to a minor is subject to fines and prison sentences.
The general purpose of these statutes, the protection of children, is well intended but ultimately impossible without subjectivity. There is no general consensus about what deservesthe label "indecent" or what constitutes "patently offensive".
In addition, the routes providers of indecent material must take to prevent minors from accessing their sites (i.e. requiring credit card numbers) exclude many adults as well (i.e. those without credit cards). Filtering, the alternative to these statutes, presents new problems. Filters block a majority of indecent materials as well as a good amount of acceptable materials including breast cancer research. The connotation that the word breast carries is considered indecent without regard for the context in which it is found. At the same time certain materials that some may feel are inappropriate for children are not blocked.
Recently a solution to the problem filters has been found. PICS is a filter that not only blocks indecent materials but allows the user to choose what group has decided upon what is indecent. For instance, a parent who does not want there child to view websites about gay rights can choose the Christian Rights Coalition version of PICS. The drawback to PICS is that it has the potential to filter worldwide. This would enforce the United States standards of what is indecent of the entire world. This concept is frightening.
Overall, I feel that filtering should be left to the individual user and has no place in public space.
Scott Levine
    When it was first conceived the Internet went largely ungoverned and because usage was limited to a select group of people, problems were not very common. However, with the growing number of PC's and the advent of new technology, it is possible for just about anyone to get on-line. This ease of accessability, and its growing popularity has prompted the U.S. government to get involved. Similar to their intervention with radio, and television, the Government has decided that they need to decide what people should and should not be allowed to access.
This reduces the whole notion of freedom that the Internet had provided. By imposing standards on what can and cannot be viewed, they are extending a sort of censorship over a domain which has not been set up to adapt well to outside intervention. Cyberspace is the land of the future, with unlimited boundaries and endless possibilities. Governing such a body is a difficult task to say the least. With the increase of children using the net, and the growing use in both the public and private sector, concerns arise over people being accidentally (or purposely) exposed to obscene or shocking things. The idea of policing the Internet is premised on the notion that people need to be protected against some things that may be offensive. This needs to be balanced by the same freedoms which we are afforded by the Constitution. The question as to where to draw the line, and by whom is under heavy debate at the moment.
   First of all, the Internet is not something that is just stumbled upon. A person has to have access to a computer, and that computer needs to have the capability to get on-line. Assuming someone has all of this, they need to make a conscious decision to connect to the Internet via one of many portals, or Internet Service Providers. Being able to connect to the Internet is only the first step. Before you can access a web site, it is necessary to either type in the address, or to enter from a link. Both of these require the user to make a decision to visit a site. What site they are entering cannot be guaranteed, because at the present time, it is up to whoever is hosting the site to provide it with a name and description.
   The Government is in the process of addressing this concern, but has nowhere to look for precedent. If and how sanctions are imposed on the Internet, and to what degree is something that will come from fresh ideas of modern visionaries. Courts have not been able to uniformly decide on matters regarding the Internet, and the CDA has left room for indiscretion. It is hard to decide where information that is posted on the Internet originates. Is it connected to the person who actually posts it, or does the burden carry over to an ISP who carries that information? Is my viewing something indecent and having it in my possession equivalent to the person who posted it in the first place? All of these issues make the task more difficult for the government. Where we begin is the hardest part. Once some sort of standard is devised, all future legislation will build upon the foundation which it lays. I find this to be a very sensitive area, because too much or too little restriction could be dangerous.
I think a balance needs to be struck between protecting innocent people from indecent speech, or pictures, with other people's right to say and see what they please. The Internet is also not just an American innovation, it is an international device that is used by people all over the world. How will our deciding what is acceptable in this country relate with what is okay in other countries.
    Personally, I believe that the Internet is one area where the government does not belong but as it expands and becomes a more common place for commerce and information, problems are going to arise and these problems will need solutions. I think that some sort of device like PICS may be a step in the right direction, if we assume that some filtering mechanism is necessary. As far as how I would go about implementing something like this, I think that all interests need to be considered. I think that eliminating criminal penalties for on-line expression would be a good start. In my opinion it's okay to try to shield children from harmful things, but if a grown adult has some desire to exhibit deviant behavior, than so be it. After all, no one is forcing anyone to look at anything they don't want to, and the decision to go on-line is a choice. Parents do not have to allow their children access to the Internet, and those who do need to realize what is out there. I think instead of so much government intervention, it would be more feasible to place the burden on the individual as to what they choose to view.
A filtering device like PICS would help to aid people in avoiding material which they find offensive. I think that every site should have a description detailing its contents, but rating such contents is highly objectionable. Creating an organization (or organizations) that would monitor and rate various sites seems impossible. With the size of the Internet growing infinitely every site could not be checked. I think we need to put our trust in people's good faith that they will be honest about site contents and leave it at that. As far as regulations, and laws, why not just wait until an issue arises before jumping to make a law. I think the government means well, and is just looking to protect people, but I'm in favor of doing less to limit what I value as free speech. Let's try something like PICS, that would give a site description, and see how that goes. We can always do more if things get out of control, but to start by mandating al these things all at once we're moving away from the freedom that makes the Internet so great.


   
   
Kenneth McDonald

   UMass Amherst
Within the past five years, the number of people that have jumped on the "information superhighway" also known as the Internet has grown at an astronomical rate. Whereas some grandmothers who before did not like to use the telephone are checking the status of their 401k plans online and e-mailing grandchildren on other continents. However with the positive aspects of surfing the web, the negative ones also seem to be not far behind. It has occurred to groups such as the Christian Coalition that there may be objectionable or indecent material on the Internet that no one should view and therefore be banned. What they have not taen into account is that this country has laws protecting free speech, one in particular by the title of the First Amendment.

   
A law was proposed and eventually struck down entitled the communications Decency Act (CDA). This act limited what could be published in various forms of communication, including the Internet. Eventually, this law was found unconstitutional because it prevented the free trade of ideas guaranteed by the government. Even though someone might be offended by pornographic, material, does not make such matter illegal, maybe just perverse. The Child Online Protection Act (COPA) was born out of the failure of the CDA. Its goal was based on the same principle, but a different premise. COPA's
goal was to prevent "indecent" material from being shown to children. However, it is still in violation of the First Amendment. Right wing groups claim that it is the industry's responsibility to monitor what is being published, while Internet providers feel that it is a parent's responsibility to watch what their kids view online.

   Such controversy has spawned software companies producing programs that can block certain "objectionable" material from being accessed by a child or patron of a certain computer terminal. Two major problems have arisen form such types of software. The first is strictly technical; website addresses that may contain an objectionable word or phrase are being blocked. For example, www.SuperBowlXXXIII.com is prevented from being called up by these programs because it contains the XXX associated with hardcore pornography. Secondly, such filtering devices as CyberPatrol are limiting sites that discuss homosexual issues or other "controversial" topics. Valuable information on AIDS cannot be looked up on computers victimized by a net- filter.

   In one sense, these devices are too effective; they stop too much information that is not indecent in anyway. There should be a combination of self- programmable filtering options for parents, not companies to choose. That way, sites that could be helpful are not blocked. Legislators need to fight to make sure that the new Red Scare, the fear of the growing Internet, does not limit the valuable trade, discussion, and development of ideas.

   
   

Matthew Milos
Matthew Milos

   COPA v. CDA
What's Wrong

   For the purpose of this essay I will be writing from the standpoint of a lowly peon in the grand scheme of things that is Washington, D.C., a House Paige. I will, in the form of a memo, a letter, or something to that effect, try to enlighten the Representative that employees me in the areas of COPA and the CDA, and why they just don't work.
---
To: Representative Haywood Jablome
From: Hugh Jasse (House Paige)
Re: Why COPA & the CDA just do not work

   Upon further readings and studies of both the Child Online Protection Act (COPA) and the Communications Decency Act (CDA), I have come to the conclusion that regardless of intent and planning, it would be impossible to prevent the accessing of pornography by minors on the Internet and the World Wide Web. I say this for a variety of reasons, some of which were discussed in the above mentioned documents, and others that were not mentioned.

   The first glaring problem that COPA and the CDA face is that they attempt to impose United States laws on an entity that exists globally. The name itself should give some hint as to its magnitude, the World Wide Web. Anyone, anywhere in the world, can visit any other web site, anywhere in the world, without restriction for the most part. There are no borders to cross; there are no nations on the World Wide Web. When I am looking at a web site about vacationing in Las Vegas there may be hundreds of others from around the world looking at the same site. Conversely, many native Australians could be looking at a web site that gives information on SCUBA diving vacations at the Great Barrier Reef at the same time that I am, along with millions of others around the world.

   As easy as it is to spread information about travel and tourism, it is just as easy to make porn available. Try as we might, we cannot enforce our laws and morals upon other sovereign nations. The Web and the Internet are not solely based in the United States. We may have many connections to these informational resources, but by no stretch of the imagination do we control them in any way shape or form.

   The notion of "contemporary community standards" is also in direct opposition to the global aspect of the World Wide Web and the Internet. What Conservative Christians find "patently offensive" others will have no problem with, both here in the United States and most definitely abroad. There can be no application of "contemporary community standards" to a network that has no homogenous community, for it can be certain that there is no global "community" in existence that complies with the definition of "community" that is found in COPA or the CDA.

   One of the issues addressed in COPA and the CDA was age verification, but while the system that was laid out seems rather effective there is a major flaw in it. Passwords and age verifications are very good ideas, but they can be bypassed by anyone with access to the Internet, and I am not talking about hacking systems. I am referring to "password sites" that exist on the Internet and the World Wide Web.
These web sites are dedicated to the hacking of passwords of password protected web sites and posting those passwords for anyone who visits their web sites to see. The passwords are posted under the guise of "testing the security" of the hacked sites, with such a disclaimer being posted somewhere on the site along with information for webmasters who want the passwords to their pay sites removed. As mentioned above, these sites are readily available to anyone who wishes to look for them, and they are free, there is no cost involved. Therefore, if a fifteen-year old wanted to get into a "pay site" without the proper verification, they simply have to seek out a "password site" and obtain the password.
Overall, I believe that COPA and the CDA are noble causes with many positive aspects to them, however we must be realistic when we discuss the World Wide Web and the Internet. It is simply far too expansive for one nation to take on from a moralistic viewpoint. COPA and the CDA will most definitely fail, and at this point in time I see no effective way to regulate what children see on the Internet and the World Wide Web other than direct parental involvement. The solution to this perceived problem must come from the home, for no government on the planet can possibly hope to impose its morals upon or regulate the Internet and the World Wide Web.


Erik Moore
Current efforts to prevent children from accessing harmful material online derive from the perception that the Internet is a single "network of interconnected computers" which "[i]ndividuals can obtain access to…from many different sources." [1] This belief surfaced in the 1980s, gained popularity in the early 1990s, and eventually evolved into the foundation for the various legal and technological debates presently occurring in and around cyberspace. Unfortunately, this definition incorrectly binds the structure of the Net to the act of accessing it, and is thus fundamentally misleading. Consequently, most of the software and legislation associated with online content has been ineffective and, at the extreme, unconstitutional.
The Internet is not a single system whose entire contents become instantly available to those with access, nor should it be. Instead, it is a framework that enables the creation of such a system. Within this framework, many different systems with various purposes can be created. Each system has unique participants and each participant has unique demands. When an individual goes online, he is accessing the system he specifically participates in, not the Internet itself. This is best clarified with an example.
Let's assume that three systems rely on the framework of the Internet to exist. The first is a medical system filled with volumes of information on reproductive organs. The second is a commercial system for adults that sells legal pornographic images online. The third is an educational system for minors that uses appropriate language and images to describe human reproduction. Each of these systems has a unique, willing set of participants and each of these participants is comfortable within their respective systems. A library that becomes a participant in the educational system is not obliged to participate in the other two systems. There is no need for any new legislation or software to regulate the different systems.
Now let's assume that we force all of these systems to coexist within one larger system. In this circumstance, each participant will have to sift through the content of two unrelated systems to find the information he needs. The minors, who have unknowingly become participants in a system that is not designed for them, now have access to pornographic material. Similarly, a library must purchase the entire system in order to access the small educational portion of it. The need for legislation and filtering software emerges and is quickly prevented by the Supreme Court for violating the First Amendment. In order to protect their rights, all of the participants are forced to participate in all of the systems. Those who refuse are excluded from participating.
While this example is extreme, its premise is not. By attaching a broad definition to the term "Internet", we have forced ourselves into a position that requires the need for filtration software and legislation. This situation could easily be fixed if we allow ourselves to divide the Internet into unique systems based on participants. As long as the barriers to entry for each system are relatively low and the boundaries are very well defined, the efficiency of filtering software and cyberspace content will reach unprecedented heights. These groupings could be achieved using domain names. This spring, the ability to create and maintain new domain names will become more readily available. We need to develop a standardized domain name system that can associate specific domain names (.org, .buy, .adult, .minor, etc.) with specific industries, content levels and participant groupings. If we did this, we could easily curb the problems currently associated with an Internet that is too broadly defined, to diversely populated, and too shielded from serious reformation.

[1] Reno v. ACLU, Section I, Paragraphs 3 and 5


Seth Presser

    MAYBE NOT WRONG:
JUST UPSIDE DOWN & BACKWARDS
   
   Outlawing "indecent" online communication, with the introduction of the Federal Communications Decency Act three years ago, threw a spark was into a pile of dry leaves causing the beginnings of forest fire that would engulf the online community. The federal case Reno v. ACLU gave the supreme court an opportunity to review the issues. The Supreme Court overturned the decision clearly stating that online speech is to receive the highest level of free speech protection. This puts online speech at the same level of protection that we afford to all printed materials.
The Communications Decency Act, which is supposed to protect minors from accessing controversial or sexually explicit material, outlaws "obscene…", which already is a crime, and therefore the CDA is not needed, but also "...lewd, lascivious, filthy, or indecent", and even "annoying" "... comment[s], request[s], suggestion[s], proposal[s], image[s], or other communication "using a "...telecommunications device" all of which are protected by the First Amendment. The Act is also unconstitutional because it does not follow the Supreme Court's decision in Sable Communications vs. FCC. requiring that restrictions on speech use the "least restrictive means" possible. The Court also stated that restrictions on indecency cannot have the effect of "reduc[ing] the adult population to only what is fit for children."
It is clear to most that the material in question on the Internet would clearly be protected by the First Amendment if it were found in a printed book or magazine. The issue at hand was the level of protection it would be afforded when placed in a median that children may or may not have access to. The court was unwavering in it's statement that you may not infringe upon the civil rights of an adult merely on the basis of "protecting children." In other words you may not restrict or limit what children may see in such a way that will restrict adults from access.
Many have offered solutions to the crisis of child protection we are facing as a society but few have provided even mediocre success in the diversity of an organism that is in constant motion at incredible speeds. Solutions that seemed to offer some hope simply have not withstood the first amendment test. Some of these themes include both self-rating and third party rating systems.
The ACLU believes "that the various schemes for rating and blocking, taken together, could create a black cloud of private "voluntary" censorship that is every bit as threatening as the CDA itself to what the Supreme Court called 'the most participatory form of mass speech yet developed.'" In turn they offer six reasons why self-rating schemes are wrong for the Internet. However, the arguments, reasons, and examples the ACLU offers are fatally flawed in more than one way. The most important way is that they only take into account the currently "accepted" form of self-rating but do not address alternatives to this system that may make many problems fall by the wayside and others simply melt away.
Before a critique of the essay can be made though, it first must be determined what system we will use. This system must hold up to the six-part test of the ACLU and the Supreme Court test of the first amendment. Perhaps a system of "reverse censorship" is the hidden gold mine under the mountain of rating. Such a system seems confusing at first but overwhelming simple once in place. Rather than rating materials that are obscene, indecent, crude, etc. and denying access to children, let us rate materials that are child friendly and allow children access to only those materials. Such systems are used in nearly every other rating area of our lives. Movies are rated PG13; meaning the movie is appropriate for a thirteen-year-old or older. On the same note toys are rated for children 6 and under or 10 and above. Likewise web pages could be marked for appropriateness rather than indecency.
The question that is first and foremost is whether reverse censorship can withstand the test of the Supreme Court in regard to the first amendment. The answer is an overwhelming absolutely yes. Since the first amendment only guarantees the right of access to free speech to adults, and it has always been accepted that minors may be restricted in their access to printed and visual material, there would be no grounds to rule that statute unconstitutional. In a more simple view, adults could access both unrated pages and pages marked for children, children could only access pages marked children. Since we allow the government to restrict children in other ways this would not contradict the first amendment.
The test presented with six examples by the ACLU requires a bit more analysis. These questions address why self-rating systems are wrong for the Internet, not their legality. The stated reasons will lose much of their strength when the self-rating standard reverses to rating clean pages rather than indecent pages.
Reason #1: Self-Rating Schemes Will Cause Controversial Speech To Be Censored.
    Kiyoshi Kuromiya, founder and sole operator of Critical Path Aids Project, has a web site that includes safer sex information written in street language with explicit diagrams, in order to reach the widest possible audience. Kuromiya doesn't want to apply the rating "crude" or "explicit" to his speech, but if he doesn't, his site will be blocked as an unrated site. If he does rate, his speech will be lumped in with "pornography" and blocked from view. Under either choice, Kuromiya has been effectively blocked from reaching a large portion of his intended audience ­ teenage Internet users ­ as well as adults.
   
   The point made in the above example is well taken and clearly shows the problem with self-rating systems that rate explicit material. Under the reverse censorship rating system Kuromiya would have two options. He could rate his page safe for children and risk the penalties of mis-rating. If he did this he would not be grouped into the categories of pornography and indecent material so everyone could access his site. If he was frightened by the penalties of mis-rating a site he could choose not to rate. In this case search engines and the like would only block him if a child were using the system. As soon as someone logged on the computer with an adult identification they would have full access to Kuromiya's site. If the standards of the rating system were taken one step further, and a tiered system were created, Kuromiya could mark the page accessible to a specific age and higher. Therefore he could make his site available to sixteen year olds doing health class projects.
Reason #2: Self-Rating Is Burdensome, Unwieldy, and Costly.
    Art on the Net is a large, non-profit web site that hosts online "studios" where hundreds of artists display their work. The vast majority of the artwork has no sexual content, although there's an occasional Rubenesque painting. The ratings systems don't make sense when applied to art. Yet Art on the Net would still have to review and apply a rating to the more than 26,000 pages on its site, which would require time and staff that they just don't have. Or, they would have to require the artists themselves to self-rate, an option they find objectionable. If they decline to rate, they will blocked as an unrated site even though most Internet users would hardly object to the art reaching minors, let alone adults.
   
   Just as in reason one, the reverse censorship system could relieve the majority of this problem. If the makers of the site are so sure that the material on the site is safe to children they could simply mark all 26,000 pages safe for children in a few keystrokes. However the company will certainly not feel comfortable leaving themselves at that much risk. Rather than having to review all the pages they could simply rate them all safe for children and only change the ratings on those few pages that are questionable. If even this task is considered too much of a burden on the company budget all pages could be left unrated. Again, this would only restrict access to children. The search engines would show results for adults. This would make it the parent's responsibility to allow their child access to a constructive site such as this if needed. Perhaps not a perfect solution but certainty a step forward. By taking responsibility to keep children away from pornography away from parents, and rather making it so that children need their parent's initiative to get questionable material you will make the internet a safer place for a child without restricting the rights of an adult.
Reason #3: Conversation Can't Be Rated.
    You are in a chat room or a discussion group ­ one of the thousands of conversational areas of the Net. A victim of sexual abuse has posted a plea for help, and you want to respond. You've heard about a variety of ratings systems, but you've never used one. You read the RSACi web page, but you can't figure out how to rate the discussion of sex and violence in your response. Aware of the penalties for mis-labeling, you decide not to send your message after all.
The burdens of self-rating really hit home when applied to the vibrant, conversational areas of the Internet. Most Internet users don't run web pages, but millions of people around the world send messages, short and long, every day, to chat rooms, news groups and mailing lists. A rating requirement for these areas of the Internet would be analogous to requiring all of us to rate our telephone or streetcorner or dinner party or water cooler conversations.
   
   In this situation there is one drastic difference. The message that you want to send to help this victim of sexual abuse is sent. The fear of rating it wrong has melted away. If the composer of the message feels comfortable rating the page for children of a certain age s/he is welcome to do so. However, if s/he does not feel comfortable, the message can simply be left unrated. Adults will still be allowed full access through all the proper search channels and only children will be restricted.
Reason #4: Self-Rating Will Create "Fortress America" on the Internet.
    You are a native of Papua, New Guinea, and as an anthropologist you have published several papers about your native culture. You create a web site and post electronic versions of your papers, in order to share them with colleagues and other interested people around the world. You haven't heard about the move in America to rate Internet content. You don't know it, but since your site is unrated none of your colleagues in America will be able to access it.
    People from all corners of the globe ­ people who might otherwise never connect because of their vast geographical difference ­ can now communicate on the Internet both easily and cheaply. One of the most dangerous aspects of ratings systems is their potential to build borders around American- and foreign-created speech. It is important to remember that today, nearly half of all Internet speech originates from outside the United States. But if reverse censorship is used this tragedy of communication loss will only take place in the realm of children. Even if only Americans understand or even have knowledge of the rating system, nothing has been lost to an adult user. Admittedly this causes problems for the world discovery of children, but again responsibility to provide access of such areas would be in the hands of parents and educational institutions.
Reason #5: Self-Ratings Will Only Encourage, Not Prevent, Government Regulation.
    The webmaster for Betty's Smut Shack, a web site that sells sexually explicit photos, learns that many people won't get to his site if he either rates his site "sexually explicit" or fails to rate at all. He rates his entire web site "okay for minors." A powerful Congressman from the Midwest learns that the site is now available to minors. He is outraged, and quickly introduces a bill imposing criminal penalties for mis-rated sites.
    The webmaster in this case rated his site safe for minors because he feared it was the only way he could provide access for adults. This is a great example of why a new rating system is so needed. In the old system the webmaster became increasingly frustrated with ratings because he knew his page was being grouped into "pornography" labels and people were being denied access. To circumvent this system he re-rated his site with a lower rating. Under the new system the webmaster would never have reached this frustration level because his site, as an unrated site, would allow adults full access. If the site was mis-rated he could be held to the same consequences as the current system. Those who fear government tyranny and intervention into this system would have no fear under the new system. In the old form of rating indecent material the government could target a site just because you rated it. In other words you were placing yourself under government


    LaQuana Price
Both CDA (Communications Decency Act) and COPA (Child Online Protection Act) were proposed in order to bring about some sort of law and regulation in regard to minors on the Internet. Both acts raise very good points about the need to protect minors from indecent and obscene material. But as with everything that congress introduces there are problems with them.
CDA is very vague. CDA makes it a crime for anyone to knowingly transmit obscene or indecent material to minors on the Internet. How is one to know whether or not the person on the other end is a minor. On the Internet people are nameless, faceless and ageless. It does not address the issue of people who use servers and Internet Service Providers to send indecent or obscene material via the net.
On the other hand you have (COPA). Although in is also in some ways vague, it does clarify more thing than (CDA) does. It makes mention of Internet Service Providers in regard to the transmission of offensive materials. But introduces the new concepts of patently offensive and the use of that material for commercial purposes.
If I were a congressperson I would push for more parental control. There should be more parental control because not everyone is protective of his or her children to the same extent.
For home use there the filtering software should still be used if that is a parent would like to use to keep their children away from harmful material. On the other hand for public places such as libraries I would propose that there be some kind of new software. With this software there would be some type of code or identification number that would have to be used in order to access indecent of offensive sites.
I propose the new software because place like libraries are open to the public. Meaning that most of the people who use them are taxpayers.

Shripal Shah
   There have been many attempts to keep children form being able to access adult content on the web. It has come in the form of the CDA, the son of the CDA, pics, and filtering software such as Net Nanny.
   I believe that you can not govern the internet. The reason can be found in its title World Wide Web. If something is worldwide then no one government can preside over it. Therefore I don't believe the CDA, or even the son of the CDA would make much of a difference. Because whatever is illegal in the United States under this law would still be legal elsewhere. So companies can just go and set up their server in other countries where they have a legal right to do so. An example of this is on-line gambling. While it is illegal in the Untied States to have an on-line casino it is not illegal elsewhere so many companies have gone from the United States and set up servers in other countries and made a lot of money. The adult entertainment industry is also just as lucrative and these companies are going to keep on doing what they want. So far many have co-operated and put credit card and age verification voluntarily. But if they did not want to do this they could have gone somewhere else and set up their web site.
   As a result I believe that a software solution is the best answer. I am against however any type of automatic solution that has been created or proposed. It is because of these automatic filters that many groups such as the gay and lesbian have been filtered out by these software. The picts tags would probably have similar problems.
   The problem is that many parents today look at the computer and the internet as a baby-sitter for their children. They feel this way their child will be occupied for hours without disturbing the parents. I believe this is totally wrong. Parents just let their children go onto the web without any supervision and then get upset when they find their child has gone to adult web sites. Would a parent allow their children to go play alone in a strange neighborhood? Then why do parents allow their children to go to strange web sites alone or unsupervised? I believe that is the main problem here. What I believe these software applications should do is track where the children go on the web. And then the parents should look at the web sites when they have time where their children have been. If their is an offensive web site that their child went to they could then place it on a banned list of web sites and then confront their child about what happened. This would keep the communication between the parents and the children open. Also the parents should be able to look at their children's logs and see what sites they visit frequently. And then they could have the software program only allow their children to go to those frequently visited web sites that the parents approve of. If the child needs to go somewhere else the parents could then allow the child to go to that web site when the parent is able to monitor them.
   This is what I believe would be an effective and responsible approach in terms of monitoring what a child does on the web. Parents have a certain responsibility towards their children and can't rely solely on software to do those responsibilities for them.

Igor Moscvich1@yahoo.com
There is little doubt in my mind of the legality of the CDA and COPA. Otherwise they would not pass through the plentitude of lawyers representing the Adult Entertainment sites That only extends as far as the legal aspect of this debate.
But what we also have to bear in mind is the other side - the real life implementations of those tools, as well as PICS and other software filters designed to restrict the access to "undesirable" sites. When you come right down to it, that is all they are - tools, which, just like any other, can be used for good or bad. The real question is what are they used for?
I anticipate the next question to be "who's to determine what is good and what's bad. What might be good for one is just the opposite for the neighbor". That would be an absolutely justified question. And that is where the individual values come in to the play. It should be left to individuals to decide whether or not to implement the censorship software on their access rout to the Internet. Individual who carries the responsibility of raising the children, not the government that can not please every single body, should make that ultimate decision.
Please don't misunderstand me, I do believe that there is a place for the government to be involved in this issue. State institutions should be able to use the software in appropriate situations. But I don't mean by it to use it as a sludge hammer, but rather as a surgical scalpel. Carefully choosing the clearly obscene site that would do more damage to the young undeveloped mind that they could possibly educate them. (Which I agree is a slippery ground by itself).
When it comes to the use of this software to restrict any and all access to the World Wide Web, that is when I have a huge problem. That is exactly what I call inappropriate use of these tools.
So the bottom line would be - they are tools. They do nothing by simply being existent. The real problem begins when they are misused and the person misusing them is at fault.