Week 11: Law, Code, and Kids
November 19, 1998

Z: Good afternoon. Might as well go through some administrative stuff while waiting for our final guest of the day to emerge. So administrative thing number one, David Eddy Spicer is a doctoral student at the Ed School and wants to look at how the use of course tools on the Web has helped or not your learning in this class. David Eddy Spicer, where are you? There you are, in the back. So he has questions he'd like to ask. It actually dovetails nicely with classmate Elory Rozner's separate study through the Ed School of exactly that question as well. And I think he has a focus group he basically wants to run. It's entirely voluntary although he promises some kind of incitement. Enticement or incitement. I suppose it's enticement really. Some form of pizza and beverage.
__: A one way mirror?
Z: A one way mirror. That's right. There will be a duck blind of some kind. What do you think this is? And that's going to be some time next week according to how convenient it is for everybody's schedule. So if you are interested in participating in this focus group you should zero in on David Eddy Spicer either at the break or immediately following class. Do you have a sign up sheet of some kind?
__: ... (inaudible)
Z: So that is encouraged. And it may well fit in with some of the questions Ellory wants to ask as well. For next week, in anticipation of our wrap-up class, where we pull everything magically together, there will be perhaps some questions on the site coming from David Eddy and from Ellory about experiences. They'll be the multiple choice variety. Again, as you may have noticed given the optional nature of the questions this week, you are now in the phase of the class where you can relax. You've done a lot of work, you should be proud. And now that all the other classes are actually kicking into exams and such, it's time to not have to worry as much about that. So the questions will be relatively painless and, again, optional.

I'm going to have extra office hours next week. I think it's Tuesday afternoon and Wednesday afternoon. And the week after, if you want to see me to talk about final drafts of papers there will be other times as well available just by call in. And I'm delighted to see any of you on that. So that will be next week. There were supposed to be sign up sheets of some kind but they appear to have not made their way over from Pound. So I think that's it on administration right now. Is anybody who has something administrative that I said I would say?

__: Exchanging our papers?
Z: Oh, yes. At some point a paper exchange has to take place, doesn't it? So that seems like the right thing to do during the chaos of break. So before you go over to the Hark (?) and have a bad bagel you should find whoever it is whose paper you have and give to him or her your comments with the paper. Yes?
__: Most of us don't know the names, like don't know the faces of the names we have. So is there a way to ... (inaudible).
Z: What a sad fact.
__: ... (inaudible) would be happy.
Z: Would he be happy? He'd be smug, you mean. Yes, Alex?
__: There are two ... (inaudible). One is you could just go introduce yourself. The other is you could line up in a circle by first name and the person behind you will be the person that--
Z: You're putting this forward as an actual suggestion?
__: ... (inaudible).
__: How do we introduce ourselves (simultaneous conversation).
__: ... (inaudible)
Z: The free for all, how about that? So pick up your paper, not the paper with the best comments on it. And they'll all be up here. Why don't we do that actually after class? That seems better than to try to ruin the break with people mobbing the front table. Any other...?
__: I just had a question about the conference on Sunday. I got an e-mail about that. Are we invited to that?
Z: Yes, there's a conference on Sunday, thank you. (simultaneous conversation) Those who are participating in the conference are asking you to avert your eyes. We've actually had amazing uptake from the media. We invited several media people just that are on list. And there are people like flying in from Washington for this conference.
__: No pressure.
Z: So no pressure. There should be advice circulating around from me and others about just how to present. But that should be a great conference. It's an all day sort of affair. We have a nice poster, which of course I did not bring with me, but should be covering the walls of Hauser as we speak. It's all day. It's the result of a class that was joint between MIT and Harvard. Half MIT undergrads, half Harvard law students, some of whom are also students in this class, on the legal and technical architectures of cyberspace. And I believe the idea of this conference will be each of these student groups will have a spokesperson who presents certain policy prescriptions in a very interesting way to a panel of doubting Thomases who will promptly rend them apart. Moderated by somebody.
__: ... (inaudible)
Z: Yeah, so that should be actually good sport and good watching and I encourage you all to attend. And I believe actually there are three meals planned for the conference. There's actually a breakfast, lunch and dinner phase. So if you're going a little hungry and the stocking over the fireplace is empty this is one way to be treated for the day. I mean that only in an ecumenical sort of way. So good. I should ask this every week. Are there any other administrative announcements that I'm forgetting?
__: No.
Z: So the details on that conference, how about I will circulate it to the list, a formal invitation to the class about all the details of where and when? It's in the Ames Courtroom but everything else I'll circulate. David?
__: What's the final paper?
Z: The final version of the paper is due December 22, the last day of exam and paper period. And again, from the looks of the rough drafts and the comments, it should not be particularly onerous to get from where you are now to where you need to be by the 22nd of December. So before asking for an extension consider whether you really want this minor thing hanging over your head indefinitely.

We've stalled enough. We have two guests who will more than make up for the absence of our third. Today's topic, as you know, is Law, Codes, and Kids, and Internet Filtering. And let me just introduce our guests right now. We have John Roberts from the Massachusetts Civil Liberties Union. Thank you for being here today. As well as Dolores Greenwald, a librarian with the Boston Public Libraries, and somebody who's been following their Internet policies for a while. And I actually should allow each of you to augment your biographies as you may see fit with various items as need be.

So I wanted to get started actually by talking to our third guest. The person who's not here is Susan Getgood from Cyberpatrol's marketing division who was going to tell us all about Cyberpatrol. And I wanted to start just by getting into how Cyberpatrol works. I don't know from any of you visiting the site or reading about it, is there any lack of understanding about exactly what Cyberpatrol does? I guess you don't want to say so if so. So let me be the person who says I don't fully understand Cyberpatrol. And let me ask our local librarian. You may hear this a lot. Tell me, what is Cyberpatrol? How does this fit into what the library uses?

DG: Cyberpatrol is a program that will filter graphics and text, select words and that sort of thing. You have a choice. Actually I have a copy of Cyberpatrol's cybernot (?) list, if you want to take a look at it. One of the options is violence and profanity, graphics or text, partial nudity, full nudity, sexual acts, gross depictions. All these are various filters that you can select to put on your computer or your network.
Z: So they're like a set of check boxes and you can say I want to filter out this, on the other hand I want to let the following stuff through, that kind of thing?
DG: Mm-hmm.
Z: And I don't know if that sheet you're talking about is actually online somewhere.
DG: Yeah, it's from the Cyberpatrol site. It's called their cybernot list.
Z: Let's see if it's on their fact sheet. So there's a screen snapshot there.
DG: Cybernot block list.
Z: We can see on this sample screen that Sunday through Saturday you can set hours of use, I guess. The filters kick in at certain times and then may leave once the kids get tired. And then up here there are different modes of the Net that you can filter. I guess World Wide Web access is the really central thing. But I guess if you're in chat or something -- I noticed in looking at the site, Cyberpatrol lets you -- you're talking about this not list, I guess it comes with some things built in. But then if you -- as the librarian installing it, or if it's a parent at home installing it, you can add your own things to it.

So for instance they advise that you add a fragment of your phone number to the not list so that your child will actually be incapable of putting the four digits together that spell the end of your phone number and sending that over an e-mail or over chat. One of the problems, you'll recall, that was raised on the other kids day we had, with the Federal Trade Commission. Does that sound right, how that works? So how much have you futzed with the not list that may have come with the software? When you configure this for the library what do you do?

DG: The Internet committee met after we had a presentation from Cyberpatrol. And we looked through the cybernot list and decided exactly what filters we would select to put on the children's machines. Maybe one thing I need to say about Boston Public Library is that we have a designated children's PC with access to the Internet and an adult PC at every branch. So in determining with Cyberpatrol we talked about just the children's machines. We never really discussed (simultaneous conversation) discussions took a bit of filtering the adult PCs.
Z: And among these categories do you just know offhand -- I know you haven't actually on the kids' PC screened all of these out, is that right?
DG: No, we didn't. The best I can remember we selected four out of the list.
Z: Wow. Just trying to look. You let the satanic cult go through or was it the...? Just seeing among these things what would you want kids not to see or to see. I'm not sure what gross depictions is.
__: ... (inaudible)
Z: I guess intolerance is an interesting category.
DG: In an area such as intolerance or sex education, as a library we were very concerned about children doing research and needing access to information. We also, as librarians, understood that oftentimes children don't talk to their parents about sex but they will come to the library to get information. So we were very -- we didn't block the sex education part. Or the militant extremists. Because if you're conducting a study on hate groups you need to see how the Klan is represented on the Internet. So we didn't block that one either.
Z: When you do choose to turn one of these filters on is there a way to get down and see specifically what sites are being filtered? Or is it basically Cyberpatrol that says, "Trust us, we've found some stuff in these categories and now they'll be filtered"?
DG: I'm not actually one of the BPL employees that does this. But we download an updated list about twice a month from them. But we also -- we have forms and things I can pass out, but we also can add a list, add a site to it, or we can remove a site. So we do have some freedom with it.
JR: But Cyberpatrol does not give a list of the blocked sites?
DG: No, they don't.
Z: And indeed, so far as we know, Cyberpatrol considers those lists to be hard earned lists by the sweat of the brow, right? You've got to have a roomful of people combing the Net with these categories in mind hoping to -- you're not from Cyberpatrol. (simultaneous conversation) Hoping to categorize sites into one bucket or another. And then once they've gone to that effort the real value of the product from a purely economic standpoint is exactly these lists. So it's hard. I guess there's a way on the site to find out. You can query about a specific site and see whether it's filtered but there's no way to get the list as a whole. Tim?
__: Do you know when they -- the actual mechanics of how Cyberpatrol chooses which sites fall into these? I know they're surfing the Web -- they have employees surfing the Web -- looking for things that come under this general category. I'm just wondering, I guess we had ... (inaudible) but how the company actually decides, when somebody finds a site, whether to include that as something that will be blocked under its cybernot list?
Z: I'd bracket that question. I suspect she'll probably show up sooner or later and it would be good to ask her that as soon as she gets here. I mean she's best able to speak to it. And see if she just says, "That's a trade secret actually. I can't tell you how we come up with this list" or not. I think in some ways they rely upon people reporting various sites in which they give, pump to the top of the list for scrutiny and then otherwise maybe comb as best they can.

Now recall that, for the day Joe Reagle was here, thinking about not just the privacy platform but the so-called PICS platform. The idea behind picks was to make Cyberpatrol something readily scaled by allowing lots of different people to do a little bit of rating and then be able to concatenate all the ratings together. Or even better, have sites rate themselves. Now, how many sites would actually put themselves, for instance, into the intolerance category. You may have seen in the assignment we had one site to look at which was a family site. I don't know how many people saw that. Just see if we can pop up with that real quick.

Hope it's still the current assignment. The bot may have gotten us again. Did I see, by the way, one of the papers is called The Evil Bot? Who wrote the Evil Bot paper? I can find it here. Don't make me do that. I'll do that over the break.

__: It's Enough is Enough.
Z: Clearly a well trafficked site. No, it's not Enough is Enough. Enough is Enough is the navigation (simultaneous conversation).
__: It's the American Family Association.
Z: Yes. This is the one for which if you look at the captain's message it's still under construction, even though the site has long ago set sail. But I think there was another one.
__: Enough is Enough is Donna Rice Hughes's (?) organization.
Z: Yes. Where is this thing? Oh, this is it. I'm pointing right at it. Here it is. Morality in media. Cyberpatrol blocked them under the intolerance category. And these are the people that are trying to get homosexual speech blocked under the bad speech, or gay speech, category. So there's a long kind of choleric letter from that to the cybernot oversight committee imploring them to see that what they're doing isn't so bad and that even President Clinton has, out of at least one half of his mouth, said things that equate to what they believe, and that as a result they should not be blamed for intolerance.

And they also even show the company that they keep. These are other organizations that have been labeled as anti-gay. And they include such notables as the American Family Association, Christian Coalition, Phyllis Schlafly, the Promise Keepers. And it's the bold faced ones they say you shouldn't block. They're silent about the Storm Front White Nationalists and the Aryan Nations.

So we have some means by which Cyberpatrol blocks a set of sites using those criteria. You can manually add to it but primarily it's coming from Cyberpatrol and we're not sure how they get it. Now, do we know how easily the kid at the terminal can get around Cyberpatrol?

DG: Yes. I often oversee our Internet terminals at the branch that I work in the afternoons. And the only thing that has to be done is going into our browser and removing the proxies. And that will free it up.
Z: How many kids do you think know how to do it?
DG: I would say the ones that come into the branch that I met, very few of them actually. I'm beginning to learn it's a myth how well kids know computers because they really just don't.
Z: This is another good factual question to bracket. We've seen it before. Every time we look at a question that has a code solution for which the answer is, "Yes, but hackers will get around it and then share the bounty with the rest of us", this is yet another example of you put on Cyberpatrol but the kids are still going to figure out how to get around it too.

Here is off the Peacefire site. You saw there was a whole list of how to disable each and every piece of software. And here's one. How to disable Cybersitter '97, rename the file CE Window system WSOCDLL, etc. You do two little commands and you've disabled Cybersitter '97. And notice it says, "You can help spread the information. Just download this banner and put it on your Web page." And it says, "It won't work if you simply copy the words `Rename the file blah, blah, blah' because Cybersitter can block strings of words from appearing on Web pages." So to prevent the kind of word for word excision they actually have it built into a little graphic here that you can put onto your page. But again, I don't know whether Peacefire itself is blocked by Cyberpatrol since it contains within it information about how to unblock the blockers.

Todd Lapin (?), wire journalist, is saying yes, they're blocked.

__: Yeah. And also Cyberpatrol has cracked the encryption used to protect the blocked sites list. And they've gotten into some proprietary, intellectual property issues because they're publishing the lists of sites that are blocked and --
Z: You mean Peacefire is?
__: Peacefire has cracked Cyberpatrol's list.
Z: Which actually comes nicely to the trusted systems question of last week about what happens when you have the material on your hard drive but it's encrypted and then you go to the trouble to break so that you can then share the wealth with everybody else. Have you gotten any other complaints? I mean how long has this policy been in effect?
DG: Two years.
Z: And all quiet? I don't know if this is the first time you've met Mr. Roberts. And I'm just momentarily about to ask Mr. Roberts what he thinks about your policy. But have you heard a lot in channels other than law school classes?
DG: No, I personally haven't. We haven't got a lot of response from it. We have forms that parents can fill out if they see a site. If they happen to be at one of our branches and they're surfing the Internet they can fill out a site and give it to us if they think it should be blocked. But actually we haven't got a lot of response from the public.
__: That's for ... (inaudible)
DG: Yes.
Z: And it may be that since adults can easily get to another terminal that's not filtered in the least, that that's not as much a problem.
__: Are children obligated to only use the children's room?
DG: No. With parental consent the child can use the unfiltered machine.
__: And the adult machines are totally unfiltered?
DG: Yes.
__: Do they have chat capacity?
DG: Yes.
Z: Emily.
DG: How do you keep children off of the adult terminals? Like how are they separated in the library?
DG: They're designated. They're labeled adults, they're labeled children's.
__: So is there someone monitoring them at all times to make sure that children aren't using the adult computers?
DG: I wouldn't say that there is someone designated at every branch to keep an eye on it but the staff -- if the circulation clerk is noticing a chid is at one of the adult PCs they might say something. But we don't have a designated person in every branch to oversee that a child is not on the unfiltered machine.
Z: So John, what do you think? This is a policy that apparently doesn't block adults from getting anything that they might otherwise get, since the filtering doesn't apply to them. And adults can even allow their children to use the adult terminal if they fill out the proper paperwork.
JR: We oppose filtering software for either adults or children in a public library. I mean our view is that we have the greatest research tool ever devised, that the role of the public library really is to make that available to citizens, and should. We always have the problem, the thing that's always raised, at least historically always been raised, is what about the kids? I think our feeling is that there really are ways that you can in a positive way direct kids to things you could do. As for instance the Cambridge public library does in their children's room. But does not block the capacity for kids to use the Internet.

Again, I use the same rap whenever I talk about censorship and that is a couple things. One is I think if we are going to have freedom of expression in this country and the right to academic freedom, however you want to talk about it, we take a real risk, I think, in terms of allowing the kind of speech that a lot of us don't like to take place. We have to put up with. Obscene speech, sexist speech or whatever. And I think it's the same with the Internet. Of course, it depends on who you talk to, what librarian you talk to, as to whether or not this is a problem or not. Some people think it is a problem. A lot of librarians think it is not a problem.

My concern is that because a few kids may go in and see explicit sex is not a reason to dumb down admission of the library for the ability of kids to use a very effective research tool. I mean we've talked to the Boston Public Library. We met twice with Bernie Margolis (?) about this. We've had some long sessions. And we were in very, very early when the Boston Public Library announced it. I don't know if you know the story about that, and I can go into it if you want to, but it's a very interesting story which really involves more about the politics of Boston than it does about censorship.

The mayor announced originally that it was going to be filtering of course for adults and for everyone. And then when Bernie Margolis came in he really negotiated that out so that it was just the children's site. But we oppose that approach.

Z: Can I ask, would you defend the right of a parent to ask that her child be allowed to excuse himself from class when class turns to some particular subject, say sex ed? Would you go into court to make sure that that parent could do that?
JR: Yes, we allow that. We allow an individual choice but the not the right of a parent to dictate the change in the curriculum for others. In other words, they can make an individual decision about their own kid but not about others, not that it would affect others.
Z: So is the real difference just whether it's opt in or opt out? If the Boston library had a policy that said when you sign up as a kid to get your library card and to be entitled access to the kid's machine you must also have a card from your parent indicating whether your parent opts in or opts out of a scheme that will filter your stuff? (simultaneous conversation)
JR: -- talk about the policy of the Boston Public Library because the Boston Public Library definition of a child is below the age of 18. So that theoretically there are some really serious teenagers who I think are really quite capable of handling an awful lot of information that we might say--
Z: So if it's an 11th grade sex ed class you'd want to talk to the parent and the child before marching into court and defending the parent's right to ask that the child be excused from class? Because it's an older child now?
JR: Our argument is it is precisely the parent that does have the right to make the decision.
Z: But what do you say to the parent who says, "If you want to remove filtering from the library my only decision now is whether to send the kid to the library or not. I can't send the kid to the library and know that my kid is going to get stuff filtered as I, parent, would like it to be in my choice."
JR: Then the parent shouldn't send the kid to the library. If the--
Z: But then why shouldn't the parent not send the kid to the public school? It's the same thing, isn't it?
JR: Because what you're doing in this instance is you are forcing a number of kids that are using this limited to make available this one kid that may need, or whose parents want that, available. So that what you're doing again is you're making decisions based on protecting this child and that's a decision. We tried to get the Boston Public Library to say why don't you do this? Instead of saying that unless you have an adult card or unless you have a note from home you can't use it. Why don't you do it the opposite? And that is to say it's available to everybody. If you're going to have this policy and we don't like the policy. But if you're going to have the policy then maybe you should say it takes an affirmative action on the part of a parent--
Z: An opt in scheme?
JR: That's right.
Z: Where the kid can, all else being equal, get to an unfiltered terminal. But if a note is presented to the librarian from the kid's parent that says, "Please restrict Joey" then the kid has to only use the restricted terminal. How does that sound?
DG: We discussed that in the Internet committee. And for several reasons we decided not to do that. The librarians on the committee had discovered that parents are not real active in what their children do at the library. And it would take too much proactive response from the parent. And administrative paperwork. We were concerned about that. Because we were dealing with 26 branches and confusion in the afternoons when the child come in. Do they have the form, can they use the filtered software machine, can they use the adult machine? So we thought it would be confusion on behalf of the staff as well. But we did discuss a more proactive part for parents.
Z: Andrew.
__: In your hypothetical I think you implicated and then by assumption trashed the basic question of the course which is whether the Internet is different than anything else. Because you've created this potential catastrophe where the parent doesn't know, the parent can send the kid into a unfiltered Internet terminal laden library. Whereas in like real space the parent doesn't know if the kid's going to go read a book. There are books with content similar to what would be alarming to the parent on the Internet. And so I guess the question that your--
Z: Really? You think the local public library has books with content similar to what you get on an unfiltered computer? (simultaneous conversation)
__: There may be explicit sex acts depicted in text in the library--
Z: Medical atlases, things like that. (simultaneous conversation)
__: Alt.pictures.bestiality or whatever.
Z: That just rolled off the tongue. (simultaneous conversation)
__: Search my cookies. There may not be an analog to that in the text at the library. But so that's the basic question is whether it's necessary to your alarming situation that the Internet be ... (inaudible).
Z: This is good. This is an empirical question about just how different what you can get on the Net from what you might find in the library is. There's a cartoon that Larry Lessig has fallen in love with that shows a kid being banished from the local sex shop. And you see him going next door and it says alt.sex.stories at the top and he's happily welcomed there. Which is a partial answer by the New Yorker to your question. But we've actually evolved a sort of praxis in this class as to how we answer questions of fact like that. And we tend to consult the Fox 25 News At Ten. So that said, this may actually--
JR: There's another major problem. And that is, at schools there's compulsory education. The kids have to go to school. And I think we can talk about the real differences between a school and (simultaneous conversation).
Z: As necessary to a child's existence compared to a library.
JR: Even in terms of filtering software use in schools than in the public library. I think there's a real difference.
__: So you (simultaneous conversation) support that use of filtered terminals in schools? Or ... (inaudible)?
JR: I think schools can-- in certain situations, yes. I think if you have a certain curriculum which you want a certain use. The kids to be in a library or doing research on a particular kind of thing it seems to me the school has the ability to say, "You can't be surfing the Internet." I would hope, I press my son on this all the time. My son is a school teacher in Provincetown high school and he works out of the library and he's also the computer person down there.

And I said to him, "Do you use filtering software?" And he said, "We turn it on and off. We use it at some times." And I said, "What's the real problem?" He said, "The real problem is not sex. The real problem is games." Kids are supposed to doing research and they're doing games. So what they do is at times they can turn it on but then they turn it off when the kids are using the library for whatever. Then they can just turn it off and let them go. So it depends on how it's being used.

Z: There's another empirical question. We could actually do a quick snapshot to see which of the machines here are tuned to solitaire and which are actually people taking notes. But I won't ask people to rotate their machines 180 degrees and see what's there. My Fox 25 News At Ten comment was quite serious and I now offer actually what Fox 25 News At Ten have to say on this issue. You should take notes as to what they're assuming as they report this. There are two segments, one problem, the other solution. [video played]
__: Now we know why people watch the news.
Z: No, it's too bad we don't have it actually out of the newscast because it would be, "Now we return you to more stories of mayhem and violence." So I'm interested, from the people who are familiar with the politics behind this, how well did this story, which we know comes from the lurid side of broadcast press, how well did this cover the situation?
JR: It didn't cover it very well. I mean what was going on, if you want to hear the quick version of it, and I got this from Bernie Margolis himself, Feeney-- you ought to understand that the Feeneys and the Meninos have been warring political camps for a while. She caught wind of the fact that some kids had seen some pornography or something and gone home and their parents were upset about it. And it got to her. And so she announced that she was going to have some hearings. Within 45 minutes after Menino heard about the hearings he was on the front steps of City Hall announcing this policy about the blocking software. There was kind of a oneupsmanship going on.

So he announced it really prematurely. I think he hadn't really thought it through. But that was the policy that was announced. The hearings never took place until after Bernie Margolis came from Colorado to assume the presidency of the Boston Public Library. And in fact, the day of the hearings, the City Council hearings, he was talking to the mayor about this and came out actually at the hearing and announced the policy that in fact it would only be children's computers that would be monitored. But it was this sparring between really a couple of political camps that forced the issue so quickly onto the public as to who was really going to get credit for filtering the computers at the library.

Z: As long as we have the VCR behaving I want to quickly show the segment that happened just about a month later, after the policy was announced and Fox' take on the resolution of it. And the great thing to look out for is how they describe how the software works, what it does and how it works. [video played] So amazing in this segment is the idea that the filtering is perfect. Not only does it actually get to every site that might be pornographic but it properly avoids taking out sites that, under the relevant standards doctrinally speaking, would not be considered harmful to minors. In fact, it seems to be adjudging cases or controversies before they're even brought before federal courts and decided as to which side of a particular line something falls.

Other thoughts on these segments before we-- Joe?

__: I wanted to follow up with the question I asked you originally with respect to the difference with the schools. Because you seem to say that in schools there's an intent and there's a constrained resource. That being the students' and teachers' time and the availability of the terminal. And with the number of terminals in the library I wouldn't imagine it to be too difficult to argue that those computers there are for a purpose. That they're a scarce resource and that children shouldn't be on there looking at potentially pornographic material, not with respect to free speech, because it's a scarce resource and other people need to use those terminals.
JR: Let's talk about that. You say it's a scarce resource and you're also assuming therefore that when people come in and use it there are good things to look at and not good things to look at. And if you're looking at sex that's an inappropriate use of the computer. And why do we say that?
__: You've granted that for the schools' content.
JR: I'm really drawing a distinction between the schools and the library. I'm talking about the library.
__: Right, but I'm saying that if you grant the fact that you make a discrimination in the school context with respect to appropriateness, given that it is a scarce resource in the library I think you might--
JR: It's a scarce resource and therefore what you do is you say everyone can use it for an hour. And so when you get your hour, I mean if you're looking at sex I think what you're saying is you're wasting the library's time. And the guy that wants to get to the computer and do research on aerodynamics or something. But I mean the library has all kinds of books. You can go and read for pleasure or you can go and do research. So if you're using the computer for pleasure at the library, why do you want someone telling you how you're going to use your hour? That's the problem. That's what censorship is all about.
Z: Of course, notice a distinction, it's one that Andrew is pointing out, in the cyber context versus real world, in that the computer is both a source of content and itself a tool. And by that I mean suppose the library decided to celebrate minerals month by having a little display all about minerals. And they have some books that they've taken out that are about minerals. They put it there, please browse the minerals books and here's a minerals resource station. Turns out to be a computer and it's specifically designed to take you to minerals related sites. And they ask that you not surf elsewhere because the whole point of the exhibit is to help with that.
JR: I have no problem with that, if there is a terminal available for everybody to do their thing. If it ties up the only terminal then I guess I do have a problem with it. But if you have other terminals that anyone can go and then use but you say, "We're going to designate this one for this month on black history or minerals or whatever and we have on it software that's going to lead you through this or whatever" I don't have any problem with that. Only when you take away the availability of an unfiltered computer.
Z: Of course the library could choose not to have any Internet access at all. Would that be OK with you?
JR: Yeah. They don't have to have it, no. There's nothing that says you have to have it.
__: Pragmatically speaking, is that situation better? From the civil liberties point of view, from a pragmatic point of view, ... (inaudible). Is it better that a library not have Internet access than have Internet access?
JR: I don't think it's better, no. But a library can decide whether to have it or not to have it. It's the content issue. If you have it then you can't be making decisions about the content.
Z: Dolores, you're shaking your head. I'm just curious.
DG: I don't think it's much of a choice any more whether or not a library has the Internet. It's a necessity. With the limited funds for books, periodicals, it doesn't make sense any more for a library not to have Internet access. I can get more government information, more periodical information at my branch with the Internet than I ever could.
Z: Let's work our way over. Scott and then Alex.
__: Also get into how or if the Internet is different. I would imagine that you agree that the library should be selective in what books it purchases and that there are certain books that are inappropriate to have on the shelves. So do you think that--
JR: Is that analogous?
__: Yeah, is that analogous?
JR: I don't think that's a good analogy of the Internet. I don't know if there really is an analogy for the Internet. But if there was one I would say (simultaneous conversation) librarians make decisions all the time about the content. On the one hand you've got a certain budget and you've got certain space problems, etc, that you don't have with the Internet. But I think the Internet analogy is more like a set of encyclopedias. Because it's something that gives you access to a tremendous number of topics. It would be analogous to having a set of encyclopedias in the library and having the librarian then going through and cutting out certain pieces.
__: Of course encyclopedias are already heavily edited for--
Z: This is an interesting point because if you actually look at the legal doctrine, the legal doctrine has this bizarre split in it that says libraries are entitled to spend their money acquiring books however they like. There's no compulsion to acquire this if you're getting that. And presumably they could order volume A of the Encyclopedia Britannica and skip over volume S. However, once they've acquired something they're not allowed to throw it away or to censor it, censor access to it, in ways that make it selecting material that they've already acquired.

That bizarre dichotomy, when you play out on the Internet, is the Internet once you hook up the pipe acquiring everything and then filtering is to start tossing? Or is it just a conduit through which you can elect what should be acquired is of course a saw there's no way of answering. Does that map to what you hear Dolores? And then we should introduce Susan.

DG: Yeah. But I do think that there is a difference between encyclopedias and what you see on the Internet.
JR: Oh, yeah.
DG: I mean you won't see-- we wouldn't have to be concerned with Penthouse being in an encyclopedia. So I see it as a totally different thing.
Z: -- Andrew's question of whether the library's holdings are changed in kind as soon as you have the Internet connected. Or is it just a way of having a mirror of the sorts of things that are already there on somebody's shelf around. Why don't I let Todd get in and then introduce Susan.
__: Just a quick rhetorical point, there is a guy who runs a site called filtering facts who's a filtering advocate. I know, we won't even get into it, he's a bit of a crank. But he actually calls it the Hustler challenge. And this is his way of dealing with a bunch of these issues, including the resource issue. You know, how do you decide what's in your collection. He's basically saying this argument. He's offered to buy a Hustler magazine subscription for any library that wants it. Saying look it shouldn't be a financial issue. Under the argument that look, if we take away resource constraints they still don't want this stuff.
Z: If you drop it on their front door and they throw it away isn't that filtering (simultaneous conversation).
__: -- so he's basically got this thing going called the Hustler challenge and nobody is taking him up on it. And he uses that as an argument to say there are some things that libraries just don't want.
Z: Would-be private librarians take note. This is an opportunity to start your holdings. So at this point I think we have with us Susan Getgood from Cyberpatrol. Maybe you can give us just a quick encapsulation of who you are and ... (inaudible). We've already talked a little bit about Cyberpatrol.
SG: I'm sure you have. I see old friends in the audience who already know a little bit about it. So ... (inaudible) had some experience.
Z: Are they on the not list or on the yes list?
SG: I wouldn't know. We don't publish it. I gather I walked right into, you're in the middle of discussing the issue of libraries and filtering which is very timely right now given the Loudoun case last week. The real nut of it is that there is no right answer about how you filter and how you protect kids in libraries. And we've always taken the position that it's really for the library to make that decision. To look at its community, to look at what they need to do to protect the children within their community within the realm of not infringing on the rights of adults.

So we look at things like the way Boston has implemented filtering is if there is a right way that's the right way. You filter children and you give choices to parents about whether their children go to the library and if they do they can be filtered. And you don't infringe on the rights of the adults to get that same material.

Z: Are libraries a growth industry for Cyberpatrol? As a marketer is that an area that you'd be targeting specifically?
SG: We never have and I doubt that we ever will. We consciously made a decision over three years ago not to market the product to libraries. We felt that it was a decision that the library in its community should be making. And then if they chose to filter, I mean clearly we're a business. If they call you up and say, "We'd like to buy your product" I mean we're going to sell the product. But we're not going to spend time and energy pursuing a market that is so torn on the issue. That would be silly. So there's a far more lucrative ways of making a business.
Z: And there have been a couple of empirical questions just raised before you got here. One is, how easy is it to get around Cyberpatrol to a child?
SG: I guess I don't know how easy it is to get around it. We certainly worked very hard to make it what I would call tamper resistant. I mean you never say anything is tamper proof because that's like asking for trouble. There have been instances where certain particular operating systems changing, you have to retool your product to fit the operating system. What we really believe is actually that filtering is only part of the solution, whether it's in the home or in the school or in the library. What we're really dealing with is understanding that there are a lot of competing objectives here. And this objective happens to be to protect children from inappropriate Internet content by whatever definition you have of inappropriate.

Whether it's, "I don't want my kid to seen Penthouse" or "I don't want my kid to know how to roll a joint" or whatever it is that you view as inappropriate content.

Z: By definition of appropriate content that's the different categories you offer and then you fill in under each category. And I'm just curious the mechanics of how that's done. Without having to divulge the contents of the list, how do you go about identifying sites and putting them into categories?
SG: I'll put the train I was on into the station and I'll move ont this one. Specifically mechanically Cyberpatrol, and many of the other filtering products do as well, have categories of content which we define. And in our case it's everything from partial nudity to nudity to intolerance to drugs, a host of others. I used to know them all by heart and I don't any more. But I'm sure you can find them.
Z: We were just looking at the list actually.
SG: And our researchers actually go through and used defined criteria, view the Web sites, put them on the list according to whether or not they meet our criteria. The person using the product, in this case Cyberpatrol, has the opportunity and ability to select which categories they use or not use. They have the opportunity to add or restrict additional sites based on their own criteria. So if for example they have a 15 year old boy who's old enough for Playboy but not Penthouse they can add Playboy to the allowed list.
Z: And that overrides the restriction that's built in?
SG: That overrides the restrictions.
Z: They might not know it's restricted until you try to visit. But once they do they can say all right, I'm going to let this in.
SG: There's a certain assumption, I think, on a class of sites that we know are going to be on most lists of most filtering products. Your average commercial pornography, which is where most 14 year old boys are going to go and pretty well known.
Z: Are the criteria themselves published, the criteria used to go into each one?
SG: Yes. Are you on line?
Z: Yeah.
SG: You can get to ours on our Web site. They're also in the product. So you can indeed--
Z: And when you say researchers, how do you acquire researchers? Who are they?
SG: We hire parents and teachers who are trained in our criteria to locate the sites.
Z: And do they work from home or do they come in?
SG: No, they work in our office?
Z: How much do they get per hour approximately?
SG: I don't reveal that information. Would you like someone to reveal your salary?
Z: Maybe. Frankly haven't checked my hourly wage lately. If I were doing this I would probably be interested in having it revealed. I mean I ask really because do you think there is an expertise in classification developing? For instance, once upon a time I worked for the government and the government classifies materials. Some of it is unclassified, some of it is confidential, some of it is top secret. And there are paid people who just classify stuff. It comes in one end, they look at it, they classify it, it goes out the other according to stated criteria. And I'm wondering if you think that's a distinct profession that might develop in our cyber economy where somebody becomes really good at classifying things for your categories.
SG: Probably, but not from the point of view of restricting material. I think that actually where the value of classifying and understanding what is out there comes from classifying it from the positive sense. Where can I find information on X, Y and Z? Being able to aggregate content in such a way that you can find the material that you're looking for. A whole component of our business is based on building white list communities, whatever. Integrative inclusive lists for kids generally geared at younger children, little kids, so that parents can actually let them go on line, visit these communities of sites and not feel like they've been blocked. Because really they're just surfing in a safe environment. That's not a solution for a 15 year old. That's a solution for a six year old.

So there is and that's the model really that America Online uses. That's a model that AT&T uses and is creating ... (inaudible) kindergartens if you like. So there's certainly some component of that. We decided at the very beginning that we wanted it to be parents and teachers who are doing the research for our product because, cycling back to what our real objective here is, is to create a tool for parents, for teachers, that they can use to help protect kids from inappropriate content.

So to some level parents and teachers come into that with that as part of their mindset anyway because that's what they do, that's what they're interested in doing.

Z: To an order of magnitude approximately how many sites do you block?
SG: About 60,000.
Z: And how many per month get added to that list? And do they tend to cycle around? I mean somebody will have a home page, they'll just move it to another site and you have to find it again and block it again?
SG: We have some tools in place that let you do that relatively painlessly. I don't know how many they're adding per week any more, truthfully.
Z: And if a site is blocked I don't presume there's any particular notification to the site. You know, congratulations, you've been blocked in the following categories. You've just earned your fourth category.
SG: It's interesting because no, there is no process of notifying. But again, these are commercial products developed by companies, sold to people who choose to buy them. And it's back to the train I was on before. In context they're part of what parents are doing to protect kids, or teachers using. And if you just had filtering software in isolation it would be the wrong thing. But if you have filtering software in the context of education for children, education for teachers, information about the Internet, safety rules for the Internet, and a whole host of things all geared to the same objective of protecting kids, then it is an effective way of protecting kids. But it shouldn't be done alone.
Z: So something to bracket then is you'll recall Molly Schafer Van Howling's (?) axe to grind. Her paper was assigned, I think, either as background or primary reading on state action. From a constitutional point of view I think Susan is right. She's saying look, this is not state action. This is private actors coming together to make judgements about things. In fact, in some ways it sounds like the WC3's view as well when they talk about PICS. They're just people getting together to facilitate a sharing of judgements so that a parent can use Cyberpatrol's researchers as proxies for what they want their kids to look at when they themselves don't have time to go through and do it.

Now, it does perhaps raise questions of what the proper boundary of state action in this realm should be if the primary means of filtering is going to be through things like this installed. And that was actually the question asked for this week, about what if it comes bundled with the OS or something like that. So I don't know if there's-- yeah, Ellory.

__: It also raises questions about the boundaries of those categories, though. And I'm not sure if I should ask this now. But why then are sites relating to breast cancer, say, blocked? If you're trying to create kindergartens or environments of education, why would a site on breast cancer be blocked?
Z: It might actually just good to ask as a threshold question, are such sites-- if there was a site about breast cancer would it fall into any of the categories?
SG: No.
Z: It wouldn't be?
SG: No.
Z: And if it were blocked it would be a mistake. Somebody would be notified and fix it.
SG: The breast cancer example specifically, and you could talk a bit, is an old example. Because it did happen in the very beginnings of America Online. They actually did block, I think it was a message board based on the use of the word breast. And it was a breast cancer message board. That's not how the majority of commercial filtering products work. They wouldn't block material about breast cancer. We wouldn't even block material about the therapeutic use of marijuana. We would block material about the non therapeutic use of it.
Z: That's amazing. I mean it really asks for a degree of attention of the researcher paid to just what is this site about. Can you partially block a site or does it have to be all or nothing?
SG: No, you can partially block a site.
Z: And that means certain pages versus certain other pages or you can block portions of the page?
SG: You don't block portions of a page. Not cleanly anyway. I mean some products talk about being able to grup (?) out-- technical word-- peel out the nasty words and just leave you with the good ones. But I don't know if there is really any efficiency there.
Z: How many people have complained to you, or how many complaints do you get in a given period that you've blocked my site and that's unfair, don't block it?
SG: A lot of questions. Always a lot of questions. Particularly in the early days of filtering people didn't know what does it mean when my site is blocked by Cyberpatrol. And I'm sure the other companies had similar experiences. But for the most part once you explain this is a product, people buy it, it's free choice, most people understood why their site was blocked.
Z: Do you write back to everybody who comes in with a complaint of that nature?
SG: Anyone who comes in gets a response. Then we've had a few relatively, shall we say, larger appeals. We have a whole appeals process in our product where we have an oversight committee who actually assists us. They are not employees. They are people from the community around where the company started. But they represent a variety of civic and political and social points of view. And they come together every quarter and help us deal with some of these issues, like why was my site blocked. In the past year we've had discussions from nudists. We've discussed that issue. The pagans, we've discussed that issue. We blocked the American Family Association for intolerance to homosexuality. That was not a popular decision.
Z: Yes, we saw their letter today about that.
SG: And the key that we try to do is be consistent in our application. And that in particular was not the easiest decision. I mean it would have been much easier to not stick to what we believe was right.
__: Are you blocking pagans?
SG: No. That was a discussion about (simultaneous conversation). One of the things, part of what we do, is we have people from-- in the case of those two things it was those two special interest groups coming in to explain what their group was about. The religious nature. So that our researchers would understand so that if we did hit a pagan site they would understand that it was religion, it wasn't something else. Because we do have a cult category.
Z: As a matter of company policy do you reserve the right to block any site if you so choose regardless of whether it fits into the criteria? Or have you made a promise only to block if you believe it fits within the criteria?
SG: Only to block if we believe it fits within the criteria.
Z: Does that include sites that tell you how to unblock the software?
SG: That's illegal. It's hacking. There's a category for that.
Z: We saw one example of just renaming two files that might disable one of your competitors, Cybersitter.
SG: Right.
Z: It's not clear that that would be illegal to rename the two files.
SG: You know, I can't comment to what Cybersitter might or might not do. If someone is hacking or hacking into our software we would block that site.
Z: And is illegality (simultaneous conversation) and gambling.
SG: And hacking is in there.
Z: So any site that talks about disabling Cyberpatrol is questionable or illegal?
SG: Questionable in my mind certainly.
Z: Let's pick up where we left off. Alex, did you get your dig in? I don't remember.
__: No, but it's all right.
__: I don't really know how this is related but--
Z: You always preface your comments with that.
__: I was just wondering, isn't this sort of a balancing issue? The Internet to me is different than the library. Because the Internet is not catalogued. So the problem that we ran into the first day of class when you typed in whitehouse.com and came up with a pornographic site. Which I'm assuming is blocked. And that's different than a kid who goes to the library. You're not going to just randomly come across a site like that because libraries are catalogued. So that if you want to find something about government buildings or presidents it's all in a certain section and you're never going to come across some sort of pornographic site like that.

I mean if you just randomly search through a library maybe you'll come across some books. But the books about anatomy are in a certain section, the books about presidents are in a certain section. And so doesn't what Cyberpatrol really do is sort of in a way catalog things. It says that if you're just looking for books on presidents it's in this section. You're not going to come across the pornographic sties. So it prevents the kid who's really just going, irrespective of the kid who goes to look at whitehouse.com, it prevents the kid who really just wants to go to whitehouse.org. It says we will only send you here to whitehouse.org. You don't have to fear that you're going to end up at a porno site.

Or even if you go into Yahoo sometimes, something that seems completely benign like--

Z: I just tried to that with X men actually. And it turns out it just showed me X men.
__: Which is what you wanted so that you don't have to (simultaneous conversation).
SG: But Yahoo is not the best example of a search engine.
Z: I redid it with Altavista actually.
SG: I mean just because Yahoo does imply some editorial judgement on what they put on their...
__: Take any search engine.
SG: No, but your example, I think, is very good. Again, we're trying to put something in place that prevents the inadvertent, the inappropriate access by kids to whitehouse.com when they're trying to find out what the name of the president's cat is.
__: What would you say? I forgot your name.
JR: We have no problem with filtering software in the hands of parents and in the private sector. It's the state action issue that's our concern when you get to libraries. I want to make that clear. If people want to go out and buy software to filter for their kids, fine. And that's what Cyberpatrol-- we do have a problem when it gets into a public institution like a public library that is the resource in the community that one goes to get information to do research. We have a problem with that and it's a state action issue. That is, if someone is making decisions about what you are able to see and what you are able to reach. That's the problem. And we don't want to give the government or the government agents that power to be making those decisions.
Z: I'm really loath actually to take a break at this point since we got started a little late and we're on a roll. So let me just alert people who were waiting for the break to do some necessities that I welcome you to do that quietly now out of the room and that there will be a reception afterwards. So if it's just nourishment you want it's going to be brought to the door. So with that how about Antoun and then Paul.
__: I'm wondering, is there anyone in the room who is a parent? You don't count.
Z: Bill's a parent too. Bill McSwain.
__: He's six months old and doesn't surf the Web.
__: What I'm trying to get at, I'm curious as to whether real parents would find that the categories offer them the kind of granularity in judging what their child sees. I mean I'm trying to put myself in the position of having a kid who's of surfing age and think would I buy Cyberpatrol? And I think these categories are way too low resolution to capture the kinds of judgements that I would want to raise my child with. About what's good to read and what's bad to read and what's good to know that's out there but not necessarily something you should wallow in. This doesn't get it and I would question whether really any piece of software is going to get that.
SG: Absolutely right. And if they were just relying on Cyberpatrol or any other filtering software to perform that function for their family they would be making a serious mistake. If they're going to use a product of any kind, any kind of tool for that matter that sits in loco parentis, whether it's filtering software or the television or the computer itself, sending them to the library, whatever you do that's got that kind of role in their life, you ought to make sure that it's being done and used-- computer software. My company makes tons of other kinds of computer software. It's done in a way that's positive. If this were all it would be wrong. The fact is that it's a pretty good and effective solution for parents who, in addition to raising their kids probably have jobs and a lot of other things, that this helps them get started in any case.
__: But that's the ABC After School Special argument. I mean parents please watch this with your children. What percentage of parents who purchase your software sit with their children as their children surf or have some sort of frequent contact with their child that gives them a sense for what's going on there?
SG: I think a fair number. I think this is an engaged parent who buys or installs filtering software. It's not a casual parent. I mean if it were-- I think it's an engaged parent.
Z: Paul and then Jen.
__: I'd like to go back to the choice ... (inaudible) for a couple of questions. First is--
Z: One to a customer.
__: They're kind of related. Kind of. First is, would you censor-- excuse me, filter-- that ... (inaudible) to information that you would permit but a lot to information you won't? For example, if a site came to act as a resource for political dissent and it provided some links or a lot of links to sites about ... (inaudible) and porn, but also a lot of links to sites about, to the NAACP and similar sites. Would you filter that?
SG: Depends.
__: On?
SG: On the content on that particular site, the context of the materials presented in--
__: What if there's no content? What if it's just a list of links? Here's where you can go to get more information on this.
SG: You really look at each one of these things, especially a case like that, as an individual decision. And it would really depend, again, on what the organization was, what the site really was. And here's your example. If it was really a site about how to find information about making illegal bombs but there are a few sites to some good organizations at the bottom just as cover, would you want me to not block it? I don't know. So you have to really look at what the content of the site is. The other thing is generally speaking sites that have just links are not the only way of getting somewhere.
__: They may be the easiest way, especially if--
SG: That's why you have to look at each case. That's question one. What's question two?
__: So you basically answered question one as to it's the judgement of the parent who's clicking on it.
SG: Right. And you can go in and say your kid gets blocked and they come to you and say, "Mommy, daddy, I got blocked on this site, I need it to do my homework." Parent has the ability to look at the site and say, "This is fine, we're going to allow it in our family" and allow the site. The parental judgement is an important part of the whole thing.
__: Susan, just to be clear, you do block links, not just content?
SG: We block content but if you had a site that was (simultaneous conversation) a bunch of links and had it organized in such a way that it was here's your resource for making bombs, it depends on what the editorial content is. If it's just links probably not. We're not going to junk up Cyberpatrol's list with a lot of link sites. But if it's something problematic that they're going to add to all the time, you're right. I don't know. That's why I say you have to look at everything in the context of what is on that site. It's very hard to answer any questions about would you block this, would you block that in the abstract, because we don't do it in the abstract. We do it by looking at a site and deciding whether or not it's going to meet the criteria.
__: The second question?
Z: Paul. When it comes around again. Come on, Jen.
__: What pops up on the screen when a kid types in a URL that's blocked?
SG: I think it now says, "Cyberpatrol check point" or something like that.
__: I guess I'm trying to figure out what your concept of the parent working with Cyberpatrol (simultaneous conversation) So ideally the parent is keeping track of what's been blocked and then--
SG: No, we don't do that. The concept is if the child is surfing away doing their homework on, I don't know, pick something, and they get blocked by a site, if the parents are doing what we hope they're doing in terms of working with their children, in terms of using the Internet in general, they're going to say, "Mom, I got blocked by going to www.dogs.com" and it turns out that it's OK.
__: But there's no way for the parents to know exactly what is being blocked by your software?
__: Does it log it?
SG: No, we don't log.
__: And so how do you deal with like maybe like the situation of dynamic Web sites? You said that you block the use of marijuana except for therapeutic marijuana. So how do you deal with a Web site that your researchers have seen and you thought that it was OK at the time but since then the content of that Web site has changed or vice versa?
SG: It doesn't happen as much as everyone claims it happens. Generally speaking, the content may change on a Web site but when you think about the overall goals and what the Web site was intended to achieve, that doesn't really change. So if it's a positive Web site about the therapeutic uses of marijuana logically it's probably going to stay that way. And the content may change but the goals of the Web site don't change. We do spot check certain categories all the time. And we have some tools that we go back in and use to see if sites are still there.
Z: Jen does raise an interesting question about a different kind of software that might come about. Not like you have now. That instead of, in the words of the Loudoun county court, it being a prior restraint to block these sites before they're even presented you can just imagine software analogous to the little thing you can put on a car where if your teen drives too quickly somebody can dial a number and report your teen and then you find out about it later and punish the kid. Similarly, you could imagine actually somebody that kept a fairly good history file and threw up a flag so the parent could say is there anything I need to worry about. And it would say yes, in fact your kid visited the following sites that fit under these criteria. And then the parent could see what was going on.
SG: Those products already exist.
__: Yeah, they exist.
SG: We made a conscious decision three years ago to not add that feature into the product that we sell into homes because our belief was that if you are helping parents to restrict the children from accessing inappropriate content kids are kids but they have a right to privacy as well. So if you're by definition blocking them from the inappropriate content we'd like to have a little bit of privacy for the children so we would not log.
Z: Joe.
__: Dolores ... (inaudible)
DG: Yes.
__: Are you familiar with the American Library Association's policy on blocking? As I understand it they've actually taken an extremely principled stand where they say the American Library Association believes that no library should filter.
DG: Yes.
__: So I was wondering how you saw yourselves in the context of that position. Are you an outlier?
DG: We discussed the ALA policy. And librarians generally have a rule against censorship. But we chose to make the decision to go against the ALA policy at that time and go ahead and do the filtering.
__: Do you think the ALA just had it wrong, was incorrect and ... (inaudible)?
DG: I think this is new to the library profession and it was a new issue for the ALA as well. And I think that probably in time libraries are going to start changing and maybe the ALA will change ... (inaudible).
__: I was thinking in the broader context of copyright and intellectual property in general. If we made all these specific niches for libraries for copyright protection for ... (inaudible) and other things but the libraries turn around and start to filter content on their own, should they still be available to use those doctrines? I mean should they be protected and be able to use copyrighted material and at the same time turn around and become censors of their own? I'm not sure that I'm making that question quite clear. I mean should we continue to protect libraries as a protected class if they're not going to distribute the information that they're supposed to be distributing to individuals. I mean the very reason why we have ... (inaudible) for libraries is so that everybody will have access to it. If you turn around and block access should you still have a fair use doctrine to protect you?
DG: But it's like what we discussed earlier. That libraries, we don't really call it censorship but libraries make decisions about the information they choose to put in their library all the time.
__: Sure. Then should you receive that information for free and have fair use? I mean the whole idea behind fair use is that the public has access to all the information. And if you're going to decide that they don't have access to parts of it should you benefit from a law that says you can have it for free?
DG: You're not necessarily talking about just public. We're talking about children as well as adults. Adults can come and look at anything on the Internet that they choose to look at. It's more of a protection tool for kids than it really is, in my opinion, and it is my opinion, a censorship issue.
__: When we talk about kids we're talking about a very specific class of children. We're not talking about children that have the Internet at home obviously. We're talking about mostly inner city children who don't have access to that at home. So just another something to think about.
JR: Why didn't the library distinguish between a 13 year old and a 17 year old?
DG: We discussed that as well. Because normally the library does. We classify children as 8th grade and under. We classify young adults as 9th grade and above to high school. But we looked at the censorship issue as age 18 and under because of the classification of adult.
JR: Again, the American Library Association has taken a very strong position in terms of the access of library materials to children, to young people and to children. And that they can't make a distinction between a 12 or 13 year old and a 17 year old is kind of astonishing. We had an incident in Andover, the Andover library. This was pre-Internet days but (simultaneous conversation) two 13 year old kids wanted to go into the stacks and they had a policy there that if you're, I think it was 13 and under, 12 and under, whatever it was, you could only go into the children's room.

Again, that seemed to be very, very limiting for a 13 year old. And I think you have that in spades with the Internet. Because I know that at the library there's three classifications of cards. At the Boston Public Library. So you have the classification of a young adult. And I don't understand if you're going to have a policy why at least you wouldn't allow the young adult ... (inaudible).

DG: The Internet committee, as we were evaluating this, we were looking at what Mayor Menino was saying and what the public attitude toward filtering seemed to be. And the public attitude we came up with determined was that parents and Mayor Menino was not really distinguishing between young adults and children. That it was all under 18.
__: I have another one.
Z: But it highlights it in this case. It's a decision made at the highest political levels of the city which is then handed to the library to implement and we even have heard (simultaneous conversation). And we've heard a story here today about a way of having the professional staff, as it were, nibble away at the originally declared policy by the mayor. It's not going to be every computer. It's just going to be in these two categories or something. And that may well represent tradeoffs among the professional staff as against their boss, their elected boss by the citizens of the city, as to how they want to do it. And whether that's a problem or not of course depends on how you feel about the policy. And what the right feeling about the policy is again something either left to the polity or left to something that has a floor when you talk about rights. No matter what the polity wants to do we have a right to X, Y or Z. Jennifer.
__: I just have a question about the potential impact of Cyberpatrol and if you've already seen this happening or if you expect to see different sites actually altering their content out of the reaction of being blocked by Cyberpatrol. I'm thinking in particular of an analogous situation in real space where we see like movie producers that choose not to include certain content in their films and they can get a PG rating. Do you expect that this is something that might happen?
SG: No. The extent that we've seen of that is "We were blocked by Cyberpatrol." But other than that I don't--
Z: Has anybody asked you to block them on general principles even though they have something entirely innocuous?
SG: Actually we've been asked to block sites all the time.
Z: By the sites themselves?
SG: By the sites themselves. Particularly in the time period of the first communications decency act. Tons. Pornographers were standing in line to report themselves to the Internet filtering products. Because that was their best defense against being prosecuted. Or they felt it was a defense against being prosecuted. I am not a lawyer.
Z: We know that there are affirmative good faith defenses that were discussed in the CDA that was struck down and/or in the new child's on line protection act which says, among other things, there's a catchall third. And our moot county people may be thinking reminiscently about this. There's a third that says you'll use available technologies to make a good faith effort to block your harmful to minors material from minors. And it does seem that you could imagine a kind of synchronization between the government passing that law and a company like Cyberpatrol that can then offer itself as a way for somebody running a site to do the good faith effort. To actually have them working in concert. Which again, raises the state action question about how easily can you pull apart the puzzle and say that piece isn't state action, that piece isn't, but wait, when it starts at a library now we worry.
JR: And librarians have a ... (inaudible) both under the assembly statute and under the harmful to minors statute.
Z: Todd.
__: I was just mentioning this to Joseph. And having sat through this debate for like four years, the thing--
Z: Come on, it's only been an hour and a half. (simultaneous conversation)
__: It's been going on for like four years. I've seen so many different permutations on it that the one thing that I'm increasingly finding makes some sense, although it's a technological solution ... (inaudible) is something like a dot xx or dot sex domain. Because there's a type of thing to say isn't that going to require a form of prior restraint because people are going to have to self select? My feeling is that on some level if you just build it they will come. Because I mean exactly the kind of reaction you saw is actually quite common. I've spoken with people in the adult sites industry and they would love to have something like that. Because from the way they look at it, most of them, is if you have kids visiting the site the cops are right behind the kids. And they just don't want-- they want to just run a business. They're not really interested in the kids. So if there's some really relatively straightforward way to keep the kids out they're actually pretty game for that. (simultaneous conversation)
SG: Commercial pornographers, people who are running a business, would welcome an xxx type domain. My personal problem with xxx or any kind of labeling scheme that selects people into an environment is if your objective is, back to where I began, is to help parents protect their kids you're not getting all the stuff on the fringes. You're not getting the amateurs and the amateurs are worse than the professionals. You're not getting autopsy photos on line, you're not getting things that are far more disturbing to children than a little tits and ass. You're getting real problematic stuff. And those people are never going to self select into any kind of domain because they're the folks that call their sites whitehouse.com and bambi.com because they want to titillate and they want to shock.
Z: And also it points out that there are business models that don't require a cash transaction for it to be profitable for, at least allegedly profitable, right? We don't know if Web advertising is going to work yet.
__: But under the Carol (?) legislation, for example under CDA2, the amateur sites are not necessarily prosecutable. Assuming they don't carry any sort of advertising.
Z: That's the question. Are they commercial sites?
__: So if there is no money being transacted then the law--
Z: Then her answer is, which I suppose I shouldn't offer for her--
SG: You can give it because I've heard it three or four times. It's about helping parents and teachers protect kids.
Z: And she says then COPA doesn't go far enough because it only covers the commercial (simultaneous conversation).
SG: -- do anything to that objective. The laws don't really solve the problem that we're trying to get at, which is where our space is at, which is the issue of parents and kids. So the laws are almost noise to some degree.
Z: Greg.
__: I've been thinking for a while here. How is this technology any different than what government does all the time? Government, at least to me, in several cases says that children are suited to hear such material and (simultaneous conversation). That's why you're not allowed to play Love Lines at 2:00 in the afternoon. The FCC says that if you're going to have a broadcast like this it's got to be after 11:00 and up until 5:00 in the morning. It's got to be between these hours.
Z: This is partially a doctrinal question which Andrew McLaughlin is incredibly well suited to answer, having worked on the CDA case for ... (inaudible).
__: Right. So the CDA case boiled down the question of like is the Internet more like a telephone, a newspaper or a television? Because those are areas of first amendment doctrine that are well established. And so the Internet purist answer was, "Dummy, it's not like any of those." But the position that we took in the litigation and the Supreme Court bought was it's like a newspaper and then some. And what you do is you look at the technological features that underlie each of those three technologies. And you say a television is limited band width allocated by the government for the public interest.

And it's got the channel dial problem, that a kid can just turn the dial and skip past anything that's on there. So they said in the case of TV then we think it's reasonable to protect kids by saying that you can limit certain kinds of shows to certain hours. They've never actually taken the extreme position of saying the government can prohibit certain shows from being shown. They've said over the public air waves which permeate every household in America you can have certain kinds of time, place and manner restrictions. But then--

__: These are basically the two doctrines that make it legal to censor television at all, which are the pervasiveness doctrine and the scarcity doctrine. Am I correct?
__: Yeah.
__: So the question is do those doctrines apply to the Internet? And the court said no. I mean again, the pervasiveness argument has nothing to do with how many people have TVs. The pervasiveness scenario is you leave Junior in the crib with the TV on and you left and it was Captain Kangaroo but by 2:00 in the afternoon it turns into Deathrays 2000 or something like that. So again, it's like coming at you. And the argument was no, you have to affirmatively request a Web site. So the content isn't changing without you actually doing anything. So no, it's not pervasive.

And then there's the scarcity argument. Is there a band width scarcity? Again, in terms of the original context of 13 channels. Then the answer was no. So that was the reasons why it was decided that no, you can't apply the same sort of, again, time, space and so on and so forth restrictions to the Internet.

Z: Geocus (?).
__: I want to go back to Cyberpatrol. You said that it's really subjective and really hard to choose which sites you want to filter.
SG: I didn't say that but that's OK.
__: I thought you said there are certain sites that are really hard to decide whether they're--
SG: Some things are more clear cut than others.
__: So I wonder whether you have any sort of procedure to review your reviewers.
SG: Yes.
__: You do?
SG: Our reviewers all obviously operate independently as they're doing their job. But part of the process is, and particularly in the categories that we call the gray areas which are not (simultaneous conversation). It's not too hard to figure out what you're looking at. So most of that is relatively straightforward. When you get into the gray areas, is it intolerance, is it satanic cult material, whatever the various things are, there's a policy that they go through and they review those as a group to make sure that everyone agrees that the decision that was made was the reasonable one. And then on top of that, that's just what we do before we put it on the list. Then there's obviously the whole process of appeal which could happen afterwards if a site was put on the list.
__: But they don't know. They don't receive any kind of notification. But they can certainly find out. There's tons of folks whose business in life it is to inform people if they're blocked by filtering software.
Z: Leah.
__: It seems like to me, going back to the last question, that there's another distinction between what's going on in the Boston Public Libraries and what's going on in government regulation of speech in traditional media in terms of time, place and manner restrictions. And that's that it's not really the government making the determinations here. It's they've effectively delegated these constitutional determinations to a private entity, Cyberpatrol. And the problem that I see is that, it goes back to the secret list. There's really not an easy way for the library to know whether or not-- I mean there are two separate lines. There's a line that Cyberpatrol is drawing and there's the constitutional one. And it's not a government employee in any way determining whether those two lines are the same. So I guess I'm wondering if there's any way that the government, under limited circumstances, might get a list from Cyberpatrol of the sites that are blocked, if there were some way that the government could keep it confidential.
DG: I'd like to see it. I mean it was a concern of ours too that we were delegating this responsibility to someone else. And we felt like we didn't have as much control as we would have liked. And those of us on the Internet committee, especially when we first started using Cyberpatrol, did a lot of testing on our own. Surfed sites for information, checked to see if the old example of breast cancer actually did work the way it was supposed to. We have ways to correct the problem if it occurs to us. But I would like to have seen the list myself and had a little bit more control and a little bit more knowledge of what it was.
Z: In your absence we surmised that the reason the list is a secret, a trade secret at that, is because it represents the sweat of the brow of the researchers who have painstakingly put it together. Is there any other reason?
SG: Sure. There's a really big reason. The two reasons are the one you point out, that it's proprietary information. It's taken a number of years to build and it is something that is part of our company's revenue line. However, it's also a list of inappropriate sites and we really don't want to publish a directory of inappropriate sites.
Z: Like the-- If you saw it you'd see you wouldn't want to see it.
SG: So we also would make it available under the appropriate circumstances, for appropriate reasons, to the government. But they haven't asked. So there you go.
__: I mean it seems to me it could be important to look at maybe because of some of the concerns about what is blocked. For instance, you get the criticisms, and I don't know as this is still of Cyberpatrol, but the criticism that those who criticize the blocking software people, if you are critical of what you're doing then your site is blocked. A number of people that have, Peacefire and some of the ones that are really out there--
Z: Is that true? If I were to make a site that were critical of filters but didn't include how to crack them would that be questionable?
SG: No. And matter of fact the specific example you gave is absolutely true. Up until the time that Peacefire published a way to crack Cyberpatrol it was not blocked. But when it did we felt that that merited our category and so we put it on the list. So the criticism is fine. It's just our right to protect the parents who trust that our product is protecting their kids. And we've also indeed fixed the problem that he exposed as well. So there you go.
Z: Let's get Tim and then Alex.
__: I think what bothers me and building upon Leah's point and Greg's point is it seems like not only is there a delegation but perhaps a difference between the different companies, like Cyberpatrol and Netsitter and so forth, is that you're really selling a political ideology in a way. Because I agree with you that there are some sites which are clear cut. This is just not appropriate. But you said that there's a gray area. And I would imagine it's perhaps the gray areas, especially on the Internet where everyone has their own viewpoint and so forth, is a lot bigger or at least as big as the very clear cut area. And so then the question becomes OK, a judgement call must be made.

As Greg was saying it's a balancing. A judgement call has to be made as to where the gray area begins and ends. What's clearly not appropriate and what isn't. And at that point political ideology comes in. What is creating a balanced viewpoint to some person is actually-- what is intolerance to one person is actually just the alternative viewpoint. And the same reason that the Supreme Court said Nazis are allowed to parade in Skokie is because even though it's offensive to other people this is their constitutional right to be heard and to parade.

SG: And that's why you can both select which categories you want to use-- so if you're using this in a public setting, and indeed most libraries that use our product, use Cyberpatrol, only use those four or five categories, or a maximum of those four or five categories, that relate to the sexually explicit material. Very few of the public libraries that use our product, I can't speak to any others, actually use the other categories. I know Boston doesn't. Austin public libraries don't.
__: So in fact only the clear cut areas are the ones--
SG: If you want to call them clear cut. I mean only the ones that are clearer. I mean the fact is partial nudity and nudity is pretty easy to determine if it's there or not.
__: But even let's say things like the famous Supreme Court case with G strings and pasties. I mean do you consider that-- you make a political or an ideological decision that a certain amount of nudity, a line has to be drawn at some point, that that amount of nudity, only showing her whatever, is wrong, it's inappropriate. And especially, I think it's even more troubling when a public library which is being funded by the city is delegating an ideological decision to a private institution. They're saying to you, "Look, we're going to buy your software and you've made the line drawing here. You made the line drawing in these areas." Which I can argue till the day is long that there are gray areas in these categories that you're saying, even those ones that libraries are picking out. And that's troubling to me that my tax dollars or you-- not necessarily you, I'm sorry--
Z: But your best friends.
__: Private companies are making these ideological decisions for the state in state run, state funded institutions.
Z: But notice that in constitutional discourse we have a word for these kinds of ideological decisions. And that's content based or even, god forbid, viewpoint based. And as soon as something hits in that category and you have government making decisions on the basis of it, or delegating such decisions, that raises all sorts of strict scrutiny red flags. But now when you narrow it just to the question of pornography the decision as to what is obscene and what is not, if it turns out to be obscene it's not a content based thing because we know from our doctrine that that's just non-speech as far as the constitution is concerned, at least as far as the Supreme Court is concerned.

So under American law, and realize too we haven't even hit the international angle that for any given country there's going to be some non-ideological hot button that for us is pornography, at least under American law you could argue that a decision whether to filter out obscene or harmful to minors stuff is not as much a content based decision. Even though we know you're looking right at the content and making a decision based on it.

__: But in those cases the court or a state agency--
Z: No, big difference. The court is always entitled to make content based decisions about content. That's what they do. State based agencies are going to be scrutinized.
__: But at least they will be scrutinized. There is some scrutinization [sic] of those decisions. And in this case there is none. The scrutinization process occurs at Cyberpatrol's offices where their people are making those decisions. That's where it's going on. There is no-- they said your internal appeal process and review the decision. But where's the-- this hasn't, maybe it has come up in Loudoun county but the point is--
SG: Not our product.
__: But the court is not in their offices deciding whether their scrutiny of something as, or their definition of something as obscenity actually is right.
Z: And you see it happening--
SG: We're also not dealing, and never claimed to be dealing with obscenity. Remember, we don't claim that we're only blocking or that we can only protect you from obscene material.
Z: But you do claim to at least do that, right?
SG: I don't think you'd find anything in us that ever really refers to that particular standard.
Z: So as not to, if you fail in that, somehow be in trouble or have somebody who was relying on you be in trouble for obscene material (simultaneous conversation).
SG: Actually focusing just on the fact that what we're really dealing with is inappropriate for children. That's--
Z: Surely material that is of no artistic, literary, scientific merit--
SG: In which community? I'm not going there. We're not going there.
__: You're not arguing that obscenity, ... (inaudible) obscenity would be, something within the circle of obscenity would also be within the circle of something appropriate for children?
SG: No. I'm making the clear statement that filtering software that would claim to only block obscene material would be saying, would be untrue. We do not claim that.
Z: But she's also making the claim, I think, that if it is obscene she can't say whether it's appropriate for children or not because the--
SG: I can't say whether it's obscene. Can't do that.
__: You don't make legal determinations.
SG: Don't make legal determinations.
__: You just describe what's on the pages in 12 categories.
SG: Yeah.
__: That ... (inaudible). I mean I know you said the government wouldn't want to look over your list but it seems like they have a duty to. And as you pointed out, those libraries take four categories, not 15. So I'm sure that would narrow the list somewhat and they could review some of the sites without necessarily visiting all.
Z: In your conception, Leah, could the government add to the list or just take away? Could the government say, "You forgot to add the following sites"?
__: Isn't that technologically possible?
Z: Sure. I'm just wondering. You're thinking of it being a better situation in which the government is standing over their shoulder and editing the list.
__: The reason it's better is because the government is constrained by constitutional principles.
__: Right.
__: And you can sue and do those type of things.
Z: In which case they could add but any decision to add would be known, it wouldn't be in the proprietary part of their list. And then we'd have four year litigations over a given addition to the list and we'd all feel protected.
__: Yeah.
Z: OK, gotcha. Alex.
__: If there's two things I've learned from this course one is the benefit of Fox news and the second is (simultaneous conversation) three people, three types of people who use the Internet. And the first are kids, which we've obviously discussed. The second are pornographers, which we've obviously discussed. And the third are gamblers and gambling sites. So the question is this is a way in which the Internet is not like the newspaper. What would the three panelists say to gambling sites? Does the librarian want to block people gambling in her library? Would Cyberpatrol be able to say whether gambling is legal or illegal in a nationally distributed product? And would the civil libertarian have any trouble with blocking gambling sites in a library?
__: Isn't that illegal?
Z: Not everywhere, right?
__: In this country, yeah? (simultaneous conversation)
Z: Last time I was in Las Vegas there was some--
SG: I meant Internet gambling.
__: No.
__: Internet gambling is only illegal if it is illegal in both jurisdictions where the person is placing the bet and where the server is located. So if gambling is legal in say California and Texas ... (inaudible) and the server is in Texas and the bettor is in California then it is completely legal.
__: One particular kind of gambling, for example horse racing. I mean it's type of gambling by type of gambling.
__: Cyberpatrol says ... (inaudible) distinction but you call hacking illegal.
Z: Here's the criteria right here. Pictures or text advocating materials or activities of a dubious nature which may be illegal in any or all jurisdictions, such as illegal business schemes. So they're saying at least it might be. They're not making a final determination but if it might be, you could imagine them saying if it might be obscene and similarly crossing that line. They don't want to do that, though.
__: Civil libertarians have no trouble with library (simultaneous conversation).
JR: No.
__: And why is that ... (inaudible).
JR: We think the Internet is a tool that people ought to be able to freely use. And if the library, the government, shouldn't get into it. They shouldn't be policing it. So that if someone gets on the Internet and uses it for whatever purpose then that's their, they're free to do that. There are things that you can do on the Internet that are illegal. I'm not saying that they should be able to do that. You can't access child pornography and there are certain things that are crimes. But certainly if someone gets on the Internet and is legally gambling, why not? Or looking at sexual images and not bothering anyone, why not? Why should the library or the government or anyone else give a damn about that?
SG: And from my point of view, the legality of gambling in certain jurisdictions for adults is maybe different in different places but gambling is always illegal for children. So circling back to where I come from, there is no question about whether or not I would block material that was of gambling nature. And that's why it's a category on the list.
Z: Andrew.
__: I have a fact question for Dolores. At the adult terminals is there any image or words that the library would stop me from viewing?
Z: Assuming you're over 18.
__: Yeah, let's assume I'm an adult and I'm sitting there. And I ask this not in an aspirational sense, like I'm hoping to find out what I can get there. But is there anything on the basis of which content I would get kicked out of the library?
DG: I mean I can't speak for all 26 branches. I can speak for the branch that I work in. If you come in you can, as an adult, you can look at anything you choose to without being asked to.
__: That implicates then the is the Internet different point. And your earlier point about whether or not this is like, bringing the Internet into the library is like buying books. Because there's actually a somewhat more charitable view of the first amendment doctrine that you criticized as being incoherent with regard to libraries. Which is that the law simply says that the library has to be content neutral and viewpoint neutral in all its decisions. And it's, I agree, somewhat conceptually tenuous to say that the act of choosing which books to purchase is content neutral. But the courts, and at least Justice Brennan in his, I think it was the Pico (?) decision--
Z: Right. Pico v. Board of Education.
__: He basically skips over that by saying what they're doing is they're buying books on the basis of their presence in the canon, the best seller lists. That there are these objective non viewpoint and content based reasons that you go out and purchase books. In essence what they're doing is applying, even if they're unwritten policies, policies about which books are of the quality to go into the library. And so it seems to me that actually the Boston Public Library is adhering to this policy of content neutrality in saying that adults can go and look at anything on the Internet terminals. I just wanted to point that it's not necessarily as weird a doctrine as you make it sound.
Z: I guess whatever contradictions it has, trying to distinguish between acquisition and then selection once it's acquired, are more clearly highlighted once you're in the networked context where you're not sure just whether it fits into the box of acquisition or selection to filter something out at the door as somebody is asking to have it brought in for them and it costs nothing to the library to do so. But even on Brennan's situation you might say it's clearly a fiction. It's a legal fiction indulged in perhaps for good reasons. There might be libraries that really do consult completely content neutral criteria. What it means is librarians are so ignorant of the content of the books that they're ordering that they really have, I guess we'll just get whatever is on the list. You know, what got circulated here today?

But there also would be librarians who are intimately familiar with what they think would make for a comprehensive community library and what wouldn't help so much in that regard. And that those decisions aren't meant to be, even if Brennan knew that, aren't meant to be taken off the table as something the library is entitled to do. Surely it wouldn't be an adverse admission by the library to say, "Yeah, I read that book and I thought it stunk. That's why I didn't get it." Even though that's clearly then a content based distinction. Meg, did you have something?

__: I was just going to say one of the librarians I spoke to at the BPL main branch said that one of the concerns they have is that a patron comes in and is looking at pornographic images on a library computer does that raise issues of creating a hostile place for other patrons and for library employees? Does that make the library responsible for the content which is being displayed to the entire library? Because the computers are in the middle of the room?
Z: And we saw that in Loudoun where they discussed whether there would be screens to protect people.
__: And they shouldn't be in the middle of the room. I mean that's easily dealt with. And I think libraries are beginning to deal with so that the computers are now actually flat and you can look at them.
Z: It's more and more like a peep show now.
__: That's the concern and there's a privacy interest here also. Then you've got to deal with the privacy interest. That should not be used as a censorship argument.
Z: Right. And also notice that we read earlier Julie Cohen's Right to Read, or at least right to read anonymously. If you read the Loudoun decision talking about a policy that's more restrictive than Boston Public Library's. It basically doesn't have the two their adult-kid system and was struck down by the district court again recently. In that instance you may say, I mean I don't know, if it's restricting everybody they look at it as a limited public forum in which speech is being restricted by the government and therefore it's not allowed. If you actually cut through it all and talked about a right to read it would make that case come out much more neatly, even though the judge got to that same kind of result using rights to speak and rights to have just a more generic forum.
SG: I don't have my notes with me. I was talking to somebody today about, didn't Loudoun just revise its policy, like two days ago?
Z: I think they just canceled all Internet access.
SG: No, I was talking to someone from NPR today and he said that they had a meeting a couple of days ago and they had just come up with a new policy and it was two tiers of access and they were looking at privacy screens. But one of the issues that came up with the privacy screens was that if a parent was in the library wanting to work with their child on the computer the privacy screen would maybe potentially cause a problem because you have to be pretty much dead on straight to have that work. I can't remember all the details because I don't have it now. But they have got like a new policy that they're going to be working on.
Z: The last I had seen there had actually been some controversy because they just pulled the plug on everything and the judge was not (simultaneous conversation).
SG: -- but I would be lying if I said anything more than what I said.
Z: I want to wrap this up just by doing a couple things. One, I want to ask Todd Lapin who has been following this debate for a number of years as a writer and editor and I want to know what do you think is the next phase of this debate? To the extent that what we've just heard today is the familiar battle lines being drawn, familiar arguments lodged back and forth, what awaits us around the corner that amount to new dynamics in this area?
__: Let me just think. There's always the ongoing death by 10,000 cuts which we were talking about, which is you haggle over a million and one different specific circumstances. Interesting cases will be like perhaps again broad band access through-- I mean you could see something, for example, like let's say you have a set top box and my service provider is at home. Does that somehow change a pervasiveness scenario? Does that somehow change the dynamics of--
Z: Once it's Web TV isn't it much more TV?
__: Once it's more Web TV or some medium that actually looks an awful lot like television does that start to change the legal standards? I mean it's always the technology question. What's technologically possible is always what determines what's going on. I was just talking about this with Professor Lessig last night about for example in the COPA decision, in the COPA TRO they basically stipulated the 1995 definition of what the Internet was like. Which I was like hey, that sounds great because those were great stipulations. But he was like arguably it makes the TRO more vulnerable because then they can come in and say, "The Net is not the same. There's all sorts of things going on now that we couldn't do in 1995." In 1995 those of us who were following it knew there was a lot of wink-wink going on in terms of what's technologically possible.

And so again, for example like push. Now we say push is dead now, which it didn't really go anywhere. But push is a great example where push technology, at the same time there was an argument going on that (simultaneous conversation) affirmatively request it. We're all looking at each other going, "Jesus, I hope they don't find out about push technology." Because push is exactly, that's exactly what it is. It's like you're getting stuff pumped at you and you're not picking it. So I mean it's always where's the-- and the same way that I don't have any solid sense of where the technology is going, I certainly know that there's all sorts of possible configurations.

Z: Can you tell us what the state of play is? And maybe you could fill us in too, Susan, quickly on the likelihood that say an upcoming version of Windows will fairly tightly integrate into it perhaps even a choice of a particular filtering company. And then when you buy Windows and you configure it all you've got to do is check a box and you've already installed and bought--
__: Susan probably knows more than I do.
SG: I'll tell you that generally speaking, and this is from our experience, the large on line companies, whether it be Microsoft or America Online or anybody, really wants to have an arms length relationship with these kinds of decisions and this kind-- because for the reasons we've been talking about. And so they prefer to go to an outside company. America Online uses Cyberpatrol's technology within its product and I know that they're one example that prefers it to be a sperate company doing that work. So I don't know that you'll see anything more than what you already see in browsers or on line. Which is tight links to commercial products or non commercial things like labeling systems that aren't associated directly with companies. (simultaneous conversation)
__: PICS was the obvious thing that I was thinking as well, which is integrated into Internet Explorer. Of course nobody is really using PICS but I mean--
SG: That's why I didn't talk--
Z: Poor Joe, who worked on PICS.
SG: Was it your baby?
__: That goes again back to what I was saying, which is you cannot define what is the Internet. I mean that's the thing. So it becomes this rhetorical strong man that we can say, "We know what it is now." But again, one of the arguments that's being used to possibly challenge COPA by Professor Lemly (?) and Professor Lessig was actually Professor Lemly's argument which is theoretically we can create tokens. You know, a token that sort of says, "I am a child." And if you don't have the token it assumes you're an adult. So OK, yes, theoretically that's totally possible and he says that would be a less restrictive means of effectively achieving the same result.
SG: But unacceptable to parents because no parent is really going to want to identify for whoever else might know that bit of information that there's a child at the other end of the computer. But it raises the other question that we haven't, and this wasn't designed to talk about, which is privacy of children. I mean if you identify there's a kid here that opens up all kinds of parental fears. Real or unfounded, it doesn't matter, they're still there.
Z: So let me, I guess, just leave us with the thought that in a sense squaring the civil libertarian and the librarian or the librarian from a politician who's taking a particular view, rather than this librarian, is in a way the old battle line. And it's one that is argued using familiar forms of doctrine. I mean if you read the Loudoun case you'll see all the things about standing to bring the suit and then walking right through all the first amendment doctrinal boxes to see where the policy fits. With it being obvious that it's a state action question because once it's in the library, the public library, it's government involved.

And a theme that I think has run through the day today is that the real battle lines, I think, in the future are not drawn in such clearly governmental contexts. Instead the government is going to be able to encourage certain behaviors in the private sector, be it through CDA2 or other forms of legislation or cajoling. And when that's done, even though you have the private sector taking on responsibilities that do seem somewhat government like, deciding what is harmful to children and what is not, normally left say to school boards and librarians. As they take on those decisions and even evolve structures that start looking very ad law like. We have an appeals board and then we have a committee that looks at it and if you file this we're bound to listen and give you a reply.

It's still all a new rubric, as they will no doubt try to argue, of private actors doing private things were anybody to try to lodge a constitutional challenge either about being blocked or about having a right to see that which their parents or, not their public librarian but some other private actor has sought to block from them. And just whether that's a way of escaping the first amendment values that we thought were so important when the government was the steward just by eliminating the state action question is a good question. So with that I think there's food outside by now. And please join me in thanking our guests for a great day.

Don't forget to sign up for the focus group if you want. Thank you all very much. Please drop the papers off in front so we can do the magic exchange.