Introduction
Using the Internet depends, in the first instance, on access to
the network. The initial emergence of the Internet in
the early 1990sfrom the increasing connectivity of a series
of university and government networks alongside private services
like America Online, Prodigy, and Compuserve, occurred almost entirely
across slow dial-up modem connections over telephone wires. Sufficient
for email, Usenet news groups and transferring relatively small
files, and later viewing simple web pages, slow transfer made consumption
of data rich content infuriating and its provision unprofitable.
There was, however, an important compatibility between the Internet
architecture and the plain old telephone system. The basic protocols
of the Internet (click here
for a tutorial on TCP/IP) treat all information as equal. They do
not recognize rich content or poor content, content owned by one
person or another. So too, the basic telephone network, because
it is regulated as a common
carrier by the FCC, was
required to treat all these data calls alike. The combination meant
that in this new medium, unlike in the mass media of the 20th centurytelevision,
cable, and newspapersno one had much of an advantage over
anyone else in communicating their views to the world. The low bandwidth
available also meant that production valueexpensive
sets and camerasthat also limited access to the opportunities
to speak in traditional mass media, were less important. The result
was a substantially more egalitarian communications medium than
any that the 19th and 20th century had known. For a while, and for
limited communications applications.
In the mid-1990s, as the number of users grew and the graphical
user interfaces to the World Wide Web emerged, the variety and complexity
of the applications that could be used over the Internet increased
dramatically. While modem speeds too increased, there were serious
limitations on what traditional phone lines with traditional modems
could do. These constraints and the prediction that the Internet
would make the market in high speed broadband access
(synchronous transfer speeds in excess of 200 kilobits per second)
immensely valuable led to substantial investments and something
of a race to capture the broadband market. The vision
for what was then called the National
Information Infrastructure (NII) was that an entirely new network,
eventually all in fiber optic, would one day emerge for high speed
communications. In the long haul network this was already happening,
with redundant fiber optic systems already in place by AT&T,
MCI, and Sprint, as well as others. The problem was how to fiber
the last
mile, and the U.S.
strategy in the first half of the 1990s was that the two main
playersthe telephone companies and the cable companies - would
enter into competition in all services with each other, which would
eventually result in two high capacity fibers being connected to
every home.
This vision for the emergence of a NII was an important driving
force, or at least an important justification offered for, the passage
of the Telecommunications
Act of 1996. The idea was to allow incumbent local exchange
carriers to enter into the long distance and video delivery markets,
to allow the cable carriers into the telephone market, and to allow
the long distance carriers into both. In this competition of all
players for all services, prices would decline, and redundant ubiquitous
high-speed infrastructure, it was thought, would be built.
As things progressed over the course of the later 1990s, however,
it became clear that most high speed access services would be delivered
initially through a variety of innovations in how legacy infrastructurethe
copper wires of the telephone system and the coaxial cable of the
cable companieswas used. Competition increased in some areas
of telephone service, but local telephone and cable markets seemed
resistant to substantial competition. In particular this seemed
to be true in competition for high speed Internet connections to
residences. Owners of existing conduits to the home have a clear
advantage but their networks require adaptation. Working on the
assumption that whoever controlled the physical network would have
a head-start in the 'battle for eyeballs', consolidation followed:
AT&T
sought to enter the local loop by expanding its cable network through
the acquisition of TCI and Media One. AOL
merged with Time Warner. The local exchange carriers merged,
and the seven Regional Bell Operating Companies and largest non
Bell local exchange carriersGTE and SNETcombined since
then into fourVerison, SBC, BellSouth, and US West/Qwest.
In the most recent FCC
report on high-speed access, the Commission estimated that there
are currently 9.6 million subscribers to high-speed data services,
of which 5.2 million, roughly 54%, subscribe through a cable service.
Only roughly half of that, 2.7 million or roughly 28%, subscribe
to DSL. Of these, 93% of subscriptions are through incumbent LECs.
The remainder of the lines are almost entirely business and university
oriented T1 and better lines. Satellite and fixed licensed wireless
put together, the two other sources of high speed connectivity often
cited, account for 0.2 million lines, or roughly 2%.
Access to the legacy infrastructure has been pervaded by a concern
with attempts by incumbents to use their position to gain control
over the market for high-speed Internet access, whether over telephone
and cable facilities. The Telecommunications Act of 1996 itself
required invasive regulation of the incumbent local telephone companies
relationships with their competitors in order to assure reasonably
equal access to competitors. Its implementation has been hard fought
and difficult. As for access to the cable infrastructure, the need
for regulation has been a central issue of public policy debate
in 2000-2001. Section 706 of the 1996 Act (see notes to 47
U.S.C. § 157) directs the FCC to ensure the reasonable
and timely provision of advanced telecommunications capability
and to take steps to accelerate its deployment if necessary. In
its First
Report under this section, in February 1999, the FCC concluded
that no regulatory action was warranted with respect to broadband,
finding its rate of deployment adequate. A Second
Report, based upon more extensive data, was issued in August
2000. One month later the FCC released a Notice
of Inquiry Concerning High-Speed Access to the Internet over Cable
and other Facilities, leaving open the question as to whether
access to cable infrastructure would be mandated.
In the early part of 2002 the Commission initiated a major push
towards resolving the regulatory questions it faces. It issued a
Third
Report on deployment in February, 2002. More importantly, it
issued a series of proposed rulemakings regarding cable,
incumbent
telephone carriers, and in particular the unbundling
requirements placed on them. It also is attempting to develop an
appropriate general
framework towards wireline-based broadband. These seem to be
tending towards skepticism with respect to open access regulations,
and an increased reliance on competition between cable and telephone,
now increasingly being called "intermodal competition,"
rather than competition among the incumbent platform owners and
unaffiliated ISPs on both the telephone platform and the cable platform,
now usually termed "intramodal competition."
The difficulties raised by providing open infrastructure over legacy
infrastructure raise the question of whether there are alternatives.
Perhaps the most promising avenue to circumvent the problem of the
proprietary last mile has been the emergence of the possibility
of license-free wireless infrastructure development, which is explored
in the third segment of the module. Another, but as yet undeveloped
approach is the deployment of publicly owned and operated infrastructure
at the municipal level, perhaps by using the sewage
systems as the primary conduit.
Finally, even if the physical layer of the infrastructure for accessing
the Internet is not controlled by any single entity, there remains
the problem of the logical layer of the infrastructure. Just as
at the physical layer, if any single company controls the logical
layer--the browser or operating system, the ISP software or the
messaging platform--it can exert tremendous control over the way
in which that infrastructure is controlled. This problem of access
to the soft infrastructure is explored in the fourth
segment of the module.
The Open Access debate unites three distinct concerns.
The first is to neutralize the anti-competitive risks constituted
by concentrated control of key facilities. This aspect will be the
primary focus of the discussion of the telecommunications sector
policy of unbundling. The second centers on the architecture of
the network and the desire to maintain its end
to end (papers)
architecture as the structure best endowed to support innovation.
Finally, there is a wish to maintain and nurture a conduit of communication
uniquely democratic in the modern era, affording comparable communicative
power to commercial and individual speech, amateur and professional.
Regulatory choices adjusting the market environment and future technological
evolution will ultimately determine whether the communications paradigm
remains mired in the broadcast era, where a few speakers dominate
content provision and the mass merely consume, or whether a richer
system may emerge, capable of providing greater diversity of voices
and individual expressive autonomy. The choices determining the
outcome of these challenges are occurring at three layers in the
information access process: the physical layer (computers,
communications paths, routers); the logical layer (operating
systems and software), and the content layer (informational
and cultural inputs). As information is a composite good, control
over any of the three layers of production threatens extension of
control to the whole. (If you are interested in how the physical,
logical, and content layers of the infrastructure interact, you
can explore here.)
Much that has been covered in the other modules concerns the content
layer. The following segments of this module address the contested
control of the first two layers.
Intro
| Phones | Cable | Wireless
| Software Platforms | Universal
Service | Discussion | Additional
Materials
1.
Access to Physical Layer Infrastructure (1) - Copper Wires
Throughout most of the 20th century the Bell System provided telephone
service in the United States as a regulated monopoly. As competition
for elements of the telephone serviceprimarily customer equipment
and long distancebegan to emerge in the 1960s, a fifteen-year
process of antitrust enforcement began. This culminated in the breakup
of AT&T in 1984 under what has come to be known as the Modified
Final Judgment, or the MFJthe order that settled the AT&T
antitrust case. After the breakup, local telephone service continued
to be a monopoly, with the seven Regional Bell Operating Companies
(RBOCs) created by the MFJ from AT&Ts local phone divisions
prohibited from most other lines of business offering most of the
nations local telephone service. Long distance service was
retained by AT&T, and was quite soon thereafter subject to competition,
primarily from MCI and Sprint. The RBOCs and AT&T were regulated
for the next 12 years by a combination of the court that enforced
the MFJ under antitrust law, and the FCC, enforcing the 1934 Communications
Act.
The Telecommunications Act of 1996 replaced this arrangement. Its
goal was to open up the local telephony market to competition. This
departed radically from the conception that dominated local telephony
regulation throughout most of the 20th century, namely, that local
telephone service was a natural monopoly to be regulated, but not
to be subject to competition. Recognizing the interest of the incumbent
local telephone companies to maintain their monopoly, the 1996 Act
created a series of obligations on local exchange carriers in general,
and on the RBOCs in particular, to cooperate with new entrantstheir
emerging competitors. In order to create a competitive marketplace
the legislation offered the regional Bell operating companies (RBOCs)
the opportunity to enter the long-distance market once they had
demonstrated that the structural steps had been taken to irreversibly
guarantee local competition.
Three mechanisms were provided to this end: mandating sale of telephone
services at wholesale rates; leasing of access to unbundled elements
of the network by competitors, and interconnection. Arbitration
by the state public utility commissions, or by the FCC, is provided
for by section 251 of the 1996 Act failing private negotiation between
the parties. The basic idea underlying these three mechanisms is
that it not be necessary for a competitor to have in place a full
network, completely redundant to the incumbents network, before
being able to offer competitive services. Any competitor can therefore
enter by simply buying minutes and reselling them more
efficientlythis is the resale component of the arrangement.
It is intended largely to give an entrant an opportunity to build
local reputation while preparing to build its competing facilities.
Second, a competitor can buy, at cost, access to any network element
of the incumbent. A competitor, for example, can simply deploy switches
and some large trunks, but then lease lines from the switch to customers
homes from the incumbent at cost. This is known as unbundling.
Finally, a competitor that has a fully functional network to some
customers in an area, has a right to interconnectionto
be connected at cost to the incumbents network, so as to be
able to provide its customers connections to all of the customers
of the incumbent. The latter provision is crucial, because it allows
an entrant to compete with an incumbent on an equal footing with
regard to the value of telephone as a network gooda good whose
value increases the more others consume the same good. If an incumbent
were not required to interconnect, a competitor could only offer
its customers the value of connecting to its other customers, and
not to everyone on the planet. Denial of interconnection was a central
strategy that the Bell System had used early in the 20th century
to eliminate competitors in local markets, and again to prevent
competition in long distance in the 1960s and 1970s. The 1996 Act
sought to assure that it would not be used so again.
The incumbents fought implementation of the requirements to open
their networks on a variety of fronts, largely through litigation.
The most important of these efforts was the Iowa Utilities litigation.
In the first iteration, the incumbents succeeded in winning much
of what they wanted from the court of appeals, but were largely
reversed by the Supreme Court in AT
& T v Iowa Utilities Board 525 US 366 (1999). After the
case was sent back to the court of appeals, that court upheld the
FCC's general approach, but invalidated the specific requirements
that the Commission had required of the incumbents. In May, 2002,
however, in Verizon
Communications, Inc., v. FCC, the Supreme Court finally reversed
the court of appeals on all the items on which it had held for the
incumbent exchange carriers, and adopted the FCC's cross petition,
both in methodology and the reasonableness of the actual requirements.
Six years after the 1996 Act was passed, with much litigation and
stalling, the Supreme Court seems to have, for the moment, allowed
the FCC quite significant leeway to regulate the rate and structure
of the unbundling requirement.
The delay, however, was sufficient to allow political winds to
change. The mood at the FCC seems to have turned from one vigorously
dedicated to opening up the telephony platform to competition not
only in voice telephony, but also in broadband/DSL delivery, to
one that is much more hesitant to treat these types of services
as common carriers. It is this issue--the way in which the measures,
primarily unbundling--that the 1996 Act introduced to help competitors
are applied to broadband over telephone infrastructure that matters
most to understanding the problem of Internet access over this platform.
Through a variety of
approaches for high speed data transmission over copper wires, generally
known as DSL, telephone infrastructure can be used to offer
broadband Internet access. The question becomes whether the mechanisms
adopted by the 1996 Act, as implemented by the Commission, can ensure
that at least as far as broadband Internet access is deployed over
telephone systems, access can be competitive and open. A glimpse
at attempts that RBOCs have made to receive permission to offer
long distance service suggests caution with regard to that question.
The 1996 Act had conditioned RBOC entry into long distance markets
on FCC approval, in consultation with the US Department of Justice.
Approval depended on a finding that the applying RBOC had indeed
opened its local network to competition.
The first successful application, after a number had failed, was
Bell Atlantics application to provide long distance service
in the New York area, the most lucrative market in the United States.
Bell Atlantic had taken a more cooperative tack than all its sister
RBOCs, and had advanced further in opening up its network to interconnection
and unbundling by competitors than had been the case in other regions.
The 1996 Act requires authorization by the FCC in consultation with
the Department of Justice. The FCC eventually approved Bell Atlantics
application, a decision upheld
by the Federal Court of Appeals for the D.C. Circuit. Nonetheless,
the Department
of Justice Evaluation of Bell Atlantics request, which
counseled against granting Bell Atlantic's request, reveals the
level of detail at which an incumbent can effectively thwart competition.
The quality of the graphical user interface for receiving orders
for entry, the speed of deployment of service personnel, and the
design of order forms came under scrutiny, suggesting that for all
practical purposes the opportunities for an incumbent to make the
lives of competitors unbearable were infinite.
The lesson from these effortsand the fragility of competition
that requires access to a competitors infrastructureneed
not be that competition or open infrastructure are impossible. But
neither does it suggest reason to be sanguine that a formal requirement
of openness will necessarily lead to competition and an open network.
Even where access to broadband service operates on a common carriage
basis, with explicit and extensive requirements for competitors
to cooperate, there continues to be wide opportunity for incumbents
to capture and control large segments of the Internet access market.
On this background, the recent developments in this area both at
the FCC level and at the Court of Appeals for the D.C. Circuit leave
one with little cause to be confident that significant competition
among DSL providers will emerge. The two most relevant FCC documents
are the Appropriate
Framework for Broadband Access to the Internet over Wireline Facilities
NPRM and the Incumbent
LEC Broadband Telecommunications Services NPRM. A part of the
UNE
Triennial Review is also relevant. The D.C. opinion is U.S.
Telecomm Ass'n v. FCC.
The "Appropriate Framework" NPRM generalizes the most
important conceptual move in this area by tentatively categorizing
broadband services over a wire as "information services"
under the Communications Act. This categorization is immensely important,
because there is no stautory category of regulated information
services. The definition is in contradistinction to "telecommunications
services" or common carriers, which are subjct to regulation.
The explicit intent of the Commission in making this classificatory
move was to allow itself to build an appropriate regulatory framework
from the ground up, without being encumbered by trying to fit new
services into old regulatory boxes. The result is likely to be to
make any form of regulation very difficult. The definition of "information
service" in the Act is:
The term `information service' means the offering of a
capability for generating, acquiring, storing, transforming, processing,
retrieving, utilizing, or making available information via telecommunications,
and includes electronic publishing, but does not include any use
of any such capability for the management, control, or operation
of a telecommunications system or the management of a telecommunications
service.
Plainly, this is a very broad definition, and explicitly encompasses
even online newspapers. The Commission will therefore have to be
highly solicitous of the providers of broadband, and will likely
regulate very lightly, if at all. The definition is also likely
to be regarded by courts reviewing any regulation as indicative
that the Commission believes these entities to be traditional media
organs, which receive the highest First Amendment protection. The
risk that a court will treat any form of access or unbundling requirement
as functionally equivalent to a requirement that a newspaper devote
some of its editorial page on a common carrier model is likely,
an analogy that will surely prevent any access regulation.
An alternative approach would be to look at the functional architecture
of the service, and to see that it is comprised of two distinct
services--a platform service that in principle looks and behaves
like a telecommunications service, and an ISP/value-added service
that looks and feels like an ISP service. The latter is quite properly
designated an information service. The former is not. Treating their
combination as purely an information service is no more necessary
than treating local and long distance service as a single "telephone
service" category would have been. On this view, the Commission's
definitions seem to run contrary to the original model of the 1996
Act, which sought to move away from regulation of types of companies
towards regulation of specific functions that different companies
combine. Had the Commission indeed defined broadband services as
involving both a telecommunications service and an information service,
it could easily have achieved its goal in this definition--to allow
these services not to be encumbered by inappropriately applicable
regulatory constraints developed for plain old telephone service.
The Commission is empowerd, under the Act, to refrain from regulating
an advanced telecommunications service when it deems that refraining
would serve the public interest. Had it defined the platform component
of the broadband service as a "telecommunications service,"
it could have started with the regulatory baseline applicable to
common carriers and telecommunications services, and then decided
to forbear from imposing regulatory components of that framework
that would be inappropriate to broadband service. As things stand,
however, the recent move to classifying all broadband over wireline
as an information service tends to suggest there will be little
regulation to force incumbents to open their networks to competitors.
The Incumbent LEC Broadband NPRM applies this framework to incumbent
local exchange carriers, tentatively finding that these too are
information services. The most important focus here is on the presence
of substantial intermodal competition--that is, largely, that DSL
has strong competition from cable broadband, which does not allow
the incumbent telephony carriers to behave like monopolists.
The UNE Triennial review refers to a comprehensive review of all
the Commission's approach to unbundling in the telephony infrastructure.
The most important component for purposes of broadband is the question
of whether the high frequency portion of a wire will be required
to be unbundled. In an earlier order, the Line
Sharing Order, the FCC had required the telephone companies
to unbundle the high frequency portion of their wires. This is a
metaphorical way of referring to the following technical situation.
Voice telephony encodes the voice it carries as a series of low
frequency electromagentic signals. Data communications encode data
using high frequency signals. We are used to thinking of "frequencies"
as a physical thing in the airwaves, but that is entirely false.
Frequency measures a property of any electromagnetic signal, irrespective
of whether it travels in the air or in a wire. Because voice and
data use electromagnetic signals with such different characteristics,
they can use the same wire without confusing equipment connected
to the wire as to which signals are voice, and which are data. Competitors
of the incumbents wanted to rely on this fact to purchase from the
incumbent LECs the right to send data over the same wires that the
incumbents were sending voice. This would enable them to sell data
services without buying a whole dedicated line. The incumbents wanted
to sell the competitive entrants a much more expensive "network
element"--the whole wire. The Commission conceptualized this
as though there was a pipeline with different frequencies in it,
and required incumbents to unbundle the higher frequency portion.
From a policy perspective, this was clearly the right choice if
the correct goal is to lower the entry costs to competitors of the
incumbents, without harming the incumbents (beyond the harm they
suffer from the cheaper introduction of competition). In the UNE
Triennial Review, the FCC is requesting comment on whether it should
reopen this requirement. This inquiry may, however, have been rendered
moot by a
decision rendered in late May 2002 by the D.C. Circuit, which
invalidated the Line Sharing Order as unreasonable. It is not entirely
clear that this opinion quite fully squares with the relatively
broad discretion and deference that could be read into the U.S.
Supreme Court decision in Verizon, although it is not directly
in conflict with that holding.
The conclusion to all this is, unsuprisingly, that things are in
flux in the telephony-based infrastructure for broadband Internet
access. While the Surpeme Court seems to have recently given the
Commission broader powers to use unbundling as a means of facilitiating
entry by competitors, both the Commission itself and the D.C. Circuit
are actually moving away from regulating the use of shared facilities
to promote competition among competitors using the incumbent telephony
infrastructure. Instead, they seem to be relying more heavily than
ever on the presence and long term beneficial effect of intermodal
competition from cable.
Intro
| Phones
| Cable
| Wireless
| Software
Platforms | Universal
Service | Discussion
| Additional
Materials
2.
Access to Physical Layer Infrastructure (2) - Coaxial Cable
At least 80% of US households have access to cable service. This
infrastructure is currently being upgraded from the coaxial cable
- designed for the one-way transmission of video programming - to
hybrid fiber co-axial for the purpose of broadband Internet access.
Data carriage capacity is higher for cable broadband than comparably
priced DSL services. It is also available more widely throughout
the cable service area, because DSL availability is limited by distance
from a central office. Cable transfer speeds, however, fluctuate
in accordance with the number of users at a given time, while DSL
is a dedicated line and more predictable. Cable Commissions
report Broadband
Today, though slightly dated, still provides a good starting
point to understand the forces operating in this area.
The open access debate arises from the cable network owners' practice
of bundling cable modem access and Internet service provider (ISP)
selection. They had initially signed exclusive contracts with affiliated
ISPs, Excite@home and Roadrunner. More recently, these cable providers
have been providing their own systems. Cable providers, unlike the
telephone companies, are regulated under Title VI of the Communications
Act. Most importantly, this means that they have never been not
subject to common carrier restrictions, and the various requirements
imposed on the incumbent local exchange carriers to cooperate with
competitors do not apply to them. Independent ISPs feared that exclusion
from the cable network would leave them incapable of competing in
quality of service with the cable operators affiliated ISPs.
The DSL providers, in particular the incumbent local telephone carriers,
objected to the fact that they must provide access to their network
to competitors, whereas the cable companies need not. In opposition
to the demands of open access advocates, cable network owners and
their allies argued that imposition of forced access upon them will
result in slowing the availability of high-speed access. The cable
infrastructure requires adaptation to be used interactively and
their claim is that in order to finance the adapatation they need
exclusive control. They have also asserted that technical and security
questions impede the opening of access to other ISPs. The politics
of the debate are complex, and heavily influenced by interest. AOL,
for example, was initially a major part of in the OpenNet
coalitions efforts to persuade the FCC to impose open
access requirements on cable service providers. It reversed its
position after it announced its merger with Time Warner. The merger,
however, provided the first opportunity to impose limited open access
requirement as part of the merger approval.
A central aspect of concern
with cable access is whether an owner of physical infrastructure,
that also controls Internet access, can to a large extent affect
the content of the information its customers will see, and how they
will use it. (These concerns were given grounding and heightened
by a White Paper from Cisco extolling the benefits of such a strategy
as part of a sales pitch offering the tools for its execution Controlling
Your Network - A Must for Cable Operators). Over and above
the concerns about parity in competition, there is a concern over
the eventual bifurcation of the Net, with high-end access available
to commercial content, and low-level access for non-commercial content.
The emergence of commercial services such as Akamai
accelerating access to e-commerce and commercial sites points to
the tendency towards bifurcation present even in the absence of
infrastructure-based handicapping. With an added layer of preference
for commercial content by the infrastructure itself, we could find
ourselves with a medium much more like the mass media of the 20th
century than the Internet of the 1990s. Elements of the approach
are evident in the self-description of At Home's network. Restrictive
end-user contracts from cable companies indicate that they are concerned
to avoid competition not only from their industry competitors, but
also from production of content by users. End-user contracts forbid
media streaming of more than ten minutes duration and the operation
of a server over the cable modem network Other means by which proprietors
can exploit their position include selective caching of content
produced by them or their partners - so that it is easier to access
and cleaner to watch - and the collection of data on customer use
patterns.
The general response at the national level to efforts to open access
to the cable infrastructure was cool. The FCC, for example, refused
to impose open access requirements upon AT&T in reviewing its
mergers with TCI and MediaOne. (See
Lessig on access in the MediaOne Merger.)
The first successes that open access advocates enjoyed were at
a local level. Cable companies are regulated as much, if not more,
by local franchising authorities as they are by the FCC. Portland,
Oregon was among the first locales whose local
franchising authority mandated open access by ordinance.
These requirements have been overturned in court in holdings of
two varieties. In the case of Portland an interpretive approach
to the 1996 Act deemed cable broadband access not within the
definition of cable service. In Broward County, Florida, a Federal
District Court took a constitutional approach, and found that broadband
access was a cable service, but that imposition of conditions on
its provision violated the First Amendment.
47 USC Section 541(c) of the Communications Act states: "Any
cable system shall not be subject to regulation as a common carrier
or utility by reason of providing any cable service." But in
many instances, the franchisee needs the franchising authority's
approval in case of sale. This, at least in principle, opens the
possibility of the imposition of conditions upon the grant of that
approval under 47
USC 537, and under 47
USC 533(d)(2) which states: Nothing in this section shall
be construed to prevent any State or franchising authority from
prohibiting the ownership or control of a cable system in a jurisdiction
by any person
(2) in circumstances in which the State or
franchising authority determines that the acquisition of such a
cable system may eliminate or reduce competition in the delivery
of cable service in such jurisdiction.
(a) AT&T
Corp. v. City of Portland
When AT&T bought TCI, it sought to transfer to it the cable
franchises that TCI had previously held. In Portland, this was used
as an opportunity to get AT&T to comply with the citys
requirement that AT&T open up the cable broadband infrastructure
to competitors to offer Internet service. AT&T rejected Portlands
open access requirement, and its request to transfer the franchise
was accordingly denied. AT&T sued the city, seeking invalidation
of the ordinance. The District
Court upheld its legality but was overruled by the 9th Circuit
on appeal. The Court
of Appeals rejected the classification of cable broadband access
as a 'cable service', distinguishing it from the 'one-way transmission'
of programming with highly limited subscriber interaction.
Consequently Portland did not have any legal basis to regulate the
delivery of the service under the franchise agreement. Indeed the
court held that no cable franchise was required for delivery of
broadband services. The court decided that broadband internet access
was a hybrid service, constituted in one part - the 'pipeline' -
by a telecommunications service and the other -as an ISP - by information
service. Accordingly, regulation of the service by the cable franchising
authority was struck down.
(b) Comcast
Cablevision of Broward County, Inc. v. Broward County, Florida
Broward County in Florida adopted an open access ordinance requiring
the cable franchise to provide access to its broadband network 'on
rates, terms and conditions at least as favorable as those on which
it provides such access to itself.' The ordinance was the subject
of a successful first amendment challenge for interference in cable
broadcaster's power to determine programming available on their
systems. Judge Middlebooks likened open access to the right to reply
statute found unconstitutional in Miami
Herald Pub. Co. v Tornillo, and dismissed the argument that
cable companies enjoyed a bottleneck monopoly over access to the
Internet on the grounds that most users connect over phone lines.
Access requirements at the Federal Level--Limited and Diminishing
One of the determinative factors in both these cases was the FCCs
unwillingness to support open access. An important, though in some
sense little stated, concern faced by the courts was whether such
a question of national communications policy ought to be decided
on a local basis by hundreds of franchising authorities. The sustained
public pressure seemed to yield some change in national policy when
it came time for the relevant federal agencies to approve the AOL-Time
Warner merger. In a sense, that merger crystallized the concerns
with a fully integrated cable broadband infrastructure. The nations
largest ISP bought one of the two largest cable operators, and one
of the nations largest content libraries. Together, the picture
was one of a fully integrated network. Consumers would be presented
with an infrastructure and an interface controlled by one corporation.
This corporation also controlled many media properties and would
have incentive to use the opportunities control over the infrastructure
gave it to focus users attention and time on its content to
the detriment of others. To obtain FTC
approval of its proposed merger with AOL, Time Warner entered
into a five year consent
agreement to allow access by rival ISP Earthlink to their broadband
network in 70% of their markets, prior to making available AOLs
broadband ISP service (November 2000). In addition it must contract
with at least two other ISPs within ninety days of the launch of
AOLs service. In the remaining markets the company must contract
with three non-affiliated ISPs within ninety days of the launch
of AOLs service on terms and conditions as favorable as those
provided to AOL and affiliated ISPs. Network flow data and interconnection
points must be provided to affiliated and non-affiliated providers
on a non-discriminatory basis. While this is only a limited requirement,
and only applies to AOL Time Warner, it was interpreted at the time
as an important first instance of such a requirement. Following
this decision, the FCC issued a Notice
of Inquiry process regarding access to high-speed Internet access
over cable, focused in large measure on the question of whether
something like the moderated open access requirement imposed on
AOL Time Warner should be generalized.
The most recent FCC activity in this area seems, however, to begin
a tentative move away from open access. As part of the series of
rulemakings issued in early 2002, the FCC issued a Declaratory
Ruling and NPRM that makes two important moves away from open
access in cable. Both involve the declaration that broadband internet
access over cable is not a service that combines telecommunications
services with information services, but is, rather, purely an information
service. The first, direct, effect that this declaration does is
securely to federalize the issue of access to the cable broadband
platform. By defining cable broadband as an interstate information
service, the FCC asserted sole jurisdiction over it, and thereby
preempted action by local franchising authorities. Given that these
authorities were the primary sources of efforts to regulate cable
companies broadband provision, this decision was an important conclusion
to the move to federalize this question already implicit in AT&T
v. Portland. The second, probably more important but less direct
effect that this declaration had was to make it very difficult for
the Commission to adopt a seriously invasive access model. As an
information service, cable broadband is categorized in a very broad
category that could apply to many things, in particular online newspapers,
that could never consitutionally be required to open their facilities
to others. The result is that AOL Time Warner is subject to some
access regulation because of the merger. If other cable providers
similarly merge, the Federal Trade Commission might impose
some access requirement, as it did in the AOL Time Warner merger.
But the FCC at this stage seems to have decided that competition
from the Incumbent LECs will suffice to secure non-monopolistic
behavior.
The result of the cable decisions, given the parallel move away
from requiring extensive unbundling and forced sharing of inputs
on the telephony side, is that the FCC is preparing to make peace
with a duopoly situation based on pipeline ownership. Almost all
current broadband over telephone lines is offered by incumbent LECs.
Almost all cable broadband access is offered by ISP affiliates of
the cable operators. Each pipeline owner will, for all practical
purposes, enjoy a monopoly over its own pipeline. Even assuming
the best of all possible worlds, where the reach of each pipe is
equal and equally comprehensive, we are left with a duopoly. Two
are better than one, perhaps, but to describe such a system as achieving
the open, competitive system that the Telecommunications Act of
1996 envisioned would be something of an exaggeration.
The threat to innovation
While many of the arguments about open access are concerned with
traditional antitrust concernsconsumer welfare, quality and
price of serviceand with capacity of users to speak, Mark
Lemley and Larry Lessig formulated a different argument in support
of access. They focus on the potential for cable access bundling
to wipe out the layer of ISPs that have provided much of the innovative
impetus for the flourishing of the web. In addition, they identify
this as a step towards undermining the end-to-end
design principle that undergirded the network architecture of the
Internet, which ensured maximum accessibility by placing all intelligence
at the end of the network rather than at its core, where, it has
been said, stupidity
is its strength.
Intro
| Phones | Cable |
Wireless | Software
Platforms | Universal Service | Discussion
| Additional Materials
3.
License-Free Wireless Spectrum
Many of the problems that plague the deployment of high-speed internet
access over the legacy wired infrastructure have to do with the
very high costs of construction and the difficulty of acquiring
the necessary rights of way to deploy the system. One infrastructure
for communications that does not suffer from similarly high set-up
costs, at least not necessarily, is offered by wireless communications.
At the moment, the limited, controlled nature of wired infrastructure
is being replicated in the air by licensing policies that limit
the number of providers who can offer Internet access over the air.
But this regulatory creation of a controlled-infrastructure model
for wireless communications is technically obsolete. (for a detailed
analysis go here.)
Until the last decade, the universe of options for regulating spectrum
was technologically limited. Given the relative crudeness of reception
devices that could be made cheaply enough for consumer markets,
the generally accepted understanding was that the only way for a
signal from a transmitter to be received by a receiver was for the
transmitter to be louder than all other sources of radiation in
a given frequency. This meant that if two or more transmitters tried
to be "heard" over that frequency, there would be "interference"neither
would be sufficiently louder than the other. This basic technological-economic
fact limited the menu of institutional options open for government
in regulating wireless transmission. Someone had to be given
the exclusive right to transmit loudly over a given narrow frequency,
at a given time, in a given location and power. Policy debates in
the area of spectrum management have therefore focused for almost
half a century on whether that someone should be chosen by licensing
or by auctioning, and to what extent the use to which they put their
license should be determined by the FCC as opposed to by the licensees
responding to market signals.
Advances in computer processing, network communications, and wireless
transmission technologies, mostly spread
spectrum and software defined radios, have now made possible
a third alternative. Receivers and transmitters can be made intelligent
enough to share spectrum. Multiple users can and do share wide swaths
of spectrum simultaneously. Allocation and assignment are achieved
on a packet-by-packet basis, using equipment-embedded protocols
rather than organizational decisions by a licensee or spectrum owner.
"Spectrum" is "managed" not by licensing, but
on a license-free model, where all equipment that complies with
some standard for sharing--at a minimum complying with a power limit--is
permitted to operate without a license.
The technological shift has begun to bring about some initial changes
in spectrum policy, and somewhat more significant developments in
the equipment and service markets. Traditional spectrum policy permitted
operation of low power devicessuch as cordless phones, garage
openers, microwave ovensin very narrow bands of spectrum,
like the now much-used 900MHz band and 2.4GHz band. These were generally
dumping grounds for sundry useful low power emitters, rather a focus
of communications policy. As the 1990s progressed, the FCC permitted
operation in the U-PCS band and the U-NII band (see the FCC's Unlicensed
National Information Infrastructure Order of 1997), as two instances
where the FCC self-consciously permitted operation of low-power
devices without a license, with the expectation that communications
services would be offered in reliance on a license-free model. The
problem with these was that license-free service was permitted and
designed not as a primary focus, but primarily by reference to its
potential interference with licensed incumbent services. With the
U-NII Band in particular, the 300 MHz in which license free operation
was permitted, was sliced and regulated seemingly guided largely
by a concern for incumbent licensed services, not for what would
optimize operation of the license-free devices themselves. More
recently, the FCC has begun to look more seriously at flexible transceivers
using license free spectrum as a serious direction. Most important
in this regard are the Commission's adoption of the Ultra-Wide
Band order and the permission to manufacture Software Defined
Radios. These are substantial moves towards permission for equipment
manufacturers to go ahead and experiment with wireless communications
that utilize a combination of techniques to break away from the
old model of wireless communication and create an entirely new model.
Market actors have been less reticent about license-free spectrum,
and some have taken advantage of the limited spectrum that is available
to develop quite significant applications, utilizing a variety of
standards and approaches. Apples Airport
is a particularly visible example, but many PC producers are now
integrating license-free wireless networking into their computers,
and some equipment manufacturerslike Nokiahave
developed equipment that could be the basis for large-scale high
speed data networks owned by no one. Actual networks
built around license-free equipment have emerged on a nonprofit
basis around the world.
The year 2001 seemed to have been for license-free wireless what
1999 was for free software. Equipment using standards developed
for the scraps of spectrum in which license-free operation is permitted
are beginning to become ubiquitous. Different standards--802.11,
Blue Tooth and Home
RF, for example, sought acceptance in devices capable of transmitting
at much higher speeds than those currently available over cable
or DSL, using spectrum owned by no-one. In conjunction with TCP/IP
protocols, these devices can provide the last 10 miles, or mile,
or at least 100 feet connection to the Internet, and make it mobile
to boot. If this should happen, it would radically alter the lay
of the land on Internet access. By late 2001 and erly 2002 this
possibility began to be appreciated even in some media accounts.
To get a flavor of the possible, you can browse around work done
by Dave Hughes, a researcher
who conducted numerous field tests connecting schools in rural areas,
you can read about them here,
or listen to him give a presentation.
Another important locus of implemenation projects is the Dandin
Group, headed by Dewayne
Hendricks. Important and accessible explanations are availble
in Reed's
Locus. The intrepid may wish to examine the technical detail
provided by the FCC Technological
Advisory Committee's working group on spectrum management (SWG).
Hardware manufacturers such as Nokia and Apple are driving the evolution
of these forms with systems such as Wireless
Rooftop Routing (technical papers) and Airport. You can also
read the CarNet proposal for a mobile
wireless network. You can read about policy proposals generated
by these technical possibilities. Eli
Noam has proposed a market model which would clear competing
uses based on willingness to pay a price which would vary depending
upon the volume of traffic on the network at the time. Benkler
argues for a system more akin to the Internet protocol itself, without
reference to payment at all, as the benefits of such an arrangement
to democracy and autonomy outweigh its functional deficiencies-the
wireless spectrum can provide a commons network of last resort
coexisting with the traditional alternatives. Critics, such as Thomas
Hazlett, are skeptical of the capacity of wireless to provide
broadband requirements. Given that broadcast regulation by the FCC
has always been predicated upon the assumption of scarcity and a
consequent need to limit users so as to prevent interference, the
implications of spread spectrum technology are profound. Some, like
Noam, and Benkler and Lessig, have argued that refusal to permit
widespread use of license-free devices could make FCC
regulation of radio unconstitutional.
Intro
| Phones | Cable |
Wireless | Software
Platforms | Universal Service | Discussion
| Additional Materials
4.
Access to the Logical Layer: ‘Soft’ Infrastructure
In digital networks, this is the key layer: this is where
network configuration is defined, where interconnection between
separate physical networks is made possible or prevented, and where
co-existence of various service providers is permitted or denied.
Bar & Sandvig, "Rules
from Truth: Post-Convergence Policy for Access"
(a) Network Effects, Tippy Markets & Lock-in
While obstacles to market entry at the physical layer derive from
the high costs and time required to establish alternative facilities,
soft infrastructure erects barriers through a series of positive
feedback that permit one company to lock
in an early advantage and make entry by competitors difficult.
As the amicus
economists brief in the case of Lotus v. Borland describes,
the nature of software markets means that products whose initial
success may be serendipitous and contingent can capture a market
and exclude even products that are intrinsically better by some
functional measure.
Moreover, markets in software that performs discrete functions
are often closely related to each other. Very often providing a
given functionality can depend on access to another functionality
performed by software in a market already captured by someone. This
characteristic provides opportunity for a company that has become
dominant in one market, to leverage that position to extend its
monopoly into markets that rely on access to the segment of the
logical layer that it controls. This was the argument the government
made in the case against Microsoft and its Windows OS, vis-à-vis
web browsers. The District Courts findings
of fact in the Microsoft case document in detail the
strategies used to produce this effect: bundling of the web browser
with the OS, control over the active desktop, and manipulating user
reliance on default settings. The Microsoft case illustrates that
in principle, antitrust policy can work in 'tippy' markets by enforcing
interoperability, which largely parallels the requirement in telecommunications
markets of enforcing interconnection. Copyright law could attenuate
this danger to some extent, though not completely, by privileging
reverse engineering and refusing protection to interface aspects
essential for the viability of new market entrantsensuring
backwards compatibility and horizontal interoperability, as the
court in Lotus
v Borland did.
(b) Software based platforms for content delivery
The insight is by no means limited, at least in principle, to Microsoft
and operating systems. Prior to the approval of their merger with
Time Warner, America Online resisted opening its Instant Messaging
systems to interoperability
with other instant messaging systems. AOL IM and ICQ dominate the
IM market with 90% of installed clients. AOL's refusal to interoperate
with other providers means that users must run parallel instant
messaging clients to communicate with buddies using
services offered by other providers such as Odigo,
iCast, Yahoo and Microsoft. AOL actively obstructed
competitors' attempts to interact. It argued
that the swift growth of their competitors' user base demonstrates
the absences of any anti-competitive threat, and that control over
interoperation was necessary to assure integrity of their service.
The success of IM has attracted numerous application
developers aiming to capitalize on its potential as a content
delivery platform, incorporating telephony, video conferencing and
a variety of point-of-presence driven services. Control over the
IM platform would, one might be concerned, give AOL similar opportunity
to exert power over these applications that Microsofts power
gave it vis-à-vis application developers who needed access
to its operating system. Subsequent to the FTCs imposition
of approval conditions on the AOL/Time Warner merger, the FCC
appended a requirement that AOL provide
access to IM should the service be synthesized with cable broadband
to offer advanced instant messaging'. So long, however, as
IM remains limited to phone and DSL lines interoperability will
not be mandated and AOL will retain full control. An example of
the effect on this on the compatibility of new programs can be seen
in the fact that Jabber,
the open-source project that provides an instant messaging functionality,
is compatible only with Yahoo and MSN's instant messaging, not AOL's
AIM and ICQ. One should, however, remain somewhat cautious about
the claim that AOL's instant messaging programs will in fact hold
bottleneck control. Firms attempting to offer client-side interoperability
suggest that, with some initial start-up costs, it is possible that
AOL's hold will be substantially less complete than Microsoft's.
This is an example.
(c) Overcoming Exclusivity: Open Standards, Free
Software & Open
Source Definition
One solution to the problem of closed platforms at the logical
layer is to develop open public standards for how applications operate
and interoperate with each other. A standard is a codified way of
performing a function. It allows anyone who needs to perform the
function as part of a function being provided by their software
to do so based on the standard, and be confident that the application
will work with equipment and other applications that similarly adhere
to the standard. If many who need to use the functionality can participate
in formulating the standard, and it is open for anyone to read and
understand fully, then different applications, created by different
companies or individuals can all interoperate without giving anyone
the power to control communications that depend on the functionality.
Organizations like the IETF,
W3C, The
Internet Society and the IEEE
provide for standard setting that, when followed, provide an open
framework for software, whether or not proprietary, to interoperate.
A more radical remedy, more complete in its promise to maintain
an open logical layer, is the placement of free software or open
source software in critical junctures of the logical layer. Already,
the Apache server software
that most web servers
use, and Perl, the programming
language used for most scripts and many other cross platform functionalities
on the Web, are free software. In September 2000, the President's
Information Technology Advisory Committee produced a report
on open source recommending concerted federal support. As more
of the core of the logical layer is free software, open for all
to see, use, and modify, the logical layer becomes an open layer
that cannot be manipulated by any one entity to control Internet
access.
Phones
| Cable | Wireless
| Software Platforms | Universal
Service | Discussion | Additional
Materials
5.
Universal Service: progressive social policy or fig-leaf for incumbent’s
interests?
While most of the focus in this module has been on the structure
of access to the Internet and the ways in which it does or does
not give infrastructure owners the power to control use of the Internet,
an important social policy concern with Internet access has been
the distribution of access across class, race, gender, age, and
education attainment. Treated mostly in the latter part of the 1990s
as the problem of the Digital Divide, the problem is one that in
traditional telecommunications policy has been treated under the
umbrella of universal
service.
At inception, universal service referred to the need for a unified
service in a context of a fragmented phone system without interconnection
requirements, rather than a conscious policy to promote residential
telephone penetration and rural telecommunications facilities through
subsidies gleaned from long-distance. It was introduced
as a concept by AT&Ts President, Theodore Vale, in 1907,
largely as a justification for the Bell Systems push to monopolize
telephone service throughout the United States. According to Milton
Mueller, as competition began to enter the long-distance market
in the 1970s, Bell invoked universal service as part of its mission
for reasons of political expediency, namely to retain their position
as a regulated monopoly.
Coming from these inauspicious roots, the solution developed to
the problem in the Telecommunications Act of 1996 was incomplete.
The FCC operates targeted schemes to bring access to specified groups
under programs like Lifeline and Link-Up, which provide plain old
telephone service of some degree to the poorest users. E-Rate,
is the program aiming to bring schools on-line, and the one most
closely related to Internet access. Under 1996 Act the FCC and a
joint federal-state board is supposed to develop a standard of "an
evolving level of telecommunications services" that is to be
reconsidered and updated periodically, that would be subsidized
for poor users. This evolving definition for universal service is
intended to comprise services subscribed to by a majority of American,
but inclusion of Internet access for the general user remains undetermined.
Any service brought within the definition becomes eligible for subsidy.
The FCC released a first definition
in May 1997 and presented a detailed Report
to Congress on the functioning of universal service in 1997.
Nonetheless, while much political airtime has been spent on the
Digital Divide, relatively little by way of active subsidization
or implementation of ubiquitous access has in fact been achieved
beyond the declining price of computers and Internet connections.
Intro
| Phones | Cable |
Wireless | Software
Platforms | Universal Service | Discussion
| Additional Materials
Discussion
Topics
1. AT&T
Corp. v. City of Portland. Imagine that you are a Supreme-Court
justice hearing arguments from AT&T and the City of Portland
on appeal. You know that the Federal
Trade Commission has required AOL Time Warner to open its network
to some competitors, and that the FCC is considering the issue more
broadly. How would you rule and why? Do you think these issues should
be decided at a national or local level? How do you see your decision
affecting the architecture of Internet access over cable?
2. You have just read papers describing Nokia's Rooftop
Networks. Imagine that you are: (a) a staff member at the FCC's
Office of Engineering Technology responsible for spectrum management
decisions; or (b) a lawyer for an industry group of wirless providers
who have spent billions of dollars purchasing spectrum licenses
at auctions in the past few years. Imagine that the FCC is considering
whether the UHF bands, that will be released by television stations
in 2006, should be auctioned off to providers of 3G wireless services
or retained for a license-free spectrum. What would your arguments
be about whether the spectrum should, or should not be auctioned?
3. Local exchange carriers often argued for regulatory parity.
Why, they asked, is it fair to require them to provide their competitors
in high-speed Internet access equal access to their facilities,
without similarly requiring cable companies to do the same? This
fairness objection could be resolved in one of two opposite ways.
Cable companies could be required to offer competitors unbundled
access to their broadband platform, or telephone companies could
be relieved of their obligations. An argument in favor of relieving
the telephone companies of their obligations is that having two
wires in each home--the DSL provider and the cable provider--would
suffice to assure that consumers benefit from competition, while
leaving both telephone companies and cable operators enough incentives
to make the huge investments necessary to bring broadband to the
home. The presence of competitors who can free ride on the investments
of the infrastructure owner at cost dampens incentives for investment
in infrastructure. If, this argument goes, consumers value highly
being able to access many internet service providers, then either
the telephone company or the cable company will likely open their
system up to other ISPs as a competitive strategy against the other
infrastructure owner, to capture those consumers who value choice
of ISPs. The contrary argument is that neither cable nor telephone
companies should be allowed to exclude competitors, and that there
should be competition not only between two wires, but also within
each, by regulating them so as to make space for any number of ISPs
to compete with each other and with the infrastructure owner. Only
thus, goes this argument, will we get enough competitors. These
are precisely the questions that the FCC must face now, as it decides
how to regulate the newly-defined cateogry of "information
services" into which it has now put both cable broadband and
DSL. What do you think?
4. The District Court in the Broward
County case held that requiring a cable operator to open its
network to competing providers violated AT&T's first amendment
rights by forcing them to carry the speech of another. Public
interest advocates have argued that, to the contrary, open access
is mandated by the first amendment, because it is necessary to enable
individuals and many diverse groups to speak, and the permitting
a cable company to control Internet access over its infrastructure
is what harms speech. Which is the more persuasive argument?
5. One approach to building high-speed Internet access is to construct
municipal networks. Chicago, for example, is working on what it
calls CivicNet.
Some municipalities are building on their existing network of sewers,
working with a company called CityNet,
which uses robots to pull fiber conduits through sewers to every
home. Deployment has already begun in a number of U.S. cities, and
the company is seeking contract to do the same elsewhere, beginning
with a contract to pull fiber through the sewers of Vienna.
Is this a solution to the Internet access problem? Is it appropriate
for government to "compete" with private broadband infrastructure
providers? Will it help or harm speech and innovation to have local
governments owning broadband municipal networks?
6. What would you say to the argument that the debate over Internet
access and the precise architecture of the network and the market
are really only the concern of a small privileged class, both nationally
and internationally. The really important issue is distribution
of access. With so many people on the planet lacking the most basic
telecommunications facilities, it is much more important to make
sure that more people get some Internet access than to spend all
this political and regulatory energy on the precise details of how
Internet access is configured.
7. Should universal service impose an obligation on federal and
local government to finance public provision of fiber optics and
wireless spread spectrum devices for all? Dave
Hughes argues that at least the E-Rate program should be usable
as a public subsidy for purchasing wireless equipment for building
an unowned infrastructure. What do you think?
8. In approving the AOL Time Warner merger the FCC deferred open
access requirements to future services combining Instant Messaging
and cable broadband to deliver advanced instant messaging.
Should the FCC have paid more attention to interoperability in messaging
systems today, rather than focusing on the future? Is there a concern
that customers will more or less become committed to AOLs
software, so that opportunities for competition later will become
moot because the market shall have tipped? Would such a development
affect, perhaps, the viability of services built around instant
messaging facilities, like Madster?
9. Lemley and Lessig contend, that: If a regulated entity
threatens to force the adoption of an architecture which is inconsistent
with the Internets basic design, and if that action affects
a significant portion of a relevant Internet market, then the burden
should be on the party taking that action to justify this deviation
from the Internets default design. The presumption should
be against deviating from these principles
.As with any principle,
these presumptions should apply unless there is clear evidence that
displacing them in a particular case would be benign. Is it
the proper role of regulation to adopt and support a particular
architecture, even if it is open and conducive to innovation? Can
policy afford to be otherwise, if Lemley and Lessig are correct
in their predictions?
Go
to the Discussion Board
Intro
| Phones | Cable |
Wireless | Software
Platforms | Universal Service | Discussion
| Additional Materials
Additional
Materials
Copper Wires & Telephones
AT
& T v Iowa Utilities Board 525 US 366 (1999)
Department
of Justice Evaluation of Bell Atlantic
Eben Moglen, The
Invisible Barbecue, 97 Columbia L. Rev. 945 (1997)
LATA = Local
Access and Transport Area
James Speta, Handicapping
the Race for the Last Mile?: A Critique of Open Access Rules for
Broadband Platforms, 17 Yale J. on Reg. 39 (2000).
Jim Wagner, Verizon
DSL Reductions Prompt ISP Outrage. September 1, 2000, InternetNews
Notice
of Inquiry Concerning High-Speed Access to the Internet over Cable
and other Facilities,
Kevin Werbach, The
Digital Tornado: OPP Working Paper Series 29, 1997
Coaxial Cable
Broadband
Today, FCC Staff Report (October 1999).
Tech
Law Journal: Summary of AT&T v. City of Portland.
Patricia Aufderheide, The
Threat to the Net, Metrotimes, February 2000.
Joel Garreau and Linton Weeks, Visions
of a World That's Nothing but Net, Washington Post, January
11, 2000, at C1.
Cisco, Controlling
Your Network - A Must for Cable Operators, 1999.
Peter S. Goodman and Craig Timberg, AOL
Ends Lobbying for Open Access, Washington Post, February
12, 2000.
Lawrence Lessig, Broadband
Blackmail, The Industry Standard, November 14 1999.
Lawrence Lessig, The
Cable Debate, Part II, The Industry Standard, November 14 1999.
Peter S. Goodman, Is
it Cable or Is It Internet?, Washington Post, November 2, 1999.
Peter S. Goodman, AT&T
Rivals Cautious on Cable Access, Washington Post, December 6,
1999.
James Speta, The
Vertical Dimension of Cable Open Access, 71 Colo. L. Rev.
975 (2000)
Amicus
Curiae Brief of the Federal Communications Commission, AT&T
Corp. v. City of Portland, On Appeal from the United States District
Court for the District of Oregon, No. 99-35609. (ONLY READ Point
I of the Argument Section)
Shawn ODonnell, Broadband
Architectures, ISP Business Plans, and Open Access
Bar, François, and Christian Sandvig, "Rules
from Truth: Post-Convergence Policy for Access", paper
presented at the 28th Annual Telecommunications Policy Research
Conference, Arlington VA, Sept 23-25, 2000.
Bar, François, Stephen Cohen, Peter Cowhey, Brad DeLong,
Michael Kleeman, and John Zysman, "The
Open Access Principle: Cable Access as a Case Study for the Next
Generation Internet " (Feb 21, 2001 draft), Forthcoming
in Lee W. McKnight and John Wroclawski , eds, The Economics of Quality
of Service in Networked Markets, MIT Press (2001)
Jason Whiteley , AT&T
Corp. v. City of Portland: Classifying Internet Over Cable
in the "Open Access" Fight
Lessig, In
the Matter of the AT&T/MediaOne Merger
Jerome H. Saltzer, "Open
Access" is Just the Tip of the Iceberg, October 22, 1999
FCC Cable Service Bureau Homepage for America Online, Inc. and Time
Warner, Inc. Proposed Transfer of Control.
Wireless
Kevin Werbach, Here's
a Cure for Bandwidth Blues, ZDNet, Nov. 28, 2001
Paul Davidson, Airwaves
Battles Mount Before the FCC, USA Today, Mar. 13, 2002.
Reed's Locus
Dandin Group
David Hughes and the NSF
in Colorado
Apple Airport
Cellnet Data Systems
David
Hughes on E-Rate
Dave Beyer, Mark D. Vestrich and JJ Garcia-Luna-Aceves The
Rooftop Community Network: Free, High-speed Network Access for Communities
Rooftop Networks, Harvard IIP.
Morris, Jannotti, Kaashoek, et. al., CarNet, A
Scalable Ad Hoc Wireless Network System
Paul Baran, Visions
of the 21st Century Communications: Is the Shortage of Radio Spectrum
for Broadband Networks of the Future a Self Made Problem?
Keynote Talk Transcript, 8th Annual Conference on Next Generation
Networks Washington, DC, November 9, 1994,
FCC, Further
NPRM & Order In Re: Amendment of Part 15 Regs. Regarding Spread
Spectrum
Devices & Wi-LAN, Inc., May 11, 2001.
Federal Communications Commission's Technological
Advisory Committee's working group on spectrum management (SWG).
Yochai Benkler, Overcoming
Agoraphobia: Building the Commons of the Digitally Networked Environment,
11 Harv. J. L. & Tech. 287 (1998).
Eli M. Noam Taking
the Next Step Beyond Spectrum Auctions: Spectrum Access
Yochai Benkler, A
Speakers Corner under the Sun in The Commodification
of Information: Political, Social, and Cultural Ramifications
(N. Elkin-Koren, N. Netanel, eds.). Kluwer, 2000.,
Thomas Hazlett, The
Wireless Craze, The Unlimited Bandwidth Myth, The Spectrum Auction
Faux Pas, and the Punchline to Ronald Coase's "Big Joke":
An Essay on Airwave Allocation Policy. AEI-Brookings
Joint Center for Regulatory Studies Working Paper 01-02. January
2001.
Yochai Benkler and Lawrence Lessig, Will
technology make CBS unconstitutional? Net Gains, Issue date:
12.14.98
Plumbers
Crack the Last Mile, Wired, April 2001.
Soft Infrastructure
Francois Bar et al. Defending
the Internet Revolution in the Broadband Era: When Doing Nothing
is Doing Harm, August 1999.
Bar, Francois, Islands
in the Bit-Stream: Charting the NII Interoperability Debate,
BRIE Working Paper, 79 (1995)
James Boyle, Missing
the Point on Microsoft, Salon, 4.7.2000
Steven Weber, The
Political Economy of Open Source Software, June 2000
Hankland Kanellos and Wong, Sun,
Microsoft settle Java suit CNET News.com, January 23, 2001
Tony Kontzer, Time
Warner/AOL Merger Conditions, Information Week January 2001.
Barbara Darrow Wednesday, Instant
Messaging Combatants Rail Against AOL, Techweb May 24, 2000
David S. Isenberg,
Dawn of the Stupid Network
Lotus v Borland, amicus
economists brief, 1995
President's Information Technology Advisory Committee Developing
Open Source Software to Advance High-End Computing,October
2000.
Universal Service
Bar, François, and Annemarie Munk Riis, "Tapping
User-Driven Innovation: A New Rationale for Universal Service",
The Information Society, 16:1-10, 2000
Falling
through the Net
Adrian Herbst, Developing
Public Telecommunication Systems
Ephraim Schwartz, Utilities,
municipalities partner to build fiber links, Infoworld,
January 28, 2000
FCC, E-Rate Program
FCC, Report
to Congress on Universal Service, April 1998.
Mueller, Universal
Service and the New Telecommunications Act: Mythology Made
Law
For more background see papers at: http://www.vii.org/afuniv.htm
Computer Professionals for Social Responsibility, Serving
the Community: A Public Interest Vision of the National Information
Infrastructure , 1996
Organizations
OpenNet coalition
Free Software
Open
Source
IEEE
The Internet Society
Alliance for Community Media
Benton Foundation
Center for Media Education
Civil Rights Forum on
Communications Policy
Computer Professionals for Social
Responsibility
Consumer Project on Technology
Consumers Union
Media Access Project
National Association of Counties
OMB Watch
Toward Utility Rate Normalization
http://www.netaction.org/
(Pro Open Source, Anti Open Access)
Utility Consumer Action Network
http://www.starband.com/
David Hughes
Apples Airport
Public Knowledge
New
America Foundation
Intro
| Phones | Cable | Wireless
| Software Platforms | Universal
Service | Discussion | Additional
Materials
|