Award Winning Blog

Wednesday, December 8, 2010

No Free Lunch in Internet Peering or Transit

              Like many of you, I am keenly following the Comcast-Level 3 dispute and am trying to make sense of it all.   The dispute confirms several universal principles about Internet traffic routing that have passed the test of time:

1)         Consumers pay Internet Service Providers (“ISPs”) a monthly subscription with the expectation that the fee covers access to available content, i.e., the conduit.  As the World Wide Web evolves and content options diversify to include full motion video, consumers simply expect their ISPs to make sure the download distribution pipes are sufficiently robust to handle high bandwidth requirements and commensurately large monthly download volume.  Cable modem service agreements may have a cap on downloading per month, but consumers generally assume “All You Can Eat” access rights, plus the expectation that video streaming will work, i.e., no blurring, frozen frames, or blue screens.

2)         Because upstream requests for content are narrowband and because the typical consumer downloads much more content than he or she uploads, ISPs serving end users, such as Comcast, typically will have a large traffic imbalance with more downstream traffic to deliver than upstream traffic that the end user serving ISP might want other ISPs, such as Level 3, to handle.

3)         Until such time as Comcast’s “Television Anywhere” takes off and generates lots more traffic that Comcast will need other ISPs to handle—whether on a peering or transit basis—Level 3 vastly contributes to Comcast’s download “surplus” delivery burden to end users.  Of course Level 3 replaces another content distribution network so the total volume of Comcast’s downloading burden does not change in the short term. However, in the context of peering and transit between Comcast and Level 3, the traffic volume relationship changes with a greater imbalance resulting from the new Netflix traffic Level 3 now delivers to Comcast.

4)         The Comcast- Level 3 dispute distills to a disagreement over whether and how much either should pay in light of changed traffic patterns.  Because the parties already have traffic agreements, modification of terms might require additional payments from Level 3 to Comcast, absent Comcast’s need for Level 3’s upstream transmission services.  Of course Comcast does need the services of Tier 1 ISPs like Level 3, but until Comcast starts distributing lots more of its cable television video product over the Web, Netflix downloading to Comcast subscribers will predominate.

5)         Cooperative ISPs typically align inbound and outbound peering traffic with an eye toward creating a balance, but either or both ISPs might also want to expand transiting services as these paid arrangements are based on the unlikelihood of balanced traffic loads.   Digital Society Policy Director George Ou reports that Comcast and Level 3 have both peering and transit agreements; see http://www.digitalsociety.org/2010/12/video-level-3-versus-comcast-peering-dispute/.  George lays blame on Level 3 for expecting Comcast to absorb the newly increased volume of traffic delivered to it by Level 3 without additional payment by Level 3, or the offer of additional free upstream capacity.

            Reasonable people can disagree as to the mutual exclusivity or substitutability of peering versus transit.  George considers the two types of traffic arrangements mutually exclusive and has chided me for thinking that the parties could recalibrate both to mitigate the traffic imbalance if they wanted to.  See http://www.digitalsociety.org/2010/12/many-analysts-wrong-on-comcast-versus-level-3/. 

            The Comcast- Level 3 dispute confirms that there is no such thing as a free lunch.  It also highlights disagreement over who has to pay when consumers’ download requirements increase with full motion video access.  George considers it a nonstarter for Comcast to raise end users cable modem rates, despite a vast increase in the value proposition created by IPTV.   Some economists consider it a given that Comcast has the “right” to demand compensation from both sides of its market position, upstream from Level 3—and possibly the real instigators of greater bandwidth requirements Netflix and Google—and also downstream from end users, i.e., cable modem subscribers, co-conspirators with Netflix and Google.

            Bottom line: one or more players in the Internet “network of networks” will have to pay for greater capacity.  Early on in the Internet’s development, avoiding payment strategies were depicted as “hot potato routing.”  Carriers unwilling to upgrade facilities to accommodate greater demand sought to hand off traffic as soon as possible.  Level 3 has no such option of passing the packets off to several different carriers for the last mile to end users.  Comcast knows this and true to form the company exploits its position to the fullest extent possible.

Thursday, December 2, 2010

Pick Your Poison: FCC Chairman Genachowski’s New Network Neutrality Strategy

FCC chairman Julius Genachowski appears set to abandon a strategy applying selective portions of Title II regulatory safeguards in lieu of general Title I ancillary jurisdiction. Either strategy appears likely to fail upon review by a court or Congress.
           
What makes this matter so difficult is that while an appellate court might try to consider the issue narrowly in terms of whether sufficient statutory authority exists, broader business and political factors matter as well.  Recall that the FCC was able to justify substantial deregulation of DSL, replacing Title II with Title I oversight, based on changed circumstances, largely concerns about "regulatory parity" with largely unregulated cable modem service.  In this politicized and super-charged environment, coupled with the Comcast court decision, the FCC cannot readily reassert Title II based on changed circumstances supporting light-handed government oversight and the public interest, e.g., evidence that Internet access has become an essential public need coupled with proof of discriminatory conduct.
       
There is much speculation that Chairman Genachowski has abandoned his Third Way link to streamlined Title II authority, replacing it with Title I ancillary jurisdiction based on language contained in Sec. 706 of the Telecommunications Act of 1996.  This section requires the FCC and states to encourage ubiquitous access to "advanced telecommunications capability."  The Commission probably will face judicial skepticism whether and how Sec. 706 confers statutory authority to encourage Internet access through selective regulation.
       
I can appreciate that Chairman Genachowski would want to apply a streamlined version of Title II.  It provides the direct statutory authority a reviewing court requires and before changed circumstances provided the basis for its abandonment, Title II required nondiscrimination, transparency and the other Internet Freedoms.  But the political impracticality of re-regulation and the Supreme Court's Brand X affirming the FCC's functional abandonment of Title II, by classifying cable modem Internet access as an information service, makes reliance on Title II a sure loser on appeal.
       
Chairman Genachowski appears to have acknowledged this, but returning to Title I ancillary jurisdiction.  There is case precedent for judicial deference to the FCC's expertise to fashion public interest serving remedies under Title I, e.g., the Commission imposition of cable television regulations in advance of having received explicit statutory authority.  But as emphasized by the D.C. Circuit in the Comcast case, the link to some sort of statutory authority must exist.  The D.C. Circuit likely will remain quite skeptical about an FCC claim of ancillary jurisdiction, simply because Title I confers some general oversight duty over "wire and radio" communications, or the advanced telecommunications capability promotion elements of Section 706.
       
From my vantage point, it looks like the Commission loses either way, should some aggrieved party appeal.  Since Congress has a near zero likelihood of passing explicit statutory authority, the status quo remains.  This means that companies, such as Comcast, which can't help but push the envelope, will exploit the absence of rules to its financial advantage.  The demand for video carriage surcharge from Level 3, provides an example how an ISP can raise the cost of doing business of a rival.  Expect Comcast and others to raise the cost of doing business for both content delivery networks, which generate traffic for Comcast to deliver, as well as content producers, like Netflix, that compete with Comcast's video products.

Monday, November 29, 2010

Comcast’s Demand for a Video Surcharge From its Level 3 “Peer”

According to Level 3, a major long haul Internet Service Provider, Comcast has demanded a “recurring fee” when Level 3 hands off movie and other high capacity video traffic for delivery by Comcast to one of the cable company’s subscribers.  See http://lb.vg/46734.  This demand warrants scrutiny, perhaps less in the context of Network Neutrality and more in terms of further diversification (unraveling) of the peering process.

I will leave to others the advocacy for and against another Comcast innovation in non-neutrality.  The company must consider its merger with NBC a done deal as it continues to maintain a high profile for provocative actions that raise rates to rivals and subscribers alike.

My interest lies in the evolution of peering, a process that used to be symmetrical and largely uniform between similarly sized ISPs.  Under the old school model, Level 3 would have similar peering agreements with Comcast as with other national cable operators.  Likewise Level 3 would have symmetrical terms for the carriage of its traffic downstream via a “peering partner,” such as Comcast, and for Level 3’s carriage upstream of traffic originated or passed onto Level 3 by Comcast.  So under the old model, if Comcast wants to single out a particular type of traffic for a surcharge payment from Level 3, then all things being equal at least in terms of traffic volume, Level 3 could require a similar payment from Comcast. 
           
Under the traditional peering model, if traffic volumes are roughly equal, the surcharge Level 3 would have to credit for payment to Comcast would be offset by a roughly equivalent credit to Level 3 for video traffic originated over the Comcast network, or transiting through it. If Comcast unilaterally has demanded and received the right to a video delivery surcharge without a reciprocal payment to Level 3, then Comcast either has eliminated the conventional symmetry in peering, or much more traffic originates or transits through Level 3 networks destined for Comcast subscribers than Comcast hands off to Level 3.  The fact that Level 3 has capitulated to Comcast’s surcharge demand points to a significant imbalance in traffic flow and commensurate negotiating clout.
           
Much of the Network Neutrality debate has focused on end user access, while peering changes are negotiated agreements about access upstream from end users.  The peering process is obscured by Nondisclosure Agreements and the lack of readily available data on traffic flows.  Comcast may be engaging in a shakedown designed to handicap competitive alternatives to Video on Demand, but the possibility exists that the company is responding to unequal traffic volumes.  We may never know which.

Wednesday, November 17, 2010

New Publication--Invoking and Avoiding the First Amendment: How Internet Service Providers Leverage Their Status as Both Content Creators and Neutral Conduits

The University of Pennsylvania's  Journal of Constitutional Law (Vol. 12 No. 5 ) has published my analysis of the diverging roles of Internet Service Providers as neutral conduit and content aggregator.  Here's the abstract:

Much of the policy debate and scholarly literature on network neutrality has addressed whether the Federal Communications Commission (“FCC”) has statutory authority to require Internet Service Providers (“ISPs”) to operate in a nondiscriminatory manner.   Such analysis largely focuses on questions about jurisdiction, the scope of lawful regulation, and the balance of power between stakeholders, generally adverse to government oversight, and government agencies, apparently willing to overcome the same inclination.  The public policy debate primarily considers micro-level issues, without much consideration of broader concerns such as First Amendment values.

While professing to support marketplace resource allocation and a regulation-free Internet, the FCC has selectively imposed compulsory duties on ISPs who qualify for classification as largely unregulated information service providers.  Such regulation can tilt the competitive playing field, possibly favoring some First Amendment speakers to the detriment of others.  Yet the FCC has summarily dismissed any concerns that the Commission’s regulatory regime inhibits First Amendment protected expression.

For their part, ISPs have evidenced inconsistency in how seriously they value and exercise their First Amendment speaker rights.  Such reticence stems, in part, from the fact that ISPs combine the provision of conduits, using telecommunications transmission capacity, with content.  While not operating as regulated common carriers, the traditional classification of conduit-only providers, ISPs can avoid tort and copyright liability when they refrain from operating as speakers and editors of content.   In other instances, the same enterprise becomes an aggressive advocate for First Amendment speaker rights when selecting content, packaging it into a easily accessible and user friendly “walled garden,” and employing increasingly sophisticated information processing techniques to filter, prioritize and inspect digital packets.

Technological and marketplace convergence creates the ability and incentive for ISPs to operate as publishers, editors, content aggregators, and non-neutral conduit providers.  No single First Amendment media model (print, broadcast, cable television and telephone), or legislative definition of service (telecommunications, telecommunications service and information service) cover every ISP activity.  Despite the lack of single applicable model and the fact that ISPs provide different services, the FCC continues to apply a single, least regulated classification.  The inclination to classify everything that an ISPs does into one category promotes administrative convenience, but ignores the complex nature of ISP services and the potential for to harm individuals, groups and First Amendment values absent government oversight.  For example, the information service classification enables ISPs to engage in price and quality of service discrimination that network neutrality advocates worry will distort a free marketplace of ideas.

This paper will examine the different First Amendment rights and responsibilities borne by ISPs when they claim to operate solely as conduits and when they combine conduit and content.  The paper will show that ISPs face conflicting motivations with light FCC regulation favoring diversification into content management services, like that provided by editors and cable television operators, but with legislatively conferred exemptions from liability available when ISPs avoid managing content.  The paper concludes that current media models provide inconsistent and incomplete direction on how to consider ISPs’ joint provision of conduit and content.  The paper provides insights on how a hybrid model can address media convergence, and promote First Amendment values while imposing reasonable nondiscrimination responsibilities on ISPs.      

Monday, October 25, 2010

Comments Filed on Improving International Comparisons of Broadband Development

I filed comments in the FCC International Bureau's inquiry (IB Docket No. 10-171) into how it can improve data collection and assessment of international broadband development.  Here is a summary of my recommendations:

(1)        Rather than compare the United States with other nations using composite national data, disaggregate the international data into several geographic and demographic categories.  The International Bureau should benchmark urban, suburban, exurban and rural communities in foreign locales in terms of broadband cost, transmission speeds, download caps, and other variables.

(2)       Use a credible average of delivered broadband speeds rather than advertised speeds.  A variety of demand and technological factors affect broadband service performance.  Advertised bit rates typically contain a disclaimer stating that actual performance may vary.  Because nations typically do not sanction carriers for overstating what subscribers can expect, the Commission should try to determine what bit rates subscribers can reasonably expect to receive.

(3)       The FCC must separate the data collection process from policy making so that data collection can occur without implicit or explicit coercion to support a pre-determined outcome.  Rather than consider the data collections and statistical compilations of other organizations a threat or challenge, Commission staff should try to replicate such findings and identify factors contributing to any disparities.  U.S government officials have challenged the OECD and other organizations in efforts to save face, or mitigate the political damage from reports showing mediocre national performance.  Instead, the Commission should try to understand the basis for disparity in performance statistics.

(4)       Limit redactions, trade secret designations, and other sanitization of broadband information so that researchers have access to useable and replicable data.  No trade secrets would be disclosed if the Commission identified the types of carriers serving specific areas by technology used.

(5)       Expand broadband data collection and benchmarking in the context of overall broadband leadership, quality and national readiness to compete in information industries.  Commission staff should examine the variables used in the comparative assessments generated by such organizations as Cisco, the Internet Innovation Alliance, and the Information Technology and Innovation Foundation.

(6)       Balance more easily quantifiable supply-side measurements of the broadband market with demand-side measurements that attempt to assess national digital literacy, computer ownership and access, e-government and other technology incubation efforts, as well as private/public partnerships in stimulating interest in, and use of, Internet-mediated services.

(7)       Conduct a thorough literature research with an eye toward identifying best practices in broadband data collection and dissemination.  This exercise will help Commission staff determine what are the key variables for multi-year tracking.

Thursday, October 14, 2010

The Verizon Wireless Data Rip Off—A Case Study

            For over three years, without unilateral amends by the company, or intervention by the FCC, Verizon Wireless, has profited handsomely when subscribers push a wrong button on their handsets and unintentionally access the Internet.  15 million subscribers initiated data sessions generating over $90 million in revenues for Verizon.  The revenue number is so high, because many handsets offer one button Internet access and even a few seconds of access generated a $1.99 fee as data users, lacking a monthly plan, trigger a per Megabyte fee regardless of whether only a few bytes got transmitted.  See Data Fee Mystery
            Okay we have an honest mistake, apparently easily made by lots of people texting in the dark and otherwise pushing the wrong number.  Verizon’s wireline venture typically provides a zero cost exit for misdialing.  For example, if you mistakenly end up at a dial a porn site, the meter does not start until after you are notified about charges for the call.  Verizon Wireless simply started the meter.
            What I find remarkable about this rip off is the “mystery” of how the charges rose to $90 million without either the company or the FCC doing something about it.  Might Verizon have grown to assume such revenues would contribute to “making its numbers”?  Might the FCC get all too easily persuaded that Verizon Wireless eventually would do the right thing?
            As a multi-decade observer of carriers and FCC behavior, I can readily attribute a cynical, but possible on point explanation to such cavalier attitudes.  So in the spirit of trying to make sense out of how this $90 million overcharge could continue for so long as an unsolvable “mystery,” I have a few rationales to suggest:
1)         FCC inaction
            The FCC has clear statutory authority (Truth in Billing laws) to investigate carrier billing anomalies and overcharges, regardless whether the carrier offers a telecommunications service, and information service, or both.  Similarly the FCC has statutory authority under Section 208 of the Communications Act to investigate complaints about carrier behavior.  Yet the Commission did nothing for three years even after having received ample notice, through consumer complaints, that unjustified data billing charges were accumulating. 
            I believe the FCC wants to serve the public interest, but often fears it cannot do the right thing if such action comes across to Congress and other stakeholders, like Verizon, as too aggressive.  Put another way without a forceful trigger the FCC cannot intervene and order refunds, or secure a Consent Decree to achieve the same outcome without the carrier’s acknowledgement of guilt. The trigger occurs when the Commission receives a sufficient number of consumer complaints, media inquiries, congressional letters and the like to outweigh carrier claims that they are “working on the problem.”  Until such time as a critical mass of complaints arrives, the Commission can defer to carrier claims that no problem exists, or that a minor billing anomaly will get fixed soon.
            Perhaps as well on a philosophical basis, the Commission is reticent to act when there are stakeholders that have framed every policy and regulatory issue in terms of whether the marketplace offers a better solution.  Applying that premise, the FCC should not intervene because consumers can vote with their feet and subscribe to another carrier that does not impose such false charges.  Alternatively, at least until so-called tort reform all but eliminates class action law suits, subscriber representatives can sue for collective refunds.
            The marketplace reliance rationale fails if all carriers offer similar handsets with Internet access buttons and similarly start the meter without providing subscribers notification that a charge will result if they stay online.  The rationale also suffers if the cost of litigation vastly exceeds the likely refund any single subscriber would win in litigation.
2)         Verizon Wireless’ Inaction
            I find it hard to believe that Verizon Wireless could have generated $90 million without ever asking why so many subscribers continuously triggered a $1.99 charge, but did not come close to using the 1 Megabyte allocation.  Perhaps Verizon lacked the metering or monitoring capability to detect such user mistakes.  If so the company could be excused from installing an intermediary web page warning that additional charges will ensue.  Again Verizon Wireline does this when for example, one calls a busy telephone number.  A recorded message offers to call you back when the line becomes available and dutifully notifies you that “an additional charge may apply.”
            I conclude that the folks at Verizon Wireless assumed they had little regulatory responsibilities to rectify the problem and that subscribers bore the obligation (“caveat emptor”—buyer beware) to detect overcharges and to invest the time and effort to dispute them.  Because $1.99 probably is too little over which to quibble, it is quite possible that Verizon Wireless management grew to expect revenues to accrue from subscriber button pushing mistakes.  In turn this revenue enhancer becomes “baked into” revenue projections.
            So for different reasons both the FCC and Verizon were willing to leave well enough alone.  But doesn’t this $90 million dollar false mystery evidence the need for a cop on the beat with a sufficiently stiff backbone to act on less than three years’ notice.

Tuesday, September 21, 2010

The Pennsylvania Broadband Summit

This week's Pennsylvania Broadband Summit brought together experts with many different interests and perspectives.  See Pa. BB Summit Site.

The site contains slides I prepared on network neutrality including two case studies.

Wednesday, September 15, 2010

Which is the Primary Driver of Telecom Investment: Strategic Opportunities or Deregulation?

Incumbent carriers have spent millions on a campaign aiming to convince legislators and regulators that regulation all but eliminates incentives to invest in plant—particularly next generation networks.  The campaign also tries to make deregulation appear as the single greatest “incentivizer” for such investments.

Forget about strategic opportunities, the broader business cycle, the cost of capital, technological change and declining market share in core industry sectors.  What really matters is coming up with a way to dislodge the FCC and other government agencies from regulating.  Then and only then can the market drive investment decisions.

So let’s look at recent instances where incumbent carriers want to make investments.  Using the simplistic premise these ventures have spent millions to pitch, money should flow more freely into sectors recently subject to less regulation.  If deregulation is the primary driver—or apparently the only one—then investment should take the route where regulation offers the least degree of resistance and “disincentivization.”
           
Consider the primary multi-billion dollar investment goals of Comcast and Bell Canada Enterprises, the largest phone company and 9th largest corporation in the country.  If deregulation drives investment decisions, then Comcast must want to acquire NBC because Congress and the FCC have streamlined and reduced regulatory oversight.  Similar deregulation must be occurring in Canada as BCE wants to acquire complete control of CTV, a major broadcast television network.
           
In reality two major cable and telephone companies wants to vertically integrate and acquire content for strategic reasons having quite little to do with regulation and recent changes in the scope of government oversight.  Broadcast deregulation did not make NBC and CTV more attractive.  The long term viability of Comcast and BCE drove these companies to think control of content might provide greater profitability in the long run.  

Comcast and other incumbents have successfully framed regulation and deregulation as the primary drivers of whether such companies will employ more people, and invest more money at the very same time as billions in retained earnings flow to buying still highly regulated assets.
           

Wednesday, September 8, 2010

How Granular Do FCC Orders Have to Be?

When the FCC went about the task of executing its statutory mandate to promote unbundled access to incumbent carrier facilities and services, appellate courts chided the Commission for the lack of “granularity,” geographic specificity and market-based analysis in orders requiring nationwide access. Despite an ongoing commitment to promote access and competition, the FCC has eliminated any such requirement, either out of exhaustion, or the flawed conclusion that facilities-based competition exists or soon will thrive.

While researching what few access and interconnection requirements remain, I again looked at a 2009 D.C. Circuit case where the court affirmed the FCC’s elimination of dominant carrier regulation of special access, i.e., local and middle mile interconnections. See Ad Hoc Telecommunications Users Committee v. FCC, 572 F.3d 903 (D.C. Cir. 2009) at Ad Hoc Telecommunicatins Users Committee.

In concluding the FCC did not act arbitrarily the court quoted from the case where it validated the FCC’s decision not to require unbundling of broadband access service elements: “the [Communications Act] does not compel a ‘particular mode of market analysis or level of geographic rigor.’” But when the court reviewed the FCC’s unbundling requirements, the Commission had to show market-specific and geographic-specific rigor:

“Instead, the FCC must establish unbundling criteria that take into account ‘relevant market characteristics,’ which capture ‘significant variation,’ [quoting United States Telecom Association v. FCC, 359 F.3d 554, 576 (D.C.Cir.), cert. denied sub nom., Nat’l Ass’n of Regulatory Utility Comm’rs v. United States Telecom Ass’n, 543 U.S. 925, 125 S.Ct. 313, 160 L.Ed.2d 223 (2004).” Covad Communications, Inc. v. FCC, 450 F.3d 528 (D.C. Cir. 2006)(affirming FCC’s abandonment of most unbundling requirements). The court required the FCC to sensibly define the relevant markets, connect those markets to the FCC’ determination whether a competitive local carrier would be competitive impaired if it lacked such access, and consider whether the element in question is “significantly deployed on a competitive basis. 259 F.3d at 574.

So if I get this correctly: when it comes to requiring incumbent carriers to perform common carrier interconnection duties, for compensation at rates never deemed insufficient by the FCC, the Commission had better calibrate tightly its market and geographic scope of coverage analysis. But when it comes to deregulation, the Commission can point to the possibility of competition arising somewhere and somehow.

Friday, September 3, 2010

Broadband Penetration in the U.S.: Saturated, Recession-Affected, or Pricing Out Many?

In addition to providing a better sense of what specific broadband service options consumers have in more narrowly drawn geographic areas, the FCC’s most recent statistics on broadband show a significant decline in new subscriptions. See INTERNET ACCESS SERVICES: STATUS AS OF JUNE 30, 2009 (September 2010); available at: June 2009 FCC BB Stats.

The Commission also reports that as of June 2009 there were 61 reportable residential fixed-location connections per 100 households, with 56 connections per 100 households operating at advertised whose speeds in excess of 768 kbps downstream and only 27 connections per 100 households operating at advertised speeds near the broadband availability target--actual download speeds of at least 4 Mbps and actual upload speeds of at least 1 Mbps--recommended in the National Broadband Plan.

Does a downturn in new broadband statistics point to market saturation? It sure seems as though major broadband carriers are content with their subscription numbers. For example, Comcast recently raised by $2 both its service tiers. In light of the comparatively high rates charged in the United States, $30-60 a month, a significant portion of Americans do not appear willing to pay. Alternatively, current economic conditions might have forced prospective users to hold back.

Wednesday, September 1, 2010

Network Neutrality Debate Down to Two Issues?

A few days after Google and Verizon announced their qualified open Internet proposal, the FCC has opted to issue a Public Notice seeking comment on the two most controversial aspects of the proposal: the exclusion of specialized and wireless services from open access requirements. [1] The Public Notice appears to infer that the compromises reached by Google and Verizon have broader acceptance and represent “narrowed disagreement on many of the key elements of the [open Internet] framework proposed [by the Commission in its] NPRM.” [2]

The Commission appears to infer that the Google and Verizon proposal, along with comments filed in the Open Internet NPRM, evidence consensus on much of what the Commission had proposed:

1) that broadband providers should not prevent users from sending and receiving the lawful content of their choice, using the lawful applications and services of their choice, and connecting the nonharmful devices of their choice to the network, at least on fixed or wireline broadband platforms;

2) that broadband providers should be transparent regarding their network management practices;

3) that with respect to the handling of lawful traffic, some form of anti-discrimination protection is appropriate, at least on fixed or wireline broadband platforms;

4) that broadband providers must be able to reasonably manage their networks, including through appropriate and tailored mechanisms that reduce the effects of congestion or address traffic that is unwanted by users or harmful to the network; and

5) that in light of rapid technological and market change, enforcing high-level rules of the road through case-by-case adjudication, informed by engineering expertise, is a better policy approach than promulgating detailed, prescriptive rules that may have consequences that are difficult to foresee.

Whether the product of wishful thinking, or proper confidence in the ability of Google and Verizon to persuade all other stakeholders that consensus has been reached, the FCC seems to think only two issues remain unresolved and in need of “further inquiry.” The FCC wants to receive further comments on specialized services so that the Commission might find a way to promote investment and use of such options, but not in ways that bypass open Internet protections, provide a loophole that causes the open Internet to wither as an open platform and/or provide an opportunity for broadband providers to engage in anticompetitive conduct. The Commission emphasizes the need for clarity as to what types of services qualify as “specialized” as well as transparent and full disclosure by providers of specialized services.

Noting that Google and Verizon have proposed to exempt wireless broadband from all open Internet requirements, except for transparency, and acknowledging that new pricing plans that cap broadband usage could violate open Internet principles, the FCC invited more comments about mobile wireless platforms. The Commission framed its request for additional comments in terms of what would constitute adequate transparency in disclosing limits and non-neutral restrictions, whether new technologies harm or facilitate the ability of subscribers to attach devices to mobile wireless networks and what restrictions on open access to any available software or application service providers should be able to impose.

[1] Public Notice, Further Inquiry into Two Under-Developed Issues in the Open Internet Proceeding, GN Docket No. 09-191, WC Docket No. 07-52, DA 10-1667 (rel. Sep. 1, 1010); available at: http://hraunfoss.fcc.gov/edocs_public/attachmatch/DA-10-1667A1.doc.

[2] Id. at 1. The Open Internet NPRM refers to Preserving the Open Internet; Broadband Industry Practices, GN Docket No. 09-191, WC Docket No. 07-52, Notice of Proposed Rulemaking, 24 FCC Rcd 13064 (2009).

Friday, August 20, 2010

Improving the FCC’s Data Collection and Disclosure Practices

I filed comments in the FCC’s inquiry into how it can improve its data collection practices in the Media, Wireless Telecommunications and Wireline Competitions Bureaus (MB Docket No. 10-103, WT Docket No. 10-131 and WC Docket No. 10-132); available at: FCC Data Inquiry.

Here are my suggestions:

1) Refuse to grant blanket trade secret/confidentiality requests from stakeholders, particularly where a statutory mandate obligates the Commission to identify instances where the lack of competition or availability of even a single service provider frustrates achievement of national goals. The Commission should not redact, sanitize and obscure data, the disclosure of which would serve the public interest, help the Commission achieve statutory goals, and would not cause any financial or competitive harm to the reporting party;

2) Establish a rebuttable presumption that the public is entitled to understandable, credible, granular, and reproducible statistics, based on reasonable benchmarks that can help the Commission and users of the data make valid comparisons;

3) Place the burden on acquiring ventures to demonstrate that acquisitions will not adversely impact competition and the public interest;

4) Distinguish between data and sponsored research/advocacy;

5) Use peer review and third party research; and

6) Eschew reliance on ex parte presentations and brokering deals/concessions among major stakeholders; return to hearings, fact finding and creation of a comprehensive evidentiary record.

Monday, August 16, 2010

Does Granny Need a Platinum Plan to Get Her Mission Critical Medical Bits Timely Delivered?

In previous posts and academic writings I have parted with network neutrality advocates who want absolute parity of access. I agree with Wall Street analyst Craig Moffett (see http://tech.fortune.cnn.com/2010/08/11/net-neutrality-for-wireless-dont-count-on-it/) that heart pacemaker telemetry should get priority over surfing squirrels on the Internet, but only when current network conditions necessitate such prioritization. This is key: if under current network conditions the telemetry feed would get near instantaneous routing, then what good would absolute prioritization do?

Generally the surfing squirrel video clip and the telemetry feed experience no congestion and hold up. If and only if delays, dropped packets, resend commands and any other problem would likely occur under the best efforts norm, then I would support better than best efforts routing.

Even the Google-Verizon proposal acknowledges that the prioritization accorded specialized and managed traffic should not become so widespread as to eliminate the plain vanilla best efforts option. These two players appear to state that better than best efforts should apply solely to “additional or differentiated services” and that the exception should not swallow up the norm.

But of course this requires everyone to be on their best, yankee doodle-dandy behavior, eschewing any and all opportunities to tilt the competitive playing field in favor of a corporate affiliate or favored venture. I have noted that at least insofar as cable television operators are concerned, the FCC worries when a venture has both the incentive and the ability to act anticompetitively. No one has to disparage Comcast and the character of its managers to note that blocking, distorting and throttling peer-to-peer traffic might directly or indirectly handicap a technological alternative to video on demand and pay per view.

I would like to think that the template Google and Verizon now join in advocating is not a front for a partitioning ISP networks to all but guarantee that medical telemetry subscribers will have to buy the platinum plan.

Tuesday, August 10, 2010

How Clever is That?—More Thoughts on the Google-Verizon Deal

As I reflect on the Google-Verizon “Legislative Framework Proposal” I increasingly marvel at its cleverness. First consider the title. As legislation, much less a bill, appears quite unlikely for the foreseeable future, Google and Verizon actually have targeted the FCC. Of course if the ventures had gone by the book they would have petitioned the FCC for rulemaking and the Commission would have invited comments from interested parties with an eye toward generating a complete evidentiary record. That won’t happen in this case.

Others have noted that the managed services exception creates a gaping loophole. When I wrote about allowing ISPs to offer “better than best efforts routing,” I considered ad hoc, special events, such college basketball tournaments, not a bifurcated Internet. Additionally I specified that ISPs should not so partition their networks—even if it is their property—in such a way as to guarantee that “best efforts” regularly results in dropped packets and degraded service. The deal exempts “additional or differentiated services” “distinguishable in scope and purpose from broadband Internet access service.” With that kind of definition we can expect disagreements. Consider how stakeholders manipulate words like “robust competition.”

The proposal contains the following language that appears to differentiate between oversee and regulate. The FCC “would have exclusive authority to oversee broadband Internet access service,” but “[r]egulatory authorities would not be permitted to regulate broadband Internet access service.” Are Google and Verizon trying to make a metaphysical distinction between regulation with a direct statutory link and stakeholder consent to oversight? Or does the deal have a missing first word in the second sentence: State?

Monday, August 9, 2010

The Good, the Bad and the Ugly in the Google-Verizon Legislative Framework

Google and Verizon have developed a “Proposal” on Internet access which I am sure they expect to serve as a template, starting point and frame of reference going forward. See Google-Vz Deal. In light of the FCC’s judicial reversal in the Comcast case, the absence of substantive progress at the FCC and the unlikelihood of congressional action, two major stakeholder can and have taken the lead.
It should come as no surprise that Verizon and Google have emphasized and begrudgingly compromised on their corporate interests. Any support for an “open Internet” and consumer protection is subordinate, or the product of serendipity.

The Good.

The proposal embraces three of the four Internet policies briefly articulated by the FCC in 2005. They pass on endorsing competition, but at least indirectly now agree that the 1968 Carterfone policy applies to the Internet, wired and wireless. Additionally, they support non-discrimination, albeit not applicable to wireless and conditioned by the ambiguous modifier “undue.” Google and Verizon accept exclusive FCC jurisdiction to oversee broadband Internet access service along with hefty fines for violations. The companies wisely exempt software applications, content or services from FCC jurisdiction.

The Bad.

Google apparently caved on applying the conditional non-discrimination policy to wireless, an increasingly important broadband medium. The rationale for exempting wireless does not pass the smell test: “the unique technical and operational characteristics of wireless networks, and the competitive and still-developing nature of wireless broadband.” The technical and operational aspects of wireless strongly necessitate the non-discrimination requirement. Spectrum scarcity, ISPs’ incentive and ability to discriminate in favor of corporate affiliates or favored third parties, deep packet inspection, traffic throttling, and the ability of ISPs to obscure discriminatory practices necessitate FCC scrutiny of discriminatory and anticompetitive practices. Wireless carriers in the U.S. operate in a mature market with near saturation in voice market. The top four carriers control over 92% of the market and the top two with over 60% share (Verizon and AT&T) are vertically integrated and have substantial wireline broadband market power. This is hardly an infant industry in need of government nurturing.

Increasingly consumers will use wireless broadband as their preferred medium for Internet access. I have parted with network neutrality advocates by supporting some types of price and Quality of Service discrimination, including “better than best efforts” routing. However, abandoning scrutiny of discriminatory practices all but guarantees that ISPs will migrate from controversial, but lawful practices, into the realm of what Comcast did and beyond.

The Ugly.

A vacuum of leadership, initiative and follow through has provided Google and Verizon with this opportunity to help shape the agenda and frame the issues. The FCC has a history of deferring to industry to compromise and reach consensus. In old school telephony, the major interconnection and revenue sharing arrangements for decades occurred when the National Association of Regulatory Utility Commissions decided it was time. The NARUC-managed deals took the name of the location where the association members met, e.g., The Ozark Plan. So there is a history of stakeholders making the deal.

Still I feel as though the “fix is in” when major stakeholders can cut a deal and move on to the main goal of “enhancing shareholder value.” I would like to see the addition of “in a socially responsible manner,” but that may be too much to expect even from “do no evil” Google.

Thursday, August 5, 2010

About Those “Mission Critical” Bits

News that Google and Verizon are negotiating “better than best efforts” Internet routing probably comes across as a betrayal of sorts to network neutrality advocates. See http://www.nytimes.com/2010/08/05/technology/05secret.html?partner=rss&emc=rss. Bear in mind that Information Service Providers (“ISPs”) do not file public contracts known as tariffs and have the freedom to negotiate deals with individual clients. On the other hand ISPs, regardless of their FCC regulatory classification, cannot engage in unfair trade practices that achieve anticompetitive goals such a tilting the competitive playing field in favor of a corporate affiliate, or special third party.


In my work on network neutrality I have considered what ISPs can do to provide upstream content providers service enhancements. When March Madness arrives with the college basketball tournaments I want content providers to have the option of securing priority treatment of their video bits. Streaming video has greater bandwidth and bit rate requirements and ISPs should have the option of providing greater assurances that the bits will arrive on time. IPTV consumers will not tolerate a slide show presentation of video streams coupled with frozen frames, artifacts and other service glitches.

So what’s fair and what’s not?

It should be a foregone conclusion by now that ISPs have the option of diversifying service away from a plain vanilla, “one size fits all” business model. Better than best efforts traffic routing can represent a legitimate response to consumer requirements. Put another way many forms of price and quality of service differentiation seem reasonable if ISPs operate in a transparent and nondiscriminatory manner. This means that ISPs should have the option of offering better than best efforts, but not solely to one “most favored” venture, and in a way that guarantees increasingly inferior service to everyone else. ISPs should not deliberately drop packets to discipline or punish specific content providers who have opted not to pay more for superior service. ISPs should not partition their bandwidth so that the plain vanilla users face certain congestion.

I do not recall reading or hearing any network neutrality advocate condemning the services of Akamai and other ventures that enhance Internet traffic routing. Perhaps all these companies limit their enhancement to reducing router numbers and distributing content closer to end users so that the final leg or two, still routed via best efforts, will not degrade performance. So if Akamai offers permissible enhancements what is wrong with ISPs themselves providing similar enhancements? Perhaps it is ISPs’ ability and incentive to engage in harmful meddling of traffic coupled with opportunities to do so in a stealth mode not easily detected, or remedied by regulatory agencies. So distrust and at least some instances to corroborate it drive some of the network neutrality advocacy.

ISPs cannot simply provide reassurances voiced by CEOs, that they would never block or degrade service, particularly now that the FCC lacks jurisdiction and the FTC apparently lacks interest in enforcing such commitments. Similarly I am not keen on the FCC brokering some grand deal negotiated by select stakeholders. Ideally Congress should enact a specific and narrow mandate for the FCC to enforce ISP transparency and non-discrimination, not as Title II regulated common carriers, but as information service providers subject to specific, straightforward and reasonable expectations.

If the Google-Verizon deal results in legislation, or specific and enforceable ISP commitments, then the outcome won’t be all bad.

Monday, August 2, 2010

Political and Economic Lessons from a Spent Water Heater

If you have the privilege of owning a home for more than a few years, you probably have encountered a water heater that has reached “end of life.” More times than not, water leaks (flows!) from the unit. As with downed trees from a thunderstorm—another of my homeownership travails—one needs to act quickly.
Lucky for me my eight year old water heater had a nine year warranty providing for a replacement of like quality. A manager of the big box store where I bought the unit first attempted to “honor” the warranty by applying the cost of the old water heater against the higher cost of the replacement. Unlike tire warranties, the manufacturer warranted a specific longevity and guaranteed a replacement “free of charge.” The replacement water heater cost twice as much as the original unit.

Here’s where the water heater replacement becomes instructive: the store manager finally agreed to replace the water heater, but offered insights on why the price had doubled. The federal government “took over” and now “controls” the water heater marketplace with its efficiency mandates and more demanding criteria to qualify units for the Energy Star rating.

My big box store manager friend shows how issue framing works. Government regulation, which he readily acknowledged will ensure less electricity use by the water heater, has forced his employer to double the price of the unit. Worst yet consumers might not realize any out of pocket saving from the lower electricity use, because the replacement heater cost $200 more than the original one. The store manager and consumers buying into the government control frame ignore long term benefits like marginally less demand for fossil fuels, slightly less need for more electricity generators, and a small offset to global warming—if such a thing exists.

The regulatory frame considers efficiency requirements as helping society. The government takeover frame shows government subverting the marketplace. Bear in mind that an unregulated marketplace for water heaters does not guarantee choice of units based on efficiency, particularly if no government regulation mandates disclosure of an efficiency rating and estimated annual cost of operation. And at least in my small central Pennsylvania town, the market does not guarantee competitive prices: the two major big box stores across the street from each other remarkably had nearly identical prices, just like those “robustly competitive” wireless carriers.

The unfettered marketplace probably would offer consumers choices based on length of warranty and herein lies a lesson that supports either frame one buys. While I ultimately got the free replacement (plus $200 if I wanted installation), the new water heater warranty has three new conditions should the replacement unit fail: 1) consumers can only apply the original price they paid against whatever the price is when a replacement is needed; 2) the warranty only applies if the consumer stays in the house where the water heater is installed; and 3) consumers can only qualify for one warranty replacement.

So are these warranty restrictions necessary safeguards manufacturers must impose to guard against government regulations that are guaranteed to trigger a doubling of costs? Or might water heater manufacturers become more clever about finding ways to market warranties without having to honor them, like rebates that companies don’t have to honor because a buyer failed to comply with some bogus condition?

I now better understand the importance of how regulatory and policy issues are framed by stakeholders and the media. Opponents to regulation can frame it as government controlling yet another sector of the economy. Proponents of an unregulated marketplace can readily ignore the fact that if regulation had controlled or taken over the water heater business, the government would not permit manufacturers complete freedom to limit and further limit their warranties to appoint where they come close to a “bait and switch” tactic, necessitating additional “buyer protection” payments.

Wednesday, July 28, 2010

Lies, Damn Lies and Statistics at the Federal Communications Commission

The Federal Communications Commission recently discovered that 14 to 24 million Americans, located in 1,024 out of the nation’s 3,230 counties, do not have access to any broadband service at any price. This finding greatly contrasts with the Commission’s numerous previous statements that an unregulated and robustly competitive marketplace has provided universally accessible broadband at affordable rates just about everywhere. In reality the FCC could conclude that “broadband is being reasonably and timely deployed to all Americans” only by using false data.

It should come as no surprise that the FCC could so miss the mark on actual broadband access. The agency is awash in partisanship, pseudo science, fuzzy math, creative interpretation of economic principles and legal concepts, selective interpretation of the facts, innovative collection of statistics, and flawed thinking. These defects support results-driven decision making where FCC managers first reach a decision and subsequently support that outcome by framing the policy issues, “finding” facts and compiling data in ways that rationalize the preordained conclusion.

The FCC lacks the resources or resolve to compile a record independent of what parties with a financial stake file when the Commission seeks public comments. This means that the FCC does not have an unbiased, empirical record that would meet a threshold standard of fairness and reliability assessed by independent third parties, a process known as peer review. Because the FCC relies on data compiled by stakeholders, the Commission typically lacks the ability to differentiate credible research from “cooked books.” By relying on data compiled by the companies it regulates, the Commission regularly agrees to treat the information as proprietary, making it impossible for third parties to corroborate or refute the evidence used by the FCC to support its decisions.

In the case of broadband the FCC’s commitment to confidentiality has gone so far as to deem as “trade secrets” data about whether a carrier does or does not operate in a specific locality. Trade secrets typically refer to essential business information such as food and beverage recipes, but the FCC has managed to equate information about broadband accessibility with a company’s most essential assets. Bear in mind that the Commission must act on a congressional mandate to identify and remedy broadband access scarcity.

Notwithstanding a statutory obligation to track broadband access closely the FCC purposely overstated the scope of market competition and how well carriers had made service available. The FCC defined broadband in 1999 as a bit transmission speed of at least 200 kilobits per second in one direction. The FCC retained that now woefully inadequate bit rate until this year when it acknowledged that many Internet services require higher speeds. The Commission also used zip codes as the most focused geographical measure for broadband market penetration until this year. The Commission could reach its conclusion of 99+ percent market penetration by claiming “mission accomplished” for the entire zip code if at least one subscription opportunity existed somewhere within the zip code.

What statistics the FCC complies and how the Commission interprets the data has a substantial impact on how the agency interprets its regulatory mission. If the FCC wants to deregulate and abandon existing public interest safeguards, the Commission can claim evidence proves a robustly competitive marketplace can self-regulate. Until this year the FCC considered the wireless marketplace so competitive that the Commission could deem precompetitive numerous horizontal mergers where one competitor buys out another and acquires additional market share. The Commission’s most recent analysis of the wireless marketplace makes only passing reference to the fact that Verizon and AT&T national carriers have over a 60% market share, four national carriers control over 90%, and the rate of market concentration has grown in light of FCC-approved acquisitions so much so that it now well exceeds the Justice Department’s threshold for a “highly concentrated” market. The Commission also reports that U.S. wireless carriers enjoy healthy returns led by Verizon with an enviable 46.3% margin for the second quarter of 2009.

If the FCC wants to expand its regulatory wingspan, the Commission can claim evidence supports the need to curb market power. A former FCC Chairman, normally adverse to regulatory expansion, nevertheless wanted to further regulate cable television operators based on his perception that the industry had become too dominant. Using data, not compiled by FCC staff and highly questionable in light of market conditions favoring more competition from satellite and telephone companies, this Chairman believed that cable market penetration had reached a congressionally-drawn threshold. Neither the Chairman nor his staff could generate empirical data to support this conclusion.

The FCC can rely on poor fact finding only if reviewing courts accept such practices as worthy of judicial deference to the agency’s expertise. Some courts appear not to second guess the Commission, but others readily find flaws. Examples of the latter include a court’s refusal to allow the FCC to count as equals any media outlet in a market, regardless of significance and market share.

As information, communications and entertainment become an increasingly significant component in the economy, we cannot afford to have the FCC ignore instances where market self-regulation does not serve the national interest. The FCC has undertaken some recent efforts to improve its statistical compilations, but longstanding institutional flaws remain

Thursday, July 22, 2010

Identifying Areas in the U.S. Lacking Any Broadband Options

Despite previous proclamations of near ubiquitous broadband access in the United States, using smaller and more numerous counties instead of zip codes and considering broadband to require far greater than the previous 200 kilo bits per second floor, the FCC now acknowledges that significant numbers of Americans residing in many largely rural areas with low incomes lack any access at all. [1] The Commission now acknowledges “that broadband deployment to all Americans is not reasonable and timely. This conclusion departs from previous broadband deployment reports, which held that even though certain groups of Americans were not receiving timely access to broadband, broadband deployment ‘overall’ was reasonable and timely.” [2]


The Sixth Broadband Deployment Report confirms that a sizeable number of Americans have no broadband access whatsoever, or have access that do not meet the National Broadband Plan goal of affordable service with download speeds of at least 4 megabits per second (“Mbps”) and upload speeds of at least 1 Mbps. [3] The FCC recognized the prior 200 kilobit per second rate, in either direction, “simply is not enough bandwidth to enable a user, using current technology, ‘to originate and receive high-quality voice, data, graphics, and video telecommunications,’ as section 706 [of the Telecommunication Act of 1996] requires of such services.” [4]

Using the higher bit rate threshold the FCC estimates that 1,024 out of 3,230 counties in the United States and its territories are unserved by broadband, [5] and between approximately 14 to 24 million Americans do not have access to broadband today. [6] The Commission makes a number of candid acknowledgements:

The . . . [unserved] group appears to be disproportionately lower-income Americans and Americans who live in rural areas. The goal of the statute, and the standard against which we measure our progress, is universal broadband availability. We have not achieved this goal today, nor does it appear that we will achieve success without changes to present policies. The evidence further indicates that market forces alone are unlikely to ensure that the unserved minority of Americans will be able to obtain the benefits of broadband anytime in the near future. Therefore, if we remain on our current course, a large number of Americans likely will remain excluded from the significant benefits of broadband that most other Americans can access today. Given the ever-growing importance of broadband to our society, we are unable to conclude that broadband is being reasonably and timely deployed to all Americans in this situation. [7]

As evidenced by the ambitious goals in the National Broadband Plan, the Commission aspires to do a better job of promoting affordable and ubiquitous access going forward.

[1] Inquiry Concerning the Deployment of Advanced Telecommunications Capability to All Americans in a Reasonable and Timely Fashion, and Possible Steps to Accelerate Such Deployment Pursuant to Section 706 of the Telecommunications Act of 1996, as Amended by the Broadband Data Improvement Act, GN Docket No. 09-137, Sixth Broadband Deployment Report, (rel. July 20, 2010); available at: http://hraunfoss.fcc.gov/edocs_public/attachmatch/FCC-10-129A1.pdf[hereinafter cited as Sixth Broadband Deployment Report].


[2] Id. at ¶2.

[3] See FCC, OMNIBUS BROADBAND INITIATIVE (OBI), CONNECTING AMERICA: THE NATIONAL BROADBAND PLAN, GN Docket No. 09-51 (2010) (NATIONAL BROADBAND PLAN); Inquiry Concerning the Deployment of Advanced Telecommunications Capability to All Americans in a Reasonable and Timely Fashion, and Possible Steps to Accelerate Such Deployment Pursuant to Section 706 of the Telecommunications Act of 1996, as Amended by the Broadband Data Improvement Act; A National Broadband Plan for Our Future, GN Docket Nos. 09-51, 09-137, 2010 W.L. 972375 (rel. March 16, 2010). See also, National Broadband Plan, World Wide Web Site, http://www.broadband.gov/plan/.

[4] Sixth Broadband Deployment Report at ¶10.

[5] Id. at ¶22.

[6] Id. at ¶28. The Commission previously reported that about 80 million Americans either do not have access, or do not subscriber to an available broadband service.

[7] Id.

Wednesday, July 21, 2010

Comcast Violates an Unwritten Law

When a venture has petitioned the FCC to approve an acquisition, or to provide greater regulatory relief an unwritten law requires the petitioner to lay low. Comcast has violated this law by imposing, for the second time this year in Pennsylvania, a significant rate hike well in excess of the rate of inflation at the very same time it wants the FCC to approve the company’s acquition of NBC. Not smart.


Comcast surely must know that the optics of buying NBC do not look great no matter how much Comcast says such a transaction will promote competition. Raising both cable and Internet access rates reinforces the notion that the company has market power and faces limited competitive constraints. Of course this is the very company that does not want to pay anything for retransmission consent to carry broadcast television stations even as it charges subscribers $14.50 for the privilege. This company has fought a decade long battle to stymie alternatives to lucrative set top box rentals and now wants the FCC to abandon any efforts to promote CableCards and other alternatives.

It seems as though Comcast cannot lay low even during a time it wants a big gift from the FCC. The least the company could do is throw a gold-plated crumb to subscribers.

Monday, July 19, 2010

Content Deregulation and the Second Circuit Court Opinion on Fleeting Expletives

Acting on a remand from the Supreme Court, which had upheld on procedural grounds the FCC’s increasingly stringent rules limiting broadcast indecency, [1] the Second Circuit Court of Appeals considered First Amendment arguments and concluded that the FCC’s new rules are unconstitutionally vague. [2] Unlike the Supreme Court, which had deferred to the FCC’s decision to tighten its rules, including sanctioning broadcast stations for failing to bleep out spontaneously uttered, “fleeting” expletives, the Second Circuit held that the FCC created a chilling effect [3] that goes far beyond whether broadcasters can let slip the occasional profane word.


The Second Circuit held that the FCC owes broadcasters the same degree of clarity about the Commission’s rules regardless of whether content regulations trigger a lower level of scrutiny by courts [4] as the Supreme Court had deemed appropriate in F.C.C. v Pacifica Foundation, 438 U.S. 726 (1978), where the Court approved an FCC fine for the 2 p.m broadcast of a George

Carlin monologue containing several expletives based on the uniquely pervasiveness of broadcasting and its accessibility by children.

The Second Circuit documented how the FCC shifted its enforcement without clear guidance and with inconsistent application. With an emphasis on context, the FCC found no violation when profane words were used in the film Saving Private Ryan yet the very same words were considered actionable when uttered in a documentary about the Blues music genre. Such inconsistency “raises grave concerns under the First Amendment” [5]  because “nothing would prevent the FCC from applying its indecency policy in a discriminatory manner in the future.” [6]

With such a clear endorsement of media First Amendment freedom, some observers have speculated or hoped that the judiciary, including the Supreme Court, will be inclined to reject more limitations on expression and medium-based analysis of the First Amendment. [7] While acknowledging that it must apply Supreme Court precedent, the Second Circuit strongly implied that “today’s realities” [8] do not support a dichotomy in the degree of permissible government regulation and nature of judicial scrutiny between broadcasting and other media such as the Internet and cable television. [9] Asserting that broadcast television no longer has a uniquely pervasive presence, the court emphasized the substantial changes in the media landscape since the Supreme Court decided the Pacifica case:

The past thirty years has seen an explosion of media sources, and broadcast television has become only one voice in the chorus. Cable television is almost as pervasive as broadcast-almost 87 percent of households subscribe to a cable or satellite service-and most viewers can alternate between broadcast and non-broadcast channels with a click of their remote control. . . . The internet, too, has become omnipresent, offering access to everything from viral videos to feature films and, yes, even broadcast television programs.. . . As the FCC itself acknowledges, “[c]hildren today live in a media environment that is dramatically different from the one in which their parents and grandparents grew up decades ago.” [10]

The Second Circuit argues that broadcasters should qualify for the same insulation from government content regulation as other media in light of enhanced consumer choice and the ability of parents to program their televisions, using the V-Chip, to block children’s’ access to harmful content. [11]

Notwithstanding the Second Circuit’s determination of changed circumstances and the diminution of broadcasters’ market power and influence, [12] the Supreme Court may not follow through with decisions resulting in diminished continuing government oversight and regulation of content potentially harmful to children. The FCC may not appeal the case and the Court may not grant certiorari on appeal. Despite significant changes in the media marketplace, the Court did not grant certiorari in Cablevision Systems Corp. v. F.C.C. [13] where a lower court affirmed the FCC’s must-carry rules that resulted in the carriage of an upstate New York television station by cable systems serving cities on Long Island.

Bear in mind that when the Supreme Court first considered the fleeting expletives case, a majority of the Court endorsed the FCC’s actions, albeit on administrative law grounds. Justice Scalia, the author of the majority decision, made it clear that he believes broadcasters continue to have the ability to harm children. Justice Scalia believes that potential harm in subjecting this assumption to scientific inquiry obviates the need for the FCC to acquire empirical evidence:

There are some propositions for which scant empirical evidence can be marshaled, and the harmful effect of broadcast profanity on children is one of them. One cannot demand a multiyear controlled study, in which some children are intentionally exposed to indecent broadcasts (and insulated from all other indecency), and others are shielded from all indecency. . . . Here it suffices to know that children mimic the behavior they observe-or at least the behavior that is presented to them as normal and appropriate. Programming replete with one-word indecent expletives will tend to produce children who use (at least) one-word indecent expletives. Congress has made the determination that indecent material is harmful to children, and has left enforcement of the ban to the Commission. If enforcement had to be supported by empirical data, the ban would effectively be a nullity. [14]

Justice’s Scalia’s concern for children may have constituted one of the reasons the Court has agreed to consider whether states can restrict minors’ access to potentially harmful video games.

[1] F.C.C. v. Fox Television Stations, Inc., 129 S.Ct. 1800, 1813 (2009)(the FCC’s shift in rules deeming single, non-literal use of an expletive not actionably indecent to one where it is subject to possible exceptions for bona fide news coverage and artistic necessity considered no arbitrary or capricious).


[2] Fox Television Stations, Inc., v. F.C.C., __F.3d __, 2010 W.L. 2736937 (2d Cir. July 13, 2010).

[3] “Under the current policy, broadcasters must choose between not airing or censoring controversial programs and risking massive fines or possibly even loss of their licenses, and it is not surprising which option they choose. Indeed, there is ample evidence in the record that the FCC's indecency policy has chilled protected speech.” Id. at *14 (Westlaw pagination).

[4] “Broadcasters are entitled to the same degree of clarity as other speakers, even if restrictions on their speech are subject to a lower level of scrutiny. It is the language of the rule, not the medium in which it is applied, that determines whether a law or regulation is impermissibly vague.” Id. at *10.

[5] Id. at *14.

[6] Id.

[7] “The Supreme Court, if it takes up the case, should end all government regulations on the content of broadcasts. Technological change has undermined any justification for limiting the First Amendment rights of broadcast media outlets but not others.” Free Speech for Broadcasters, Too, THE NEW YORK TIMES, editorial (July 16, 2010); available at: http://www.nytimes.com/2010/07/18/opinion/18sun1.html?_r=1&ref=todayspaper.

[8] Fox Television Stations, Inc. 2010 W.L. 2736937 at * 8.

[9] The court compared the assumptions made by courts when evaluating the degree of First Amendment protection applied to indecent speech transmitted by different media with Internet speakers entitled to full protection, citing Reno v. ACLU, 521 U.S. 844, 874-75 (1997), telephone companies not required to ban entirely “dial-a-porn” calling, citingSable Communications of California v. F.C.C., 492 U.S. 115, 131 (1989), and cable television companies free to transmit sexually oriented content at any hour. First Amendment infringing broadcast regulation is subject to a lower level of judicial scrutiny based on “the twin pillars of pervasiveness and accessibility to children.” Fox Television Stations, Inc. 2010 W.L. 2736937 at * 7 citing Pacifica, 438 U.S. at 748-49.

[10] Fox Television Stations, Inc. 2010 W.L. 2736937 at * 7 (citations omitted).

[11] We can think of no reason why this rationale for applying strict scrutiny in the case of cable television would not apply with equal force to broadcast television in light of the V-chip technology that is now available. Id. at *8.

[12] Other courts have undertaken a more nuanced assessment of media competition and market power. For example in Prometheus Radio Project v. FCC, 373 F. 3d 372 (3d Cir. 2004), the Third Circuit Court of Appeals, rejected the FCC’s methodology for determining media market competitiveness which counted the number of outlets regardless of size and impact. This court also noted that major incumbent media ventures had established a significant presence on the Internet thereby challenging the assumption that the Internet provides new content independent of other media.

[13] 570 F.3d 83 (2d Cir. 2009), cert. den. Cablevision Systems Corp. v. F.C.C., __ S.Ct.__, 2010 W.L. 322891, (U.S. May 17, 2010) (NO. 09-901).

[14] F.C.C. v. Fox Television Stations, Inc., 129 S.Ct. 1800, 1813 (2009).

[15] Video Software Dealers Ass’n v. Schwarzenegger, 556 F.3d 950 (9th Cir. 2009) , cert. granted sub nom., Schwarzenegger v. Entertainment Merchants Ass’n, 130 S.Ct. 2398, (U.S. Apr 26, 2010) (NO. 08-1448).

Monday, June 7, 2010

AYCE and the Third Screen

AT&T recently announced that it plans to abandon All You Can Eat (“AYCE”) unmetered data pricing substituting usage-based plans. One certainly can appreciate a strategy that eliminates cross-subsidies from light to heavy users. But consider what eliminating AYCE does to the overall conceptualization of wireless broadband.

Metering data consumption promotes efficient “non-wasteful” consumption, but that type of use is exactly what subscribers expect. Broadcast and cable television (“first screen”) consumption is not metered and the degree of financial support from advertisers is based on the amount of consumption. More consumption is better and the incremental cost to serve such additional demand is nil. Cable modem and DSL wired broadband carriers also offer AYCE to "second screen" computers, presumably because the incremental cost of an additional hour of consumption, while not zero, is either not worth metering, or the carriers appreciate the commercial and public relations benefits from providing AYCE access.

When consumers have to consider a ticking consumption meter, they likely will consume less of the Internet, so AT&T benefits by disciplining heavy users. But metering also reduces the overall utility most users will accrue. If I am mindful that video downloads will quickly exhaust my monthly downloading (throughput) allowance, I am not going to view or seek out full motion video advertisements that supplement carrier subscription revenues. What AT&T generates from additional Gigabyte downloading sales, it might lose from lower advertising revenues as subscribers use greater vigilance to conserve bandwidth.

We can applaud so-called efficient use of broadband, but “meter mindfulness” takes away some of the pleasure and serendipity the Web offers. Additionally AT&T strategy comes across as an acknowledgement that wireless Internet access cannot become the competitive and functional equivalent to wired options. Carriers cannot keep up with demand, cannot afford to accommodate heavy users’ demand, or wireless networks simply cannot scale up to accommodate heavy full motion video demand. If any one of these three conditions exists, then wireless devices do not fully operate as “third screens” in light of the carrier, bandwidth, and throughput limitations.

Wednesday, May 19, 2010

Handicapping the Viability of the Third Way Model for Internet Access Regulation

Like many I have grave doubts about the viability of the Third Way regulatory model proposed by FCC chaiman Genachowski and General Counsel Austin Schlick, see Federal Communications Commission, Chairman Julius Genachowski, The Third Way: A Narrowly Tailored Broadband Framework (May 6, 2010); available at: http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-297944A1.doc  (rejecting a renewed attempt to find a way to extend Title I ancillary jurisdiction or reclassifying Internet access as a telecommunications service); Austin Schlick, General Counsel, A Third-Way Legal Framework for Addressing the Comcast Dilemma (May 6, 2010); available at: http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-297945A1.doc (providing legal rationale for narrow application of selected sections of Title II regulatory authority over Internet access). The model comes across as after the fact scrambling to re-arrange the wingspan of Title II jurisdiction.

On further reflection, I am coming around to the possibility that the Chairman may be able to pull this off. In the absence of regulatory clarity provided by amendments to the Communications Act, the FCC might successfully apply Title II, subject to forbearance, to Internet access bitstreams using four rationales (only one of which Austin Schlick identified in his Third-Way Legal Framework paper of May 6, 2010):

1) Despite a preference for applying a single, least regulatory classification to convergent services, e.g., Internet access, the FCC has explicitly applied two different regulatory classifications to wireless telephony/Internet access provided by a single venture over a composite wireless carrier. When wireless carriers offer telephony (whether to other wireless subscribers or terminating to the wireline Public Switched Telephone Network (“PSTN”) the Commission applies Title II, subject to forbearance. This is the gist of the Third Way model and it unquestionably applies to wireless cellular carriers in light of their classification as providers of Commercial Mobile Radio Service (“CMRS”), a Title II service subject to forbearance.

When wireless carriers offer Internet access, or for that matter video services that increasingly look like Internet Protocol Television and Title VI cable communications, the FCC does not apply Title II. The Commission has stated that the information service (and Title I only) applies to wireless Internet access, but has tried to avoid having to make a call on wireless IPTV/video/cable service.

2) The Commission can explicitly state that it does not have to make an either/or determination about which single statutory classification applies to a convergent service such as Internet access which combines telecommunications and information services along with content. Why not declare that not all forms of Internet access always constitute an information service? The Commission has stated that it thought it needed to sustain and apply a telecommunications service/information service dichotomy, but nothing in the Communications Act explicitly requires it. The Commission has stated only that it thinks “[t]he language and legislative history of [the Communications Act of 1996] indicate[s] that the drafters . . . regarded telecommunications services and information services as mutually exclusive categories.” In re Federal-State Joint Board on Universal Service, 13 F.C.C.R. 11501, 11522-23 (1998); see also Vonage Holdings Corp. v. Minn. PUC, 290 F. Supp. 2d 993, 994, 1000 (D. Minn. 2003).

In essence the FCC would have to acknowledge that Justice Scalia was correct in his dissent in the Brand-X case where he chided the FCC for ignoring the inconvenient truth that ISPs combine telecommunication and information services. Justice Scalia used pizzerias and pizza delivery for his primary analogy and asserted that one could not ignore the fact that pizza baking and pizza delivery constitute two separate elements of the pizza business. He concluded, “[i]t is therefore inevitable that customers will regard the competing cable-modem service as giving them both computing functionality and the physical pipe by which that functionality comes to their computer—both the pizza and the delivery service.” National Cable & Telecommunications Association v. Brand X Internet Services, 125 S. Ct. 2688, 2715 (2005).

3) No one disputed the FCC’s jurisdiction and authority, to sanction the Madison River telephone company when the company blocked DSL subscriber access to VoIP services. See http://www.fcc.gov/eb/Orders/2005/DA-05-543A2.html. The matter resulted in a voluntary forfeiture of $15,000 by the company instead of litigation without a complete examination of the jurisdictional basis for claiming jurisdiction over a DSL information service. However the Commission did reserve the option of reviewing any complaints against the company—presumably retroactively and prospectively—under its authority in Sec. 208 (Title II) of the Communications Act. Madison River provides some basis for FCC intervention to safeguard the public interest and assert jurisdiction over the telecommunications links used to provide DSL Internet access.

4) If the FCC can rely on changed circumstances to justify a re-classification from regulated telecommunications service (DSL) to unregulated information service, might further changed circumstances justify another re-classification? Bear in mind that the information service classification applies to competitively provided non-essential services. Arguably Internet access has become more essential even as it remains a monopoly or duopoly-provided service in many parts of the nation. Perhaps eventually terrestrial and satellite wireless will provide a competitive option, in terms of price, bit rate and monthly throughoput quotas, but that is not the case now.

Tuesday, May 18, 2010

New Publication: Case Studies in Abandoned Empiricism and the Lack of Peer Review at the Federal Communications Commission

The Journal on Telecommunications and High Technology Law at the University of Colorado law school has published my article that provides ample evidence to prove that the FCC rarely does an adequate job at fact finding when making decisions having substantial impact on our national economy; see http://jthtl.org/content/articles/V8I2/JTHTLv8i2_Frieden.PDF.

I examine six case studies where the Commission deregulates, approves a merger, and relaxes ownership and spectrum restrictions. In each instance the FCC makes expansive claims how the proposed transaction or shift in policy will promote competition, serve the public interest and enhance consumer welfare. The Commission makes these claims based often exclusively on the filings of stakeholders in lieu of internal fact finding and analysis. Often reviewing courts refrain from questioning the FCC’s findings and rejecting the Commission’s lack of empirical analysis subject to statutorily required peer review.

I offer recommendations for an improved process at the FCC. However, no one should expect progress unless and until reviewing courts do a better job of requiring a match between a policy outcome and the evidence supporting it.

Thursday, May 6, 2010

Network Neutrality and the FCC’s Inability to Calibrate Regulation of Convergent Operators

That FCC Chairman Julius Genachowski is struggling to find a way to calibrate network neutrality and Title I ancillary jurisdiction confirms the difficulty in regulating operators that seamlessly blend carriage and content. See, THE THIRD WAY: A NARROWLY TAILORED BROADBAND FRAMEWORK; available at: http://voices.washingtonpost.com/posttech/genachowski.doc.


Internet Service Providers offer convergent services that blend telecommunications, as in bit transport, with telecommunications services, such as telephony and arguably first and last mile Internet access, with video services, such as Internet Protocol Television, and with information services that ride on top of the bit transmission link. For administrative convenience and not as required by law, the FCC likes to apply an either/or single regulatory classification to convergent operators. Having classified ISPs as information service providers, the Commission unsuccessfully sought to sanction Comcast’s meddling with subscribers’ peer-to-peer traffic. Now Chairman Genachowski wants to further narrow and nuance regulatory oversight without changing the organic information service classification.

Some network neutrality advocates have urged the FCC simply to abandon the information service classification and reclassify aspects of ISP Internet access as Title II, common carrier regulated telecommunications service. Why use tortured and legally suspect analysis to craft an absolute dichotomy?

What is wrong with the FCC acknowledging that providers of convergent services trigger different regulatory classifications as a function of what service they provide? Even thought the FCC largely emphasizes wireless carriers’ information services, the Commission occasionally reminds cellular radio service providers that they still operate as common carriers that for example have to interconnect with other wireless carriers to provide seamless roaming opportunities for users. So it’s possible for the FCC to recognize that in the case of wireless carriers, a single venture using the same conduit can configure both regulated telecommunications services and generally unregulated information services.

The FCC has to confront the messy reality that when ventures offer convergent services that combine conduit and content and when these ventures vertically and horizontally integrate throughout many market segments, the Commission cannot rely on absolute either/or service dichotomies to classify everything a venture provides. Even to this day the Commission cannot bring itself to confront this reality as evidenced by its utter silence on what regulatory regime should apply to Voice over the Internet Protocol and Internet Protocol Television.

It’s time to recognize that layered and convergent services defy compartmentalization into convenient, single regulatory classifications and regimes.

Determining Causality in Telecommunications

With the FCC and most government actors obsessed with incentive creation, it makes sense to determine whether and how a regulatory or deregulatory action causes some desired outcome. Consider the creation of incentives to invest in physical plant. Incumbent carriers have spent a lot of time, money and effort arguing that regulation creates investment disincentives and deregulation does the desired opposite. This simplistic and not always correct premise constitutes the prevailing wisdom in the U.S.

Using this mindset, sponsored researchers have argued that next generation network plant skyrocketed soon after the FCC abandoned local loop unbundling and other “sharing” requirements. Let’s probe this assertion. First, recall that local loop unbundling was not something incumbent Local Exchange Carriers (“ILECs”) gave away or shared. Resellers and repackagers of local switching and routing plant paid the incumbents, albeit at a rate below what the ILECs would like to have been paid. Second I have found—deep, deep, deep in the FCC’s obscure statistics and data collection process—that compulsory rentals from incumbents to newcomers peaked at 12%, a level never close to forcing incumbents to invest in plant that they would have to make available solely to competitors. See Trends in Telephone Service (Aug. 2008), at p. 8-8; available at http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-284932A1.pdf.




The FCC stopped preparing this helpful source of information, but the percentage of resold ILEC lines has declined below the 8% reported in 2007 in light of the fact that rates to Competitive Local Exchange Carriers (“CLECs”) can exceed retail rates to end users, a price squeeze, but one the FCC and the Supreme Court in the Linkline case has no concerns.

Let’s assume that ILECs actually did increase their aggregate plant investment after the FCC abandoned local loop unbundling, bearing in mind that the Commission never required leasing of next generation plant such as dark or even lit fiber. Did deregulation cause all of the new investment? Of course not. Might the business cycle have had something to do with it? Might the cost of capital have had something to do with it? Might competitive necessity have had something to do with it? Oh and might declining market share and revenues in core business lines such as Plain Old Telephone Service have had something to do with it?

Whatever disincentive local loop unbundling imposed paled in comparison to incumbents’ need to find new revenues. Giving the ILECs due credit they have invested in next generation networks, mostly wireless and video plant for which no unbundling requirement ever applied. As to new found zeal in investing in Digital Subscriber Line services, might the ILECs want to make relatively small additional investment in already amortized copper plant, to secure some of the broadband growth market?

In a nutshell: do not buy the assertion that carriers make investment go/no decisions solely on the state of regulatory oversight. Carriers make sound business decisions, affected more by business conditions than the relatively minor impact of any FCC regulatory or deregulatory decision.