Reforming the Federal Communications Commission

By TAP Staff

Posted on May 24, 2010


Share

By Kaleb A. Sieh, Rapporteur

On March 5th, 2010, Silicon Flatirons, Public Knowledge and the Information Technology & Innovation Foundation (ITIF) hosted a conference in Washington, D.C. on reforming the FCC. Two prior events on this subject explored new models of self-governance and standard setting. This conference asked where we stand on FCC reform and how new models of self-governance and standard setting fit into the reform efforts. Dale Hatfield, Executive Director of Silicon Flatirons, Adjunct Professor of Telecommunications at the University of Colorado, and former Chief Engineer of the FCC, delivered opening remarks to start the proceedings.

There is overall consensus that the FCC has serious procedural, organizational, and cultural problems, said Hatfield. In order to combat these problems, he said, open, transparent, predictable, objective, and pragmatic policy-making processes are critical. Hatfield also felt these reforms are especially critical to people’s faith in government because bias or less than objective regulation can destroy the sense of fairness in governance. Also, according to Hatfield, it is important to explore new models of governance in order to maintain the balance between too much regulation and too little.


Panel I—The Present and Future of FCC Reform

There is a consensus that the FCC is broken. It has serious procedural, organizational, and cultural problems that must be addressed by the agency's incoming leaders. Reforms aimed at addressing these concerns are key to the credibility of any actions the agency might take. The new FCC Chairman has made reform a priority—but will it be enough or should Congress step in and make some changes?

Open Transparent and Data Driven Reform Proposals by the FCC Chairman

The panel began by discussing the "open, transparent, and data driven" reforms proposed by Chairman Genachowski. Mary Beth Richards, Special Counsel for the FCC, highlighted how the reforms had underlying goals of openness, transparency and encouraging public input. She described seven broad areas of reform:
 

  1. Public Safety Readiness;
  2. Data Collection, Analysis, and Dissemination—the FCC collects a large amount of data;
  3. Systems—complaint filing, data access/collection, licensing, and working towards a standard public interface;
  4. Communication—both within and without the agency;
  5. Workforce and Organization—increasing the size and technical proficiency of the staff in order to truly be an expert agency;
  6. Rules and Processes;
  7. Financial—how the FCC spends the money it gets.

Austin Schlick, General Counsel for the FCC, drilled down into the proposed changes “in the works” for the FCC’s rules and processes. He said there are three changes in this area with a goal of maximizing transparency, efficiency, and accessibility. First, the FCC will provide the public with draft rules in its Notices of Proposed Rulemakings (NPRM)—though there may be some efficiency losses here since this will require additional Notices of Inquiry in order to build the draft rules. Second, a Procedures NPRM—the FCC has commenced a proceeding to improve its own procedures. Here, there is movement towards broader use of electronic filing and documenting, because “when you file it online it is available online.” According to Schlick, there are a “vast” number of proceedings that are undocumented because of the separate bureaus within the FCC and the agency is moving as close to full online filing and availability as it can get.


Finally, Schlick pointed to the Ex parte NPRM where the FCC has commenced a proceeding to revise the ex parte rules. Currently, he said, the ex parte rules do not require a notice to be filed if there is no “new information” being provided; the rules are also vague as to the substance and degree of disclosure required when there actually is new information. The proposed reform, according to Schlick, requires more disclosure, a preference for electronic filing, and extends the filing period due to the increased disclosure requirement.

The panelists generally approved of the proposed reforms and felt the reforms were moving in the “right direction,” but had some additional suggestions. Matthew Hussey, Telecommunications Legislative Assistant for Senator Olympia Snowe, said more could be done to make the FCC the expert agency it needs to be. The bulk of the other panelists were very concerned with ex parte contacts.

Ex Parte Communications

The FCC’s use, and some might say abuse, of ex parte communications were a serious concern for all the panelists. Nicholas Johnson, Professor of Law at the University of Iowa and former FCC Commissioner, was against the “continuation” of ex parte contacts and wondered why the face-to-face contact needed to be with an individual commissioner instead of in front of the entire Commission. Mark Cooper, Senior Adjunct Fellow at Silicon Flatirons and Director of Research at Consumer Federation of America, gave a withering critique of ex parte contacts at the FCC.

The essence of democracy, according to Cooper, is the ability of the people to “write the rules” under which they live. In order to have change, he said, the rules must be changed, and in order to change the rules you must have a proceeding. In this context, said Cooper, ex parte contacts are an “affront” to democracy, because certain individuals are inherently more able to get access than others. He felt strongly an agency’s communications should be a matter of public record and commissioners should not have meetings with anyone but their own staff. As an aside, Cooper pointed out how ex parte communications are usually frowned upon in other contexts, for example in most court cases an ex parte communication results in a mistrial.

Susan Crawford, Professor of Law at the University of Michigan, had concerns over the ex parte process as well, but pointed out how it seems to be a “central” part of the FCC’s decision-making process. In the same vein, Austin Schlick said the ex parte meetings help commissioners because the filings by themselves always leave questions unanswered. He felt the barriers to ex parte communications should be lowered in order to allow fuller access to the FCC for all interested parties, and not just those with significant budgets and a full-time Washington, D.C. presence. Additionally, Schlick felt one way to address the ex parte concerns could be a rule requiring full disclosure of the substance of the ex parte communication.

Gigi Sohn, Senior Adjunct Fellow at Silicon Flatirons and President and Co-founder of Public Knowledge, as well as the moderator of the panel, asked about lax enforcement of ex parte rules as they currently stand. She said a large number of ex parte filings were late, but wondered what was done with the violations. According to Schlick, a study showed the vast majority of ex parte filings, approx. 96%, were filed on time. The remaining filings were often only late by a couple of days and usually for good faith or innocent reasons. Schlick felt there was more of a “specificity” problem in the disclosure rule itself; that currently there is a lack of clarity as to exactly what needs to be disclosed in the ex parte filing. Once there is specificity as to what should be disclosed, he said, then the FCC can enforce the “clear” rule.

The FCC’s Relationship with the Whitehouse

The panelists were asked whether the FCC should be an executive agency instead of the independent agency it currently is and asked if it was normal for the FCC to be “cozy” with the President considering he appoints the Commissioners.

Susan Crawford said there is always “pressure” when the appointment power is present, but the real pressure on the FCC comes from industry—and it is this dynamic that makes the FCC’s decisions suspect. She pointed out how the telecommunications world is a very close community where everyone knows each other. She highlighted the steep learning curve and the large number of acronyms in the industry.  Crawford then asked about the revolving door when it came to hiring—what if FCC staff members could not turn around and work for the industry they had been regulating? Finally, she wondered if the FCC’s policy role should be taken away and instead given to the Executive, which would make the FCC simply a regulator. Mark Cooper said there were two accepted ways of insulating people from influence: life appointments and term limits. He asked whether it would make sense to give FCC Commissioners term limits with a longer duration than the term of the President. In terms of the revolving door, Austin Schlick pointed out how all FCC employees are restricted from lobbying the agency on any matter they worked on personally or was handled by the Commission during their time of employment and that senior officials had a one-year ban.

Matthew Hussey felt the FCC’s jurisdiction was a more important issue. He pointed out how the FCC had jurisdiction over non-federal spectrum licensees while NTIA managed federal licensees. This split in jurisdiction makes it more difficult to accomplish overall spectrum reform because the two different groups often had competing interests in terms of spectrum management and usage. Additionally, the FCC was created for a “relatively simple” wireline era and today the telecommunications world is significantly more technologically complicated. The FCC, according to Hussey, needs to be streamlined in order to keep up with the pace and innovative dynamics of the industry it has jurisdiction over.

Data and the FCC

The FCC is a data intensive agency and the quality and accessibility of its data is important to its decision-making processes.

Mark Cooper applauded the FCC’s efforts to “open up” the data it uses but criticized the FCC for not following the White House Office of Management and Budget (OMB) guidelines when it comes to “influential scientific information.” Cooper gave a number of suggestions: (1) there should be a formal process of peer review; (2) the FCC should post influential scientific information on peer review sites; (3) there should be a minimal amount of time between creation and release of a report; (4) for confidential information, there should be a way for review in some sort of protected, off-line, manner; and (5) influential scientific information should generally be subjected to as much “sunshine” as possible.

Mary Beth Richards said the FCC has the opportunity to make significant change when it comes to how it uses data. She felt the data the FCC uses to make decisions should be published and the public should be allowed to respond to it—with conflicting data if that happens to be the case. Austin Schlick pointed out how many reports are “gifts” to the FCC—as long as the group providing the report is not affiliated with any telecom company—and often a report is submitted into a proceeding’s docket by a professor or academic. Other times, he said, there are Commission-funded studies for major reviews and proceedings.


Panel II—Regulatory Reforms: Standard-Setting and Mediating Institutions

The second panel explored the purpose and guiding principles of standard setting in the broadband context, as well as new models for governance. Rob Atkinson, President of the Information Technology and Innovation Foundation, began the panel by asking the “meta-question” of where different types of governance are most appropriate. Atkinson asked which areas should use regulation, co-regulation, and self-regulation.

Pierre de Vries, Senior Adjunct Fellow at Silicon Flatirons, began by asking if it was best to view the Internet as an “ecosystem.” In comparing the Internet to an ecosystem, de Vries likened it to how an elephant compares to a whale. Both, he said, have deep and shared characteristics—they are both large mammals—but both are also very different. One shared characteristic of the Internet and an “ecosystem” is how they are both complex adaptive systems made up of smaller autonomous or semi-autonomous parts that both interact and react to each other. In terms of managing a complex adaptive system, de Vries said we should look for the “point of resilience”—the point at which there are still booms and busts but nothing that will threaten the entire system—and there should be a few overarching principles to our approach: (1) Flexibility; (2) Delegation of Responsibility; and (3) Diversity of Solutions. He felt our mindset should be that this is an eternal experiment and though the “ecosystem” keeps changing we must keep learning. Principles are much more important than rules in this kind of system, said de Vries, and this is a good argument for a more “common law” type governance and problem-solving process.

Rick Whitt, Washington Telecom and Media Counsel for Google, said “emergence economics” and some of the other newer economic thinking should be informing our policy-making to a greater extent. In terms of complex adaptive systems and viewing the Internet as an “ecosystem,” he wondered what kinds of frameworks and tools should be applied to these since the telecommunications “ecosystem” is evolving very quickly from a technology perspective. Whitt felt regulators often overlooked tools that could be applied to complex adaptive systems—especially if those tools were outside of the agency. He said when it comes to choosing between co-regulation, self-regulation, or regulation there is a tension between when to stand back and take a hands-off approach versus when the government should be stepping in to write the rules, but that “no standard whatsoever” leaves the maximum amount of uncertainty for businesses operating in that space. He argued for balancing flexibility and adaptability with responsibility and enforceability. Finally, he suggested a movement away from formality and rigidity in regulation. Some formality and rigidity was necessary, according to Whitt, in order to enable the process to be effective, but too much would hinder the regulator’s ability to learn.

Giving an example of a successful self-governance approach, Kathryn Brown, Senior Vice President for Public Policy & Corporate Responsibility at Verizon, pointed to website privacy policies. As the web emerged, she said, it quickly became clear that a large amount of personal information was being shared. Instead of mandating a solution through legislation, they turned to a self-governance approach that relied on collaborative norm setting. Kathleen Wallman, CEO of Wallman Strategic Consulting, felt protecting the public interest in any self-governance regulatory regime is very important because the standard-setting process, which is just one example of a self-governance approach, can be “chaotic” and “cacophonic.” Wallman felt the public’s interest in the broadband debate is: (1) “things need to work”—the user must be able to turn on the product and connect with little trouble and (2) “things need to not be expensive”—which should result from a well functioning system that prevents holdups and allows for cyclical upgrades.

Paul de Sa, Chief of the Office of Strategic Planning & Analysis at the FCC, talked about the interaction between innovation and standards. He felt we should be careful about two things when it comes to standards: (1) the “tightness” of the standards—in that if the standards are too tight and difficult to comply with they will ultimately hinder innovation and (2) the number of standards—because if there are too many different standards it will become difficult for smaller companies to comply with them all.

Self-Regulation, Co-Regulation, or Regulation—Which type is best and where?

Pierre de Vries asked a definitional question, when it came self-regulation, concerning whether the panel was talking about standards of behavior. In terms of different companies and players interacting with each other and the norms involved, or in his words how they “divide up the pie,” he felt that because it was such a fast moving industry it made sense to allow the companies themselves to figure out how to do it. When it came to co-regulation—where the regulator has a “big stick” and steps in only if the group cannot decide amongst themselves—the regulator simply plays the role of a backstop.

Paul de Sa felt co-regulation was an interesting notion for a regulatory method “in between” regulation and self-regulation, but asked what the actual rules would be. If co-regulation, in its application, means a large number of meetings over a long period of time, then this inherently favors larger players with more resources and can thus disadvantage the smaller more disruptive players. According to Sa, this may have negative implications for innovation.
Rick Whitt felt there needed to be greater clarity around the term “regulation” since the FCC seemed to be talking about some sort of “common law” approach. He pointed out that new online tools could reduce the difficulties and costs of involvement in any self- or co-regulation process. Also, he pointed out how the FCC could have access to these same tools and give feedback or guidance to the self-regulation group as the standards evolved. Paul da Sa asked if the regulator should be “seeding” these self-regulation groups from the beginning.

 

Closing

The conference came to a close with a round of thanks to the Public Knowledge staff, the Silicon Flatirons staff, and the panelists.


Video
of the conference is available from the Silicon Flatirons’ site.


Conference summary provided by Kaleb A. Sieh, Silicon Flatirons Research Fellow and 2009 graduate of the University of Colorado Law School.


Share