Over the years, activists have repeatedly warned that the actions of government agencies or Congress will “break the internet.” This past year, we’ve mostly seen how huge surges of traffic have overwhelmed poorly designed or ancient government websites, from Healthcare.gov to the Federal Communications Commission’s (FCC’s) Electronic Comment Filing System (ECFS).
This summer, the FCC’s ancient online system for filing comments went down when comedian John Oliver encouraged viewers to comment on the agency’s controversial proposed rules for regulating how broadband internet service providers handle traffic. A week later, the FCC’s system went down again on the last day in the comment period, leading the agency to extend the deadline for submitting net neutrality comments. Months later, “Internet Slowdown Day” sent hundreds of thousands of comments to the same online system, with much the same result.
This weekend, I joined Government Matters, a public affairs show produced by DC’s local ABC News affiliate and distributed on the Armed Forces Network, to talk about how moving rulemaking online has led to the participation of millions of people in the process. While there are reasons to be cautious about the ease of sending millions of form letters over email and advocates using templates, the number of unique comments filed to date also highlights the promise of involving more engaged citizens in governance processes using new technologies.
Today, on the last day of the reply comment period, more comments will flow to the FCC, potentially adding up to four million by the midnight deadline (the FCC has received three million comments on this docket). Despite the campaign and increased public awareness, however, there’s no certainty that the regulator will shift its policy based upon the prevailing sentiment, an outcome that could well drive even more cynicism in an electorate that is already deeply mistrustful of the federal government and its policies toward powerful corporate interests. The resulting dynamic seems perfectly designed for comedic satire and public frustration: the US seems to be a nation based upon 19th century laws, government entities using 20th century technology, populated by citizens using 21st century tools to organize and communicate but not, just yet, shifting the balance of power between the government and the governed.
The FCC’s experience isn’t an outlier, although the choices it faces in maintaining or upgrading its IT infrastructure to deal with the influx of interest may be useful to other government entities (more on that later). It turns out that if you connect a lot of citizens to government websites and activists tell them that they can weigh in on proposed rules and decisions of federal agencies, they will. Americans are participating in political and governmental processes in unprecedented numbers, along with starting and signing e-petitions, reading and “liking” legislation, and commenting on the social media profiles of their elected representatives.
As Gautham Nagesh reported for The Wall Street Journal, the campaigns of networked activists, advocates, and concerned citizens have flooded federal agencies with record numbers of comments on new rules. The Environmental Protection Agency has received hundreds of thousands of comments on emissions rules for power plants; the State Department has received over 2.5 million comments regarding the construction of an oil pipeline from Canada; and the FCC has received at least 1.75 million comments on net neutrality. Participatory e-democracy turns out to be messy, complicated, conflicted, biased, not universally accessible, and unevenly effective, at its best. It’s also never been tried at this scale before, and that makes it more than a little interesting.
In browsing through these filings this summer, I found an intriguing suggestion from the Mozilla Foundation, the group that makes the Firefox web browser. Like others, Mozilla’s policy team suggested in its public filing that the FCC use Title II authority for net neutrality but made a specific suggestion about how it does so: they asked the agency to define the offerings of internet service providers (ISPs) to “edge providers” like Netflix or Dropbox as a separate service, as opposed to the ISPs themselves. I’m not going to dive deep into the intricacies of telecommunications law nor review the past decade of legal wrangling over the FCC’s regulatory authority, or explain what net neutrality means (if you follow these links and ones below, you’ll find resources, authors, or documents that accomplish those goals). I won’t dwell on who came up with the principle that ISPs should enable access to all content and applications equally, without favoring or blocking particular service. I won’t review the timeline of net neutrality history, the controversial proposed Open Internet rules the agency promulgated this summer, or the case for reclassifying ISPs as telecommunications providers under Title II of the Communications Act to enforce the principle of net neutrality, making them “common carriers.” I’m also not going to dive into the significance of FCC chairman Tom Wheeler suggesting that broadband rules could cover wireless networks, though it’s a significant development.
The subjects for today’s column are what the CIO of the FCC could tell me about the infrastructure that’s buckling under sustained public interest in that outcome, what releasing these comments as open data enabled us to learn, and whether networked movements trying to influence the decision of an independent federal regulator and its commissioners will, can, or even should do so.
What’s wrong with the FCC’s online comment system?
On Wednesday, September 10, 2014, the FCC’s ECFS couldn’t keep up with the volume of comments catalyzed by Fight for the Future and dozens of websites participating in Internet Slowdown Day. The official number of comments that made it in the FCC system, which I estimated that night based upon agency statements was only a fraction of the total collected on that day, estimated at 777,364.
“The FCC’s website was just falling to pieces, so we turned off submissions at their request,” related Fight for the Future’s Co-Founder and Co-Director Tiffiniy Cheng, via email. “They asked us to turn it back on in the evening, but it wasn’t even worth it. The site was just up and down even when we weren’t submitting comments. They have a hosting service that could not scale up, and a method of accepting submissions that needs to be re-written.”
Note: The FCC disputes this account. The agency says that they never asked the organization to turn off comments.
As a result, while hundreds of thousands of new comments were filed in the ECFS the next morning, many were not. “In response to our urgent requests for a solution for submitting our 760,000+ comments, David Bray issued these instructions for submitting comments in volume given the problem with the website,” said Cheng.
The FCC’s alternative option is basic but effective: send the public comments as .csv files attached to an email. While the approach may seem somewhat retro, it’s both a simple and smart way of electronically transferring the comments through an established channel (firstname.lastname@example.org) that enables the agency to quickly ingest them as open data. The approach could also make releasing the comments online as structured data for third-party analysis easier; if the FCC wanted to, it could publish them almost as quickly as the comments came in. Email also enables the FCC to monitor the size of files, get the email address for the sender, and identify repeat senders.
While the FCC didn’t specifically refer to the vast number of comments Fight for the Future has collected, the method should solve the issue of getting them into the record by tomorrow morning. It’s necessary: if Fight for the Future’s higher projections are correct, there may have been more than four million total comments submitted on the Open Internet (14-28) docket since it opened.
While this bandaid approach may work now, it’s not a permanent solution. As I learned in researching the issues behind the ECFS this summer, despite its aspirations to be a 21st century digital agency, the FCC’s approach to collecting and sharing comments has shown that the lawyers controlling the rulemaking process are still working within a 20th century paradigm.
“Mostly their whole workflow is oriented around paper processes, so everything gets turned into PDFs even if it was an electronic submission originally, which is a huge pain, so fixing that would go a long way towards making things easier,” said Sunlight Foundation web developer Andrew Pendleton, in an email. “Less technically, I think the submission process is pretty intimidating, which is probably part of what encourages some organizations to aggregate submissions and send them in bulk rather than encouraging constituents to submit directly. The email@example.com email submission option was aimed to address this, but came late in this process and was poorly integrated into their infrastructure.”
That IT infrastructure is groaning today, and has not proven amenable to scaling easily. From the public perspective, this looks like a simple problem: enable people to publish a document online and submit it for public comment to email or an online docket in a way that’s accessible to everyone. That’s the fundamental need. What does the FCC need to execute against that, so it can take public comments online? Why can’t an agency that requested a $360 million annual budget (give or take) pull this off?
“The story there is actually quite simple,” said FCC CIO David Bray, in an interview. “With sequestration, we had to make budget cuts. And we can’t cut our workforce, and so the first thing that got cut was IT. The philosophy was, if it’s not broken, don’t fix it. Well, at a certain point, when your roof gets to be 20 years old, you don’t wait for it to leak. You replace it. So, that’s more of look to Congress. The FCC does bring in money through licensing and auctions. Raising our budget is a little easier, because I think we’re CBO-neutral, as opposed to other agencies, if that money that we brought in through licenses could then be used to make it easier for the public. That’s really the story: Can we lift our eyes off the immediate, bandaid solutions, or look to the horizon? We need to invest now so that we can be ready for the future, and not always have systems that are bandaids on top of bandaids — or I guess the chairman’s philosophy of bailing wire and glue, stuck together. What I’m trying to do is also encourage a culture change that does look beyond the immediate, but again, sequestration enforced the message of just get by, and if it’s not broken, don’t fix it — but of course then when it does break, you’re always fighting fires.”
The limitations of the ECFS are now painfully apparent to everyone.
“The code was written 17 years ago, and so it literally is a 32-bit application,” said Bray. “As you know, we’ve moved beyond that. As much as we want to adopt cloud technology — and in fact, we are, for some of our more recent applications — you just can’t lift and shift a 17 year old application straight to the cloud. We have to think about porting the data and then we have to think about does the code actually allow itself, the way it was written, to work well in the cloud. Quite frankly, it was designed 17 years ago, which was client/server, and that was about it.”
When FCC chairman Tom Wheeler requested funds to modernize the FCC’s IT systems earlier this summer, he was frank about both the constraints and the national embarrassment that a federal communications regulator unable to communicate bidirectionally using the same systems it oversees, funds, and promotes. He writes:
The FCC has been forced by budget restrictions to operate with an IT infrastructure that would be unacceptable to any well-managed business. Efforts to upgrade this IT capability were a casualty of sequestration. Most recently, the agency requested of Congress approximately $13 million for IT upgrades in the FY 2015 appropriation. I appreciate that the Senate subcommittee has provided the Commission with full funding in its FY 2015 spending bill, so that we can make these important upgrades. Unfortunately, the appropriations bill passed by the House today would fund the FCC at $17 million below current levels and $53 million below our overall budget request, dramatically undermining any effort to modernize our IT systems.
The ability to improve the FCC’s internal procedures — an important priority for Congress — will be hurt without 21st Century IT infrastructure.
The ability of the public to communicate with their government has — as we have seen — already has been hurt by the inability of the FCC to receive all of their comments without complication.
The ability of those companies the FCC regulates to express their views is similarly hurt by an infrastructure none of them would tolerate in their own companies, even though their fees pay for the FCC budget without touching tax dollars.
It is particularly distasteful that the FCC — the agency entrusted with promoting a world-class broadband infrastructure for the nation — could ever be incapable of dealing with Americans expressing themselves via that broadband capability.
What’s going on here? One possibility could be “regulatory capture,” where an agency created to act in the public interest instead prioritizes the business or policy interests of the special interests that control a given sector. This is a danger that technology journalists, civil liberties advocates, and activists have warned about at the FCC for years, exacerbated by a revolving door in Washington between telecom companies and the agency that regulates them. Investing and building up the FCC’s technical capacity to receive and analyze public comments, the thinking goes, could shift the public dynamic of power between the people and the industry. That’s one potential explanation for why the FCC and Congress, which provides budget support to it, haven’t taken steps to improve upon ECFS long ago.
A more banal explanation is that once the ECFS software was built, it was “good enough” for commissioners and Congress to let stand until quite recently, when the number of people connected to the internet started placing unprecedented demand on it.
Bray repeatedly emphasized how proud he was that his team was able to get a system built in the 1990s to scale up from 15,000 comments in a day to over 200,000 per day this fall. Unlike this fall, the issues from this summer weren’t a matter of public interest but rather the result of a heavy volume of searches. As Sam Gustin reported for Vice, unknown actors locked up the ECFS through a technique similar to a distributed denial of service attack.
“Without too much detail there, the issue was with search, and scaling that search,” said Bray. “If you send in a heavy volume of searches, it’ll actually delay the application as it was originally designed to slow down everyone else from being able to submit a comment. The thing that the work team did was figure out a way that people could still submit comments and at the same time search can proceed.”
The FCC CIO estimates that if they get the funds, they could have an agile development team rewrite it in four months. We could “get it a much better user interface and have it be located in the cloud, so it can obviously scale, and obviously decouple filing the comments from the search,” said Bray.
When I asked Bray about that funding, he reflected upon the challenge the agency faces in making its case for more funding on Capitol Hill.
“Whenever we go to Congress, Congress says, ‘Well, aren’t you supposed to save money going to the cloud?’, and it’s like, well, that’s only if we’re spending money to maintain it,” he said. “Part of the challenge that the FCC has is that about 80% of its budget is spent on people. The remaining 20% is then divided amongst facilities costs, services, and then, eventually, IT. It’s not a large percentage for IT, especially stretched across 207 different systems, of which about 50% or more are more than 10 years old. We’ve tried to do the best we can, given the ‘do more, plus’ mentality that sequestration incurred, but I don’t think it’s had a lot of maintenance at all.”
When I asked Bray about whether regulations.gov or other agency commenting technology could be applied, he told me something I’d missed: that the use of the Portable Document Format, abbreviated and well-known as PDF, was required in FCC rulemakings.
“One of the joys I’ve found is that some of our practices and our specific systems that need to be used for rulemaking are actually specified in law, including that it has to be in a PDF form, for example,” said Bray, in our interview. “That’s actually part of the rulemaking process. I’ve actually sponsored people coming in to the FCC to give demos for programmatic elements. We’d be happy to have other people come in. I do think, as a good CIO, my goal is to provide a choice architecture to different bureaus and officers. Ultimately, at the end of the day, they have to provide the demand signal, and then I’d be happy to make it happen.”
As I learned, it is not just a question of funding (over to you, Congress) or inertia that’s keeping the FCC from switching over from ECFS to regulations.gov: it’s the underlying systems and culture.
“My read (though I’m not a lawyer, so caveat emptor) of the executive order encouraging agencies to use a shared comment platform exempts independent regulatory agencies, of which the FCC is one, so there’s less impetus for them to switch than there has been for other agencies,” said Pendleton. “Also, regulations.gov is just a public front-end on top of FDMS (the Federal Docket Management System), which is an internal tool for agencies to manage regulatory comment submissions, which they’d probably need to adopt in its entirety. To complicate things even more, the FCC uses their system for regulatory dockets, but also other kinds of dockets like enforcement actions, which don’t cleanly map to the FDMS model. The regulations.gov platform (though not the CFPB [Consumer Financial Protection Bureau] stuff) is run by a contractor who would have to do at least some of the work to port them over, so some sort of procurement would probably be necessary. In other words, the FCC tech folks couldn’t do it unilaterally even if they wanted to and the organizational will existed.”
If Congress approves more funding, Bray expressed confidence that they could build a better system but emphasized that the process would need to be driven by internal and external stakeholders, not IT.
“The approach would be to recognize that the data is really important with ECFS. To pretty much scrap that app wouldn’t be too expensive. A specific estimate has not been done, but it’s basically four months, and that wouldn’t be too much of a burn, in terms of the actual costs of developers. The biggest thing that we have to do with whatever solution we work with — whether it’s something that is specifically built outside of the FCC or a service built inside of the FCC — fit it to the FCC’s rulemaking process. I, as CIO, try not to put the metaphorical cart before the horse. We’d try to engage what our stakeholders want, both internal to FCC and externally, in a storyboarding process. If we were to do that, then the question would be what external services, like regulations.gov, which I agree is very innovative, does that fit what legal professionals at FCC say they need in ECFS and the FCC’s rulemaking process? If not, how can it be adjusted? How fast can it be adjusted? I want to make sure that it’s something that comes as a programmatic demand, not something that myself as CIO is coming in saying you all will use this, and then I’m told by the programmatic side that it doesn’t fit our needs, or processes, etc.”
With sufficient funding, Bray aspires to do much more than upgrade ECFS: he’d like to fundamentally update the FCC enterprise to a data-centric operational model.
“IT is probably less than 10% of the overall spend from the FCC,” he said. “Of those numbers, about 80-85% is currently spent on maintaining existing systems, the 207 we have. Oftentimes, that’s split: one person doing five or six systems, if not more. We have put forward a request for both Fiscal Year 2015 and 2016 to do an overall lift and shift for the FCC. The first stage would be, instead of replicating the 207 systems in the cloud (which would be a really bad approach), to have an enterprise data view, a common data platform across all the different systems the FCC has. One thing with ECFS, if we do use shared services from the outside, is we need to make sure it has hooks to our edocs, as well as hooks to our efforts as well within the FCC. As you know, the FCC has systems for licensing, it has systems for auctions, it has systems for public comment, it’s got systems for presenting the actual procedures and rulemaking that the FCC does. We want to make sure that the data is unified across the different systems. When we move to the cloud, we just have a single data platform, and then on top of that we can actually build instantiations using a very thin user interface with very simple modular code that we remix with open APIs. We can then have that exposed to other search services, to other public partners, and to the private sector. Really, where the FCC should be two or three years from now, if we actually do the modernization, is really to be a data broker between the public and the private sector, and other government agencies.
I’m all for shared services. There are solutions that make sense to be custom-tailored and not redundant, and fit the needs of whatever the organization is. If we’re really embracing APIs, really we just need to have a way of doing them universally, not just receiving data but exchanging data across these different systems. If the data is the most important thing, how can we have modular reusability across each different instantiation, as opposed to trying to figure out a single solution? You may have heard about what was called the consolidated licensing system that was attempted. That tried to get everyone to agree to a single way to do licensing at the FCC. Again, my job is about 80-90% diplomacy, maybe 20-10% technology. Let’s just say if you spend a lot of time trying to get people to agree to one way to do something, you probably would have been much better off having different ways of doing processes, so then it’s unique to their needs. Behind the scenes, you have the exposed APIs and the data so that is common, and can be used and reused, regardless of the platform it’s sitting on top of.”
What can open data of public comments tell us?
On August 5, 2014, the FCC tried something new and potentially important, setting a bar for opening up a rulemaking process that other federal agencies might consider in the future: it made the Open Internet comments filed by the public more open to the public. In a blog post at FCC.gov, Gigi Sohn, special counsel for external affairs, announced that chairman Wheeler had asked the agency’s IT team to make the comments available to the public in the form of six XML files that added up to over over 1.4 gigabytes of data.
“The release of the comments as open data in this machine-readable format will allow researchers, journalists and others to analyze and create visualizations of the data so that the public and the FCC can discuss and learn from the comments we’ve received,” she said. According to the FCC, the six ECFS files with comments have been downloaded 2,357 times between August 8, 2014 and September 2014.
“The only thing that we had challenges with is that some of the comments were submitted as image files,” said Bray. “Where we could, we did OCR [optical character recognition] but we also, in some cases, just included links to the image, so that individuals analyzing it can make sure they provide their own, in some cases, OCR, where it’s just not possible.”
“David and the tech team have combined as many of the Open Internet comments as they could from the ECFS database and from the firstname.lastname@example.org email address into six XML files,” said FCC spokesman Bartees Cox. “Things can be used to describe the comments in a way that could help people that might not have the expertise or ability to do visualizations, but help them understand what’s happening in the comments a little bit better. This was an attempt by the FCC to be more transparent in making sure that everyone has a chance to look at these files and manipulate these files and do things with them so they can serve their audiences as well.”
According to Bray, the agency chose to release the data as XML after consulting external experts because it was the best open standard that didn’t tie the agency to any single product or proprietary software.
“We can’t pick products that are winners or losers for the analysis,” he said. “XML is universal in nature; it allows structure, so you can actually have the individual comments nested in the XML. That did seem to be the preference.”
Over a month later, it’s fair to say the agency and the public were well-served by that release. If you browse the open docket ECFS, you’ll find filings by the telecommunications giants that would be directly affected by regulation that oppose reclassification under Title II alongside comments from nonprofits like Public Knowledge or New America Foundation’s Open Technology Institute that support it. (There’s also the full text of “War and Peace,” which I did not read again.)
To date, the most useful analyses of the comments that I’ve seen have come from NPR, which published a fascinating look inside of the Open Internet comments on August 12, 2014, and the Sunlight Foundation, which not only shared an analysis of what we can learn from public comments on the FCC’s net neutrality plan but published cleaned up versions of the data and the open source code their labs team used to scrape, parse, and make sense of them. Here, third parties were crucial in enabling public (and media) understanding of these comments, at least at a level beyond counting the frequency of different keywords in the data.
Here’s a quick rundown of the highlights, quoted from Sunlight’s analysis (also explore an interactive of the FCC Open Internet comments):
- Less than 1 of the 800,000 comments analyzed were clearly opposed to net neutrality.
- At least 60% of the comments submitted were form letters written by organized campaigns.
- More than 200 comments originated in law firms, submitted on behalf of the firms or their clients.
- About 2/3rds of commenters opposed paid prioritization for internet traffic, or “tiering.”
- About 2/3rd of comments asked the FCC to reclassify ISPs as common carriers under the 1934 Communications Act.
- More than half of commenters described internet access as an essential freedom.
- About 5% of comments had anti-regulatory messages.
As NPR technology journalist Elise Hu explained, Quid, a San Francisco-based company, was commissioned by the Knight Foundation to parse comments, tweets, and news coverage about net neutrality since January 2014. The firm was able to apply machine learning and sentiment analysis to explore where the comments originated and what arguments were shared between them.
— Alex Howard (@digiphile) August 12, 2014
Both the Quid and Sunlight analyses showed a majority of the public comments were based upon templates. As Hu reported, the use of templates is far from unusual, from form letters and faxes of last century to the email campaigns of today. The fact that so many comments did not derive from templates was notable, however.
The proportion of templates was “actually low compared to analyses of other rule-making — upwards of 80 percent of comments on financial regulation were templates,” she wrote. (According to the Sunlight Foundation, the two largest dockets in Docket Wrench, the nonprofit’s regulatory analysis tool, were dominated by comments produced by templates, with upwards of 75% of the comments in the Department of State Keystone XL rulemaking and the Internal Revenue Service docket on tax-exempt social welfare organizations and political activity were classified as form letters.)
Hu also highlighted something else in the Quid analysis that was perhaps even more interesting, and even unexpected: “how the Internet affirms American principles”:
“One cluster focused on preserving net neutrality to maintain a diversity of opinion. Commenters argued that biasing faster traffic to the content providers that can pay for it removes a set of voices that should have a fair shake in sharing content. ‘It’s the idea that America is America because you can connect to different opinions,’ Quid’s Sean Gourley says.
The related but separate cluster of arguments had to do with the American dream. Commenters believe America should be a meritocracy, and that everyone should be able to compete equally with everyone else. Not preserving net neutrality, commenters argue, tilts the playing field away from everyone and toward firms in special positions of power.”
Both Quid and Sunlight Labs had to apply non-trivial processing and time to deriving these insights, given some “dirtiness to the data.” The latter estimated the total time spent playing “data janitor” at four to five full days of developer work.
“The 1:1 XML to JSON part was pretty fast (maybe a man-day),” estimated Pendleton. “The harder part was that there were a bunch of individual submissions that were a whole bunch of comments glommed together, and splitting those out into individual comments was much more time-consuming. All of the email submissions to the email@example.com emails were aggregated into groups of 10,000, then printed to PDF and submitted as PDFs, and then converted back into text again for the bulk release (so there were page numbers interspersed within the text). A couple of submitters (like CREDO Action) also did bulk submissions where they sent in 100k comments as one packet, and one submission purportedly from Bernie Sanders’ office submitted 12 thousand as a spreadsheet, which was printed to PDF. So, including all of that the cleaning was probably four or five man-days.”
A lot of that friction was created by a technical decision that only a regulatory lawyer could love.
“One of the joys I’ve found is that some of our practices and our specific systems that need to be used for rulemaking are actually specified in law, including that it has to be in a PDF form, for example,” said Bray, in our interview. “As mentioned earlier, PDFs are currently required in FCC rulemaking.”
Process and technology aside, the bigger question of how, where, when, or even if public comments should affect a rulemaking process remains open. On the one hand, while some polls show that most Americans don’t want internet “fast lanes,” the regulatory outcome will be based upon evidence, not popular sentiment, referendum, or the number of signatures on petitions that advocates can collect. Despite the context of a democratic society, rulemaking is based upon what independent regulators find the law allows, not what people demand or how loudly they do so.
As federal judge Ken Starr wrote in a 1987 decision, “Natural Resources Defense Council, Inc. v. United States Environmental Protection Agency” for the U.S. Court of Appeals for the District of Columbia, “the substantial-evidence standard has never been taken to mean that an agency rule-making is a democratic process by which the majority of commenters prevail by sheer weight of numbers. Regardless of majority sentiment within the community of commenters, the issue is whether the rules are supported by substantial evidence in the record. The number and length of comments, without more, is not germane to a court’s substantial-evidence inquiry.”
Sunlight Labs put it another way, stripping out the legalese: “rulemakings are about arguments, not volume. Not a popular vote at all.” In that context, whether the FCC receives 10,000 or one million or 10 million comments is less important than the quality of the reasoning contained in them, including the legal and technical viability of the solutions proposed. While comments are “still great measures of popular sentiment and its potential disconnect from what policymakers are doing,” as Sunlight Labs noted, “this isn’t direct democracy and it won’t be unusual if popular sentiment doesn’t map to outcomes.”
It’s worth making a final, crucial point based upon Sunlight’s analysis: their “back-of-the-envelope” estimate of the number of expert submissions made to the FCC in the first round of comments was 600, or just 0.08% of the 800,959 comments analyzed. It’s here that influence on the rules are most likely to be found, as well as comments that explore the legal substance of the matter. On that count, the FCC will host a public Open Internet roundtable discussion tomorrow in Washington, DC that will feature views on all sides of the issue.
In August 2014, a prominent citizen who lives a few blocks north of the FCC’s headquarters made a notable public comment on where he stands on net neutrality, reiterating a position he took in 2007: “in the United States, one of the issues around net neutrality is whether you are creating different rates or charges for different content providers,” said President Barack Obama, in answer to a question at the U.S.-Africa Business Forum. “That’s the big controversy here. You have big, wealthy media companies who might be willing to pay more but then also charge more for more spectrum, more bandwidth on the Internet so they can stream movies faster or what have you. And I personally — the position of my administration, as well as I think a lot of companies here — is you don’t want to start getting a differentiation in how accessible the Internet is to various users. You want to leave it open so that the next Google or the next Facebook can succeed.”
Disclaimer: TechRepublic and CNET are CBS Interactive properties.