Google sells out

Google and Verizon came out with their joint statement on Net Neutrality on Monday. It is reasonable and idealistic in its general sentiments, but contains several of the loopholes Marvin Ammori warned us about. It was released in three parts: a document posted to Google Docs, a commentary posted to the Google Public Policy Blog, and an op-ed in the Washington Post. Eight paragraphs in the statement document map to seven numbered points in the blog. The first three numbered points map to the six principles of net neutrality enumerated by Julius Genachowski [jg1-6] almost a year ago. Here are the Google/Verizon points as numbered in the blog:

1. Open access to Content [jg1], Applications [jg2] and Services [jg3]; choice of devices [jg4].
2. Non-discrimination [jg5].
3. Transparency of network management practices [jg6].
4. FCC enforcement power.
5. Differentiated services.
6. Exclusion of Wireless Access from these principles (for now).
7. Universal Service Fund to include broadband access.

The non-discrimination paragraph is weakened by the kinds of words that are invitations to expensive litigation unless they are precisely defined in legislation. It doesn’t prohibit discrimination, it merely prohibits “undue” discrimination that would cause “meaningful harm.”

The managed (or differentiated) services paragraph is an example of what Ammori calls “an obvious potential end-run around the net neutrality rule.” I think that Google and Verizon would argue that their transparency provisions mean that ISPs can deliver things like FIOS video-on-demand over the same pipe as Internet service without breaching net neutrality, since the Internet service will commit to a measurable level of service. This is not how things work at the moment; ISPs make representations about the maximum delivered bandwidth, but for consumers don’t specify a minimum below which the connection will not fall.

The examples the Google blog gives of “differentiated online services, in addition to the Internet access and video services (such as Verizon’s FIOS TV)” appear to have in common the need for high bandwidth and high QoS. This bodes extremely ill for the Internet. The evolution to date of Internet access service has been steadily increasing bandwidth and QoS. The implication of this paragraph is that these improvements will be skimmed off into proprietary services, leaving the bandwidth and QoS of the public Internet stagnant.

The exclusion of wireless many consider egregious. I think that Google and Verizon would argue that there is nothing to stop wireless being added later. In any case, I am sympathetic to Verizon on this issue, since wireless is so bandwidth constrained relative to wireline that it seems necessary to ration it in some way.

The Network Management paragraph in the statement document permits “reasonable” network management practices. Fortunately the word “reasonable” is defined in detail in the statement document. Unfortunately the definition, while long, includes a clause which renders the rest of the definition redundant: “or otherwise to manage the daily operation of its network.” This clause appears to permit whatever the ISP wants.

So on balance, while it contains a lot of worthy sentiments, I am obliged to view this framework as a sellout by Google. I am not alone in this assessment.

Net Neutrality heating up

I got an email from Credo this morning asking me to call Julius Genachowski to ask him to stand firm on net neutrality.

The nice man who answered told me that the best way to make my voice heard on this issue is to file a comment at the FCC website, referencing proceeding number 09-191.

So that my comment would be a little less ignorant, I carefully read an article on the Huffington Post by Marvin Ammori before filing it.

My opinion on this is that ISPs deserve to be fairly compensated for their service, but that they should not be permitted to double-charge for a consumer’s Internet access. If some service like video on demand requires prioritization or some other differential treatment, the ISP should only be allowed to charge the consumer for this, not the content provider. In other words, every bit traversing the subscriber’s access link should be treated equally by the ISP unless the consumer requests otherwise, and the ISP should not be permitted to take payments from third parties like content providers to preempt other traffic. If such discrimination is allowed, the ISP will be motivated to keep last-mile bandwidth scarce.

Internet access in the US is effectively a duopoly (cable or DSL) in each neighborhood. This absence of competition has caused the US to become a global laggard in consumer Internet bandwidth. With weak competition and ineffective regulation, a rational ISP will forego the expense of network upgrades.

ISPs like AT&T view the Internet as a collection of pipes connecting content providers to content consumers. This is the thinking behind Ed Whitacre’s famous comment, “to expect to use these pipes for free is nuts!” Ed was thinking that Google, or Yahoo or Vonage are using his pipes to his subscribers for free. The “Internet community” on the other hand views the Internet as a collection of pipes connecting people to people. From this other point of view, the consumer pays AT&T for access to the Internet, and Google, Yahoo and Vonage each pay their respective ISPs for access to the Internet. Nobody is getting anything for free. It makes no more sense for Google to pay AT&T for a subscriber’s Internet access than it would for an AT&T subscriber to pay Google’s connectivity providers for Google’s Internet access.

Network Neutrality – FCC issues NPRM

I wrote earlier about FCC chairman Julius Genachowski’s plans for regulations aimed at network neutrality. The FCC today came through with a Notice of Proposed Rule Making. Here are the relevant documents from the FCC website:

Summary Presentation: Acrobat
NPRM: Word | Acrobat
News Release: Word | Acrobat
Genachowski Statement: Word | Acrobat
Copps Statement: Word | Acrobat
McDowell Statement: Word | Acrobat
Clyburn Statement: Word | Acrobat
Baker Statement: Word | Acrobat

The NPRM itself is a hefty document, 107 pages long; if you just want the bottom line, the Summary Presentation is short and a little more readable than the press release. The comment period closes in mid-January, and the FCC will respond to the comments in March. I hesitate to guess when the rules will actually be released – this is hugely controversial: 40,000 comments filed to date. Here is a link to a pro-neutrality advocate. Here is a link to a pro-competition advocate. I believe that the FCC is doing a necessary thing here, and that the proposals properly address the legitimate concerns of the ISPs.

Here is the story from Reuters, and from AP.

Femtocell pricing chutzpah

It’s like buying an airplane ticket then getting charged extra to get on the plane.

The cellular companies want you to buy cellular service then pay extra to get signal coverage. Gizmodo has a coolly reasoned analysis.

AT&T Wireless is doing the standard telco thing here, conflating pricing for different services. It is sweetening the monthly charge option for femtocells by offering unlimited calling. A more honest pricing scheme would be to provide femtocells free to anybody who has coverage problem, and to offer the femtocell/unlimited calling option as a separate product. Come to think of it, this is probably how AT&T really plans for it to work: if a customer calls to cancel service because of poor coverage, I expect AT&T will offer a free femtocell as a retention incentive.

It is ironic that this issue is coming up at the same time as the wireless carriers are up in arms about the FCC’s new network neutrality initiative. Now that smartphones all have Wi-Fi, if the handsets were truly open we could use our home Wi-Fi signal to get data and voice services from alternative providers when we were at home. No need for femtocells. (T-Mobile@Home is a closed-network version of this.)

Presumably something like this is on the roadmap for Google Voice, which is one of the scenarios that causes the MNOs to fight network neutrality tooth and nail.

FCC to issue Net Neutrality rules

In a speech to the Brookings Institution today, FCC Chairman Julius Genachowski announced that the FCC is initiating a public process to formulate net neutrality rules for broadband network operators based on six principles:

  1. Open access to Content
  2. Open access to Applications
  3. Open access to Services
  4. Freedom for users to attach devices to the network
  5. Non-discrimination for content and applications
  6. Transparency of network management practices

The first four of these principles were initially articulated by former FCC Chairman Michael Powell in 2004 as the “Four Freedoms.” Numbers 5 and 6 are new. The forthcoming rules will apply these six principles to all broadband access technologies, including wireless.

Genachowski made the case that Internet openness is essential and that it is threatened. He acknowledged that network providers need to manage their networks, and said that they can control spam and help to maintain intellectual property integrity without compromising these principles.

The threats to Internet openness come from reduced competition among ISPs and conflicts of interest within the ISPs, because they are also trying to be content providers.

Genachowski rightly sees these threats as serious:

This is not about protecting the internet against imaginary dangers. We’re seeing the breaks and cracks emerge, and they threaten to change the Internet’s fundamental architecture of openness. This would shrink opportunities for innovators, content creators and small businesses around the country, and limit the full and free expression the internet promises. This is about preserving and maintaining something profoundly successful and ensuring that it’s not distorted or undermined.

These rules will be very tough to enforce. The fundamental structure of the business works against them. A more effective approach may be to break up the ISPs into multiple independent companies, for example: Internet access operations, wide area network operations, and service/content/application operations. The neutrality problem is in the access networks – the WANs and the services are healthier. With only the telcos (DSL and fiber) and the MSOs (cable) there is not enough competition for a free market to develop. This is why Intel pushed so hard for WiMAX as a third mode of broadband access, though it hasn’t panned out that way. It is also why municipal dark fiber makes sense, following the model of roads, water and sewers.

Sharing Wi-Fi 2 – Atheros turns a cellphone into an access point

There are several smartphone applications that allow a cell phone to act as a wireless WAN router and Wi-Fi access point, creating a wireless LAN with Internet access. For the (jailbroken) iPhone there’s PDAnet, for Windows Mobile there’s WM Wi-Fi Router and for Symbian there’s Walking HotSpot and JoikuSpot. Now Atheros has proposed to bake this functionality into their low power Wi-Fi chipset.

An idea that is as patent jargon goes “obvious to one skilled in the art,” can sometimes have obvious handicaps to one experienced in the industry. While exposing a broadband wireless data connection through a smartphone’s Wi-Fi radio is massively useful to consumers, it is unlikely to appeal to network service providers, who would prefer you to buy a wireless data card (and an additional service subscription) for your laptop rather than to simply use the wireless data connection that you are already paying for on your phone.

It will be interesting to see where this goes. I will be stunned if Atheros’ implementation appears on any phone subsidized by (or even distributed by) a wireless carrier, until they can figure out a way to charge extra for it. As Tim Wu says in his Wireless Carterfone paper (the Wireless Carterfone concept was promoted by Skype, and rejected by the FCC last April):

carriers are in a position to exercise strong control over the design of mobile equipment. They have used that power to force equipment developers to omit or cripple many consumer-friendly features.

The billing issue may not be that intractable. Closely related models already exist. You can get routers from Cisco and other vendors that have a slot for a wireless WAN card, and the service providers have subscription plans for them. More similarly, this could be viewed as a kind of “tethering” But tethering only lets one PC at a time access the wireless WAN connection – unless that PC happens to support My Wi-Fi.

Update: Marvell has announced a similar capability for its 88W8688 chip.

Transparency and neutrality

Google and the New America Foundation have been working together for some time on White Spaces. Now they have (with PlanetLab and some academic researchers) come up with an initiative to inject some hard facts into the network neutrality debate.

The idea is that if users can easily measure their network bandwidth and quality of service, they will be able to hold their ISPs to the claims in their advertisements and “plans.” As things stand, businesses buying data links from network providers normally have a Service Level Agreement (SLA) which specifies minimum performance characteristics for their connections. For consumers, things are different. ISPs do not issue SLAs to their consumer customers. When they advertise uplink and downlink speeds, these speeds are “typical” or “maximum,” but they don’t specify a minimum speed, and they don’t offer any guarantees of latency, jitter, packet loss or even integrity of the packet contents. For example, here’s an excerpt from the Verizon Online Terms of Service:

VERIZON DOES NOT WARRANT THAT THE SERVICE OR EQUIPMENT PROVIDED BY VERIZON WILL PERFORM AT A PARTICULAR SPEED, BANDWIDTH OR DATA THROUGHPUT RATE, OR WILL BE UNINTERRUPTED, ERROR-FREE, SECURE…

Businesses pay more than consumers for their bandwidth, and providing SLAs is one of the reasons. Consumers would probably not be willing to pay more for SLAs, but they can still legitimately expect to know what they are paying for. The Measurement Lab data will be able to confirm or disprove accusations that ISPs are intentionally impairing traffic of some types.

This is a complicated issue, because one man’s traffic blocking is another man’s network management, and what a consumer might consider acceptable use (like BitTorrent) may violate an ISP’s Acceptable Use Policy (Verizon:”…it is a violation of… this AUP to… generate excessive amounts of email or other Internet traffic;”). The arguments can go round in circles until terms like “excessive” and “unlimited” are defined numerically and measurements are made. So Measurement Lab is a great step forward in the Network Neutrality debate, and should be applauded by consumers and service providers alike.

White Spaces Videos

I found this “grass roots” video on Google’s Public Policy Blog. That blog also has some interesting posts on related issues by Richard Whitt and Vint Cerf.

Looking at this provoked me to go to YouTube and search for other White Spaces related videos. I was interested to find a coordinated (by Google) effort by the proponents of White Spaces, and on the other side basically nothing – just this incredibly lame video that takes 7 minutes to tell us that microphones are used in sports broadcasting (don’t waste your time watching more than a few seconds – it’s the same all the way through).

It’s odd that the main opponents of Whitespaces (NAB and MSTV) haven’t put rebuttal videos on YouTube yet, and even odder that they haven’t found a need to present any more thoughtful analyses of the issue, equivalent (but presumably opposite) to those of Chris Sacca or Tim Wu. Instead, I have the impression that their strategy rests on the two prongs of public fear-mongering and bare-knuckled political lobbying.

White Space update

The forthcoming transition to digital TV transmissions will free up about half the spectrum currently allocated to TV broadcasters. This freed-up spectrum was the subject of the FCC’s just-concluded 700MHz Auction, which yielded about $20 billion in license fees to the government. The fate of the other half of the TV spectrum, the part that will remain assigned to TV broadcasts after the digital transition, remains in contention.

This spectrum will be shared by licensed TV broadcast channels and wireless microphones, but even so much of it will remain mostly unused. These chunks of spectrum left idle by their licensees are called “White Spaces.” The advent of “spectrum sensing” radio technology means that it is now theoretically possible for transmitters to identify and use White Spaces without interfering with the licensed use of the spectrum.

The FCC has issued a Notice of Proposed Rulemaking and a First Report and Order to explore whether this is a good idea, and if so, how to handle it.

The potential users of the White Spaces have formed roughly two camps, those who see it best suited for fixed broadband access (similar to the first version of WiMAX), and those who see it as also suited for “personal/portable” applications (similar to Wi-Fi).

Google, along with Microsoft and some other computer industry companies, advocates the personal/portable use. The FCC’s Office of Engineering and Technology (OET) is currently lab-testing some devices from Microsoft and others to see if their spectrum-sensing capabilities are adequate to address the concerns of the broadcast industry, which fears that personal/portable use will cause interference.

Google filed an ex-parte letter with the FCC on March 24th, weighing in on the White Spaces issue. The letter is well worth reading. It concedes that in the introductory phases it makes sense to supplement spectrum sensing with other technologies, like geo-location databases and beacons. The letter asserts that these additional measures render moot the current squabble over a malfunction in the devices in the first round of FCC testing, and that the real-world data gathered in this introductory phase would give the FCC confidence ultimately to repeal the supplemental measures, and perhaps to extend open spectrum-sensing uses to the entire radio spectrum, leading to a nirvana of effectively unlimited bandwidth for everybody.

The kicker is in the penultimate paragraph, where Google recycles an earlier proposal it made for the 700MHz spectrum auction, suggesting a real-time ongoing “dynamic auction” of bandwidth. Google now suggests applying this dynamic auction idea to the white spaces:

For each available spectrum band, the licensee could bestow the right to transmit an amount of power for a unit of time, with the total amount of power in any location being limited to a specified cap. This cap would be enforced by measurements made by the communications devices. For channel capacity efficiency reasons, bands should be allocated in as large chunks as possible. The airwaves auction would be managed via the Internet by a central clearinghouse.

Current expectations are for spectrum-sensing use of the whites spaces to be unlicensed (free, like Wi-Fi). Now Google appears to be proposing “sub-licensed” rather than unlicensed spectrum use. The word “auction” implies that this could be a revenue producer for TV broadcast licensees, who received their licenses free from the government. This is a very different situation than the original context of the dynamic auction proposal, which applied to spectrum for which licensees paid $20 billion. Depending how it is implemented, it could fulfill the telcos’ dream of directly charging content providers for bandwidth on a consumer’s Internet access link, a dream that Google has opposed in the network neutrality wars. Google may ultimately regret opening the door to this one, even though it presumably sees itself cashing in as the ideal candidate to operate the “central clearinghouse.”

Update April 10th: Interesting related posts from Michael Marcus and Sascha Meinrath.