Femtocell pricing chutzpah

It’s like buying an airplane ticket then getting charged extra to get on the plane.

The cellular companies want you to buy cellular service then pay extra to get signal coverage. Gizmodo has a coolly reasoned analysis.

AT&T Wireless is doing the standard telco thing here, conflating pricing for different services. It is sweetening the monthly charge option for femtocells by offering unlimited calling. A more honest pricing scheme would be to provide femtocells free to anybody who has coverage problem, and to offer the femtocell/unlimited calling option as a separate product. Come to think of it, this is probably how AT&T really plans for it to work: if a customer calls to cancel service because of poor coverage, I expect AT&T will offer a free femtocell as a retention incentive.

It is ironic that this issue is coming up at the same time as the wireless carriers are up in arms about the FCC’s new network neutrality initiative. Now that smartphones all have Wi-Fi, if the handsets were truly open we could use our home Wi-Fi signal to get data and voice services from alternative providers when we were at home. No need for femtocells. (T-Mobile@Home is a closed-network version of this.)

Presumably something like this is on the roadmap for Google Voice, which is one of the scenarios that causes the MNOs to fight network neutrality tooth and nail.

FCC to issue Net Neutrality rules

In a speech to the Brookings Institution today, FCC Chairman Julius Genachowski announced that the FCC is initiating a public process to formulate net neutrality rules for broadband network operators based on six principles:

  1. Open access to Content
  2. Open access to Applications
  3. Open access to Services
  4. Freedom for users to attach devices to the network
  5. Non-discrimination for content and applications
  6. Transparency of network management practices

The first four of these principles were initially articulated by former FCC Chairman Michael Powell in 2004 as the “Four Freedoms.” Numbers 5 and 6 are new. The forthcoming rules will apply these six principles to all broadband access technologies, including wireless.

Genachowski made the case that Internet openness is essential and that it is threatened. He acknowledged that network providers need to manage their networks, and said that they can control spam and help to maintain intellectual property integrity without compromising these principles.

The threats to Internet openness come from reduced competition among ISPs and conflicts of interest within the ISPs, because they are also trying to be content providers.

Genachowski rightly sees these threats as serious:

This is not about protecting the internet against imaginary dangers. We’re seeing the breaks and cracks emerge, and they threaten to change the Internet’s fundamental architecture of openness. This would shrink opportunities for innovators, content creators and small businesses around the country, and limit the full and free expression the internet promises. This is about preserving and maintaining something profoundly successful and ensuring that it’s not distorted or undermined.

These rules will be very tough to enforce. The fundamental structure of the business works against them. A more effective approach may be to break up the ISPs into multiple independent companies, for example: Internet access operations, wide area network operations, and service/content/application operations. The neutrality problem is in the access networks – the WANs and the services are healthier. With only the telcos (DSL and fiber) and the MSOs (cable) there is not enough competition for a free market to develop. This is why Intel pushed so hard for WiMAX as a third mode of broadband access, though it hasn’t panned out that way. It is also why municipal dark fiber makes sense, following the model of roads, water and sewers.

Sharing Wi-Fi 2 – Atheros turns a cellphone into an access point

There are several smartphone applications that allow a cell phone to act as a wireless WAN router and Wi-Fi access point, creating a wireless LAN with Internet access. For the (jailbroken) iPhone there’s PDAnet, for Windows Mobile there’s WM Wi-Fi Router and for Symbian there’s Walking HotSpot and JoikuSpot. Now Atheros has proposed to bake this functionality into their low power Wi-Fi chipset.

An idea that is as patent jargon goes “obvious to one skilled in the art,” can sometimes have obvious handicaps to one experienced in the industry. While exposing a broadband wireless data connection through a smartphone’s Wi-Fi radio is massively useful to consumers, it is unlikely to appeal to network service providers, who would prefer you to buy a wireless data card (and an additional service subscription) for your laptop rather than to simply use the wireless data connection that you are already paying for on your phone.

It will be interesting to see where this goes. I will be stunned if Atheros’ implementation appears on any phone subsidized by (or even distributed by) a wireless carrier, until they can figure out a way to charge extra for it. As Tim Wu says in his Wireless Carterfone paper (the Wireless Carterfone concept was promoted by Skype, and rejected by the FCC last April):

carriers are in a position to exercise strong control over the design of mobile equipment. They have used that power to force equipment developers to omit or cripple many consumer-friendly features.

The billing issue may not be that intractable. Closely related models already exist. You can get routers from Cisco and other vendors that have a slot for a wireless WAN card, and the service providers have subscription plans for them. More similarly, this could be viewed as a kind of “tethering” But tethering only lets one PC at a time access the wireless WAN connection – unless that PC happens to support My Wi-Fi.

Update: Marvell has announced a similar capability for its 88W8688 chip.

Transparency and neutrality

Google and the New America Foundation have been working together for some time on White Spaces. Now they have (with PlanetLab and some academic researchers) come up with an initiative to inject some hard facts into the network neutrality debate.

The idea is that if users can easily measure their network bandwidth and quality of service, they will be able to hold their ISPs to the claims in their advertisements and “plans.” As things stand, businesses buying data links from network providers normally have a Service Level Agreement (SLA) which specifies minimum performance characteristics for their connections. For consumers, things are different. ISPs do not issue SLAs to their consumer customers. When they advertise uplink and downlink speeds, these speeds are “typical” or “maximum,” but they don’t specify a minimum speed, and they don’t offer any guarantees of latency, jitter, packet loss or even integrity of the packet contents. For example, here’s an excerpt from the Verizon Online Terms of Service:

VERIZON DOES NOT WARRANT THAT THE SERVICE OR EQUIPMENT PROVIDED BY VERIZON WILL PERFORM AT A PARTICULAR SPEED, BANDWIDTH OR DATA THROUGHPUT RATE, OR WILL BE UNINTERRUPTED, ERROR-FREE, SECURE…

Businesses pay more than consumers for their bandwidth, and providing SLAs is one of the reasons. Consumers would probably not be willing to pay more for SLAs, but they can still legitimately expect to know what they are paying for. The Measurement Lab data will be able to confirm or disprove accusations that ISPs are intentionally impairing traffic of some types.

This is a complicated issue, because one man’s traffic blocking is another man’s network management, and what a consumer might consider acceptable use (like BitTorrent) may violate an ISP’s Acceptable Use Policy (Verizon:”…it is a violation of… this AUP to… generate excessive amounts of email or other Internet traffic;”). The arguments can go round in circles until terms like “excessive” and “unlimited” are defined numerically and measurements are made. So Measurement Lab is a great step forward in the Network Neutrality debate, and should be applauded by consumers and service providers alike.

Fixed Mobile Substitution and Voice over Wi-Fi

Getting rid of your land-line phone and relying on your cell phone instead is called Fixed Mobile Substitution (FMS).

A report from the National Center for Health Statistics of the Centers for Disease Control (CDC) shows a linear increase in the number of households that have a cell phone but no land-line, starting at 4.4% in 2004 and reaching 16.1% in the first half of 2008.
US Fixed Mobile Substitution 2005-2008 - source: CDC

These numbers match those in a recent Nielsen report on FMS.

FMS will most likely accelerate in 2009 because of the recession. It will be interesting to see by how much. We will reach a tipping point soon. 13% of households have a landline that they don’t use.

There are about 112 million occupied housing units in the US, and about 71 million broadband subscribers.

So what does this mean for Wi-Fi VoIP? One of the primary reasons for FMS is to save money; it is more prevalent in lower income households. There are two kinds of phone that do VoWi-Fi, smartphones and UMA phones. Smartphones are expensive, and probably less common among the cord-cutting demographic – except that that demographic is also younger and better educated as well as having a modest income – many are students.

Wi-Fi VoIP in smart phones is still negligible, but the seeds are planted: vigorous growth of smart phones, Wi-Fi attach rate to smart phones trending to 100%, a slow but steady opening up of smart phones to third party applications, broadband in most homes, Wi-Fi growing in all markets.

Self-configuring Femtocells

Rethink Wireless reports that picoChip has added cognitive capabilities to their femtocells. Related “sniffing” technology is used in White Spaces radios and in the UNII-2 band by Wi-Fi. The idea is to check to see how the spectrum is currently being used, and to arrange matters to interfere as little as possible. With White Spaces and Wi-Fi the sniffing is used to avoid spectrum occupied by a primary user. PicoChip uses it to create self configuring networks:

As well has handling configuration, synchronization and hand-off – and reporting metrics on the cell to help network planning – the sniff function will support entirely self-organizing networks of the type Vodafone has outlined in recent presentations. Currently, most of the interference management these require are handled in different ways by the femtocell OEMs, but each has its own proprietary algorithms, making mixed-vendor networks difficult. The picoChip designs also allow the femto silicon to run the manufacturer specific code.

Wi-Fi certification for voice devices

In news that is huge for VoWi-Fi, the Wi-Fi Alliance announced on June 30th a new certification program, “Voice-Personal.” Eight devices have already been certified under this program, including enterprise access points from Cisco and Meru, a residential access point from Broadcom, and client adapters from Intel and Redpine Signals.

Why is this huge news? Well, as the press release points out, by 2011 annual shipments of cell phones with Wi-Fi will be running at roughly 300 million units. The Wi-Fi in these phones will be used for Internet browsing, for syncing photos and music with PCs, and for cheap or free voice calls.

The certification requirements for Voice-Personal are not aggressive: only four simultaneous voice calls in the presence of data traffic, with a latency of less than 50 milliseconds and a maximum jitter of less than 50 milliseconds. These numbers will produce an acceptable call under most conditions, but a network round-trip delay of 300 ms is generally considered to approach the limit of acceptability, and with a Wi-Fi hop at each end running at the limit of these specifications there would be no room in the latency budget for any additional delays in the voice path. The packet loss requirement, 1% with no burst losses, is a very good number considering that modern voice codecs from companies like GIPS can yield excellent sound quality in the presence of much higher packet loss. This number is hard to achieve in the real world, as phones encounter microwave ovens, move through spots of poor coverage and transition between access points.

Since this certification is termed “Voice-Personal,” four active calls per access point is acceptable; a residence is unlikely to need more than that. Three of the four access points submitted for this certification are enterprise access points. They should be able to handle many more calls, and probably can. The Wi-Fi Alliance is planning a “Voice-Enterprise” certification for 2009.

There are several things that are good about this certification. First, the WFA has seen fit to highlight voice as a primary use for Wi-Fi, and has set a performance baseline. Second, this certification requires some other certifications as well, like WMM power save and WMM QoS. So far in 2008, of 99 residential access points certified only 6 support WMM power save, and of 52 enterprise access points only 13 support WMM power save. One of the biggest criticisms of Wi-Fi in handsets is that it draws too much power. WMM power save yields radical improvements in battery life – better than doubling talk time and increasing standby time by over 30%, according to numbers in the WFA promotional materials.

White Space update

The forthcoming transition to digital TV transmissions will free up about half the spectrum currently allocated to TV broadcasters. This freed-up spectrum was the subject of the FCC’s just-concluded 700MHz Auction, which yielded about $20 billion in license fees to the government. The fate of the other half of the TV spectrum, the part that will remain assigned to TV broadcasts after the digital transition, remains in contention.

This spectrum will be shared by licensed TV broadcast channels and wireless microphones, but even so much of it will remain mostly unused. These chunks of spectrum left idle by their licensees are called “White Spaces.” The advent of “spectrum sensing” radio technology means that it is now theoretically possible for transmitters to identify and use White Spaces without interfering with the licensed use of the spectrum.

The FCC has issued a Notice of Proposed Rulemaking and a First Report and Order to explore whether this is a good idea, and if so, how to handle it.

The potential users of the White Spaces have formed roughly two camps, those who see it best suited for fixed broadband access (similar to the first version of WiMAX), and those who see it as also suited for “personal/portable” applications (similar to Wi-Fi).

Google, along with Microsoft and some other computer industry companies, advocates the personal/portable use. The FCC’s Office of Engineering and Technology (OET) is currently lab-testing some devices from Microsoft and others to see if their spectrum-sensing capabilities are adequate to address the concerns of the broadcast industry, which fears that personal/portable use will cause interference.

Google filed an ex-parte letter with the FCC on March 24th, weighing in on the White Spaces issue. The letter is well worth reading. It concedes that in the introductory phases it makes sense to supplement spectrum sensing with other technologies, like geo-location databases and beacons. The letter asserts that these additional measures render moot the current squabble over a malfunction in the devices in the first round of FCC testing, and that the real-world data gathered in this introductory phase would give the FCC confidence ultimately to repeal the supplemental measures, and perhaps to extend open spectrum-sensing uses to the entire radio spectrum, leading to a nirvana of effectively unlimited bandwidth for everybody.

The kicker is in the penultimate paragraph, where Google recycles an earlier proposal it made for the 700MHz spectrum auction, suggesting a real-time ongoing “dynamic auction” of bandwidth. Google now suggests applying this dynamic auction idea to the white spaces:

For each available spectrum band, the licensee could bestow the right to transmit an amount of power for a unit of time, with the total amount of power in any location being limited to a specified cap. This cap would be enforced by measurements made by the communications devices. For channel capacity efficiency reasons, bands should be allocated in as large chunks as possible. The airwaves auction would be managed via the Internet by a central clearinghouse.

Current expectations are for spectrum-sensing use of the whites spaces to be unlicensed (free, like Wi-Fi). Now Google appears to be proposing “sub-licensed” rather than unlicensed spectrum use. The word “auction” implies that this could be a revenue producer for TV broadcast licensees, who received their licenses free from the government. This is a very different situation than the original context of the dynamic auction proposal, which applied to spectrum for which licensees paid $20 billion. Depending how it is implemented, it could fulfill the telcos’ dream of directly charging content providers for bandwidth on a consumer’s Internet access link, a dream that Google has opposed in the network neutrality wars. Google may ultimately regret opening the door to this one, even though it presumably sees itself cashing in as the ideal candidate to operate the “central clearinghouse.”

Update April 10th: Interesting related posts from Michael Marcus and Sascha Meinrath.

FCC 700 MHz “Open Platform” Auction Completed

It took a while, and 261 rounds of bidding, but its over. Click here for the write-up from Wired.

The attractive thing about the 700 MHz spectrum that was freed up by the move to digital TV broadcasting is that transmissions at these frequencies pass through walls. The unusual thing about the “C Block” of this spectrum is that the FCC attached “open access” conditions to the license. This was at the behest of the computer industry, spearheaded by Google, who may even have made a bid on this block. But as the Wired story points out, Google had already won their victory with the imposition of the open access rules – winning the spectrum would have been more of a headache for them than losing it.

Don’t confuse the spectrum licensed in this auction with the White Spaces spectrum. The White Spaces spectrum is the spectrum that the TV broadcasters retained for their transition to digital transmissions in February 2009. The White Spaces issue is still unresolved by the FCC. The FCC is deliberating over whether to allow unlicensed use of the digital TV spectrum when it is not being used by a TV broadcast (hence “White Spaces.”) This use depends on effective functioning of “cognitive radio,” which lets transmitters sense by listening (and other means) when spectrum is available for use. If the FCC allows it, they still have to decide whether to allow only fixed broadband replacement like 802.22, or to allow “Personal and Portable” use as well.

Intel’s Primary Wireless Campus

Intel published a white paper last year about a trial deployment of 802.11a as a replacement for wired Ethernet at a 5,000 person campus. The results were lower costs and happier workers. This was just for PC connectivity. The dual-mode phone phase of the deployment is still to come.

There are several interesting findings in the white paper. First, while the latency of the network increased somewhat, the difference was imperceptible to the users. Second, Intel chose to abandon the VPN, relying on 802.11i for security. This made joining the network faster and easier.

The decision to use 802.11a was presumably for the greater capacity (more non-interfering channels than 11g), and for the cleaner spectrum. 802.11n is superior to 802.11a in capacity and rate at range. This means that what was doable with 11a will be even easier with 11n.