HD Voice Numbers

50% of consumers say they would change their telephone service provider to get better sound quality, according to Tom Lemaire, Sr. Director of Business Development at Orange/France Telecom North America, speaking at the CES HD Voice Summit this week (Orange/France Telecom has the largest deployment of HD Voice of any traditional telco). Rich Buchanan, Chief Marketing Officer at Ooma, said at the same session that his surveys show that 65% of consumers would change their provider to get better voice quality.

Bearing in mind that we know from observation that consumers value both mobility and price above call quality, these survey numbers fall into the “interesting if true” category.

Lemaire and Buchanan also said that their logs show that the average call in HD lasts longer than the average narrowband call, though they didn’t give numbers.

Top ten uses for an Internet Tablet/Web Slate

The tablet wars are imminent, with Microsoft, Google and Apple breaking out their big guns. Here’s what you will be doing with yours later this year:

  1. Internet browser of course: think iPhone experience with a bigger screen. It will be super-fast with 802.11n in your home, and somewhat slower when you are out and about, tethering to your cell phone for wide-area connectivity. You don’t need a cellular connection in the Internet Tablet itself, though the cellcos wish you would.
  2. TV accessory: treat it as a personal picture-in-picture display. View the program guide without disturbing the other people watching the main screen. Use it for voting on shows like American Idol. Use it as a remote to change channels and set up recordings.
  3. TV replacement: a 10 inch screen at two feet is the same relative size as a 50 inch screen at ten feet. Use it with Hulu and the other streaming video services.
  4. Video iPod, but with a much nicer screen. Say goodbye to portable DVD players.
  5. VideoPhone: some Internet Tablets will have hi-res user-facing cameras and high definition microphones and speakers: the perfect Skype phone to keep on your coffee table. How about on your fridge door for an always-on video connection to the grandparents? Or in a suitable charging base, a replacement office desk phone.
  6. Electronic picture frame: sure it’s overkill for such a trivial application, but when it’s not doing anything else, why not?
  7. eBook reader: maybe not in 2010, but as screen and power technology evolve the notion of a special-function eBook reader will become as quaint as a Wang word processor. (Never heard of a Wang word processor? I rest my case.)
  8. Home remote: take a look at AMX. This kind of top-of-the-line home control will be available to the masses. Set the thermostat, set the burglar alarm, look at the front door webcam when the doorbell rings…
  9. Game console: look at all the games for the iPhone. Many of them will work on Apple’s iSlate from day one. And you can bet there will be plenty of cool games for Android, and even Windows-based Internet Tablets.
  10. PND display: Google Maps on the iPhone is miraculously good, but it’s not perfect. The display is way too small for effective in-car navigation. It’s possible that some Internet Tablets will have GPS chips in them (GPS only adds a few dollars to the bill of materials), but for this application there’s no need. Tether it to your cell phone for the Internet connectivity and the GPS, and use the tablet for display and input only.

2010 will be the year of the Internet Tablet. The industry has pretty much converged on the form factor: ten-inch-plus screen, touch interface, Wi-Fi connectivity. What’s a little more up in the air are minor details that will provide differentiation, like cellular connectivity, cameras, speakers and microphones. Apple will jump-start the category, but there will quickly be a slew of contenders at sub-$200 price points.

Several technology advances have converged to make now the right time. Low-cost, low energy ARM processors like the Qualcomm Snapdragon have enough processing muscle to drive PC-scale applications, and their pricing piggy-backs on the manufacturing scale of phones. 802.11n is fast enough for responsive web-based applications and HD video streaming. LCD screens continue to get cheaper. Personal Wi-Fi networks enable tethering and wireless keyboards for when you need them.

This also the perfect form factor for grade school kids. Once the screen resolutions get high enough books will disappear almost overnight. No more backs bent under packs laden with schoolbooks. Just this.

First 802.11n handset spotted in the wild – what took so long?

The fall 2009 crop of ultimate smartphones looks more penultimate to me, with its lack of 11n. But a handset with 802.11n has come in under the wire for 2009. Not officially, but actually. Slashgear reports a hack that kicks the Wi-Fi chip in the HTC HD2 phone into 11n mode. And the first ultimate smartphone of 2010, the HTC Google Nexus One is also rumored to support 802.11n.

These are the drops before the deluge. Questions to chip suppliers have elicited mild surprise that there are still no Wi-Fi Alliance certifications for handsets with 802.11n. All the flagship chips from all the handset Wi-Fi chipmakers are 802.11n. Broadcom is already shipping volumes of its BCM4329 11n combo chip to Apple for the iTouch (and I would guess the new Apple tablet), though the 3GS still sports the older BCM4325.

Some fear that 802.11n is a relative power hog, and will flatten your battery. For example, a GSMArena report on the HD2 hack says:

There are several good reasons why Wi-Fi 802.11n hasn’t made its way into mobile phones hardware just yet. Increased power consumption is just not worth it if the speed will be limited by other factors such as under-powered CPU or slow-memory…

But is it true that 802.11n increases power consumption at a system level? In some cases it may be: the Slashgear report linked above says: “some users have reported significant increases in battery consumption when the higher-speed wireless is switched on.”

This reality appears to contradict the opinion of one of the most knowledgeable engineers in the Wi-Fi industry, Bill McFarland, CTO at Atheros, who says:

The important metric here is the energy-per-bit transferred, which is the average power consumption divided by the average data rate. This energy can be measured in nanojoules (nJ) per bit transferred, and is the metric to determine how long a battery will last while doing tasks such as VoIP, video transmissions, or file transfers.

For example, Table 1 shows that for 802.11g the data rate is 22 Mbps and the corresponding receive power-consumption average is around 140 mW. While actively receiving, the energy consumed in receiving each bit is about 6.4 nJ. On the transmit side, the energy is about 20.4 nJ per bit.

Looking at these same cases for 802.11n, the data rate has gone up by almost a factor of 10, while power consumption has gone up by only a factor of 5, or in the transmit case, not even a factor of 3.

Thus, the energy efficiency in terms of nJ per bit is greater for 802.11n.

Here is his table that illustrates that point:
Effect of Data Rate on Power Consumption

Source: Wireless Net DesignLine 06/03/2008

The discrepancy between this theoretical superiority of 802.11n’s power efficiency, and the complaints from the field may be explained several ways. For example, the power efficiency may actually be better and the reports wrong. Or there may be some error in the particular implementation of 802.11n in the HD2 – a problem that led HTC to disable it for the initial shipments.

Either way, 2010 will be the year for 802.11n in handsets. I expect all dual-mode handset announcements in the latter part of the year to have 802.11n.

As to why it took so long, I don’t think it did, really. The chips only started shipping this year, and there is a manufacturing lag between chip and phone. I suppose a phone could have started shipping around the same time as the latest iTouch, which was September. But 3 months is not an egregious lag.

All you can eat?

The always good Rethink Wireless has an article AT&T sounds deathknell for unlimited mobile data.

It points out that with “3% of smartphone users now consuming 40% of network capacity,” the carrier has to draw a line. Presumably because if 30% of AT&T’s subscribers were to buy iPhones, they would consume 400% of the network’s capacity.

Wireless networks are badly bandwidth constrained. AT&T’s woes with the iPhone launch were caused by lack of backhaul (wired capacity to the cell towers), but the real problem is on the wireless link from the cell tower to the phone.

The problem here is one of setting expectations. Here’s an excerpt from AT&T’s promotional materials: “Customers with capable LaptopConnect products or phones, like the iPhone 3G S, can experience the 7.2 [megabit per second] speeds in coverage areas.” A reasonable person reading this might think that it is an invitation to do something like video streaming. Actually, a single user of this bandwidth would consume the entire capacity of a cell-tower sector:
HSPA ell capacity per sector per 5 MHz
Source: High Speed Radio Access for Mobile Communications, edited by Harri Holma and Antti Toskala.

This provokes a dilemma – not just for AT&T but for all wireless service providers. Ideally you want the network to be super responsive, for example when you are loading a web page. This requires a lot of bandwidth for short bursts. So imposing a bandwidth cap, throttling download speeds to some arbitrary maximum, would give users a worse experience. But users who use a lot of bandwidth continuously – streaming live TV for example – make things bad for everybody.

The cellular companies think of users like this as bad guys, taking more than their share. But actually they are innocently taking the carriers up on the promises in their ads. This is why the Rethink piece says “many observers think AT&T – and its rivals – will have to return to usage-based pricing, or a tiered tariff plan.”

Actually, AT&T already appears to have such a policy – reserving the right to charge more if you use more than 5GB per month. This is a lot, unless you are using your phone to stream video. For example, it’s over 10,000 average web pages or 10,000 minutes of VoIP. You can avoid running over this cap by limiting your streaming videos and your videophone calls to when you are in Wi-Fi coverage. You can still watch videos when you are out and about by downloading them in advance, iPod style.

This doesn’t seem particularly burdensome to me.

Why are we waiting?

I just clicked on a calendar appointment enclosure in an email. Nothing happened, so I clicked again. Then suddenly two copies of the appointment appeared in my iCal calendar. Why on earth did the computer take so long to respond to my click? I waited an eternity – maybe even as long as a second.

The human brain has a clock speed of about 15 Hz. So anything that happens in less than 70 milliseconds seems instant. The other side of this coin is that when your computer takes longer than 150 ms to respond to you, it’s slowing you down.

I have difficulty fathoming how programmers are able to make modern computers run so slowly. The original IBM PC ran at well under 2 MIPS. The computer you are sitting at is around ten thousand times faster. It’s over 100 times faster than a Cray-1 supercomputer. This means that when your computer keeps you waiting for a quarter of a second, equally inept programming on the same task on an eighties-era IBM PC would have kept you waiting 40 minutes. I don’t know about you, but I encounter delays of over a quarter of a second with distressing frequency in my daily work.

I blame Microsoft. Around Intel the joke line about performance was “what Andy giveth, Bill taketh away.” This was actually a winning strategy for decades of Microsoft success: concentrate on features and speed of implementation and never waste time optimizing for performance because the hardware will catch up. It’s hard to argue with success, but I wonder if a software company obsessed with performance could be even more successful than Microsoft?

Network Neutrality – FCC issues NPRM

I wrote earlier about FCC chairman Julius Genachowski’s plans for regulations aimed at network neutrality. The FCC today came through with a Notice of Proposed Rule Making. Here are the relevant documents from the FCC website:

Summary Presentation: Acrobat
NPRM: Word | Acrobat
News Release: Word | Acrobat
Genachowski Statement: Word | Acrobat
Copps Statement: Word | Acrobat
McDowell Statement: Word | Acrobat
Clyburn Statement: Word | Acrobat
Baker Statement: Word | Acrobat

The NPRM itself is a hefty document, 107 pages long; if you just want the bottom line, the Summary Presentation is short and a little more readable than the press release. The comment period closes in mid-January, and the FCC will respond to the comments in March. I hesitate to guess when the rules will actually be released – this is hugely controversial: 40,000 comments filed to date. Here is a link to a pro-neutrality advocate. Here is a link to a pro-competition advocate. I believe that the FCC is doing a necessary thing here, and that the proposals properly address the legitimate concerns of the ISPs.

Here is the story from Reuters, and from AP.

Dual mode phone trends update 3

I last looked at dual mode phone certifications on the Wi-Fi Alliance website almost a year ago.

Here’s what has happened since, through the first three quarters of 2009:
Wi-Fi Alliance Dual-Mode Phone Certifications 2005-2009

There are still no certifications for 802.11 draft n, and almost none for 802.11a.

Here’s another breakdown, by manufacturer and year. Click on the chart to get a bigger image. This shows that the Wi-Fi enthusiasts have been pretty constant over the years: Nokia, HTC, Motorola and Samsung. Then more recently SonyEricsson and LG. Note that the 2009 figures are only through Q3, so the growth is even more impressive than it seems from this chart.
Wi-Fi Alliance Dual-Mode Phone Certifications 2005-2009 by OEM

The all-time champion is Samsung, with a total of 84 phone models certified for Wi-Fi, followed by Nokia with 68, then HTC with 54. This changes if you look just at smartphones, where Nokia has 61 total certifications to HTC’s 34 and Samsung’s 29.

3G network performance test results: Blackberries awful!

ARCchart has just published a report summarizing the data from a “test your Internet speed” applet that they publish for iPhone, Blackberry and Android. The dataset is millions of readings, from every country and carrier in the world. The highlights from my point of view:

  1. 3G (UMTS) download speeds average about a megabit per second; 2.5G (EDGE) speeds average about 160 kbps and 2G (GPRS) speeds average about 50 kbps.
  2. For VoIP, latency is a critical measure. The average on 3G networks was 336 ms, with a variation between carriers and countries ranging from 200 ms to over a second. The ITU reckons latency becomes a serious problem above 170 ms. I discussed the latency issue on 3G networks in an earlier post.
  3. According to these tests, Blackberries are on average only half as fast for both download and upload on the same networks as iPhones and Android phones. The Blackberry situation is complicated because they claim to compress data-streams, and because all data normally goes through Blackberry servers. The ARCchart report looks into the reasons for Blackberry’s poor showing:

The BlackBerry download average across all carriers is 515 kbps versus 1,025 kbps for the iPhone and Android – a difference of half. Difference in the upload average is even greater – 62 kbps for BlackBerry compared with 155 kbps for the other devices.
Source: ARCchart, September 2009.

Femtocell pricing chutzpah

It’s like buying an airplane ticket then getting charged extra to get on the plane.

The cellular companies want you to buy cellular service then pay extra to get signal coverage. Gizmodo has a coolly reasoned analysis.

AT&T Wireless is doing the standard telco thing here, conflating pricing for different services. It is sweetening the monthly charge option for femtocells by offering unlimited calling. A more honest pricing scheme would be to provide femtocells free to anybody who has coverage problem, and to offer the femtocell/unlimited calling option as a separate product. Come to think of it, this is probably how AT&T really plans for it to work: if a customer calls to cancel service because of poor coverage, I expect AT&T will offer a free femtocell as a retention incentive.

It is ironic that this issue is coming up at the same time as the wireless carriers are up in arms about the FCC’s new network neutrality initiative. Now that smartphones all have Wi-Fi, if the handsets were truly open we could use our home Wi-Fi signal to get data and voice services from alternative providers when we were at home. No need for femtocells. (T-Mobile@Home is a closed-network version of this.)

Presumably something like this is on the roadmap for Google Voice, which is one of the scenarios that causes the MNOs to fight network neutrality tooth and nail.