Big Brother

Some ideas are so obvious once you hear them that you feel like you already had them yourself. One such is a new application for Wi-Fi from a company called Euclid Analytics.

Euclid’s idea is to provide Google Analytics-style information on foot traffic in retail stores. They implement it using the Wi-Fi on smart phones. This is technologically trivial: if you leave the Wi-Fi on your phone turned on, it will periodically transmit Wi-Fi packets, for example ‘probe requests.’ Every packet transmitted by a device contains a unique identifier for that device, the MAC address. So by gathering this information from a Wi-Fi access point, Euclid can tell how often and for how long each device is in the vicinity. Presumably enough people have Wi-Fi on their phones by now to gather statistically representative data for analytics purposes.

The Euclid technology doesn’t require your opt-in, and it doesn’t need to be tied to Wi-Fi. The concept can trivially be extended to any phone (not just Wi-Fi equipped ones) by using cellular packets rather than Wi-Fi, and for people with no phone, face recognition with in-store cameras. For this kind of application even 90% accuracy on the face recognition would be useful.

One of the only four choices on Euclid’s website’s navigation menu is Privacy. Privacy gets this prominent treatment because the privacy issues raised by this technology are immense.

Gathering this kind of information for one store – anonymous traffic by time, duration of stay, repeat visits and so on doesn’t seem too intrusive on individuals, but Euclid will be tempted to aggregate it across all the stores in the world, and to correlate its data with other data that stores already gather, like point of sale records.

Many technology sophisticates I talk with tell me that it is naive to expect any privacy whatsoever in the Internet age, and I guess this is another example. Euclid will effectively know where you are most of the time, but it won’t know much more than your cellular provider, or any any of the app vendors to whom you have given location permission on your phone.

ABI projects rapid uptake of 802.11ac

An interesting graphic from ABI research projects ongoing rapid growth in Wi-Fi in phones out to 2017, when the penetration will be approaching 100%. This doesn’t seem to be unrealistically fast to me, since the speed at which feature phones have been displaced by low-cost smartphones implies that by that time practically all phones will be smartphones, and since carriers are increasingly attracted to Wi-Fi for data offload from their cellular networks.

The graphic also shows rapid transition from 802.11n to 802.11ac starting next year.

Wi-Fi Penetration in Mobile Handsets by Protocol World Market, Forecast 2010-2017

ITExpo: Anatomy of Enterprise Mobility: Revolutionizing the Mobile Workforce

If you are going to ITExpo West 2012 in Austin, make sure you attend my panel on this topic at 10:00 am on Friday, October 5th.

The panelists are Brigitte Anschuetz of IBM, Akhil Behl of Cisco Systems, John Gonsalves of Symphony Teleca Corporation, Sam Liu of Partnerpedia and Bobby Mohanty of Vertical.

The pitch for the panel is:

Enterprise mobility is one of the fastest growing areas of business, allowing companies to virtually connect with customers and employees from anyplace in the world. CIOs are facing more decisions than ever when it comes to managing their mobile workforce. Employees expect to be able to do their work on multiple platforms, from desktops and laptops to tablets and smartphones.

This session will dive into the various components of an enterprise mobility solution, provide best practices to ensure they are successful and explain how they integrate together to enable companies to grow their business. Topics will include: mobile enterprise application platforms, enterprise app stores, mobile device management, expense management, and analytics.

ITExpo: BYOD – The New Mobile Enterprise

If you are going to ITExpo West 2012 in Austin, make sure you attend my panel on this topic at 1:30 pm on Wednesday, October 3rd.

The panelists are Jeanette Lee of Ruckus Wireless, Ed Wright of ShoreTel and John Cash of RIM.

The pitch for the panel is:

BYOD (Bring Your Own Device) has been in full swing for a couple of years now, and there’s no going back. Enterprises have adopted a policy of allowing users to use their own devices to access corporate networks and resources. With it comes the cost savings of not having to purchase as many mobile devices, and user satisfaction increases when they are able to choose their preferred devices and providers (and avoid having to carry multiple devices). But the benefits don’t come without challenges — the user experience must be preserved, security policies must accommodate these multiple devices and operating systems, and IT has to content with managing applications and access across different platforms. This session looks at what businesses can do to mitigate risks and ensure performance while still giving your users the device freedom they demand.

802.11ac and 802.11ad. Why both?

The current generation of Wi-Fi, 802.11n, is soon to be superseded by not one, but two new flavors of Wi-Fi:

802.11n claims a maximum raw data rate of 600 megabits per second. 802.11ac and WiGig each claims over ten times this – about 7 megabits per second. So why do we need them both? The answer is that they have different strengths and weaknesses, and to some extent they can fill in for each other, and while they are useful for some of the same things, they each have use cases for which they are superior.

The primary difference between them is the spectrum they use. 802.11ac uses 5 GHz, WiGig uses 60 GHz. There are four main consequences of this difference in spectrum:

  • Available bandwidth
  • Propagation distance
  • Crowding
  • Antenna size

Available bandwidth:

There are no free lunches in physics. The maximum bit-rate of a wireless channel is limited by its bandwidth (i.e. the amount of spectrum allocated to it – 40 MHz in the case of old-style 802.11n Wi-Fi), and the signal-to-noise ratio. This limit is given by the Shannon-Hartley Theorem. So if you want to increase the speed of Wi-Fi you either have to allocate more spectrum, or improve the signal-to-noise ratio.
The amount of spectrum available to Wi-Fi is regulated by the FCC. At 5 GHz, Wi-Fi can use 0.55 GHz of spectrum. At 60 GHz, Wi-Fi can use 7 GHz of spectrum – over ten times as much. 802.11ac divides its spectrum into five 80 MHz channels, which can be optionally bonded into two and a half 160 MHz channels, as shown in this FCC graphic:

Channels in 5 GHz

802.11ad has it much easier:

“Worldwide, the 60 GHz band has much more spectrum available than the 2.4 GHz and 5 GHz bands – typically 7 GHz of spectrum, compared with 83.5 MHz in the 2.4 GHz band.
This spectrum is divided into multiple channels, as in the 2.4 GHz and 5 GHz bands. Because the 60 GHz band has much more spectrum available, the channels are much wider, enabling multi-gigabit data rates. The WiGig specification defines four channels, each 2.16 GHz wide – 50 times wider than the channels available in 802.11n.”

So for the maximum claimed data rates, 802.11ac uses channels 160 MHz wide, while WiGig uses 2,160 MHz per channel. That’s almost fourteen times as much, which makes life a lot easier for the engineers.

Propagation Distance:

60 GHz radio waves are absorbed by oxygen, but for LAN-scale distances this is not a significant factor. On the other hand, wood, bricks and particularly paint are far more opaque to 60 GHz waves:
Attenuation of various materials by frequency

Consequently WiGig is most suitable for in-room applications. The usual example for this is streaming high-def movies from your phone to your TV, but for their leading use case WiGig proponents have selected an even shorter range application: wireless docking for laptops.

Crowding:

5 GHz spectrum is also used by weather radar (Terminal Doppler Weather Radar or TDWR), and the FCC recently ruled that part of the 5 GHz spectrum is now completely prohibited to Wi-Fi. These channels are marked “X” in the graphic above. All the channels in the graphic designated “U-NII2” and “U-NII 3” are also subject to a requirement called “Dynamic Frequency Selection” or DFS, which says that if activity is detected at that frequency the Wi-Fi device must not use it.

5 GHz spectrum is not nearly as crowded as the 2.4 GHz spectrum used by most Wi-Fi, but it’s still crowded and cramped compared to the wide open vistas of 60 GHz. Even better, the poor propagation of 60 GHz waves means that even nearby transmitters are unlikely to cause interference. And with with beam-forming (discussed in the next section), even transmitters in the same room cause less interference. So WiGig wins on crowding in multiple ways.

Antenna size:

Antenna size and spacing is proportional to the wavelength of the spectrum being used. This means that 5 GHz antennas would be about an inch long and spaced about an inch apart. At 60 GHz, the antenna size and spacing would be about a tenth of this! So for handsets, multiple antennas are trivial to do at 60 GHz, more challenging at 5 GHz.

What’s so great about having multiple antennas? I mentioned earlier that there are no free lunches in physics, and that maximum bit-rate depends on channel bandwidth and signal to noise ratio. That’s how it used to be. Then in the mid-1990s engineers discovered (invented?) an incredible, wonderful free lunch: MIMO. Two adjacent antennas transmitting different signals on the same frequency normally interfere with each other. But the MIMO discovery was that if in this situation you also have two antennas on the receiver, and if there are enough things in the vicinity for the radio waves to bounce off (like walls, floors, ceilings, furniture and so on), then you can take the jumbled-up signals at the receiver, and with enough mathematical computer horsepower you can disentangle the signals completely, as if you had sent them on two different channels. And there’s no need to stop at two antennas. With four antennas on the transmitter and four on the receiver, you get four times the throughput. With eight, eight times. Multiple antennas receiving, multiple antennas sending. Multiple In, Multiple Out: MIMO. This kind of MIMO is called Spatial Multiplexing, and it is used in 802.11n and 802.11ac.

Another way multiple antennas can be used is “beam-forming.” This is where the same signal is sent from each antenna in an array, but at slightly different times. This causes interference patterns between the signals, which (with a lot of computation) can be arranged in such a way that the signals reinforce each other in a particular direction. This is great for WiGig, because it can easily have as many as 16 of its tiny antennas in a single array, even in a phone, and that many antennas can focus the beam tightly. Even better, shorter wavelengths tend to stay focused. So most of the transmission power can be aimed directly at the receiving device. So for a given power budget the signal can travel a lot further, or for a given distance the transmission power can be greatly reduced.

How does 802.11ac get to 6.9 Gigabits per second?

You know from a previous post how 802.11n gets to 600 megabits per second. 802.11ac does just three things to increase that by 1,056%:

  1. It adds a new Modulation and Coding Scheme (MCS) called 256-QAM. This increases the number of bits transmitted per symbol from 6 to 8, a factor of 1.33.
  2. It increases the maximum channel width from 40 MHz to 160 MHz (160 MHz is optional, but 80 MHz support is mandatory.) This increases the number of subcarriers from 108 to 468, a factor of 4.33.
  3. It increases the maximum MIMO configuration from 4×4 to 8×8, increasing the number of spatial streams by a factor of 2. Multi-User MIMO (MU-MIMO) with beamforming means that these spatial streams can be directed to particular clients, so while the AP may have 8 antennas, the clients can have less, for example 8 clients each with one antenna.

Put those factors together and you have 1.33 x 4.33 x 2 = 11.56. Multiply the 600 megabits per second of 802.11n by that factor and you get 600 x 11.56 = 6,933 megabits per second for 802.11ac.

Note that nobody does this yet, and 160 MHz channels and 8×8 MIMO are likely to remain unimplemented for a long time. For example Broadcom’s recently announced BCM4360 and Qualcomm’s QCA9860 do 80 MHz channels, not 160 MHz, and 3 x 3 MIMO, so they claim maximum raw bit-rates of 1.3 gigabits per second. Which is still impressive.

Maximum theoretical raw bit-rate is a fun number to talk about, but of course in the real world that will (almost) never happen. What’s more important is the useful throughput (raw bit-rate minus MAC overhead) and rate at range, the throughput you are likely to get at useful distances. This is very difficult, and it is where the manufacturers can differentiate with superior technology. For phone chips power efficiency is also an important differentiator.

Droid Razr first look.

First impression is very good. The industrial design on this makes the iPhone look clunky. The screen is much bigger, the overall feel reeks of quality, just like the iPhone. The haptic feedback felt slightly odd at first, but I think I will like it when I get used to it.

I was disappointed when the phone failed to detect my 5GHz Wi-Fi network. This is like the iPhone, but the Samsung Galaxy S2 and Galaxy Nexus support 5 Ghz, and I had assumed parity for the Razr.

Oddly, bearing in mind its dual core processor, the Droid Razr sometimes seems sluggish compared to the iPhone 4. But the Android user interface is polished and usable, and it has a significant user interface feature that the iPhone sorely lacks: a universal ‘back’ button. The ‘back’ button, like the ‘undo’ feature in productivity apps, fits with the way people work and learn: try something, and if that doesn’t work, try something else.

The Razr camera is currently unusable for me. The first photo I took had a 4 second shutter lag. On investigation, I found that if you hold the phone still, pointed at a static scene, it takes a couple of seconds to auto-focus. If you wait patiently for this to happen, watching the screen and waiting for the focus to sharpen, then press the shutter button, there is almost no shutter lag. But if you try to ‘point and shoot’ the shutter lag can be agonizingly long – certainly long enough for a kid to dodge out of the frame. This may be fixable in software, and if so, I hope Motorola gets the fix out fast.

While playing with the phone, I found it got warm. Not uncomfortably hot, but warm enough to worry about the battery draining too fast. Investigating this, I found a wonderful power analysis display, showing which parts of the phone are consuming the most power. The display, not surprisingly, was consuming the most – 35%. But the second most, 24%, was being used by ‘Android OS’ and ‘Android System.’ As the battery expired, the phone kindly suggested that it could automatically shut things off for me when the power got low, like social network updates and GPS. It told me that this could double my battery life. Even so, battery life does not seem to be a strength of the Droid Razr. Over a few days, I observed that even when the phone was completely unused, the battery got down to 20% in 14 hours, and the vast majority of the power was spent on ‘Android OS.’

So nice as the Droid Razr is, on balance I still prefer the iPhone.

P.S. I had a nightmare activation experience – I bought the phone at Best Buy and supposedly due to a failure to communicate between the servers at Best Buy and Verizon, the phone didn’t activate on the Verizon network. After 8 hours of non-activation including an hour on the phone with Verizon customer support (30 minutes of which was the two of us waiting for Best Buy to answer their phone), I went to a local Verizon store which speedily activated the phone with a new SIM.

Deciding on the contract, I was re-stunned to rediscover that Verizon charges $20 per month for SMS. I gave this a miss since I can just use Google Voice, which costs $480 less over the life of the contract.

MIMO for handset Wi-Fi

I mentioned earlier that the Wi-Fi Alliance requires MIMO for 802.11n certification except for phones, which can be certified with a single stream. This waiver was for several reasons, including power, size and the difficulty of getting two spatially separated antennas into a handset. Atheros and Marvell appear to have overcome those difficulties; both have announced 2×2 Wi-Fi chips for handsets. Presumably TI and Broadcom will not be far behind.

The Atheros chip is called the AR6004. According to Rethink Wireless,

The AR6004 can use both the 2.4GHz and the 5GHz bands and is capable of real world speeds as high as 170Mbps. Yet the firm claims its chip consumes only 15% more power than the current AR6003, which delivers only 85Mbps. It will be available in sample quantities by the end of this quarter and in commercial quantities in the first quarter of next year.

The AR6004 appears to be designed for robust performance. It incorporates all the optional features of 802.11n intended to improve rate at range. Atheros brands this suite of features “Signal Sustain Technology.” The AR6004 is also designed to reduce the total solution footprint, by including on-chip power amplifiers and low-noise amplifiers. Historically on-chip CMOS power amplifiers have performed worse than external PAs using GaAs, but Atheros claims to have overcome this deficiency, prosaically branding its solution “Efficient Power Amplifier.”

The 88W8797 from Marvell uses external PAs and LNAs, but saves space a different way, by integrating Bluetooth and FM onto the chip. The data sheet on this chip doesn’t mention as many of the 802.11n robustness features as the Atheros one does, so it is unclear whether the chip supports LDPC, for example.

Both chips claim a maximum 300 Mbps data rate. Atheros translates this to an effective throughput of 170 Mbps.

Of course, these chips will be useful in devices other than handsets. They are perfect for tablets, where there is plenty of room for two antennas at the right separation.

ITExpo East 2011: C-01 “Connecting the Distributed Enterprise via Video”

I will be moderating this panel at IT Expo in Miami on February 3rd at 9:00 am:

Mobility is taking the enterprise space by storm – everyone is toting a smartphone, tablet, laptop, or one of each. It’s all about what device happens to be tIn today’s distributed workforce environment, it’s essential to be able to communicate to employees and customers across the globe both efficiently and effectively. Prior to today, doing so was far more easily said than done because, not only was the technology not in place, but video wasn’t accepted as a form of business communication. Now that video has burst onto the scene by way of Apple’s Facetime, Skype and Gmail video chat, consumers are far more likely to pick video over voice – both in their home and at their workplaces. But, though demand has never been higher, enterprise networks still experience a slow-down when employees attempt to access video streams from the public Internet because the implementation of IP video is not provisioned properly. This session will provide an overview of the main deployment considerations so that IP video can be successfully deployed inside or outside the corporate firewall, without impacting the performance of the network, as well as how networks need to adapt to accommodate widespread desktop video deployments. It will also expose the latest in video compression technology in order to elucidate the relationship between video quality, bandwidth, and storage. With the technology in place, an enterprise can efficiently leverage video communication to lower costs and increase collaboration.

The panelists are:

  • Mike Benson, Regional Vice President, VBrick Systems
  • Anatoli Levine, Sr. Director, Product Management, RADVISION Inc.
  • Matt Collier, Senior Vice President of Corporate Development, LifeSize

VBrick claims to be the leader in video streaming for enterprises. Radvision and LifeSize (a subsidiary of Logitech) are oriented towards video conferencing rather than streaming. It will be interesting to get their respective takes on bandwidth constraints on the WLAN and the access link, and what other impairments are important.

IT Expo East 2011: NGC-04 “Meeting the Demand for In-building Wireless Networks”

I will be moderating this panel at IT Expo in Miami on February 2nd at 12:00 pm:

Mobility is taking the enterprise space by storm – everyone is toting a smartphone, tablet, laptop, or one of each. It’s all about what device happens to be the most convenient at the time and the theory behind unified communications – anytime, anywhere, any device. The adoption of mobile devices in the home and their relevance in the business space has helped drive a new standard for enterprise networking, which is rapidly becoming a wireless opportunity, offering not only the convenience and flexibility of in-building mobility, but WiFi networks are much easier and cost effective to deploy than Ethernet. Furthermore, the latest wireless standards largely eliminate the traditional performance gap between wired and wireless and, when properly deployed, WiFi networks are at least as secure as wired. This session will discuss the latest trends in enterprise wireless, the secrets to successful deployments, as well as how to make to most of your existing infrastructure while moving forward with your WiFi installation.

The panelists are:

  • Shawn Tsetsilas, Director, WLAN, Cellular Specialties, Inc.
  • Perry Correll, Principal Technologists, Xirrus Inc.
  • Adam Conway, Vice President of Product Management, Aerohive

Cellular Specialties in this context is a system integrator, and one of their partners is Aerohive. Aerohive’s special claim to fame is that they eliminate the WLAN controller, so each access point controls itself in cooperation with its neighbors. The only remaining centralized function is the management. Aerohive claims that this architecture gives them superior scalability, and a lower system cost (since you only pay for the access points, not the controllers).

Xirrus’s product is unusual in a different way, packing a dozen access points into a single sectorized box, to massively increase the bandwidth available in the coverage areas.

So is it true that Wi-Fi has evolved to the point that you no longer need wired ethernet?