Mobile Malware Update

Blue Coat Systems has published an interesting report on the state of mobile malware. The good news is that in the words of the report “the devices’ security model” is not yet “broken.” This means that smartphones and tablets are still rarely hijacked by viruses in the way that computers commonly are.

Now for the bad news. On the Android side (though apparently not yet on the iOS side), virus-style hijackings have begun to appear:

Blue Coat WebPulse collaborative defense first detected an Android exploit in real time on February 5, 2009. Since then, Blue Coat Security Labs has observed a steady increase in Android malware. In the July-September 2012 quarter alone, Blue Coat
Security Labs saw a 600 percent increase in Android malware over the same period last year.

But this increase is from a minuscule base, and this type of threat is still relatively minor on mobile devices. Instead the report says, “user behavior becomes the Achilles heel.” The main mobile threats are from what the report calls “mischiefware.”

Mischiefware works by enticing the user into doing something unintentional. The two main categories of Mischiefware are:

  1. Phishing, which tricks users into disclosing personal information that can be used for on-line theft.
  2. Scamming, which tricks users into paying far more than they expect for something – like for-pay text (SMS) messages or in-app purchases. Even legitimate service providers can be guilty of this type of ‘gotcha’ activity, with rapacious international data roaming charges, or punitive overage charges on monthly ‘plans.’

“User behavior becomes the Achilles Heel” is hardly a revelation. A more appropriate phrase would be “User behavior remains the Achilles Heel,” since in this respect the mobile world is no different from the traditional networking world.

Mobile Security and HTML5

Smartphones and tablets have plenty of computing power to host malware, and they are simultaneously connected to the Internet via a cellular connection and to the LAN via Wi-Fi. So everybody in your organization has a device capable of by-passing your firewall in their pocket.

The good news is that smartphone OSes were designed recently enough that their creators were able to build security into the platforms using techniques like ARM TrustZone, and “chain of trust.” Technologies of this type are merely optional on PCs. Plus,the Android and iPhone app stores tightly control the applications that they distribute, and most people don’t take the trouble to avoid this protection. With these system-level and application-level protections, smartphones and tablets are intrinsically less vulnerable than PCs.

But there’s plenty of bad news, too. The chain of trust isn’t foolproof, and malicious code can get through the app store certification process.

On top of these traditional threats, a new one looms: HTML 5. Adobe Flash is so notoriously vulnerable that Steve Jobs refused to let it onto the iPhone. Adobe has now thrown in the towel, and committed to HTML 5 instead. HTML 5 is presumably safer than Flash, but it is untried, and it has powerful access to the platform more akin to a native app than to traditional HTML.

This means that we can expect a rising tide of smartphone-related security breaches.

Big Brother

Some ideas are so obvious once you hear them that you feel like you already had them yourself. One such is a new application for Wi-Fi from a company called Euclid Analytics.

Euclid’s idea is to provide Google Analytics-style information on foot traffic in retail stores. They implement it using the Wi-Fi on smart phones. This is technologically trivial: if you leave the Wi-Fi on your phone turned on, it will periodically transmit Wi-Fi packets, for example ‘probe requests.’ Every packet transmitted by a device contains a unique identifier for that device, the MAC address. So by gathering this information from a Wi-Fi access point, Euclid can tell how often and for how long each device is in the vicinity. Presumably enough people have Wi-Fi on their phones by now to gather statistically representative data for analytics purposes.

The Euclid technology doesn’t require your opt-in, and it doesn’t need to be tied to Wi-Fi. The concept can trivially be extended to any phone (not just Wi-Fi equipped ones) by using cellular packets rather than Wi-Fi, and for people with no phone, face recognition with in-store cameras. For this kind of application even 90% accuracy on the face recognition would be useful.

One of the only four choices on Euclid’s website’s navigation menu is Privacy. Privacy gets this prominent treatment because the privacy issues raised by this technology are immense.

Gathering this kind of information for one store – anonymous traffic by time, duration of stay, repeat visits and so on doesn’t seem too intrusive on individuals, but Euclid will be tempted to aggregate it across all the stores in the world, and to correlate its data with other data that stores already gather, like point of sale records.

Many technology sophisticates I talk with tell me that it is naive to expect any privacy whatsoever in the Internet age, and I guess this is another example. Euclid will effectively know where you are most of the time, but it won’t know much more than your cellular provider, or any any of the app vendors to whom you have given location permission on your phone.

ABI projects rapid uptake of 802.11ac

An interesting graphic from ABI research projects ongoing rapid growth in Wi-Fi in phones out to 2017, when the penetration will be approaching 100%. This doesn’t seem to be unrealistically fast to me, since the speed at which feature phones have been displaced by low-cost smartphones implies that by that time practically all phones will be smartphones, and since carriers are increasingly attracted to Wi-Fi for data offload from their cellular networks.

The graphic also shows rapid transition from 802.11n to 802.11ac starting next year.

Wi-Fi Penetration in Mobile Handsets by Protocol World Market, Forecast 2010-2017

ITExpo: Anatomy of Enterprise Mobility: Revolutionizing the Mobile Workforce

If you are going to ITExpo West 2012 in Austin, make sure you attend my panel on this topic at 10:00 am on Friday, October 5th.

The panelists are Brigitte Anschuetz of IBM, Akhil Behl of Cisco Systems, John Gonsalves of Symphony Teleca Corporation, Sam Liu of Partnerpedia and Bobby Mohanty of Vertical.

The pitch for the panel is:

Enterprise mobility is one of the fastest growing areas of business, allowing companies to virtually connect with customers and employees from anyplace in the world. CIOs are facing more decisions than ever when it comes to managing their mobile workforce. Employees expect to be able to do their work on multiple platforms, from desktops and laptops to tablets and smartphones.

This session will dive into the various components of an enterprise mobility solution, provide best practices to ensure they are successful and explain how they integrate together to enable companies to grow their business. Topics will include: mobile enterprise application platforms, enterprise app stores, mobile device management, expense management, and analytics.

ITExpo: BYOD – The New Mobile Enterprise

If you are going to ITExpo West 2012 in Austin, make sure you attend my panel on this topic at 1:30 pm on Wednesday, October 3rd.

The panelists are Jeanette Lee of Ruckus Wireless, Ed Wright of ShoreTel and John Cash of RIM.

The pitch for the panel is:

BYOD (Bring Your Own Device) has been in full swing for a couple of years now, and there’s no going back. Enterprises have adopted a policy of allowing users to use their own devices to access corporate networks and resources. With it comes the cost savings of not having to purchase as many mobile devices, and user satisfaction increases when they are able to choose their preferred devices and providers (and avoid having to carry multiple devices). But the benefits don’t come without challenges — the user experience must be preserved, security policies must accommodate these multiple devices and operating systems, and IT has to content with managing applications and access across different platforms. This session looks at what businesses can do to mitigate risks and ensure performance while still giving your users the device freedom they demand.

ITExpo: Enterprise SBC and UC Security Essentials

If you are going to ITExpo West 2012 in Austin, make sure you attend my panel on this topic at 10:00am on Wednesday, October 3rd.

The panelists are Scott Beer of Ingate Systems, Jeff Dworkin of Sangoma, Eric Hernaez of NeSatpiens, Mykola Konrad of Sonus Networks, Jack Rynes of Avaya and John Nye of Genband.

The pitch for the panel is:

Supported by Session Border Controllers (SBCs) and Unified Communications (UC), enterprises can enable workers to essentially carry their desk phone extensions and features with them, wherever they are working on any given day – via VoIP clients and other UC applications on smartphones, tablets, and other mobile devices. With rich UC applications features such as call transfer, conference call, corporate directory listings, and presence, workers can collaborate and communicate in real-time, increasing productivity by maintaining an always one presence.

But wireless and Internet connected mobile devices present unique security challenges that differ dramatically from traditional communications and data security methods that rely on firewalls, user authentication, and encryption. Further, these mobile devices can expose sensitive network traffic, and proprietary or confidential data and communications, to multiple vulnerabilities.

Enterprises that have embraced SBCs, and other components of UC security, are proving they can securely protect and extend communications to external parties, unlocking new ways of collaborating with clients, partners, distributed employees and the supply chain. This session will consider the Enterprise SBC as a means of satisfying security and privacy requirements, with signaling and traffic encryption, media and signaling forking, network demarcation, and threat detection and mitigation, enabling enterprises to capture the cost benefits of VoIP and UC, while maintaining essential security postures and access to multi-mobile communications across the network, anytime, anywhere.

802.11ac and 802.11ad. Why both?

The current generation of Wi-Fi, 802.11n, is soon to be superseded by not one, but two new flavors of Wi-Fi:

802.11n claims a maximum raw data rate of 600 megabits per second. 802.11ac and WiGig each claims over ten times this – about 7 megabits per second. So why do we need them both? The answer is that they have different strengths and weaknesses, and to some extent they can fill in for each other, and while they are useful for some of the same things, they each have use cases for which they are superior.

The primary difference between them is the spectrum they use. 802.11ac uses 5 GHz, WiGig uses 60 GHz. There are four main consequences of this difference in spectrum:

  • Available bandwidth
  • Propagation distance
  • Crowding
  • Antenna size

Available bandwidth:

There are no free lunches in physics. The maximum bit-rate of a wireless channel is limited by its bandwidth (i.e. the amount of spectrum allocated to it – 40 MHz in the case of old-style 802.11n Wi-Fi), and the signal-to-noise ratio. This limit is given by the Shannon-Hartley Theorem. So if you want to increase the speed of Wi-Fi you either have to allocate more spectrum, or improve the signal-to-noise ratio.
The amount of spectrum available to Wi-Fi is regulated by the FCC. At 5 GHz, Wi-Fi can use 0.55 GHz of spectrum. At 60 GHz, Wi-Fi can use 7 GHz of spectrum – over ten times as much. 802.11ac divides its spectrum into five 80 MHz channels, which can be optionally bonded into two and a half 160 MHz channels, as shown in this FCC graphic:

Channels in 5 GHz

802.11ad has it much easier:

“Worldwide, the 60 GHz band has much more spectrum available than the 2.4 GHz and 5 GHz bands – typically 7 GHz of spectrum, compared with 83.5 MHz in the 2.4 GHz band.
This spectrum is divided into multiple channels, as in the 2.4 GHz and 5 GHz bands. Because the 60 GHz band has much more spectrum available, the channels are much wider, enabling multi-gigabit data rates. The WiGig specification defines four channels, each 2.16 GHz wide – 50 times wider than the channels available in 802.11n.”

So for the maximum claimed data rates, 802.11ac uses channels 160 MHz wide, while WiGig uses 2,160 MHz per channel. That’s almost fourteen times as much, which makes life a lot easier for the engineers.

Propagation Distance:

60 GHz radio waves are absorbed by oxygen, but for LAN-scale distances this is not a significant factor. On the other hand, wood, bricks and particularly paint are far more opaque to 60 GHz waves:
Attenuation of various materials by frequency

Consequently WiGig is most suitable for in-room applications. The usual example for this is streaming high-def movies from your phone to your TV, but for their leading use case WiGig proponents have selected an even shorter range application: wireless docking for laptops.


5 GHz spectrum is also used by weather radar (Terminal Doppler Weather Radar or TDWR), and the FCC recently ruled that part of the 5 GHz spectrum is now completely prohibited to Wi-Fi. These channels are marked “X” in the graphic above. All the channels in the graphic designated “U-NII2” and “U-NII 3” are also subject to a requirement called “Dynamic Frequency Selection” or DFS, which says that if activity is detected at that frequency the Wi-Fi device must not use it.

5 GHz spectrum is not nearly as crowded as the 2.4 GHz spectrum used by most Wi-Fi, but it’s still crowded and cramped compared to the wide open vistas of 60 GHz. Even better, the poor propagation of 60 GHz waves means that even nearby transmitters are unlikely to cause interference. And with with beam-forming (discussed in the next section), even transmitters in the same room cause less interference. So WiGig wins on crowding in multiple ways.

Antenna size:

Antenna size and spacing is proportional to the wavelength of the spectrum being used. This means that 5 GHz antennas would be about an inch long and spaced about an inch apart. At 60 GHz, the antenna size and spacing would be about a tenth of this! So for handsets, multiple antennas are trivial to do at 60 GHz, more challenging at 5 GHz.

What’s so great about having multiple antennas? I mentioned earlier that there are no free lunches in physics, and that maximum bit-rate depends on channel bandwidth and signal to noise ratio. That’s how it used to be. Then in the mid-1990s engineers discovered (invented?) an incredible, wonderful free lunch: MIMO. Two adjacent antennas transmitting different signals on the same frequency normally interfere with each other. But the MIMO discovery was that if in this situation you also have two antennas on the receiver, and if there are enough things in the vicinity for the radio waves to bounce off (like walls, floors, ceilings, furniture and so on), then you can take the jumbled-up signals at the receiver, and with enough mathematical computer horsepower you can disentangle the signals completely, as if you had sent them on two different channels. And there’s no need to stop at two antennas. With four antennas on the transmitter and four on the receiver, you get four times the throughput. With eight, eight times. Multiple antennas receiving, multiple antennas sending. Multiple In, Multiple Out: MIMO. This kind of MIMO is called Spatial Multiplexing, and it is used in 802.11n and 802.11ac.

Another way multiple antennas can be used is “beam-forming.” This is where the same signal is sent from each antenna in an array, but at slightly different times. This causes interference patterns between the signals, which (with a lot of computation) can be arranged in such a way that the signals reinforce each other in a particular direction. This is great for WiGig, because it can easily have as many as 16 of its tiny antennas in a single array, even in a phone, and that many antennas can focus the beam tightly. Even better, shorter wavelengths tend to stay focused. So most of the transmission power can be aimed directly at the receiving device. So for a given power budget the signal can travel a lot further, or for a given distance the transmission power can be greatly reduced.

How does 802.11ac get to 6.9 Gigabits per second?

You know from a previous post how 802.11n gets to 600 megabits per second. 802.11ac does just three things to increase that by 1,056%:

  1. It adds a new Modulation and Coding Scheme (MCS) called 256-QAM. This increases the number of bits transmitted per symbol from 6 to 8, a factor of 1.33.
  2. It increases the maximum channel width from 40 MHz to 160 MHz (160 MHz is optional, but 80 MHz support is mandatory.) This increases the number of subcarriers from 108 to 468, a factor of 4.33.
  3. It increases the maximum MIMO configuration from 4×4 to 8×8, increasing the number of spatial streams by a factor of 2. Multi-User MIMO (MU-MIMO) with beamforming means that these spatial streams can be directed to particular clients, so while the AP may have 8 antennas, the clients can have less, for example 8 clients each with one antenna.

Put those factors together and you have 1.33 x 4.33 x 2 = 11.56. Multiply the 600 megabits per second of 802.11n by that factor and you get 600 x 11.56 = 6,933 megabits per second for 802.11ac.

Note that nobody does this yet, and 160 MHz channels and 8×8 MIMO are likely to remain unimplemented for a long time. For example Broadcom’s recently announced BCM4360 and Qualcomm’s QCA9860 do 80 MHz channels, not 160 MHz, and 3 x 3 MIMO, so they claim maximum raw bit-rates of 1.3 gigabits per second. Which is still impressive.

Maximum theoretical raw bit-rate is a fun number to talk about, but of course in the real world that will (almost) never happen. What’s more important is the useful throughput (raw bit-rate minus MAC overhead) and rate at range, the throughput you are likely to get at useful distances. This is very difficult, and it is where the manufacturers can differentiate with superior technology. For phone chips power efficiency is also an important differentiator.