Three short stories on today’s Mobile Networks Performance

Ensuring the quality of the networks for an optimal end user experience is often a challenging task for mobile network operators. While the carriers’ engineers adjust the systems for getting the most efficient usage according to the load required, you might be affecting the quality of the subscribers’ service in particular conditions, subject to the applications being used by them, the coverage and access technologies available in determined locations, or even the non-always optimal policies used for access technology selection.

Evolved QoE – Application Performance who?

Nowadays delivering quality services to the mobile subscribers has evolved beyond the traditional network availability and quality. Today’s users are demanding sufficient performance for each type of application used, leading to profile based modelling of the traffic and increasing the complexity of the Quality of Experience (QoE) evaluation for the carriers. For the operators evaluating the QoE is hard, as published by the GSA and Ericsson this month (here) “A 2012 study from the University of Massachusetts Amherst and Akamai Technologies found that internet users start abandoning attempts to view online videos if they do not load properly within two seconds. As time goes on, the rate at which viewers give up on a given video increases”, “with the rise of mobile-broadband and smartphone usage over the past few years, the meaning of user experience has changed dramatically”.


What used to be measured with coverage and bandwidth capacity is now extended to performance per application and end user experience, involving signal coverage maps, latency analysis, QoS, security features, loading speed for web pages or online multimedia content (e.g. HD video) and apps, among others. As explained and exemplified in a recent Ericsson Whitepaper on Network Performance (here) “Network performance cannot be generalized because the only true measurement of performance is the experience of those who use it.”, “App coverage is one way we describe this performance. It refers to the proportion of a network’s coverage that has sufficient ability to run a particular app at an acceptable quality level. For example, an area that has 95 percent coverage for voice calls, may have 70 percent coverage for streaming music and only 20 percent coverage for streaming HD video. A consumer’s usage patterns will determine their preferred type of coverage”


Indoor small cells – Please mind the gap between the macro and small cells platforms

Evolved small cells for indoor installations are coming to fill the coverage gap between the macro networks (i.e. 4G/LTE, 3G, 2G, etc.) and the small cells technologies (i.e. Pico and Femto cells, etc.). A new solution was recently announced by Ericsson called Radio Dot System (here), which is according to them “The most cost-effective, no-compromise solution to indoor coverage challenges”. It is well known the operators have challenges for covering indoor areas and buildings on a cost effective manner, while more than 70% of the traffic is generated in this domain. The solution is ultra-small, light, scalable, with fast deployment, and relies on Ethernet connection for integrating with the existing mobile network.

Although Ericsson’ solution should not be available before next year, we would expect to see other similar solutions in the market in the near future. This trend would potentially look to take over part of the current usage being done on WiFi technologies, preferred by most of the users for indoor communications.


Smart access network selection – The seamless cellular and WiFi access marriage

A recent report from 4G Americas (here) analyses the role of the WiFi technology in current mobile data services, and the methods for overcoming the challenges appearing as a result of the integration and mobility between cellular technologies and the WiFi. As stated by them “with smartphone adoption continuing to rise and the increasing prevalence of bandwidth-intensive services such as streaming video, the limited licensed spectrum resources of existing cellular networks are as constrained as ever. Wi-Fi, and its associated unlicensed spectrum, presents an attractive option for mobile operators – but improved Wi-Fi/cellular network interworking is needed for carriers to make optimal use of Wi-Fi.”


The so-called interworking between traditional mobile access technologies and the WiFi networks must be seamless and transparent to the end users. In such way, the service continuity must be assured when a subscriber moves in example from 4G/LTE coverage to WiFi covered zones and back, using methods like an automatic offload policy. Different methods are currently used for this interworking like session continuity, or client-based mobility, or network-based mobility. One of the most popular and accepted, also standardized by the 3GPP, is the network-based Access Network Discovery and Selection Function (ANDSF), which is already supported by most of the WiFi devices and network elements, including Policy Managers and specific network gateways. Other innovations have been made available for addressing the seamless interworking issues, in standards like the Hotspots 2.0, or the seamless SIM-based authentication.


As it was commented in my previous post “The top 10 fast facts you should know about LTE today”, the 5G will be a combination of access technologies for jointly fulfilling the requirements of the future. In these scenarios the seamless network selection and mobility becomes even more important beyond the classical offload scenarios, and some particular issues for these are commented by 4G Americas and vendors like Ericsson. These issues include: Premature WiFi selection (access technology shifted when coverage is still too weak due to distance), Unhealthy choices (traffic offloaded to systems overloaded), Lower capabilities (offload to alternative technologies having less performing networks), or Ping-pong effects (frequent access technology shifting due to mobility affecting the QoE).

A. Rodriguez

Wi-Fi services re-invented

Since the first Wi-Fi networks appeared near the year 2000, the technology has been evolving according to the Wi-Fi Alliance and the IEEE standards for 802.11 and its family of protocols. During the last years the most common standards for Wi-Fi has been 802.11b/g, and more recently 802.11n, operating in the 2.4 and 5GHz frequencies and allowing browsing rates of up to 600Mbps. In the near future, the standard 802.11ac should be also popular in the Wi-Fi capable devices, operating in the 5GHz frequency and allowing browsing rates over 1Gbps… and so on, the evolution in the Wi-Fi technology devices is most likely to continue.

Nevertheless, despite the evolution commented, we have seen the services offered by the network operators for Wi-Fi in the last decade were pretty much the same originally offered since day one. The Wi-Fi was typically seen as a technology that allowed the subscribers connecting from their homes and/or offices, without being able to monetize the usage any further than a flat rate, nor providing any interesting services for the subscribers outside these places. Mainly because of the lack of control for the subscribers’ traffic in their homes and offices, due to the original access technology used (e.g. ADSL, cable, fibre, etc.), and because offering Wi-Fi in public areas was not profitable nor feasible in business and technical terms.

Luckily this situation has being changing in the recent years, and let me say it was about time. The network operators and service providers are realizing the potential of the Wi-Fi, and the impact it has in the subscribers’ life. The fact that more than 70% of the total data traffic currently done is accessed over Wi-Fi networks is a solid reason to support it. The hotels, stores, and coffee shops were the first to realize about this opportunity, allowing their customers free Wi-Fi access as a way to attract consumers, among other strategies. The operators also started to be creative in the Wi-Fi offers, making sure solid business opportunities exists for supporting the investment in Wi-Fi access points in public places. The classical example is the open zones, as the ones from BT in the UK, covering high usage density zones and subway stations, etc. with free Wi-Fi for those customers who already have a service contracted with BT, this as a way to reward loyalty, reduce churn, or however you prefer to call it.

Recently, a more creative and innovative strategy is being used by operators like O2, also in the UK, allowing any user (from any company) to use their Wi-Fi network composed by hundreds of access points deployed in the main districts of London. Their business in this last case is directly with the stores and companies in these districts, where O2 offers them a chargeable service to have access points near their stores locations. Attracting customers for them and providing them with the detailed statistics of the usage done by these subscribers in the Wi-Fi networks, as a way to provide targeted commercial campaigns, etc. This service is still improving O2 service image and recruiting churners from other companies… and I would say while still rewarding loyalty (here). Another example could be Google, who proposes another approach in their recent agreement with Starbucks. In this case they are taking over the access provided by AT&T in more than 7000 branches, and replacing it with an incredibly faster Wi-Fi service by increasing the backhaul capacity 10 times, rewarding people for consuming at Starbucks while Google’s image is improved, without mentioning the obvious economic agreement benefits (here).

The Wi-Fi opportunity is not only for increasing revenue from a service provider or an operator point of view, but it can also be used as a way to encourage evolution. The recent example is South Korea, where the government has announced they will expand their current free Wi-Fi network coverage to provide nationwide service in the next 4 years. A Ministry of Science, ICT and Future Planning official said, “Expansion of Wi-Fi areas will ease financial burdens on consumers and help narrow the information gap between people in Seoul and other cities” (here). Examples like this and the deployment seen during the Olympics in London last year, show how the technology, and particularly the Wi-Fi, is becoming a need for the users in today’s connected world.

The European race for Wireless Spectrum

As the mobile telecommunications market gets more crowded every year, and the technologies for delivering mobile cellular services evolves, the wireless spectrum has become one of the most precious goods for the telecom carriers. The government agencies are making efforts for ensuring a fair competition and split of the wireless spectrum within each country and continents, via the public spectrum auctions and regulation mechanisms.


When an operator plans a new technology deployment, like the most recent case with the 4G/LTE, the process must start by having the spectrum secured for operating. That is why it is so important to invest in the spectrum auctions on time. In Europe, we have seen cases where an operator takes huge competitive advantages by having wireless spectrum granted earlier than the competition, like the case of EE in the UK (here). UK regulator Ofcom approved the use of part of the 1800MHz band for LTE in August 2012, ahead of the proper auction carried later on February 2013, allowing them to claim being the first 4G operator in the UK and winning an important number of churner subscribers. We have also seen cases where a European operator is left behind in the LTE offer for not being able to use the available spectrum granted, like the case of Telefonica Movistar in Spain (here). Telefonica saw themselves forced to collaborate on a network infrastructure sharing with Yoigo on July 2013 for allowing a late 4G rollout in the near future, while their granted 800MHz band is liberated from the Terrestrial Digital Television (TDT) and they build their own LTE infrastructure.

The issue of the 800MHz wireless spectrum usage is quite important in Europe these days. As we know the available bands for LTE are in the ranges of 800, 1800, and 2600 MHz, but the lower the frequency of the spectrum the easier and cheaper to cover states and reach geographical areas, making the 800MHz band strategically important. In example most of the operators plan or have roll-out the 4G/LTE coverage for big population cities with the higher 1800 and 2600MHz bands, and use the 800MHz bands to ensure the rest of the geographical extensions are fully covered in the countries. That is mainly why the European Commission decided that every country in the European Union should have the 800MHz band liberated by January 2013, as stated by the Commission “Opening up the 800 MHz band is an essential for expanding use of popular wireless broadband services”. However, recent statements from the European Commission has indicated 17 out of the 28 European Union’ states have not been able to meet the January 2013 deadline (here), with some of them asking for postponements or derogations due to exceptional reasons like having the spectrum occupied with previously agreed usages for TV, etc. So far, the only EU countries with the 800MHz band liberated from different uses, and able to offer LTE services on it are Denmark, France, Germany, Italy, Netherlands, Portugal, Sweden, UK, Luxemburg, Croatia, and Ireland. A full table of the operators in Europe and the band used for 4G/LTE is shown below.

(List of LTE deployments per operator and country)

Apart from that, the European Commission also highlighted the poor LTE coverage in Europe compared with USA: “Three out of every four people living in the EU can’t access 4G/LTE mobile connections in their hometowns, and virtually no rural area has 4G. In the United States over 90% of people have 4G access” (here). This is said in response to the early victory claims from some operators with advanced 4G roll-outs, which according to the Commission are still far from really cover all of the geographical extensions as expected. It is simply a truth during the last years USA has advanced gigantic steps towards the mobile communications evolution, while Europe is struggling trying to catch up.

Looking ahead of the 4G/LTE spectrum issues, the small cells are ways to benefit from the spectrum shortage in the macro networks. A recent study (here) also from the European Commission reveals, “71% of all EU wireless data traffic in 2012 was delivered to smartphones and tablets using Wi-Fi, possibly rising to 78% by 2016”. The results far from shocking are totally expected, considering the low cost the small cells technologies and particularly the Wi-Fi represents to both the end user and to the operators for delivering this. The recommendations made by the Commission are at least encouraging “The study recommends:

  • to make spectrum from 5150 MHz to 5925 MHz available globally for Wi Fi;
  • to continue making the 2.6 GHz and the 3.5 GHz bands fully available for mobile use and to consult on future licensing options for 3.5 GHz and other potential new licensed mobile frequency bands; and
  • to reduce the administrative burden on the deployment of off-load services and networks in public locations.”

As it is a fact, the future of the telecoms is most likely a combination of macro networks and small cells. Transitioning those with seamless offload functionalities available now, and being evolved every day by the incumbent vendors.