Every ten years or so, something big happens in mobile. Once a decade, a new generation of mobile network technology comes along: the first mobile networks appeared in the 1980s, GSM followed in the 1990s, 3G arrived at the turn of the century, and LTE began rolling out in 2010.
Each generation has set out to fix the flaws of its predecessor: GSM fixed the security weaknesses of analogue telephony, 3G was meant to sort out GSM’s lack of mobile data and, given it didn’t much succeed, 4G was needed to finally make consuming data less of an unpleasant experience.
Now, 5G is emerging ahead of the turn of a new decade and the next big change to hit mobile. But what’s the problem that 5G’s meant to fix?
Here’s the thing: no one’s too sure about 5G, not really, not yet. The main gripes that people have with their mobile service today are coverage and price – neither of which are problems that need a new generation of mobile tech to solve. Throw a bit of cash into building out LTE and LTE-A and much of these headaches would go away, yet the industry is ploughing full steam ahead into 5G. Instead, the industry is hoping 5G will solve problems we don’t have today, but those that could hold us back years in the future.
The process of building each new mobile standard begins years before it’s put into use, and once up and running, those standards will remain in place in various forms for a decade or more. With 5G, we’re having to build a standard that will still be in use in 2030 and beyond – and the mobile industry has a terrible track record when it comes to future-gazing.
Back at the start of 2000, with 3G just about to launch, who could have predicted how the mobile world would look in 2010? At the turn of this century, we all packed candy bar feature phones, now most of us have feature-packed smartphones.
Figuring out what uses 5G will be put to is the equivalent of trying to predict the rise of the iPhone five years before it launched. No one foresaw its arrival, or how the market would change in response to it, and how we’d end up where we are now. We’re facing the same situation again: trying and imagine how the mobile world will look 10 years from now and design a standard to fit it.
If history is any guide, we’re going to fail spectacularly again. That doesn’t mean that the industry isn’t going to try.
Setting a standard
“5G” is something of a misnomer: the standard doesn’t exist yet. It will be months, likely years, before it’s finally defined. In the meantime, organisations, governments, and academics are working on the technologies that will form the standard, but today, 5G is purely a concept, and one that needs to go from vapourware to real-world rollout in the next six years.
According to prevailing wisdom, the first networks built on the standard will be rolled out in 2020. In order to meet that deadline, most of the hard yards in the 5G standardisation process will have to be completed over the next two to three years, with standards bodies including the 3GPP, ITU, and IEEE, as well as universities, public bodies, and special interests groups, all having their input.
The idea that the 5G standard is already a fait accompli may come from the industry’s fondness for trumpeting new partnerships and headline-grabbing tests of all stripes as “5G” breakthroughs, without acknowledging the standard has yet to be finalised.
Nonetheless, all those experiments and partnerships are laying the foundation for the standard ahead and funding it beginning to roll in around the world for 5G technology projects.
The EU and South Korea for example have signed a deal to work together on 5G development, while both have promised €700m and $1.5bn respectively in funding for local 5G projects. In the UK, £70m of funding is going to build a 5G research facility, known as the 5G Innovation Centre (5GIC).
The 5GIC, set in the market town of Guildford, is set open its doors in April of next year. Once it’s up and running, the centre will provide a working testbed for early 5G technology projects. The idea is to provide a heterogeneous environment of 41 access points — small cells, macro cells, indoors and outdoors, and so on — for companies to come and trial their products on. Guinea pig users will be supplied by students and staff from the University of Guildford, of which the 5GIC is a part.
The US has its own test bed too, courtesy of NYU Wireless, part of the Polytechnic Institute of New York University. There, researchers are gathering data from the Big Apple using prototype base stations and mobile units that they hope will help in the development of 5G channel models.
As early networks and testbeds show, the momentum around 5G is building. The standard may be unwritten, but the industry has a fair idea what it must deliver.
So far, three main criteria for the 5G standard have been established:
1. It should be capable of delivering a 1Gbps downlink to start with and multi-gigabits in future
2. Latency must be brought under one millisecond
3. It should be more energy efficient than its predecessors (though there’s no agreement yet on just how much more)
Despite never managing to successfully predict what each forthcoming generation of mobile technology should deliver in order to satisfy future users, the industry has nonetheless reached some consensus on the use cases for 5G. Machine to machine communications is one. 5G should enable the internet of things, the future where all our online-enabled objects will quietly pass on data to our tech overlord of choice. Facilitating the use of mobile networks by connected or autonomous cars, remotely controlled industrial robots, telehealth systems, and smart city infrastructure are also all expected to figure large in 5G thinking.
There are more familiar experiences, too, that are often cited as upcoming uses for 5G — the ability to download 4K or 8K video at speed, for example — and occasionally those that are more forward-looking. Tactile web, anyone?
Despite this emerging understanding of what 5G should look like, there’s much still up for debate around the standard, including which technology should form part of it.
Won’t somebody think of the spectrum?
Every new mobile standard brings with it calls from operators for more spectrum. 5G is no exception. If mobile operators want to deliver more and more capacity, they’re going to need more and more wireless spectrum to do it.
And, with every generation of mobile tech, governments around the world must identify what spectrum those operators will need, whether anyone’s using those bands and how to move them off them if so, then find the best way to sell that spectrum at the right price, and finally make sure that all the operators are meeting the obligations that buying the spectrum imposed on them. The history of the wireless industry is littered with tales of fouled-up spectrum auction procedures, delays to network rollouts, mud-slinging between mobile companies, obligations not met and clean up procedures not followed. It’s a dirty, expensive business.
5G probably won’t diverge from the age-old pattern, but it does come with one added hassle: we just don’t have enough spectrum to go around any longer, according to wireless analysts. Roaming in particular could be problematic.
“Spectrum is and will remain a major challenge for the success and early rollout of 5G. We don’t have enough spectrum in general and 5G is a lot about optimising the use of spectrum. But clearly, allocating more spectrum to 4G and later 5G would help and this is a global challenge… An additional challenge will be to find a globally harmonised band for 5G roaming since all suitable spectrum is already in use in one or another part of the world,” said Thibaut Kleiner, head of the European Commission’s CONNECT (Communications Networks, Content, and Technology) Directorate-General.
One solution to the spectrum crunch could be to look beyond the lower-frequency spectrum — between 700MHz and 2.6GHz — used by most carriers today, and move towards higher spectrum bands such as 6GHz, 28GHz, and 38GHz.
At the top end, beyond 30GHz, these extremely high frequency bands are known as millimetre wave. Bringing those bands into use is both one of the most exciting, and least guaranteed, areas of 5G development.
Asha Keddy, general manager of standards and advanced technology in Intel’s mobile communication group, describes the difference between microwaves and millimetre waves by comparing them to light: “Let’s say I’m driving and when I’m pulling up late at night, I have floodlights on to light up the entire garage. When I have floodlights in the garage, I can see everything, because the properties are about range. When you millimetre wave, depending on the frequency and how high you go, it’s more like you have pinpointed beams.”
The simile holds true on two fronts: using traditional spectrum allows you to transmit data over a longer distance but at lower capacity, but millimetre wave offers greater bandwidth, but the signal won’t reach as far. Millimetre wave also bring with it the possibility of beamforming — rather than broadcasting or receiving signals in all directions, they’re sent directly where they need to go, be that a handset, a router, or a base station.
“Today, we use pretty much omnidirectional antennas in base stations and mobiles in the cellular and Wi-Fi worlds, so there’s not a lot of directionality and beamforming being done. LTE in its future states will have some limited amount, but when you go up to millimetre wave, you have the ability to put dozens or hundreds of antennas in a very small space, and that allows you to provide very narrow energy beams,” said Ted Rappaport, a professor at New York University and director of its NYU Wireless unit.
“This changes the dynamics of wireless systems completely. It takes them from today’s interference limited environment, where interference from other mobiles radiates everywhere, like everyone yelling on the corner of a street, and now makes radio energy very focused like megaphones, with everyone talking with a megaphone to only where they want,” Rappaport said.
Millimetre wave isn’t a new discovery. Research into the frequency has been going on since the 1980s, but work on the 60GHz spectrum discovered a good deal of attenuation — a weakening of the signal over distance — due to oxygen in the air, meaning its potential for wireless communications was largely ignored at the time and why it remains unlicensed today.
So why, thirty years later, is millimetre wave back on the table once again? Subsequent research found that there exist certain frequencies that don’t have the same attenuation problem that held the mobile industry back from embracing its use.
According to Rappaport, in some frequencies, the problem falls away. “If I go up to 28, 38 or even 70GHz, I no longer have that excess oxygen absorption seen at 60GHz, and the radio channel behaves remarkably similarly to today’s cellular and wi-fi frequencies with no additional atmospheric loss; in fact, most of the spectrum has very, very little attenuation compared to today’s free space absorption, meaning mm waves really don’t suffer from additional attenuation other than rain attenuation and standard free space propagation.”
That was one blocker out of the way for millimetre wave’s use in 5G, but other remained. Until recently, one of the greatest perceived challenges to using millimetre wave is how it will function outdoors.
One criticism levelled at millimetre wave is its attenuation in rain: that is, how the signal can suffer if there’s a downpour. It’s true that millimetre wave will attenuate in rain, a phenomenon that has caused disruption in the financial services industry in the past, where millimetre wave bands are used to carry high-frequency trading data between financial services’ datacentres. According to a paper by NYU academics, the attenuation is in the region of 1.4dB over 200m — the average distance between one cell and the next. In other words, yes, there is attenuation, but it’s not really at a high enough level to worry about. Good news for the mobile users of London, then.
“Work by many researchers has confirmed that for small distances (less than 1 km), rain attenuation will present a minimal effect on the propagation of mm-waves at 28GHz to 38GHz for small cells,” the paper says. Directional antennas will also help make up for some of the loss too.
Surfing the (millimetre) technology wave
While the attenuation argument appears to be largely over, millimetre wave’s place in the upcoming 5G standard is far from certain.
Because of its relative novelty in mobile, a whole bunch of research is going on into the basics of using millimetre wave, including the channel model, its propagation, what millimetre wave antennas might look like, what impact they might have on handset design, and even what effect they might have on the human body. And, as with any new technology, there’s an ecosystem that need to build up around it, too. Someone needs to start manufacturing all the network kit, and handsets to cope with it.
All of that means that we’re somewhere between five and ten years from being able to use millimetre wave commercially, according to Nokia. Nonetheless, early experiments seem promising, and the company has achieved 115Gbps speeds over a distance of 15 metres using 70GHz spectrum.
Several other of tech’s big names have dipped a toe in the water of millimetre wave, including Samsung, which has produced what it claims is the first millimetre wave hardware: a 64-element adaptive array transceiver. Operating in the 28GHz band, the transceiver can handle over 1Gbps over 2km, according to Samsung, which is hoping to take the technology commercial before the all-important 2020 timeframe.
Initial Google experiments in several millimetre wave bands surfaced in an FCC filing recently, after the search giant acquired millimetre wave research company Alpental Technologies earlier this year. There were suggestions at the time that Google’s interest in millimetre wave stems from a view that it could be used as an eventual replacement for fibre broadband. While that might be something of a stretch in the short term, there are those who think that millimetre wave bands could lend themselves to backhaul for small cells.
5G and small cells
There are traditionally three ways the mobile industry can add more capacity to its network: by adding more spectrum, by improving spectrum efficiency, or by rolling out more infrastructure. As we’ve seen, no one’s quite sure how the spectrum arm-wrestling will play out. As for improving spectrum efficiency, according to Volker Ziegler, technology and innovation chief architect at Nokia’s Networking arm, every generation of mobile tech brings a threefold improvement in efficiency — that is, you can get three times as many bits through on the same bit of spectrum. Perhaps, he said, we could get that to five, 10 or 20 times, it still wouldn’t be enough to hit the multi-gigabit future that 5G foresees.
That leaves installing more infrastructure. But, the idea of more base stations going up in high-footfall areas is unlikely to be a popular prospect in most towns and cities. Small cells — shrunk-down base stations — offer a more palatable alternatives for both operators and town planners.
Small cells help fill in gaps in coverage left by the full-fat base stations that underpin a mobile macrocell. Up until now, small cells have chiefly been installed in business premises and homes to bolster dodgy in-building mobile coverage; with 5G, the idea is to throw up loads of small cells in densely populated, high-data-demand urban areas.
Unlike full-fat base stations, small cells are, as their name suggests, far more petite — even down to smaller than your home router — and don’t need to be installed as high up as normal mobile masts. That means that far from grumbling about towers being blights on the landscape, small cells can be made almost unnoticeable, strapped to lamp posts, or even in future built into bricks in buildings. They’re also cheaper than the macro alternative, can help lower latency, and improve coverage at the cell edge. What’s not to love?
Of course, because of their reduced size, small cells have a much reduced range compared to their bigger siblings, at around 200 or 300 metres. That means there’s a potential challenge with handover: if you’re in a car speeding through town you could be passing through several small cells, and with each handover you risk packet loss or distortion — a royal pain in the ass if you’re in the middle of a call. There are already suggested ways around the problem though: using small cells for data only, and identifying those subscribers moving between many cells and putting them back onto the macrocell.
The idea of ultra dense networks also brings with it issues of energy consumption, 5G’s other cause célèbre. Sure, small cells are far lower powered than macrocells, but a network with huge numbers of them dotted around will still need more energy to run than one without. So how can you minimise power consumption and still roll out small cells?
One suggestion is a fundamental change to mobile architecture, with a greater separation between the network’s control plane (which plans how data will move through a network) and its data plane (which actually does the data moving).
“Once you separate the control and data plane, you can do all kinds of things, like energy efficiency. You can turn the small cells on and off, but keep the anchor on, so you don’t miss calls. Today, you have to shut the base station down, but that has its own problems. With this, you can still keep the anchor on, or you can have the pilot from the devices do it, but you can turn particular coverage areas in the data plane on and off,” Intel’s Keddy said.
The idea, sometimes referred to as ultra-lean design, is a major change from our current, non-ultra-dense networks. Today, cellular systems transmit data all the time. As the number of transmitters in the network grows, that’s going to lead to more and more interference. In 5G, while the anchor stays on, small cells can be shut down or awakened for the tiniest slices of time.
“Cutting always-on transmissions to a bare minimum, so that communication only occurs when there is user data to deliver, allows the transmitter to dynamically – on a millisecond basis – switch off and be silent,” Ericsson says. That means less energy used, and less interference.
MIMO is another technology likely to arrive in a big way with 5G. Rather than having a single antenna in the receiver and one in the transmitter as is the case now, MIMO (which stands for multiple input, multiple output) envisages a scenario where both sets of equipment have tens, or even a hundred, antennas or more. That translates into better data rates for users, and helps with both spectral and energy efficiency for operators.
It should work in concert with millimetre wave and small cells too. “Massive MIMO base stations and small-cell access points are two promising approaches for future cellular. Massive MIMO base stations allocate antenna arrays at existing macro base stations, which can accurately concentrate transmitted energy to the mobile users. Small cells offload traffic from base stations by overlaying a layer of small cell access points, which actually decreases the average distance between transmitters and users, resulting in lower propagation losses and higher data rates and energy efficiency. Both of these important trends are readily supported and, in fact, are enhanced by a move to mm-wave spectrum, since the tiny wavelengths allow for dozens to hundreds of antenna elements to be placed in an array on a relatively small physical platform at the base station, or access point, and the natural evolution to small cells ensures that mm-wave frequencies will overcome any attenuation due to rain,” the NYU academics’ paper says.
And, once MIMO becomes common — be it massive or otherwise — beamforming can follow. Put simply, rather than broadcasting a signal in all directions, the beam is directed towards the equipment that’s meant to receive it.
“What you really do is you smartly integrate elements of control that optimise the beam patterns and the way that energy is radiated,” Nokia’s Ziegler said. “It’s a processing technique that optimises the way electromagnetic waves are being transmitted and/or received, it goes back to that very basic notion of combining elements in a phased array, like a multi-element antenna setup.” The result is better coverage, better throughput, and reduced interference.
While neither beamforming nor MIMO are new — some Wi-Fi kit has the former and a handful of operators have rolled out the latter in a limited fashion as part of LTE-A — they will be greatly facilitated by the arrival of 5G, and are likely to become far more widespread as a result.
The new risks with 5G
5G makes all sorts of technologies possible – but also raises the stakes. If your car is being operated via a cloud-based autonomous driving system over 5G, you don’t want to lose the signal right at the precise moment it’s about to tell your vehicle to slam on the brakes. Operators and technology companies know that (and are perhaps considering the insurance implications). So they are aiming to cut network latency to make sure such an event doesn’t happen.
The need for low latency will have a profound effect on how networks are developed, according to Huawei’s head of wireless research, Dr. Wen Tong. Packet error rates will need sorting out.
With “one per cent of packet error rate, the human [user] won’t even have a perception of that bad packet, but if you want to use this packet to drive a car, that [error rate] has to be one million, ten million, or hundred million percent because you can easily cause accidents,” he said.
Being the operator whose network didn’t deliver the signal to turn right at the crucial moment or left a family stranded on a remote road because of connectivity problems is a possibility no one wants to face.
Network access time will have to be cut dramatically. Network controllers, for example, will need to become far more local to you — think single digit kilometres — to make autonomous cars a reality. Advances in coding and modulation will also be necessary.
“With LTE, when you touch your screen, you wait for the web response, and it’s fast enough. For example, if it’s less than 16ms, you feel like there’s no delay and everything’s a very good experience. If you use GSM, it’s 500 to 600 ms. You have a half second of latency… It’s improved [with subsequent generations of mobile tech], but if you’re driving a car and you ask a car to turn left, if you still have 100ms delay, that car will be in trouble. That needs a very fast response, and that’s something we don’t have today with LTE,” Tong said. “We need to reduce latency to less than 1ms. That means that you need a new design.”
That could mean either hardening the network from start to finish — not a practical measure by any means — or introducing software defined networking, novel technology yet to be rolled out on any significant scale.
However, software-defined networks (SDN) would effectively allow operators to better determine the path that data would travel, choosing the shortest route, compared to the data-agnostic paths we have today.
Lower latency will mean an increase in edge computing. Rather than have data travel all the way to a datacentre at the centre of the network, data can be stored or processed at the base station, radio network controller, or similar.
According to the European Telecommunications Standards Institute industry specification group for mobile edge computing, the technology will mean greater convergence of IT and telecoms, with carriers opening up the edge of their network to other service providers.
For sub-1ms latency, “the options to stay centralised are constrained, because you get round trip that will take 50ms or 100ms. From this, we have to empower operators to find ways to dynamically allocate resource, centralised or decentralised, and that goes to the very heart of the idea of what we call edge computing,” Nokia’s Ziegler said.
“The idea is we would have compute and processing, if you look at the resource layer, performed at the cell site, so you would co-locate in the most simple scenario a server at the cell site, but you would still keep up that notion of control and coordination more centrally so as to ensure seamless integration with a wide area network.”
As well as speeding up data round-trip times for time-critical services, by pushing more work onto the edge of the network, congestion is also reduced elsewhere, meaning a speed boost overall for users.
For Huawei’s Tong, latency is one of the risk areas for 5G. If it isn’t dealt with right, with cars and all the other industries that will depend on it in mind, it could be a real problem for the standard. “If we don’t build it into the fundamental baseline release, it will always be difficult to fix in later releases,” he said.
5G and the internet of things
Historically, mobile data was something that human-controlled devices, not autonomous machines, consumed, and it was designed accordingly to cater to the usage patterns for phones, and later laptops and tablets. Now, the mobile industry is trying to work out how machines, not least those latency-loathing autonomous cars, will want to consume data. That means getting vertical companies involved in the standardisation process — companies who have historically never had to take an interest in networking and whose core competencies don’t involve mobility.
It’s another way 5G represents a break from previous mobile standards. “All these different use cases put very different diverse requirements [on 5G]. It’s not that elements of these didn’t exist before, but we never had one technology or one element that had such extreme use cases — ranging from IoT to having a good experience, to wearables, and even cars,” said Intel’s Keddy.
“This diversity really puts in a lot of different tension,” said Keddy. “For example, a good M2M solution has to be low cost, cheap and high range, which is a very different thing from what I want in my high end phone. Those are the diverse requirements that are at play.”
This need to include more companies is yet another challenge on the way to standardisation — a process that’s known for being fraught with disagreements and geographies jockeying for the upper hand at the best of times.
Traditionally, mobile standards vary from continent to continent as different operators around the world opt for the version of the technology that suits them best. In 3G’s case, for example, UMTS was popular in Europe, China went for TD-SCDMA, while CDMA cropped up in the US. While the number of variants has been falling with each generation of mobile standard, it’s hoped there will be just one single version of 5G used around the world.
“5G will require major efforts in terms of research. Governments need to support this research effort financially. We also need a global commitment to avoid competing standards, so that we have one global 5G. This requires agreement on a set of policies, like spectrum allocation. We are preparing joint actions with South Korea and Japan and we will explore future cooperation with China and the US. The next step for 2015 is to agree on a common global definition of the 5G technical parameters,” the EC’s Kleiner said.
Putting the 5G pieces together
It’s thought that the first networks will arrive in 2018 — timed to coincide with the Winter Olympics in PyeongChang, South Korea — before the standard itself is finalised. Pre-standard networks are traditionally something of a rarity and bring their own set of problems. Take Japan’s FOMA, for example. It may have been the first 3G service on the market, but as it was pre-standard, it wasn’t compatible with the more popular UMTS variant of 3G, meaning the range of compatible handsets was initially small and unattractive.
It’s also been suggested that, rather than the Winter Olympics, the next major sporting event that will fire the starter’s gun on 5G will be the 2022 World Cup. As well as avoiding the issues with pre-standard networks, those extra four years will also give retailers a little longer to make back the money they’ve spent on the previous generation of mobile network technology.
“There’s a long way to go with 4G, in terms of adding capabilities, efficiency, performance,” Luke Ibbetson, head of R&D at Vodafone, said at a recent 5GIC event. “We don’t see 4G running out of steam for a long time.”
According to figures from the GSM Association, operators will spend $1.7 trillion dollars on their LTE networks between 2014 and 2020, and clearly they’ll want to see some return on that investment before they begin rolling out any replacement, particularly given that pricing models for mobile operators have been a source of discomfort for some time. When many carriers switched on 4G, they upgraded their customers’ access for free, offering 5GB of 4G data at the same cost as 5GB of 3G data. In order to make their money back, they also parcelled out data on a volume basis, hoping that those who enjoyed better speeds would consume more data, and so move to more and more expensive packages.
The theory is that 5G will allow mobile carriers to figure out new and different ways to part users from their money, not necessarily because the new network permits it, but because buyers will be far more comfortable swallowing a price rise if it’s accompanied by a perceived leap in technology at the same time. Among potential pricing change may be offering data tiered by speeds — a practice that we’re already starting to see in Europe with the advent of LTE and LTE-A networks. Another option is that flat-rate data packages disappear, and that pricing is simply stepped up incrementally as users consumer more and more. Packages may be segmented along service lines – all the video or music streaming you can eat for a certain price with quality of service thrown in. Either way, as one wit put it, it will end in tiers.
This isn’t the only way that 5G will mean consumers will find themselves having to put their hands in their pockets. Every new generation of mobile technology means that consumers have to put their hands in their pockets for a new smartphone, usually another way for operators to extract more cash, either as an upfront fee or in exchange for signing a long-term contract, from their customers.
While this upgrade cycle may be one we recognise, 5G may be the last G we ever see. As EE’s principal network architect Andy Sutton put it, “if we get 5G right, there won’t necessarily be a 6G.”
While 1Gbps speeds is the base line for 5G, multi-gigabits is the aim. Consumers, according to industry types, should have the feeling that they always have all the capacity and speed they need, however much data they’re using and wherever they are. If 5G networks are built well, there will be no need to upend mobile tech every decade, just update it bit by bit as new needs, and technology, arise.
If, adds Sutton, “5G always delivers sufficient rates such that the end consumer perceives there’s infinite capacity, we’ll continue to evolve our networks, but we’re moving away from this major generational shift every 10 years to a more general evolutionary capability. Whether we call that something new or not will become more of a marketing question than a technology question. I can’t foresee a time in 100 years when my grandchildren’s children are standing up and saying ‘We do need to start work on 15G because 14G’s not really up to it’.”
Of course, if mobile has taught us anything, it’s that when it comes to predicting the future of our networks, all bets are off.