EDN's Larry Desjardin roams the halls of the International Microwave Symposium to explore the drivers and application behind 5G mmWave communications.
5G, as of late has emerged as the new buzzword in any publications about communication networks. Current 4G LTE have expanded and updated but the industry is set and hard at work to define the next leap in wireless communication. 5G promises 10Gb/s data rates and under 1ms latency for future users. The breakthrough is achieved by jumping to the millimeter wave frequencies, approximately 30GHz and beyond, where spectrum is plentiful but ill-behaved.
Frequent readers of this column, Test Cafe, have seen me comment on the implications for test equipment- higher frequencies, huge bandwidths, new protocols, and overwhelmingly modular due to the multi-channel nature of the beast. I stand by those predictions. 5G offers both opportunities and challenges for test equipment vendors. The opportunity is that new technology waves will enable large swings in market share in an otherwise static marketplace. The challenge is that the measurements are really, really difficult.
Figure 1: 5G mmWave communication will rely heavily on beamforming, making mobile communications challenging. Are the results worth the costs? (Image courtesy of Keysight Technologies)
5G measurements are difficult because the networks themselves face very difficult challenges. When I say mmWaves are ill-behaved, here’s what I mean. mmWave face high attenuation, so active beamforming is a necessity to get the needed gain. On top of that, almost everything becomes a reflector- trees, lampposts, windows, people. So, in addition to simple beamforming, often the base station will have to perform sophisticated bank shots as a user walks behind a post or tree to make a connection. Even with all that, the maximum cell range is in the neighbourhood of 200m, basically two football fields. This means that cell density must be very high, and coupled to sophisticated hand-off mechanisms between cells. And yet, with all these difficulties, operators are racing to define, design, and deploy 5G.
Or are they?
It is with this question in mind that I attended the annual International Microwave Symposium (IMS), this year held in San Francisco. I was there primarily to look at what the test vendors were offering, and with what type of architectures. I will cover that in Part 2 of my IMS coverage. But I also wanted to check on the business case for 5G itself. After all, the investments made by service providers will be in the tens of billions of dollars. And that is only if they can deploy this difficult technology at all. I’ve seen technology cycles delayed (40Gb/s optical in the early 2000s), so it would not be unprecedented for the industry to take a pause between their major refresh cycles if the value proposition wasn’t compelling enough.
So, wherever I went, I asked this intentionally provocative question: Given that peak data rates of 4G are already sufficient for almost all known applications, and coverage is the major dissatisfier for cellular users, why would a service provider commit billions of dollars for something so ill-behaved and only goes 200 meters?
Here’s what I found…
My first stop occurred at a breakfast press event hosted by Keysight Technologies just outside the conference. The keynote speaker was Mark Pierpoint, VP and General Manager of the Internet Infrastructure Solutions Group at Keysight. In full disclosure, I’ve known Mark for many years, and worked with him before I retired from Agilent.
Mark gave a very engaging and thought provoking presentation about new developments in communication networks. Examples ranged from low power ubiquitous communications for IoT (Internet of Things), to satellite-enabled services, to a mmWave bandwidth explosion. While all have sometimes been referred to as “5G”, it is the latter that I truly equate with 5G. At the end of the presentation I asked what are the specific applications that require this phenomenal bandwidth that would justify this massive investment.
While Mark had hinted at many applications in the long-run (e.g. automated semi trucks arranged in a train line on a freeway), he selected two that would need the bandwidth right away- stadiums and virtual reality.
The stadium application is real. There is a very high density of users, and many sporting events cause the cellular network to collapse to a snail’s pace. Stadiums, conventions, and indeed everywhere there is a high density of users, the 5G value proposition seems valid. Toss in extra features, like personal instant replay or views from a different perspective, and 5G could be critical in supplying new experiences. It is no coincidence that Samsung has promised their first 5G public demo will be deployed at the 2018 Seoul Olympics.
But virtual reality? True, real time VR systems require a lot of bandwidth, but what are the applications? Is the experience that much better than, well, watching a video? In all honesty, I had never put on a virtual reality headset, so I was skeptical.
Fortunately, the IEEE was offering VR demo on the conference floor, and I jumped at the chance to experience this myself. Donning the headset, I and two other volunteers sat in chairs that simulated pilots’ seats of a spacecraft. I looked above, below, and behind, and saw that I was in a detailed realistic spacecraft alongside two other suited space pilots. Below and to my left was a cup of coffee- still steaming, with the IEEE logo printed across it. We were given two minutes of instructions on how to use the controls before the real simulation would occur.
Figure 2: Here the author experiences VR (virtual reality) for himself. It may appear he is on the floor of a conference, but he is actually piloting a shuttlecraft across the surface of Mars. Fortunately, no people were harmed in the simulation, nor any virtual coffee spilled.
Check out part 2 of the series here.