Google's Pixel 4 is the first smartphone to feature a radar IC. We spoke with the Infineon, which supplied the chip.
Google last week rolled out its Pixel 4 smartphone, whose claim to fame is a radar-based technology that makes it the first smartphone featuring “Motion Sense” capabilities.
Under a project named “Soli,” Google’s Advanced Technology and Projects (ATAP) group embarked on the development of a miniature radar chip five years ago. Google wanted to design a radar technology that tracks human gestures, everything from a finger-tap to whole body movement.
For Google, the goal of Project Soli isn’t just embedding the radar technology inside smartphones. Thinking bigger, Google saw its mission as making radar a foundational technology for “the language of non-verbal communication.”
Combining Soli radar chips with Google’s home-grown machine learning and newly created big database, Google set its sights high to develop a new man-machine interface that can be pervasively applied to a variety of devices from watches, glasses and lighting to commercial and industrial systems.
To be clear, though, Google’s so-called Soli chip and its underlying hardware technologies were developed by Infineon Technologies, not Google.
While Pixel 4 surely creates a big-volume design-win opportunity for Infineon’s radar chips, this appears hardly the whole story. Unanswered questions include: How did the German chip company find its way into Google’s campus in Mountain View? How closely did Infineon work with the Project Soli team? Did Infineon have to compete with other radar chip companies to win the socket for Pixel 4?
EE Times last week caught up with Andreas Urschitz, Infineon’s division president for power management and multimarket, who happened to be visiting Silicon Valley. Our goal was to see if Urschitz might pull back the curtain to show the inner workings of the Infineon-Google relationship.
EE Times: Infineon has been in the radar chip market for a long time with a strong presence in the automotive market. But you’re not alone. Others, including NXP, Texas Instruments and STMicroelectronics, are also competing in the radar market. How did you get in on the Soli Project? Did Google invite you?
Urschitz: No, not really. We approached Google to see if they might be interested in radar-enabled gesture detection/ambient sensing ideas.
EE Times: Wait, so you are saying that Infineon initiated this partnership?
Urschitz: We’ve been working with radar technologies for a while. Our radar chips have been used in automotive and industrial applications, and we’ve been quite successful. But we’ve always thought applying the mmWave radar technology to non-verbal communication detection was an area that's overlooked by the industry.
EE Times: In other words, Infineon had the kernel of an idea for Project Soli, even before Google launched it. But my question is, how did you approach Google? You just walked into Google’s Mountain View campus — uninvited?
Urschitz: We first went to see the managing director of Google ATAP. We explained our idea. Then, he introduced us to one of ATAP groups.
EE Times: And you guys hit it off?
Urschitz: They liked the idea, and that’s how the project started.
EE Times: Obviously, Google didn’t just pick up your chip, as is, and run with it. Google must have had a lot of demands, about specs and the capabilities of the radar chips they wanted. What were their requirements?
Urschitz: First, they wanted the radar chip to be as small as possible, small enough to fit inside a smartphone.
EE Times: How much further did Infineon have to go with the miniaturization of the chip? How big was your radar chip when you started, and how small did it get eventually?
Urschitz: We squeezed it down from an 15mm x 12mm radar chip to a 6mm x 5mm chip.
EE Times: What was the hardest part in the course of miniaturization?
Urschitz: The hardest thing was to make the size of multiple antennas very small and embedded inside the radar chip. This called for a lot IP development on our part.
EE Times: What other challenges did you face in making a 60 GHz mmWave radar chip to fit Google’s requirements?
Urschitz: They demanded super efficiency in power consumption. While it depends on use cases, the power consumption of the 60 GHz radar chip now is down to between 1mW and 10mW. In some use case, it consumes only 0.5mW.
EE Times: What other demands did Google impose?
Urschitz: They wanted to make sure that we enable our radar chip to see 180 degrees.
EE Times: A radar chip with a 180-degree field of view was important to Google, because Google wanted the Pixel 4 to have spatial awareness everywhere in front and off to the sides. Right?
Urschitz: Yes, they wanted a wide-angle view, but it was tricky, because it directly affected the antenna designs. This was all about physics.
EE Times: So, even though Google calls this device "Google’s Soli chip," Infineon developed the hardware. Under the Project Soli, both Infineon and Google must have developed a host of new technologies. Of all the innovations, could you tell me which IPs belong to Infineon? In contrast, which IPs does Google own?
Urchitz: Google owns the entire software. The way the antennas send and receive signals and the algorithms that interpret the signals are Google’s IPs. But Infineon owns all the hardware-related radar technology IPs.
EE Times: One thing that surprised me about Project Soli is Google’s effort to develop a machine learning [ML] model for radar applications. My understanding is that computer vision-based AI has already progressed a great deal thanks to the broader availability of CMOS image sensors. It allowed AI researchers to get a head start in creating a large visual database used for the development of visual object recognition software. ImageNet is a good example. But for radar, the industry hasn’t seen a big database or AI model equivalent to that of ImageNet. Is that correct?
Urschitz: As far as I know, yes, very little has been done in a ML model for radar applications.
EE Times: Presumably, that’s where Google comes in. Under Project Soli, Google is building a large database to design a robust ML model to interpret gestures sensed by radar, according to Google. Beyond smartphones, where do you think your radar chip is going and what will it enable?
Urschitz: We anticipate the radar technology to be used not just in smartphones but speakers and TV sets. Radar technology can perceive movements in rooms or measure distances from objects in the millimeter range. Aside from using micro-gesture to control a tiny consumer device such as a smart watch, we expect industrial applications such as air conditioning equipment and smart lighting to effectively leverage radar technology. If radar can sense the presence of people in a room and how its air is flowing, it could make air conditioner work a lot more efficiently, for example.
EE Times: Given that Google is leading the industry in designing an ML model for radar applications, do you think Google will open it up and allow other system companies to use their software?
Urschitz: That’s for Google to answer. But it makes sense for Google to do so for other Android smartphone vendors.
EE Times: As you pitch your radar chips for getting designed into products beyond Pixel 4, what is your strategy?
Urschitz: Independent of Google, we are developing software for our radar chips, allowing system vendors to measure speed, angle, distance and size of objects. We will make SDK available so that our customers can develop their own algorithms.
EE Times: Is your new radar chip, designed for Project Soli, already in volume production?
Urschitz: Of course. [The Pixel 4 is set for release on October 24 in the United States.]
EE Times: Where are Soli radar chips [complete with RF front-end, baseband and ADCs] being manufactured?
Urschitz: It’s been fabricated at our 300mm-wafer fab in Dresden using a BiCMOS process.