The Groq funding round (amount undisclosed) includes new investor TDK Ventures which picked Groq for its sustainability credentials...
AI accelerator chip designer Groq has closed a third round of funding but declined to reveal how much was raised. The Silicon Valley startup, which raised $67 million in its previous two rounds, said that it intends to double its 60-person head count in 2020 and again in 2021. Groq will also use the funding to work on a second-generation version of its architecture.
Groq’s latest funding round was led by new investor D1 Capital Partners. Other new investors span early-stage business angels to firms that typically favor later-stage companies, according to the company.
One of Groq’s new strategic investors, TDK Ventures, told EE Times that from the first meeting with Groq to closing the investment took 25 days, expeditious by any standard. The first venture fund for component maker TDK is looking to invest in companies that have synergy with its parent, as well as those that “solve a meaningful problem in the world”.
“We are looking in places where we are not yet in the market, or we want to be providing a stronger solution to the market with technology we may or may not already have. So we have this exploration purpose,” said TDK Ventures’ managing director Nicolas Sauvage. “[Partnering with Groq will mean] we are able to learn much earlier about future requirements, as well as the current requirements of customers. And what that means is we can start to have our roadmaps intersecting these requirements at the right time.”
Sauvage’s take on “solving a meaningful problem in the world” is about reducing the amount of power consumed by AI hardware in the data center and beyond, something he says Groq is well-positioned to do.
“AI inference will scale to an unprecedented level. Every other compute platform has been limited by the number of offices, desks, homes or people — or if you really want to push it, the number of pockets a person has. Maybe they have two or three smartphones, but that’s the limit,” he said. “For AI inference there is no cap, there’s no limit to how many inferences will happen every day… We believe that worldwide energy consumption for hyperscalers and AI inference use cases, and therefore the carbon emissions, will see double digit [growth] within five or 10 years. If we can invest in a company which we believe is the best in terms of number of operations per Watt, this is extremely meaningful.”
Hyperscale data centers, Groq’s key market (at least, up to now, see below) are notoriously power hungry. But do they really care about sustainability?
The Silicon 60 has become the Silicon 100 — we’ve expanded our list of the most interesting, dynamic, and important startups in the world. Who’s on the list?
“We may want a more sustainable world, but if there is no alignment with the [key performance indicators] that customers care about, the sustainability element is not going to work,” Sauvage agreed. “What we believe will be successful and a really meaningful contribution is that for power consumption, being much lower translates to better [total cost of ownership], which means we believe Groq will be successful and win huge market share. And at that point the sustainability can kick in. So there is a very nice alignment between what the customer cares about in terms of KPI, and the sustainability elements.”
In an interview with EE Times, Groq was vocal about the benefits of its AI accelerator architecture, previously marketed for data center inference, for automotive applications.
According to Groq VP products and marketing, Bill Leszinske, Groq’s architecture has a very large dynamic range — running the same hardware at lower voltage and lower frequency makes it suitable for autonomous vehicles. Future process technology nodes will increase the effect, he said. Determinism, a key facet of Groq’s architecture, also suits it to the automotive world, which is dependent on predictability for safety reasons.
“Because it’s deterministic, [Groq’s chip] appeals to the autonomous folks because it simplifies their software design,” Leszinske said. “They have hours of video footage that they use to train their models every night, to make incremental improvements. And so you enable training and deployment in the field on the same hardware, which simplifies their development cycle as well.”
Has Groq pivoted away from the data center towards the (totally different) automotive market?
“Not at all,” insisted Leszinske, saying that the company is merely expanding the application areas it is engaging with, which include any application with a power envelope above 10 W.
While hyperscale data centers are a tough market for a startup, automotive is traditionally seen as being even more challenging. Hyperscalers are already starting to adopt AI accelerator chips in real applications, whereas the autonomous vehicle chip market is still emerging. Plus, while data centers might refresh hardware every 4 years or so, design cycles in automotive can be much longer, which means longer to wait for revenues, a critical issue for any startup.
Leszinske acknowledged that automotive is a complicated market to get into.
“One of the reasons why we continue to stay engaged with the autonomous driving and autonomous vehicle people is, frankly, they keep coming to us, because the deterministic nature of our architecture greatly simplifies what they’re doing from a development and algorithm management standpoint,” he said.
TDK Ventures’ Nicolas Sauvage had a slightly different take on it.
“The reason you have a beautiful opportunity [in automotive] is that no one is happy with the current solutions,” Sauvage said. “And that’s why you could end up with a very short design cycle versus having a great product when the incumbent’s already there, and it’s very hard to displace them.”
Groq’s VP corporate development Adam Tachner said that while autonomous driving may become a revenue stream in the longer term, the company would continue to focus on short- and medium-term opportunities in parallel.
“All of those will stack up into a very nice revenue ramp over the course of the first several years of the company’s revenue,” he said.
In the short-term, Groq is working with customers using its 1 Peta-OPS chip for various high performance compute applications.
“There are customers that are not particularly cost sensitive, that are leveraging large linear algebra engines to do deep quantitative analysis – that could be in the financial sector, or the pharmaceutical space for drug discovery,” Tachner said. “Fundamentally in these spaces, the speed of innovation is limited by the speed of compute. They are very excited about our ability to do multivariate analysis and other things they want to do at massive scale. That’s a much more direct revenue opportunity which will be the focus of our first two years.”