A "multi-decade era" of evolution has begun, aiming to up production and drive costs down
SAN JOSE, Calif. — Deep neural networks are crawling toward the factory floor.
For several early adopters, neural nets are the new intelligence embedded behind the eyes of computer-vision cameras. Ultimately, the networks will snake their way into robotic arms, sensor gateways, and controllers, transforming industrial automation. But the change is coming slowly.
“We’re still in the early phases of what’s likely to be a multi-decade era of advances and next-generation machine learning algorithms, but I think we’ll see enormous progress in the next few years,” said Rob High, chief technology officer for IBM Watson.
Neural networks will nest in growing numbers of Linux-capable, multicore x86 gateways and controllers appearing on and around the factory floor. Emerging 5G cellular networks will one day give neural nets ready access to remote data centers, said High.
Auto and aircraft makers and health-care providers are among those taking early steps, mainly with smart cameras. Canon is embedding Nvidia Jetson boards in its industrial cameras to switch on deep learning. Industrial camera vendor Cognex Corp. is ramping up its own offerings. And China startup Horizon Robotics is already shipping surveillance cameras that embed its deep-learning inference accelerators.
“All the early adopters have deployed deep learning for visual perception, and others are starting to notice them,” said Deepu Talla, general manager of autonomous machines at Nvidia. “Perception is reasonably easy to do, and researchers see it as a solved problem.
“Now the big problems are in using AI for interaction with humans and more detailed actuation — these are 10-year research problems. In areas such as drone and robot navigation, we are more in the stage of prototypes.”
Talla calls robotics “the intersection of computers and AI,” but many industrial uses of deep learning will be less glamorous — and will arrive sooner.
Factory robots are not using AI yet, said Doug Olsen, chief executive of Harmonic Drive LLC, a leading supplier of robotic components. In the short term, don’t watch for smart robotic arms so much as embedded “machines on the factory floor that can predict failures, gathering data about daily use to determine when systems need preventative maintenance,” said Olsen. “That’s where AI can take hold first.”
Some big chipmakers agree. Renesas started experimenting three years ago, putting microcontrollers supporting AI at end nodes to detect faults and predict maintenance needs in production systems at one of its semiconductor fabs.
In October, the Japanese chip giant rolled out its first MCUs with dynamically reconfigurable processor blocks for real-time image processing. It aims to follow up with controllers that can support real-time cognition in 2020 and incremental learning in 2022.
Rival STMicroelectronics is taking a similar approach with its STM32 chips. In February, it announced a deep-learning system-on-chip and an accelerator under development, aimed in part at fault detection on the factory floor.
Berkeley professor Pieter Abbeel formed startup covariant.ai to help factories use deep learning to train robots for multiple tasks.
The smart robots will come eventually. Startup covariant.ai, for one, is working to enable them with reinforcement learning. “Equipping robots to see and act on what they see will be one of the biggest differences that deep learning will make in the next few years,” said Pieter Abbeel, an AI researcher who founded covariant and runs a robotics lab at the University of California at Berkeley.
Abbeel shows jaw-dropping simulations of robots learning to run using neural-net techniques, but it’s still early days. “In fact, we started covariant in part because the industrial AI space is not that crowded yet,” he said.
Deep learning must clear three substantial hurdles on its way to the factory floor, said Abbeel. “What’s needed is a combination of the right annotated data, expertise that knows what to do with it, and money to pay for the compute [resources].”
Industrial companies looking to hire face “a shortage of AI experts,” said Abbeel. “Berkeley graduates about 30 Ph.D.s a year in the field. Many end up in places focused on research, such as Deep Mind at Google, so that doesn’t leave many people going into industry.”
The experts are needed to collect, label, and build models to train data sets — data-science tasks that traditionally haven’t been an in-house priority at industrial companies. That’s slowly changing, however, with the expanding use of computer models for everything, said Colin Parris, vice president of softwar research at GE.
“Industry has long used models and simulations for safety and cost, but now they are maintaining ‘living’ models [that are] continuously updated,” said Parris. For example, “whole airline fleets are modeled for effects flying into hot and cold zones from the Middle East to Scandinavia.”
Such models — so-called digital twins — are becoming tools for predicting behavior changes, a job that “requires collecting more data faster and using it in different ways,” said Parris. “The trouble is that everyone is trying to sell a modeling tool, and the industry winds up with issues around data interoperability.”
Chinese startup Horizon Robotics is already shipping smart cameras embedded with its Sunrise deep-learning accelerator. (Image: Horizon Robotics)
For example, deep learning has led to the rapid rise of software frameworks such as TensorFlow, Caffe, and Chainer, each with its own pros and cons. Developers see a need for hybrid frameworks to support nuanced AI abstractions, and researchers are just starting to work on them.
Others noted that neural networks are nondeterministic. That creates two problems: The results are useful but cannot be explained, and they are expressed with often good but always imperfect accuracy levels. “Even research paper [results] have success rates that may be 80% to 99%, so you need to take care around situations when failures would create a high cost,” said Abbeel
Like the algorithms, the hardware is still evolving. “Calculations in deep learning need changes in how you lay out transistors for features such as matrix multiplication,” said Abbeel, who was an early investor in Graphcore, one of nearly 50 startups designing AI accelerators. “[Deep learning’s needs] do not match up with today’s CPUs and GPUs, which are good but not optimal.”
Machine learning is starting to make an impact in industrial applications ranging from painting jet turbines to packing Christmas presents.
Parris said that GE uses robots outfitted with computer vision to apply luminescent paint to jet engine turbines and inspect them for cracks. “It takes people out of that depressing environment, and we get better information,” he said.
Nvidia’s chips support automated optical inspection of parts from Japan’s Musashi Seimitsu. They also run safety cameras that prevent diggers from Komatsu from bumping into workers in construction zones.
John Deere uses Nvidia’s Jetson boards to help tractors distinguish between crops and weeds so that fertilizer isn’t wasted on the latter. The boards also power an automated harvesting system made by Agrobot.
A handful of startups are pioneering the trickier tasks of helping robots automatically navigate around warehouses or even cities. Fellow Robots has a system that cruises aisles, taking inventory, in some Lowe’s and BevMo retail stores. Robby Technologies has pilots in the San Francisco area making deliveries to homes, and Aerialtronics makes commercial drones with computer vision for law enforcement, search and rescue, and construction.
Giant fulfillment operations run by the likes of Alibaba and Amazon will be big drivers for smart robots, said Rodney Brooks, the father of the Roomba and chairman of Rethink Robotics. “Amazon Robotics employs 700 people just in Boston, but the robots can’t do pick-and-pack operations, so they are hiring every Christmas,” said Brooks at an event last year.
Robots produced by startup Robby Technologies are already in field tests, making home deliveries in the San Francisco area. (Image: Robby Technologies)
“These fulfillment centers have an incredible need for pick-and-pack operations where every package is unique,” he said. “That will drive development in robotic arms in ways we haven’t seen, and that will, in turn, impact factory automation in ways we can’t predict.” He added, however, that the “tremendous” demand “will disintermediate many people in robotics.”
Samsung employs a team of data scientists at its US$17 billion U.S. chip foundry, said Jonathan Taylor, vice president of manufacturing for Samsung Austin Semiconductor. He would not say whether those scientists use deep-learning techniques, but the facility did announce a deal with AT&T to test a private 5G network.
The fab may use the network to link vibration sensors to utility systems, such as water pumps, in remote parts of its campus. It may also use wireless links to 4K video cameras for security and to confirm the whereabouts of 3,000 employees in case of emergency, a face-recognition app ideal for neural nets.
The facility already has “a lot of wired connectivity, so we’re looking to connect areas we have not been able to connect to before,” said Taylor.
Factories are getting more connected, providing a foundation for AI. A third of industrial IoT networks now link more than a thousand nodes, according to a 2017 survey from market watcher ON World. That’s twice the level of wired connections that it found in a 2014 survey, but only 12% of the factories in the latest survey have deployed that many wireless nodes at a single site.
The expanding networks are laying pathways through which deep-learning neural nets will someday crawl, bringing new levels of intelligence for every system connected to them.
— Rick Merritt, Silicon Valley Bureau Chief, EE Times