Japan Showcases its AI Tech

Article By : Junko Yoshida, EE Times

A mammoth report on the hottest happenings from Yokohama's Embedded Technology 2018

YOKOHAMA, Japan — As with any trade show featuring “embedded technology” anywhere in the world, the Embedded Technology 2018 Exhibition in Yokohama earlier this month got hijacked by today’s two hot topics: AI and IoT. 

On one hand, Japanese electronics heavyweights — mostly Fujitsu, NEC and Toshiba — showcased new materials and wireless technologies they deem critical to the spread of IoT applications.

On the other hand, this year’s Embedded Technology/IoT show trotted out a host of Japanese startups, including Ascent Robotics, LeapMind, Robit and others with an intense business and technology focus on AI. While these AI rookies in Japan are largely unknown to the world, their motivation — what they do, how they do it, and why they do it — is no different from startups everywhere. They are eager, ambitious and trained for fast decision making. They want to make AI useful and prove its effectiveness in real-world industrial and consumer applications.

Japanese startups tend to differ from startups elsewhere in their commitment to leverage Japan’s decades of experience in building robots and automobiles. They want to use their proximity to automated manufacturing sites and to experienced factory managers as a head start toward developing AI algorithms for industrial applications.

A common thread tying both incumbents and startups who gathered at the Embedded Technology show was their active promotion and development of “edge technology.” While Google, Facebook, Amazon and others in the United States may have already established a stronghold in areas like big data, data centers and deep learning, Japan’s hopes focus on making edge devices smarter, more connected and autonomous.

Exhibition show floor of Embedded Technology/IoT Technology in Yokohama this year (Photo: Japan Embedded Systems Technology Association)
Exhibition show floor of Embedded Technology/IoT Technology in Yokohama this year (Photo: Japan Embedded Systems Technology Association)

In the following pages, EE Times shares highlights of the Embedded Technology show this year — what we spotted and what we learned.

Deep Learning Gets Embedded

Caption: Detecting cracks in concrete (Photo: EE Times)
Detecting cracks in concrete (Photo: EE Times)

Tech managers often talk about introducing deep learning into their businesses. But few discuss the real challenges of collecting data, implementing deep learning and embedding inferences on edge devices.

Thus far, the deep learning practices that have proven effective in the real world are limited to a handful of applications implemented by big deep-pocket corporations.

For LeapMind, a Tokyo-based startup, the mission is to make deep learning ubiquitous in edge devices. LeapMind helps clients by first asking what specific problems they want to solve, testing if deep learning is ideally suited to solve such issues, helping them build big data for deep learning, and finally designing AI implementations on embedded devices placed on the edge.

So far, LeapMind has applied deep learning to anomaly detection — to locate cracks, spots or rust zones — and to object detection where it can detect cars, traffic lights or traffic signs, etc. The company has also used deep learning in hazard prediction, using big data to foresee a possible malfunction.

Flying a drone along electric lines (Photo: NTT Data)
Flying a drone along electric lines (Photo: NTT Data)

LeapMind has been working with NTT Data, which flies a drone to capture images of electric lines for anomaly detection. This job isn’t as simple as it seems, because electric lines tend to sag between poles.

But what if a drone, equipped with intelligence, can see and identify an electric line that doesn’t stay strictly straight? That’s what LeapMind is working on, using deep learning, with NTT Data.

FPGA is its underlying hardware

LeapMind’s underlying technologies include the low-cost, lower-power Cyclone-V FPGA device (Intel is one of the investors in LeapMind) and a technology for quantization to low bit (“Quantization enables LeapMind to use bit manipulation operations to accelerate neural networks,” according to the company). It is also developing “original neural network architectures that run optimally on LeapMind’s own target hardware.”

LeapMind’s deep learning architecture based on the company’s technology — quantization to lower bit-widths has already proven to work well with limited memory, running at high speed.

Automated flow from TensorFlow to FPGA (Source: LeapMind)
Automated flow from TensorFlow to FPGA (Source: LeapMind)

In return, LeapMind had to develop a special training method for quantization and original networks designed to be highly compatible with FPGA devices. This recently launched product, called “Blueoil,” is a software stack dedicated to neural networks — made available on open source.

LeapMind claims that the new models, based on Blueoil, “can be trained easily by only preparing data.” The finished model “can be converted into a binary file that runs on FPGA or CPU devices with a single command.”

Printable sensors

NEC rolled out pressure-sensitive sheet sensors at the Embedded Technology show. The sensors are formed on top of thin-film transistors. The thin, light-weight and bendable sheet sensors are made by using printing technology.

Pressure-sensitive sheet sensors (Photo: EE Times)
Pressure-sensitive sheet sensors (Photo: EE Times)

The thin-film transistors detect the current value which fluctuates in accordance with the pressure applied to each sensor element. The result is a mechanism to sense the distribution of pressure applied to the entire sheet.

The sheet sensors capture the pressure of a plastic bottle and a human hand. (Photo: EE Times)
The sheet sensors capture the pressure of a plastic bottle and a human hand. (Photo: EE Times)

The high-density sheet sensors consist of 34,560 sensor cells per sheet whose dimensions can be as large as 288×172.8mm2. Laid on shelves in a store, the sheet sensors can monitor any changes in the location and weight of the displayed goods.

AI startups to mine Japan’s cars & robotics

Ascent to develop self-driving algorithms (Photo: EE Times)
Ascent to develop self-driving algorithms (Photo: EE Times)

Ascent Robotics, founded in 2016, had to be one of the least likely startups in Japan at that time. Co-founded by a Canadian machine-learning expert, the Tokyo-based Robotics/AI startup has created an English-only workspace populated almost exclusively by young non-Japanese programmers. The focus of the startup isn’t hardware but fixed on the development of algorithms for “intelligent autonomous vehicles and next-generation industrial robotics solution.”

Building momentum for Ascent is what looks like the runaway success of Preferred Networks, a fellow Japanese AI startup valued at more than $2 billion.  Ascent now calls itself Japan’s second largest AI company — after Preferred Networks.

Ascent, however, is no copycat. Preferred Networks has Toyota Motor Co., with a more than $110 million outlay, as its lead investor. Fanuc, the world’s largest maker of industrial robots based in Japan, is also behind Preferred Networks, giving the startup access to the vast data generated by thousands of robots running on Fanuc’s factory lines.

In contrast, Ascent, has not established thus far any association with or direct investment from Japan’s automakers or robotic manufacturers.

It’s all about simulations
Ascent is confident, however, that its complex simulators are the stuff for which Japan’s automotive/robotics industries are clamoring. In Ascent’s view, such simulators, to develop AI algorithms for deep reinforcement learning and generative models, will ultimately separate it from the pack.

Ascent’s own learning architecture, called “Atlas,” is where the company’s AI training simulations will run. Atlas will enable more “efficient and more intelligent training of AIs for a wide variety of tasks,” according to the company. Atlas is critical to substantially “reduce the need for handcrafted code, labeled training data and costly real-world testing,” the startup added.

Ascent's vehicle loaded with sensors (Photo: EE Times)
Ascent’s vehicle loaded with sensors (Photo: EE Times)

Since last summer, Ascent has been running a fleet of four vehicles, each equipped with eight lidars, eight cameras and four millimeter-wave radars and one infrared sensor. Companies such as Waymo or Uber need their fleets to hit the streets in heavy volume so that their autonomous vehicles can learn how to drive. In contrast, Ascent uses its fleet simply to test and verify the algorithms developed in their simulators, according to the company.

Battery-free, flexible beacons can be attached anywhere

Pulsar Gum is flexible and bendable. (Photo: EE Times)
Pulsar Gum is flexible and bendable. (Photo: EE Times)

“Pulsar Gum” developed by Fujitsu is a solar cells-operated beacon — measuring 19 x 72 x 3 mm — that can transmit ID and positioning information without batteries. The electronics are covered by silicone rubber (as seen below). The bendable Pulsar Gum is durable, even usable outdoors. It can be used as a wearable device attached as a beacon on a curved surface such as a helmet.

Electronics is sandwiched between silicone rubbers. (Photo: EE Times)
Electronics is sandwiched between silicone rubbers. (Photo: EE Times)

As the Internet of Things (IoT) proliferates on factory floors, in building corridors and in distribution warehouses, so do beacons that can broadcast an identifier to nearby portable devices. Installing beacons in any work environment, though, requires a power source for each beacon and the time it takes to install them.

Fujitsu’s battery-free beacons can make installation a snap.

Meanwhile, at the Embedded Technology show, Fujitsu — in partnership with Alps Electric/NTT Data — presented the new improved version of Pulsar Gum.

The Alps/NTT Data team has leveraged “backscattering” in the development of their new ultralow-power wireless communication technology. The beacon — based on Pulsar Gum — can communicate with a host of tags in the field to gather data, while the new ultralow-power communication technology consumes only 1/1000 of power used by wireless technologies such as BLE, IEEE 802.15.4, EnOcean. 

The Alps/NTT tags need not consume power because in communicating their data, they make use of the reflection of waves, particles or signals back to the direction from which radio came. Communication distance is five to 10 meters.

The combination of battery-less beacons and ultralow-power wireless communication technology makes it ideal to monitor everything from aging infrastructure to manufacturing floors and goods in warehouses.

The backscattering-based wireless technology can collect data from 30 tags within 20 milliseconds. The technology also ensures secure communication between a beacon and a tag by enabling identification and encryption. Existing RFID technologies cannot offer either high-speed data collection or security for wireless communication.  

Can AI make visual inspection easy?

Robit, a Japanese startup founded in 2014, takes pride in its abilities in fast prototyping, unique design, application development, embedded systems and manufacturing.

Robit’s first product was a consumer product called “Wake-up Call Alarm Curtain Opener Mornin’.” It connects to a phone by Bluetooth and then physically to the user’s curtain rods. By synchronizing an alarm with the sudden opening of the bedroom curtains, this robotic butler adds sunlight to the process of waking up.

Robit demonstrates its visual inspection equipment (Photo: EE Times)
Robit demonstrates its visual inspection equipment (Photo: EE Times)

While this seems to have nothing to do with AI or visual inspection, the idea of bringing AI to the inspection process hit Robit’s young engineering team when they were talking to manufacturing partners. Today’s manufacturing industries are suffering from a chronic shortage of skilled, experienced human inspectors. More important, automating the visual inspection process by using AI isn’t as easy as it’s cracked up to be.

On one hand, neither inspection equipment vendors nor robot manufacturers are accustomed to leveraging AI. On the other hand, AI suppliers have little to no experience in grasping how a manufacturing site is set up and how hardware is deployed under what sort of lighting conditions.

Robit's AI-based visual algorithm detect anomaly on a surface (Photo: EE Times)
Robit’s AI-based visual algorithm detect anomaly on a surface (Photo: EE Times)

Robit rolled out a newly automated solution for visual inspection, called Tesray. The startup’s focus is on robots designed for visual inspection. For video processing, it applies Robit’s home-grown AI algorithms, specifically designed to detect the depth and width of nicks and chips on a surface of a product.  

Japan Embedded Systems Technology Association (JASA)

What’s the point of attending Embedded Technology show in Yokohama without bumping into an over-the-top Cosplay girl representing Japan Embedded Systems Technology Association (JASA)?

She represents Japan Embedded Systems Technology Association (JASA). Really? (Photo: EE Times)
She represents Japan Embedded Systems Technology Association (JASA). Really? (Photo: EE Times)

— Junko Yoshida, Global Co-Editor-In-Chief, AspenCore Media, Chief International Correspondent, EE Times

Expo booth:

New products & solutions, whitepaper downloads, reference designs, videos

Conference sessions:
  • Internet of Things (IoT)
  • Supply Chain
  • Automotive Electronics
  • Wave of Wireless
3 Rounds Lucky Draw:

Register, join the conference, and visit the booths for a chance to win great prizes.

Leave a comment