The EU-funded project delivers embedded vision reference platform comprising full development kit and real-world use cases.
A three-year European Union (EU) funded project has concluded with the delivery of a comprehensive reference platform for vision-based embedded system designers. This will enable those engineers to balance low power, low latency, high performance and real-time image processing design constraints.
The Tulipp (Towards Ubiquitous Low-power Image Processing Platforms) Consortium, which received €4 million (approximately US $4.5 million) from the EU’s Horizon 2020 research and innovation program, began in January 2016 to develop high-performance, energy-efficient systems for complex, vision-based image processing applications.
The resulting reference platform includes a full development kit, comprising an FPGA-based embedded, multicore computing board, parallel real-time operating system and development tool chain with guidelines, coupled with ‘real world’ use cases focusing on medical x-ray imaging, driver assistance and autonomous drones with obstacle avoidance. The complete Tulipp ecosystem was demonstrated earlier in the year to vision-based system designers in a series of hands-on tutorials.
Philippe Millet of Thales, who was also Tulipp’s project co-ordinator, said, “By taking a diverse range of application domains as the basis for defining a common reference processing platform that captures the commonality of real-time, high-performance image processing and vision applications, it has successfully addressed the fundamental challenges facing today’s embedded vision-based system designers.”
Developed by Sundance Multiprocessor Technology, the Tulipp processing platform is 40mm x 50mm and is compliant with the PC/104 embedded processor board standard. The hardware platform utilizes the multicore Xilinx Zynq Ultrascale+ MPSoC which contains, along with the Xilinx FinFET+ FPGA, an ARM Cortex-A53 quad-core CPU, an ARM Mali-400 MP2 graphics processing unit (GPU), and a real-time processing unit (RPU) containing a dual-core ARM Cortex-R5 32-bit real-time processor based on the ARM-v7R architecture. A separate expansion module (VITA57.1 FMC) allows application-specific boards with different input and output interfaces to be created while keeping the interfaces with the processing module consistent.
Coupled with the Tulipp hardware platform, is a parallel, low latency embedded real-time operating system developed by Hipperos specifically to managecomplex multi-threaded embedded applications in a predictable manner.
Additionally, in order to help designers understand the impact of their functional mapping and scheduling choices on the available resources, the Tulipp reference platform has been extended with performance analysis and power measurement features developed by Norges Teknisk-Naturvitenskapelige Universitet (NTNU) and Technische Universität Dresden (TUD) and implemented in the Tulipp STHEM toolchain.
The insights of the Tulipp Consortium have been captured in a set of guidelines, consisting of practical advice, best practice approaches and recommended implementation methods, to help vision-based system designers select the optimal implementation strategy for their own applications. This will become a TULIPP book to be published by Springer by the end of 2019 and supported by endorsements from the growing ecosystem of developers that are currently testing the concept.
Use Case Demos: Medical, ADAS and UAVs
Tulipp also developed three ‘real-world’ use cases in medical X-ray imaging, automotive advanced driver assistance systems (ADAS) and unmanned aerial vehicles (UAVs).
The medical X-ray imaging use case demonstrates advanced image enhancement algorithms for X-ray images running at high frame rates. It focuses on improving the performance of X-ray imaging mobile C-Arms, which provide an internal view of a patient’s body in real-time during the course of an operation, to deliver increases in surgeon efficiency and accuracy with minimal incision sizes, aids faster patient recovery, lowers nosocomial disease risks and reduces by 75% the radiation doses to which patients and staff are exposed.
ADAS adoption is dependent on the implementation of vision systems, or on combinations of vision and radar, and the algorithms must be capable of integration into a small, energy-efficient electronic control unit (ECU). An ADAS algorithm should be able to process a video image stream with a frame size of 640×480 at a full 30Hz or at least at the half rate. The Tulipp ADAS use case demonstrates pedestrian recognition in real-time based on Viola & Jones algorithm. Using the Tulipp reference platform, it achieves a processing time per frame of 66ms, which means that the algorithm reaches the target of running on every second image when the camera runs at 30Hz.
In UAVs Tulipp demonstrates a real-time obstacle avoidance system for UAVs based on a stereo camera setup with cameras orientated in the direction of flight. Even though we talk about autonomous drones, most current systems are still remotely piloted by humans. This use case uses disparity maps, which are computed from the camera images, to locate obstacles in the flight path and to automatically steer the UAV around them. This is the necessary key towards fully autonomous drones.
Thales Philippe Millet added, “As image processing and vision applications grow in complexity and diversity, and become increasingly embedded by their very nature, vision-based system designers need to know that they can simply and easily solve the design constraint challenges of low power, low latency, high performance and reliable real-time image processing that face them. The EU’s Tulipp project has delivered just that. Tulipp will truly leave a legacy.”
The reference platform will be sold by Sundance, with the hardware also shared on CERN’s open hardware repository, and the tools freely available on GitHub. The Tulipp consortium members are: Thales, Efficient Innovation SAS, Fraunhofer IOSB, Hipperos, Norges Teknisk-Naturvitenskapelige Universitet, Technische Universität Dresden, Sundance Multiprocessor Technology and Synective Labs.