ETRI Develops Self-Driving Car Processor with Nine Brains
Single Chip to Process All Data Collected from Self-Driving Cars
This processor was named Aldebaran in 2016, after the name of a giant star. Although the performance of self-driving cars had largely relied upon overseas technologies for their core processors, now the technology developed in Korea will ensure independence from those foreign technologies.
Through their continued efforts for innovation, the researchers successfully increased the number of processor cores from four to nine this year. With more brains, the processor has faster speed and realizes clearer and larger images(1280 x 960).
The recognition function has been significantly improved as well. Currently, the processor manages real-time UHD images and simultaneously recognizes pedestrians, vehicles, lanes, and other movements. The researchers also succeeded in testing radar and GPS signal processing recognition. The technology will apply to Lidar and ultrasonic technology in the coming years. ETRI’s achievement is characterized by the one-chip professor format. The researchers added the function of camera image processing to the chip and improved the driver assistance system to enable motion recognition.
The newly developed chip also incorporates a black box function to store and play driving images for car security and evidence of accidents. This supports UHD-level resolution in compliance with the HEVC(High Efficiency Video Coding) standards. The researchers also increased the number of processor cores satisfying the functional safety international standards(ISO 26262) of the International Standard Organization from two to four this year. This makes it much easier to operate different software programs implemented for different kinds of functional safety, as well as recognition of risks such as collision.
The processor-embedded chip satisfies the error prevention criteria at over a 99% level, one of the international standards. In other words, the Aldebaran processor recognizes and resolves any malfunction of electronic devices causing, for instance, abrupt acceleration in 99% of the cases. This represents revolutionary car semiconductor technology that ensures self-checking of any vehicle breakdown. ETRI is now able to lower the cost of chips based on image processing in a one-chip format. The chip is sized 7.8 x 6.7mm, which is smaller than a nail. The chip will be planted in an ECU(Electronic Control Unit) board(10cm x 10cm) and then embedded in the vehicle console parts after housing.
Outstanding Competitive Power and Technical Skills
The research team explained that the one-chip processor remarkably improved efficiency, unlike the cores separately used for pre-treatment of various sensors used in self-driving cars.
This technology will be used, among other applications, for ADAS(Advanced Driver Assistance System), which requires a great deal of image processing and a conditional self-driving function(Level 3). The research team plans to provide semiconductor technology particularly necessary for expensive vehicles. ETRI announced that the Aldebaran chip has the world’s top performance and has notable price competitiveness compared to global competitors’ modules with separate chips. The researchers say their next move will be to use neural network technology to develop a chip incorporating image recognition engine and high-performance AI(Artificial Intelligence) technology.
ETRI has completed a design to realize image recognition intelligence in real time and at a low power in order to develop an application processor to be applied to various information devices in an era of AI. The institution added that the research team will introduce an AI processor by next year with an image recognition engine that performs 100 times better than the current level. “Our goal is to develop technology to accurately recognize all moving objects, just like we humans do,” explains Dr. Yeong-su Kwon of ETRI’s Processor Research Group. “We will soon able to develop a chip that can set a destination through machine-human conversation and voluntarily navigate the route.”