For self-driving car systems, one of the most basic and most challenging capabilities is to detect and classify objects. Self-driving cars must first be able to accurately assess the surrounding environment before they can safely adjust their driving status according to traffic flow, road rules or obstacles.
Today, high-precision sensors in advanced driver assistance systems (ADAS) are providing safety guarantees for saving lives on the road. These high-precision sensors include a series of cameras, lidar, radar, computing, and mapping technology.
Self-driving car platform
Sensors, hardware and software provided by Intel and Mobileye can give autonomous vehicles the ability to recognize their surroundings. This technology can create basic components for self-driving cars, including a series of cameras, lidar, radar, and computing and mapping technologies.
webcam
The entire vehicle will deploy 12 cameras in a 360-degree omni-directional manner. Among them, 8 cameras can provide support for automatic driving, and 4 short-range cameras can realize near-field sensing, thereby providing support for automatic driving and automatic parking. These cameras are the highest resolution sensors (hundreds of millions of samples per second), and they are the only ones that can detect shapes (vehicles, pedestrians, etc.) and object structures (road markings, traffic sign text, traffic light colors, etc.) Sensor. Advanced artificial intelligence and vision capabilities can use these cameras to establish a fully-sensing state. This end-to-end capability combined with other types of sensors is the key to achieving "true redundancy."
Lidar
Six "sector" lidars will be deployed throughout the vehicle; three of them are located in the front and three are located in the rear. Lidar sensors can detect objects by measuring the reflection of laser pulses. The combination of lidar and radar can be used by the system to provide a completely independent source of shape detection. Lidar assists the camera system. Since we use a camera-centric approach, lidar only needs to be used in some specific situations-mainly for long-distance ranging and road contouring. Compared with lidar-centric systems, limiting the workload of lidar will help significantly reduce costs, simplify the manufacturing process and enable mass production.
radar
The whole car is equipped with six radar components (a combination of short-range and long-range), which can achieve 360-degree surround coverage around the car. Radar is a mature technology that uses radar wave reflection to detect objects and determine their speed. It is especially suitable for metal objects, which is also very effective in severe weather. The combination of radar and lidar can provide a completely independent object detection system. When combined with a camera system, true redundancy can be achieved.
Computing-Intel® Atom (Atom) and Mobileye EyeQ system integrated chip
The Mobileye EyeQ system integration chip uses a proprietary computing core (which can be called an accelerator), which is optimized for various deep neural networks, computer vision, signal processing, and machine learning tasks. Software and hardware integrated on a single chip developed for specific end-uses (ADAS and autonomous driving) can make computing performance, energy consumption and cost more competitive, which has obvious advantages over general-purpose chips provided by competitors. Mobileye's self-driving fleet is currently equipped with four EyeQ4 chips, which is about 10% of the computing power we will eventually deploy in the mass production version of the L4/L5 system.
The production version of the central autonomous vehicle system processor will use an Intel® Atom-based chip and two Mobileye EyeQ5 system integrated chips. More importantly, Intel is building a professional and powerful software development platform so that Mobileye can provide an open version of the Mobileye EyeQ5 system integration chip, which will help customers to cooperate with us in the field of sensor fusion and driving strategy (and deploy their own code )
Road book
Roadbook (Roadbook) is a high-definition map for driving path geometry and other static scene semantics including lane markings, road boundaries and traffic sign information. These architectural (ie, non-shape) information provides real information for the camera system. Redundancy. This high-definition map is very unique because it is produced by crowdsourcing a large number of cars, which are equipped with a front-facing camera to support the ADAS system using the Mobileye EyeQ system integrated chip. These "non-self-driving" cars are data "collectors" that make maps. They send low-bandwidth data packets of "10 kb per kilometer" to the cloud, where the information is then aggregated into high-definition maps for high-level (L2+ and Above) Use in self-driving cars. Due to the crowdsourcing approach, the cost of map generation is very low, and it can also be updated in real time to provide a wide range of driving road coverage.
Pvc Insulated Power Cables,Pvc Insulated Power,Pvc Xlpe Cable,Insulated Pvc Sheathed Cable
TRANCHART Electrical and Machinery Co.,LTD , https://www.tranchart-electrical.com