The development of automated driving functions belongs among the most complex tasks, which have been focused on by academia and industry in the recent decade. The advantages of automated driving are enabled by achievements in many levels, such as the development of sensors, object tracking algorithms, car2x communication, planning and decision, and many others. Automated driving should contribute to a higher level of safety by reducing human driving errors, reduced congestion, CO2 emissions by optimizing the traffic flow, and finally lower stress and higher comfort for the vehicle occupants. It will also bring a new business model in the area of mobility. These benefits are possible due to the wide development of new technologies and the posterior tests required for validation on the system and sub-system levels. This paper presents an open vehicle platform for the development and testing of automated driving functions and their applications. They range from driving functions for SAE level 2, in which the driver is present and must monitor the environment to higher levels such as automated valet parking. Another application field is connected with the teleoperated driving based on efficient c2i communication. The platform is based on a production vehicle Renault Twizy extended with a drive-by-wire system for actuation and control of longitudinal and lateral dynamics connected through the low-level drive-by-wire controller by CAN bus to the control devices. The open vehicle interface enables to approach of the actuators from different HW and AW configurations. The universal sensor mounts on the modified car body enable flexible modification of sensor configuration. The vehicle can be equipped with a LIDAR and up to 8 cameras and can also be extended with radars, GNSS, and ultrasonic sensors. It also contains an in-vehicle computer, which processes, in real-time, all the detection data of the environment and outputs the desired commands to the drive by wire control unit. The open software interface, in this case, an open-source autonomous driving stack, provides predefined modules for perception, decision, and planning. Additionally, it allows users to implement their own modules. In the perception module, the LIDAR is responsible for localization, through prerecorded point cloud maps, detection, and tracking of objects in the surrounding. For instance, it is also possible to perform sensor fusion with camera data for object recognition. With this data, the system provides the best trajectory and can avoid unexpected obstacles. After all, the software outputs a set of velocity, angular velocity, wheel angle, and curvature, which is transferred by CAN bus to the drive-by-wire low-level controller. Then, the low-level controller sends the commands to the actuators.
Ing. Thiago de Borba, Technische Hochschule Ingolstadt, GERMANY Prof. Ondrej Vaculin, Technische Hochschule Ingolstadt, GERMANY Mr. Parth Patel, Technische Hochschule Ingolstadt, GERMANY