Object-level data-driven sensor simulation for automotive environment perception
The gradually evolving automated driving and ADAS functions require more enhanced environment perception. The key to reliable environmental perception is large amounts of data that are hard to collect. Several simulators provide realistic, raw sensor data based on physical sensor models. However, besides their high price, they also require very high computation capacity. Furthermore, most sensor suppliers provide high-level data, such as object detections, that is complicated to reproduce from simulated raw sensor data. This paper proposes a method that directly simulates the detections or object tracks provided by smart sensors. The model involves several uncertainties of the sensors, such as missed-, false detections, and measurement noise. In contrast to the conventional sensor models, this method tackles with state-dependent clutter model and considers the field of view in the detections model. The parameters of the proposed model are identified for an automotive smart radar and camera based on pre-evaluated real-world measurements. The resulting model provides synthetic object-level data with higher fidelity than the conventional probabilistic models, differing less than 2% from the precision and recall metrics of the actual sensors.