Table of Contents
Fetching ...

Analysis of Efficient Transmission Methods of Grid Maps for Intelligent Vehicles

Robin Dehler, Dominik Authaler, Aryan Thakur, Thomas Wodtko, Michael Buchholz

Abstract

Grid mapping is a fundamental approach to modeling the environment of intelligent vehicles or robots. Compared with object-based environment modeling, grid maps offer the distinct advantage of representing the environment without requiring any assumptions about objects, such as type or shape. For grid-map-based approaches, the environment is divided into cells, each containing information about its respective area, such as occupancy. This representation of the entire environment is crucial for achieving higher levels of autonomy. However, it has the drawback that modeling the scene at the cell level results in inherently large data sizes. Patched grid maps tackle this issue to a certain extent by adapting cell sizes in specific areas. Nevertheless, the data sizes of patched grid maps are still too large for novel distributed processing setups or vehicle-to-everything (V2X) applications. Our work builds on a patch-based grid-map approach and investigates the size problem from a communication perspective. To address this, we propose a patch-based communication pipeline that leverages existing compression algorithms to transmit grid-map data efficiently. We provide a comprehensive analysis of this pipeline for both intra-vehicle and V2X-based communication. The analysis is verified for these use cases with two real-world experiment setups. Finally, we summarize recommended guidelines for the efficient transmission of grid-map data in intelligent transportation systems.

Analysis of Efficient Transmission Methods of Grid Maps for Intelligent Vehicles

Abstract

Grid mapping is a fundamental approach to modeling the environment of intelligent vehicles or robots. Compared with object-based environment modeling, grid maps offer the distinct advantage of representing the environment without requiring any assumptions about objects, such as type or shape. For grid-map-based approaches, the environment is divided into cells, each containing information about its respective area, such as occupancy. This representation of the entire environment is crucial for achieving higher levels of autonomy. However, it has the drawback that modeling the scene at the cell level results in inherently large data sizes. Patched grid maps tackle this issue to a certain extent by adapting cell sizes in specific areas. Nevertheless, the data sizes of patched grid maps are still too large for novel distributed processing setups or vehicle-to-everything (V2X) applications. Our work builds on a patch-based grid-map approach and investigates the size problem from a communication perspective. To address this, we propose a patch-based communication pipeline that leverages existing compression algorithms to transmit grid-map data efficiently. We provide a comprehensive analysis of this pipeline for both intra-vehicle and V2X-based communication. The analysis is verified for these use cases with two real-world experiment setups. Finally, we summarize recommended guidelines for the efficient transmission of grid-map data in intelligent transportation systems.

Paper Structure

This paper contains 18 sections, 7 equations, 6 figures, 2 tables.

Figures (6)

  • Figure 1: V2X use case example for the presented approach. The analyzed compression step, including decompression, enables V2X applications for local grid maps (green and yellow). Then, the grid fusion of multiple grid maps from multiple CAVs can be done on the server. The fused grid (red) can then be returned to the CAVs for various applications. The wireless transmission is indicated with the dashed arrows.
  • Figure 2: Simple overview of patch-wise compression. Each patch, indicated with different colors, is compressed separately before the whole ROS 2 message is serialized. For reconstruction, the whole ROS 2 message is deserialized, and each patch is again decompressed separately.
  • Figure 3: Comparison of the summed time of (i) serialization, (ii) compression, (iii) decompression, and (iv) deserialization of the whole ROS 2 message to the patch-wise compression pipeline with (i) patch-wise compression, (ii) serialization, (iii) deserialization, and (iv) patch-wise decompression. Patch-wise compression is indicated with the subscript $P$.
  • Figure 4: Time and compressed size comparison of different acceleration parameters for LZ4 compression for both normal (left) and quantized (right) data.
  • Figure 5: Time and compressed size comparison of different compression level parameters for Zstd compression for normal (left) and quantized (right) data.
  • ...and 1 more figures