Treffer: Lightweight traffic sign detection algorithm with noise suppression and semantic enhancement.
Weitere Informationen
Existing traffic sign detection algorithms, when deployed on edge computing devices, often face challenges such as high computational overhead and insufficient detection capabilities for small or deformed targets. To address these limitations, this paper proposes LNSE-YOLO, a lightweight traffic sign detection algorithm based on YOLOv11, which incorporates novel noise suppression and semantic enhancement techniques. Initially, building upon the YOLOv11n architecture, we design a Feature Fusion Module named NSSE (Noise Suppression and Semantic Enhancement) to optimize multi-scale feature representation and effectively mitigate background interference. Subsequently, an Edge-Driven Feature Enhancement Network (ED-FEN) is integrated into the backbone, and a Local Deformable Attention (LDA) module is incorporated into the detection head. These additions are specifically engineered to bolster the model's perception of small target edges and enhance its robustness to geometric distortions, culminating in the development of a high-precision model, NSE-YOLO. Finally, to achieve model lightweighting, a channel pruning strategy based on BatchNorm scaling factors is employed to compress the NSE-YOLO, yielding the ultimate LNSE-YOLO model. Experimental results on the TT100K and CCTSDB datasets demonstrate that, compared to the baseline YOLOv11n, our high-precision NSE-YOLO model achieves significant improvements in mAP@50 by 5.5 and 2.5 percentage points, respectively. The pruned LNSE-YOLO model, while maintaining its accuracy advantage, exhibits a substantial reduction in parameters and comparable computational cost to the baseline. This unequivocally validates the effectiveness and practical utility of our proposed methodology.
(Copyright: © 2026 Wang et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.)
The authors have declared that no competing interests exist.