Education, Science, Technology, Innovation and Life
Open Access
Sign In

Real-Time Bi-Directional Traffic Counting: A Comparative Study on the Efficiency and Accuracy Trade-offs of YOLOv8 and Advanced Association Algorithms

Download as PDF

DOI: 10.23977/jipta.2025.080120 | Downloads: 3 | Views: 112

Author(s)

Yuhong Li 1

Affiliation(s)

1 Shenzhen Wisdom Nebula AI Technology Co., Ltd., Xili Street, Nanshan District, Shenzhen, China

Corresponding Author

Yuhong Li

ABSTRACT

Vehicle tracking and counting are essential components of Intelligent Transportation Systems (ITS), yet achieving optimal balance between high accuracy and real-time processing remains a critical challenge, especially in high-density aerial surveillance scenarios. This paper presents a robust framework for vehicle tracking and bi-directional counting, validated on a challenging 35-second aerial traffic segment. We systematically evaluate the efficacy of the You Only Look Once (YOLOv8) architecture, comparing the YOLOv8n and YOLOv8m variants to establish the trade-off between detection precision and inference speed. Furthermore, we investigate three distinct tracking mechanisms—simple IOU-based tracking and state-of-the-art association algorithms (BoT-SORT and ByteTrack). The study's core innovation includes an optimized vector-based counting logic that significantly enhances the robustness of bi-directional counting by minimizing false positives resulting from trajectory jitter and occlusion. Experimental results, conducted on a MacBook Air M3 CPU, demonstrate that the heavier YOLOv8m paired with ByteTrack achieved the highest accuracy, realizing a perfect 100.0% counting score. The YOLOv8n paired with ByteTrack offered the optimal trade-off for real-time applications, reaching a high accuracy of 97.6% at a speed of 20.3 FPS. This work confirms that advanced, high-performance tracking is indispensable for high-accuracy counting, providing a practical benchmark for selecting efficient models and trackers for real-time aerial traffic video analytics.

KEYWORDS

YOLOv8, Multi-Object Tracking, Real-Time Efficiency, Traffic Counting

CITE THIS PAPER

Yuhong Li, Real-Time Bi-Directional Traffic Counting: A Comparative Study on the Efficiency and Accuracy Trade-offs of YOLOv8 and Advanced Association Algorithms. Journal of Image Processing Theory and Applications (2025) Vol. 8: 170-177. DOI: http://dx.doi.org/10.23977/jipta.2025.080120.

REFERENCES

[1] Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (2016) You Only Look Once: Unified, Real-Time Object Detection. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 779-788.
[2] Wang, C. Y., Bochkovskiy, A., & Liao, H. Y. M. (2022) YOLOv7: Trainable Bag-of-Freebies Sets New State-of-the-Art for Real-Time Object Detectors. arXiv preprint arXiv, 2207.02696. 
[3] Bochinski, E., Eiselein, V., and Sikora, T. (2017) High-Speed Tracking-by-Detection without Using Image Information. International Conference on Advanced Video and Signal Based Surveillance (AVSS), 1-6.
[4] Gunjal, P.R.,  Gunjal, B.R.,  Shinde, H.A.,  Vanam, S.M. and Aher, S.S. Moving Object Tracking Using Kalman Filter. (2018) International Conference On Advances in Communication and Computing Technology (ICACCT), 544-547.
[5] Zhang, Y., Sun, P., Jiang, Y., Yu, D., Weng, F., Yuan, Z., Luo, P., Liu, W., and Wang, X. (2022) Bytetrack: Multi-Object Tracking by Associating Every Detection Box. European Conference on Computer Vision (ECCV), 1-21.  
[6] Aharon, N., Orfaig, R., and Bobrovsky, B.Z. (2022) BoT-SORT: Robust Associations Multi-pedestrian Ttracking. arXiv preprint arXiv, 2206.14651.
[7] Zhu, P., Wen, L., Du, D., Bian, X., Fan, H., Hu, Q., and Ling, H. (2018) The VisDrone-DET2019: The Vision Meets Drone Object Detection in Image Challenge Results. International Conference on Computer Vision Workshops (ICCVW).

Downloads: 2711
Visits: 201817

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.