Treffer: Enhancing underwater image quality with the resnet method yields better color balance and fusion accuracy than with the region-based convolutional neural network algorithm.

Title:
Enhancing underwater image quality with the resnet method yields better color balance and fusion accuracy than with the region-based convolutional neural network algorithm.
Source:
AIP Conference Proceedings; 2026, Vol. 3345 Issue 1, p1-7, 7p
Database:
Complementary Index

Weitere Informationen

This work aims to improve underwater image enhancement using Color Balance & Fusion. The primary source of information for the study was a dataset from Kaggle. Our samples were divided into Group I and Group II. There were 20 samples in each group, for a total of 40! Group II employed a sophisticated technique known as the Region-Based Convolutional Neural Network, while Group I worked with ResNet. It was pretty cool that we used Python to compare speed and compute statistics for analysis! Clincalc.com was our statistical analysis tool. We have alpha set at 0.05 and beta at 0.2, and we have set the statistical power, or G-power, at 85%. We primarily compared the performance of ResNet with the other algorithms, with accuracy as the primary metric. ResNet achieved an amazing accuracy score of 94.507%! In contrast, the accuracy of the Region-Based Convolutional Neural Network was 87.521%. Oh, and we discovered that it was two-tailed (p > 0.05) with a significance value of less than 001. [ABSTRACT FROM AUTHOR]

Copyright of AIP Conference Proceedings is the property of American Institute of Physics and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)