Treffer: Using image segmentation models to analyse high-resolution earth observation data: new tools to monitor disease risks in changing environments.
IEEE Trans Neural Netw Learn Syst. 2023 Mar 30;PP:. (PMID: 37015131)
Trop Med Infect Dis. 2019 Jan 18;4(1):. (PMID: 30669341)
Comput Biol Med. 2024 Mar;171:108238. (PMID: 38422961)
Sci Rep. 2023 Sep 28;13(1):16275. (PMID: 37770628)
Acta Trop. 2021 Dec;224:106123. (PMID: 34480869)
PLoS One. 2022 Aug 5;17(8):e0272317. (PMID: 35930531)
Nat Rev Microbiol. 2024 Mar 14;:. (PMID: 38486116)
Sci Rep. 2021 Jun 3;11(1):11810. (PMID: 34083582)
Lancet Infect Dis. 2023 Sep;23(9):e339. (PMID: 37633296)
J Infect Dev Ctries. 2024 Feb 29;18(2):299-302. (PMID: 38484359)
Lancet. 2023 Jul 29;402(10399):361-362. (PMID: 37517424)
Trends Parasitol. 2021 Jun;37(6):525-537. (PMID: 33775559)
Lancet. 2023 Mar 4;401(10378):727-728. (PMID: 36870720)
Nat Commun. 2023 Dec 11;14(1):8179. (PMID: 38081831)
Parasit Vectors. 2022 Dec 16;15(1):473. (PMID: 36527116)
Malar J. 2023 Sep 27;22(1):286. (PMID: 37759213)
Remote Sens (Basel). 2023 May 26;15(11):2775. (PMID: 37324796)
Malar J. 2021 May 31;20(1):244. (PMID: 34059053)
PLoS One. 2024 Jan 31;19(1):e0287270. (PMID: 38295017)
Insects. 2021 Jul 21;12(8):. (PMID: 34442229)
PLoS Pathog. 2007 Oct 26;3(10):1361-71. (PMID: 17967056)
PLoS Negl Trop Dis. 2019 Jan 17;13(1):e0007105. (PMID: 30653491)
Parasit Vectors. 2024 Jan 29;17(1):38. (PMID: 38287419)
J Environ Public Health. 2021 Nov 1;2021:3220244. (PMID: 34759971)
Trends Parasitol. 2014 Nov;30(11):514-9. (PMID: 25443854)
Parasit Vectors. 2017 Jan 14;10(1):29. (PMID: 28088225)
Int J Remote Sens. 2004 Jan;25(2):359-376. (PMID: 18084628)
Nat Clim Chang. 2022;12(9):869-875. (PMID: 35968032)
Weitere Informationen
Background: In the near future, the incidence of mosquito-borne diseases may expand to new sites due to changes in temperature and rainfall patterns caused by climate change. Therefore, there is a need to use recent technological advances to improve vector surveillance methodologies. Unoccupied Aerial Vehicles (UAVs), often called drones, have been used to collect high-resolution imagery to map detailed information on mosquito habitats and direct control measures to specific areas. Supervised classification approaches have been largely used to automatically detect vector habitats. However, manual data labelling for model training limits their use for rapid responses. Open-source foundation models such as the Meta AI Segment Anything Model (SAM) can facilitate the manual digitalization of high-resolution images. This pre-trained model can assist in extracting features of interest in a diverse range of images. Here, we evaluated the performance of SAM through the Samgeo package, a Python-based wrapper for geospatial data, as it has not been applied to analyse remote sensing images for epidemiological studies.
Results: We tested the identification of two land cover classes of interest: water bodies and human settlements, using different UAV acquired imagery across five malaria-endemic areas in Africa, South America, and Southeast Asia. We employed manually placed point prompts and text prompts associated with specific classes of interest to guide the image segmentation and assessed the performance in the different geographic contexts. An average Dice coefficient value of 0.67 was obtained for buildings segmentation and 0.73 for water bodies using point prompts. Regarding the use of text prompts, the highest Dice coefficient value reached 0.72 for buildings and 0.70 for water bodies. Nevertheless, the performance was closely dependent on each object, landscape characteristics and selected words, resulting in varying performance.
Conclusions: Recent models such as SAM can potentially assist manual digitalization of imagery by vector control programs, quickly identifying key features when surveying an area of interest. However, accurate segmentation still requires user-provided manual prompts and corrections to obtain precise segmentation. Further evaluations are necessary, especially for applications in rural areas.
(© 2024. The Author(s).)