Multimodal fusion of UAV-based computer vision and plant water content dynamics for high-throughput soybean maturity classification

Yaxin Li , Tingting Li , Yanqiang Zhao , Kunpeng Jiang , Yuying Ye , Shuai Wang , Zhipeng Zhou , Qiaorong Wei , Rongsheng Zhu , Qingshan Chen , Limin Hu , Mingliang Yang , Le Xu

Crop and Environment ›› 2025, Vol. 4 ›› Issue (4) : 241 -255.

PDF (13195KB)
Crop and Environment ›› 2025, Vol. 4 ›› Issue (4) : 241 -255. DOI: 10.1016/j.crope.2025.07.001
Research article
research-article

Multimodal fusion of UAV-based computer vision and plant water content dynamics for high-throughput soybean maturity classification

Author information +
History +
PDF (13195KB)

Abstract

Soybean is the most important oilseed and forage crop globally. Advancements in high-throughput phenotyping technologies are critical for accelerating genetic improvement in modern breeding research. However, conventional methods for assessing soybean maturity remain labor intensive. This study developed high-throughput phenotyping algorithms based on unmanned aerial vehicle (UAV) multispectral imagery combined with machine learning to monitor the maturity process of 30 soybean cultivars in large-scale breeding trials. UAV images and plant water content (PWC) data were collected to classify soybean maturity into four distinct phases: immaturity (i.e. the period before R5 stage), late pod filling (i.e. R5 to R6), physiological maturity (i.e. R7), and harvesting maturity (i.e. R8). We evaluated the performance of three classification approaches: (1) a computer vision model utilizing UAV-derived color features, (2) a PWC-based model retrieving PWC dynamics using UAV-derived feature, and (3) a multimodal fusion model integrating computer vision and PWC dynamics. Computer vision model effectively distinguished immature and mature plants but showed limitations in resolving specific maturity phases due to genetic variation in canopy color among cultivars (training set accuracy: 0.71; validation set accuracy: 0.69). The sensitive UAV-derived features were applied to establish the prediction model of PWC using convolutional neural network, which achieved the highest R2 (training set: R2 ​= ​0.95; validation set: R2 ​= ​0.86) between the predicted and measured PWC. The PWC-based algorithm outperformed the computer vision approach, achieving higher classification accuracy (training set: 0.78; validation set: 0.79). Strong correlations between PWC and pod water content, stem water content, and leaf water content underscored the physiological relevance of PWC in tracking maturation dynamics. Further improvement in classification accuracy was achieved with the multimodal fusion model (training set: 0.84; validation set: 0.83), which combined the information of computer vision and PWC dynamics. It was also confirmed that the multimodal fusion model achieved the lowest misclassification rate in the validation analysis across diverse soybean cultivars. These findings emphasize the potential of integrating UAV-based computer vision and PWC features to improve the accuracy and efficiency of soybean maturity classification. The proposed multimodal approach offers a robust framework for phenotypic selection and trait evaluation, providing valuable insights for soybean breeding programs.

Keywords

High-throughput phenotyping / Maturity classification / Multimodal fusion / Plant water content / Soybean / UAV

Cite this article

Download citation ▾
Yaxin Li, Tingting Li, Yanqiang Zhao, Kunpeng Jiang, Yuying Ye, Shuai Wang, Zhipeng Zhou, Qiaorong Wei, Rongsheng Zhu, Qingshan Chen, Limin Hu, Mingliang Yang, Le Xu. Multimodal fusion of UAV-based computer vision and plant water content dynamics for high-throughput soybean maturity classification. Crop and Environment, 2025, 4(4): 241-255 DOI:10.1016/j.crope.2025.07.001

登录浏览全文

4963

注册一个新账户 忘记密码

References

[1]

Awais M., Li W., Cheema M.J.M., Hussain S., Shaheen A., Aslam B., Liu C., Ali A., 2022. Assessment of optimal flying height and timing using high-resolution unmanned aerial vehicle images in precision agriculture. Int. J. Environ. Sci. Technol. 19, 2703-2720. https://doi.org/10.1007/s13762-021-03195-4.

[2]

Bagherian K., Bidese-Puhl R., Bao Y., Zhang Q., Sanz-Saez A., Dang P.M., Lamb M.C., Chen C., 2023. Phenotyping agronomic and physiological traits in peanut under mid-season drought stress using UAV-based hyperspectral imaging and machine learning. Plant Phenom. J. 6, e20081. https://doi.org/10.1002/ppj2.20081.

[3]

Bannari A., Morin D., Bonn F., Huete A.R., 1995. A review of vegetation indices. Remote Sens. Rev. 13, 95-120. https://doi.org/10.1080/02757259509532298.

[4]

Barnes E.M., Clarke T.R., Richards S.E., Colaizzi P.D., Haberland J., Kostrzewski M., Waller P., Choi C., Riley E., Thompson T., Lascano R.J., Li H., Moran M.S., 2000. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In:Proceedings of the Fifth International Conference on Precision Agriculture, p. 6. Bloomington, MN, USA.

[5]

Belgiu M., Drăgut¸ L., 2016. Random forest in remote sensing: A review of applications and future directions. ISPRS J. Photogrammetry Remote Sens. 114, 24-31. https://doi.org/10.1016/j.isprsjprs.2016.01.011.

[6]

Bendig J., Bolten A., Bennertz S., Broscheit J., Eichfuss S., Bareth G., 2014. Estimating biomass of barley using crop surface models (CSMs) derived from UAV- based RGB imaging. Remote Sens. 6, 10395-10412. https://doi.org/10.3390/rs61110395.

[7]

Bendig J., Yu K., Aasen H., Bolten A., Bennertz S., Broscheit J., Gnyp M.L., Bareth G., 2015. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 39, 79-87. https://doi.org/10.1016/j.jag.2015.02.012.

[8]

Burton J.W., 1997. Soyabean (Glycine max (L.) Merr.). Field Crops Res. 53, 171-186. https://doi.org/10.1016/S0378-4290(97)00030-0.

[9]

Cavaliere D., Loia V., Senatore S., 2020. Towards a layered agent-modeling of IoT devices to precision agriculture. In: 2020 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE). IEEE, Glasgow, United Kingdom, pp. 1-8. https://doi.org/10.1109/FUZZ48607.2020.9177771.

[10]

Chen A., Orlov-Levin V., Meron M., 2019. Applying high-resolution visible-channel aerial imaging of crop canopy to precision irrigation management. Agric. Water Manage. 216, 196-205. https://doi.org/10.1016/j.agwat.2019.02.017.

[11]

Chen H., Chen H., Zhang S., Chen S., Cen F., Zhao Q., Huang X., He T., Gao Z., 2024. Comparison of CWSI and Ts-Ta-VIs in moisture monitoring of dryland crops (sorghum and maize) based on UAV remote sensing. J. Integr. Agric. 23, 2458-2475. https://doi.org/10.1016/j.jia.2024.03.042.

[12]

Choudhury B.J., Digirolamo N.E., Dorman T.J., 1994. A comparison of reflectances and vegetation indices from three methods of compositing the AVHRR-GAC data over Northern Africa. Remote Sens. Rev. 10, 245-263. https://doi.org/10.1080/02757259409532249.

[13]

Choudhary S.S., Biswal S., Saha R., Chatterjee C., 2021. A non-destructive approach for assessment of nitrogen status of wheat crop using unmanned aerial vehicle equipped with RGB camera. Arabian J. Geosci. 14, 1739. https://doi.org/10.1007/s12517-021-08139-3.

[14]

Coelho A.P., Faria R.T.D., Leal F.T., Barbosa J.D.A., Rosalen D.L., 2020. Validation of white oat yield estimation models using vegetation indices. Bragantia 79, 236-241. https://doi.org/10.1590/1678-4499.20190387.

[15]

Copley T.R., Duceppe M.O., O'Donoughue L.S., 2018. Identification of novel loci associated with maturity and yield traits in early maturity soybean plant introduction lines. BMC Genom. 19, 167. https://doi.org/10.1186/s12864-018-4558-4.

[16]

Da H., Li Y., Xu L., Wang S., Hu L., Hu Z., Wei Q., Zhu R., Chen Q., Xin D., Zhao Z., 2025. Advancing soybean biomass estimation through multi-source UAV data fusion and machine learning algorithms. Smart Agric. Technol. 10, 100778.

[17]

Delavarpour N., Koparan C., Nowatzki J., Bajwa S., Sun X., 2021. A technical study on UAV characteristics for precision agriculture applications and associated practical challenges. Remote Sens. 13, 1204. https://doi.org/10.3390/rs13061204.

[18]

Du M., Noguchi N., 2017. Monitoring of wheat growth status and mapping of wheat yield's within-field spatial variations using color images acquired from UAV-camera system. Remote Sens. 9, 289. https://doi.org/10.3390/rs9030289.

[19]

Edwards J.T., Purcell L.C., 2005. Soybean yield and biomass responses to increasing plant population among diverse maturity groups: I. agronomic characteristics. Crop Sci. 45, 1770-1777. https://doi.org/10.2135/cropsci2004.0564.

[20]

Evstatiev B., Mladenova T., Valov N., Zhelyazkova T., Gerdzhikova M., Todorova M., Grozeva N., Sevov A., Stanchev G., 2023. Fast pasture classification method using ground-based camera and the modified green red vegetation index (MGRVI). Int. J. Adv. Comput. Sci. Appl. 14. https://doi.org/10.14569/IJACSA.2023.0140605.

[21]

Fehr W.R., Caviness C.E., Burmood D.T., Pennington J.S., 1971. Stage of development descriptions for soybeans, Glycine Max (L.) Merrill. Crop Sci. 11, 929-931.

[22]

Fehr W.R., Caviness C.E., Vorst J.J., 1977. Response of indeterminate and determinate soybean cultivars to defoliation and half-plant cut-off. Crop Sci. 17, 913-917.

[23]

Gao B., 1996. NDWI—A normalized difference water index for remote sensing of vegetation liquid water from space. Remote Sens. Environ. 58, 257-266. https://doi.org/10.1016/S0034-4257(96)00067-3.

[24]

Gbikpi P.J., Crookson R.K., 1981. A whole-plant indicator of soybean physiological maturity. Crop Sci. 21, 469-472.

[25]

Geng Z., Lu Y., Duan L., Chen H., Wang Z., Zhang J., Liu Z., Wang X., Zhai R., Ouyang Y., Yang W., 2024. High-throughput phenotyping and deep learning to analyze dynamic panicle growth and dissect the genetic architecture of yield formation. Crop Environ. 3, 1-11. https://doi.org/10.1016/j.crope.2023.10.005.

[26]

Gitelson A.A., Kaufman Y.J., Stark R., Rundquist D., 2002. Novel algorithms for remote estimation of vegetation fraction. Remote Sens. Environ. 80, 76-87. https://doi.org/10.1016/S0034-4257(01)00289-9.

[27]

Goward S.N., Dye D.G., Turner S., Yang J., 1993. Objective assessment of the NOAA global vegetation index data product. Int. J. Rem. Sens. 14, 3365-3394. https://doi.org/10.1080/01431169308904453.

[28]

Guo Z., Chamberlin J., You L., 2023. Smallholder maize yield estimation using satellite data and machine learning in Ethiopia. Crop Environ. 2, 165-174. https://doi.org/10.1016/j.crope.2023.07.002.

[29]

Han X., Wang H., Yuan T., Zou K., Liao Q., Deng K., Zhang Z., Zhang C., Li W., 2023. A rapid segmentation method for weed based on CDM and ExG index. Crop Prot. 172, 106321. https://doi.org/10.1016/j.cropro.2023.106321.

[30]

Haralick R.M., Shanmugam K., Dinstein I., 1973. Textural features for image classification. IEEE Trans. Syst., Man, Cybern. SMC-3, 610-621. https://doi.org/10.1109/TSMC.1973.4309314.

[31]

Hill C.B., Hartman G.L., Esgar R., Hobbs H.A., 2006. Field evaluation of green stem disorder in soybean cultivars. Crop Sci. 46, 879-885.

[32]

Hu J., Feng H., Wang Q., Shen J., Wang J., Liu Y., Feng H., Yang H., Guo W., Qiao H., Niu Q., Yue J., 2024. Pretrained deep learning networks and multispectral imagery enhance maize LCC, FVC, and maturity estimation. Remote Sens. 16, 784. https://doi.org/10.3390/rs16050784.

[33]

Hu J., Yue J., Xu X., Han S., Sun T., Liu Y., Feng H., Qiao H., 2023. UAV-based remote sensing for soybean FVC, LCC, and maturity monitoring. Agriculture 13, 692. https://doi.org/10.3390/agriculture13030692.

[34]

Huete A., Didan K., Miura T., Rodriguez E.P., Gao X., Ferreira L.G., 2002. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 83, 195-213. https://doi.org/10.1016/S0034-4257(02)00096-2.

[35]

Jadhav S.B., Udup V.R., Patil S.B., 2019. Soybean leaf disease detection and severity measurement using multiclass SVM and KNN classifier. Int. J. Electr. Comput. Eng. 9, 4077-4091.

[36]

Jain M., Singh B., Srivastava A.A.K., Malik R.K., McDonald A.J., Lobell D.B., 2017. Using satellite data to identify the causes of and potential solutions for yield gaps in India's Wheat Belt. Environ. Res. Lett. 12, 094011. https://doi.org/10.1088/1748-9326/aa8228.

[37]

Jorge J., Vallbé M., Soler J.A., 2019. Detection of irrigation inhomogeneities in an olive grove using the NDRE vegetation index obtained from UAV images. Eur. J. Remote Sens. 52, 169-177. https://doi.org/10.1080/22797254.2019.1572459.

[38]

Kameswara Rao N., Dulloo M.E., Engels J.M., 2017. A review of factors that influence the production of quality seed for long-term conservation in genebanks. Genet. Resour. Crop Evol. 64, 1061-1074. https://doi.org/10.1007/s10722-016-0425-9.

[39]

Kaufman Y.J., Tanre D., Holben B.N., Markham B., Gitelson A., 1992. Atmospheric effects on the NDVI - strategies for its removal. In: Proceedings IGARSS ’92:International Geoscience and Remote Sensing Symposium. IEEE, Houston, USA, pp. 1238-1241. https://doi.org/10.1109/IGARSS.1992.578402.

[40]

Kessler A., Archontoulis S.V., Licht M.A., 2020. Soybean yield and crop stage response to planting date and cultivar maturity in Iowa. Agron. J. 112, 382-394. https://doi.org/10.1002/agj2.20053.

[41]

Lees K.J., Artz R.R.E., Khomik M., Clark J.M., Ritson J., Hancock M.H., Cowie N.R., Quaife T., 2020. Using spectral indices to estimate water content and GPP in sphagnum moss and other peatland vegetation. IEEE Trans. Geosci. Rem. Sens. 58, 4547-4557. https://doi.org/10.1109/TGRS.2019.2961479.

[42]

Li L., Liu L., Peng Y., Su Y., Hu Y., Zou R., 2023. Integration of multimodal data for large-scale rapid agricultural land evaluation using machine learning and deep learning approaches. Geoderma 439, 116696. https://doi.org/10.1016/j.geoderma.2023.116696.

[43]

Liu K., Li Y., Han T., Yu X., Ye H., Hu H., Hu Z., 2019. Evaluation of grain yield based on digital images of rice canopy. Plant Methods 15, 28. https://doi.org/10.1186/s13007-019-0416-x.

[44]

Liu S., Li L., Gao W., Zhang Y., Liu Y., Wang S., Lu J., 2018. Diagnosis of nitrogen status in winter oilseed rape (Brassica napus L.) using in-situ hyperspectral data and unmanned aerial vehicle (UAV) multispectral images. Comput. Electron. Agric. 151, 185-195. https://doi.org/10.1016/j.compag.2018.05.026.

[45]

Maimaitijiang M., Sagan V., Sidike P., Hartling S., Esposito F., Fritschi F.B., 2020. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 237, 111599. https://doi.org/10.1016/j.rse.2019.111599.

[46]

Malet P., 1996. Classifying the geometry of canopies from time variations of red and near-infrared reflectance. Remote Sens. Environ. 56, 164-171. https://doi.org/10.1016/0034-4257(95)00223-5.

[47]

Meyer G.E., Neto J.C., 2008. Verification of color vegetation indices for automated crop imaging applications. Comput. Electron. Agric. 63, 282-293. https://doi.org/10.1016/j.compag.2008.03.009.

[48]

Moeinizade S., Pham H., Han Y., Dobbels A., Hu G., 2022. An applied deep learning approach for estimating soybean relative maturity from UAV imagery to aid plant breeding decisions. Mach. Learn. Appl. 7, 100233. https://doi.org/10.1016/j.mlwa.2021.100233.

[49]

Morris T.C., Vann R.A., Heitman J., Collins G.D., Heiniger R.W., 2021. Maximizing soybean yield by understanding planting date, maturity group, and seeding rate interactions in North Carolina. Crop Sci. 61, 4365-4382. https://doi.org/10.1002/csc2.20603.

[50]

Mourtzinis S., Gaspar A.P., Naeve S.L., Conley S.P., 2017. Planting Date, Maturity, and temperature effects on soybean seed yield and composition. Agron. J. 109, 2040-2049. https://doi.org/10.2134/agronj2017.05.0247.

[51]

NSBC (National Bureau of Statistics of China), 2023.China Statistical Yearbook. China Statistics Press, Beijing.

[52]

Nevavuori P., Narra N., Lipping T., 2019. Crop yield prediction with deep convolutional neural networks. Comput. Electron. Agric. 163, 104859. https://doi.org/10.1016/j.compag.2019.104859.

[53]

Nguyen M.H., de la Torre F., 2010. Optimal feature selection for support vector machines. Pattern Recogn. 43, 584-591. https://doi.org/10.1016/j.patcog.2009.09.003.

[54]

Okada M., Barras C., Toda Y., Hamazaki K., Ohmori Y., Yamasaki Y., Takahashi H., Takanashi H., Tsuda M., Hirai M., Tsujimoto H., Kaga A., Nakazono M., Fujiwara T., Iwata H., 2024. High-throughput phenotyping of soybean biomass: conventional trait estimation and novel latent feature extraction using UAV remote sensing and deep learning models. Plant Phenom. 6, 244. https://doi.org/10.34133/plantphenomics.0244.

[55]

Patil G., Mian R., Vuong T., Pantalone V., Song Q., Chen P., Shannon G.J., Carter T. C., Nguyen H.T., 2017. Molecular mapping and genomics of soybean seed protein: a review and perspective for the future. Theor. Appl. Genet. 130, 1975-1991. https://doi.org/10.1007/s00122-017-2955-8.

[56]

Peng X., Ma Y., Sun J., Chen D., Zhen J., Zhang Z., Hu X., Wang Y., 2024. Grape leaf moisture prediction from UAVs using multimodal data fusion and machine learning. Precis. Agric. 25, 1609-1635. https://doi.org/10.1007/s11119-024-10127-y.

[57]

Rondeaux G., Steven M., Baret F., 1996. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 55, 95-107. https://doi.org/10.1016/0034-4257(95)00186-7.

[58]

Rowland D.L., Sorensen R.B., Butts C.L., Faircloth W.H., Sullivan D.G., 2008. Canopy characteristics and their ability to predict peanut maturity. Peanut Sci. 35, 43-54. https://doi.org/10.3146/PS06-052.1.

[59]

Saleem M.H., Potgieter J., Arif K.M., 2021. Automation in agriculture by machine and deep learning techniques: A review of recent developments. Precis. Agric. 22, 2053-2091. https://doi.org/10.1007/s11119-021-09806-x.

[60]

Shibayama M., Akiyama T., 1991. Estimating grain yield of maturing rice canopies using high spectral resolution reflectance measurements. Remote Sens. Environ. 36, 45-53. https://doi.org/10.1016/0034-4257(91)90029-6.

[61]

Steven M.D., 1998. The sensitivity of the OSAVI vegetation index to observational parameters. Remote Sens. Environ. 63, 49-60. https://doi.org/10.1016/S0034-4257(97)00114-4.

[62]

Stricker M.A., Orengo M., 1995. Similarity of color images. In: NiblackW., JainR.C. ( Storage and Retrieval for Image and Video Databases III. SPIE, San JoseCA,Eds.), pp.381-392.

[63]

Stroppiana D., Migliazzi M., Chiarabini V., Crema A., Musanti M., Franchino C., Villa P., 2015. Rice yield estimation using multispectral data from UAV: A preliminary experiment in northern Italy. In: 2015 IEEE International Geoscience and Remote Sensing Symposium (IGARSS). IEEE, Milan, Italy, pp. 4664-4667. https://doi.org/10.1109/IGARSS.2015.7326869.

[64]

TeKrony D.M., Egli D.B., Balles J., Pfeiffer T., Fellows R.J., 1979. Physiological maturity in soybean. Agron. J. 71, 771-775. https://doi.org/10.2134/agronj1979.00021962007100050016x.

[65]

Trevisan R., Pérez O., Schmitz N., Diers B., Martin N., 2020. High-throughput phenotyping of soybean maturity using time series UAV imagery and convolutional neural networks. Remote Sens. 12, 3617. https://doi.org/10.3390/rs12213617.

[66]

Tsai F., Philpot W., 1998. Derivative analysis of hyperspectral data. Remote Sens. Environ. 66, 41-51.

[67]

Van K., McHale L., 2017. Meta-analyses of QTLs associated with protein and oil contents and compositions in soybean [Glycine max (L.) Merr.] seed. Int. J. Mol. Sci. 18, 1180. https://doi.org/10.3390/ijms18061180.

[68]

Vincini M., Frazzi E., 2011. Comparing narrow and broad-band vegetation indices to estimate leaf chlorophyll content in planophile crop canopies. Precis. Agric. 12, 334-344. https://doi.org/10.1007/s11119-010-9204-3.

[69]

Volpato L., Dobbels A., Borem A., Lorenz A.J., 2021. Optimization of temporal UAS- based imagery analysis to estimate plant maturity date for soybean breeding. Plant Phenom. J. 4, e20018. https://doi.org/10.1002/ppj2.20018.

[70]

Volpato L., Wright E.M., Gomez F.E., 2024. Drone-based digital phenotyping to evaluating relative maturity, stand count, and plant height in dry beans (Phaseolus vulgaris L.). Plant Phenom. 6, 278. https://doi.org/10.34133/plantphenomics.0278.

[71]

Wang F., Huang J., Tang Y., Wang X., 2007. New vegetation index and its application in estimating leaf area index of rice. Rice Sci. 14, 195-203. https://doi.org/10.1016/S1672-6308(07)60027-4.

[72]

Wang L., Gao R., Li C., Wang J., Liu Y., Hu J., Li B., Qiao H., Feng H., Yue J., 2023. Mapping soybean maturity and biochemical traits using UAV-based hyperspectral images. Remote Sens. 15, 4807. https://doi.org/10.3390/rs15194807.

[73]

Wang Z., Hong J., Du G., 2008. Use of satellite imagery to assess the trophic state of Miyun Reservoir, Beijing, China. Environ. Pollut. 155, 13-19. https://doi.org/10.1016/j.envpol.2007.11.003.

[74]

Xing Y., Lv P., He H., Leng J., Yu H., Feng X., 2022. Traits expansion and storage of soybean phenotypic data in computer vision-based test. Front. Plant Sci. 13, 832592. https://doi.org/10.3389/fpls.2022.832592.

[75]

Yang G., Li Y., Yuan S., Zhou C., Xiang H., Zhao Z., Wei Q., Chen Q., Peng S., Xu L., 2024. Enhancing direct-seeded rice yield prediction using UAV-derived features acquired during the reproductive phase. Precis. Agric. 25, 1014-1037. https://doi.org/10.1007/s11119-023-10103-y.

[76]

Yu F., Zhang Q., Xiao J., Ma Y., Wang M., Luan R., Liu X., Ping Y., Nie Y., Tao Z., Zhang H., 2023. Progress in the application of CNN-based image classification and recognition in whole crop growth cycles. Remote Sens. 15, 2988. https://doi.org/10.3390/rs15122988.

[77]

Yu N., Li L., Schmitz N., Tian L.F., Greenberg J.A., Diers B.W., 2016. Development of methods to improve soybean yield estimation and predict plant maturity with an unmanned aerial vehicle based platform. Remote Sens. Environ. 187, 91-101. https://doi.org/10.1016/j.rse.2016.10.005.

[78]

Yuan H., Liu Z., Cai Y., Zhao B., 2018. Research on vegetation information extraction from visible UAV remote sensing images. In: Fifth International Workshop on Earth Observation and Remote Sensing Applications (EORSA). Xi’an, China, pp. 1-5. https://doi.org/10.1109/EORSA.2018.8598637.

[79]

Yuan W., Wijewardane N.K., Jenkins S., Bai G., Ge Y., Graef G.L., 2019. Early prediction of soybean traits through color and texture features of canopy RGB imagery. Sci. Rep. 9, 14089. https://doi.org/10.1038/s41598-019-50480-x.

[80]

Zarco-Tejada P.J., Berjón A., Morales A., Miller J.R., Agüera J., Cachorro V., 2003. Leaf biochemistry estimation on EU high-value crops with ROSIS and DAIS hyperspectral data and radiative transfer simulation. In: Proceedings of the 3rd EARSeL Workshop on Imaging Spectroscopy. EARSeL, Munich, Germany, pp. 597-602.

[81]

Zarco-Tejada P.J., González-Dugo V., Williams L.E., Suárez L., Berni J.A.J., Goldhamer D., Fereres E., 2013. A PRI-based water stress index combining structural and chlorophyll effects: assessment using diurnal narrow-band airborne imagery and the CWSI thermal index. Remote Sens. Environ. 138, 38-50. https://doi.org/10.1016/j.rse.2013.07.024.

[82]

Zhang L., Zhang H., Niu Y., Han W., 2019. Mapping maize water stress based on UAV multispectral remote sensing. Remote Sens. 11, 605. https://doi.org/10.3390/rs11060605.

[83]

Zhang S., Feng H., Han S., Shi Z., Xu H., Liu Y., Feng H., Zhou C., Yue J., 2022. Monitoring of soybean maturity using UAV remote sensing and deep learning. Agriculture 13, 110. https://doi.org/10.3390/agriculture13010110.

[84]

Zhao Z., Liao G., 2024. Imaging hyperspectral feature fusion for estimation of rapeseed pod's water content and recognition of pod's maturity level. Mathematics 12, 1693. https://doi.org/10.3390/math12111693.

[85]

Zhou J., Yungbluth D., Vong C.N., Scaboo A., Zhou J., 2019. Estimation of the maturity date of soybean breeding lines using UAV-based multispectral imagery. Remote Sens. 11, 2075. https://doi.org/10.3390/rs11182075.

[86]

Zhou L., Meng R., Yu X., Liao Y., Huang Z., Lv Z., Xu B., Yang G., Peng S., Xu L., 2023a. Improved yield prediction of ratoon rice using unmanned aerial vehicle- based multi-temporal feature method. Rice Sci. 30, 247-256.

[87]

Zhou W., Song C., Liu C., Fu Q., An T., Wang Y., Sun X., Wen N., Tang H., Wang Q., 2023b. A prediction model of maize field yield based on the fusion of multitemporal and multimodal UAV data: A case study in northeast China. Remote Sens. 15, 3483. https://doi.org/10.3390/rs15143483.

AI Summary AI Mindmap
PDF (13195KB)

17

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/