• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Deciphering the contributions of spectral and structural data to wheat yield estimation from proximal sensing

    2022-10-12 09:30:50QingLiShihoJinJingrongZngXioWngZhungzhungSunZiyuLiShnXuQinYnjunSuQinghuGuoDongJing
    The Crop Journal 2022年5期

    Qing Li ,Shiho Jin,b,* ,Jingrong Zng ,Xio Wng ,Zhungzhung Sun ,Ziyu Li ,Shn Xu,Qin M,Ynjun Su,Qinghu Guo,Dong Jing,*

    a Plant Phenomics Research Centre,Academy for Advanced Interdisciplinary Studies,Regional Technique Innovation Center for Wheat Production,Ministry of Agriculture,Key Laboratory of Crop Physiology and Ecology in Southern China,Ministry of Agriculture,Collaborative Innovation Centre for Modern Crop Production co-sponsored by Province and Ministry,Jiangsu Key Laboratory for Information Agriculture,Nanjing Agricultural University,Nanjing 210095,Jiangsu,China

    b Jiangsu Provincial Key Laboratory of Geographic Information Science and Technology,International Institute for Earth System Sciences,Nanjing University,Nanjing 210023,Jiangsu,China

    c Department of Forestry,Mississippi State University,Mississippi State,MS 39759,USA

    d State Key Laboratory of Vegetation and Environmental Change,Institute of Botany,Chinese Academy of Sciences,Beijing 100093,China

    e Department of Ecology,College of Environmental Sciences,and Key Laboratory of Earth Surface Processes of the Ministry of Education,Peking University,Beijing 100871,China

    Keywords:LiDAR Multispectral Yield Phenotype Hyper-temporal

    ABSTRACT Accurate,efficient,and timely yield estimation is critical for crop variety breeding and management optimization.However,the contributions of proximal sensing data characteristics (spectral,temporal,and spatial) to yield estimation have not been systematically evaluated.We collected long-term,hypertemporal,and large-volume light detection and ranging (LiDAR) and multispectral data to (i) identify the best machine learning method and prediction stage for wheat yield estimation,(ii) characterize the contribution of multisource data fusion and the dynamic importance of structural and spectral traits to yield estimation,and(iii)elucidate the contribution of time-series data fusion and 3D spatial information to yield estimation.Wheat yield could be accurately (R2=0.891) and timely (approximately-two months before harvest)estimated from fused LiDAR and multispectral data.The artificial neural network model and the flowering stage were always the best method and prediction stage,respectively.Spectral traits (such as CIgreen) dominated yield estimation,especially in the early stage,whereas the contribution of structural traits (such as height) was more stable in the late stage.Fusing spectral and structural traits increased estimation accuracy at all growth stages.Better yield estimation was realized from traits derived from complete 3D points than from canopy surface points and from integrated multi-stage(especially from jointing to heading and flowering stages) data than from single-stage data.We suggest that this study offers a novel perspective on deciphering the contributions of spectral,structural,and timeseries information to wheat yield estimation and can guide accurate,efficient,and timely estimation of wheat yield.

    1.Introduction

    Wheat is one of the main staple food crops maintaining food supply and security around the world [1].Accurate,efficient,and timely prediction of wheat yield is critical for variety breeding and management optimization and requires an understanding of wheat yield estimation methods.

    Methods for yield assessment comprise mainly direct measurement and indirect estimation.Conventional direct measurement consists of harvesting,threshing,and weighing,which is laborintensive and time-consuming.Indirect methods are based mainly on prediction models,especially machine learning models,employing phenotypic traits (such as height and biomass) provided by remote sensing data [2,3].For example,a national-level wheat yield of America was successfully estimated (with R2=0.76) from vegetation indices derived from MODIS satellite imagery at 500-m spatial resolution using machine learning models [4].Similar studies [5,6] have demonstrated the feasibility of national-scale yield estimation for policy management.More recent efforts have focused on attempting to estimate yield in plotscale field phenotyping studies using higher-resolution proximal sensing data.

    Proximal sensing refers to the acquisition of information about an object using sensors mounted on near-surface platforms.Compared with contact measurement and remote observation,proximal sensing provides data with higher spatial,temporal,and spectral resolution.Maimaitijiang et al.[7] estimated soybean yield (with R2=0.72) from optical (RGB,multispectral,and thermal) imagery data collected by an unmanned aerial vehicle(UAV) platform at the beginning pod stage.Similar studies have achieved accurate yield estimation from fine-spatial-resolution optical images at one or several critical stages [8,9],but they did not exploit higher temporal resolution (weekly or daily) data that might increase yield accuracy.They rarely used light detection and ranging (LiDAR),whose rich three-dimensional (3D) spatial information can increase estimation accuracy [10].

    Compared to popular UAV-borne systems,emerging gantrybased platforms,such as FieldScan [11] and Crop3D [12],permit multisource data collection at a daily or even hourly frequency.This type of platform can (1) collect high-precision data using high-resolution sensors mounted at close range;(2) simultaneously collect multi-sensor data benefits from its high loading capacity;and (3) observe repeatedly if equipped with a stable and long-term power supply.Although these gantry-based systems permit continuous monitoring of crop growth status [13] and exploit time-series phenotypic data for yield estimation,their capabilities have not been fully assessed.

    Besides the use of time-series data,the application of multisource data to yield estimation is also gaining increasing attention.Multi-source data provide rich information,such as structural and spectral information,that can be closely associated with plant morphological and physiological status [7,14].Canopy structure information derived from LiDAR or 3D points reconstructed from digital photogrammetry has been used for estimating biomass[15] and grain yield [16].3D points can be used to accurately extract yield-related structural traits (such as volume and height)[17],but points alone cannot completely capture plant physiological status.In contrast,canopy spectral information from multispectral image-derived vegetation indices has been used to estimate grain yield for various crops[8,18],but these indices often suffer from the saturation effect[19]and are sensitive to light conditions [20].Fusion of canopy spectral and structural information has the potential to improve crop yield prediction accuracy[7,21].However,current data fusion studies have focused on fusing different sources of optical imagery,and the structural information they use is mainly canopy height estimated from sparse point clouds reconstructed from digital images[16].Despite great interest in fusing optical images and LiDAR data,few studies have applied them to yield estimation,let alone evaluating their contributions to yield estimation.

    Machine learning has higher applicability than deep learning in the context of non-big data,especially higher interpretability[22],making it a favorable choice for yield estimation and data interpretation[23].Previous studies[21,24]have compared several widely used models,including random forest (RF),artificial neural network(ANN),support vector machine(SVM),and stepwise multiple regression(SWR),but no single model is always the best[25].Two recent studies[7,21]tried to evaluate the influence of data volume and feature importance on machine learning models,but they used mainly optical images with limited time coverage.

    Whether hyper-temporal LiDAR and multispectral data are applicable to selecting the optimal model and prediction stage and for deciphering the contributions of multiple data features merits systematic investigation.The present study aimed to elucidate the contributions of various factors to yield estimation from hyper-temporal LiDAR and multispectral data.The objectives included (i) finding the best method and growth stage for yield estimation by comparing estimation accuracies of four common machine learning models throughout the growth period,(ii)investigating the contribution of spectral and structural data fusion and their dynamic contribution to yield estimation,and(iii)identifying the advantages of time-series data fusion and 3D information for yield estimation.

    2.Data and methods

    2.1.Study area and experimental design

    The study area was located at the Baima Experimental Station(119°10′37′′E,31°37′09′′N)of Nanjing Agricultural University,Nanjing,Jiangsu Province,China.The experimental field has an area of approximately 2000 m2,which was divided into two replications(Fig.1).Two nitrogen (N) treatments were imposed on each replication:no N fertilizer(0 kg ha-1,N deficiency)and normal nitrogen fertilizer(240 kg ha-1,control group).The control group was supplied with half of the N as urea(46%N content)before sowing and the other half during the jointing stage.The soil nutrient content(in the 0-25 cm soil layer)was measured before sowing.The total N and alkali-hydrolyzable N of the soil were 0.67 g kg-1and 66.03 mg kg-1,indicating that the soil was nitrogen-deficient and suitable for the designed nitrogen treatments.Phosphate fertilizer (P2O5,12%,120 kg ha-1) and potassium chloride (K2O,60%,120 kg ha-1) were applied as basal fertilizers before sowing.

    Under each N treatment,120 wheat varieties with diverse yield characteristics were selected and planted.The plot size for each variety was 1 × 1 m,with a row spacing of 0.25 m and a density of 300 seeds m-2.Both N treatment and variety selection were used to generate different levels of above-groud biomass (AGB)and yields.The 480 plots (Fig.1A) were manually planted in November 2019 and harvested in May 2020.

    2.2.Data collection

    2.2.1.Field measurement

    To collect ground-truth AGB and yield at the plot level,all plants in each plot were sampled manually at the maturity stage.Because the sample size was large(480 plots),fresh samples were first dried at 105 °C for 30 min for storage.All samples were then dried at 65 °C to constant weight and used as AGB.Kernels were manually threshed from wheat ears of each plot for dry weight measurement,representing yield.The total time for the measurement of AGB and yield was approximately-two months.The harvest index of each variety was calculated as the grain yield divided by the corresponding AGB.

    2.2.2.LiDAR and multispectral data

    LiDAR and multispectral imagery were collected four times per day using the FieldScan platform [11].The four fixed data collection time periods were 6:00-11:00,11:00-17:00,17:00-23:00,and 23:00-05:00.The platform was equipped with two sets of equipment for collecting data from the designed experimental replication.Each set of equipment contained two commercial high-resolution 3D laser scanners and two multispectral sensors with artificial light sources (PlantEye F500,Phenospex,Heerlen,the Netherlands) (Fig.1B).

    Fig.1.Study area and data collection.(A) The experimental layout.(B) The FieldScan system equipped with both dual-sensor LiDAR and multispectral sensors.(C) A condensed view of LiDAR data of 480 plots collected at the grain-filling stage.Colors from blue to red represent height from low to high.(D)A zoomed-in view of the LiDAR point clouds in a 1 m × 1 m plot.

    The laser sensor emits a single-line pulse in the near-infrared(NIR,940 nm) region of the light spectrum at plants.Distance and direction between the sensor and target are recorded and transformed into coordinates in point-cloud format.All sunlight and background noise points are automatically filtered out by optical and algorithm-based sunlight filters during data collection[11].The acquired point resolutions in the X,Y,and Z directions are approximately 0.8,0.8,and 0.2 mm,respectively.An example of a condensed view of LiDAR data of 480 plots was shown in Fig.1C,and a side view of LiDAR data of one plot was presented in Fig.1D.

    The multispectral sensor is a line-scan camera that contains red(R,620-645 nm),green (G,530-540 nm),blue (B,460-485 nm),and NIR(720-750 nm)bands,which have been calibrated for radiation by its manufacturer.We further verified the sensor reliability by calculating the correlation between the multispectral sensor RGB value of a standard color card and its corresponding value captured by the PlantEye sensor(Fig.S1).The correlations in the R,G,and B bands were 0.951,0.909,and 0.934,respectively.The linescan camera works simultaneously with laser scanning.Plants were scanned four times near-daily from the tillering (107 days after seeding,DAS) to harvest (191 DAS) stages.Excluding equipment interruptions(such as power failure),the total data collection time was 55 days.The total size of the LiDAR and multispectral data was 2 TB.To reduce the influence of wind and sunlight conditions,data collected at night with artificial light sources were selected for further analysis.

    2.3.Methods

    2.3.1.Data preprocessing

    The collected multispectral and point cloud data were fused using a web-based tool,i.e.,HortControl (PHENOSPEX,Heerlen,the Netherlands).After data fusion,the 3D points not only contain their basic geometric attributes (XYZ coordinates) but also have spectral attributes (R,G,B,and NIR) (Fig.2A).The fused point clouds were preprocessed by registration,filtering,denoising,and normalization.First,the fused point cloud data collected by two laser scanners in the plot were registered according to the sensors’relative position as well as point features using HortControl.The registered points were filtered to obtain vegetation-only points using a similar-height threshold method [12] widely used in terrestrial LiDAR data filtering.After filtering,points were denoised using a statistical outlier removal method implemented in the Point Cloud Library (https://pointclouds.org/).Clipping was not performed because the collected data were saved by plot according to the barcode set in the field.Finally,the denoised points of each plot were normalized by subtraction of the height of each point from the height of its nearest ground point in the horizontal direction.The spectral values of each band were normalized to the interval 0-255.The normalized data were used to extract plant structural and spectral traits.

    Fig.2.A workflow diagram of (A) data collection,(B) phenotype extraction,(C) model building and evaluation,and (D) model analysis.

    2.3.2.Phenotypic trait extraction

    2.3.2.1.Structural traits.For each plot,27 structural traits were extracted from the 3D point cloud (Fig.2B): maximum height(Hmax),mean height (Hmean),height quantiles (from 99%-quantile height,H99,to 80%-quantile height,H80,in 1% steps),canopy cover (CC),projected leaf area (PLA),plant area index(PAI),3D profile index (3DPI),and volume.Hmax is defined as the maximum z value of all normalized points.Hmean is the mean z value of all normalized points.CC is the proportion of canopy per unit area.PLA and PAI are highly correlated with plant growth and yield,owing to their good representation of leaf area[13].3DPI and volume have been used for AGB estimation[12,26].Specific methods for calculating structural traits can be found in our previous studies [12,13].

    2.3.2.2.Spectral traits.In addition to structural traits,14 vegetation indices that have been widely used in recent decades were selected(Table S1)and calculated based on the R,G,B,and NIR spectral values of the point cloud.Most of these vegetation indices are highly correlated with plant inner pigment content,such as green chlorophyll index(CIgreen)[27],chlorophyll vegetation index(CVI)[28],and anthocyanin reflectance index (ARI) [29].Some of them describe both plant function and structure attributes,such as the normalized difference vegetation index (NDVI) [30],near-infrared reflectance of vegetation (Nirv) [31],and plant senescence reflectance index(PSRI)[32].These vegetation indices have shown great potential for AGB estimation and may also contribute to grain yield prediction [7].

    2.4.Model building

    Four machine learning models (Fig.2C): ANN,RF,SMR,and SVM,were fitted using the phenotypic traits at each time point.In each model,the phenotypic traits were normalized and used as independent variables,and the yield (or AGB) was used as the dependent variable.

    ANN is a popular approach for nonlinear feature learning and is composed of an input layer,hidden layer(s),and an output layer[33].The number of hidden layers was set to 1 according to our previous study [12] estimating AGB from phenotypic traits.Each layer contains a different number of neurons.The neuron numbers of the input layer and output layer were determined by the number of independent predictors (phenotypic traits) and dependent variables (yield or AGB).The neuron number of hidden layers is uncertain but important and is usually optimized by a trial-anderror method[34].To find the best neuron number of hidden layers according to the best model prediction accuracy,we tried different neurons from 1 to 20 with an interval of 1.For the best model,the connection weight analysis method[35]was used to evaluate trait importance.Building the ANN model and evaluating trait importance were conducted with the R NeuralNet package.

    RF is also a nonlinear statistical learning method featuring ensemble learning of decision trees,and has been widely used in the remote sensing and plant science communities [36] for classification and regression owing to its lower risk of overfitting.It is well known[37]that ntree and mtry are two vital parameters that influence RF model performance.ntree defines the number of trees in the model,and mtry is the list of variables randomly sampled as candidates at each splitting node.In the present study,a grid searching method was adopted to select the combination of ntree and mtry leading to the maximum prediction accuracy [12].The ntree was varied from 100 to 1000 in intervals of 100 and combined with mtry from 1 to the number of independent variables in intervals of 1[36].For the selected best model,trait importance was also evaluated following Hutengs and Vohland[38].RF model building and trait importance analysis were performed with the R randomForest package.

    SMR is a simple but effective method for multiple linear regression that automatically selects independent variables according to their influence on dependent variables using the Akaike information criterion (AIC) and F tests.This study adopted the backward elimination method,which fits all variables in the first step and eliminates variables that do not reduce the AIC value[39].The process was implemented using the R stepwise package.

    The SVM method is a supervised machine learning method based on computational learning theory that can achieve optimal prediction accuracy while taking into account model complexity[40].This study used SVM with a linear kernel for yield estimation inspired by a previous study [41].Two parameters: the loss function of ε and the error penalty factor C,that influence model performance were adjusted by a grid searching method.ε was varied from 0.05 to 0.5 with an interval of 0.05 and combined with C from 0.001 to 100 with an interval of 10 times.Other model parameters were set as default as in the previous study [41].The best combination of ε and C was determined according to the best prediction accuracy.The process was implemented using the sklearn package in Python 3.7.

    2.5.Model accuracy evaluation

    The phenotypic traits of all 480 plots at each growth stage were combined with the final yield(or AGB)for building the abovementioned prediction models.At each time point,70% of the data was randomly selected for training and the remaining 30% for testing.The coefficient of determination(R2)and root-mean-squared error(RMSE) were selected as model evaluation criteria to select the best prediction model and prediction time,which were further used to estimate the contributions of factors to yield estimation.

    2.6.Model analysis

    The model analysis included two major aspects:feasibility analysis and contribution analysis (Fig.2D).Model feasibility analysis aims to determine whether wheat yield can be estimated from phenotypic traits using machine learning models,and whether the estimated yield is comparable to the aboveground biomass(hereafter biomass).Four models (ANN,RF,SVM,SMR) using both structural and spectral traits during the entire growth season were fitted.to the model yielding the highest prediction accuracy (R2)was selected.Based on this model,the accuracies of direct and indirect estimation were compared by multiplying the predicted biomass by the harvest index.

    Based on the best prediction model,contribution analysis was designed to estimate the contributions of hyper-temporal,multi-source,multi-stage combination,and 3D information to yield estimation (Fig.2D).First,to evaluate the contribution of hypertemporal data,we determined the prediction accuracy of the best model over the entire growth period.Comparison of the model results for each days through the entire growth stage revealed the best prediction time.Second,the best model was built with multi-source data (both spectral and structural traits) or singlesource data to evaluate the contributions of the two data sources.Third,to evaluate the contribution of combining multi-stage data to yield prediction,the best prediction data(with the highest yield estimation accuracy) from the tillering,jointing,booting,heading,flowering,early filling,middle filling,and maturity stages were selected and combined for modeling.The specific DAS corresponding to their growth stages were as follows:107-124 DAS,tilling(T);125-137 DAS,jointing(J);138-146 DAS,booting(B);147-153 DAS,heading (H);154-163 DAS,flowering (FW);164-189 DAS,filling(F);190-191 DAS,maturity(M).Comparing the prediction accuracy of the models fitted with a combination of multiple consecutive stage data and with single-stage data revealed the contribution of multi-stage data.Finally,to demonstrate the influence of 3D information on yield estimation,we extracted canopy surface point clouds from the full plant point cloud data using a grid sampling method with a pixel size of 0.1 m,sufficient to sample hundreds of points for analysis.Models fitted with traits derived from the canopy surface points and the complete 3D points were compared to reveal the contribution of 3D information to model accuracy.

    3.Results

    3.1.Contribution of hyper-temporal data to selection of the best yield prediction model and stage

    All models showed similar temporal patterns,with prediction accuracy increasing with DAS(Fig.3).Prediction accuracy increased fastest during the tilling stage and reached a local maximum(R2of 0.866 in ANN) at the middle of the jointing stage,a value close to the prediction accuracy at the mature stage.Prediction accuracy improved steadily during the jointing and flowering stages,reaching the global maximum value(R2=0.891)at the flowering stage.Accuracy decreased slightly from the flowering to the maturity stage(Fig.3).

    The best time for yield estimation was at approximately 154 DAS at the flowering stage,and the best model was ANN(R2=0.891),followed by RF (R2=0.877),SVM (R2=0.871),and SMR (R2=0.839) (Fig.S2).The ANN model built with the data at 154 DAS was used for further analysis.

    Fig.3.Wheat yield prediction accuracies indicated by(A)R2 and(B)RMSE,using ANN,RF,SVM,and SMR models based on all phenotypic traits derived from fused LiDAR and multispectral data.Dots are prediction accuracies R2 and RMSE,and curves are fitted to dots using the Savitzky-Golay Wave Filter.T,J,B,H,FW,F,and M represent the tillering,jointing,booting,heading,flowering,filling,and mature stages,respectively.

    3.2.Contribution of spectral and structural traits to wheat yield estimation

    Spectral traits showed high accuracy for wheat yield estimation in the ANN model,with R2ranging from 0.684 to 0.876 (Fig.4).In contrast,structural traits (height and canopy cover) resulted in slightly lower estimation accuracy,with R2varying from 0.418 to 0.819.The fusion of spectral and structural traits increased the accuracy during the entire growing season,with R2varying from 0.697 to 0.891.The contribution of data fusion was estimated at the best prediction time (154 DAS) when the predicted R2using structural traits,spectral traits,and all traits were 0.819,0.876,and 0.891,respectively (Fig.S3).The contribution of data fusion was pronounced at the maturity stage and not at the best prediction time (154 DAS) (Fig.4).

    The trait importances during the entire growth period were dissect and its dynamic changes were normalized using min-max normalization at each prediction time,as shown in Fig.5.Among the 41 input variables,spectral traits(upper part of Fig.5)showed dominant importance,especially CIgreen,ARI,and SIPI.These spectral traits were important throughout almost the entire growth period.Two structural traits,Hmean and volume,were also important but mainly during the tilling and jointing stages (107-134 DAS).

    Trait importance was estimated using the best model (ANN) at the best prediction time (154 DAS).The 20 most important traits are ranked in Fig.S4.For the ANN model using both spectral and structural traits (Fig.S4A),the first four variables were spectral traits: CIgreen (3.87%),MCARI (2.48%),CVI (2.09%),and NCPI(1.41%),followed by the two main structural traits: H85 (1.29%)and volume (1.25%).For the spectral-traits-only model (Fig.S4B),the four most important traits were again CIgreen,CVI,MCARI,and NCPI.For the structural-traits-only model (Fig.S4C),canopy cover and volume appeared along with the most important height traits (H85 and Hmean).

    3.3.The contribution to yield prediction of combining multi-stage data

    By comparison of the accuracies of models built with singlestage data and a combination of multiple consecutive stage data(Table 1),the most suitable consecutive stage combination for yield estimation was identified.There was no overfitting problem,because the testing accuracy was similar to the training accuracy even for the highest-dimensional combination (at tillering-maturity) (Table S2).

    Fig.4.Wheat yield prediction accuracies ((A) R2 and (B) RMSE) using the best-selected ANN model built with either multispectral data (spectral traits) or LiDAR data(structural traits)or both LiDAR and multispectral data(all traits).Dots are prediction accuracies R2 and RMSE,and curves are fitted to dots using the Savitzky-Golay Wave Filter.T,J,B,H,FW,F,and M represent the tillering,jointing,booting,heading,flowering,filling,and mature stages,respectively.

    Fig.5.Normalized trait importance dynamics using the ANN model with fused LiDAR and multispectral data.T,J,B,H,FW,F,and M represent the tillering,jointing,booting,heading,flowering,filling,and mature stages,respectively.

    Almost all combinations of consecutive growth stages led to marked improvement in prediction accuracy compared to using only single-stage data (Table 1).The data combination benefits were more pronounced for structural traits than for spectral traits or all traits.Among all trait models,the highest prediction accuracy was achieved with the combination of the jointing,booting,and heading stages (jointing-heading),with an R2of 0.904.Among the spectral-trait-only models,the highest prediction accuracy was achieved by the combination of the jointing,booting,heading,and flowering stages (jointing-flowering),with an R2of 0.900.Among the structural-trait models,the prediction accuracy of the combination of all eight stages was highest,with an R2of 0.857.

    Table 1 Yield prediction accuracies of models built with either spectral,structural,or both structural and spectral traits (all traits) from a single stage or a combination of multiple consecutive stages(e.g.,tillering-maturity represents the data combination from tillering to maturity stage,including tillering,jointing,booting,heading,flowering,early filling,middle filling,and maturity).

    3.4.Contribution of 3D canopy traits to wheat yield estimation

    To evaluate the contribution of 3D information to wheat yield estimation,we compared the prediction accuracy of ANN using all the phenotypic traits extracted from the complete 3D points or the canopy surface points (Fig.6).Although the temporal pattern of model prediction accuracies using canopy surface points was similar to that using complete 3D points,their accuracies were overall lower during the whole growth period.

    When canopy surface points were used,the accuracy of yield estimation using both spectral and structural traits was relatively high,with R2ranging from 0.304 to 0.770.The accuracy using only spectral traits was slightly lower,with R2varying from 0.175 to 0.748;and that using only structural traits was lowest,with R2varying from 0.156 to 0.706.The maximum prediction accuracies(R2) using all traits,only structural traits,and only spectral traits from canopy surface points were 0.121,0.162,and 0.170 lower than those from complete 3D points.

    4.Discussion

    4.1.Contribution of hyper-temporal data to selection of the best yield prediction model and stage

    In comparison with previous studies that focused on biomass estimation [12,20,42] or further combined with harvest index to achieve yield estimation[43,44],this study found that the estimation accuracy patterns of the models(Fig.S5)using different traits(Fig.S6) as well as trait importance analysis (Fig.S7) of AGB are similar to the yield prediction results,implying that wheat yield could be accurately and timely estimated from fused LiDAR and multispectral data.The yield estimation accuracy (R2=0.891,Fig.7A) was similar to that of the method (R2=0.895,Fig.7C) of multiplying predicted biomass (R2=0.875,Fig.7B) by harvest index.

    Fig.6.Wheat yield prediction accuracies ((A) R2 and (B) RMSE) using the best-selected ANN model built with either multispectral data (spectral traits) or LiDAR data(structural traits) or both LiDAR and multispectral data (all traits) from the complete points (solid line) or canopy surface points (CP_,dashed line).Dots are prediction accuracies R2 and RMSE,and curves are fitted to dots using the Savitzky-Golay Wave Filter.T,J,B,H,FW,F,and M represent the tillering,jointing,booting,heading,flowering,filling,and mature stages,respectively.

    Fig.7.(A)Correlation between measured yield and direct predicted yield,(B)correlation between measured AGB and predicted AGB,(C)correlation between measured yield and yield predicted by multiplying predicted AGB and harvest index (HI).These results were derived from the ANN model built with both LiDAR and multispectral data collected at 154 DAS.The solid line represents the fitted line and the dashed line represents the 1:1 line.

    All models constructed from all 41 extracted spectral and structural traits achieved satisfactory accuracy (R2>0.839) for yield estimation (Fig.S2).The best model was ANN,followed by RF,SVM,and SMR.These findings are consistent with those of previous studies suggesting that model performance corresponded to model complexity.ANN performed better in a multi-trait scenario and outperformed RF and SVM [21],and RF performed better than SVM and SMR [24].However,in another study,ANN performed more poorly than SMR and RF [24],perhaps owing to insufficient training data used by ANN [12].In the present study,the gantry-based system provided unprecedented long-term,hyper-temporal(near-daily,55-day observation data over the 84-day growing season) and large-volume LiDAR and multispectral data.The use of multi-source and large-volume data may explain why all the models achieved high accuracy and why the data-driven ANN was the best model.

    The finding that the best prediction time was 154 DAS,in the flowering stage,is consistent with agronomist experience and previous research conclusions that the accuracy of yield prediction is time-dependent [45-47].Plant structural and biochemical traits such as height and chlorophyll content have arrived at almost their maximum levels and remain stable at the flowering stage [25,48].Zhou et al.[8] found that booting was the optimal stage when using vegetation indices for grain yield prediction.Zhang et al.[45] found that the heading stage made the highest contribution to yield estimation,followed by the grain-filling and jointing stages.The mid-filling stage was the best stage for predicting yield when the green ratio vegetation index (GRVI) was used [47].In another study [45] using vegetation indices,the best prediction time was around initial grain filling.The best prediction stage(the flowering stage)found in this study overall was in agreement with previous studies but a few days earlier than the grain-filling stage reported by most previous studies.The higher frequency of data observations used in this study may have permitted predicting yield at an earlier stage.The earlier prediction may also be attributed to the utilization of 3D canopy traits,such as plant height and PAI,which reached almost their maximum values at the flowering stage,whereas these structural traits were rarely considered in previous studies [49].

    4.2.Contribution of spectral and structural traits to wheat yield estimation

    Previous studies have used spectral traits (vegetation indices)as predictors for crop yield estimation[7,8].With the development of measuring methods for structural traits,such as structure from motion (SFM),simple structural traits (such as plant height) have also been used for yield estimation.However,little studies have systematically compared structural and spectral traits to yield estimation.In the present study,spectral traits outperformed structural traits for yield estimation during the entire growth period,especially at the tilling stage.The reason may be that the structural differences of plots with different yield levels have not yet been expressed during the vegetative growth stage,whereas spectral traits can distinguish plot differences earlier [50].However,structural traits show relatively small changes relative to spectral traits in the late stage of crop growth,making prediction accuracy based on structural traits more stable after the flowering stage.

    Because structural and spectral data have unique and additive values at different growth stages,the fusion of multi-source data is conducive to achieving higher accuracy than single-source data[16,51].Although the overall improvement of data fusion was slight,relative to the use of only spectral traits,it showed a marked improvement in the early and late growth stages.In contrast,data fusion markedly improved yield estimation accuracy relative to using only structural traits.

    Identifying the contributions of individual traits is important for improving yield estimation accuracy.Among all spectral traits,CIgreen,MCARI,CVI,and NCPI were identified as the most important predictors for yield estimation(Fig.S4A)at 154 DAS.The possible reason is that all the identified key spectral traits were closely correlated with leaf chlorophyll content[36,52],which is an effective indicator of plant photosynthetic efficiency and eventually highly correlated with yield [53].Among all structural traits,height-related traits (such as H85 and Hmean) were the most important,followed by canopy cover and volume,in agreement with previous findings in biomass estimation [12].In this study,the trait importance pattern was similar for biomass and yield estimation (Figs.S4,S7).

    In addition to the analysis of trait importance for yield estimation,we answered(Fig.5)a rarely studied question:how does trait importance change during the entire growth period?Based on the trait importance analysis of the best ANN model built with the fused hyper-temporal LiDAR and multispectral data,we found that spectral traits dominated yield estimation in the entire period,and structural traits were important mainly in the early stages.A possible reason is that plant structure changes greatly in the early stages,and the spectrum is more sensitive to changes in plant internal physiology and external morphology throughout the growth period[45].Although structural trait importance was relatively low after the heading stage,it was more stable than that of the spectral traits (Fig.5).A possible reason is that canopy structure changes slightly after the flowering stage,but canopy spectral information changes greatly with plant senescence.The dynamic change in the contribution of spectral and structural traits to yield estimation further confirmed the necessity and benefit of data fusion (Fig.4).

    4.3.The contribution of combining multi-stage data to yield prediction

    Multi-stage data fusion showed higher prediction accuracy than use of only single-stage data(Table 1),in agreement with previous findings that the cumulative vegetation index increased yield prediction accuracy and that the best time interval was from the heading to maturity stage[47]or jointing to initial filling stage[46].The present study revealed additional value contributed by multi-stage combinations owing to the use of multi-source data.The best consecutive combination for spectral traits was from the jointing to flowering stages (Table 1),which may share the same reason as the trait importance.Compared with the real-time response characteristics of spectral information,the effect of structure on yield is more of a cumulative effect,possibly explaining why the combination of structural traits has the best accuracy when data of all stages are combined.Similarly,it may be that the fusion of spectral and structural traits contains more diverse features,and the combination of consecutive stages is the shortest when both structural and spectral traits are used.The shortest stage combination covers only the vegetative stages (from jointing to heading stages) that are crucial for the formation of grain sink capacity and leaf source activity[54].Yield formation may depend mainly on the vegetative growth stage under the premise of normal grain filling.In this case,the final yield could be accurately predicted (R2=0.866 at 126 DAS) at approximately-two months before harvest,in agreement with the previous study [5].Yield early prediction is needed for early field management and variety selection.However,how far in advance yield can be predicted depends on the experimental location (such as northern or southern China) and data source.Additionally,it needs to be noted that the use of spectral information was essential for early prediction of yield,whereas structural traits alone were insufficient for accurate prediction (Fig.4).

    4.4.The contribution of 3D canopy traits to wheat yield estimation

    As a prelude to scaling up the yield estimation method,this study compared the accuracies achieved by using complete 3D points and only canopy surface points that may be acquired easily with recently developed technologies,such as SFM with UAV images[55].The pattern of yield estimation accuracy using canopy surface points was similar to that using the complete 3D points during the entire growing season.Yield estimation accuracy using both spectral and structural traits extracted from canopy surface points was as high as 0.77(Fig.6).In one study[25],3D phenotypic traits derived from the point cloud indeed showed advantages over the widely used 2D traits (vegetation indices),implying that 3D technology (such as LiDAR) can strengthen phenotyping studies.But with the integration of large-scale canopy point clouds and spectral data [56],the method proposed in this study may be easily,quickly,and cost-effectively extended for large-scale yield prediction,facilitating large-scale field management and variety selection.

    4.5.Contributions and future work

    This study demonstrated the feasibility of accurate and timely yield prediction and systematically evaluated the contributions of multiple factors to yield estimation from hyper-temporal LiDAR and multispectral data using multiple machine learning methods.Conclusionsare: (i)wheat yield can be accurately and timely estimated from fused LiDAR and multispectral data,with accuracy comparable to that of multiplying predicted biomass by laboriously measured harvest index;(ii)ANN using flowering-stage data were best for wheat yield estimation;(iii) in multi-source data fusion,spectral traits dominated wheat yield estimation,especially in the early stages,whereas structural traits were more stable in the late stage;(iv) yield could be accurately predicted approximately-two months before harvest;(v) We untangled the potential value of the 3D points cloud for yield estimation,which may guide future research using less information but low-cost equipment according to the accuracy requirement.

    We suggest that these contributions will be beneficial to modern crop breeding and precision crop management.We emphasize the importance of crop phenotyping for building a practical technical framework.For crop breeding,the prerequisite for identifying excellent genes is distinguishing the phenotypic diversity of crop germplasm [57].The distinct contributions of spectral and structural data to yield estimation may be a valuable reference for selecting ideal traits predictive of high yield [7].The time-series importance of traits to yield estimation may help breeders identify genes that act throughout the whole growth period or only at some specific time[58].Dynamic trait importance may also contribute to the selection and design of dynamic ideal plant types.For field management,the timely and accurate acquisition of crop information is the key to decision making [4,6].Final yield can be accurately predicted at around two months before harvest by multisource or multi-stage data fusion,facilitating early field management decisions.Comparison of yield estimation accuracy using complete 3D points with that using only canopy surface points provides an objective basis for the selection of more accurate or more cost-effective equipment.Still,the tradeoff between yield estimation accuracy and efficiency deserves future study.

    However,overlaps in structural and spectral information and intertwinement of trait dynamics pose challenges to estimating the contributions of multiple factors to yield prediction.Although this study was conducted with 480 plots with 120 different varieties and two nitrogen levels,they were used mainly to collect samples with various yield levels.It is worth verifying our findings by first building models for multiple varieties and then deciphering the model mechanism analysis under multiple varieties and environments [7].In addition,some emerging sensing technologies,such as thermal and fluorescence sensors,could provide canopy physiology information,which is worth future consideration.Canopy surface points are virtually generated from 3D points,which could be substituted by digital area points collected by UAV images.Finally,collinearity among variables is a confounding factor in multivariate modeling.The RF,ANN,and SMR models are insensitive to variable autocorrelation [59],while SVM could be less influenced once reasonable parameters are set using the grid searching method.Methods such as principal component analysis are useful for eliminating multiple collinearity issues,but the features can be difficult to interpret.For this reason,the present study directly used all biologically meaningful traits as predictors,as done in similar studies [12].

    5.Conclusions

    This study systematically deciphered the contribution of plant spectral and structural traits to yield prediction and found that:(i) the best machine learning model for yield estimation was ANN (R2of 0.697-0.891) during the entire growth period,and the best time for yield estimation was the flowering stage (154 DAS);(ii) yield estimation results from multisource data(spectral+structural) were more accurate than those using either spectral data or structural data.Spectral traits dominated wheat yield estimation,especially in the early stages,while structural traits were more stable in the late stages;(iii)higher yield estimation accuracy was achieved using traits derived from complete 3D points than canopy surface points and from integrated multistage data than from single-stage data.

    These findings demonstrate that long-term,hyper-temporal,and large-volume LiDAR and multispectral data are valuable for estimating wheat yield and effective for deciphering the contributions of multiple prediction factors.Thepresent study is conducive to achieving earlier and more accurate yield estimation,facilitating early selection of varieties and timely field management.

    Declaration of competing interest

    The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

    CRediT authorship contribution statement

    Qing Li:Investigation,Visualization,Formal analysis,Writingoriginal draft.Shichao Jin:Conceptualization,Supervision,Writing-original draft,Writing -review &editing,Funding acquisition.Jingrong Zang:Investigation.Xiao Wang:Writing-review&editing,Resources.Zhuangzhuang Sun:Investigation,Formal analysis.Ziyu Li:Investigation.Shan Xu:Writing -review &editing.Qin Ma:Writing -review &editing.Yanjun Su:Writing -review &editing,Funding acquisition.Qinghua Guo:Writing -review &editing.Dong Jiang:Funding acquisition,Supervision,Resources,Writing -review &editing.

    Acknowledgments

    This study was supported by the Jiangsu Agricultural Science and Technology Independent Innovation Fund Project (CX(21)3107),the National Natural Science Foundation of China(32030076),High Level Personnel Project of Jiangsu Province(JSSCBS20210271),China Postdoctoral Science Foundation(2021 M691490),Jiangsu Planned Projects for Postdoctoral Research Funds (2021K520C),Strategic Priority Research Program of the Chinese Academy of Sciences (XDA24020202),and the Jiangsu 333 Program.

    Appendix A.Supplementary data

    Supplementary data for this article can be found online at https://doi.org/10.1016/j.cj.2022.06.005.

    xxxhd国产人妻xxx| av网站在线播放免费| 波野结衣二区三区在线| 亚洲精品国产区一区二| 在现免费观看毛片| 午夜福利免费观看在线| 日韩中文字幕欧美一区二区 | 99久久99久久久精品蜜桃| 亚洲av国产av综合av卡| 天天影视国产精品| 亚洲欧美精品综合一区二区三区| av天堂久久9| 国产黄频视频在线观看| 男女床上黄色一级片免费看| 亚洲专区中文字幕在线 | 国产精品久久久av美女十八| 女性被躁到高潮视频| 亚洲欧洲精品一区二区精品久久久 | 精品少妇内射三级| 国产片内射在线| 最近最新中文字幕免费大全7| 男人操女人黄网站| 中文天堂在线官网| 天天影视国产精品| 97在线人人人人妻| 国产成人免费无遮挡视频| 国产熟女午夜一区二区三区| 性高湖久久久久久久久免费观看| 熟女av电影| 日韩欧美一区视频在线观看| 精品人妻一区二区三区麻豆| 国产精品一区二区精品视频观看| 精品少妇久久久久久888优播| 国产成人啪精品午夜网站| 亚洲一级一片aⅴ在线观看| 男人操女人黄网站| 亚洲精品,欧美精品| 精品免费久久久久久久清纯 | 国产一区亚洲一区在线观看| 看非洲黑人一级黄片| 日本色播在线视频| 18在线观看网站| 国产乱人偷精品视频| 国产亚洲午夜精品一区二区久久| 人人妻人人澡人人看| 女性生殖器流出的白浆| 国产亚洲一区二区精品| 少妇的丰满在线观看| 久久久欧美国产精品| 王馨瑶露胸无遮挡在线观看| 国产亚洲午夜精品一区二区久久| av免费观看日本| 亚洲精华国产精华液的使用体验| 日本欧美国产在线视频| 老司机深夜福利视频在线观看 | 国产午夜精品一二区理论片| 少妇被粗大的猛进出69影院| 久久精品久久精品一区二区三区| 久久久精品国产亚洲av高清涩受| 成人亚洲精品一区在线观看| 国产亚洲av高清不卡| av卡一久久| 亚洲情色 制服丝袜| 亚洲欧洲日产国产| 啦啦啦在线观看免费高清www| 一级毛片 在线播放| 少妇人妻 视频| 日韩不卡一区二区三区视频在线| 亚洲精品久久久久久婷婷小说| 在线天堂中文资源库| 亚洲人成网站在线观看播放| 国产精品国产av在线观看| 国产在线视频一区二区| 制服丝袜香蕉在线| 青草久久国产| 国产精品秋霞免费鲁丝片| 大码成人一级视频| 成年女人毛片免费观看观看9 | av福利片在线| 亚洲国产精品一区二区三区在线| 国产一级毛片在线| 在线观看国产h片| 亚洲第一区二区三区不卡| 国产日韩欧美在线精品| 国产精品久久久久久精品古装| 少妇猛男粗大的猛烈进出视频| 国产成人欧美在线观看 | 亚洲精品成人av观看孕妇| 哪个播放器可以免费观看大片| 极品少妇高潮喷水抽搐| 欧美另类一区| 男女免费视频国产| 99国产精品免费福利视频| 晚上一个人看的免费电影| 人人妻人人澡人人看| 尾随美女入室| 超碰成人久久| 性少妇av在线| bbb黄色大片| 日韩av不卡免费在线播放| 母亲3免费完整高清在线观看| 色94色欧美一区二区| 人妻 亚洲 视频| 精品视频人人做人人爽| av线在线观看网站| 亚洲国产精品国产精品| 亚洲精品一区蜜桃| 天天躁狠狠躁夜夜躁狠狠躁| 欧美精品高潮呻吟av久久| 午夜免费观看性视频| 久久国产精品男人的天堂亚洲| 欧美av亚洲av综合av国产av | 亚洲精品美女久久久久99蜜臀 | 一级片'在线观看视频| 国产精品秋霞免费鲁丝片| 久久精品久久久久久噜噜老黄| 在线观看免费午夜福利视频| 婷婷色麻豆天堂久久| 亚洲精品aⅴ在线观看| 午夜福利网站1000一区二区三区| 色精品久久人妻99蜜桃| 一级片免费观看大全| 午夜日本视频在线| 制服诱惑二区| 日本色播在线视频| 亚洲伊人久久精品综合| 国产成人欧美在线观看 | 亚洲欧美激情在线| 久久久久久免费高清国产稀缺| 大码成人一级视频| 国产精品 国内视频| 久久久精品免费免费高清| 男女免费视频国产| 下体分泌物呈黄色| 国产亚洲最大av| 男的添女的下面高潮视频| 天天躁夜夜躁狠狠久久av| 日韩av在线免费看完整版不卡| 亚洲欧美一区二区三区黑人| 久久99一区二区三区| 亚洲精品自拍成人| 91国产中文字幕| 久久这里只有精品19| av福利片在线| 国产一区亚洲一区在线观看| 久久午夜综合久久蜜桃| 日日爽夜夜爽网站| 精品卡一卡二卡四卡免费| 无限看片的www在线观看| 国产黄色视频一区二区在线观看| 久久久久精品性色| 国产乱来视频区| 最近的中文字幕免费完整| 亚洲欧美精品综合一区二区三区| 韩国高清视频一区二区三区| 国产精品国产三级国产专区5o| 国产成人av激情在线播放| 国产成人啪精品午夜网站| 国产成人啪精品午夜网站| 国产亚洲最大av| 亚洲色图综合在线观看| 欧美精品人与动牲交sv欧美| 久久久久精品人妻al黑| 欧美激情 高清一区二区三区| 欧美变态另类bdsm刘玥| 久久狼人影院| 日韩视频在线欧美| 国产精品久久久av美女十八| 高清在线视频一区二区三区| 亚洲国产精品国产精品| 亚洲国产精品国产精品| 少妇精品久久久久久久| 极品少妇高潮喷水抽搐| av在线app专区| 免费少妇av软件| 国产不卡av网站在线观看| 久久精品国产亚洲av涩爱| 夫妻性生交免费视频一级片| 欧美xxⅹ黑人| 最新的欧美精品一区二区| 欧美精品一区二区免费开放| 免费不卡黄色视频| 亚洲av福利一区| 最新在线观看一区二区三区 | 免费日韩欧美在线观看| 亚洲国产精品成人久久小说| 韩国高清视频一区二区三区| 婷婷色av中文字幕| 女人被躁到高潮嗷嗷叫费观| 亚洲美女搞黄在线观看| 美女午夜性视频免费| 亚洲欧美成人精品一区二区| 黄色一级大片看看| 免费观看av网站的网址| 亚洲一区二区三区欧美精品| 香蕉国产在线看| 日日撸夜夜添| av一本久久久久| 久久 成人 亚洲| av女优亚洲男人天堂| 亚洲成人一二三区av| 男男h啪啪无遮挡| 国产男女内射视频| 亚洲免费av在线视频| 大型av网站在线播放| 国产精品一区二区三区四区久久 | bbb黄色大片| 国产精华一区二区三区| 免费在线观看亚洲国产| 国产单亲对白刺激| 怎么达到女性高潮| 脱女人内裤的视频| 亚洲国产欧美网| 亚洲男人的天堂狠狠| 高清在线国产一区| 九色国产91popny在线| 久久青草综合色| 麻豆一二三区av精品| 9热在线视频观看99| 乱人伦中国视频| 美女高潮喷水抽搐中文字幕| 一级片免费观看大全| 男人的好看免费观看在线视频 | 91大片在线观看| 人人澡人人妻人| 中文字幕最新亚洲高清| 91精品国产国语对白视频| 啦啦啦观看免费观看视频高清 | 人人妻人人爽人人添夜夜欢视频| 成在线人永久免费视频| 麻豆av在线久日| 三级毛片av免费| 国产高清videossex| 国产精品国产高清国产av| 伊人久久大香线蕉亚洲五| 国产av一区二区精品久久| 久久久久久大精品| 麻豆av在线久日| 中出人妻视频一区二区| 91大片在线观看| 波多野结衣巨乳人妻| 香蕉久久夜色| 日本黄色视频三级网站网址| 亚洲美女黄片视频| 午夜福利,免费看| 国内毛片毛片毛片毛片毛片| 亚洲国产欧美一区二区综合| 欧美日韩福利视频一区二区| 欧美另类亚洲清纯唯美| 精品国产国语对白av| 久久香蕉精品热| 精品国产一区二区三区四区第35| 亚洲精品美女久久久久99蜜臀| 亚洲性夜色夜夜综合| 亚洲av第一区精品v没综合| 69av精品久久久久久| 日本a在线网址| 欧美成人一区二区免费高清观看 | 一二三四在线观看免费中文在| 欧美+亚洲+日韩+国产| 国产精品影院久久| 在线观看一区二区三区| 亚洲,欧美精品.| 欧美日本中文国产一区发布| 国产亚洲欧美在线一区二区| 俄罗斯特黄特色一大片| 免费人成视频x8x8入口观看| www.www免费av| 99久久国产精品久久久| 国产精品亚洲av一区麻豆| √禁漫天堂资源中文www| 国产aⅴ精品一区二区三区波| 波多野结衣巨乳人妻| 国产精品亚洲美女久久久| 亚洲欧美精品综合一区二区三区| 亚洲中文字幕日韩| 亚洲 国产 在线| 国产在线观看jvid| www.999成人在线观看| 啦啦啦 在线观看视频| 窝窝影院91人妻| 国产成+人综合+亚洲专区| 欧美精品亚洲一区二区| 麻豆久久精品国产亚洲av| 亚洲成人精品中文字幕电影| 91精品三级在线观看| 国产精品爽爽va在线观看网站 | 久久精品aⅴ一区二区三区四区| 国产成人av激情在线播放| 久热爱精品视频在线9| 男女下面进入的视频免费午夜 | 国产精品综合久久久久久久免费 | 视频在线观看一区二区三区| 巨乳人妻的诱惑在线观看| 亚洲熟妇熟女久久| 免费搜索国产男女视频| 欧美人与性动交α欧美精品济南到| 亚洲欧美精品综合久久99| 丁香六月欧美| 777久久人妻少妇嫩草av网站| 日本五十路高清| 精品国产美女av久久久久小说| 淫秽高清视频在线观看| 在线观看免费日韩欧美大片| e午夜精品久久久久久久| 欧美大码av| 亚洲一区中文字幕在线| 欧美黑人精品巨大| 中亚洲国语对白在线视频| 99久久综合精品五月天人人| 日日摸夜夜添夜夜添小说| 国产乱人伦免费视频| 国产99白浆流出| 免费少妇av软件| 欧美激情极品国产一区二区三区| 国产成人精品无人区| 热99re8久久精品国产| 欧美大码av| 色婷婷久久久亚洲欧美| 国产精品久久久久久亚洲av鲁大| 日本精品一区二区三区蜜桃| 欧美在线一区亚洲| 99在线视频只有这里精品首页| 黄色a级毛片大全视频| 国产麻豆成人av免费视频| 久久精品91无色码中文字幕| 亚洲性夜色夜夜综合| 亚洲av成人av| 在线观看一区二区三区| 神马国产精品三级电影在线观看 | 欧美色欧美亚洲另类二区 | 中文字幕最新亚洲高清| 少妇的丰满在线观看| 久99久视频精品免费| 久久精品成人免费网站| 少妇裸体淫交视频免费看高清 | 美国免费a级毛片| 搡老熟女国产l中国老女人| 亚洲国产高清在线一区二区三 | 久久久久久久久免费视频了| 91在线观看av| 亚洲精品国产区一区二| 国产精品香港三级国产av潘金莲| 啦啦啦 在线观看视频| 中亚洲国语对白在线视频| 法律面前人人平等表现在哪些方面| 久久国产亚洲av麻豆专区| 老司机在亚洲福利影院| 黑人巨大精品欧美一区二区蜜桃| 精品国产美女av久久久久小说| 国内毛片毛片毛片毛片毛片| 两性夫妻黄色片| 成人18禁在线播放| 国产免费男女视频| 老司机靠b影院| 黄色女人牲交| 国内毛片毛片毛片毛片毛片| 久9热在线精品视频| 亚洲精品国产色婷婷电影| 超碰成人久久| 亚洲伊人色综图| 欧美黄色片欧美黄色片| 69av精品久久久久久| 性少妇av在线| 怎么达到女性高潮| 国产精品亚洲一级av第二区| 中文字幕人成人乱码亚洲影| 欧美激情高清一区二区三区| 久久午夜亚洲精品久久| 午夜久久久久精精品| 亚洲色图av天堂| 精品久久久久久,| av有码第一页| 如日韩欧美国产精品一区二区三区| 色av中文字幕| 757午夜福利合集在线观看| av天堂在线播放| 美女高潮到喷水免费观看| 亚洲人成电影观看| 国产午夜精品久久久久久| 少妇的丰满在线观看| 国产精品亚洲一级av第二区| 国产亚洲欧美在线一区二区| 大码成人一级视频| 欧美成人午夜精品| 亚洲av成人av| 首页视频小说图片口味搜索| 手机成人av网站| 国产一区二区三区综合在线观看| 两人在一起打扑克的视频| 亚洲专区国产一区二区| 变态另类丝袜制服| 国产免费男女视频| 不卡av一区二区三区| 一区二区日韩欧美中文字幕| av在线天堂中文字幕| 19禁男女啪啪无遮挡网站| 免费搜索国产男女视频| 99久久综合精品五月天人人| 精品国产国语对白av| 亚洲成人久久性| 欧美老熟妇乱子伦牲交| 国产成人精品无人区| 婷婷丁香在线五月| 妹子高潮喷水视频| 亚洲狠狠婷婷综合久久图片| 极品人妻少妇av视频| 少妇 在线观看| 日本 av在线| 日韩高清综合在线| √禁漫天堂资源中文www| 亚洲国产欧美网| 黑人巨大精品欧美一区二区mp4| 国产不卡一卡二| 国产伦人伦偷精品视频| 欧美日本亚洲视频在线播放| 色综合站精品国产| www.www免费av| 免费不卡黄色视频| 最新美女视频免费是黄的| 少妇熟女aⅴ在线视频| 国产精品影院久久| 国产免费av片在线观看野外av| 91麻豆精品激情在线观看国产| av在线天堂中文字幕| 国内久久婷婷六月综合欲色啪| 老司机深夜福利视频在线观看| 欧美黑人欧美精品刺激| 人妻久久中文字幕网| 欧美日韩黄片免| 免费人成视频x8x8入口观看| 久久精品国产综合久久久| 男女床上黄色一级片免费看| 免费在线观看亚洲国产| 亚洲中文字幕日韩| 久久人妻福利社区极品人妻图片| 免费久久久久久久精品成人欧美视频| 国产午夜福利久久久久久| 亚洲性夜色夜夜综合| 黄色丝袜av网址大全| 国产欧美日韩一区二区三区在线| 日韩av在线大香蕉| 国产精品野战在线观看| 欧美午夜高清在线| 少妇的丰满在线观看| 91成人精品电影| 亚洲精品中文字幕一二三四区| 精品国产乱子伦一区二区三区| 日韩成人在线观看一区二区三区| 啪啪无遮挡十八禁网站| 级片在线观看| 久久婷婷人人爽人人干人人爱 | 亚洲色图综合在线观看| 亚洲 国产 在线| 他把我摸到了高潮在线观看| 国产人伦9x9x在线观看| 十八禁网站免费在线| 欧美大码av| www.999成人在线观看| 国产精品野战在线观看| 久久人妻福利社区极品人妻图片| 成人国产综合亚洲| 黄片大片在线免费观看| 老司机靠b影院| 国产精品秋霞免费鲁丝片| 99国产极品粉嫩在线观看| 国产欧美日韩综合在线一区二区| 国产亚洲精品第一综合不卡| 国产激情久久老熟女| 宅男免费午夜| 亚洲精品av麻豆狂野| 精品久久久久久久毛片微露脸| 亚洲专区字幕在线| 最近最新免费中文字幕在线| 亚洲精品国产区一区二| 免费在线观看亚洲国产| 日韩 欧美 亚洲 中文字幕| 在线播放国产精品三级| 99国产精品免费福利视频| 50天的宝宝边吃奶边哭怎么回事| 久久精品aⅴ一区二区三区四区| 热99re8久久精品国产| 老司机靠b影院| 久99久视频精品免费| 男女午夜视频在线观看| 欧洲精品卡2卡3卡4卡5卡区| 国产成年人精品一区二区| 黄色毛片三级朝国网站| 午夜亚洲福利在线播放| 亚洲五月色婷婷综合| 成人手机av| 国产欧美日韩一区二区三区在线| 性欧美人与动物交配| 男人舔女人下体高潮全视频| 久久午夜综合久久蜜桃| 亚洲无线在线观看| 国产精品国产高清国产av| 一本久久中文字幕| 欧美日韩亚洲综合一区二区三区_| 一区二区三区国产精品乱码| 日本a在线网址| 国产精品亚洲一级av第二区| 国产精品美女特级片免费视频播放器 | 正在播放国产对白刺激| 啦啦啦 在线观看视频| 亚洲一区二区三区不卡视频| 电影成人av| 黑人巨大精品欧美一区二区mp4| 美女国产高潮福利片在线看| 免费无遮挡裸体视频| 99riav亚洲国产免费| 久久婷婷成人综合色麻豆| 很黄的视频免费| 后天国语完整版免费观看| 久久久久久人人人人人| 日韩一卡2卡3卡4卡2021年| 美女午夜性视频免费| 成人18禁高潮啪啪吃奶动态图| 国产一区二区激情短视频| 两性午夜刺激爽爽歪歪视频在线观看 | 又黄又粗又硬又大视频| 国产成人影院久久av| 天天躁夜夜躁狠狠躁躁| 成人三级做爰电影| 91精品三级在线观看| 搡老熟女国产l中国老女人| 最近最新免费中文字幕在线| 日本免费一区二区三区高清不卡 | 亚洲激情在线av| 一级毛片精品| 久久精品国产亚洲av高清一级| 亚洲成人免费电影在线观看| 国内精品久久久久久久电影| 久久精品国产亚洲av香蕉五月| 亚洲在线自拍视频| 国产精品av久久久久免费| 老司机在亚洲福利影院| 亚洲五月天丁香| 国语自产精品视频在线第100页| 午夜免费激情av| 亚洲精品在线观看二区| 极品人妻少妇av视频| 国产av在哪里看| 久久久久国产一级毛片高清牌| 国产精品免费一区二区三区在线| 亚洲国产欧美网| 最近最新免费中文字幕在线| 伊人久久大香线蕉亚洲五| 免费久久久久久久精品成人欧美视频| 日韩三级视频一区二区三区| 又黄又爽又免费观看的视频| 亚洲一卡2卡3卡4卡5卡精品中文| 亚洲国产精品999在线| 国内久久婷婷六月综合欲色啪| 色综合站精品国产| 国产精品香港三级国产av潘金莲| 欧美性长视频在线观看| 国产在线精品亚洲第一网站| 88av欧美| 在线播放国产精品三级| 少妇粗大呻吟视频| 男女下面插进去视频免费观看| 91大片在线观看| 一进一出抽搐gif免费好疼| 一边摸一边抽搐一进一小说| 9191精品国产免费久久| 国产高清激情床上av| 日韩欧美在线二视频| 婷婷精品国产亚洲av在线| 国产区一区二久久| 女人高潮潮喷娇喘18禁视频| 禁无遮挡网站| 变态另类成人亚洲欧美熟女 | 中文字幕人成人乱码亚洲影| 久久久精品国产亚洲av高清涩受| 好男人电影高清在线观看| 三级毛片av免费| 日韩欧美国产一区二区入口| 精品一区二区三区四区五区乱码| 亚洲精品在线观看二区| 美女 人体艺术 gogo| 国产精品久久久久久亚洲av鲁大| 女警被强在线播放| 此物有八面人人有两片| 操美女的视频在线观看| 国产高清视频在线播放一区| 国产精品美女特级片免费视频播放器 | 亚洲精华国产精华精| 亚洲精品一卡2卡三卡4卡5卡| 一个人观看的视频www高清免费观看 | 给我免费播放毛片高清在线观看| 精品国产超薄肉色丝袜足j| 亚洲成av人片免费观看| 女人被躁到高潮嗷嗷叫费观| 国产麻豆69| 最好的美女福利视频网| 99久久综合精品五月天人人| 久久亚洲精品不卡| 成人18禁在线播放| 亚洲av美国av| 久久青草综合色| av视频免费观看在线观看| 久久人妻av系列| 99在线人妻在线中文字幕| 成人精品一区二区免费| 亚洲九九香蕉| 女人爽到高潮嗷嗷叫在线视频| 91av网站免费观看| 精品福利观看| 日韩高清综合在线| 久久久久久人人人人人| 人人妻人人澡人人看| 村上凉子中文字幕在线| 中出人妻视频一区二区| 午夜福利一区二区在线看|