• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Research on Automatic Diagnostic Technology of Soybean Leaf Diseases Based on Improved Transfer Learning

    2022-06-25 08:40:54YuXiaoJingYongdongandZhengLulu

    Yu Xiao, Jing Yong-dong, and Zheng Lu-lu

    1 College of Electrical and Information, Northeast Agricultural University, Harbin 150030, China

    2 School of Computer Science and Technology, Shandong University of Technology, Zibo 255049, Shandong, China

    Abstract: Soybean diseases and insect pests are important factors that affect the output and quality of the soybean, thus, it is necessary to do correct inspection and diagnosis on them. For this reason, based on improved transfer learning, a classification method of the soybean leaf diseases was proposed in this paper. In detail, this method first removed the complicated background in images and cut apart leaves from the entire image; second, the data-augmented method was applied to amplify the separated leaf disease image dataset to reduce overfitting; at last, the automatically fine-tuning convolutional neural network (AutoTun) was adopted to classify the soybean leaf diseases. The proposed method respectively reached 94.23%, 93.51% and 94.91% of validation accuracy rates on VGG-16, ResNet-34 and DenseNet-121, and it was compared with the traditional fine-tuning method of transfer learning. The results indicated that the proposed method had superior to the traditional transfer learning method.

    Key words: transfer learning, deep convolutional neural network, classification recognition, soybean disease

    Introduction

    Nowadays, soybeans are one of the leading seed soybean plants in the world. It is the primary source of edible oils. The soybean soil accounts for 25% of the total edible oils. At the same time, it is the main component in poultry and fish formula fodders, occupying 60%of worldwide livestock fodders (Agarwalet al., 2013).Besides, with the comprehensive soybean nutrition and abundant content, it is contributed to preventing heart disease and diabetics. In today's society, China is the 4th soybean producing country in the world, next only to the USA, Brazil and Argentina. The cultivation and plantation of soybeans can be traced back to the agricultural era of China. Meanwhile, the northeast is the leading soybean planting area of China. Different prevalent diseases and insect pests will take place every year. More severely, it will result in more than 30% of the output loss (Niuet al., 2019). The insufficient soybean plant protection procedure, the increase of fungus virus pathogen categories and poor cultivation methods are causes for increasing the damage degree of soybean plant diseases and insect pests. The categories of the soybean leaf diseases were studied, which could be divided into multiple types,including anthracnose, bacterial blight, bacterial leaf spot, soybean mosaic virus, copper poison disease,charcoal rot, frogeye, leaf blight, wind blight, downy mildew, powdery mildew, rust disease and tan disease(Wanget al., 2014).

    At present, recognition of the soybean leaf diseases is based on human eye recognition, but it will be influenced by a subjective explanation of crop disease professionals, resulting in misjudgment (Barbedo and Arnal, 2016). Moreover, for most small and mediumsized farmers, it is difficult to contact professionals,leading to some delay in finding out the reasons for the leaf disease symptoms and preventive solutions. This severely affects the quality and output of soybeans.As a result, an automatic and reliable computer-assisted system is needed to solve the efficiency issue of soybean leaf disease detection and recognition. At present, establishing accurate technology to identify soybean leaf diseases is the key to prevent soybean leaf diseases and insect pests. The detection of plant diseases needs lots of researchers to apply the image processing technology to remit such a difficult task.Researchers put forward multiple recognition and detection technologies of multiple plant diseases and insect pests and reviewed the traditional detection technologies and innovative detection technologies from multiple aspects (Bagdeet al., 2015; Rastogiet al.,2015; Prasadet al., 2016; Khirade and Patil, 2015;Martinelliet al., 2015; Sankaranet al., 2010).

    In current days, the traditional detection technologies get involved in molecules, serology and DNA,while the innovative detection technologies include volatile organic compounds, spectrum technology and convolutional neural network (CNN), etc. A digital picture processing technique (Shrivastava and Hooda, 2014) detects and classifies the tan disease and frogeye. In detail, the recognition accuracy rates of the tan disease and frogeye respectively reach 70%and 80%. However, the above-mentioned methods show some limitations. Firstly, the researches only consider two kinds of soybean diseases and insect pests. Secondly, the recognition accuracy rate in the current research field is not enough. A soybean left disease and insect pest detection method based on salient regions extracts disease areas from soybean leaf disease images (Guiet al., 2015). However,such a method fails to detect the soybean disease types and the recognition accuracy rate. A semiautomatic soybean disease and insect pest recognition system based on the K-means algorithm identifies three diseases including downy mildew, frogeye and leaf blight (Kauret al., 2018). The average maximum accuracy rate reaches 90%, which cannot satisfy the high recognition accuracy rate in today's research fields. A digital picture processing technology (Araujo and Peixoto, 2019) combined with the color moments,local binary pattern (LBP) and bag-of-visual-words(BoVw) recognizes eight kinds of leaf diseases, including bacterial blight, rust disease, copper poison disease,soybean mosaic virus, target leaf spot, downy mildew,powdery mildew and tan disease, showing the classification accuracy rate of 75.8%.

    For this reason, the follow-up study can pay much attention to analyzing how to improve recognition accuracy rate, while identifying multiple types of soybean leaf diseases. As a modern image processing and data analysis method, deep learning is equipped with a good image analysis effect and huge development potential. With the successful application of deep learning in each field, it is gradually applied in the agricultural field (Kamilaris and Prenafeta, 2018). In the past several years, deep learning gains extremely excellent performance, especially for the deep neural network (DNN) (Lecunet al., 2015). In the image recognition field, the CNN is rapidly developing, which can extract key features from lots of input images.Through the CNN, researchers can accurately classify the soybean leaf diseases, but the CNN needs lots of calculation resources and time, as well as huge datasets or plenty of input images. To cope with the abovemention shortcomings of the CNN, transfer learning becomes in general use. The utilization of transfer learning uses the pre-trained DNN on the super-large scale datasets to solve specific model training tasks with limited data (Tanet al., 2018). This research regarded the automatically fine-tuning convolutional neural network (AutoTun) as the skeleton and put forward a soybean disease recognition system, which was composed of two modules, including the image processing module and classification module (Bashaet al., 2021). The image processing module aimed to extract the leaf area from the leaf image with the complicated background or the entire image removal background. The classification module used AutoTun to fine-tuning pre-trained CNN on the soybean leaf disease datasets, thus the classification model trained in the original datasets showed excellent performance on the target dataset. In other words, more complicated features could be learned from the soybean leaf disease datasets, to improve the leaf disease recognition accuracy rate. The proposed recognition system results were compared with the traditional fine-tuning methods of transfer learning. Such methods and our method used the same dataset. The experimental and comparative analysis indicated that the proposed method showed excellent performance, namely, relatively unapparent and more detailed leaf structure features were successfully learned, while getting the higher classification accuracy rate in the validation set.

    Materials and Methods

    Dataset

    The soybean leaf disease image data used in this research were gained from the digipathos plant dataset images provided by Embrapa (Brazil), including 459 images and 11 types, which were subdivided into bacterial leaf spot, southern blight, target leaf spot, rust disease, powdery mildew, downy mildew,copper poison disease, grey speck disease, soybean mosaic virus, healthy and unknown diseases. During the model training process, 80% of images were used for training, while 20% of images were applied for validation. Table 1 showed each leaf disease's image quantity in the training stage and validation stage in the original dataset. Fig. 1 showed the sample image of soybean diseases.

    Due to the small dataset, transfer learning was trained under the small-scale dataset. Meanwhile, the data-augmented technology was applied to amplify the dataset to reduce overfitting and underfitting.In the computer vision field, each dataset of image classification issues needed to use the specific dataaugmented strategy to gain the best classification effect.A high-efficient automatic data-augmented method based on the search algorithm, AutoAugment (Cubuket al.,2018), was applied. Data augmentation in the field of image recognition was generally applied in equalization,flip horizontal, cutting and rotation. The data-augmented scheme was composed of multiple strategies, while each strategy included two different image processing methods(equalization, flip horizontal, cutting and rotation),as well as the use sequence and probability for each image processing method. The genetic algorithm(Schulmanet al., 2017) was chosen to find out the best data-augmented scheme in the search space.

    Table 1 Image quantity in training sets and verification sets

    Fig. 1 Soybean leaf disease sample

    Image processing module

    By observing the sample images in Fig. 1, it was found that the leaf area was strongly less than the background area. Before the image input classifier, leaves in the entire image were separated, thus the classifier accurately extracted features relating to leaf diseases to do accurate leaf disease recognition. The image processing module constituted of four submodules,with the components and output images in Fig. 2.

    The first submodule converted the RGB color space image intoL*a*bcolor space, which was the 3D real space.Lrepresented luminance,astood for the component from green to red, andbwas the component from blue to yellow. Among them, the output ofbchannel component was the input of the second submodule, as shown in Fig. 2. The second submodule applied the K-means (Hartigan and Wong, 1979) clustering algorithm to cutbchannel components' images into two clusters (k=2). One cluster corresponded to the leaf area (the interested area), while another cluster corresponded to the background area excluding leaves. In detail, white (the pixel 255) was used to represent the foreground(the leaf area), while black (the pixel 0) was applied to represent the background.Landachannels did not get excellent output in the image segmentation,thus components ofLandachannels did not participate in the K-means cluster. However, after the K-means clustering algorithm cut soybean leaves, the image background area still had some non-leaf areas. In other words, the K-means clustering algorithm cut some areas in the background into the foreground, as shown in Fig. 2.

    Beyond that, the third submodule applied the K-means clustering output images as the images after improved input segmentation. The main purpose of the third submodule was to find out and cut out the maximum connected domain or the soybean leaf area.The images contained multiply connected domains which meant that the images contained the same pixel and stayed in the image area composed of foreground pixels with the adjacent position, including the leaf area and connected domains that were cut into the foreground without the demand or correlation. The connected domain label algorithm (Cabaret, 2010) was adopted to gain the number of connected domains in input images and different connected domains applied different colors for marking, as shown in Fig. 2. Besides,the maximum connected domain or a pair of binary images could be extracted from the marked connected domains. Pixel 1 with data normalization represented the leaf area, while pixel 0 meant the background area.At last, binary images were mapped to the original input RGB images, gaining RGB images with the leaf areas. As the input of the 4th submodule, the mapping algorithm converted the normalized binary images into the three-channel images and then multiplied the original input RGB images.

    The 4th submodule converted the binary processing of input images into the single-channel, as shown in Fig. 2. Two lists were set up to do the simple iterative algorithm. The horizontal list conducted traversal inx-axis of input images, while the vertical list conducted traversal iny-axis of input images. Coordinate points with the pixels of 254 were added to the list. The minimum of the horizontal list was the left boundary,while the maximum was the right boundary. The minimum of the vertical list was the bottom boundary,while the maximum was the top boundary. The leaf area images were cut on the corresponding interval, as shown in Fig. 2.

    The above-mentioned image processing module was applied to all the images of the dataset, but only left the leaf area, removing the irrelevant background.With a tiny minority, this module cut the leaf area into the background, while the background area was cut into the foreground. The sum of pixels after cutting and clipping all the image leaf areas was not the same,and the image size was too large (4 128*3 096*3 pixels), thus the image size was reset as 224*224*3(pixels) as a whole.

    Fig. 2 Image processing module

    AutoTun

    In recent years, deep learning had the remarkable achievements in target detection, computer vision,natural language processing, automatic speech recognition and semantic analysis (Zhouet al.,2017). Compared with the shallow algorithm model of traditional machine learning, deep learning showed remarkable superiority in feature extraction and modeling. At present, the CNN was one of the primary forms in deep learning, featured with the local connection, weight sharing, pooling operation and multi-layer structure. It was suitable for the image classification field. Models with the excellent performance included VGG-16 (Simonyan and Zisserman, 2014), ResNet-34 (Heet al., 2016),Inception (Szegedyet al., 2015), DenseNet (Huanget al., 2016) and Xception (Chollet, 2016). They generally needed lots of training data and calculation resources to gain excellent performance. However, it was a pity that the new research field did not have sufficient data to support such models. Facing to the abovementioned challenges, transfer learning was a common solution and could provide favorable performance in small-scale datasets. Transfer learning used the CNN advantages and conducted fine-tuning on the pretrained CNN in the source tasks on the target dataset to satisfy the demands of target tasks (Zhuang, 2015).However, in actual applications, relative to the source dataset, the target dataset had a limited scale so that the model overestimated the overfitting caused by the feature capacity of target tasks. Based on the improved transfer learning, Bayesian optimization was applied to do fine-tuning for the pre-trained CNN, to fit for the soybean disease classification task. This section focused on discussing the experimental method. Fig. 3 showed the improved strategies for the training of the CNN and traditional transfer learning.

    Fig. 3 Transfer learning comparison

    AutoTun method was applied so that hyperparameter search space would not be limited to the last fixed layer or several layers. This method first removed the pre-trained CNN softmax layer and used a new softmax layer for replacement. The number of nerve cells was equal to the category of the target dataset. Meanwhile, Bayesian optimization was used to automatically adjust the CNN layer. The layer in front of the CNN represented the general original features, such as the margin and spot. These features were general to most tasks. The exclusive features of target tasks were extracted from several last layers.As a result, this method adjusted the CNN layers from right to left (from the last layer to the initial layer).As shown in Fig. 3 (C), the locking symbol meant to freeze this layer, while the unblocking symbol referred to conduct fine-tuning on this layer. The TPE algorithm used the pre-trained CNN model, hyperparameter search space, training set, validation set,number of training cycles as input, and selected gaussian mixed model (GMM) as the prior distribution of the objective function. The program constantly updated the posterior distribution of the objective functionFwith the iteration of the hyperparameter search space, selected the maximum value for the next iteration each time, and finally outputed an improved CNN model suitable for the target dataset.

    In this paper, the hyper-parameter search space involved in different CNN layers was discussed.VGG-16, ResNet-34 and DenseNet-121 were applied.Such networks were gained by pre-training in the Imagenet (Denget al., 2009) dataset. AutoTun was applied to conduct fine-tuning on the soybean leaf disease dataset to gain better performance.Hyper-parameter search space of fine-tuning was illustrated in Table 2, involved in six operations of the convolutional layer and pooling layer. The fullyconnected hyper-parameter included the number of layers and nerve cells. Dropout was adopted on the fully-connected layer and dense layer. Dropout was a mainstream regularization with the factor to be adjusted in the range of [0, 1]. The offset was 0.1,namely the Dropout factor value was {0, 0.1, 0.2, 0.3,0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1}.

    In the entire experimental process, unmodified pretrained CNN connectivity made the model suitable for the soybean leaf disease dataset. Two operators including 1*1 convolution and upper sampling operation were used to respectively solve the unmatched issue of tensors in deep and spatial dimensions.

    Bayesian optimization

    On the target dataset, the pre-trained CNN conducted the automatic fine-tuning and it was regarded as a black box optimization issue. In other words, target functions were not directly accessed. Bayesian optimization was applied to do fine-tuning for the CNN.Fwas set up as the target function with the mathematical form as (1):

    Where,x*was used as the input. In formula (2),Srepresented the hyper-parameter search space, as shown in Table 2.

    Table 2 Relevant hyper-parameter search space

    Any points in search space were used to solve the value consumption calculation resource of the target functionF, which was gained by conducting finetuning (re-training) for the original CNN layer on the target dataset.x*represented the optimal estimation of relevant hyper-parameter in the CNN layer after finetuning.

    Bayesian optimization was composed of the agent model and gathering function (Pelikanet al., 2005).The agent model was a Bayesian statistical model and Gaussian process (GP) regression was used to establish the approximation of the target functionF. It was assumed that GP was used as the prior distribution of the target functionF. The gathering function used the formula (2) to find out the global maximum of the target functionF.

    Results

    This research was applied for the classification and identification of bacterial leaf spot, southern blight,target leaf spot, rust disease, powdery mildew, downy mildew, copper poison disease, grey speck disease and soybean mosaic virus.

    The image processing module output of the soybean leaf disease recognition system used in this research is illustrated in Fig. 2. The experimental results indicated that such a method was effective for the soybean leaf disease images to do background segmentation. The interested areas or leaf areas were separated from the complicated background. The separated leaf images reserved the features of the infected disease and insect pest parts, thus the output images of the image processing module were suitable for the classification module to do feature extraction.

    After gaining the leaf images without the complicated background, AutoAugment technology was used for data augment. The 80% of images were used as the training set at random, while 20% of images were selected as the validation set to verify the accuracy rate of the classifier. The accuracy rate was defined as formula (3):

    Where,xrepresented the leaf disease type,TpxandTnx,respectively represented the time of success and failure to identify the leaf diseasexin the entire system or the average classification accuracy rate of each leaf disease in the entire system.

    The classifier AutoTun was used to fit for the soybean leaf dataset's CNN structure through algorithm 1 learning to improve the training transfer learning.Bayesian optimization algorithm conducted automatic fine-tuning for the CNN to gain different layer structures and optimal configurations of relevant hyperparameters in Table 3. In the soybean leaf disease dataset, the VGG-16 network (pre-training on ImageNet dataset) was conducted fine-tuning to gain 92.55% of validation accuracy rate in the experiment, overlapping a fully connected layer with 1 024 nerve cells. In detail, the Dropout factor with the originally connected layer was 0.6, overlapping the maximum pooling layer. The filter size was 3*3 to do fine-tuning for the ResNet-34 network. The Dropout factor including the originally connected layer was 0.3. The filter size of the last two convolutional layers was 3*3, respectively including 512 and 256 pieces. Through fine-tuning on the DenseNet-121 network, 92.29% of the validation accuracy rate was gained in the experiment, overlapping a new fully-connected layer with 1 024 nerve cells.The Dropout factor with the originally connected layer was 0.4. The filter size of the last two convolutional layers respectively reached 5*5 and 2*2, including 512 and 128 pieces. The fully-connected layer listed and parameters excluded the fully connected layer of output because of the number of nerve cells in the output category. Table 3 indicated that the three CNN structures obtained by Bayesian optimization gained excellent performance on the soybean leaf disease dataset.

    Table 4 showed the accuracy rate from the traditional fine-tuning method of transfer learning and AutoTun in the research in the VGG-16, ResNet-34 and DenseNet-121 structures for the soybean leaf disease dataset. To prove the AutoTun improvement on transfer learning, the layers of the traditional finetuning method remained consistent with the layers gained by automatic fine-tuning of AutoTun. The findings indicated that compared with the traditional fine-tuning method of transfer learning, the AutoTun significantly improved the performance (validation set's accuracy rate) or the number of trained parameters was relatively reduced. For the pre-trained VGG-16 model, the traditional fine-tuning gained 85.52% of the validation accuracy rate on the soybean leaf disease dataset, requiring 1.33 million parameters relating to the three layers. The AutoTun method conducted automatic fine-tuning for the pre-trained VGG-16 model, gaining 94.23% of the validation accuracy rate on the soybean leaf disease dataset. The trained parameters only used 823 000, which was reduced by 38% compared with the traditional fine-tuning method.The AutoTun method could conduct fine-tuning for the pre-trained ResNet-34 and DenseNet-121 models on the soybean leaf disease dataset, finding that the trained parameters were slightly higher than the traditional fine-tuning method of transfer learning, because the extra fully-connected layer was considered in the parameter search space. Such a method respectively gained 93.51% and 94.91% of the validation accuracy rates on the soybean leaf disease dataset, while the traditional fine-tuning method respectively gained 87.13% and 87.74% of the validation accuracy rates.Through the comparison, it was proven that the extra fully-connected layer could improve the performance of the validation data.

    In addition to the comparative analysis between Tables 3 and 4, the AutoTun method and traditional fine-tuning method used the same source dataset(ImageNet dataset) and target dataset (the soybean leaf disease dataset) for the VGG-16, ResNet-34 and DenseNet-121 networks for fine-tuning. At the same time, the number of fine-tuning layers in the traditional fine-tuning method remained the same with the layers gained by automatic fine-tuning from the AutoTun method. The analysis for the trained parameters and validation accuracy rate gained from the contrast experiment proved that the AutoTun method could show excellent performance in the soybean leaf disease dataset, indicating that the AutoTun method owned excellent generalization ability for the CNN with different structures.

    Table 3 Optimal hyper-parameter configuration

    Table 4 Classification performance comparison

    Discussion

    The quality and quantity of soybeans were affected by diseases and insect pests. In this paper, the identification method for the nine kinds of leaf diseases including bacterial leaf spot, southern blight, target leaf spot, rust disease, powdery mildew, downy mildew, copper poison disease, and soybean mosaic virus, as well as healthy leaves, and unknown leaf diseases was proposed, while improving the traditional transfer learning method to identify soybean diseases and insect pests.

    The image segmentation method was applied to remove the complicated background and cut it into the leaf areas with the AutoAugment technology to augment data and reduce overfitting. Relevant findings indicated that the current research method of conducting background segmentation for soybean leaf disease images was effective. The interested areas or leaf areas were separated from the complicated background. The segmented leaf images reserved the features of diseases and insect pests. The AutoTun method improved transfer learning. On the soybean leaf disease dataset, the VGG-16, ResNet-34 and DenseNet-121 network models (ImageNet pre-training)were conducted automatic fine-tuning. The findings indicated that the available methods respectively got 94.23%, 93.51% and 94.91% validation accuracy rates on the VGG-16, ResNet-34 and DenseNet-121 networks showing better performance than the traditional fine-tuning method of transfer learning and fewer trained parameters. Future researches will consider how to use the existing soybean leaf disease database to train and test the AutoTun method, to verify the method's generalization ability on different datasets.

    Conclusions

    In this paper, a new methodology was presented for identification of nine soybean diseases. The proposed methodology consisted of extraction of color features by using the color moments technique, extraction of the texture features by the K-means clustering algorithm.Classification was performed using AutoTun to finetuning pre-trained CNN. The proposed methodology identified a single disease per leaf, and the experimental and comparative analysis indicated that the proposed methodology showed excellent performance. If there were multiple diseases, the system would classify only one of them, namely, that with the greatest prominence in the leaf image. In addition, if a disease was observed that was not included in the nine diseases investigated in this study, the system's output would indicate the disease that had been used for training and that was most similar to the new disease being considered.

    久久国产亚洲av麻豆专区| 中亚洲国语对白在线视频| 亚洲九九香蕉| 19禁男女啪啪无遮挡网站| 国产一区在线观看成人免费| 国产欧美日韩一区二区三| 国产男靠女视频免费网站| 一级a爱片免费观看的视频| 好男人电影高清在线观看| 国产又色又爽无遮挡免费看| 一区福利在线观看| 少妇粗大呻吟视频| 老熟妇乱子伦视频在线观看| 欧美大码av| 特大巨黑吊av在线直播 | 99国产综合亚洲精品| 色播在线永久视频| 久久国产精品人妻蜜桃| 精品国内亚洲2022精品成人| 欧美另类亚洲清纯唯美| 成人国语在线视频| 熟女少妇亚洲综合色aaa.| 中国美女看黄片| 精品国产乱码久久久久久男人| 日韩三级视频一区二区三区| 久久人人精品亚洲av| 伊人久久大香线蕉亚洲五| 老司机靠b影院| 国产精品亚洲一级av第二区| 精品一区二区三区av网在线观看| 国产熟女xx| 最新在线观看一区二区三区| 国产国语露脸激情在线看| 大香蕉久久成人网| 这个男人来自地球电影免费观看| 一区二区三区精品91| 午夜a级毛片| 在线观看日韩欧美| 丝袜人妻中文字幕| 亚洲国产精品成人综合色| av超薄肉色丝袜交足视频| 色精品久久人妻99蜜桃| 嫁个100分男人电影在线观看| 国产又色又爽无遮挡免费看| 香蕉国产在线看| 国内毛片毛片毛片毛片毛片| 国产欧美日韩精品亚洲av| 久久天堂一区二区三区四区| 此物有八面人人有两片| 成人国产综合亚洲| 亚洲av片天天在线观看| 国产伦在线观看视频一区| 国产精品电影一区二区三区| av视频在线观看入口| 亚洲人成网站在线播放欧美日韩| 曰老女人黄片| 麻豆成人av在线观看| 一级a爱片免费观看的视频| 啦啦啦 在线观看视频| 日韩精品免费视频一区二区三区| 亚洲,欧美精品.| 亚洲性夜色夜夜综合| 亚洲av电影在线进入| 欧美成人性av电影在线观看| 久久香蕉激情| 欧洲精品卡2卡3卡4卡5卡区| 亚洲av片天天在线观看| 熟女电影av网| 国产精品亚洲一级av第二区| 国产亚洲精品久久久久久毛片| 亚洲人成网站在线播放欧美日韩| 亚洲精品在线观看二区| 午夜福利免费观看在线| 国产单亲对白刺激| 亚洲最大成人中文| 大型黄色视频在线免费观看| 亚洲中文字幕日韩| 日本黄色视频三级网站网址| 亚洲专区字幕在线| 国产又爽黄色视频| 12—13女人毛片做爰片一| 黄网站色视频无遮挡免费观看| 动漫黄色视频在线观看| 国产精品免费视频内射| 女性被躁到高潮视频| 亚洲在线自拍视频| 色综合婷婷激情| 亚洲国产高清在线一区二区三 | 视频在线观看一区二区三区| 亚洲在线自拍视频| 午夜福利在线在线| 最近最新中文字幕大全免费视频| 国产激情久久老熟女| 午夜激情av网站| 午夜福利一区二区在线看| 日韩av在线大香蕉| 欧美激情高清一区二区三区| 亚洲av电影在线进入| 国产精品九九99| 久久亚洲精品不卡| 亚洲av电影在线进入| 欧美黄色淫秽网站| 亚洲专区国产一区二区| 亚洲精品粉嫩美女一区| 午夜免费激情av| 中文字幕人妻熟女乱码| 在线av久久热| 亚洲精品美女久久av网站| 国产精品久久久久久精品电影 | 中文字幕最新亚洲高清| 日日夜夜操网爽| 国产免费男女视频| 国产亚洲精品av在线| 看黄色毛片网站| av电影中文网址| 欧美亚洲日本最大视频资源| 精品人妻1区二区| 久久精品国产清高在天天线| 99国产综合亚洲精品| 国产精品久久久久久人妻精品电影| 国产极品粉嫩免费观看在线| 十分钟在线观看高清视频www| 久久久久精品国产欧美久久久| 两个人看的免费小视频| 99精品在免费线老司机午夜| 制服人妻中文乱码| 欧美国产日韩亚洲一区| 国产高清videossex| 久久国产亚洲av麻豆专区| 精品久久久久久久久久久久久 | 午夜成年电影在线免费观看| 久久香蕉激情| av超薄肉色丝袜交足视频| 亚洲九九香蕉| 亚洲成人国产一区在线观看| 久久久久久久久久黄片| 19禁男女啪啪无遮挡网站| 国产一区二区在线av高清观看| 日韩欧美一区视频在线观看| 波多野结衣av一区二区av| 国产三级在线视频| 在线观看www视频免费| 日本免费一区二区三区高清不卡| 大型黄色视频在线免费观看| 成人三级黄色视频| 欧美三级亚洲精品| 一区二区三区激情视频| 国产视频一区二区在线看| 久久久久久免费高清国产稀缺| 免费在线观看影片大全网站| 欧美+亚洲+日韩+国产| 日韩欧美 国产精品| 成人特级黄色片久久久久久久| 黄色 视频免费看| 国产真人三级小视频在线观看| 视频区欧美日本亚洲| 人成视频在线观看免费观看| 97超级碰碰碰精品色视频在线观看| 女人被狂操c到高潮| 男女那种视频在线观看| 久久精品亚洲精品国产色婷小说| 每晚都被弄得嗷嗷叫到高潮| 中亚洲国语对白在线视频| 久久久久久久午夜电影| 色综合站精品国产| 欧美在线一区亚洲| 国产成人一区二区三区免费视频网站| 亚洲成a人片在线一区二区| 亚洲精品国产一区二区精华液| 久久久久久久久中文| 亚洲成人精品中文字幕电影| 日本免费a在线| 99久久精品国产亚洲精品| 99国产精品99久久久久| 国产亚洲精品久久久久久毛片| 午夜a级毛片| 精品乱码久久久久久99久播| 一a级毛片在线观看| 夜夜看夜夜爽夜夜摸| 色综合欧美亚洲国产小说| 脱女人内裤的视频| 欧美大码av| 狠狠狠狠99中文字幕| 亚洲精华国产精华精| 成人18禁在线播放| 精品不卡国产一区二区三区| 一级a爱片免费观看的视频| 变态另类丝袜制服| 啪啪无遮挡十八禁网站| 91国产中文字幕| 亚洲aⅴ乱码一区二区在线播放 | 在线视频色国产色| 亚洲 欧美 日韩 在线 免费| 麻豆国产av国片精品| 高清在线国产一区| 亚洲最大成人中文| 免费看a级黄色片| 91av网站免费观看| 午夜久久久在线观看| 国产精品亚洲一级av第二区| 国产91精品成人一区二区三区| a在线观看视频网站| 我的亚洲天堂| 国产精品乱码一区二三区的特点| 亚洲av片天天在线观看| 午夜日韩欧美国产| 成人永久免费在线观看视频| 曰老女人黄片| 妹子高潮喷水视频| 天天躁狠狠躁夜夜躁狠狠躁| 他把我摸到了高潮在线观看| 少妇裸体淫交视频免费看高清 | 18禁黄网站禁片午夜丰满| 91麻豆av在线| 99国产精品99久久久久| 黄色成人免费大全| 黄网站色视频无遮挡免费观看| 午夜精品在线福利| 亚洲一卡2卡3卡4卡5卡精品中文| 午夜福利成人在线免费观看| 久久热在线av| 免费无遮挡裸体视频| 啦啦啦免费观看视频1| 国产成人精品久久二区二区91| 成年女人毛片免费观看观看9| 一本综合久久免费| 丁香欧美五月| 俄罗斯特黄特色一大片| 黄色片一级片一级黄色片| 午夜日韩欧美国产| 国产99白浆流出| 男人舔奶头视频| 久久久久久人人人人人| 欧美国产日韩亚洲一区| 精品一区二区三区av网在线观看| 国产精品国产高清国产av| 日韩 欧美 亚洲 中文字幕| 性色av乱码一区二区三区2| 老司机午夜福利在线观看视频| 男女那种视频在线观看| 少妇熟女aⅴ在线视频| 国产精品免费视频内射| www.精华液| 99国产综合亚洲精品| 国产成人精品无人区| 在线观看免费午夜福利视频| 欧美成人免费av一区二区三区| 日日夜夜操网爽| 激情在线观看视频在线高清| 99久久久亚洲精品蜜臀av| 黄色视频不卡| 可以在线观看的亚洲视频| 美女 人体艺术 gogo| 精品久久久久久久久久免费视频| 亚洲专区国产一区二区| 国产精品综合久久久久久久免费| 亚洲欧美日韩无卡精品| 最好的美女福利视频网| 国产成人一区二区三区免费视频网站| 窝窝影院91人妻| 人人妻人人澡欧美一区二区| 母亲3免费完整高清在线观看| 中文字幕最新亚洲高清| av免费在线观看网站| 国产高清激情床上av| 麻豆国产av国片精品| 9191精品国产免费久久| 午夜福利一区二区在线看| 麻豆国产av国片精品| 久久精品国产综合久久久| 亚洲一区二区三区色噜噜| 草草在线视频免费看| 手机成人av网站| 黑人巨大精品欧美一区二区mp4| 自线自在国产av| 中文字幕人妻丝袜一区二区| 国产av又大| 国产精品野战在线观看| 午夜两性在线视频| 精品久久久久久,| 欧美日韩黄片免| 听说在线观看完整版免费高清| 免费在线观看成人毛片| 黄片播放在线免费| 熟女电影av网| 琪琪午夜伦伦电影理论片6080| 黄色视频不卡| 欧美日韩中文字幕国产精品一区二区三区| 国产精品日韩av在线免费观看| 男人操女人黄网站| 黄色丝袜av网址大全| 精品福利观看| 亚洲一区二区三区色噜噜| 两个人免费观看高清视频| 成人午夜高清在线视频 | 国产精品爽爽va在线观看网站 | 黄色毛片三级朝国网站| 亚洲国产毛片av蜜桃av| 变态另类丝袜制服| 久久香蕉精品热| 婷婷六月久久综合丁香| 国产熟女午夜一区二区三区| 国产亚洲精品综合一区在线观看 | 国产精品国产高清国产av| 99精品欧美一区二区三区四区| 99久久精品国产亚洲精品| 国产精品免费视频内射| 午夜福利免费观看在线| 美女扒开内裤让男人捅视频| 成人特级黄色片久久久久久久| 88av欧美| 三级毛片av免费| 亚洲av五月六月丁香网| 88av欧美| 老司机午夜十八禁免费视频| 日本在线视频免费播放| 黄色成人免费大全| 人人妻人人澡欧美一区二区| 黑丝袜美女国产一区| 精品日产1卡2卡| 婷婷精品国产亚洲av| 国产午夜精品久久久久久| 成年免费大片在线观看| 亚洲自偷自拍图片 自拍| 亚洲成人免费电影在线观看| 级片在线观看| 国产亚洲精品一区二区www| 国产真人三级小视频在线观看| 免费人成视频x8x8入口观看| 最近最新中文字幕大全电影3 | 日韩成人在线观看一区二区三区| 给我免费播放毛片高清在线观看| 热re99久久国产66热| www日本黄色视频网| 久久九九热精品免费| 成人一区二区视频在线观看| 成人国产一区最新在线观看| 极品教师在线免费播放| 国内久久婷婷六月综合欲色啪| 动漫黄色视频在线观看| 久久久精品欧美日韩精品| 国产又黄又爽又无遮挡在线| 国产一区在线观看成人免费| 午夜精品久久久久久毛片777| 欧美乱色亚洲激情| 哪里可以看免费的av片| 少妇的丰满在线观看| 亚洲午夜精品一区,二区,三区| 免费电影在线观看免费观看| 在线观看免费午夜福利视频| 国产亚洲精品av在线| 身体一侧抽搐| 天堂影院成人在线观看| 麻豆久久精品国产亚洲av| 制服丝袜大香蕉在线| 欧美中文综合在线视频| 999久久久精品免费观看国产| 国产精品日韩av在线免费观看| bbb黄色大片| 国产精品国产高清国产av| 中文字幕人妻丝袜一区二区| 亚洲人成电影免费在线| 美女国产高潮福利片在线看| 中文字幕人妻丝袜一区二区| 久久久精品国产亚洲av高清涩受| 两性午夜刺激爽爽歪歪视频在线观看 | 精品久久久久久久久久久久久 | 欧美+亚洲+日韩+国产| 亚洲精品国产精品久久久不卡| 99精品久久久久人妻精品| 熟妇人妻久久中文字幕3abv| 国产精品免费一区二区三区在线| 中文字幕精品亚洲无线码一区 | 熟女电影av网| 亚洲狠狠婷婷综合久久图片| 欧美成人免费av一区二区三区| 久久精品91蜜桃| av欧美777| 欧美日韩中文字幕国产精品一区二区三区| 国产主播在线观看一区二区| 久久精品国产亚洲av高清一级| 国产成人av激情在线播放| 女人高潮潮喷娇喘18禁视频| 国产精品久久视频播放| 看片在线看免费视频| 国产精品香港三级国产av潘金莲| 给我免费播放毛片高清在线观看| 看片在线看免费视频| 久久狼人影院| 午夜a级毛片| 搡老岳熟女国产| 中亚洲国语对白在线视频| 亚洲国产日韩欧美精品在线观看 | 国产精品,欧美在线| 久9热在线精品视频| 不卡av一区二区三区| 午夜福利一区二区在线看| 国内精品久久久久久久电影| 久久婷婷成人综合色麻豆| 亚洲黑人精品在线| 在线国产一区二区在线| 久久天堂一区二区三区四区| 亚洲第一欧美日韩一区二区三区| 久久久久久人人人人人| 男女视频在线观看网站免费 | 一边摸一边抽搐一进一小说| 国内毛片毛片毛片毛片毛片| 亚洲第一电影网av| 999久久久精品免费观看国产| 狠狠狠狠99中文字幕| xxx96com| 精华霜和精华液先用哪个| 国产av又大| 激情在线观看视频在线高清| 波多野结衣高清作品| 免费看日本二区| 久久久久久九九精品二区国产 | 女性生殖器流出的白浆| 国产伦一二天堂av在线观看| 亚洲av熟女| av免费在线观看网站| 一区二区三区精品91| 老司机靠b影院| 国产精品久久久久久人妻精品电影| 久久精品国产综合久久久| 88av欧美| 色尼玛亚洲综合影院| 一区二区三区激情视频| 白带黄色成豆腐渣| 亚洲国产精品999在线| 免费在线观看黄色视频的| 一级黄色大片毛片| 亚洲国产精品成人综合色| 又黄又粗又硬又大视频| 成人亚洲精品一区在线观看| 精品久久蜜臀av无| 男女午夜视频在线观看| 日本撒尿小便嘘嘘汇集6| 久久狼人影院| 午夜福利在线观看吧| 久久久久免费精品人妻一区二区 | 久久中文字幕人妻熟女| 在线播放国产精品三级| www.自偷自拍.com| 久久伊人香网站| 欧美性长视频在线观看| АⅤ资源中文在线天堂| 国产人伦9x9x在线观看| 日韩 欧美 亚洲 中文字幕| 一本久久中文字幕| 美国免费a级毛片| 国产一卡二卡三卡精品| 色综合欧美亚洲国产小说| 午夜视频精品福利| 日韩大码丰满熟妇| 亚洲色图av天堂| 欧美日韩中文字幕国产精品一区二区三区| 最近最新中文字幕大全电影3 | 久久天堂一区二区三区四区| 狂野欧美激情性xxxx| ponron亚洲| 亚洲国产精品999在线| 亚洲精品色激情综合| 精品久久久久久成人av| 精华霜和精华液先用哪个| 亚洲第一欧美日韩一区二区三区| 黄网站色视频无遮挡免费观看| 欧美黑人巨大hd| 黄色毛片三级朝国网站| 欧美午夜高清在线| 欧美另类亚洲清纯唯美| av中文乱码字幕在线| 好男人在线观看高清免费视频 | 国内揄拍国产精品人妻在线 | 亚洲成av人片免费观看| 一区二区三区国产精品乱码| 在线免费观看的www视频| 成年女人毛片免费观看观看9| 国产精品一区二区精品视频观看| 精品高清国产在线一区| 嫩草影视91久久| 欧美成人一区二区免费高清观看 | 男女床上黄色一级片免费看| 高清在线国产一区| 美女免费视频网站| 国产区一区二久久| 神马国产精品三级电影在线观看 | 欧美zozozo另类| 美国免费a级毛片| 精品免费久久久久久久清纯| 精品第一国产精品| 国产又爽黄色视频| bbb黄色大片| 免费在线观看影片大全网站| 69av精品久久久久久| 精品国产国语对白av| 久久中文看片网| 国产高清videossex| 人人妻,人人澡人人爽秒播| 香蕉国产在线看| 亚洲色图av天堂| 免费高清在线观看日韩| 亚洲精品美女久久久久99蜜臀| 人成视频在线观看免费观看| 欧美大码av| avwww免费| 欧美中文综合在线视频| 午夜福利在线在线| 日韩一卡2卡3卡4卡2021年| 日韩成人在线观看一区二区三区| 狂野欧美激情性xxxx| 国产在线观看jvid| 久久伊人香网站| 久久天堂一区二区三区四区| 亚洲人成网站高清观看| 老汉色∧v一级毛片| 国产色视频综合| 精品一区二区三区视频在线观看免费| 一边摸一边抽搐一进一小说| 国产精品免费视频内射| 国产伦人伦偷精品视频| 国产成+人综合+亚洲专区| av有码第一页| 99热这里只有精品一区 | 国产精品精品国产色婷婷| 国产成人av激情在线播放| 欧美午夜高清在线| 精品久久久久久久久久免费视频| 十八禁网站免费在线| 日韩欧美国产在线观看| 亚洲激情在线av| 人人澡人人妻人| 久久精品国产清高在天天线| 高清在线国产一区| 国产精品久久久av美女十八| 亚洲国产欧美日韩在线播放| 亚洲,欧美精品.| 制服人妻中文乱码| 少妇被粗大的猛进出69影院| 熟女少妇亚洲综合色aaa.| 99re在线观看精品视频| 亚洲avbb在线观看| 自线自在国产av| 日韩一卡2卡3卡4卡2021年| 美女免费视频网站| 国产精品永久免费网站| 天天躁狠狠躁夜夜躁狠狠躁| av视频在线观看入口| 中文字幕精品免费在线观看视频| 精品免费久久久久久久清纯| 国产伦在线观看视频一区| 国产高清有码在线观看视频 | 亚洲五月婷婷丁香| 亚洲精品在线美女| 99riav亚洲国产免费| 国语自产精品视频在线第100页| 国产精品 国内视频| 成人国产综合亚洲| av超薄肉色丝袜交足视频| www.自偷自拍.com| 久久午夜综合久久蜜桃| 国产伦人伦偷精品视频| 国产高清视频在线播放一区| 欧美国产日韩亚洲一区| 淫秽高清视频在线观看| 黄网站色视频无遮挡免费观看| 黄色片一级片一级黄色片| av中文乱码字幕在线| 91麻豆av在线| 国产精品久久久人人做人人爽| 日本一本二区三区精品| 99在线人妻在线中文字幕| 精品久久久久久久末码| 在线av久久热| 国产在线观看jvid| 亚洲欧洲精品一区二区精品久久久| 丰满的人妻完整版| 国内久久婷婷六月综合欲色啪| 国产精品一区二区三区四区久久 | 久久国产精品影院| 欧美zozozo另类| 999久久久精品免费观看国产| 国产日本99.免费观看| 久久亚洲精品不卡| 欧美最黄视频在线播放免费| 久久中文字幕一级| 国产黄片美女视频| 嫁个100分男人电影在线观看| 99久久久亚洲精品蜜臀av| 人人妻人人澡人人看| 国产三级在线视频| 丰满人妻熟妇乱又伦精品不卡| 亚洲熟妇熟女久久| 老司机深夜福利视频在线观看| 免费看十八禁软件| 午夜视频精品福利| 女人被狂操c到高潮| 亚洲第一av免费看| 国产在线精品亚洲第一网站| 亚洲性夜色夜夜综合| 成年版毛片免费区| 午夜精品在线福利| 国产男靠女视频免费网站| 午夜福利成人在线免费观看| 欧美黑人欧美精品刺激| 精品卡一卡二卡四卡免费| 免费女性裸体啪啪无遮挡网站| 日韩精品中文字幕看吧| 男人的好看免费观看在线视频 | 97碰自拍视频| 亚洲成国产人片在线观看| 国产精品久久久久久精品电影 | 一二三四社区在线视频社区8| 色综合婷婷激情| а√天堂www在线а√下载|