• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Inversion of Oceanic Parameters Represented by CTD Utilizing Seismic Multi-Attributes Based on Convolutional Neural Network

    2020-11-30 03:20:54ANZhenfangZHANGJinandXINGLei
    Journal of Ocean University of China 2020年6期

    AN Zhenfang, ZHANG Jin, *, and XING Lei

    Inversion of Oceanic Parameters Represented by CTD Utilizing Seismic Multi-Attributes Based on Convolutional Neural Network

    AN Zhenfang1), 2), ZHANG Jin1), 2), *, and XING Lei1), 2)

    1),,,266100,2),,266071,

    In Recent years, seismic data have been widely used in seismic oceanography for the inversion of oceanic parameters represented by conductivity temperature depth (CTD). Using this technique, researchers can identify the water structure with high horizontal resolution, which compensates for the deficiencies of CTD data. However, conventional inversion methods are model- driven, such as constrained sparse spike inversion (CSSI) and full waveform inversion (FWI), and typically require prior deterministic mapping operators. In this paper, we propose a novel inversion method based on a convolutional neural network (CNN), which is purely data-driven. To solve the problem of multiple solutions, we use stepwise regression to select the optimal attributes and their combination and take two-dimensional images of the selected attributes as input data. To prevent vanishing gradients, we use the rectified linear unit (ReLU) function as the activation function of the hidden layer. Moreover, the Adam and mini-batch algorithms are combined to improve stability and efficiency. The inversion results of field data indicate that the proposed method is a robust tool for accurately predicting oceanic parameters.

    oceanic parameter inversion; seismic multi-attributes; convolutional neural network

    1 Introduction

    Oceanic parameters, including temperature, salinity, den- sity, and velocity can be obtained directly by conductivity temperature depth (CTD) instruments or indirectly by in- version utilizing seismic data. Although the vertical resolution of CTD data is higher than that of seismic data, its horizontal resolution is far lower. Joint CTD-seismic inversion combines the advantages of both to obtain oceanic parameters with high resolution.

    However, traditional inversion methods, such as constrained sparse spike inversion (CSSI) and full waveform inversion (FWI), typically assume the existence of prior deterministic mapping operators between the geophysical responses and the geophysical parameters, such as a convolution operator and a wave equation operator. For some oceanic parameters, however, such as temperature and salinity, it is difficult to establish their mapping relationships and seismic responses by mathematical modeling.

    A recent trend in many scientific fields has been to solve inverse problems using data-driven methods and a revived use of deep neural networks (DNNs). According to the universal approximation theorem (Hornik., 1990), a DNN can theoretically approximate any conti- nuous function when there are enough neurons in the hidden layer. Machine learning based on a DNN is usually referred to as deep learning. The convolutional neural network (CNN) is a DNN that has two special characteristics: local connection and shared weights, which improves computational efficiency by reducing the number of weights. Due to the significant advances in image pro- cessing and speech recognition, CNNs have attracted wide- spread attention and have been successfully applied in the fields of agriculture (Kamilaris and Prenafeta-Boldú, 2018; Qiu., 2018; Teimouri., 2018), medicine (Liu., 2017; Acharya., 2018; Wachinger., 2018), and transportation (Li and Hu, 2018; Li., 2018; Wang., 2018a).

    Here, we introduce CNNs into marine geophysics to solve the inverse problem described above. Using deep learning, a CNN can automatically search and gradually approximate inverse mapping operators from the seismic responses to oceanic parameters, thus eliminating the need for prior deterministic mapping operators. In other words, CNN is purely data-driven.

    In solid geophysics, CNNs are mainly applied to classification problems, such as fault interpretation (Guo., 2018; Ma., 2018; Wu., 2018a), first-break picking (Duan., 2018; Hollander., 2018; Yuan., 2018), seismic facies identification (Dramsch and Lüthje, 2018; Zhao, 2018), and seismic trace editing (Shen., 2018). Inversion belongs to the category of regression techniques. Generally, seismic inversion based on a CNN takes seismic records as input data and stratigraphic parameters as output data. In one study, generated normal- incidence synthetic seismograms served as input data, and the acoustic impedance served as output data (Das., 2018). In another study, synthetic two-dimensional multi- shot seismic waves were encoded into a feature vector, then the feature vector was decoded into a two-dimen- sional velocity model (Wu., 2018b). In a third study, a modified fully convolutional network was tuned to map pre-stacked multi-shot seismic traces to velocity models (Wang., 2018b).

    The inversion methods mentioned above use one- or two-dimensional seismic records as input data and the corresponding subsurface models as output data. To obtain a large number of samples, many subsurface models are built to generate synthetic seismic records. These methods have made some progress, but continue to have a number of disadvantages. First, they require prior deterministic forward operators to synthesize the seismic records. Second, accurate inversion results cannot be obtained based on a small number of samples. Third, the problem of multiple solutions becomes more prominent as only seismic records are used as input data.

    The samples for our study were obtained from actual data, including both CTD and seismic data acquired from the East China Sea. We took the CTD curves as output data and the seismic traces near the CTDs as input data. Given the fact that if only seismic records are used as input data, the problem of multiple solutions inevitably arises, to re- duce the multiplicity of solutions, we used stepwise regre- ssion to select only the optimal attributes and their combination and took the two-dimensional images of the selected attributes as input data. Because of the frequency difference between the CTD and seismic data, we assumed that one sampling point on the CTD curve corresponded to multiple adjacent sampling points on the seismic trace near the CTD.

    2 Theory of Convolutional Neural Networks

    CNNs (Venkatesan and Li, 2018) consist of an input layer, hidden layers, and an output layer. The hidden layers between the input and output layers generally consist of convolution layers, pooling layers, and a fully connectedlayer. Each hidden layer incorporates several feature maps, and each feature map contains one or more neurons. Fig.1 shows the architecture of the CNN designed for this study, the model parameters of which are listed in Table 1.

    The CNN workflow can be divided into forward and backward propagation. Model output can be obtained afteran original image is successively processed by convolution, pooling, and its weighted sums during forward propagation. Then, the error between the model output and the de- sired output propagates backward from the output layer to the first hidden layer. Meanwhile, the convolution kernels, weights, and biases are updated during backward propagation. These updates will continue until the error tolerance value is reached.

    Fig.1 Architecture of the convolutional neural network, wherein the squares represent maps of neurons. The number on the left of @ denotes the number of maps, and the two numbers on the right of @ denote the number of rows and columns in the map, respectively.

    Table 1 Model parameters of the convolutional neural network

    2.1 Forward Propagation

    For this process, let the current layer be layer, and the layer in front of the current layer be layer?1.

    The convolution layer can be obtained as follows:

    where(l?1)represents the original image of the input layer or the output of the pooling layer;(l)andb(l)are the convolution kernel and the bias of the convolution layer, respectively;(l)and(l)are the input and output of the convolution layer, respectively; rot90(·) performs the 180-degree counterclockwise rotation of(l); conv2(·) performs the two-dimensional convolution of(l?1)and(l), wherevalidreturns only those parts of the convolution that are computed without zero-padded edges; and(·) denotes the activation function.

    The pooling layer can be obtained by the following:

    where(l?1)represents the output of the first convolution layer;(l)and(l)are the input and output of the pooling layer, respectively; and down(·) performs the down-sampling operation of(l?1).

    The fully connected layer and the output layer can be obtained as follows:

    where(l?1)represents the output of the second convolution layer or the fully connected layer;(l)and(l)are the weight and bias of the fully connected layer or the output layer, respectively;(l)and(l)are the input and output of the fully connected layer or the output layer, respectively; and the symbol * denotes multiplication between the two matrices. In the setting=(L),is the total number of layers.

    To prevent vanishing gradients, we use a rectified linear unit (ReLU) function (Nair and Hinton, 2010) as the activation function of the hidden layer. Because the desired output is normalized during data preprocessing, we use the sigmoid function as the activation function of the output layer.

    2.2 Backward Propagation

    Learning rules are driven by the cost function, which is defined as follows:

    whereyanddare the model output and the desired output, respectively,is the number of data points, andis the cost function.

    According to the gradient descent method, the convolution kernels, weights, and biases are updated as follows:

    whereis the learning rate anddenotes the convolution kernel, the weight, or the bias.

    The partial derivatives ofwith respect to(l)and(l)of the output layer and the fully connected layer can be calculated as follows:

    where the superscript T represents the transpose of(l?1)and(l)denotes the delta of the output layer or the fully connected layer. The delta is defined as the element- by-element multiplication of the error and the derivative of the weighted sum:

    where the symbol ?* denotes element-by-element multiplication and(l)is the error of the output layer:

    or the error of the fully connected layer:

    The partial derivatives ofwith respect to(l)andb(l)of the convolution layer can be calculated as follows:

    where sum(·) performs summation of the elements of(l). The delta of the convolution layer can be obtained using Eq. (11). The error of the second convolution layer can be obtained using Eq. (13). Because down-sampling is performed from the first convolution layer to the pooling layer during forward propagation, up-sampling should be performed from the pooling layer to the first convolution layer during backward propagation. The error of the first convolution layer can be obtained by the following:

    , (16)

    where up(·) performs the up-sampling operation of(l+1).

    The error of the pooling layer can be calculated as follows:

    wherefullreturns the full two-dimensional convolution.

    3 Methodology Used to Perform Inversion

    The CTD data and the seismic data near the CTD are different responses at the same sea-water location, so they have an inherent relationship that can be approximated by the CNN. First, the CTD curves are taken as the desired output data and the seismic traces near the CTDs are taken as the input data. The CNN automatically approximates the inverse mapping operator from the input data to the desired output data. Then, the seismic traces obtained far away from the CTDs are input into the trained network model that serves as the inverse mapping operator. Finally, the inversion results are output from the network model.

    3.1 Input and Output of Convolutional Neural Network

    The CTD curves and the seismic traces near the CTDs are taken as the desired output data and the input data, respectively. If only seismic records are taken as input data, the problem of multiple solutions inevitably arises. To solve this problem, stepwise regression is performed to select the optimal attributes and their combination, including the instantaneous frequency, instantaneous amplitude, instantaneous phase, average frequency, dominant frequency, apparent polarity, and the derivative, integral, coordinate, and time attributes. Then, we take the two-dimensional images of the selected attributes as input data instead of the one-dimensional seismic records. The selected attributes are the derivative, time and-coor- dinate, which are located in the left-hand, middle, and right-hand columns of the images, respectively.

    Because the frequency of CTD data differs from that of seismic data, their sampling points do not have a one- to-one correspondence, but one-to-many. We assume that one sampling point in the CTD data corresponds to five adjacent sampling points near the CTD in the seismic data. As shown in Fig.2, we took images with 5×3 pixels as input data and the corresponding target point as the desired output data.

    3.2 Optimization of Training for Network Model

    We divided the samples into training and validation datasets and used cross validation to determine whether the network model was overfitting the data. The root-mean- square error (RMSE) between the model and desired out- puts is defined as follows:

    where e is the RMSE, yk and dk are the model and desired outputs, respectively, and N is the number of data points. If the RMSE of the validation dataset is consistently too large, this indicates that the network model is overfitting.

    The training dataset was evenly divided into several groups, and the convolution kernels, weights, and biases were not updated until all the samples in one group had been trained. This optimization method, known as the mini- batch algorithm, has the high efficiency of the stochastic gradient descent (SGD) algorithm and the stability of the batch algorithm. If the training dataset is kept as just one group, the mini-batch algorithm becomes a batch algorithm. If each group has only one sample, the mini-batch algorithm becomes an SGD algorithm. The increments of the convolution kernels, weights, and biases are calcula- ted by the SGD and batch algorithms shown in Eqs. (19) and (20), respectively:

    whereis the convolution kernel, the weight, or the bias, and Δis the increment of convolution kernel, weight, or bias.

    With the mini-batch algorithm, we combined another op-timization method, known as Adam (Kingma and Ba, 2015), the name of which is derived from adaptive moment esti- mation. Adam combines the advantages of the AdaGrad and RMSProp algorithms, and works well with sparse gradients as well as in non-stationary settings. In the Adam algorithm, convolution kernels, weights, and biases are updated as follows:

    whereis the convolution kernel, the weight, or the bias; the subscriptis the timestep, making=+1, with0=0 as the initialized timestep;is the learning rate;is a very small value; and′ and′ are the bias-corrected first moment estimate and the bias-corrected second raw- moment estimate, respectively:

    where1and2are the exponential decay rates for the respective moment estimates, andandare the biased first moment estimate and the biased second raw-moment estimate, respectively:

    where0=0 and0=0 are the initialized first moment vector and the initialized second raw-moment vector, respectively, andis the gradient with respect to the stochastic objective function, namely the vector of the partial derivatives of() with respect toevaluated at timestep:

    where() denotes the stochastic objective function with parameter. Good default settings for the tested machine learning problems are=0.001,1=0.9,2=0.999, and=10?8. In this study, however,=0.0001, because the mo- del output will fluctuate around the true solution when=0.001. All operations on vectors are element-wise.

    4 Field Data

    A post-stack seismic profile (Fig.3) with 1000 traces was acquired from the East China Sea, and three CTDs were dropped near the seismic traces numbered 85, 471, and 854. It is difficult to distinguish vertical variations in the ocean water from Fig.3 due to its low vertical resolution, which shows only the strong reflection interface. Fig.4 shows the temperature, salinity, density, and velocity curves plotted using data from CTD-1, CTD-2, and CTD- 3, in which we can see that temperature and velocity decrease with depth, whereas salinity and density increase with depth.

    Figs.5, 6, and 7 show the selected derivative, time, and-coordinate attributes obtained using stepwise regression. The derivative attribute is the nonlinear transformation of the seismic data, which records differences in amplitude values between adjacent sampling points. We know that the frequency of seismic data is much lower than that of the CTD data, but the frequency difference between them decreases significantly with the increased high-frequency component after derivative processing, which improves the correlation between the seismic and CTD data. Use of the coordinate attribute can reduce the multiplicity of solutions and improve generalizability. Because CTD data is converted from the depth domain to the time domain, time takes the place of the-coordinate.

    CNNs tend to overfit data when the number of samples is small. To avoid overfitting, we obtained six virtual CTDs by linear interpolation between the three actual CTDs to supply the CNN with more samples. For testing, we used CTD-2 and the seismic attributes extracted from the seismic trace numbered 471, and used the other eight pairs of inputs and outputs for training.

    4.1 Data Preprocessing

    CTD and seismic data are in different domains, so we converted the CTD data to the time domain using the following depth-time conversion equation:

    whereis the interval velocity,is the depth for each interval velocity,0is the two-way vertical time to1, andtis the two-way vertical time corresponding tod.

    Due to the differences in the dimensions and units of the seismic attributes and the oceanic parameters, it is dif- ficult for the CNN to converge if these attributes and parameters are directly used for training. Therefore, we no- malized the seismic attributes and oceanic parameters using Eqs. (28) and (29), respectively:

    Fig.4 Oceanic parameters measured by CTD-1 ((a) and (d)), CTD-2 ((b) and (e)), and CTD-3 ((c) and (f)), which are located near the seismic traces numbered 85, 471, and 854, respectively. The blue, green, red, and cyan lines denote the temperature, salinity, density, and velocity curves, respectively.

    Fig.5 Derivative attribute profile.

    Fig.6 Time attribute profile.

    Fig.7 x-coordinate attribute profile.

    where abs(·) represents the absolute value of the elements of, and max(·) and min(·) are the largest and smallest elements in, respectively.

    4.2 Inversion of Oceanic Parameters Represented by CTD

    Four network models were established for inverting tem- perature, salinity, density, and velocity. After 10000 epo- chs, the four network models ended training as the error tolerance had been reached, as shown in Fig.8.

    A series of images containing 5×3 pixels were then ge- nerated from the three attribute profiles shown in Figs.5, 6, and 7. These images were then input into the four trained network models in turn. Finally, the predicted temperature, salinity, density, and velocity profiles were quickly output, as shown in Figs.9–12, respectively.

    Figs.3 and 9–12 show different responses to the same sea water, with Fig.3 reflecting the variation of impedance and Figs.9–12 reflecting the respective variations of temperature, salinity, density, and velocity. The vertical resolutions of Figs.9–12 as compared with that of Fig.3 have improved.

    Fig.8 Error variation with epoch. The blue, green, red, and cyan lines represent the error curves of temperature, salinity, density, and velocity, respectively.

    Since the desired outputs were normalized in data preprocessing, the model outputs were then post-processed using the following inverse transformation equation:

    Fig.10 Salinity profile predicted by convolutional neural network.

    Fig.11 Density profile predicted by convolutional neural network.

    Fig.12 Velocity profile predicted by convolutional neural network.

    The accuracy of the inversion results can be determined by comparing the model outputs with the desired outputs in the location of CTD-2. Fig.13 shows fold plots of the model and desired outputs, in which we can see that the predicted and actual oceanic parameters exactly match, and the Pearson correlation coefficients of the four oceanic parameters exceed 90%. Therefore, we can conclude that the proposed method is confirmed to have generalizability for accurately predicting temperature, salinity, density, and velocity.

    Fig.13 Fold plots of the predicted and actual oceanic parameters in the location of CTD-2. The blue solid and red dotted lines denote the desired and model outputs, respectively.

    5 Conclusions

    It is difficult for the traditional seismic inversion me- thod to establish mapping relationships between oceanic parameters and seismic responses by mathematical modeling. In this paper, we presented a CTD-seismic joint in- version method based on a CNN, which is purely data- driven. Using deep learning, the CNN can automatically approximate the inverse mapping operator from input data to the desired output data. To reduce the multiplicity of solutions, we utilized stepwise regression to select the best attributes and their combination and took the two- dimensional images of the selected attributes as input data. The derivative attribute was shown to improve the correlation between the seismic and CTD data, and the coordinate attribute to reduce the multiplicity of solutions and improve generalizability. The inversion results of our field data proved that the proposed method can accurately predict oceanic parameters.

    Joint CTD-seismic inversion based on CNN is data- driven, which gives it bright prospects for the future, although a number of problems remain to be solved. Among them the most prominent is the multiplicity of solutions. To reduce the multiplicity of solutions, existing appro- aches basically involve increasing the number of samples. However, the resulting trained network model is still only applicable to local areas. This is an urgent problem that must be addressed to further reduce the multiplicity of so- lutions and improve generalizability.

    Acknowledgements

    This research is jointly funded by the National Key Research and Development Program of China (No. 2017 YFC0307401), the National Natural Science Foundation of China (No. 41230318), the Fundamental Research Funds for the Central Universities (No. 201964017), and the National Science and Technology Major Project of China (No. 2016ZX05024-001-002).

    Acharya, U. R., Fujita, H., Oh, S. L., Raghavendra, U., Tan, J. H., Adam, M., Gertych, A., and Hagiwara, Y., 2018. Automated identification of shockable and non-shockable life- threatening ventricular arrhythmias using convolutional neural network., 79: 952- 959.

    Das, V., Pollack, A., Wollner, U., and Mukerji, T., 2018. Convolutional neural network for seismic impedance inversion.. Anaheim, 2071-2075.

    Dramsch, J. S., and Lüthje, M., 2018. Deep learning seismic facies on state-of-the-art CNN architectures.. Anaheim, 2036-2040.

    Duan, X. D., Zhang, J., Liu, Z. Y., Liu, S., Chen, Z. B., and Li, W. P., 2018. Integrating seismic first-break picking methods with a machine learning approach.. Anaheim, 2186-2190.

    Guo, B. W., Liu, L., and Luo, Y., 2018. A new method for automatic seismic fault detection using convolutional neural network.. Anaheim, 1951-1955.

    Hollander, Y., Merouane, A., and Yilmaz, O., 2018. Using a deep convolutional neural network to enhance the accuracy of first break picking.. Anaheim, 4628-4632.

    Hornik, K., Stinchcombe, M., and White, H., 1990. Universal approximation of an unknown mapping and its derivatives using multilayer feedforward networks., 3 (5): 551-560.

    Kamilaris, A., and Prenafeta-Boldú, F. X., 2018. Deep learning in agriculture: A survey., 147: 70-90.

    Kingma, D. P., and Ba, J. L., 2015. Adam: A method for stochastic optimization.(). San Diego, 1-15.

    Li, B., and Hu, X., 2018. Effective vehicle logo recognition in real-world application using mapreduce based convolutional neural networks with a pre-training strategy., 34 (3): 1985-1994.

    Li, Y., Huang, Y., and Zhang, M., 2018. Short-term load forecasting for electric vehicle charging station based on niche immunity lion algorithm and convolutional neural network., 11 (5): 1253.

    Liu, F., Zhou, Z., Jang, H., Samsonov, A., Zhao, G., and Kijowski, R., 2017. Deep convolutional neural network and 3D deformable approach for tissue segmentation in musculoske-letal magnetic resonance imaging., 79 (4): 2379-2391.

    Ma, Y., Ji, X., BenHassan, N. M., and Luo, Y., 2018. A deep learning method for automatic fault detection.. Anaheim, 1941-1945.

    Nair, V., and Hinton, G. E., 2010. Rectified linear units improve restricted Boltzmann machines.(). Omnipress, 807-814.

    Qiu, Z., Chen, J., Zhao, Y., Zhu, S., He, Y., and Zhang, C., 2018. Variety identification of single rice seed using hyperspectral imaging combined with convolutional neural network., 8 (2): 212.

    Shen, Y., Sun, M. Y., Zhang, J., Liu, S., Chen, Z. B., and Li, W. P., 2018. Seismic trace editing by applying machine learning.. Anaheim, 2256- 2260.

    Teimouri, N., Dyrmann, M., Nielsen, P., Mathiassen, S., Somer- ville, G., and J?rgensen, R., 2018. Weed growth stage estimator using deep convolutional neural networks., 18 (5): 1580.

    Venkatesan, R., and Li, B. X., 2018.:. CRC Press, Boca Raton, 1-183.

    Wachinger, C., Reuter, M., and Klein, T., 2018. DeepNAT: Deep convolutional neural network for segmenting neuroanatomy., 170: 434-445.

    Wang, Q., Gao, J., and Yuan, Y., 2018a. A joint convolutional neural networks and context transfer for street scenes labeling., 19 (5): 1457-1470.

    Wang, W. L., Yang, F. S., and Ma, J. W., 2018b. Velocity model building with a modified fully convolutional network.. Anaheim, 2086-2090.

    Wu, X. M., Shi, Y. Z., Fomel, S., and Liang, L. M., 2018a. Con- volutional neural networks for fault interpretation in seismic images.. Ana- heim, 1946-1950.

    Wu, Y., Lin, Y. Z., and Zhou, Z., 2018b. InversionNet: Accurate and efficient seismic waveform inversion with convolutional neural networks.. Anaheim, 2096-2100.

    Yuan, S. Y., Liu, J. W., Wang, S. X., Wang, T. Y., and Shi, P. D., 2018. Seismic waveform classification and first-break picking using convolution neural networks., 15 (2): 272-276.

    Zhao, T., 2018. Seismic facies classification using different deep convolutional neural networks.. Anaheim, 2046-2050.

    . E-mail: zhjmeteor@163.com

    January 30, 2019;

    May 10, 2019;

    May 6, 2020

    (Edited by Chen Wenwen)

    最近最新中文字幕大全电影3| 国产精品福利在线免费观看| 在线天堂最新版资源| 欧美最新免费一区二区三区| 国产毛片a区久久久久| 在线精品无人区一区二区三 | 欧美最新免费一区二区三区| 国产真实伦视频高清在线观看| 亚洲精品乱久久久久久| 一区二区三区四区激情视频| 婷婷色av中文字幕| 伦精品一区二区三区| 我的老师免费观看完整版| 亚洲怡红院男人天堂| 国产精品女同一区二区软件| 久久久久性生活片| 99久久精品一区二区三区| 高清午夜精品一区二区三区| 成人国产av品久久久| 亚洲av免费高清在线观看| 国产精品伦人一区二区| 国产一区二区三区综合在线观看 | 黑人高潮一二区| 91精品伊人久久大香线蕉| 麻豆乱淫一区二区| 国产精品不卡视频一区二区| 成人毛片60女人毛片免费| 亚洲国产精品专区欧美| 亚洲欧美日韩卡通动漫| 欧美极品一区二区三区四区| 成人高潮视频无遮挡免费网站| 亚洲av中文av极速乱| 欧美精品国产亚洲| 欧美日韩国产mv在线观看视频 | 国产探花极品一区二区| 哪个播放器可以免费观看大片| 91久久精品国产一区二区三区| 好男人在线观看高清免费视频| 国产乱人偷精品视频| 成人高潮视频无遮挡免费网站| 国产精品成人在线| 久久久久久久精品精品| 免费看av在线观看网站| 国产精品国产av在线观看| 亚洲av日韩在线播放| 最近最新中文字幕免费大全7| 免费观看无遮挡的男女| 高清毛片免费看| 亚洲最大成人中文| 久久国产乱子免费精品| 丝袜美腿在线中文| 精品亚洲乱码少妇综合久久| .国产精品久久| 精品国产乱码久久久久久小说| 成人漫画全彩无遮挡| 亚洲人与动物交配视频| 色5月婷婷丁香| 男女啪啪激烈高潮av片| 国产高清国产精品国产三级 | 亚洲丝袜综合中文字幕| 国产黄色免费在线视频| 久久综合国产亚洲精品| 成人鲁丝片一二三区免费| 看黄色毛片网站| 男的添女的下面高潮视频| 黑人高潮一二区| 婷婷色综合www| av在线app专区| 欧美最新免费一区二区三区| 日韩精品有码人妻一区| 性色avwww在线观看| 欧美成人午夜免费资源| 网址你懂的国产日韩在线| 美女主播在线视频| 六月丁香七月| 国产色爽女视频免费观看| 最近中文字幕2019免费版| 亚洲美女视频黄频| 97热精品久久久久久| 亚洲av在线观看美女高潮| 国产白丝娇喘喷水9色精品| av国产免费在线观看| 综合色丁香网| 国产精品久久久久久久久免| 在线a可以看的网站| 亚洲av成人精品一二三区| 久久久久久久精品精品| 丝瓜视频免费看黄片| 白带黄色成豆腐渣| 少妇人妻一区二区三区视频| 亚洲怡红院男人天堂| 亚洲av福利一区| 久久久国产一区二区| 夫妻午夜视频| 久久韩国三级中文字幕| 日本欧美国产在线视频| 久久精品夜色国产| 成年av动漫网址| 精品人妻一区二区三区麻豆| 日韩电影二区| 内射极品少妇av片p| 国产高清不卡午夜福利| 国内少妇人妻偷人精品xxx网站| 街头女战士在线观看网站| 九九久久精品国产亚洲av麻豆| 国产男女内射视频| 亚洲av电影在线观看一区二区三区 | 亚州av有码| 国产成人免费观看mmmm| 成人亚洲精品一区在线观看 | 天堂俺去俺来也www色官网| 国产成人a区在线观看| 国产午夜精品一二区理论片| 成人鲁丝片一二三区免费| 色视频在线一区二区三区| 能在线免费看毛片的网站| 熟女人妻精品中文字幕| 久久国内精品自在自线图片| 在现免费观看毛片| 女人被狂操c到高潮| videos熟女内射| 97超碰精品成人国产| 99精国产麻豆久久婷婷| 亚洲精品,欧美精品| 丰满人妻一区二区三区视频av| 国产精品女同一区二区软件| 一区二区三区乱码不卡18| 在线免费观看不下载黄p国产| 高清毛片免费看| 国产精品人妻久久久久久| 国产有黄有色有爽视频| 麻豆国产97在线/欧美| 人妻少妇偷人精品九色| 国产精品一区二区三区四区免费观看| 国产亚洲91精品色在线| 国产精品99久久99久久久不卡 | 最新中文字幕久久久久| 成人无遮挡网站| 国产成人a区在线观看| av网站免费在线观看视频| 91久久精品国产一区二区三区| 一级黄片播放器| 国产探花极品一区二区| 免费电影在线观看免费观看| 国产乱来视频区| 人妻制服诱惑在线中文字幕| 国产欧美另类精品又又久久亚洲欧美| 国产乱来视频区| 岛国毛片在线播放| 久久人人爽人人爽人人片va| 又爽又黄无遮挡网站| 国产毛片a区久久久久| 亚洲av男天堂| 精品一区二区三区视频在线| 色5月婷婷丁香| 日韩精品有码人妻一区| 最近2019中文字幕mv第一页| 国产黄片美女视频| 国产男女内射视频| 欧美极品一区二区三区四区| av.在线天堂| 国国产精品蜜臀av免费| 春色校园在线视频观看| 99热这里只有精品一区| 欧美97在线视频| 极品教师在线视频| 一级爰片在线观看| 黄色配什么色好看| 午夜日本视频在线| 亚洲真实伦在线观看| 国产伦在线观看视频一区| 亚洲精品aⅴ在线观看| 18禁裸乳无遮挡动漫免费视频 | 久久精品国产亚洲网站| 国产大屁股一区二区在线视频| 人人妻人人看人人澡| 亚洲人成网站高清观看| 国产成人福利小说| 夫妻午夜视频| 国产成人aa在线观看| 高清毛片免费看| 在线观看一区二区三区激情| 成人亚洲欧美一区二区av| 亚洲电影在线观看av| 两个人的视频大全免费| 3wmmmm亚洲av在线观看| 国产av不卡久久| 亚洲av免费高清在线观看| 18禁动态无遮挡网站| 午夜免费男女啪啪视频观看| 熟女电影av网| 亚洲怡红院男人天堂| 日韩成人av中文字幕在线观看| 日韩 亚洲 欧美在线| av在线老鸭窝| 老司机影院毛片| 少妇人妻久久综合中文| 亚洲成人久久爱视频| 婷婷色综合www| 国产一区有黄有色的免费视频| 日韩人妻高清精品专区| 一区二区三区四区激情视频| 成人午夜精彩视频在线观看| 亚洲av.av天堂| 最近2019中文字幕mv第一页| 人妻制服诱惑在线中文字幕| 国国产精品蜜臀av免费| 国产 一区 欧美 日韩| 亚洲精品乱码久久久久久按摩| 听说在线观看完整版免费高清| 亚洲精品国产av蜜桃| 亚洲第一区二区三区不卡| av在线天堂中文字幕| 99久久人妻综合| av网站免费在线观看视频| 在线观看国产h片| 美女视频免费永久观看网站| 久久亚洲国产成人精品v| 午夜日本视频在线| freevideosex欧美| 精品久久久噜噜| av在线亚洲专区| 亚洲熟女精品中文字幕| 丝袜美腿在线中文| 99久久人妻综合| 亚洲色图av天堂| 亚洲欧洲国产日韩| 国产成人a∨麻豆精品| 97在线人人人人妻| 亚洲第一区二区三区不卡| 国产黄a三级三级三级人| 人妻系列 视频| 国产欧美亚洲国产| 91午夜精品亚洲一区二区三区| 国产成人精品婷婷| 亚洲国产色片| 国产又色又爽无遮挡免| 午夜日本视频在线| 麻豆国产97在线/欧美| 好男人在线观看高清免费视频| 国产精品麻豆人妻色哟哟久久| 天天一区二区日本电影三级| 国产一区二区三区av在线| 我要看日韩黄色一级片| 熟女人妻精品中文字幕| 男人添女人高潮全过程视频| 永久免费av网站大全| 午夜激情久久久久久久| 国产精品嫩草影院av在线观看| 97在线人人人人妻| 人人妻人人澡人人爽人人夜夜| 久久久久精品久久久久真实原创| 99九九线精品视频在线观看视频| 最近的中文字幕免费完整| 日韩电影二区| 午夜日本视频在线| 搡老乐熟女国产| 韩国av在线不卡| 人妻一区二区av| 国产精品秋霞免费鲁丝片| 国精品久久久久久国模美| 免费黄网站久久成人精品| 九九在线视频观看精品| 亚洲自偷自拍三级| 日本黄色片子视频| 少妇高潮的动态图| 亚洲精品久久久久久婷婷小说| 精品酒店卫生间| 久久久成人免费电影| 亚洲av免费高清在线观看| 久久久精品免费免费高清| 成人毛片a级毛片在线播放| 国产精品福利在线免费观看| 小蜜桃在线观看免费完整版高清| 最近最新中文字幕大全电影3| 日本一本二区三区精品| 国产成人精品福利久久| 午夜日本视频在线| 精品一区在线观看国产| 国产精品一二三区在线看| 另类亚洲欧美激情| 深夜a级毛片| 大片免费播放器 马上看| 狂野欧美激情性bbbbbb| 亚洲欧美日韩东京热| 国产精品成人在线| 久久韩国三级中文字幕| 免费在线观看成人毛片| 国产在线男女| av国产免费在线观看| 综合色av麻豆| 一区二区三区乱码不卡18| 亚洲不卡免费看| 纵有疾风起免费观看全集完整版| 国产成人91sexporn| 国产综合精华液| 成人二区视频| 欧美日韩亚洲高清精品| 免费大片黄手机在线观看| av在线观看视频网站免费| 一级毛片电影观看| 成人二区视频| 国产成人一区二区在线| 深夜a级毛片| 啦啦啦在线观看免费高清www| 亚洲精品日本国产第一区| 日本三级黄在线观看| 日本-黄色视频高清免费观看| 在线观看三级黄色| 免费看av在线观看网站| 一个人看的www免费观看视频| 国产男人的电影天堂91| 最近的中文字幕免费完整| 国产成人freesex在线| 菩萨蛮人人尽说江南好唐韦庄| 国产精品嫩草影院av在线观看| 18+在线观看网站| 精品久久久久久久久亚洲| 久久午夜福利片| 国产探花在线观看一区二区| 国产黄频视频在线观看| 亚洲国产最新在线播放| 成年女人在线观看亚洲视频 | 18禁裸乳无遮挡动漫免费视频 | www.av在线官网国产| 亚洲美女视频黄频| 久久久久精品性色| 精品酒店卫生间| 久久久精品94久久精品| 久久精品国产亚洲网站| 欧美激情国产日韩精品一区| 最新中文字幕久久久久| 国产淫片久久久久久久久| 亚洲性久久影院| 亚洲av成人精品一二三区| 免费观看的影片在线观看| 可以在线观看毛片的网站| av播播在线观看一区| 精品久久久久久久末码| 少妇的逼好多水| 欧美zozozo另类| 国产在线一区二区三区精| 国产综合懂色| 国产探花极品一区二区| 亚洲精品色激情综合| 少妇猛男粗大的猛烈进出视频 | 国产欧美亚洲国产| 寂寞人妻少妇视频99o| 一级a做视频免费观看| 国产精品99久久99久久久不卡 | 嫩草影院精品99| 国产片特级美女逼逼视频| 小蜜桃在线观看免费完整版高清| 亚洲av中文av极速乱| 中国国产av一级| 色视频www国产| 一个人看的www免费观看视频| av免费在线看不卡| 2021少妇久久久久久久久久久| 精品一区二区三卡| av网站免费在线观看视频| 亚洲国产欧美人成| 午夜激情福利司机影院| 少妇的逼水好多| 亚洲精品久久久久久婷婷小说| 91在线精品国自产拍蜜月| 伊人久久国产一区二区| 日本免费在线观看一区| 成人特级av手机在线观看| 超碰97精品在线观看| 欧美日韩综合久久久久久| 国产成人福利小说| 啦啦啦啦在线视频资源| 另类亚洲欧美激情| 国产黄频视频在线观看| 免费观看性生交大片5| 中文字幕亚洲精品专区| 欧美日韩综合久久久久久| 男女下面进入的视频免费午夜| 国产高清有码在线观看视频| 一级av片app| 国产伦在线观看视频一区| 深夜a级毛片| 啦啦啦中文免费视频观看日本| 最近的中文字幕免费完整| 国产精品人妻久久久久久| 欧美亚洲 丝袜 人妻 在线| 美女xxoo啪啪120秒动态图| 国产精品久久久久久精品古装| 久久久久久久久久久免费av| av免费观看日本| 亚洲欧美日韩卡通动漫| 18禁动态无遮挡网站| 黄色一级大片看看| 国产精品人妻久久久影院| 精品一区二区三区视频在线| 久久久a久久爽久久v久久| av网站免费在线观看视频| 青青草视频在线视频观看| 别揉我奶头 嗯啊视频| 国产男女超爽视频在线观看| 国产精品一及| 好男人在线观看高清免费视频| 国产伦精品一区二区三区四那| 日韩欧美精品v在线| 亚洲,一卡二卡三卡| 老女人水多毛片| 久久久久久九九精品二区国产| 久久久精品欧美日韩精品| 午夜精品国产一区二区电影 | 国产精品福利在线免费观看| 嫩草影院新地址| 少妇 在线观看| 欧美+日韩+精品| 精华霜和精华液先用哪个| 18禁在线无遮挡免费观看视频| 观看免费一级毛片| 久久精品人妻少妇| 丝袜美腿在线中文| 国产免费又黄又爽又色| 黄片无遮挡物在线观看| 色播亚洲综合网| 精品人妻一区二区三区麻豆| 91久久精品电影网| 久久99热这里只频精品6学生| 伊人久久国产一区二区| 中文字幕av成人在线电影| 91在线精品国自产拍蜜月| 亚洲人成网站高清观看| 免费看a级黄色片| videossex国产| 联通29元200g的流量卡| 国产亚洲5aaaaa淫片| 好男人视频免费观看在线| 欧美潮喷喷水| 日本色播在线视频| 亚洲va在线va天堂va国产| 国产午夜精品一二区理论片| 看非洲黑人一级黄片| 国产淫片久久久久久久久| 伊人久久国产一区二区| 国产在线男女| 婷婷色综合大香蕉| 亚洲欧美日韩卡通动漫| 狠狠精品人妻久久久久久综合| www.av在线官网国产| 男女下面进入的视频免费午夜| 亚洲精品成人av观看孕妇| 国产熟女欧美一区二区| 欧美一区二区亚洲| 国产一级毛片在线| 免费看光身美女| 久久99热这里只有精品18| 美女内射精品一级片tv| 亚洲三级黄色毛片| 我的老师免费观看完整版| 免费观看性生交大片5| 精品亚洲乱码少妇综合久久| 精品人妻熟女av久视频| 国产精品国产av在线观看| 亚洲av成人精品一二三区| 日韩 亚洲 欧美在线| 波野结衣二区三区在线| 久久久国产一区二区| 亚洲人成网站在线播| 人妻夜夜爽99麻豆av| 欧美少妇被猛烈插入视频| 如何舔出高潮| 高清午夜精品一区二区三区| 久久久精品94久久精品| 高清毛片免费看| 天天躁夜夜躁狠狠久久av| 男男h啪啪无遮挡| 国产在线男女| 精品一区二区免费观看| 久久人人爽人人片av| 欧美最新免费一区二区三区| 大片电影免费在线观看免费| 特大巨黑吊av在线直播| 成人亚洲欧美一区二区av| 成人漫画全彩无遮挡| 涩涩av久久男人的天堂| 免费高清在线观看视频在线观看| 日韩欧美精品免费久久| 日韩成人伦理影院| 一本色道久久久久久精品综合| 国产精品一二三区在线看| 免费观看在线日韩| 精品少妇久久久久久888优播| 国产免费又黄又爽又色| 亚洲国产av新网站| 毛片一级片免费看久久久久| 亚洲精品一区蜜桃| 免费看a级黄色片| 国产亚洲av嫩草精品影院| 久久久久国产网址| 菩萨蛮人人尽说江南好唐韦庄| 性色avwww在线观看| 久久久久国产网址| 精品一区二区免费观看| 99久久精品国产国产毛片| 国产黄片美女视频| 深爱激情五月婷婷| 日韩电影二区| 一区二区三区精品91| 高清欧美精品videossex| 国产成人一区二区在线| 日韩三级伦理在线观看| 亚洲精品456在线播放app| 亚洲av欧美aⅴ国产| 最后的刺客免费高清国语| 免费大片黄手机在线观看| 狂野欧美激情性xxxx在线观看| 在线观看人妻少妇| 成人黄色视频免费在线看| 熟女人妻精品中文字幕| 人体艺术视频欧美日本| 少妇裸体淫交视频免费看高清| 日韩电影二区| 亚洲av男天堂| 中文字幕免费在线视频6| 亚洲精品久久久久久婷婷小说| 国产精品嫩草影院av在线观看| 久久久久久久久久久免费av| 国产精品精品国产色婷婷| 亚洲三级黄色毛片| 草草在线视频免费看| 国产精品久久久久久精品古装| 成人毛片a级毛片在线播放| 一本久久精品| 黄色配什么色好看| 亚洲av电影在线观看一区二区三区 | 久久久久久国产a免费观看| 极品教师在线视频| 亚洲一区二区三区欧美精品 | 国产午夜精品一二区理论片| 亚洲欧美成人综合另类久久久| 国产精品久久久久久精品电影小说 | 亚洲婷婷狠狠爱综合网| 人人妻人人看人人澡| 亚洲伊人久久精品综合| 国产精品久久久久久精品古装| 午夜福利在线观看免费完整高清在| 视频中文字幕在线观看| av专区在线播放| 草草在线视频免费看| 热re99久久精品国产66热6| 亚洲精品aⅴ在线观看| 亚洲最大成人手机在线| 人体艺术视频欧美日本| 六月丁香七月| 黄色一级大片看看| 亚洲精品久久午夜乱码| 国产伦在线观看视频一区| 青春草亚洲视频在线观看| 精品一区二区三卡| 18禁裸乳无遮挡动漫免费视频 | 男人舔奶头视频| 丝袜脚勾引网站| 插逼视频在线观看| 日韩欧美精品免费久久| 美女脱内裤让男人舔精品视频| 久久6这里有精品| 日本熟妇午夜| 欧美日韩视频高清一区二区三区二| 人妻 亚洲 视频| 夫妻午夜视频| 久久精品熟女亚洲av麻豆精品| 亚洲av福利一区| 久久久久久久大尺度免费视频| 国产真实伦视频高清在线观看| 少妇人妻精品综合一区二区| 91久久精品电影网| 午夜爱爱视频在线播放| 欧美三级亚洲精品| 国模一区二区三区四区视频| 久久久久久久久久久免费av| 亚洲aⅴ乱码一区二区在线播放| 内射极品少妇av片p| 一级二级三级毛片免费看| 免费看不卡的av| a级毛片免费高清观看在线播放| 秋霞伦理黄片| 国产一区二区亚洲精品在线观看| 中文字幕久久专区| 国产一区二区亚洲精品在线观看| 亚洲av中文字字幕乱码综合| 亚洲精品国产成人久久av| 91狼人影院| 狂野欧美激情性xxxx在线观看| 少妇的逼水好多| av一本久久久久| kizo精华| 欧美性猛交╳xxx乱大交人| 久久久欧美国产精品| 97在线视频观看| 精品久久久久久久人妻蜜臀av| 精品久久国产蜜桃| 久久久欧美国产精品| 国产高清国产精品国产三级 | 蜜桃久久精品国产亚洲av| 一级爰片在线观看| 高清av免费在线| 久久亚洲国产成人精品v| 亚洲,一卡二卡三卡| 国产高清不卡午夜福利| 毛片女人毛片| 男人舔奶头视频| 亚洲国产成人一精品久久久| 七月丁香在线播放| 一级毛片aaaaaa免费看小| 国产精品精品国产色婷婷| 成人免费观看视频高清| 日本欧美国产在线视频| 汤姆久久久久久久影院中文字幕|