• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Physics-constrained indirect supervised learning

    2020-07-01 05:13:18YuntinChenDongxioZhng

    Yuntin Chen, Dongxio Zhng*

    a Intelligent Energy Lab, Frontier Research Center, Peng Cheng Laboratory, Shenzhen 518000, China

    b School of Environmental Science and Engineering, Southern University of Science and Technology, Shenzhen 518055, China

    Keywords:Supervised learning Indirect label Physics constrained Physics informed Well logs

    ABSTRACT This study proposes a supervised learning method that does not rely on labels. We use variables associated with the label as indirect labels, and construct an indirect physics-constrained loss based on the physical mechanism to train the model. In the training process, the model prediction is mapped to the space of value that conforms to the physical mechanism through the projection matrix, and then the model is trained based on the indirect labels. The final prediction result of the model conforms to the physical mechanism between indirect label and label, and also meets the constraints of the indirect label. The present study also develops projection matrix normalization and prediction covariance analysis to ensure that the model can be fully trained. Finally, the effect of the physics-constrained indirect supervised learning is verified based on a well log generation problem.

    In machine learning, supervised learning often needs to directly obtain the specific output value (label) of each data [1,2].However, the label is sometimes difficult to obtain in practice,and the dependence on labeling has restricted the application of supervised learning in practical engineering problems.

    The essence of the algorithms and models is to extract information from the training data, and the supervised learning indicates that the information is extracted from the data with labels. When training a model, a specific label is not necessarily required as the output of the model [3]. In fact, only information describing the distribution of the output is needed. This is similar to the fact that humans are able to learn without direct labels,and only need a description of what an output should be [3].Conventional supervised learning is a completely datadriven method, in which the labels can simply and directly describe the distribution of the output. However, the information used to train the model can not only come from the data itself, but also from a priori information, such as physical mechanisms[4–7]. In this study, the motivation of indirect supervised learning is to combine the information of physical mechanisms and data. We show, in the following sections, that it is possible to perform supervised learning based on indirect variables and prior information without relying on specific labels.

    Specifically, it can be assumed that the input variables of the model are represented by X, the output variables to be predicted are Y, and a mapping relationship exists between X and Y defined by function f(·). In conventional supervised learning, the model is directly trained with the data distribution of both X and Y. Nevertheless, if we can find variables H, and there is a mapping g(·) between H and Y, then we can also attempt to train a model that predicts Y based on the data distribution of X and H combined with the information of the mapping function g(·) that holds over the output space. In other words, H is the indirect label used by the model. The information of g(·) is utilized to ensure that the output of the neural network conforms to the mapping relationship between H and Y, and then Y is substituted by H to train the model. The relationship between X, Y, and H is shown in Fig. 1.

    In order to introduce indirect supervised learning more intuitively, the present study takes the problem of synthetic well log generation as an example. Previous studies have indicated that the neural network can generate well logs [8–11]. Zhang et al.[12] utilized a long short-term memory (LSTM) network which is capable of processing sequential data, and the model performance is better than that of conventional neural networks. Here,we predict the uniaxial compressive strength (UCS) directly based on conventional well logs, including depth, density, resistivity, and gamma rays via LSTM. Because UCS does not exist in the dataset, we cannot directly use UCS as a label to construct a training dataset for supervised learning. However, if the sonic logs are available, we can employ them as indirect labels since a mapping relationship exists between the sonic logs and the UCS.Therefore, the sonic logs can be used as indirect labels and the conventional logs can be taken as inputs to directly train a model to predict the UCS based on physical mechanisms. It should be mentioned that, although the aforementioned method belongs to supervised learning, it does not utilize the label (UCS)directly, but it is based on indirect labels, and thus this method is called physics-constrained indirect supervised learning.

    Indirect labels cannot be used directly to train the model.The mapping relationship between sonic logs and the UCS can be split into non-linear and linear parts according to the physical mechanism. Firstly, the dynamic Young's modulus (Edyn) can be obtained by Eq. (1) based on sonic logs (indirect label) and density log (input), where c represents a constant, ρ is the density, the Δtsand Δtpare the sonic logs [13]:

    When calculating the UCS, the static Young's modulus Estatis usually used instead of the Edyn. The Estatis obtained through experiments, and it can be estimated by Edyn, as shown in Eq. (2)[13]. Finally, the UCS can be calculated by Eq. (3) [14]:

    Fig. 1. Relationship between inputs, labels, and indirect labels.

    In this study, the model is trained through physical mechanisms based on indirect labels. Therefore, based on Eqs. (1)–(3),the problem can be expressed in the form of matrixes, as shown in Eq. (4), where A∈Rn×2is constructed based on the dynamic Young's modulus, Y represents the UCS, X represents the inputs,and n is the number of training data (the product of batch size and training data length):

    Because n is usually greater than 2 in practice, the matrix A is strictly skinny, and Eq. (4) is overdetermined, which means that it is difficult to find a set of solutions that make the model fit all of the data points. In other words, there is no guarantee that the predictions are all accurate, and there should be nonzero residuals. Therefore, we define the residual r = AX–Y and attempt to approximately solve Y = AX by finding an X* that minimizes the residual r. To minimize the residual, we calculate the gradient of the squared residual w.r.t. X, which is shown in Eq. (5):

    It is known that when the residual is at its minimum, the gradient of the residual must be zero. Since the matrix ATA is usually invertible in practice, the X* can be obtained as X*=(ATA)-1ATY. Based on X* and Eq. (4), the relationship between a given prediction value Y and its projection Y* on range(A) can be constructed, as shown in Eq. (6). The matrix P =A(ATA)-1ATis called the projection matrix [15], which is the key to indirect label supervised learning. Through Eq. (6), the predicted value can be projected to a numerical space (range(A))that conforms to a certain physical mechanism:

    Equation (6) and projection matrix P have clear physical meanings. Minimizing the residual is equivalent to finding the projection of Y on range(A), i.e., the point where Y is closest to range(A), as shown in Fig. 2. Range(A) is actually the solution space determined by Eqs. (1)–(3). It is a set of points that conform to the physical mechanism and is shown as a red line. If the prediction result f(x) of the neural network is taken as Y and substituted into Eq. (6), then Y* is the point in range(A) closest to the prediction result of the neural network. Therefore, in essence,the physical meaning of Eq. (6) is to find the point closest to the predicted value of the neural network among all points that meet given physical constraints. The physical constraints used in this process are determined based on prior knowledge and indirect labels.

    In order to make the prediction result of the neural network comply with physical constraints, a new loss function is constructed based on Eq. (6). The proposed loss function trains the neural network through physical constraints based on known indirect labels, as shown in Eq. (7). This loss is called indirect physics-constrained loss, and it is utilized to calculate the difference between the projected result Y* of the predicted value of the neural network in the physics-constrained space and the predicted value of the neural network Y. By iteratively optimizing the neural network through the proposed indirect physicsconstrained loss function, the final prediction result of the network can be guaranteed to meet the given physical constraints(i.e., the difference between the projected value Y* and the predicted value Y is the smallest). In addition, the indirect physicsconstrained loss is a function of the predicted value of the neural network and does not require the value of UCS, which reflects the meaning of indirect label supervision

    In practical application of the indirect physics-constrained loss, two challenges exist that may cause neural network training to fail. The first challenge is that the projection matrix will change the mean and variance of the distribution of the predictions, and the second challenge is the symmetry of the indirect physics-constrained loss. The above two challenges can be solved by projection matrix normalization and prediction covariance analysis, respectively.

    In the first challenge, although the distribution of the projected value Y* will meet the physical constraints (range(A)), the projection matrix cannot guarantee that the projected value Y*and the model prediction Y are at the same scale. In other words,the Y* obtained by the projection matrix conforms to physical constraints in the trend, but its mean and variance differ from those of neural network prediction Y. In practice, the Y* calculated based on the projection matrix is often smaller than Y,which means that the loss function tends to reduce the scale of the model output. This will cause the predicted value of the neural network to gradually decrease as the iteration progresses,eventually approaching 0. The small model prediction value in the loss function, in turn, will cause the loss value to be extremely small, making the loss function unable to effectively provide the optimization direction and result in training failure.

    The idea to solve this problem is similar to that of the batch normalization layer, i.e., to perform a normalization after each iteration. This operation can ensure that both the projected value and the predicted value are within a reasonable range,which is conducive to accurately calculating the loss of the model prediction. Specifically, in each iteration, the Y* that conforms to the physical mechanism after projection is first normalized,and then the difference with the predicted value of the neural network is calculated and used as the indirect physics-constrained loss. Therefore, normalizing the projection result is conducive to model convergence, and can solve the first challenge.

    Fig. 2. Geometric interpretation of the projection on range (A).

    The second challenge is the symmetry of the indirect physics-constrained loss. This symmetry comprises two parts. On the one hand, the projection matrix based on the indirect label(Edyn) is symmetrical to the indirect label, i.e., P(Edyn)≡P(–Edyn).This means that when the indirect label is flipped along the x-axis, the projection result of the neural network prediction does not change. In other words, the model obtained based on the indirect physics-constrained loss has symmetric equivalent solutions. The second symmetry is that the indirect physics-constrained loss is symmetrical to the model predicted value, i.e.,the indirect physics-constrained loss g(f(x))≡g(–f(x)). This is because all terms in the indirect physics-constrained loss are calculated based on f(x). Consequently, when f(x) takes the opposite number, the projection result will also change, so that the root of the entire loss becomes the opposite number, while the loss itself remains unchanged. Because the indirect physics-constrained loss exhibits the above two symmetries, if the neural network is trained directly by the indirect physics-constrained loss (Eq. (7)), there will be a 50% probability that an axisymmetric solution to the trend of the real result will be obtained, as shown in Fig. 3. To avoid the effects of randomness, 50 independent repeated experiments were performed. The black curve is the real result, and the red curves and the blue curves are the final results obtained in independent repeated experiments. The results show that both solutions occur exactly 25 times, which indicates that the probability of the two solutions is approximately 50%. The experimental results are consistent with the theoretical analysis.

    This challenge can be solved through covariance analysis.The motivation of this method is that, although the symmetry of indirect physics-constrained loss results in two symmetrical equivalent solutions, the correlation between the true solution and the indirect labels should be the same as the correlation between the variable to be predicted and the indirect label. Specifically, since we know the value of the indirect labels (sonic logs) and the mapping relationship g(·) between the indirect label and the parameter to be predicted (UCS) according to physical mechanism, we can know whether the indirect labels are positively or negatively related to the variable to be predicted. At the same time, the covariance between the model prediction result and the indirect labels can be directly calculated. Therefore, according to the positive and negative covariance between the model prediction result and the indirect labels, the two equivalent solutions can be divided into positive and negative solutions. Specifically, if the variable to be predicted is positively correlated with the indirect label, the positive solution is the true solution; otherwise, the negative solution is the true solution.Through covariance analysis, the true solution in the equivalent solutions of the model can be quickly determined, and the multiple solution problem caused by indirect physics-constrained loss is solved.

    In order to verify the feasibility of the physics-constrained indirect supervised learning, a synthetic well log generation experiment was performed in this study. The experimental data come from 39 wells in North Dakota in the United States, including the true vertical depth log, density log, resistivity log, gamma ray log,and sonic logs [16]. The UCS, which is a kind of geomechanical log, is predicted, but it does not exist in the data set. The depth,density, resistivity, and gamma ray are used as inputs in the model. The indirect labels are sonic logs (Delta-T shear, Δtsand Delta-T compressional, Δtp). The entire model does not require UCS values. Instead, it employs the mapping relationship between sonic logs (indirect labels) and UCS (label) as prior information, and uses the indirect physics-constrained loss to train neural networks.

    Fig. 3. Symmetric equivalent solutions obtained by indirect physics-constrained loss of 50 independent repeated experiments.

    In the physics-constrained indirect supervised learning, projection matrix normalization and prediction covariance analysis are used to ensure the convergence of the model. In this study,the LSTM is utilized as the prediction model to generate synthetic well logs, since it possesses the advantage of processing longterm dependent sequential data. The data set contains a total of 39 wells. Each experiment randomly selects 27 wells (70%) as the training data set and 12 wells (30%) as the test data set. Each experiment was repeated 10 times to avoid the impact of randomness. The hyperparameters involved in the experiment include batch size, training data length, and whether or not to add a batch normalization layer.

    In general, a larger batch size is helpful for each batch to better represent the distribution of the whole data set, but a batch size that is too large will also lead to a lack of randomness in the training process, and it is easier to fall into a local optimum. For training data length, this is mainly related to the stratum. Overall, the longer is the training data length, the greater is the stratum thickness corresponding to each training data, but the higher is the computational complexity. The projection matrix in the indirect label supervised learning involves calculating the inverse matrix of ATA, and the number of columns is the product of the batch size and the training data length. Therefore, an excessively large batch size and training data length may cause insufficient graphics processing unit (GPU) memory.

    We optimized the hyperparameters of the model separately.The default setting of the hyperparameters is 128 for the batch size, and 200 for the training data length with the batch normalization layer. Firstly, the effects of batch sizes of 32, 64, and 128 are evaluated, respectively, and the results are shown in Fig. 4a.The height of the bars in the histogram corresponds to the average mean squared error (MSE) predicted by the indirect supervised learning model, and the error bar corresponds to the standard deviation of the MSE of 10 repeated experiments. It can be seen that, as the batch size increases, the model prediction accuracy increases. Secondly, the model performances when the training data length is 1, 20, 50, 100, 150, and 200 are evaluated,as shown in Fig. 4b. The final result reveals that the best training data length is 150 for this problem. It is worth mentioning that the model performs poorly when the training data length is 1,mainly because the LSTM neural network with a sequence length of 1 is equivalent to a fully connected neural network.This result also reflects the advantages of LSTM over conventional neural networks, which is consistent with Ref. [12]. Finally,the effect of the batch normalization layer is examined, as shown in Fig. 4c. Therefore, in subsequent experiments, the batch size and the training length are set to be 128 and 150, respectively,and the model uses a batch normalization layer.

    In order to show the performance of the indirect supervised learning more intuitively, the prediction result of one randomly extracted well is plotted, as shown in Fig. 5. The blue curve is the ground truth of the UCS, the orange curve represents the model prediction, and the red boxes show the enlarged details. One can see that the model based on the physics-constrained indirect label and the physical mechanism can accurately predict the trend of the UCS log, and the prediction result is close to the ground truth. In well log interpretation, experts mainly refer to the trend of well logs, and thus the prediction result of this model meets the needs in practice.

    This study uses the physical mechanism between variables as a priori information for supervised learning. The information about the variable to be predicted is required in conventional supervised learning. However, we proposed that this information can come not only directly from the variable (label), but also from the physical mechanism and other related variables (indirect labels). Therefore, we attempt to describe the variable to be predicted and perform supervised learning through some other variables, and the mapping relationship between these variables and the variable to be predicted.

    In order to achieve this goal, we split the mapping relationship between the indirect label and label into linear and non-linear parts, and construct a projection matrix to project the prediction result of the model into the range of values that conform to the physical mechanism. Then, this study proposes an indirect physics-constrained loss, which does not need to obtain the variable to be predicted. In addition, the research utilizes projection matrix normalization and prediction covariance analysis to ensure that the model can be fully trained. Finally, through iterative optimization of the indirect physicsconstrained loss, the model can ensure that the prediction and the indirect labels conform to a given mapping relationship (physical mechanism),and accurately predict the unknown label.

    Fig. 4. Histogram of MSE with different hyperparameters.

    Fig. 5. Model prediction and ground truth of the UCS of the first test well in the last set of the experiments.

    In the experiment section, the feasibility of the physicsconstrained indirect supervised learning is verified based on the well logs of 39 wells. We successfully predicted the UCS, which is not included in the training data, by using sonic logs as indirect labels and introducing the physical mechanism as a priori information into the loss function. This study shows that a loss function can be constructed based on the physical mechanism between the indirect label and the variable to be predicted, and it is possible to train a model by supervised learning based on indirect labels when the training data do not contain the variable to be predicted.

    It should be mentioned that a na?ve method can also be directly used to firstly learn and predict the indirect label (sonic logs) through a neural network, and then calculate the desired variable (UCS) based on the physical mechanism. This na?ve method is similar to the physics-constrained indirect supervised learning and does not require the UCS as the training data, but differences still exist between these two methods.

    Firstly, the physics-constrained indirect supervised learning provides a new modeling idea. When training neural networks,prior information is also used as a form of supervision in the loss function, and thus the supervised variable of the model is not the same as the desired variable. However, in the na?ve method, the label is still required to train the model, which means that the supervised variable is consistent with the desired variable.Therefore, the two methods differ at the methodological level.

    Secondly, the physics-constrained indirect supervised learning has the potential to further reduce the need for model constraint information, such as: (1) when some of the hyperparameters in the physical mechanism are unknown, it is not possible to calculate the desired variable by the na?ve method.However, since the projection matrix in the physicsconstrained indirect supervised learning does not depend on these hyperparameters, it is still theoretically possible to predict the desired variable, (2) because the physical mechanism is used to directly constrain the model's value space, the proposed model may have faster convergence speed and lower demand for training data, and (3) because the physical mechanism is integrated in the loss function, the same mechanism can be applied to different data sets, and thus the proposed model may offer superior generality. The above potential advantages and applications of the physics-constrained indirect supervised learning will be further studied in the future.

    Acknowledgements

    This work is partially funded by the National Natural Science Foundation of China (Grants 51520105005 and U1663208). The authors are grateful to Mr. Yuanqi Cheng for his assistance with data preparation, figure generation, and constructive discussions during the course of this work.

    日本午夜av视频| 黄色视频在线播放观看不卡| 国产精品三级大全| 亚洲内射少妇av| av又黄又爽大尺度在线免费看| 自拍偷自拍亚洲精品老妇| 欧美+日韩+精品| 国产伦理片在线播放av一区| 精品亚洲成国产av| 国产一级毛片在线| 亚洲国产精品国产精品| 成人一区二区视频在线观看| 最近中文字幕2019免费版| 亚洲精品视频女| 日韩中文字幕视频在线看片 | 中文字幕精品免费在线观看视频 | tube8黄色片| 51国产日韩欧美| 国产一区二区三区av在线| 亚洲在久久综合| 大陆偷拍与自拍| 高清午夜精品一区二区三区| 国产伦精品一区二区三区视频9| 成人18禁高潮啪啪吃奶动态图 | 日本欧美国产在线视频| 国产精品成人在线| 亚洲精品乱码久久久v下载方式| 国产色爽女视频免费观看| 联通29元200g的流量卡| 大香蕉久久网| 中文欧美无线码| 久久精品国产自在天天线| 天堂8中文在线网| 久久午夜福利片| 一级黄片播放器| 丝瓜视频免费看黄片| 国产成人a区在线观看| 免费大片黄手机在线观看| 我的老师免费观看完整版| 亚洲欧美精品专区久久| 黑人猛操日本美女一级片| 天堂8中文在线网| 日本欧美国产在线视频| 极品教师在线视频| 国产午夜精品一二区理论片| 夫妻性生交免费视频一级片| 国产黄色免费在线视频| 国产精品人妻久久久影院| av又黄又爽大尺度在线免费看| 毛片女人毛片| 国产v大片淫在线免费观看| 国产大屁股一区二区在线视频| 国产毛片在线视频| 最近中文字幕2019免费版| 国产精品成人在线| 欧美另类一区| 国产精品熟女久久久久浪| 国产精品久久久久久久久免| 18禁动态无遮挡网站| 国产又色又爽无遮挡免| 国产男女超爽视频在线观看| 国产免费福利视频在线观看| 亚洲成人中文字幕在线播放| 精品久久久久久久久av| 精品少妇久久久久久888优播| 日本欧美国产在线视频| 老女人水多毛片| 18+在线观看网站| 国产成人一区二区在线| 日韩人妻高清精品专区| 九九久久精品国产亚洲av麻豆| 久久久国产一区二区| 超碰av人人做人人爽久久| 最后的刺客免费高清国语| 丝袜脚勾引网站| 亚洲成人一二三区av| 中文字幕人妻熟人妻熟丝袜美| 六月丁香七月| av国产久精品久网站免费入址| 中文乱码字字幕精品一区二区三区| 色视频www国产| 亚洲av在线观看美女高潮| 国产色婷婷99| 国产有黄有色有爽视频| 看非洲黑人一级黄片| 亚洲av福利一区| 99九九线精品视频在线观看视频| 国产精品99久久99久久久不卡 | 欧美国产精品一级二级三级 | 日韩亚洲欧美综合| 2021少妇久久久久久久久久久| a级毛色黄片| 最近的中文字幕免费完整| 晚上一个人看的免费电影| 亚洲人成网站在线播| 80岁老熟妇乱子伦牲交| 欧美最新免费一区二区三区| 中文欧美无线码| 2018国产大陆天天弄谢| 亚洲综合精品二区| 欧美丝袜亚洲另类| 美女中出高潮动态图| 国产白丝娇喘喷水9色精品| 亚洲激情五月婷婷啪啪| 一区二区三区四区激情视频| 国产淫片久久久久久久久| 国产精品蜜桃在线观看| 91精品一卡2卡3卡4卡| 亚洲电影在线观看av| 午夜福利在线在线| 精品午夜福利在线看| 一级毛片电影观看| 亚洲精品乱久久久久久| 一二三四中文在线观看免费高清| xxx大片免费视频| 久久精品久久久久久噜噜老黄| 中文字幕av成人在线电影| 九九在线视频观看精品| 久热久热在线精品观看| 高清欧美精品videossex| 日韩免费高清中文字幕av| 香蕉精品网在线| 97超碰精品成人国产| 熟女电影av网| 一级a做视频免费观看| 亚洲国产最新在线播放| 久久精品国产亚洲av天美| 免费看不卡的av| 一级毛片aaaaaa免费看小| 人妻夜夜爽99麻豆av| 亚洲美女搞黄在线观看| 啦啦啦视频在线资源免费观看| 2018国产大陆天天弄谢| 有码 亚洲区| 午夜老司机福利剧场| 国产v大片淫在线免费观看| av黄色大香蕉| 国内少妇人妻偷人精品xxx网站| 高清日韩中文字幕在线| 一区二区三区免费毛片| 极品少妇高潮喷水抽搐| 人人妻人人澡人人爽人人夜夜| av又黄又爽大尺度在线免费看| 18禁在线无遮挡免费观看视频| 性色av一级| 美女中出高潮动态图| 久热这里只有精品99| 亚洲色图综合在线观看| 国产男女超爽视频在线观看| 久久久久久人妻| 蜜桃久久精品国产亚洲av| 老司机影院成人| 国产淫片久久久久久久久| 国产淫片久久久久久久久| 国产免费一区二区三区四区乱码| 啦啦啦视频在线资源免费观看| 亚洲欧美日韩卡通动漫| 一级毛片久久久久久久久女| 各种免费的搞黄视频| 日韩制服骚丝袜av| 99久久人妻综合| 国产成人aa在线观看| 国产片特级美女逼逼视频| 观看美女的网站| 久久影院123| 91久久精品国产一区二区三区| 亚洲精品视频女| 亚洲精品,欧美精品| 欧美老熟妇乱子伦牲交| 在线 av 中文字幕| 久久午夜福利片| 亚洲成人手机| 亚洲国产av新网站| 精品视频人人做人人爽| 亚洲三级黄色毛片| 99久久精品一区二区三区| 亚洲不卡免费看| 欧美日韩亚洲高清精品| 五月天丁香电影| 亚洲,一卡二卡三卡| 啦啦啦视频在线资源免费观看| 久久午夜福利片| 欧美日韩国产mv在线观看视频 | 丝袜脚勾引网站| 成年av动漫网址| 99热全是精品| 国产美女午夜福利| 人妻系列 视频| 午夜精品国产一区二区电影| 一边亲一边摸免费视频| 丰满乱子伦码专区| 91久久精品国产一区二区三区| 国产伦精品一区二区三区视频9| 国产精品国产三级国产av玫瑰| 精品久久国产蜜桃| 一级毛片我不卡| 久久精品国产亚洲av天美| 黄色一级大片看看| 精品国产三级普通话版| 六月丁香七月| 久久久久久久久久久丰满| 黄色一级大片看看| 国产精品秋霞免费鲁丝片| 卡戴珊不雅视频在线播放| 蜜桃亚洲精品一区二区三区| 欧美人与善性xxx| 91aial.com中文字幕在线观看| 亚洲精品国产成人久久av| 最近手机中文字幕大全| 久久婷婷青草| 亚洲精品亚洲一区二区| av免费在线看不卡| 一边亲一边摸免费视频| 一区二区三区免费毛片| av在线app专区| 成人高潮视频无遮挡免费网站| 免费观看a级毛片全部| 99久久中文字幕三级久久日本| 美女内射精品一级片tv| 精品一区二区三卡| 肉色欧美久久久久久久蜜桃| 九九久久精品国产亚洲av麻豆| 18禁在线无遮挡免费观看视频| 欧美日韩亚洲高清精品| 国产淫语在线视频| a级一级毛片免费在线观看| 插逼视频在线观看| 街头女战士在线观看网站| 一边亲一边摸免费视频| 一边亲一边摸免费视频| 国产亚洲欧美精品永久| 久久久国产一区二区| 午夜激情久久久久久久| 国产亚洲精品久久久com| 一级爰片在线观看| av在线蜜桃| 嫩草影院新地址| 国产精品久久久久久久久免| 插逼视频在线观看| 一边亲一边摸免费视频| 97在线人人人人妻| 午夜福利网站1000一区二区三区| 伦理电影免费视频| 美女高潮的动态| 欧美人与善性xxx| 亚洲精品日韩av片在线观看| 丰满少妇做爰视频| 男女无遮挡免费网站观看| 91久久精品电影网| 免费观看性生交大片5| 亚洲国产成人一精品久久久| 一级毛片黄色毛片免费观看视频| 大码成人一级视频| 欧美 亚洲 国产 日韩一| 日韩视频在线欧美| 精品久久久久久久毛片微露脸 | 亚洲欧美精品综合一区二区三区| 青青草视频在线视频观看| 男女国产视频网站| 成人影院久久| 久久 成人 亚洲| 亚洲中文字幕日韩| 午夜福利,免费看| 麻豆国产av国片精品| 久久99热这里只频精品6学生| 三上悠亚av全集在线观看| 18禁国产床啪视频网站| 亚洲精品久久久久久婷婷小说| 精品第一国产精品| 91字幕亚洲| 精品国产一区二区三区久久久樱花| 女人精品久久久久毛片| 一级黄色大片毛片| 成人免费观看视频高清| 大片电影免费在线观看免费| 国产黄色视频一区二区在线观看| 中文字幕人妻丝袜一区二区| 一区二区av电影网| www日本在线高清视频| 国产成人系列免费观看| 亚洲人成网站在线观看播放| 另类亚洲欧美激情| 少妇被粗大的猛进出69影院| 亚洲第一青青草原| 日韩视频在线欧美| 黄色一级大片看看| 你懂的网址亚洲精品在线观看| 欧美人与性动交α欧美精品济南到| 久久精品国产综合久久久| av线在线观看网站| 国产成人精品久久二区二区91| 少妇的丰满在线观看| 亚洲精品日本国产第一区| 在线观看一区二区三区激情| 亚洲精品久久久久久婷婷小说| 免费一级毛片在线播放高清视频 | av视频免费观看在线观看| 一区福利在线观看| 日韩 亚洲 欧美在线| 欧美人与性动交α欧美软件| 一边摸一边抽搐一进一出视频| 国产精品久久久久久精品古装| 亚洲国产成人一精品久久久| 老汉色av国产亚洲站长工具| 久9热在线精品视频| av不卡在线播放| 国产精品久久久人人做人人爽| 欧美精品一区二区免费开放| 男女高潮啪啪啪动态图| 少妇粗大呻吟视频| 91麻豆精品激情在线观看国产 | 亚洲欧美精品自产自拍| www.精华液| 飞空精品影院首页| 国产成人精品久久二区二区91| svipshipincom国产片| 97精品久久久久久久久久精品| 亚洲成人国产一区在线观看 | 国产精品国产三级国产专区5o| 国产欧美日韩精品亚洲av| 如日韩欧美国产精品一区二区三区| 大片电影免费在线观看免费| 侵犯人妻中文字幕一二三四区| 黄片小视频在线播放| svipshipincom国产片| 搡老岳熟女国产| 精品一区二区三区四区五区乱码 | 国产亚洲欧美精品永久| 香蕉国产在线看| av片东京热男人的天堂| 国产精品亚洲av一区麻豆| 啦啦啦在线免费观看视频4| 国产成人av激情在线播放| 在现免费观看毛片| av有码第一页| 国产亚洲av高清不卡| 久久精品熟女亚洲av麻豆精品| 亚洲欧美精品自产自拍| 成人三级做爰电影| 波多野结衣av一区二区av| 欧美精品av麻豆av| 天天躁狠狠躁夜夜躁狠狠躁| 国产精品国产三级专区第一集| 精品国产乱码久久久久久小说| 久久ye,这里只有精品| 丝袜在线中文字幕| 波野结衣二区三区在线| 国产男女内射视频| 女性被躁到高潮视频| 美女中出高潮动态图| 国产成人一区二区三区免费视频网站 | 欧美日韩亚洲高清精品| 久久久精品免费免费高清| 老熟女久久久| 国产成人精品在线电影| 啦啦啦在线观看免费高清www| 91九色精品人成在线观看| 少妇粗大呻吟视频| 肉色欧美久久久久久久蜜桃| 中文字幕色久视频| 夜夜骑夜夜射夜夜干| 晚上一个人看的免费电影| 妹子高潮喷水视频| 亚洲图色成人| 午夜福利视频在线观看免费| 最近中文字幕2019免费版| 视频在线观看一区二区三区| 成年女人毛片免费观看观看9 | 人人妻,人人澡人人爽秒播 | 亚洲精品美女久久av网站| 亚洲成人免费电影在线观看 | 母亲3免费完整高清在线观看| 久久国产精品影院| 欧美精品一区二区大全| 在线观看国产h片| 美女国产高潮福利片在线看| 精品亚洲乱码少妇综合久久| 久久久欧美国产精品| 国产精品一国产av| 男女高潮啪啪啪动态图| 涩涩av久久男人的天堂| 成年美女黄网站色视频大全免费| 老司机午夜十八禁免费视频| 国产一区二区 视频在线| 国产成人a∨麻豆精品| 亚洲国产精品国产精品| 亚洲五月婷婷丁香| 免费av中文字幕在线| 国产高清国产精品国产三级| 精品福利观看| 新久久久久国产一级毛片| 亚洲精品美女久久久久99蜜臀 | 宅男免费午夜| 91老司机精品| 你懂的网址亚洲精品在线观看| 久久天躁狠狠躁夜夜2o2o | 国产精品 国内视频| 免费高清在线观看视频在线观看| 99久久精品国产亚洲精品| 色视频在线一区二区三区| 丝瓜视频免费看黄片| 90打野战视频偷拍视频| 久久人人爽av亚洲精品天堂| 亚洲精品美女久久av网站| 国产亚洲午夜精品一区二区久久| 青春草视频在线免费观看| 18禁黄网站禁片午夜丰满| 欧美久久黑人一区二区| 日本黄色日本黄色录像| 中国国产av一级| 免费在线观看完整版高清| 欧美日韩亚洲国产一区二区在线观看 | 国产精品熟女久久久久浪| 女人久久www免费人成看片| 免费观看a级毛片全部| 亚洲av日韩精品久久久久久密 | 人人妻人人澡人人爽人人夜夜| 久久精品成人免费网站| 精品一区二区三区四区五区乱码 | 曰老女人黄片| 国产午夜精品一二区理论片| 亚洲精品国产一区二区精华液| 青春草视频在线免费观看| 性少妇av在线| 999久久久国产精品视频| 日韩一卡2卡3卡4卡2021年| 激情视频va一区二区三区| 久久人妻福利社区极品人妻图片 | 国产精品九九99| 美女扒开内裤让男人捅视频| 赤兔流量卡办理| xxx大片免费视频| 亚洲熟女毛片儿| 欧美精品一区二区大全| av电影中文网址| 亚洲av成人不卡在线观看播放网 | www.自偷自拍.com| 久久天躁狠狠躁夜夜2o2o | av网站免费在线观看视频| 99国产综合亚洲精品| 日韩熟女老妇一区二区性免费视频| 嫁个100分男人电影在线观看 | 亚洲七黄色美女视频| 久久午夜综合久久蜜桃| 亚洲精品美女久久av网站| 我的亚洲天堂| 亚洲成人国产一区在线观看 | 不卡av一区二区三区| 尾随美女入室| 国产av国产精品国产| 国产淫语在线视频| 国产熟女午夜一区二区三区| 免费看av在线观看网站| 中国美女看黄片| 婷婷色av中文字幕| 国产精品成人在线| 日日爽夜夜爽网站| 亚洲人成77777在线视频| 日日夜夜操网爽| 美女中出高潮动态图| 国产成人免费观看mmmm| 好男人电影高清在线观看| 一级片免费观看大全| 我要看黄色一级片免费的| 成人免费观看视频高清| 欧美 亚洲 国产 日韩一| 少妇裸体淫交视频免费看高清 | 午夜福利视频在线观看免费| 日韩伦理黄色片| 啦啦啦啦在线视频资源| 这个男人来自地球电影免费观看| 美女视频免费永久观看网站| 如日韩欧美国产精品一区二区三区| 欧美性长视频在线观看| 久久精品成人免费网站| 侵犯人妻中文字幕一二三四区| 观看av在线不卡| 97在线人人人人妻| 国产成人啪精品午夜网站| www.999成人在线观看| 亚洲av片天天在线观看| 在线看a的网站| 老司机在亚洲福利影院| 国产色视频综合| 亚洲av成人不卡在线观看播放网 | 天天躁日日躁夜夜躁夜夜| cao死你这个sao货| 妹子高潮喷水视频| 日韩制服骚丝袜av| 美女大奶头黄色视频| 亚洲欧美日韩另类电影网站| 日韩伦理黄色片| 国产一区有黄有色的免费视频| 国产高清国产精品国产三级| 男人添女人高潮全过程视频| 黑人欧美特级aaaaaa片| 另类亚洲欧美激情| av在线播放精品| 国产一区二区三区综合在线观看| 黑人猛操日本美女一级片| 欧美激情极品国产一区二区三区| 男男h啪啪无遮挡| 波野结衣二区三区在线| 国产免费视频播放在线视频| 欧美人与善性xxx| 美女扒开内裤让男人捅视频| 青春草视频在线免费观看| 亚洲情色 制服丝袜| 国产欧美日韩一区二区三 | 亚洲一卡2卡3卡4卡5卡精品中文| 性色av一级| 国产成人91sexporn| 丝瓜视频免费看黄片| 最黄视频免费看| 亚洲欧美色中文字幕在线| 少妇的丰满在线观看| 99久久人妻综合| 视频区图区小说| 三上悠亚av全集在线观看| 在线av久久热| 男人添女人高潮全过程视频| 又大又爽又粗| 18禁观看日本| 菩萨蛮人人尽说江南好唐韦庄| 久久精品熟女亚洲av麻豆精品| av又黄又爽大尺度在线免费看| 日本五十路高清| 成人三级做爰电影| 男人添女人高潮全过程视频| xxxhd国产人妻xxx| 美女高潮到喷水免费观看| 午夜免费鲁丝| 国产亚洲精品久久久久5区| 不卡av一区二区三区| 国产精品一区二区在线观看99| 高清不卡的av网站| 老汉色av国产亚洲站长工具| 亚洲欧美一区二区三区久久| 国产免费福利视频在线观看| 真人做人爱边吃奶动态| av网站免费在线观看视频| 亚洲午夜精品一区,二区,三区| 一区在线观看完整版| 国产成人欧美| 欧美人与性动交α欧美精品济南到| 国产精品麻豆人妻色哟哟久久| 国产男人的电影天堂91| www.999成人在线观看| 日本av手机在线免费观看| 亚洲成av片中文字幕在线观看| 不卡av一区二区三区| 永久免费av网站大全| 亚洲精品成人av观看孕妇| 久久天躁狠狠躁夜夜2o2o | kizo精华| 丰满迷人的少妇在线观看| 尾随美女入室| 黑人欧美特级aaaaaa片| 久久精品国产亚洲av涩爱| 91老司机精品| 国产av精品麻豆| 成人18禁高潮啪啪吃奶动态图| 久久久久久久大尺度免费视频| 久久久久久人人人人人| 亚洲九九香蕉| 又粗又硬又长又爽又黄的视频| 91字幕亚洲| 日本wwww免费看| 免费在线观看视频国产中文字幕亚洲 | 女性被躁到高潮视频| 欧美xxⅹ黑人| 欧美日韩亚洲高清精品| 精品久久久久久电影网| 国产精品久久久人人做人人爽| 欧美久久黑人一区二区| 蜜桃在线观看..| 精品少妇久久久久久888优播| 欧美黑人欧美精品刺激| 在现免费观看毛片| 日韩av在线免费看完整版不卡| 我要看黄色一级片免费的| 亚洲av欧美aⅴ国产| 曰老女人黄片| 精品国产一区二区久久| tube8黄色片| 脱女人内裤的视频| 夫妻性生交免费视频一级片| 国产亚洲av高清不卡| 大话2 男鬼变身卡| 又粗又硬又长又爽又黄的视频| 亚洲精品av麻豆狂野| 国产真人三级小视频在线观看| 午夜老司机福利片| 精品福利观看| 国产不卡av网站在线观看| 精品一区在线观看国产| av在线app专区| 欧美日韩福利视频一区二区| 国产日韩欧美视频二区| 叶爱在线成人免费视频播放| www日本在线高清视频| 午夜福利在线免费观看网站| 国产精品一区二区免费欧美 | 中文字幕另类日韩欧美亚洲嫩草| 亚洲免费av在线视频| 老司机靠b影院| 大香蕉久久成人网| 久久久久国产精品人妻一区二区| 免费人妻精品一区二区三区视频| 欧美性长视频在线观看| 精品视频人人做人人爽| 久久久久视频综合| www.自偷自拍.com| 亚洲国产欧美一区二区综合| 免费一级毛片在线播放高清视频 |