• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Predicted Oil Recovery Scaling-Law Using Stochastic Gradient Boosting Regression Model

    2021-12-11 13:31:44MohamedElAminAbdulhamitSubasiMahmoudSelimandAwadMousa
    Computers Materials&Continua 2021年8期

    Mohamed F.El-Amin,Abdulhamit Subasi,Mahmoud M.Selimand Awad Mousa

    1Energy Research Lab.,College of Engineering,Effat University,Jeddah,21478,KSA

    2Artificial Intelligence and Cyber Security Research Lab.,College of Engineering,Effat University,Jeddah,21478,KSA

    3Department of Mathematics-College of Sciences&Humanities in Al Aflaj,Prince Sattam bin Abdulaziz University,Al-Aflaj,11912,KSA

    4Department of Physics,College of Science&Humanities in Al-Aflaj,Prince Sattam bin Abdulaziz University,Al-Aflaj,11912,KSA

    5Mathematics Department,Faculty of Science,Aswan University,Aswan,81528,Egypt

    Abstract:In the process of oil recovery,experiments are usually carried out on core samples to evaluate the recovery of oil,so the numerical data are fitted into a non-dimensional equation called scaling-law.This will be essential for determining the behavior of actual reservoirs.The global non-dimensional time-scale is a parameter for predicting a realistic behavior in the oil field from laboratory data.This non-dimensional universal time parameter depends on a set of primary parameters that inherit the properties of the reservoir fluids and rocks and the injection velocity, which dynamics of the process.One of the practical machine learning (ML) techniques for regression/classification problems is gradient boosting(GB)regression.The GB produces a prediction model as an ensemble of weak prediction models that can be done at each iteration by matching a least-squares base-learner with the current pseudoresiduals.Using a randomization process increases the execution speed and accuracy of GB.Hence in this study, we developed a stochastic regression model of gradient boosting (SGB) to forecast oil recovery.Different nondimensional time-scales have been used to generate data to be used with machine learning techniques.The SGB method has been found to be the best machine learning technique for predicting the non-dimensional time-scale,which depends on oil/rock properties.

    Keywords: Machine learning; stochastic gradient boosting; linear regression;time-scale; oil recovery

    1 Introduction

    1.1 Background

    The hydrocarbons reservoirs are naturally fractured, so that they consist of two main sections,fractures and matrix blocks (Fig.1).Fracture permeability is higher than matrix permeability, and the opposite is true for porosity, i.e., the volume of the hydrocarbons in matrix blocks is much bigger than in fractures.In the oil recovery process, water is pumped, and oil is collected into fractures from the matrix blocks and then into wells of production.The imbibition is considered a common mechanism of oil extraction, whereas water pushes oil from the matrix into the adjacent fracture.There are two sorts of imbibition, namely, counter-current imbibition and co-current imbibition.In the co-current imbibition mechanism, water is injected into the matrix from one side to displace oil to the opposite side.On the other hand, in the counter-current imbibition type, water is injected into the matrix and collect oil from the same side.The counter-current is one often mechanism because the matrix block (filled with oil) is surrounded by fractures (filled with water) [1-8].Usually, experiments are performed on small rock samples to estimate the oil recovery.Therefore, the obtained data are fitted in a non-dimensional equation, called scaling-law.This would help understand real reservoir behavior.This kind of scaling law is a non-dimensional universal variable including primary parameters of rocks and fluids properties.Essentially, when the data are presented in one curve with a few numbers of primary parameters, then we can conclude that the scale is working well.

    Figure 1:Schematic diagram of the counter-current oil-water imbibition in a fractured rock

    1.2 Research Motivation

    In this study, we utilize the SGB machine learning technique to forecast oil recovery estimation and its scaling-law.In general, the ML techniques, including Artificial Neural Networks(ANNs), k-nearest neighbor (k-NN), and Support Vector Machine (SVM), can be utilized to predict oil recovery too.However, most of the well-established machine learning techniques are complex, and their training processes are time-consuming.Rule-based Decision Tree (DT) and tree-based ensembles (TBE) methods such as Random Forest (RF), Extremely Randomized Trees(ERT), and SGB are powerful and robust forecasting algorithms.The ML tools are recently used widely in the oil/gas industry [9] due to their precise assessment of future recovery [9].Therefore, several oil/gas industry problems can be solved [10-17].On the other hand, the ANN can be utilized to predict carbonate reservoirs’permeability based on the well-log data [18-20].For example, Elkatatny et al.[12] have used the ANN model to predict reservoir well-logs heterogeneous permeability.Although the ANN method estimates the permeability with high precision and minimal log-data, ANN has some problems, such as uncertainty.Also, the support vector machine(SVM) method has been employed to predict reservoir permeability [11].The SVM scheme has some limitations, including the inability to eliminate uncertainties and the need to confirm the stability/consistency of the scheme.Fortunately, the SGB and RF schemes can sustain the abovementioned limitations.Those necessities and have not been applied to oil/gas reservoir predictions,such as the current work, which is interested in oil recovery predictions.

    1.3 Contribution

    The machine learning SGB regression algorithm has been developed to predict oil recovery from oil reservoirs based on laboratory measurements and analytical models.To the best of the author’s knowledge, the machine learning tools have not been used before in oil recovery prediction.The performance of oil recovery prediction is assured for the SGB model.Also, other machine learning techniques, including k-NN, ANN, SVM, and RF, have been used oil recovery besides the SGB model.Another significant coefficient in oil recovery prediction, namely, the universal dimensionless time, has also been predicted along with oil recovery in a generalized scaling-law [8].

    1.4 Organization

    In this paper, a novel ensemble prediction scheme is utilized to accomplish oil recovery prediction.The proposed model utilizes the SGB model for oil recovery.The rest of the paper is arranged as follows:In Section 2, scaling laws are discussed.In Section 3, machine learning techniques are explained.Section 4 provides the results and discussions; and finally, conclusions are presented in Section 5.

    2 Traditional Models of Oil-Recovery and Time-Scale

    Aronofsky et al.[21] have introduced an analytical formula of the oil recovery against the dimensionless time, i.e.,

    whereα=0.05 [-] is a rate coefficient [22], T [-] is the dimensionless time,R[m3] is the recovery,Rim[m3] is the ultimate.Tab.1 presents the most common analytical time-scales that can be found in the literature.

    Table 1:Scaling-Laws of the dimensionless time

    Table 1:Continued

    The generalized formula of dimensionless time by El-Amin et al.[8], contains spontaneous and forced imbibition cases.For example, for the countercurrent imbibition (m=1), the scalinglaw collapses to the other above formulas as those in Tab.1.The velocity can be represented in terms of the given rock characteristics as shown in Tab.2 for different wettability cases.

    Table 2:Wettability against characteristic velocity [8]

    3 Traditional Models of Oil-Recovery and Time-Scale

    In the following subsections, selected machine learning methods have been presented including k-NN, ANNs, SVM, RF and SGB.

    3.1 The k-NN Method

    The k-NN is a nonparametric technique that can be utilized for classification and regression [24].In k-NN regression, the nearest examples of functional training space represent the input, while the output is a property values average of the nearest neighbors of the object.Also, the k-NN is instance-based lazy learning, which approximates the function locally delays all computation after classification.This method can be utilized to give weight to the contributions of neighbors.Thus, neighbors contribute more to the average distance.For example, the typical weight for a given neighbor distance d is 1/d [24].The neighborhood includes a set of objects with known property values that can represent the algorithm training set.

    3.2 The ANNs Method

    The idea of ANNs is to mimic the human brain’s task of learning from experience and then identify predictive patterns [25].The neural network architecture generates nodes representing neurons and linking nodes matching axons, dendrites, and synapses.The ANN includes an input layer, an output layer, and a hidden layer.The input layer transforms the input variables, while the output layer transforms variables of the target.The input data gave by input nodes can achieve prediction.The input data is weighted by the links to the neural network’s values.A hidden layer uses a particular transfer function, and the forecast is calculated at the output nodes.With an additional recursive effort, ANN accumulates its predictive power.In a real sense, the network learns such that the learning examples are delivered to the network one by one.Nevertheless, the algorithms of the predictive model are close to other statistical methods.In a more straightforward sense, neural networks could be defined as a mixture of regression and general multivariate techniques.The architecture of the neural network was constructed from a multi-layer perspective.One may differentiate between the development of the model forward and backward spread.The method of backward propagation is commonly employed.ANNs can be described as a mixture of various multivariate forecast models in theoretical terms.However, the results are usually very complex and not instinctually simple.This approach is complex with simple, intermediate stages and intuitively consistent tests compared to regression and decision tree modeling.ANN was trained with 50 neurons in the hidden layer, 0.3 learning rate, 0.2 momentum value, and backpropagation algorithm.

    3.3 The SVM Method

    The SVM (Support Vector Machine) method is linear/nonlinear data classification algorithms [26].The SVM algorithm utilizes a nonlinear mapping to transform the original training data to a higher dimension.In general, the data is divided into two groups by hyperplane, which is created by a nonlinear mapping to a high dimension space.The SVM seeks the hyperplane by margins and support vectors.SVMs are reliable since they can predict complex nonlinear boundaries of decision; however, their training times are prolonged.SVMs are less sensitive than other algorithms to overfit.Linear SVMs cannot be utilized for linearly inseparable data.The linear SVM can be extended to contain nonlinear SVM for linearly inseparable data.Nonlinear SVMs allocate input space to nonlinear boundaries of decision (i.e., nonlinear hypersurfaces).Two main steps are introduced to get a nonlinear SVM.Such steps generate a quadratic optimization problem that a linear formulation of SVM can solve.SVM was trained withpukkernel with C=100.

    3.4 The RF Method

    The tree classifiers group, which is relevant to random vectors, is called random forest(RF) [27].Assuming that a given training data of the random vector, {Vi:i=1,...,s} is and identically distributed and independent, a decision tree can be built with a classification model.This classification model has the highest voting classifier for the DT collection, {Mi:i=1,...,s}.On the other hand, the RF contains various decision trees’ensemble schemes (e.g., bagging).The random vectors can match the number of the elements directly in the data set of the training object index, i.e., point to the successive bootstrap elements.Several sources like data sampling training or various decision configuration tree inducer consider the randomization of trees, which make up forests [28].

    3.5 The SGB Method

    Gra?bczewski [29] implemented bagging techniques with the idea that they could improve their performance by introducing a type of randomness to the function of estimation procedures.Also,Breiman [30] have used random sampling boost implementations.However, it was deliberated to approximate stochastic weighting until the implementation of the base learner did not retain the observation weights.Freund et al.[31] recently proposed a hybrid adaptive bagging technique that replaces the base learner with the bagged base learner and replaces residuals “out of the bag” at each boosting step.Freund et al.[31] prompted a small shift to gradient boosting to incorporate uncertainty as an essential process metric.A minor change was inspired by gradient boosting to integrate uncertainty as a critical measure of the process.Moreover, the SGB scheme has been developed to be related to bagging and boosting [32-34].The SBG technique has been applied in different fields, such as remote sensing problems [35].The SBG was used by Moisen et al.[36] to predict the existence of species and basal area for 13 tree species.

    4 Results and Discussion

    4.1 Dataset

    The dataset and input variables used in this study were mainly experimental data extracted from many published papers [8,22,23,37-42].The predicted values are typically the oil recovery and dimensionless time.

    4.2 Prediction Performance Metrics

    It is well known that an independent test set is not a good indicator of performance on the training set.One may use a training set based on each instance’s classifications within the training set.In order to predict a classifier’s performance on new data, an error estimation is required on a dataset (called test set), which has no role in classifier formation.The training data, as well as the test data, are assumed to be representative samples.In some particular cases, we have to differentiate between the test data and training data.It is worth mentioning that the test data cannot be utilized to generate the classifier.In general, there are three types of datasets, namely,training, validation, and test data.One or more learning schemes use the training data to create classifiers.The validation data is often employed to modify specific classifier parameters or to pick a different one.The test data will then be used to estimate the error of the final optimized technique.In order to obtain good performance, the training and testing sets should be chosen independently.Thus, in order to achieve better performance, the test set should be different from the training set.In many cases, the test data are manually classified, which reduces the number of the used data.A subset of data, which is called the holdout procedure, is used for testing, and the rest is employed for training, and sometimes a part can be used for validation [43].In this study, 2/3 of the data is utilized for training, and 1/3 is used for testing.

    In order to evaluate the quality of each ML technique, a number of statistical measures are listed in Tab.3.These measures include correlation coefficient (R), mean-absolute error (MAE),relative-absolute error (RAE), root relative squared error (RRSE) and root-mean-squared error(RMSE) [44-46].Forntest cases, assuming that the actual valuesaiand the predicted onespifor the test caseiare defined as:

    Given the following expressions:

    Table 3:Prediction performance metrics

    4.3 Results

    In this study, several machine learning algorithms are utilized to forecast the dimensionless time and oil recovery in terms of primary physical parameters of rocks and fluids.In this regard,diverse prediction models are developed, and the predictors’performance is examined.As shown in Tab.4, the SGB learner has achieved better efficiency than the ANN, k-NN, SVM, and RF.

    Figs.2-4 demonstrate the predicted dimensionless time and oil recovery by the SGB model against the scaling-law (actual) ones [8] for strong/weak/intermediate water-wet cases.The findings show that the performance is enhanced with respect to the coefficient of correlation and MAE and less improvement with regard to RMSE, RAE, and RRSE.It is clearly observed that the SGB method has the highest correlation coefficient.Consequently, one may observe that the SVM method can be reliable in predicting both the oil recovery and dimensionless time because of its ability to achieve better performance while ensuring a better generalization.

    Table 4:Performance of different ML techniques to predict oil recovery and nondimensional time for different wettability conditions

    Figure 2:SGB predicted oil recovery and dimensionless time against actual scaling-law ones at for intermediate water-wet

    4.4 Discussion

    Figure 3:SGB predicted oil recovery and dimensionless time against actual scaling-law ones at for

    In this work, selected ML techniques have been used to predict the dimensionless time and oil recovery.So, several machine learning techniques are developed to predict the dimensionless time and oil recovery against the scaling-law (actual) ones for strong/weak/intermediate waterwet cases.When comparing the SGB algorithm with other techniques (Tab.4), it achieved better performance than the ANN, k-NN, SVM, and RF regarding the correlation coefficient R and the RMS error.It is clear from this table that the RMS error of the SGB technique is smaller than those of other methods.Regarding the computational aspect, the SGB model requires a comparable complexity for the prediction compared to that of the ANN, SVM, and RF models.Unbiased tests can be used to evaluate the prediction performance of the used ML method.To test the accuracy of predictions, we quantify standard metrics such as coefficient of correlation(R) and error between the real and expected values.The experimental results have shown that the SGB technique can accurately evaluate the dimensionless time and oil recovery since it achieved a high value of R and low error.These models’error and R assign the higher positive relationship between the expected and real oil recovery values.The results indicate increased performance in terms of correlation coefficient and MAE and rose ultimately with RMSE, RAE, and RRSE.

    Figure 4:SGB predicted oil recovery and dimensionless time against actual scaling-law ones for weak water-wet

    If one compares the performances of the machine learning techniques for the predicted dimensionless time and oil recovery against the scaling-law (actual) ones for strong/weak/intermediate water-wet cases, the proposed SGB method achieved the best results in all cases.But, ANN, k-NN,SVM, and RF are also accomplished good performance in some cases.For instance, SGB achieved R = 0.9869, MAE = 3.4592 and RMSE = 5.6901, k-NN is also achieved similar results with R= 0.9799, MAE = 3.8663, and RMSE = 6.6369 for oil recovery in the intermediate water-wet case.SGB achieved R = 0.9906, MAE = 10.3747 and RMSE = 55.1585, SVM is also achieved similar results with R = 0.9887, MAE = 13.4459 and RMSE = 60.6137 for dimensionless time in the intermediate water-wet case.SGB achieved R = 0.9958, MAE = 1.8362 and RMSE = 2.9497,k-NN is also achieved similar results with R = 0.9936, MAE = 2.3153 and RMSE = 3.9147 for oil recovery in the strong water-wet case.SGB achieved R = 0.9956, MAE = 1.2462 and RMSE= 2.0136.Random Forest is also achieved similar results with R = 0.9939, MAE = 1.6865, and RMSE = 2.5251 for the dimensionless time in the strong water-wet case.SGB achieved R =0.9875, MAE = 3.4529 and RMSE = 6.0695, k-NN is also achieved the same results with R =0.9875, MAE = 3.4529, and RMSE = 6.0695 for oil recovery in the weak water-wet case.SGB achieved R = 0.9989, MAE = 8.0802 and RMSE = 35.1254, SVM is also achieved the similar results with R = 0.9986, MAE = 10.4821 and RMSE = 40.0731 for dimensionless time in the weak water-wet case.In all cases, the SGB model outperformed the other models.This reveals that the SGB is a robust model and tackle well noisy conditions.The overall results illustrate that the SGB technique can effectively handle the expected oil recovery data because of its ability to produce better performance while ensuring better generalization.

    5 Conclusions

    As the dimensionless scaled-time law is fundamental to predict oil-recovery using laboratory data.We examined several ML techniques (k-NN, ANN, SVM, RF, and SBG) to predict the dimensionless scaling-law based on oil and rock physical properties in the current paper.The SGB regression was found to be the best ML method for predicting dimensionless scaling-time.The machine learning techniques’performance has been compared using R, MAE, RMSE, RAE, and RRSE.Assessment of the experimental results among the machine learning techniques has shown that the SGB algorithm has the best prediction performance.Besides, the SGB model achieved higher prediction accuracy and lowered MAE, RMSE, RAE, and RRSE compared to k-NN,ANN, SVM, and RF regression models.

    Funding Statement:The authors received no specific funding for this study.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    国产精品 国内视频| 国产亚洲最大av| 黄色毛片三级朝国网站| 三级国产精品片| 日本91视频免费播放| 国产精品二区激情视频| 日韩av在线免费看完整版不卡| 少妇人妻久久综合中文| 一二三四中文在线观看免费高清| 建设人人有责人人尽责人人享有的| 男人添女人高潮全过程视频| 午夜福利在线免费观看网站| 色婷婷av一区二区三区视频| 嫩草影院入口| 久久精品国产鲁丝片午夜精品| 国产一区二区三区综合在线观看| av在线观看视频网站免费| 免费久久久久久久精品成人欧美视频| 九色亚洲精品在线播放| 国产成人午夜福利电影在线观看| 丝袜人妻中文字幕| 丰满迷人的少妇在线观看| 中文字幕精品免费在线观看视频| 久久久久久久国产电影| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 多毛熟女@视频| 免费观看无遮挡的男女| 日韩一区二区三区影片| 久久精品国产亚洲av涩爱| 电影成人av| 国产男女内射视频| av女优亚洲男人天堂| 日本91视频免费播放| 精品国产一区二区三区久久久樱花| 人人妻人人澡人人看| 国产一区二区 视频在线| 日韩 亚洲 欧美在线| 久久精品熟女亚洲av麻豆精品| 精品国产露脸久久av麻豆| 91aial.com中文字幕在线观看| 久久午夜综合久久蜜桃| 超碰97精品在线观看| av视频免费观看在线观看| 制服丝袜香蕉在线| av又黄又爽大尺度在线免费看| 美女福利国产在线| 一个人免费看片子| 亚洲av日韩在线播放| 亚洲av.av天堂| videossex国产| 成年av动漫网址| 看免费成人av毛片| 日韩制服骚丝袜av| 老司机亚洲免费影院| 一区二区三区激情视频| 亚洲成人一二三区av| 国产极品粉嫩免费观看在线| 日本午夜av视频| 午夜av观看不卡| 成人黄色视频免费在线看| 最近最新中文字幕大全免费视频 | 在现免费观看毛片| 日韩免费高清中文字幕av| 精品卡一卡二卡四卡免费| 热re99久久国产66热| av国产精品久久久久影院| 高清不卡的av网站| 国产人伦9x9x在线观看 | 国产精品一区二区在线不卡| 国产成人91sexporn| 日韩中字成人| av网站免费在线观看视频| 高清黄色对白视频在线免费看| 老鸭窝网址在线观看| 欧美日韩成人在线一区二区| 五月伊人婷婷丁香| 日韩一卡2卡3卡4卡2021年| 可以免费在线观看a视频的电影网站 | 老司机影院毛片| 90打野战视频偷拍视频| 国产成人91sexporn| 亚洲国产最新在线播放| 国产亚洲av片在线观看秒播厂| 国产在线免费精品| 免费播放大片免费观看视频在线观看| 热re99久久国产66热| 99久久人妻综合| 99精国产麻豆久久婷婷| 桃花免费在线播放| 2021少妇久久久久久久久久久| 国产在线一区二区三区精| 精品人妻熟女毛片av久久网站| 欧美人与性动交α欧美精品济南到 | 在线观看www视频免费| 国产精品不卡视频一区二区| 一区在线观看完整版| 丰满迷人的少妇在线观看| 国产精品免费视频内射| 老汉色av国产亚洲站长工具| 日韩一卡2卡3卡4卡2021年| 欧美人与性动交α欧美软件| 成人午夜精彩视频在线观看| 波多野结衣一区麻豆| 亚洲 欧美一区二区三区| 欧美 日韩 精品 国产| 久久av网站| 最新的欧美精品一区二区| 成人18禁高潮啪啪吃奶动态图| 国产日韩欧美视频二区| 亚洲成色77777| 亚洲国产av影院在线观看| 99九九在线精品视频| 国产乱人偷精品视频| 欧美日韩一区二区视频在线观看视频在线| 亚洲av福利一区| 咕卡用的链子| 精品视频人人做人人爽| 亚洲av国产av综合av卡| 青春草亚洲视频在线观看| 韩国av在线不卡| 国产毛片在线视频| 美女主播在线视频| 色视频在线一区二区三区| 啦啦啦啦在线视频资源| 亚洲精品美女久久久久99蜜臀 | 麻豆av在线久日| 黑人欧美特级aaaaaa片| 男女午夜视频在线观看| 日韩av不卡免费在线播放| 国产亚洲最大av| 亚洲精品第二区| 两个人免费观看高清视频| 亚洲av国产av综合av卡| xxxhd国产人妻xxx| 一级爰片在线观看| 18在线观看网站| 最近中文字幕高清免费大全6| 国产欧美日韩一区二区三区在线| 国产免费福利视频在线观看| 国产精品一区二区在线不卡| 女的被弄到高潮叫床怎么办| 国产熟女欧美一区二区| 熟女少妇亚洲综合色aaa.| 少妇 在线观看| 国产av一区二区精品久久| 国产黄频视频在线观看| 国产成人av激情在线播放| av线在线观看网站| av.在线天堂| 中文精品一卡2卡3卡4更新| 少妇人妻精品综合一区二区| 欧美老熟妇乱子伦牲交| 精品国产乱码久久久久久小说| 久久国产亚洲av麻豆专区| 国产深夜福利视频在线观看| 成人毛片a级毛片在线播放| 亚洲av电影在线观看一区二区三区| 日韩 亚洲 欧美在线| 亚洲精品美女久久久久99蜜臀 | 看非洲黑人一级黄片| 丰满少妇做爰视频| 美女中出高潮动态图| 日韩av不卡免费在线播放| 啦啦啦中文免费视频观看日本| 纯流量卡能插随身wifi吗| 国产白丝娇喘喷水9色精品| 99久久中文字幕三级久久日本| 欧美日韩av久久| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 国产黄色视频一区二区在线观看| 蜜桃在线观看..| 国产黄频视频在线观看| 最近中文字幕高清免费大全6| 建设人人有责人人尽责人人享有的| 香蕉精品网在线| 精品久久久久久电影网| 国产成人91sexporn| 亚洲av电影在线观看一区二区三区| 综合色丁香网| 国产精品嫩草影院av在线观看| 久久精品国产a三级三级三级| 亚洲精品久久成人aⅴ小说| 90打野战视频偷拍视频| 亚洲一区二区三区欧美精品| 免费在线观看黄色视频的| 一级片'在线观看视频| 青春草亚洲视频在线观看| 老司机影院成人| 黄色 视频免费看| 精品一品国产午夜福利视频| 国产在线视频一区二区| 看十八女毛片水多多多| 晚上一个人看的免费电影| 久久久久久久亚洲中文字幕| 1024视频免费在线观看| 精品久久蜜臀av无| 精品少妇久久久久久888优播| xxxhd国产人妻xxx| 性色av一级| 国产日韩欧美亚洲二区| 亚洲国产日韩一区二区| 狠狠婷婷综合久久久久久88av| 91aial.com中文字幕在线观看| 90打野战视频偷拍视频| 亚洲人成电影观看| 自拍欧美九色日韩亚洲蝌蚪91| 久久 成人 亚洲| 9热在线视频观看99| 亚洲欧美色中文字幕在线| 免费女性裸体啪啪无遮挡网站| 中文字幕最新亚洲高清| 国产成人精品一,二区| 麻豆av在线久日| 男男h啪啪无遮挡| 午夜福利视频在线观看免费| 69精品国产乱码久久久| 777久久人妻少妇嫩草av网站| 久久久久国产网址| 亚洲精品国产色婷婷电影| 国产精品蜜桃在线观看| 老司机影院毛片| 午夜激情久久久久久久| 少妇 在线观看| 成年人免费黄色播放视频| 大码成人一级视频| 国产成人欧美| 精品人妻偷拍中文字幕| av不卡在线播放| 熟女av电影| 国产麻豆69| 精品少妇一区二区三区视频日本电影 | 激情视频va一区二区三区| 成人亚洲精品一区在线观看| 有码 亚洲区| 男人舔女人的私密视频| freevideosex欧美| 亚洲欧美一区二区三区久久| 深夜精品福利| 欧美中文综合在线视频| 国产精品麻豆人妻色哟哟久久| 日韩一本色道免费dvd| 91精品国产国语对白视频| 亚洲精品一区蜜桃| 边亲边吃奶的免费视频| 王馨瑶露胸无遮挡在线观看| 热99久久久久精品小说推荐| 国产精品人妻久久久影院| 亚洲精品中文字幕在线视频| 亚洲激情五月婷婷啪啪| 不卡av一区二区三区| 99re6热这里在线精品视频| 精品久久久久久电影网| 蜜桃国产av成人99| 亚洲人成电影观看| 曰老女人黄片| 亚洲欧美一区二区三区黑人 | 免费女性裸体啪啪无遮挡网站| 黄色怎么调成土黄色| 日本91视频免费播放| 制服诱惑二区| 亚洲欧美精品综合一区二区三区 | 校园人妻丝袜中文字幕| 秋霞伦理黄片| 久久精品久久久久久噜噜老黄| 亚洲一区中文字幕在线| 欧美激情高清一区二区三区 | 女的被弄到高潮叫床怎么办| 欧美精品一区二区免费开放| 亚洲欧美一区二区三区国产| 黄频高清免费视频| 一级毛片我不卡| 国产一区二区三区综合在线观看| 9色porny在线观看| 日韩在线高清观看一区二区三区| 亚洲欧洲日产国产| 国产淫语在线视频| 亚洲av.av天堂| 黄色 视频免费看| 日韩一本色道免费dvd| 美国免费a级毛片| 成年人免费黄色播放视频| 有码 亚洲区| 日日啪夜夜爽| 五月伊人婷婷丁香| 老汉色∧v一级毛片| 亚洲精品视频女| 一级毛片我不卡| 国产成人精品福利久久| 国产一区二区 视频在线| 精品一品国产午夜福利视频| 一本—道久久a久久精品蜜桃钙片| 韩国高清视频一区二区三区| 欧美日韩综合久久久久久| 天美传媒精品一区二区| 国产精品二区激情视频| 两个人看的免费小视频| 日韩在线高清观看一区二区三区| 国产精品免费大片| tube8黄色片| 99香蕉大伊视频| 最近的中文字幕免费完整| 免费观看av网站的网址| 一二三四中文在线观看免费高清| 高清视频免费观看一区二区| 有码 亚洲区| 国产一区有黄有色的免费视频| 欧美bdsm另类| 狠狠精品人妻久久久久久综合| 亚洲激情五月婷婷啪啪| 黄色视频在线播放观看不卡| 欧美激情高清一区二区三区 | 涩涩av久久男人的天堂| 日韩伦理黄色片| 国产精品99久久99久久久不卡 | av网站在线播放免费| 男女午夜视频在线观看| 性少妇av在线| 日韩 亚洲 欧美在线| videosex国产| 免费观看在线日韩| 国产福利在线免费观看视频| 国产亚洲欧美精品永久| 午夜老司机福利剧场| 妹子高潮喷水视频| 日韩欧美一区视频在线观看| 超色免费av| 啦啦啦在线免费观看视频4| 女人被躁到高潮嗷嗷叫费观| 90打野战视频偷拍视频| 国产一区亚洲一区在线观看| 久久久久网色| 欧美黄色片欧美黄色片| 亚洲精品,欧美精品| 我的亚洲天堂| 另类精品久久| 国产精品嫩草影院av在线观看| 精品亚洲成国产av| 香蕉丝袜av| 国产成人精品婷婷| 国产片特级美女逼逼视频| 日韩在线高清观看一区二区三区| 日韩伦理黄色片| 久久 成人 亚洲| 新久久久久国产一级毛片| 亚洲一码二码三码区别大吗| 我的亚洲天堂| 97在线人人人人妻| 一级片免费观看大全| 一二三四中文在线观看免费高清| 久久久国产欧美日韩av| 久久99精品国语久久久| 午夜老司机福利剧场| 亚洲欧美成人精品一区二区| 中文字幕最新亚洲高清| 伦精品一区二区三区| 美女xxoo啪啪120秒动态图| 午夜免费男女啪啪视频观看| 中文字幕人妻熟女乱码| 女性被躁到高潮视频| 精品国产一区二区三区久久久樱花| 国产精品久久久久久av不卡| 国产成人精品福利久久| 黄色毛片三级朝国网站| 久久久久久久久久人人人人人人| 日日爽夜夜爽网站| 午夜福利在线免费观看网站| 日韩免费高清中文字幕av| 亚洲人成77777在线视频| 嫩草影院入口| 人体艺术视频欧美日本| 日本欧美国产在线视频| 高清黄色对白视频在线免费看| av不卡在线播放| 香蕉丝袜av| 欧美人与性动交α欧美软件| 国产成人精品久久久久久| 欧美 亚洲 国产 日韩一| 欧美xxⅹ黑人| 中文字幕精品免费在线观看视频| 久久99一区二区三区| 国产精品久久久久成人av| 国产成人精品在线电影| 中文字幕精品免费在线观看视频| 国产成人精品无人区| 久久久久精品性色| 永久免费av网站大全| 一区二区三区乱码不卡18| 午夜免费观看性视频| 国产麻豆69| 国产亚洲精品第一综合不卡| 自拍欧美九色日韩亚洲蝌蚪91| 精品一区二区三卡| 精品午夜福利在线看| 国产淫语在线视频| 国产精品 欧美亚洲| 久久久精品免费免费高清| 中文精品一卡2卡3卡4更新| 美女中出高潮动态图| 欧美精品国产亚洲| 久久久欧美国产精品| 制服人妻中文乱码| 日本黄色日本黄色录像| 在线观看美女被高潮喷水网站| 高清视频免费观看一区二区| 又粗又硬又长又爽又黄的视频| 亚洲第一av免费看| 国产男女内射视频| 国产av一区二区精品久久| 国产精品一区二区在线不卡| 国产毛片在线视频| 久久99热这里只频精品6学生| 久久精品国产亚洲av涩爱| 黄色怎么调成土黄色| 国产探花极品一区二区| 美女福利国产在线| 亚洲第一区二区三区不卡| 国产片特级美女逼逼视频| 日韩av在线免费看完整版不卡| 国产成人aa在线观看| 亚洲人成电影观看| 亚洲精品国产av蜜桃| 久久精品国产自在天天线| 成年av动漫网址| 女人精品久久久久毛片| 日韩一区二区视频免费看| 久久午夜综合久久蜜桃| 国产在线免费精品| 精品国产国语对白av| 午夜福利在线观看免费完整高清在| 亚洲精品国产av成人精品| 男女下面插进去视频免费观看| 久久精品夜色国产| 69精品国产乱码久久久| 亚洲精品国产色婷婷电影| 亚洲少妇的诱惑av| 国产男女内射视频| 亚洲欧美成人精品一区二区| 黄色毛片三级朝国网站| 亚洲第一区二区三区不卡| 人妻少妇偷人精品九色| 如日韩欧美国产精品一区二区三区| 欧美 日韩 精品 国产| 免费播放大片免费观看视频在线观看| 高清在线视频一区二区三区| 日韩中文字幕视频在线看片| 日韩在线高清观看一区二区三区| 久久 成人 亚洲| 夫妻性生交免费视频一级片| 色婷婷久久久亚洲欧美| 亚洲一区二区三区欧美精品| 成人二区视频| 精品久久久久久电影网| 一级爰片在线观看| 丰满乱子伦码专区| 久久久久精品人妻al黑| 久久韩国三级中文字幕| 国产免费现黄频在线看| 日日爽夜夜爽网站| 久久精品国产亚洲av涩爱| 色哟哟·www| 日韩电影二区| 少妇人妻久久综合中文| 女的被弄到高潮叫床怎么办| 最新的欧美精品一区二区| 亚洲国产色片| 人妻 亚洲 视频| 18禁裸乳无遮挡动漫免费视频| 亚洲四区av| 精品国产一区二区三区久久久樱花| 啦啦啦在线免费观看视频4| 欧美老熟妇乱子伦牲交| 各种免费的搞黄视频| 成人二区视频| 汤姆久久久久久久影院中文字幕| 一本大道久久a久久精品| 波野结衣二区三区在线| 日本av手机在线免费观看| 久久毛片免费看一区二区三区| 成年女人毛片免费观看观看9 | av在线app专区| 26uuu在线亚洲综合色| 性色av一级| 日韩av不卡免费在线播放| 国产老妇伦熟女老妇高清| 香蕉丝袜av| 一区二区三区乱码不卡18| 欧美激情极品国产一区二区三区| 精品少妇久久久久久888优播| 亚洲,欧美精品.| 黑人欧美特级aaaaaa片| 国产精品国产av在线观看| 国产黄频视频在线观看| 五月伊人婷婷丁香| 国产精品久久久久久精品电影小说| 永久免费av网站大全| 亚洲婷婷狠狠爱综合网| 黄色一级大片看看| 少妇猛男粗大的猛烈进出视频| 国产又色又爽无遮挡免| 国产一区亚洲一区在线观看| 97在线人人人人妻| 最新中文字幕久久久久| 久久久久久久国产电影| 午夜av观看不卡| 香蕉精品网在线| 两性夫妻黄色片| 另类精品久久| 观看av在线不卡| 建设人人有责人人尽责人人享有的| 这个男人来自地球电影免费观看 | 超碰成人久久| 男女免费视频国产| 亚洲欧美清纯卡通| 肉色欧美久久久久久久蜜桃| 亚洲欧美一区二区三区久久| 欧美日韩综合久久久久久| 美国免费a级毛片| 亚洲第一区二区三区不卡| 超碰成人久久| 人成视频在线观看免费观看| 曰老女人黄片| 国产精品一区二区在线观看99| 人人妻人人澡人人爽人人夜夜| 国产亚洲最大av| 日韩电影二区| 免费不卡的大黄色大毛片视频在线观看| 久久精品国产亚洲av天美| 日韩一卡2卡3卡4卡2021年| 精品亚洲成a人片在线观看| 国产97色在线日韩免费| av在线老鸭窝| 精品午夜福利在线看| 男女无遮挡免费网站观看| 欧美精品国产亚洲| 一级片免费观看大全| 在线免费观看不下载黄p国产| 亚洲精品,欧美精品| 亚洲三区欧美一区| 超色免费av| 美女视频免费永久观看网站| 香蕉国产在线看| 在线亚洲精品国产二区图片欧美| 亚洲av成人精品一二三区| 伊人久久大香线蕉亚洲五| 亚洲伊人久久精品综合| 亚洲av在线观看美女高潮| 国产一区二区在线观看av| 热99久久久久精品小说推荐| 性色avwww在线观看| 在线观看免费日韩欧美大片| 国产亚洲av片在线观看秒播厂| 伦理电影免费视频| 亚洲成色77777| 777久久人妻少妇嫩草av网站| 日韩不卡一区二区三区视频在线| 久久婷婷青草| 久久青草综合色| 大话2 男鬼变身卡| 欧美亚洲日本最大视频资源| 欧美日韩亚洲国产一区二区在线观看 | 熟女少妇亚洲综合色aaa.| 五月开心婷婷网| 9热在线视频观看99| 少妇人妻 视频| 精品视频人人做人人爽| av在线播放精品| 一级片免费观看大全| 国产在线一区二区三区精| www.熟女人妻精品国产| 国产高清不卡午夜福利| 久久久精品区二区三区| 中文精品一卡2卡3卡4更新| 国产一区二区激情短视频 | 又大又黄又爽视频免费| 精品视频人人做人人爽| 免费观看无遮挡的男女| 成人毛片a级毛片在线播放| 久久精品国产亚洲av天美| 青春草亚洲视频在线观看| 欧美xxⅹ黑人| 丝袜在线中文字幕| 国产亚洲午夜精品一区二区久久| 国产白丝娇喘喷水9色精品| 99热全是精品| 大码成人一级视频| 日本黄色日本黄色录像| 美女国产高潮福利片在线看| 国产一区二区在线观看av| 人人妻人人爽人人添夜夜欢视频| 亚洲一区中文字幕在线| 国产麻豆69| 深夜精品福利| a 毛片基地| 日韩伦理黄色片| 精品人妻在线不人妻| 午夜福利影视在线免费观看| 中文欧美无线码| 成年动漫av网址| 天天躁日日躁夜夜躁夜夜| 日本色播在线视频| av视频免费观看在线观看| 亚洲欧美一区二区三区久久| 午夜av观看不卡| 午夜福利一区二区在线看| 一本大道久久a久久精品| 欧美bdsm另类| 国产精品不卡视频一区二区| 亚洲精品日本国产第一区| 99香蕉大伊视频| 蜜桃在线观看..| 一本大道久久a久久精品| 国产极品粉嫩免费观看在线| 国产精品不卡视频一区二区| 久久久久网色| 亚洲av.av天堂|