• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A novel heterogeneous ensemble of extreme learning machinesand its soft sensing application

    2020-05-10 09:11:52MaNingDongZe

    Ma Ning Dong Ze

    (Hebei Technology Innovation Center of Simulation & Optimized Control for Power Generation, North China Electric Power University, Baoding 071003, China)(School of Control and Computer Engineering, North China Electric Power University, Beijing 102206, China)

    Abstract:To obtain an accurate and robust soft sensor model in dealing with the increasingly complex industrial modeling data, an effective heterogeneous ensemble of extreme learning machines (HEELM) is proposed. Specifically, the kernel extreme learning machine (KELM) and four common extreme learning machine (ELM) models that have different internal activations are contained in the HEELM for enriching the diversity of sub-models. The number of hidden layer nodes of the extreme learning machine is determined by the trial and error method, and the optimal parameters of the kernel extreme learning machine model are determined by cross validation. Moreover, to obtain the best output of the ensemble model, least squares regression is applied to aggregate the outputs of all individual models. Two complex data sets of practical industrial processes are used to test the HEELM performance. The simulation results show that the HEELM has a high prediction accuracy. Compared with the individual ELM models, bagging ELM ensemble model, BP and SVM models, the prediction accuracy of the HEELM model is improved by 4.5% to 8.7%, and the HEELM model can obtain better generalization capability.

    Key words:soft sensor; extreme learning machine; least squares; ensemble

    In many industrial processes, some key process parameters are of great importance to the implementation of control strategies and production plans[1]. However, in some situations, due to the technical problems, a high investment cost or measurement delay, it is difficult to obtain these variables using hardware sensors[2]. To solve this issue, soft sensing technology has been studied and applied by many scholars in the past decades[3-5]. Soft sensor modeling methods can be divided into the mechanism modelling method and data-driven modeling method[6-7]. The mechanism modeling method has the advantages of strongly explanatory and easily understood, but it also has the disadvantages of a complex model and poor portability, especially for some complicated thermal and chemical processes. The other soft sensor modelling method, namely data-driven method, can be developed through learning the historical data. There are various methods applied to setting up data-driven models, for instance, a support vector machine[8], Gaussian process regression[9], artificial neural networks (ANNs)[10], and so on. Compared with other methods, ANNs show prominent advantages due to their good non-linear mapping and generalization ability. Hence, ANNs have been used in a wide variety of industrial process modeling[11-12].

    Actually, the accuracy and stability of soft sensor models are the most important criteria to evaluate the quality of models established. In spite of having strong fitting and generalization capability, ANNs are essentially unstable methods based on the statistical theory. The output of ANNs highly depends on the initial weight and training samples. Previous studies also have shown that the performance of a single neural network model is unstable. The performance of ANNs depends heavily on the model structure, especially for the number of nodes and layers in the hidden layers. With the increase in industrial complexity, the dimensionality and coupling of process data tend to be larger, which undoubtedly increases the difficulty of data-driven modelling methods. Hence, some efforts have been made to increase the generalization and stability capability by scholars through various technical methods, for instance, the ensemble method, regularization approach, and so on. Among the above techniques, the ensemble approach seems to be pretty effective. Hansen et al.[13]firstly proposed ANN ensemble in 1990. Many previous studies have confirmed that the neural network ensemble can show better performance for the same issue by aggregating the outputs of some individual neural networks[14]. The reason why the ensemble learning model exhibits a high prediction accuracy is that the ensemble method can balance the outputs of multiple individual subnets, weakening the influence of imperfect models.

    Although ANN ensemble approaches have a wide application in practice, one important issue should be considered. General neural network models in ANN ensemble use very time-consuming training methods, such as the back propagation (BP) method to train the model, which suffers from some insuperable disadvantage, such as a plenty of adjustable parameters and danger of over-fitting[15]. To deal with this difficult problem, a kind of effective ANN model called extreme learning machine (ELM) is selected. Different from other neural network methods, ELM transforms the learning training problem into solving the least squares norm problem of the output weight matrix, which gives it the advantages of avoiding falling into local extremism and having a powerful generalization capability[16]. Moreover, many kinds of activation functions can become an ELM inner function, regardless of whether the function is continuous or discontinuous. Due to these advantages, ELM is used to construct an ANN ensemble model in this work. However, the common ELM model uses single type of activation function, which can restrict the performance and robustness of ELM.

    To eliminate such restrictions, the heterogeneous ensemble model based on kernel extreme learning machine (KELM) and multiple inner functions of ELMs (HEELM) is developed. In the proposed HEELM ensemble model, five kinds of ELMs (sigmoid activation functions ELM, sin activation functions ELM, radbas activation functions ELM, tribas activation functions ELM and one KELM) are selected as individual models. Meanwhile, to further improve the performance of the ensemble model, least squares regression is used to aggregate the outputs of each signal models. In order to validate HEELM performance, the HEELM is used to establish the soft sensor models of two real-world complex datasets, and simultaneously, unlike other single models. Finally, test results prove that the proposed HEELM has both a good generalization capability and strong robustness.

    1 Theory and Algorithm

    1.1 ELM

    ELM was firstly proposed by Huang et al.[17], and it has been widely applied in various fields in recent years. The structure of the ELM is given in Fig.1, where we can recognize that the ELM is a three-layer neural network. Compared with the traditional BP or RBF network, the ELM model has a relatively fast learning speed, the reasons of which lie in two aspects: One is that the biases and input weights of the ELM are randomly assigned, and the other is that the least square approach is applied to calculate the output weights of the ELM. The procedure of the ELM algorithm is exhibited below.

    Fig.1 The structure of ELM

    Suppose that there areNtraining samples (xi,ti), in whichxi={xi1,xi2,…,xin}T∈Rn,i=1,2,…,N, are the input data andti=[ti1,ti2,…,tim]T∈Rm, are the output data.nandmare equal to the number of input layer nodes and output nodes of the ELM, respectively. The following form is the computational expression of ELM,

    (1)

    whereβiis the output weights, and it connects thei-th hidden node with the output nodes. Simultaneously,wirepresents the input weights, connecting thei-th hidden node with the input nodes;lis the number of hidden layer nodes of the model andg( ·) is the activation function. Previous studies show that the output value of the ELM model can be fitted to samples with zero error. Therefore, a derivation equation can be obtained as

    (2)

    Eq.(1) can be written as

    (3)

    Eq.(3) can be simply written as

    Hβ=T

    (4)

    where

    (5)

    β=[β1,β2,…,βl]T,T=[t1,t2,…,tN]T

    (6)

    whereHis the hidden layer output matrix. In the training process, whenwiandbiof ELM are generated, the output matrixHcan be obtained, so that the ELM learning training problem is transformed into the least squares norm problem for solving the output weight, and that is

    (7)

    whereH+is the Moore-Penrose generalized inverse ofH.

    Hence, the establishment of the ELM model can be achieved by the following four steps:

    Step1Divide the data sets into two parts: the training data set and testing data set.

    Step2Randomly assign the input weights and biases and initialize the number of hidden layer nodes.

    Step3Obtain the output matrixH, and calculateβvia training data set.

    Step4Use the calculated output weightsβto calculate the output value of the model with the testing data set.

    Obviously, the learning procedure of the ELM is very fast and easy to implement. Nevertheless, there are still some deficiencies when the ELM is used in practice, which are shown as the following three factors: 1) The performance of a single ELM model tends to be affected by the randomly assigned set input weights and biases. 2) An ELM model is only assigned with one activation function, limiting the robustness of the model to a certain extent. 3)When dealing with very complex large-scale data with high collinearity, one standard ELM always shows poor generalization performance.

    1.2 Kernel ELM

    The kernel ELM (KELM) was proposed by Huang et al.[18]based on the analysis of the support vector machine theory, and it is an extension of the extreme learning machine method. The KELM uses Mercer’s conditions to define kernel matrixΩand replaces random matrixHHTin the ELM with the kernel matrixΩ,

    ΩKELM=HHT,Ωi,j=h(xi) ·h(xj)=K(xi,xj)

    (8)

    According to the above formula, the output of the KELM model is as

    (9)

    The KELM method does not need to assign the initial input weights and biases as well as the number of hidden layer nodes. The specific form of the kernel functionK(xi,xj) is the unique parameter that needs to be adjusted. In this paper, the radial basis function is selected as the kernel function,

    (10)

    1.3 Least square regression

    Least squares regression (LSR) is an effective linear statistical regression modeling method. Assume that the data set consists of an input (independent) variableX∈Rn×mand an output (dependent) variableY∈Rn×1and both variables are mean-centered and scaled by the standard deviation. The linear relationship between the input and output variables is expressed in the matrix form as

    Y=X×W+E

    (11)

    whereWis the regression coefficient vector, andEis the residual error matrix.

    The optimal linear regression relationship between the input and output variables can be estimated by the least squares algorithm, assuming that the optimal linear relationship obtained by least squares is

    (12)

    (13)

    2 Proposed Heterogeneous HEELM Model

    To establish a more accurate and stable model for soft sensor modeling, a novel heterogeneous ELM ensemble model called HEELM is developed in this work. The structure diagram of the HEELM is presented in Fig.2. The proposed HEELM model uses five kinds of ELM to enhance the diversity of the individual model, which also can tackle the problem of noise in training data. As shown in Fig.2, sigmoid, sin, rabas, tribas function ELM, as well as KELM are applied for the individual model of the HEELM, and the least squares regression method is used as the aggregation strategy to obtain better ensemble outputs. The detailed steps of the HEELM modeling method are described as follows.

    Fig.2 The structure of the HEELM model

    Suppose that the data set isD={(Xi,Yi)|i=1,2,…,N}, whereXi=[xi1,xi2,…,xim]∈Rmrepresents the input data withmvariables inXi;Yi∈Rrepresents output data. Before building the model, the data is divided into three groups: training setDtr={(Xt,Yt)|t=1,2,…,Ntr}, validation setDva={(Xv,Yv)|v=1,2,…,Nva}, and testing setDte={(Xt′,Yt′)|t′=1,2,…,Nte},N=Nte+Nva+Ntr. The validation set is used to validate the number of hidden layer nodes, the ELM models, andC,γvalues of KELM.

    Step1Preprocess input and output data in the same order of amplitude by the following equations:

    (14)

    (15)

    Step2Set the input weights and biases of ELM models with sigmoid, sin, rabas, tribas activation functions and build individual models using training set. The KELM model does not need to set input weights and other parameters.γin the kernel function and regularization coefficientCare the two parameters in KELM that need to be optimized. In the present study, two parameters are determined byk-fold cross-validation. Specifically, the training samples are divided intokgroups equally. Then, thek-1 groups are used to train the KELM model, and the remaining group is applied to the test model. Afterkrepeated experiments, each group of data can be used as test data in turn. The average of the total test errors is taken as an assessment criterion to evaluate the parameters of the KELM model. Moreover, the most suitable number of hidden nodes of ELM models with sigmoid, sin, rabas, tribas functions is calculated using the trial-and-error approach.

    Step4Calculate the output of the proposed HEELM model through establishing a regression model between the outputs of each individual model and the expected outputs by the least squares regression technique.

    (16)

    (17)

    Step6To accurately evaluate the performance of the proposed HEELM, the root mean square error (RMSE) is used as evaluation criteria. RMSE can be calculated as

    (18)

    3 Case Studies

    The ensemble model capability is validated using two practical industrial processes: One is the debutanizer and the other is selective catalytic reduction (SCR) flue gas denitration process of the power plant boiler.

    3.1 Debutanizer column

    The debutanizer column is a part of desulfurization and naphtha splitting plant. Its task is to reduce the concentration of tower bottom butane as much as possible[19]. The flowchart of a debutanizer column process is shown in Fig.3. Usually, the concentration of bottom butane is measured on-line by a gas chromatography analyzer installed on the top of the tower. Since it takes a certain time for the vapor of bottom butane to reach the top of the tower and the analysis process of the gas chromatography analyzer, there is a lag in the on-line measurement of the concentration of bottom butane. So, it is necessary to establish a soft sensor model to estimate the concentration of bottom butane on-line and in real time. There are in total seven variables selected as input variables in the soft sensing model. The only output variable is the concentration of butane in the bottom of the debutanizer. Tab.1 lists the detail description of input variables. There is a total of 2 393 data samples in the debutanizer column process, of which about half are used as training sets, about one-third are test sets and the rest are validation sets. All the data can be downloaded in Ref.[20].

    Tab.1 Input variables of soft sensor for the debutanizer column

    Fig.3 The flowchart of the debutanizer column

    In this study, some kinds of single ELM models include ELM with sigmoid, sin, radbas, tribas activation functions, and the KELM model are built to be compared with the HEELM model. To ensure fair comparison, some parameters for five single models such as the number of hidden layer nodes,Candγare firstly selected by the trial-and-error method. Those parameters can be determined when the errors are the smallest within the validation data.

    Fig.4 shows the variation of relative errors of the validation set with the number of hidden layer nodes of ELM models. It can be seen that, for the ELM with sigmoid function, the relative error is the least when the number of nodes is 135. Hence, the number of hidden layer nodes of the individual ELM with the sigmoid function is assigned as 135. Similarly, the numbers of hidden layer nodes of single ELM models with sin, radbas, and tribas inner functions are determined as 115, 130 and 130, respectively. In addition, parametersCandγin the KELM model are finally optimized to beC=50 andγ=0.06. After determining the optimal parameters of each sub-model, the proposed HEELM can be developed via aggregating the outputs of five individual models using the least squares regression strategy. To enhance the reliability of the simulation experiment, the experiment is repeated 30 times, and the max, min, mean and standard deviation (SD) of RMSE values for the testing dataset are shown in Tab.2. Bagging the ELM uses five different ELMs as sub-models. Bagging ensemble is a common ensemble technique, and in this study, the Bagging ELM ensemble model is established to make a comparison with the performance of the proposed HEELM.

    As seen from Tab.2, the proposed HEELM model can achieve smaller max, min, and mean of RMSE for the testing dataset than those of the other five individual models and the Bagging ELM model. Fig.5 displays the variation of RMSE values obtained by the seven models in 30 runs for testing the dataset of debutanizer column. It is clear that, the RMSE value of each ELM models with sigmoid, sin, radbas, tribas activation functions varies from 0.086 9 to 0.110 7 with a large fluctuation. The reason for such a result is that although the optimal number of nodes for each ELM model has been determined, the input weights and bias values of the four ELM models are randomly determined in each simulation experiment, which can lead to the unstable prediction performance of the four models. When the optimum parameters (C,γ) are determined, the KELM model has no other parameters that can be adjusted, so the error results of the KELM model for 30 times are invariable. The RMSE values of the HEELM are low and stable around 0.086 0 without fluctuation.

    Tab.2 Simulation results of RMSE values for debutanizer column testing dataset

    MethodRMSEMaxMinMeanSDELM(sigmoid)0.106 70.092 70.098 60.003 7ELM(sin)0.103 10.089 10.097 50.003 2ELM(radbas)0.110 70.092 50.099 80.004 1ELM(tribas)0.104 80.086 90.095 90.003 7KELM0.092 60.092 60.092 60Bagging ELM0.095 80.088 80.091 80.001 4HEELM0.087 70.084 30.086 18.16×10-4

    (b)

    (c)

    (d)

    Fig.4 Variation of relative errors of validation set with the number of nodes of ELMs for the debutanizer column. (a) ELM (sigmoid); (b) ELM(sin); (c) ELM(radbas); (d) ELM(tribas)

    Apparently, the HEELM model can achieve much better stability than that of single ELM. In addition, the predictive performance of the KELM model is better than that of other four single ELM models, but not as good as that of the HEELM model. The simulation results of the debutanizer column demonstrate that the proposed HEELM model can achieve better prediction accuracy and model stability.

    Fig.5 RMSE values for debutanizer column testing dataset of six models

    3.2 SCR flue gas denitration process

    SCR flue gas denitrification is a necessary technique in coal-fired power plants for reducing the nitrogen oxides (NOx). SCR denitrification technique has some salient features such as high denitrification efficiency and simple device structure, so SCR denitrification has attracted much attention and wide application in almost all power plants. The flowchart of SCR flue gas denitration is shown in Fig.6. The working principle of SCR is that, liquid ammonia reacts with the NOx, and converts NOxto N2and H2O.

    Fig.6 Schematic diagram of reactor structure in SCR flue gas denitrification system

    In this work, 1 000 measurements of a 1 000 MW ultra-supercritical boiler SCR denitrification system boiler operation are obtained from the distributed control system (DCS) database. The sampling interval is 1 min. Based on the basic knowledge of boilers and the engineers’ experience[21], six variables are employed as inputs of the SCR model and the only output is the export NOxof the SCR denitrification system. The detailed description of input variables is listed in Tab.3. To construct the soft sensor model, 1 000 samples are divided into three parts: 500 samples are used as training sets, 200 samples are the validation sets and the remaining 300 samples are the test sets.

    Tab.3 Input variables of the soft sensor for the SCR flue gas denitration process

    Input variablesVariable descriptionx1Entrance NOx concentration x2Inlet gas flow value x3Inlet flue gas temperaturex4Ammonia injection x5Unit loadx6Entrance O2 concentration

    According to the steps of the proposed HEELM approach mentioned above, the number of hidden layer nodes of four common ELM with different activation functions and two parameters (C,γ) of KELM are firstly determined. Similar to the number of hidden layer nodes determination of debutanizer column simulation in section 3.1, Fig.7 presents the relative errors of the validation set with the number of ELM models’ nodes. It can be seen from Fig.7 that, for the SCR flue gas denitration dataset, the most suitable number of hidden layer nodes for four ELM models with sigmoid, sin, radbas, tribas functions is assigned as 85, 90, 100 and 105, respectively. Moreover, according to the cross validation method,Candγin the KELM model are assigned to be 50 and 0.1, respectively.

    After 30 repeated experiments, the results of the soft model for SCR flue gas denitration case are listed in Tab.4. Compared with the five single ELM models, BP model and SVM model, it can be clearly seen that the HEELM method can obtain smaller REME values. The SD values of RMSE four common ELM models with sigmoid, sin, radbas, tribas activation functions and BP model are obviously higher than that of the HEELM model, which reveals that the common single ELM model is unstable. Meanwhile, the HEELM model combines the outputs of five ELM models to solve the problem that includes the complex data. Five different kinds of ELM models can realize mutual complementation by least square technique when establishing the soft sensor model. Therefore, the proposed HEELM ensemble model can show the highest accuracy among all the presented models.

    To further show the capability of the HEELM method, a comparison between the predicted results and real data of 300 testing cases is presented in Fig.8. The red line is the perfect line which shows that predicted values are equal to real values, and the points are the results predicted by the HEELM method. It is easy to see that all of the points distribute closely around the perfect line, which means that the output is the export NOxof SCR which can be predicted with good accuracy by the proposed HEELM for the testing dataset. Moreover, in order to clearly show the generalization performance of six kinds of ELMs, Fig.9 presents the variation of RMSE values obtained by the eight models in 30 runs for testing the dataset of SCR flue gas denitration. From Fig.9, it can be seen that the RMSE values of HEELM are the smallest in all 30 times experiments. Hence, all the simulation re-sults of SCR flue gas denitration indicate that the HEELM ensemble model can achieve a high accuracy and good stability.

    (a)

    (b)

    (c)

    (d)

    Fig.7 Schematic diagram of reactor structure in SCR flue gas denitrification system. (a) ELM(sigmoid); (b) ELM(sin); (c) ELM(radbas); (d) ELM(tribas)

    Tab.4 Simulation results of RMSE values for SCR flue gas denitration testing dataset

    MethodRMSEMaxMinMeanSDELM(sigmoid)0.136 50.126 70.131 80.002 4ELM(sin)0.141 50.126 90.132 60.003 3ELM(radbas)0.139 70.127 20.132 10.003 4ELM(tribas)0.139 20.125 60.131 90.002 8KELM0.129 20.129 20.129 20BP0.141 30.121 30.131 10.006 6SVM0.130 20.130 20.130 20HEELM0.127 20.123 00.125 39.18×10-4

    Fig.8 Fitting performance of the HEELM model for SCR flue gas denitration testing dataset

    Fig.9 RMSE values for SCR flue gas denitration testing dataset of eight models

    4 Conclusions

    1) An advanced approach for soft sensor modeling using a heterogeneous ensemble, namely HEELM, is proposed. Five kinds of ELM algorithms are used for obtaining diversity within the HEELM model in handling complex modeling data. The least square method is used as an effective ensemble technique to enhance the generalization ability by ensuring the worst individual model have the least impact on the final output.

    2) The generalization performance of the proposed HEELM ensemble model is verified by two real datasets from the debutanizing and the SCR flue gas denitration processes. The simulation results show that the HEELM model can achieve a good performance in generalization accuracy and stability.

    3) The modeling performance of the HEELM is also compared with individual ELM models, bagging ELM ensemble model, BP, as well as SVM models, and the results demonstrate that the perfomance of HEELM is better than that of the other models in the aspects of its predictive accuracy.

    4) In future study work, other kinds of aggregating techiques and different neural network ensemble models will be studied and utilized.

    在线观看三级黄色| 国精品久久久久久国模美| 亚洲精品aⅴ在线观看| 亚洲精品一二三| 中文字幕制服av| 精品少妇内射三级| 国产麻豆69| 国产男女超爽视频在线观看| 人人妻人人爽人人添夜夜欢视频| 亚洲成人一二三区av| 丰满少妇做爰视频| 国产99久久九九免费精品| 亚洲av福利一区| 操出白浆在线播放| 夫妻午夜视频| 亚洲,欧美,日韩| 亚洲成人免费av在线播放| 老熟女久久久| 亚洲精品第二区| 日日摸夜夜添夜夜爱| 中国国产av一级| 多毛熟女@视频| 色吧在线观看| 国产 一区精品| 午夜福利在线免费观看网站| 天天添夜夜摸| 国产成人精品福利久久| 男人舔女人的私密视频| 秋霞在线观看毛片| 伦理电影免费视频| 国产极品粉嫩免费观看在线| 老鸭窝网址在线观看| 精品福利永久在线观看| 亚洲在久久综合| 在线观看www视频免费| 夜夜骑夜夜射夜夜干| 国产精品免费大片| 电影成人av| 精品国产乱码久久久久久男人| 国产精品二区激情视频| 午夜免费男女啪啪视频观看| 亚洲视频免费观看视频| 国产精品久久久av美女十八| 国产欧美日韩一区二区三区在线| 亚洲国产精品成人久久小说| 中文字幕av电影在线播放| 欧美少妇被猛烈插入视频| 在线观看免费午夜福利视频| 亚洲av欧美aⅴ国产| 不卡视频在线观看欧美| 久久国产精品男人的天堂亚洲| 一级a爱视频在线免费观看| 黄色毛片三级朝国网站| 免费久久久久久久精品成人欧美视频| 免费观看a级毛片全部| 狠狠婷婷综合久久久久久88av| 99re6热这里在线精品视频| 新久久久久国产一级毛片| 亚洲美女搞黄在线观看| 亚洲国产精品成人久久小说| 亚洲欧美中文字幕日韩二区| 两个人看的免费小视频| 日本av免费视频播放| 女性生殖器流出的白浆| 宅男免费午夜| 免费少妇av软件| 自线自在国产av| 中文字幕制服av| 亚洲欧美激情在线| 夜夜骑夜夜射夜夜干| 久久久久久久精品精品| 国产精品三级大全| 国产男女内射视频| 性高湖久久久久久久久免费观看| 啦啦啦啦在线视频资源| 国产精品嫩草影院av在线观看| 精品一区二区三区av网在线观看 | 制服人妻中文乱码| 亚洲av男天堂| 少妇人妻久久综合中文| 别揉我奶头~嗯~啊~动态视频 | 亚洲欧美日韩另类电影网站| 亚洲,一卡二卡三卡| 亚洲国产精品一区三区| 久久久国产一区二区| 黑人欧美特级aaaaaa片| 亚洲国产中文字幕在线视频| 亚洲欧美成人综合另类久久久| 国产精品国产三级专区第一集| 国产精品一区二区在线观看99| 欧美老熟妇乱子伦牲交| 久久久国产欧美日韩av| 蜜桃国产av成人99| 中文乱码字字幕精品一区二区三区| av卡一久久| 久久久久久久久免费视频了| 又粗又硬又长又爽又黄的视频| 又大又黄又爽视频免费| 久久精品亚洲av国产电影网| 51午夜福利影视在线观看| 精品国产一区二区三区久久久樱花| 夜夜骑夜夜射夜夜干| 午夜日本视频在线| 伦理电影大哥的女人| 男女床上黄色一级片免费看| 久久婷婷青草| 亚洲国产看品久久| 人人妻,人人澡人人爽秒播 | 亚洲欧美成人综合另类久久久| 免费观看a级毛片全部| 一区二区三区精品91| 亚洲精品视频女| 伊人亚洲综合成人网| 最近最新中文字幕大全免费视频 | 亚洲在久久综合| 欧美日本中文国产一区发布| 欧美日本中文国产一区发布| 女性生殖器流出的白浆| 纵有疾风起免费观看全集完整版| 视频在线观看一区二区三区| 亚洲国产精品999| 久久久久国产精品人妻一区二区| 99久国产av精品国产电影| 黄色视频在线播放观看不卡| 久久久精品区二区三区| 久久人人97超碰香蕉20202| videos熟女内射| 午夜福利乱码中文字幕| 午夜福利,免费看| 99国产精品免费福利视频| 九草在线视频观看| 国产一区二区激情短视频 | 精品久久久久久电影网| 中文精品一卡2卡3卡4更新| 亚洲欧洲日产国产| 男女边吃奶边做爰视频| 国产一区二区 视频在线| 欧美国产精品一级二级三级| 丰满乱子伦码专区| 亚洲国产精品一区三区| 人人妻人人添人人爽欧美一区卜| 成人国语在线视频| av福利片在线| 最黄视频免费看| 亚洲国产精品一区二区三区在线| www.av在线官网国产| 久久狼人影院| 久久久欧美国产精品| 不卡视频在线观看欧美| 国产有黄有色有爽视频| 一级毛片黄色毛片免费观看视频| 在线看a的网站| 亚洲精品日韩在线中文字幕| 99精品久久久久人妻精品| bbb黄色大片| 色婷婷av一区二区三区视频| 日韩制服骚丝袜av| 又粗又硬又长又爽又黄的视频| 多毛熟女@视频| 欧美另类一区| 制服人妻中文乱码| 欧美 亚洲 国产 日韩一| 欧美乱码精品一区二区三区| 啦啦啦啦在线视频资源| 精品人妻一区二区三区麻豆| 国产精品香港三级国产av潘金莲 | 麻豆精品久久久久久蜜桃| 国产爽快片一区二区三区| 一个人免费看片子| 久久毛片免费看一区二区三区| 久久国产精品大桥未久av| 国产黄频视频在线观看| 日韩不卡一区二区三区视频在线| 国产国语露脸激情在线看| 99热网站在线观看| 少妇的丰满在线观看| 大码成人一级视频| 男女高潮啪啪啪动态图| 精品一区在线观看国产| 精品一区二区三区av网在线观看 | 国产av精品麻豆| 男女之事视频高清在线观看 | 狠狠精品人妻久久久久久综合| 丝袜在线中文字幕| 午夜福利乱码中文字幕| 国产在线视频一区二区| 久久久久精品人妻al黑| www.av在线官网国产| 亚洲人成77777在线视频| 欧美精品一区二区大全| 69精品国产乱码久久久| 天天躁夜夜躁狠狠躁躁| 美女国产高潮福利片在线看| 免费黄色在线免费观看| 啦啦啦啦在线视频资源| 涩涩av久久男人的天堂| 日韩 欧美 亚洲 中文字幕| 欧美变态另类bdsm刘玥| 午夜福利在线免费观看网站| 91国产中文字幕| 99香蕉大伊视频| 亚洲成国产人片在线观看| 国产成人av激情在线播放| 久久ye,这里只有精品| 国产片内射在线| 9色porny在线观看| 国产黄色视频一区二区在线观看| 下体分泌物呈黄色| 日韩欧美精品免费久久| 久久ye,这里只有精品| 免费av中文字幕在线| 夫妻午夜视频| 成人漫画全彩无遮挡| 中文字幕精品免费在线观看视频| 欧美精品亚洲一区二区| 飞空精品影院首页| 国产精品香港三级国产av潘金莲 | 国产成人免费无遮挡视频| 欧美乱码精品一区二区三区| 9热在线视频观看99| 十八禁高潮呻吟视频| 日韩一区二区视频免费看| avwww免费| 日本91视频免费播放| 伊人久久大香线蕉亚洲五| 国产日韩欧美亚洲二区| 精品人妻一区二区三区麻豆| 欧美变态另类bdsm刘玥| 99re6热这里在线精品视频| 黑丝袜美女国产一区| 满18在线观看网站| 日本wwww免费看| 久久久久视频综合| 777米奇影视久久| 久久久久精品国产欧美久久久 | 国产精品一国产av| 一级毛片 在线播放| 日日撸夜夜添| 成人国产av品久久久| 天美传媒精品一区二区| 色视频在线一区二区三区| 高清欧美精品videossex| 飞空精品影院首页| 日韩电影二区| 男人添女人高潮全过程视频| 高清视频免费观看一区二区| 久久ye,这里只有精品| 精品第一国产精品| 国产免费一区二区三区四区乱码| 亚洲欧美中文字幕日韩二区| 精品少妇一区二区三区视频日本电影 | 国产一区二区三区av在线| 国产精品亚洲av一区麻豆 | av线在线观看网站| 亚洲人成网站在线观看播放| 亚洲第一区二区三区不卡| av片东京热男人的天堂| 国产激情久久老熟女| 免费观看性生交大片5| 中文乱码字字幕精品一区二区三区| 亚洲成人一二三区av| 少妇精品久久久久久久| 国产探花极品一区二区| 精品国产乱码久久久久久男人| 欧美日韩成人在线一区二区| 欧美变态另类bdsm刘玥| 大香蕉久久网| 亚洲精品aⅴ在线观看| 爱豆传媒免费全集在线观看| 国产1区2区3区精品| 激情视频va一区二区三区| 巨乳人妻的诱惑在线观看| 成年人免费黄色播放视频| 最新的欧美精品一区二区| 亚洲av福利一区| 午夜免费鲁丝| 亚洲欧美精品综合一区二区三区| 一个人免费看片子| 成人亚洲精品一区在线观看| 街头女战士在线观看网站| 一边亲一边摸免费视频| 最近手机中文字幕大全| 女人高潮潮喷娇喘18禁视频| 久久亚洲国产成人精品v| 高清黄色对白视频在线免费看| 国产成人91sexporn| av不卡在线播放| 日本91视频免费播放| 免费高清在线观看视频在线观看| 中文精品一卡2卡3卡4更新| 啦啦啦视频在线资源免费观看| 日韩一区二区三区影片| 少妇精品久久久久久久| av视频免费观看在线观看| 国产一区二区激情短视频 | 99久久综合免费| 嫩草影视91久久| 亚洲,欧美,日韩| a级毛片在线看网站| 国产成人欧美在线观看 | 亚洲视频免费观看视频| 久久精品国产亚洲av高清一级| 国产精品人妻久久久影院| 在现免费观看毛片| 丝袜喷水一区| 天天添夜夜摸| 制服诱惑二区| 成人18禁高潮啪啪吃奶动态图| 中文乱码字字幕精品一区二区三区| 中文字幕亚洲精品专区| 国产成人一区二区在线| 丝袜人妻中文字幕| 国产男女内射视频| 久热爱精品视频在线9| 男女午夜视频在线观看| 免费在线观看完整版高清| 国产成人精品久久二区二区91 | 黑人猛操日本美女一级片| 亚洲成人手机| 新久久久久国产一级毛片| 日韩熟女老妇一区二区性免费视频| 美女主播在线视频| 久久久久久久精品精品| 国产精品久久久久久精品古装| 久久精品国产亚洲av涩爱| 麻豆av在线久日| 十分钟在线观看高清视频www| 99久久综合免费| 亚洲成人一二三区av| a级片在线免费高清观看视频| 精品国产一区二区三区久久久樱花| 日韩精品有码人妻一区| 日本av手机在线免费观看| 97精品久久久久久久久久精品| 久久国产精品男人的天堂亚洲| 国产精品一区二区在线观看99| 哪个播放器可以免费观看大片| 国产高清国产精品国产三级| 国产乱来视频区| 中文字幕人妻丝袜一区二区 | 咕卡用的链子| 亚洲专区中文字幕在线 | 看非洲黑人一级黄片| 99热国产这里只有精品6| 日本vs欧美在线观看视频| 在线观看三级黄色| 色视频在线一区二区三区| 王馨瑶露胸无遮挡在线观看| av国产久精品久网站免费入址| 亚洲精品av麻豆狂野| 精品卡一卡二卡四卡免费| 婷婷色综合www| 老司机影院成人| 侵犯人妻中文字幕一二三四区| 欧美日韩亚洲高清精品| 无遮挡黄片免费观看| 免费少妇av软件| 一区福利在线观看| 欧美激情 高清一区二区三区| 亚洲欧美成人精品一区二区| 国产成人精品久久二区二区91 | 国产色婷婷99| 亚洲精品久久成人aⅴ小说| 欧美变态另类bdsm刘玥| 国产毛片在线视频| 国产精品久久久人人做人人爽| 蜜桃在线观看..| 在线天堂中文资源库| 最近最新中文字幕大全免费视频 | 国产精品偷伦视频观看了| 久久久国产一区二区| 国产 精品1| 国产不卡av网站在线观看| 婷婷色麻豆天堂久久| 日韩av免费高清视频| 国产精品一区二区精品视频观看| 人妻一区二区av| 两性夫妻黄色片| 久久99热这里只频精品6学生| 建设人人有责人人尽责人人享有的| 国产毛片在线视频| 99香蕉大伊视频| 国产在视频线精品| 精品视频人人做人人爽| 亚洲国产精品一区二区三区在线| 国产97色在线日韩免费| 一二三四中文在线观看免费高清| 亚洲精品一区蜜桃| 亚洲第一区二区三区不卡| 美女扒开内裤让男人捅视频| 中文乱码字字幕精品一区二区三区| 美女中出高潮动态图| 亚洲人成电影观看| 成人三级做爰电影| a级毛片黄视频| 欧美精品高潮呻吟av久久| 久久天躁狠狠躁夜夜2o2o | xxxhd国产人妻xxx| 国产一区有黄有色的免费视频| 中文欧美无线码| 午夜91福利影院| 久久精品国产亚洲av涩爱| 午夜久久久在线观看| 久久久国产欧美日韩av| 亚洲美女视频黄频| 高清欧美精品videossex| 欧美黑人精品巨大| 美女大奶头黄色视频| 青春草亚洲视频在线观看| 中文字幕亚洲精品专区| av.在线天堂| 日韩av免费高清视频| 蜜桃国产av成人99| 亚洲av日韩精品久久久久久密 | 精品人妻在线不人妻| 中文欧美无线码| 欧美日韩视频高清一区二区三区二| 亚洲精品中文字幕在线视频| 欧美日本中文国产一区发布| 精品一区二区三区四区五区乱码 | 午夜av观看不卡| 中文字幕高清在线视频| 日韩 欧美 亚洲 中文字幕| 久久99精品国语久久久| 久久97久久精品| 一区福利在线观看| 久久久精品国产亚洲av高清涩受| 日韩免费高清中文字幕av| 少妇被粗大的猛进出69影院| 国产 一区精品| 午夜福利影视在线免费观看| av在线老鸭窝| 精品福利永久在线观看| 亚洲精品国产色婷婷电影| 欧美 日韩 精品 国产| 亚洲第一区二区三区不卡| 多毛熟女@视频| 黄色 视频免费看| 中文字幕人妻丝袜制服| 国产视频首页在线观看| 超碰97精品在线观看| 伦理电影大哥的女人| 成人18禁高潮啪啪吃奶动态图| 18禁观看日本| 国产有黄有色有爽视频| 精品国产超薄肉色丝袜足j| 亚洲av电影在线观看一区二区三区| 老汉色∧v一级毛片| 国产高清不卡午夜福利| 亚洲国产中文字幕在线视频| 超碰97精品在线观看| 一级毛片黄色毛片免费观看视频| 国产亚洲最大av| 精品亚洲乱码少妇综合久久| 亚洲精品视频女| 国产成人免费无遮挡视频| 国产精品香港三级国产av潘金莲 | 国产伦理片在线播放av一区| 国产精品无大码| 美女福利国产在线| 日韩欧美一区视频在线观看| 亚洲美女视频黄频| 日韩一区二区三区影片| 久久精品人人爽人人爽视色| 免费久久久久久久精品成人欧美视频| 自线自在国产av| 街头女战士在线观看网站| 亚洲欧美一区二区三区国产| 国产熟女欧美一区二区| 丁香六月欧美| h视频一区二区三区| 又大又黄又爽视频免费| 欧美精品亚洲一区二区| 久久久国产精品麻豆| 中文字幕最新亚洲高清| 天天躁狠狠躁夜夜躁狠狠躁| 99香蕉大伊视频| 成年人午夜在线观看视频| 黄色一级大片看看| 午夜久久久在线观看| 日韩制服丝袜自拍偷拍| 欧美精品人与动牲交sv欧美| 啦啦啦中文免费视频观看日本| 免费少妇av软件| 亚洲一卡2卡3卡4卡5卡精品中文| 欧美日韩精品网址| 亚洲av电影在线进入| 午夜免费鲁丝| 男女无遮挡免费网站观看| 久久久久精品久久久久真实原创| 韩国高清视频一区二区三区| 在线天堂最新版资源| av国产精品久久久久影院| 国产午夜精品一二区理论片| 亚洲一级一片aⅴ在线观看| 天天躁夜夜躁狠狠躁躁| 这个男人来自地球电影免费观看 | tube8黄色片| bbb黄色大片| 建设人人有责人人尽责人人享有的| 欧美激情高清一区二区三区 | 免费看不卡的av| 少妇的丰满在线观看| 伊人久久国产一区二区| 亚洲美女黄色视频免费看| 亚洲精品日本国产第一区| 国产免费现黄频在线看| 亚洲精品国产色婷婷电影| 久久久国产欧美日韩av| 成人黄色视频免费在线看| 国产老妇伦熟女老妇高清| 成人亚洲精品一区在线观看| 制服人妻中文乱码| 国产又爽黄色视频| 一区在线观看完整版| 卡戴珊不雅视频在线播放| 晚上一个人看的免费电影| 国产成人精品在线电影| 久久97久久精品| 狂野欧美激情性bbbbbb| 宅男免费午夜| 欧美日韩一级在线毛片| 亚洲精品国产av成人精品| 精品一品国产午夜福利视频| 亚洲av中文av极速乱| 欧美日韩视频高清一区二区三区二| 女人被躁到高潮嗷嗷叫费观| 人人妻人人澡人人爽人人夜夜| 99热网站在线观看| 一本大道久久a久久精品| 亚洲人成网站在线观看播放| 欧美人与性动交α欧美精品济南到| 国产1区2区3区精品| 国产日韩欧美亚洲二区| 一级,二级,三级黄色视频| 黄色毛片三级朝国网站| 涩涩av久久男人的天堂| 纵有疾风起免费观看全集完整版| 搡老乐熟女国产| 丝袜在线中文字幕| 久久人妻熟女aⅴ| av电影中文网址| 老汉色av国产亚洲站长工具| 欧美中文综合在线视频| 日日啪夜夜爽| 少妇被粗大猛烈的视频| 久久久久国产精品人妻一区二区| 永久免费av网站大全| 成年人午夜在线观看视频| 大话2 男鬼变身卡| 中文字幕av电影在线播放| 久久精品国产亚洲av涩爱| 中国国产av一级| 高清视频免费观看一区二区| 成人影院久久| 一区二区日韩欧美中文字幕| 高清av免费在线| 欧美在线黄色| xxx大片免费视频| 日韩 欧美 亚洲 中文字幕| 欧美国产精品一级二级三级| kizo精华| 国产亚洲最大av| 午夜免费男女啪啪视频观看| 免费av中文字幕在线| 午夜免费观看性视频| 亚洲精品国产一区二区精华液| 国产精品久久久av美女十八| 亚洲国产av影院在线观看| 国产爽快片一区二区三区| 欧美日韩福利视频一区二区| 国产无遮挡羞羞视频在线观看| 国产精品国产三级专区第一集| 少妇 在线观看| a级毛片在线看网站| 97人妻天天添夜夜摸| 美女大奶头黄色视频| 亚洲欧美日韩另类电影网站| 国产黄色免费在线视频| 少妇人妻精品综合一区二区| 久久久亚洲精品成人影院| 国产精品一区二区精品视频观看| 一级片免费观看大全| 男人爽女人下面视频在线观看| 国产伦人伦偷精品视频| 久久综合国产亚洲精品| 久久av网站| 两个人看的免费小视频| 精品少妇黑人巨大在线播放| 黑人欧美特级aaaaaa片| 国产欧美亚洲国产| 亚洲婷婷狠狠爱综合网| 丝袜美腿诱惑在线| av电影中文网址| 黄频高清免费视频| 免费在线观看视频国产中文字幕亚洲 | 国产无遮挡羞羞视频在线观看| 青草久久国产| 丝袜在线中文字幕| 性少妇av在线| 国产一区有黄有色的免费视频| 大码成人一级视频| 日韩一区二区视频免费看| 男女高潮啪啪啪动态图| 国产精品久久久人人做人人爽| www.熟女人妻精品国产| 亚洲精品一二三| 校园人妻丝袜中文字幕| videosex国产| 又黄又粗又硬又大视频| av网站免费在线观看视频| 丰满饥渴人妻一区二区三| 午夜老司机福利片| 日日啪夜夜爽| 观看av在线不卡|