• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Study and Application of Fault Prediction Methods with Improved Reservoir Neural Networks☆

    2014-07-17 09:10:25QunxiongZhuYiwenJiaDiPengYuanXu

    Qunxiong Zhu,Yiwen Jia,DiPeng,Yuan Xu

    College of Information Science and Technology,Beijing University of Chemical Technology,Beijing 100029,China

    Study and Application of Fault Prediction Methods with Improved Reservoir Neural Networks☆

    Qunxiong Zhu,Yiwen Jia,DiPeng,Yuan Xu*

    College of Information Science and Technology,Beijing University of Chemical Technology,Beijing 100029,China

    A R T I C L E I N F O

    Article history:

    Received 15May 2013

    Received in revised form 12 October 2013 Accepted 24 December 2013

    Available on line 23 June 2014

    Fault prediction

    Tim e series

    Reservoir neural net works

    Tennessee Eastman process

    Time-series prediction is one of the major methodologies used for fault prediction.The methods based on recurrent neural networks have been widely used in time-series prediction for their remarkable non-liner mapping ability.Asa new recurrent neural network,reservoir neural network can effectively process the time-series prediction.However,the ill-posedness problem of reservoir neural networks has seriously restricted the generalization performance.In this paper,a fault prediction algorithm based on time-series is proposed using improved reservoir neural networks.The basic idea is taking structure risk into consideration,that is,the cost function involves not only the experience risk factor but also the structure risk factor.Thus a regulation coefficient is introduced to calculate the output weight of the reservoir neural net work.Asa result,the amplitude of output weight is effectively controlled and the ill-posedness problem is solved.Because the training speed of ordinary reservoir networks is naturally fast,the improved reservoir networks for time-series prediction are good in speed and generalization ability.Experiments on Mackey-Glass and sunspot time series prediction prove the effectiveness of the algorithm.The proposed algorithm is applied to TE process fault prediction.We first forecast some time series obtained from TE and then predict the fault type adopting the static reservoirs with the predicted data. The final prediction correct rate reaches81%.

    ?2014 Chemical Industry and Engineering Society of China,and Chemical Industry Press.All rights reserved.

    1.Introduction

    Fault detection and diagnosis have been studied for almost four decades and become a significant part in control theory.The requirements for system reliability and safety are increasing.It is crucial to know the failure in formation before a fault.As a result,fault prediction has attracted much attention.

    The key of fault prediction is to forecast the future state of a system, so fault prediction can be transform ed to time-series prediction.The existing methods of time-series prediction can be classified in to three categories.The first method is based on classical time series analysis, consisting of ARMA model and ARIMA model[1].The second method is based on gray model[2,3]and the last one is based on neural networks[4-7].Among these methods,the neural network method has been studied deeply and applied to time-series extensively for its remarkable non-linear mapping ability.In neural networks and machine learning communities,several types of neural network model are applied to time-series prediction such as the standard multilayer perceptions[8],radial basis function neural networks[9-11],and generalized regression neural networks[12].In addition,recurrent neural networks[13]including nonlinear autoregressive network[14],extreme learning machine networks[15],and recurrent predictor neural networks[16]are also studied for nonlinear time-series prediction.

    There are some limitations when applying neural networks to applications.For example,the performance is not good when the forward neural network is applied to time-series prediction.Although recurrent neural networks can solve the problems related to time-series,it has many disadvantages such as large calculation,slow convergence rate and difficulty in determining the number of hidden neuron.Moreover, there are fading memories,which may make error gradient missing or distorted.

    To solve these problems,Jaeger and Maass proposed echo state networks(ESNs)[17]and liquid state machine[18],respectively.Although these two methods have different angles,their essence is the improvement of traditional recurrent neural networks.Verstraeten et al.have demonstrated that the two methods are the same in essence and named it reservoir computing[19].Since the report of reservoir computing on Science journal in 2004[20],it has drawn a number of researchers'attention around the world.Beside time-series prediction [21,22],reservoir computing is extended to pattern classification[23], voice recognition[24],image processing[25]and so on.

    However,there are some problems in reservoir networks.In many situations,the coefficient matrix for calculating the output weight ismorbidly.To be specific,singular values distribute continuously with no obvious jump.The maximum singular value and the minimum one differ significantly.Consequently,the output weight is extraordinary large especially in the high-dimension reservoir networks.On the other hand, the conventional way to control the output weight is choosing the reservoirs with the dimension as low as possible,but the low-dimension reservoir networks cannot bring good generalization ability.

    In this paper,we first study the traditional structure of reservoir networks and analyze its ill-posed ness problem.According to the analysis, the structure risk is taken in to consideration.A formula is obtained to calculate the output weight with minimizing the loss function.This method involves a regulating coefficient that can control the amplitude of the output weight.The ill-posed ness problem is solved in this way. Experiments of two benchmark problems are used to verify the effectiveness of the improved method.A fault prediction algorithm based on the improved reservoir neural networks is proposed and applied to TE process.Six time-series data consisting of 2 variables from 3 faults are predicted.In the classification stage,we take advantage of static reservoir networks to predict the type of the faults.

    2.Reservoir Computing

    2.1.Structure of reservoir network

    The architecture of traditional reservoirs[17]is shown in Fig.1. Some terminologies must be fixed first.We consider discrete-time neural networks with K input units,N internal network units and L output units.Activations of input units at time step n are u(n)=(u1(n),…, uM(n)),those of internal units areχ(n)=(χ1(n),…,χN(n)),and those of output units are y(n)=(y1(n),…,yL(n)).Real-valued connection weights are collected in a N×K weight matrix Win=(for the input weights,in an N×N matrix W=(wij)for the internal connections,in an L×(K+N+L)matrix Wout=()for the connection to the output units,and in a N×L matrix Wback=()for the connections that project back from the output to the internal units.

    2.2.Mathematical model

    In most cases the output has little effect on internal unit,so we will not study parameter Wback.The equations of the reservoir networks [17]can be written as

    Fig.1.The basic structure of reservoir networks.

    Considering Eq.(1),we assume that the internal state variables χ have N dimensions,input variables u have M dimensions,and output variables y have L dimensions.To simplify the expressions,we consider bias variables as the connection weight of output fixed value of 1.Thus bχand b can be merged in to matrices Winand Wout.The active function f can be spiking neurons,threshold logic neurons,sigmoid neurons,linear neurons and so on.In this paper,sigmoid function is taken.We first initialize the networks:W and Winare generated randomly and remain unchanged during the calculation,and the original state of internal unit is zero,that is,χ(0)=0.The input and output of training samples are u(k)and y(k),respectively.Thus we can calculate Woutwith Eq.(1).

    Some important points on reservoir networks are presented as follows.

    First,the dimension of internal state units χ is very high,up to hundred even thousand,while it is relatively lower in the traditional recurrent neural networks.

    Second,weight matrices Winand Ware randomly generated and remain unchanged during all training processes.

    Last,as one of the measures maintaining the dynamic characteristics of reservoirs,the connection weight matrix of internal state is sparse to the point of2%-5%,different from most traditional recurrent neural networks,which always keep dense connection.

    3.Improved Method

    In this section,the ill-posedness problems of reservoirs networks are discussed and the improved method solving the ill-posedness problems will be given.

    3.1.Training

    In order to facilitate the study,we redraw a new picture of the network structure in Fig.2,which is essentially the same as the structure in Fig.1,with the same neuron of reservoir in different moments.Winis the connection weight between the input layer and the reservoir,W is the interval connection weight,and Woutis the connection weight between the output layer and the reservoir.

    Training the reservoir networks can be summarized as determining the connection weight matrix Woutbetween the output layer and the dynamic reservoir layer.The following are the detailed steps of building and training a reservoir network.

    Step 1 Set the parameters of reservoir networks.Set the number of internal units of reservoirs(N),the sparsity,and the spectral radius of the connection weight matrix of internal state. Initialize the reservoir network.The spectral radius of the connection weight matrix of internal state is always bet ween 0 and 1,but it is not a necessary condition.Sometimes the spectral radius greater than 1 can give better prediction performance.

    Step 2 Calculate the interval state.Normalize the input sample and

    Fig.2.The simplified model of reservoir.

    stimulate the internal state of reservoirs using the normalizedinput sample.The variables of internal state at every moment should be recorded.

    Step 3 Calculate the connection weight matrix of output layer(Wout). According to the linear regression relationship between the reservoir state variables and output variables,the connection weight matrix of output layer(Wout)can be obtained.

    There are a lot of methods for calculating Wout.Pseudo-inverse method will be taken in this paper.The reservoir state matrix A and the corresponding target vector ydare defined as

    where Ω is the length of initial transient process and K is the number of samples.To obtain better accuracy,the initial transient process is always abandoned.Thus A and ydsatisfy the following relation

    AWout≈y d.

    As only Woutneeds to be ad justed,the target function is

    while the conventional training algorithm is given by

    where A+is the pseudo-inverse matrix of A,which is also called the generalized inverse matrix.There are orthogonal matrices U∈RK×Kand V∈RN×N.The singular value decomposition of matrix A can be expressed as

    where Σ is a K×N diagonal matrix

    In this matrix,

    where σiis the singular value of A.Using the singular value decomposition the pseudo-inverse of A can be written as

    The matrix Σ+can be expressed by all non-zero singular values.

    Zero is put on the corresponding place where the singular value is zero.

    3.2.The ill-posedness of reservoir networks

    Here we discuss the solution of linear system AW=ydwith a disturbance in the system:

    Notice that

    Calculate||A||and||A?1||using singular value decomposition:

    In the same way,we have

    Therefore,the condition number is

    It is also the ratio of the maximum singular value and the minimum singular value.

    The auto-correlation matrix of the state signal can be approximately expressed as

    In existing methods of reservoir networks,the eigenvalues of the auto-correlation matrix have large dispersion degree,which may cause a large singular value dispersion degree and large condition number of matrix A,resulting in ill-posedness of reservoir networks.With Eqs.(4)and(8),the method of calculate Woutis given by

    Using singular value decomposition,the part with singular value of zero is removed in matrix A.And then the solution of output weight matrix is obtained.However,in applications,it is almost impossible for zero-value singular to appear.It just approaches zero,then serious ill-posedness problem occurs when seeking pseudo-inverse.The solution expressed by Eq.(12)for singular value decomposition can be rewritten as

    where uiand viare the left singular vectors and right singular vectors,respectively,and r=rand(A).If the minimum singular value is too small, the output weight Woutwill have extraordinarily large amplitude.

    In the reservoir network method,the amplitude distribution of the singular values is continuous in matrix A,and the smallest singular value is close to zero.On the other hand,the pseudo-inverse solution of reservoirs heavily relies on the amplitude of the minimum singular value in Eq.(12).With the minimum eigenvalue calculated,the pseudo-inverse of matrix A is related to the representation accuracy of computer floating-point.Consequently,the solution of reservoirs is influenced significantly by the computer accuracy.

    3.3.Calculation

    We introduce the structure risk minimize theory to reservoir networks to overcome the extraordinarily large amplitude of Woutmentioned in the last section.According to statistical theory,real risk consists of experience risk and structure risk.A model that balances the two risks well is considered as a good model.Thus we introduce regulating parameter C to ad just experience risk and structure risk.The cost function given by Eq.(3)can be rewritten as

    where‖Wout‖2represents the structure risk and‖e‖2stands for experience risk,K is the number of the samples,and ydis the desirable output. This function originates from statistic theory.In order to obtain proper output weight Wout,we should minimize the cost function E(W).

    As the form u la is a conditional extreme problem,we transfer it to unconditional extreme problem through the Lagrange equation:

    whereλ=[λ1,λ2,…,λK],λj∈Rm(j=1,2,…,N)is the Lagrange multiplier.

    Calculate the gradient of the Lagrange equation and make them zero:

    With Eq.(16),we can obtainλ=?C(AWout?yd)T,and then

    As Eq.(17)only needs some simple linear calculations and a matrix inversion operation,the speed of calculating Woutis very fast.

    3.4.Eχperiment

    In this section,we give two examples to show the performance of the improved method.The first exam p le is the Mackey-Glass benchmark testing,and the second is the prediction of monthly sunspots.

    Table 1Re levant parameters of reservoirs for Mackey-Glass time series prediction

    Table 2The performance(NRMSE)of proposed algorithm and other methods

    3.4.1.Prediction of Mackey-Glass time series

    The Mackey-Glass system is a time-delay differential system with the form of

    where χ(t)is the value of time-series at time t.The system is chaotic for δ<16.8.The parameter values are chosen asβ=0.1,α=0.2,andδ= 17.The dataset is constructed using the second-order Runge-Kutta method with a step size of 0.1.We can get the time-series data through this differential system.We use continuous 10 data as the input of the network and the 95th data as the desired output of the network.By this way,a sample is form ed.We can obtain many samples with these time-series data.We choose 1200 samples as the training samples and other 660 samples as the testing samples.The internal state matrix A can be calculated by the training samples and the first formula of Eq.(1).And then according to Eq.(17),Woutis obtained.

    In this example,the prediction performance is evaluated by the root mean-squared error on the test sequence pairs normalized by the variance of the original time series(NRMSE).

    where yd(i)denotes the target value,y(i)is the corresponding prediction output,n is the number of test examples,andσ2is the variance of original time series.The relevant parameters are listed in Table 1.

    Table 2 com pares the improved method and other methods.The normalized mean squared error of the proposed algorithm is smaller than that of other algorithm s,in which CESN and D&SESN are another two improved methods for reservoir neural networks.Fig.3 shows the prediction curve and prediction error,with the absolute error distributing evenly around zero.

    3.4.2.Prediction of monthly sunspots

    The sunspot numbers used in the paper are 1327 monthly mean sunspot numbers from January 1901 to Ju ly 2011,from the Solar Influences Data Analysis Center(http://sidc.om b.be).A time series is constructed by these numbers.We use continuous 10 data as the input of the network and the next data,the 11th data,as the desired output of the network.By this way,1317 samples are form ed.The first 878 samples are used as the training data while the rest439 samples are used as the testing data.The demission of reservoirs is set as600×600with the sparsity maintaining 2%,while the spectral radius is around 0.91.Table3lists the details.Table 4 presents the normalized mean squared errors of the proposed method and others.The performance of the proposed method is the best.Fig.4 is the prediction performance of sunspot monthly mean numbers with the improved method,in which the absolute errors distribute evenly around zero.

    Table 3Relevant parameters of reservoirs for prediction of sunspot time series

    Fig.3.The performance of the MC time series prediction.

    4.Simulation

    4.1.TE process

    On the basis of a large number of engineering practices,the U.S. Eastman Chemical Company developed a simulation model of process industry,which is called TE chemical process model.As a real simulation of production plant of Tennessee Eastman Chemical Company,TE process is a typical complex multi-variable nonlinear process[29],with four starting reactants A,C,D and E,products G and H,and by-product F.Four main reactions are as follows.

    Table 4The performance(NRMSE)of proposed algorithm and other methods

    which are all irreversible exothermic reactions following the Arrhenius equation.Since product G is sensitive to temperature,the reaction temperature of the system must be controlled precisely.Part of liquid products G and H will leave the reactor as vapor while the other part will stay in the reactor as liquid.Thew hole TE process consists of five processing units:reactor,condenser,recycle compressor,gas-liquid separator and stripper.The process flow chart is shown in Fig.5.

    4.2.Eχperiment and analysis

    Among 20 typical faults in TE process,we will take in to account three of them as listed in Table 5.As the ultimate purpose of the prediction is to predict the specific fault type,the step after time-series prediction is fault prediction.There are 52 variables consisting of 11 control variables and 41measured variables in TE process.These variables can be considered as characteristics describing the faults.

    However,different characteristics make different contributions to the prediction of a certain fault.Some characteristics not only fail to provide useful in formation to fault prediction,but also bring some noise to increase the error in fault prediction.Therefore,it is necessary to pick up the characteristics that can provide large a mount of information for prediction of certain faults.

    According to literature[30],characteristics 51 and 9 have very high values of mutual information with the output fault type(faults 4,9, 11)and can provide more useful information to fault prediction.In the TE process,characteristics 9 and 51 are the reactor temperature and the reactor cooling water valve opening,respectively.As faults 4 and 11 happen to be the changes in the inlet temperature of reactor cooling water,they are directly related to characteristics 9 and 51.Although fault 9(feeding temperature of material D)does not have obviouslydirect relation with these two characteristics,according to literature [30],they also can distinguish fault 9.

    Fig.4.The prediction performance for sunspot monthly mean numbers.

    Fig.5.TE process flow chart.

    For time-series prediction,the training data and testing data are generated from 3 simulations corresponding to 3 faults listed in Table 5.Each simulation run is set to be 72 h and the sampling interval is 3 m in,so each simulation run contains 1440 samples and each sample has 52 variables.In this paper,only variables 9 and 51 are needed.By taking out variables 9 and 51 of these 3 simulations,6 time-series containing 1440 data are obtained with the first 1002 data being used.For each time-series,we use continuous 10 data as the input of the network and the 13 th data as the desired output of the network.The prediction is 3-step ahead.

    In this case,990 samples are form ed.The first 660 samples are used as the training data while the rest 330 samples are used as the testing data.The parameters of the reservoir networks are listed in Table 6.Fig.6 is the prediction performance of 6 time-series.These fitting curves and error curves above indicate that the predictionaccuracy is acceptable.Table 7 shows the prediction mean square error of the six time-series.Then we use these predicted data for fault prediction with the static reservoir,in which the interval connection weight is zero.In this situation,the reservoirs become an extreme learning machine,so it is unnecessary to construct a large-scale reservoir.Experimentally,when the number of hidden layer node equals to 9,the correct rate of prediction canal ways reaches 81%, which is relatively high in TE fault prediction.Theoretically,fault prediction is more difficult than fault diagnosis.In TE fault diagnosis,a method is considered as effective when the rate of correct diagnosis is greater than 80%.The correct rate proves that our method is an advanced one.

    Table 5Three certain faults of TE process

    5.Conclusions

    In this paper,we analyze the ill-posedness problem of the reservoir neural networks and propose a fault prediction method.Two benchmark problems,Mackey-Glass time series prediction and monthlymean sunspot time series prediction give a proof that the proposed algorithm improves the performance and quality of reservoirs especially in generalization ability.In simulation section,we apply the proposed algorithm to predict certain faults of TE process.The correct rate of classification is up to 81.31%in an appropriate static reservoir neural network structure.The fault prediction of TE process is three steps ahead with 3-min interval sampling.That gives enough time to ad just the possible fault.In the future study,we should focus on the optimization of parameters of reservoir neural networks.

    Table 6The relevant parameters of reservoirs for the time-series of variables 9 and 51

    Fig.6.The performance of the six time-series prediction.

    Table 7The prediction mean square error of the six time-series

    [1]S.L.Ho,M.Xie,The use of ARIMAmodels for reliability forecasting and analysis, Comput.Ind.Eng.35(1998)213-216.

    [2]C.X.Sun,W.M.Bi,Q.Zhou,R.J.Liao,W.G.Chen,New gray prediction parameter model and its application in electrical insulation fault prediction,Control Theory App l.20(5)(2003)797-901.

    [3]W.Gao,Y.G.Zheng,The discussion on prediction model of nonlinear time series,J. Tsinghua Univ.(Sci.Technol.)40(s2)(2000)6-10.

    [4]B.Liu,D.P.Hu,Studies on applying artificial neural networks to some forecasting problem,J.Syst.Eng.14(4)(1999)338-344.

    [5]J.Zhang,A.J.Morris,E.B.Martin,Long-term prediction model based on mixed order locally recurrent neural networks,Com put.Chem.Eng.22(7)(1998)1051-1063.

    [6]H.D.Xue,Q.X.Zhu,Tim e series prediction algorithm based on structured analogy, Com put.Eng.36(1)(2010)231-235.

    [7]X.P.Lai,H.X.Zhou,C.Q.Yun,Application of hybrid-model neural networks to shortterm electric load forecasting,Control Theory Appl.17(1)(2000)69-72.

    [8]A.Lapades,R.Farbar,How neural nets work,Proc.Adv.Neural Inf.Process.Syst. (1987)442-456.

    [9]D.Q.Zhang,Y.X.Ning,X.N.Liu,On-line prediction of nonlinear time series using RBF neural networks,Control Theory App l.26(2)(2009)153-157.

    [10]S.Haykin,J.Principe,Making sense of a complex world,IEEE Signal Process.Mag.15 (3)(1998)66-68.

    [11]M.R.Cow per,B.Mu lgrew,C.P.Unsw orth,Non linear prediction of chaotic signals using a norm alized rad ial basis function network,Signal Process.82(5)(2002) 775-789.

    [12]Z.P.Feng,X.G.Song,D.X.Xue,A.P.Zheng,Y.M.Sun,Tim e series prediction based on general regression neural network,J.Vib.Meas.Diagn.23(2)(2003)105-109.

    [13]F.Sun,Q.X.Zhu,Study and application on recurrent neural networks controller,J. Beijjing Univ.Chem.Technol.27(3)(2000)88-90.

    [14]J.C.Principe,J.M.Kuo,Dynam ic modeling of chaotic time series with neural networks,Advances in Neural In formation Processing System s,7,MIT Press, Cam b ridge,MA,1995,pp.311-318.

    [15]J.Zhang,K.S.Tang,K.F.Man,Recurrent NN model for chaotic time series prediction, Proc.23 rd Annu.In t.Con f.Ind.Electron.,Con tro l,Instrum.(IECON),3,1997, pp.9-14.

    [16]M.Han,J.H.Xi,S.G.Xu,F.L.Yin,Prediction of chaotic time series based on the recurrent predictor neural network,IEEE Trans.Signal Process.52(2)(2004) 3409-3416.

    [17]Herbert Jaeger,The Echo State Approach to Analyzing and Training Recurrent Network,Bremen:GMD Report 148,GMD—Germ an National Research Institute for Computer Science,2001.

    [18]T.Wolfgang Maass,M.H.Natsch lager,Real-time computing without stable states:a new framework for neural computation based on perturbations,Neural Com put.14 (1)(2002)2531-2560.

    [19]D.Verstraeten,B.Schrauwen,M.D'Haene,D.Strooband t,An experimental unification of reservoir computing methods,Neural Netw.20(3)(2007)391-403.

    [20]Herbert Jaeger,Harald Haas,Harnessing nonlinearity:predicting chaotic systems and saving energy in wireless telecommunication,Science 304(5667)(2004) 78-80.

    [21]Y.Peng,J.M.Wang,X.Y.Peng,Researches on time series prediction with echo state networks,Acta Electron.Sin.38(2A)(2010)148-154.

    [22]Herbert Jaeger,Mantas Lukosevieius,Dan Popovici,Udo Siewert,Optimization and applications of echo state networkswith leaky in tegrator neurons,Neural Netw. 20(3)(2007)335-352.

    [23]M.D.Sko wronski,J.G.Harris,Autom atic speech recognition using a predictive echo state network classi fi er,Neural Netw.20(3)(2007)414-423.

    [24]D.Verstraeten,B.Sch rauwen,D.Strooband t,J.Van Campen hout,Isolated word recognition with the liquid state machine:a case study,Inf.Process Lett.95(6)(2005) 521-528.

    [25]C.D.Pei,Echo state networks and its applications on image edge detection,Com put. Eng.Appl.44(19)(2008)172-174.

    [26]Q.S.Song,Z.R.Feng,A new method to construct complex echo state networks,J.Xi' an Jiao Tong Univ.43(4)(2009)1-4.

    [27]Georg Holzrmann,Helmut Hauser,Echo state networks with filter neurons and delay&sum readout,Neural Netw.23(2)(2010)244-256.

    [28]H.W.Zhang,Sunspot Number Prediction Based on Wavelet Analysis and BP Neural Network,2009.

    [29]Luis T.Antelo,Julio R.Banga,Antonio A.Alonso,Hierarchical design of decentralized control structures for the Tennessee Eastman Process,Com put.Chem.Eng.32(2008) 1995-2015.

    [30]N.Lv,X.Y.Yu,Fault diagnosis of TE process based on second-order mutual in formation feature selection,J.Chem.Ind.Eng.60(9)(2009)2252-2258.

    ☆Supported by the National Natural Science Foundation of China(61074153).

    *Corresponding author.

    E-mailaddress:xuyuan@m ail.buct.edu.cn(Y.Xu).

    精品国内亚洲2022精品成人| 亚洲无线在线观看| 国产av在哪里看| 无限看片的www在线观看| www国产在线视频色| 一本一本综合久久| 91成年电影在线观看| 丰满的人妻完整版| 精品人妻1区二区| 天天躁夜夜躁狠狠躁躁| 欧美激情久久久久久爽电影| 久99久视频精品免费| 18美女黄网站色大片免费观看| 哪里可以看免费的av片| 一进一出好大好爽视频| 国产成人系列免费观看| 97碰自拍视频| 丝袜在线中文字幕| 一级a爱视频在线免费观看| 看片在线看免费视频| 中文字幕精品亚洲无线码一区 | 一级a爱片免费观看的视频| 99久久99久久久精品蜜桃| 久久人妻av系列| 午夜福利成人在线免费观看| 精品无人区乱码1区二区| 久久精品91无色码中文字幕| 免费女性裸体啪啪无遮挡网站| 女生性感内裤真人,穿戴方法视频| 两个人免费观看高清视频| 欧美精品亚洲一区二区| 国产精品电影一区二区三区| 老司机午夜十八禁免费视频| 俺也久久电影网| 欧美日韩亚洲综合一区二区三区_| 99久久久亚洲精品蜜臀av| 国产午夜福利久久久久久| 色哟哟哟哟哟哟| 少妇 在线观看| 婷婷精品国产亚洲av| 好看av亚洲va欧美ⅴa在| 男男h啪啪无遮挡| 亚洲在线自拍视频| 天堂影院成人在线观看| 操出白浆在线播放| 免费人成视频x8x8入口观看| 亚洲无线在线观看| 一区二区日韩欧美中文字幕| 国产av在哪里看| 好男人电影高清在线观看| 一二三四在线观看免费中文在| 在线观看舔阴道视频| 国产精品野战在线观看| 日韩大尺度精品在线看网址| √禁漫天堂资源中文www| 午夜福利成人在线免费观看| 91成人精品电影| 精品无人区乱码1区二区| 亚洲黑人精品在线| 黄色成人免费大全| 国产aⅴ精品一区二区三区波| 最近在线观看免费完整版| 久久久久久久精品吃奶| 91成年电影在线观看| 欧美日韩中文字幕国产精品一区二区三区| 国产精品九九99| 国内毛片毛片毛片毛片毛片| 国产精品自产拍在线观看55亚洲| 亚洲精品色激情综合| 欧美激情 高清一区二区三区| 国产不卡一卡二| 久久人人精品亚洲av| 老熟妇仑乱视频hdxx| 亚洲色图av天堂| 不卡av一区二区三区| 成人欧美大片| 丁香六月欧美| 成人欧美大片| 亚洲 欧美一区二区三区| 久久久久久久久免费视频了| 少妇的丰满在线观看| 久久这里只有精品19| 亚洲人成电影免费在线| 国产色视频综合| 亚洲九九香蕉| 黑丝袜美女国产一区| 美女高潮到喷水免费观看| 久久久久久久久久黄片| 色综合亚洲欧美另类图片| 亚洲成人精品中文字幕电影| 午夜福利在线在线| 成人国产一区最新在线观看| 91字幕亚洲| 国产高清视频在线播放一区| 久久久久久九九精品二区国产 | 啦啦啦韩国在线观看视频| 亚洲第一电影网av| 美女扒开内裤让男人捅视频| 欧美一级毛片孕妇| 91麻豆av在线| 色综合欧美亚洲国产小说| 亚洲第一欧美日韩一区二区三区| 国产极品粉嫩免费观看在线| 麻豆成人午夜福利视频| av福利片在线| 操出白浆在线播放| 99久久国产精品久久久| 日本一区二区免费在线视频| 亚洲成人久久爱视频| 操出白浆在线播放| 久久久精品国产亚洲av高清涩受| 午夜精品久久久久久毛片777| 国产av又大| 久久久久九九精品影院| 久久九九热精品免费| 俺也久久电影网| 高清毛片免费观看视频网站| 亚洲欧美一区二区三区黑人| 免费女性裸体啪啪无遮挡网站| 国产精品精品国产色婷婷| 久久久久久大精品| 99精品欧美一区二区三区四区| 中文字幕人成人乱码亚洲影| 久久亚洲精品不卡| 亚洲成人国产一区在线观看| 午夜免费观看网址| 男女床上黄色一级片免费看| 女生性感内裤真人,穿戴方法视频| 女生性感内裤真人,穿戴方法视频| 国产av不卡久久| a级毛片a级免费在线| 久久久久久久午夜电影| 国产三级黄色录像| 一夜夜www| 国内毛片毛片毛片毛片毛片| 亚洲人成77777在线视频| 露出奶头的视频| 久久亚洲精品不卡| 精品久久久久久,| 国产精品免费视频内射| 亚洲av五月六月丁香网| 黄网站色视频无遮挡免费观看| 欧美亚洲日本最大视频资源| 亚洲一区二区三区不卡视频| 国产亚洲精品av在线| av电影中文网址| 日本熟妇午夜| 侵犯人妻中文字幕一二三四区| 熟女电影av网| 日韩欧美 国产精品| 久久久久亚洲av毛片大全| 男女下面进入的视频免费午夜 | 在线免费观看的www视频| 亚洲精品久久成人aⅴ小说| 欧美黑人欧美精品刺激| 亚洲精品粉嫩美女一区| 国产亚洲精品久久久久5区| 老鸭窝网址在线观看| 午夜精品久久久久久毛片777| 在线看三级毛片| 国产精品久久电影中文字幕| 欧美中文综合在线视频| 制服人妻中文乱码| 中文字幕久久专区| 色综合婷婷激情| 欧美不卡视频在线免费观看 | 99精品在免费线老司机午夜| 在线国产一区二区在线| 天堂√8在线中文| 久久久久久久久久黄片| 亚洲av电影在线进入| 免费人成视频x8x8入口观看| 欧美性猛交黑人性爽| 欧美成人性av电影在线观看| 久久性视频一级片| 精品一区二区三区四区五区乱码| 三级毛片av免费| 亚洲一区高清亚洲精品| www国产在线视频色| 欧美一级毛片孕妇| xxx96com| 99久久99久久久精品蜜桃| 免费在线观看视频国产中文字幕亚洲| 婷婷亚洲欧美| 伊人久久大香线蕉亚洲五| 97人妻精品一区二区三区麻豆 | 极品教师在线免费播放| 精品免费久久久久久久清纯| 日韩大码丰满熟妇| av片东京热男人的天堂| 久久久久精品国产欧美久久久| 精品卡一卡二卡四卡免费| 女性被躁到高潮视频| 国产高清视频在线播放一区| 老汉色∧v一级毛片| 男人舔女人的私密视频| 亚洲国产看品久久| 亚洲精品美女久久av网站| 日本a在线网址| 禁无遮挡网站| 老司机福利观看| 亚洲av中文字字幕乱码综合 | 色综合站精品国产| 18禁美女被吸乳视频| a在线观看视频网站| 午夜激情福利司机影院| 欧美中文综合在线视频| 91老司机精品| 欧美不卡视频在线免费观看 | 国产乱人伦免费视频| 欧美亚洲日本最大视频资源| 精品久久久久久,| av有码第一页| 久久性视频一级片| 欧美黄色淫秽网站| 我的亚洲天堂| 色老头精品视频在线观看| 成人av一区二区三区在线看| 日韩欧美一区二区三区在线观看| 一二三四社区在线视频社区8| 黑丝袜美女国产一区| 国产成人影院久久av| 三级毛片av免费| 亚洲avbb在线观看| 亚洲国产欧美网| 亚洲人成网站在线播放欧美日韩| 日本五十路高清| 丰满的人妻完整版| 极品教师在线免费播放| 午夜福利在线观看吧| av福利片在线| 国产人伦9x9x在线观看| 在线观看舔阴道视频| 国产午夜福利久久久久久| 亚洲国产看品久久| 女性生殖器流出的白浆| 欧美黑人巨大hd| 国产麻豆成人av免费视频| 亚洲欧洲精品一区二区精品久久久| 热re99久久国产66热| 白带黄色成豆腐渣| 国产精品影院久久| 在线永久观看黄色视频| 国产主播在线观看一区二区| 国产精品日韩av在线免费观看| 国产国语露脸激情在线看| 亚洲国产欧美网| 色婷婷久久久亚洲欧美| 美女午夜性视频免费| 亚洲真实伦在线观看| 久久精品国产亚洲av香蕉五月| 欧美黄色片欧美黄色片| 99国产综合亚洲精品| 别揉我奶头~嗯~啊~动态视频| 色av中文字幕| 久久国产亚洲av麻豆专区| 色老头精品视频在线观看| 男人舔女人的私密视频| 久久99热这里只有精品18| 亚洲性夜色夜夜综合| 亚洲成av人片免费观看| 精品久久蜜臀av无| 可以在线观看毛片的网站| 精品不卡国产一区二区三区| 日韩成人在线观看一区二区三区| 99热只有精品国产| 国产伦人伦偷精品视频| 国产熟女午夜一区二区三区| www.熟女人妻精品国产| 一级毛片精品| 中文亚洲av片在线观看爽| 亚洲国产看品久久| 国产精品自产拍在线观看55亚洲| 欧美成狂野欧美在线观看| 亚洲国产中文字幕在线视频| 香蕉久久夜色| 欧美日韩黄片免| 国产又色又爽无遮挡免费看| 国产一区二区激情短视频| 大香蕉久久成人网| 国产激情久久老熟女| 中文字幕精品免费在线观看视频| 亚洲一区二区三区色噜噜| 久久精品夜夜夜夜夜久久蜜豆 | 亚洲午夜精品一区,二区,三区| 婷婷精品国产亚洲av在线| 老汉色av国产亚洲站长工具| 日韩大码丰满熟妇| 亚洲第一欧美日韩一区二区三区| 亚洲av成人一区二区三| 麻豆久久精品国产亚洲av| 日韩欧美三级三区| 亚洲av第一区精品v没综合| 国产av又大| 最新美女视频免费是黄的| 美女国产高潮福利片在线看| 精品人妻1区二区| 久久精品影院6| 欧美乱妇无乱码| 国产高清videossex| 白带黄色成豆腐渣| 免费看a级黄色片| 久久久久久久午夜电影| 中文字幕精品亚洲无线码一区 | 欧美绝顶高潮抽搐喷水| 成人av一区二区三区在线看| 欧美激情 高清一区二区三区| 日韩av在线大香蕉| x7x7x7水蜜桃| 中文字幕av电影在线播放| 麻豆久久精品国产亚洲av| 日韩欧美三级三区| 在线观看免费日韩欧美大片| 欧美中文综合在线视频| 精品国产国语对白av| 18禁观看日本| 日本 av在线| 亚洲一码二码三码区别大吗| 欧美黄色淫秽网站| 日日摸夜夜添夜夜添小说| 自线自在国产av| 亚洲精品国产精品久久久不卡| 久久香蕉激情| 久久 成人 亚洲| 每晚都被弄得嗷嗷叫到高潮| 精品国产美女av久久久久小说| 亚洲av熟女| 成年女人毛片免费观看观看9| 国产亚洲精品综合一区在线观看 | 久久国产精品影院| 草草在线视频免费看| 久久天躁狠狠躁夜夜2o2o| 精品日产1卡2卡| 国产亚洲精品综合一区在线观看 | 国产乱人伦免费视频| 在线视频色国产色| 国产精品久久久人人做人人爽| 久久精品国产亚洲av高清一级| 首页视频小说图片口味搜索| 成人永久免费在线观看视频| 在线视频色国产色| 午夜福利免费观看在线| 国产亚洲av嫩草精品影院| 高清在线国产一区| 日本成人三级电影网站| 国产欧美日韩一区二区三| 欧美zozozo另类| 99久久综合精品五月天人人| 男男h啪啪无遮挡| 欧美三级亚洲精品| 精品乱码久久久久久99久播| 又大又爽又粗| 国产成人啪精品午夜网站| 婷婷亚洲欧美| 国产主播在线观看一区二区| 欧美黑人巨大hd| 观看免费一级毛片| 手机成人av网站| 久久天堂一区二区三区四区| 在线观看午夜福利视频| 国产人伦9x9x在线观看| 日本黄色视频三级网站网址| 亚洲av美国av| 亚洲一卡2卡3卡4卡5卡精品中文| 成人欧美大片| 国产视频内射| 伦理电影免费视频| 亚洲熟女毛片儿| 国产精品二区激情视频| 在线观看日韩欧美| 精品高清国产在线一区| 美女免费视频网站| 亚洲黑人精品在线| 精品久久久久久成人av| 国语自产精品视频在线第100页| 国产主播在线观看一区二区| 男女之事视频高清在线观看| 熟女少妇亚洲综合色aaa.| 精品福利观看| 美女高潮喷水抽搐中文字幕| 久久久精品国产亚洲av高清涩受| 又黄又粗又硬又大视频| 一a级毛片在线观看| 色精品久久人妻99蜜桃| 久久人妻av系列| 亚洲中文字幕一区二区三区有码在线看 | 亚洲中文字幕一区二区三区有码在线看 | 欧美激情高清一区二区三区| 亚洲成国产人片在线观看| 超碰成人久久| 日韩欧美一区视频在线观看| 俺也久久电影网| 波多野结衣巨乳人妻| 国产一区在线观看成人免费| 中亚洲国语对白在线视频| 黄色a级毛片大全视频| 欧美又色又爽又黄视频| 性色av乱码一区二区三区2| 日韩欧美在线二视频| 999精品在线视频| 天天添夜夜摸| 亚洲国产精品久久男人天堂| 亚洲精品中文字幕在线视频| 午夜福利免费观看在线| 成人三级做爰电影| 午夜两性在线视频| 亚洲精品国产精品久久久不卡| 久久久久久亚洲精品国产蜜桃av| 国产精品一区二区免费欧美| 老熟妇仑乱视频hdxx| 国产精品精品国产色婷婷| 亚洲欧美精品综合久久99| 成人特级黄色片久久久久久久| 国产精品爽爽va在线观看网站 | 看片在线看免费视频| 美女免费视频网站| 制服诱惑二区| 美国免费a级毛片| 免费电影在线观看免费观看| 99在线视频只有这里精品首页| 欧美日本视频| 日本 av在线| 久久九九热精品免费| 欧美精品啪啪一区二区三区| 日韩 欧美 亚洲 中文字幕| 高清毛片免费观看视频网站| 亚洲av中文字字幕乱码综合 | 最近在线观看免费完整版| 91九色精品人成在线观看| www日本黄色视频网| 国产精品 国内视频| 精品国产乱子伦一区二区三区| 日本五十路高清| 男女视频在线观看网站免费 | 老司机福利观看| 亚洲电影在线观看av| videosex国产| 校园春色视频在线观看| 亚洲一区二区三区色噜噜| 听说在线观看完整版免费高清| 午夜久久久在线观看| 妹子高潮喷水视频| 精品国产乱码久久久久久男人| 丰满人妻熟妇乱又伦精品不卡| 国产爱豆传媒在线观看 | 琪琪午夜伦伦电影理论片6080| 一卡2卡三卡四卡精品乱码亚洲| 老司机靠b影院| 亚洲国产精品成人综合色| 99精品在免费线老司机午夜| av欧美777| 亚洲成人久久爱视频| 又黄又爽又免费观看的视频| 国产欧美日韩精品亚洲av| 精品日产1卡2卡| 麻豆久久精品国产亚洲av| 国产成人精品久久二区二区91| av在线播放免费不卡| 国产成人影院久久av| netflix在线观看网站| 1024香蕉在线观看| 非洲黑人性xxxx精品又粗又长| 国产精品一区二区精品视频观看| 久久久国产精品麻豆| 精品久久久久久久久久久久久 | 亚洲成人国产一区在线观看| 免费在线观看成人毛片| 老熟妇乱子伦视频在线观看| 亚洲精品国产精品久久久不卡| 99久久综合精品五月天人人| 欧美亚洲日本最大视频资源| 18禁美女被吸乳视频| 真人做人爱边吃奶动态| 少妇被粗大的猛进出69影院| 中文字幕精品亚洲无线码一区 | 国产av在哪里看| 18禁裸乳无遮挡免费网站照片 | 精品国内亚洲2022精品成人| 成人国产一区最新在线观看| 19禁男女啪啪无遮挡网站| 国产亚洲欧美精品永久| 国产高清视频在线播放一区| 满18在线观看网站| 村上凉子中文字幕在线| 免费搜索国产男女视频| 男男h啪啪无遮挡| 1024香蕉在线观看| 欧美色欧美亚洲另类二区| 亚洲熟女毛片儿| 日本a在线网址| 亚洲精品av麻豆狂野| 久久婷婷人人爽人人干人人爱| 欧美日韩福利视频一区二区| 亚洲精品一卡2卡三卡4卡5卡| 俄罗斯特黄特色一大片| 午夜久久久久精精品| 成人三级黄色视频| 黄色片一级片一级黄色片| 女同久久另类99精品国产91| 叶爱在线成人免费视频播放| 欧美久久黑人一区二区| 国产免费av片在线观看野外av| 夜夜夜夜夜久久久久| 在线观看www视频免费| 国产亚洲精品久久久久5区| 波多野结衣巨乳人妻| 国产精品免费一区二区三区在线| 女人被狂操c到高潮| 久久久久久九九精品二区国产 | 在线天堂中文资源库| 国产成人啪精品午夜网站| 久久久久亚洲av毛片大全| 麻豆成人午夜福利视频| 一级a爱视频在线免费观看| 日本免费a在线| netflix在线观看网站| 中文字幕人妻丝袜一区二区| 日本在线视频免费播放| 亚洲国产精品999在线| 国产麻豆成人av免费视频| 夜夜看夜夜爽夜夜摸| 日韩av在线大香蕉| 中文资源天堂在线| 亚洲国产欧洲综合997久久, | 韩国精品一区二区三区| 一本一本综合久久| 精品少妇一区二区三区视频日本电影| 日日干狠狠操夜夜爽| 在线观看舔阴道视频| videosex国产| 久久热在线av| 欧美黑人精品巨大| 精品久久蜜臀av无| 国产精品影院久久| 女警被强在线播放| 在线观看www视频免费| 国产精品 国内视频| 一本大道久久a久久精品| 亚洲精品美女久久av网站| 中文字幕最新亚洲高清| 美女午夜性视频免费| aaaaa片日本免费| 老汉色∧v一级毛片| 18禁裸乳无遮挡免费网站照片 | 最好的美女福利视频网| 母亲3免费完整高清在线观看| 一本综合久久免费| 色综合站精品国产| 国产99久久九九免费精品| 国产一卡二卡三卡精品| √禁漫天堂资源中文www| 久久久国产欧美日韩av| 三级毛片av免费| 999久久久国产精品视频| 欧美乱妇无乱码| 欧美成人午夜精品| 麻豆av在线久日| 手机成人av网站| 满18在线观看网站| 天天一区二区日本电影三级| 国产精品av久久久久免费| 午夜日韩欧美国产| 亚洲熟妇熟女久久| 欧美久久黑人一区二区| 亚洲激情在线av| 亚洲第一电影网av| avwww免费| 熟女少妇亚洲综合色aaa.| 日韩精品免费视频一区二区三区| 欧美丝袜亚洲另类 | 脱女人内裤的视频| 欧美黑人欧美精品刺激| 国产精品一区二区免费欧美| 在线永久观看黄色视频| 午夜成年电影在线免费观看| 在线观看一区二区三区| 熟女少妇亚洲综合色aaa.| av在线播放免费不卡| 老司机福利观看| 一区福利在线观看| 日韩欧美在线二视频| 久久精品国产亚洲av香蕉五月| 亚洲男人的天堂狠狠| 国产主播在线观看一区二区| 国产精品久久电影中文字幕| 在线观看一区二区三区| 国产精品久久电影中文字幕| 午夜a级毛片| 欧美国产日韩亚洲一区| 久久婷婷人人爽人人干人人爱| 欧美中文综合在线视频| 国产高清激情床上av| 日韩欧美在线二视频| 久热这里只有精品99| 中文字幕最新亚洲高清| 亚洲成av片中文字幕在线观看| 人人澡人人妻人| 在线天堂中文资源库| 啦啦啦韩国在线观看视频| 女生性感内裤真人,穿戴方法视频| 久久青草综合色| 成人免费观看视频高清| 国产亚洲精品一区二区www| 久久人妻福利社区极品人妻图片| 中文字幕精品免费在线观看视频| 男女之事视频高清在线观看| www日本在线高清视频| 免费人成视频x8x8入口观看| av免费在线观看网站| av视频在线观看入口| 白带黄色成豆腐渣| 中亚洲国语对白在线视频| 一边摸一边做爽爽视频免费| 国产精华一区二区三区| 久久婷婷成人综合色麻豆|