• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Estimating Weibull Parameters Using Least Squares and Multilayer Perceptron vs.Bayes Estimation

    2022-08-24 03:31:44WalidAydiandFuadAlduais
    Computers Materials&Continua 2022年5期

    Walid Aydiand Fuad S.Alduais

    1Department of Computer Science,College of Humanities and Science in Al Aflaj,Prince Sattam Bin Abdulaziz University,Al-Aflaj,Saudi Arabia

    2Department of Mathematics,College of Humanities and Science in Al Aflaj,Prince Sattam Bin Abdulaziz University,Al-Aflaj,Saudi Arabia

    3Laboratory of Electronics&Information Technologies,Sfax University,Sfax,Tunisia

    4Department of Administration,Administrative Science College,Thamar University,Yemen

    Abstract: The Weibull distribution is regarded as among the finest in the family of failure distributions.One of the most commonly used parameters of the Weibull distribution(WD)is the ordinary least squares(OLS)technique,which is useful in reliability and lifetime modeling.In this study,we propose an approach based on the ordinary least squares and the multilayer perceptron(MLP) neural network called the OLSMLP that is based on the resilience of the OLS method.The MLP solves the problem of heteroscedasticity that distorts the estimation of the parameters of the WD due to the presence of outliers, and eases the difficulty of determining weights in case of the weighted least square(WLS).Another method is proposed by incorporating a weight into the general entropy (GE) loss function to estimate the parameters of the WD to obtain a modified loss function (WGE).Furthermore,a Monte Carlo simulation is performed to examine the performance of the proposed OLSMLP method in comparison with approximate Bayesian estimation(BLWGE)by using a weighted GE loss function.The results of the simulation showed that the two proposed methods produced good estimates even for small sample sizes.In addition, the techniques proposed here are typically the preferred options when estimating parameters compared with other available methods,in terms of the mean squared error and requirements related to time.

    Keywords: Weibull distribution;maximum likelihood;ordinary least squares;MLP neural network;weighted general entropy loss function

    1 Introduction

    The parameters of the Weibull distribution are widely used in reliability studies and many engineering applications, such as the lifetime analysis of material strength [1], estimation of rainfall[2],hydrology[3],predictions of material and structural failure[4],renewable and alternative energies[5–8],power electronic systems[9],and many other fields[10–12].

    The form of the probability density function(PDF)of two parameters of WD is given by:

    The cumulative distribution function (CDF) and the survival functionSof the WD can be expressed as

    where the parameters?andλrepresent the scale and the shape of the distribution,respectively.

    Several approaches to estimating the parameters of the WD have been proposed [13].They can generally be classified as manual or numerical[14].

    Manual approaches include the ordinary least squares [15,16], unbiased good linear estimators[17],and weighted least squares[18].Computational methods include maximum likelihood estimation [19], the moments estimation method [20], Bayesian approach [21], and least-squares estimation with particle swarm optimization[22].

    In addition to computational methods, many studies in the literature have attempted to use the neural network (NN) to anticipate the parameters of the WD in many areas, such as the method developed by Jesus that applies the Weibull and ANN analysis to anticipate the shelf life and acidity of vacuum-packed fresh cheese [23].In survival analysis, Achraf constructed a deep neural network model called DeepWeiSurv.It was assumed that the distribution of survival times follows a finite mixture of a two-parameter WD [24].In another work in the field of electric power generation, an artificial NN(ANN)and q-Weibull were applied to the survival function of brushes in hydroelectric generators[25].

    Recently,a few methods have been attempted to combine the robustness of the ANN and some of the above statistical methods.Maria modeled the distribution of tree diameters using the OLS and the ANN [26].In the same way and based on the ability of the OLS, in its simplest form, which assumes a linear relationship between the predictor and the unreliability function on one hand and the robustness and rapidness of the single-hidden-layer networks to handle the linear functions compared with multiple-hidden-layer [27] on the other hand, we will propose to combine OLS and a neural network to predict the two-parameter WD.

    In the proposed method, we solve the problem whereby the reliability of the OLS method is compromised by outliers through the introduction of a pre-trained neural network after the linearization of the CDF.The remaining sections of this paper are organized as follows:Section 2 provides a review of different numerical and graphical methods for estimating the parameters of the WD, such as the MLE, OLS, WLS, and BLGE.In Section 3 we present the proposed methods.To evaluate their appropriateness in comparison with competing methods,the relevant performance metrics are covered in Section 4.The results are discussed in Section 5.Finally,the conclusions of this study are provided in Section 6.

    2 Review of Numerical and Graphical Methods for Estimating Parameters of WD

    The most commonly used approaches to estimate the parametersλand?of the WD are described below.

    2.1 Maximum Likelihood Estimator(MLE)

    Let the set(x1,x2,x3,...xn)ofnrandom lifetimes from the WD be defined by Eq.(1).Then,the likelihood functionLfand its corresponding logarithm?for the given sample observations are shown in Eqs.(4)and(5),respectively[28]:

    The partial derivatives of the equation for?with respect to the variables?andλare given by:

    The parameterλcan be obtained by using any numerical method,such as the Newton–Raphson.

    2.2 Ordinary Least Squares Method(OLS)

    To estimate the parameters of the WD,the OLS method is extensively used in mathematics and engineering problems [16].We can obtain a linear relationship between parameters by taking the logarithm of Eq.(2)as follows:

    LetYi=ln[-ln(1-F(xi;?,λ))],Xi=lnx(i),α0= -λln?, andβ=λ.Then,Eq.(9)can be written as Yi=α0+βXi+∈i

    LetX(1),X(2),X(3),...X(n)be order statistics ofX1,X2,X3,...Xn,and letx(1) < x(2) <x(3) <...<x(n)be the ordered observations in a random sample of sizen.To estimate the values of the cumulative distribution functionF(x(i)?,λ),we use the mean rank method as follows:

    2.3 Weighted Least Squares Method(WLS)

    In the WLS estimate,the parametersλand?are the values of the parameters that minimize the function:

    The biggest challenge in the application of the WLS is in finding the weights Wiin Eq.(15).We use the delta method[29]to find them:

    Hence,the weights can be written as follows:

    MinimizingQ*W(λ,?)we obtain the WLS estimates ofλand?as

    2.4 Approximate Bayes Estimator

    In this section,the approximate Bayesian estimator under a GE loss function of the parametersλand?of the WD is discussed.We assume a non-informative(vague)prior according to[30]as

    The parametersλandφare estimated using Lindley’s approximation technique.The posterior expectationEis given by Eq.(22)[31]:

    Moreover,it can be asymptotically estimated by:

    wherei,j,k,l= 1,2,...m,φ=(φ1,φ1,...φm),π(φ) represents the prior distribution ofφ,u=u(φ),L=L(φ) is the likelihood function,ρ≡ρ(φ) = ln(π(φ)),andσij=element (i,j) of the covariance matrix of the parameter estimators.

    For the two-parameter caseφ=(λ,φ),Eq.(22)reduces to:

    The functions in Eq.(24)are computed using MLEs with respect toλandφ.

    To apply the Lindley model of Eq.(24)to estimate the parameters of the WD,the following are obtained from Eq.(23):

    The elementsσijof the covariance matrix are expressed by

    2.4.1 Estimates Based on General Entropy Loss Function

    The general entropy loss functionLforφ,shown in Eq.(24),is expressed by the following form[32]:

    In the same way,the BLGE offor?is found by the following expressions:

    3 Proposed Methods

    In the following sections,we describe the proposed BLWGE and OLSMLP methods.

    3.1 Weighted General Entropy Loss Function

    The WGE loss function was proposed as dependent on the weighted loss GE function as follows:

    whereφrepresents the estimated parameters that minimize the expectation of the loss function(Eq.(27)),andw(φ)represents the proposed weighted function as expressed by Eq.(28):

    Based on the posterior distribution of the parameterφ,and by using the WGE function given in Eq.(28),we obtain the estimated BLWGE of the parameter?as follows:

    Thus,we can find that

    Consequently,the BLWGE of parameterφ,obtained by using the WGE loss function,isas presented in Eq.(29):

    provided thatEφ(φ-z)andEφ(φ-(z+q))exist and are finite,whereEφrepresents the expected value.

    We note that the GE is a special case of the WGE whenz=0 in Eq.(29).

    3.1.1 Estimates of Parameters of WD Based on Weighted General Entropy Loss Function

    Based on the WGE and by using Eq.(29),the approximate Bayes estimatorforλis shown as:

    Thus,the weighted Bayes estimator for the shape parameter?is

    3.2 Ordinary Least Squares and the Multilayer Perceptron Neural Network(OLSMLP)

    As previous studies have shown[14,33],manual calculations yield the smallest standard deviation(STD)in the parameterλ,and are consequently more accurate than computational methods.Moreover, methods of manual estimation are more accurate for small sample sizes [14].However, these computational methods, especially the OLS, are sensitive to outliers and specific residual behavior[34].To solve these problems, many studies have proposed different methods, such as the iterative weighting method based on the modified OLS[34],the WLS,and many other methods based on the WLS[35].A major challenge in these methods is determining the weights.

    3.2.1 Proposed Method to Estimate Parameters of WD

    We now describe the proposed method,which is divided into two main parts:the linearization of the CDF,and the application of a feedforward network with backpropagation to estimate the values ofλand?of the WD.

    The OLS method takes the CDF defined in Eq.(2) and linearizes it as described in Eq.(10).It then determines the coefficientsα0andβvia linear regression by using the slope and the intercept.The principle of the method used by the OLS to computeα0andβcan be violated even with a few outliers.

    Therefore, instead of using the slope and the intercept, we propose applying Algorithm 1 as described below.

    ·Application of Proposed Model to Estimate Parameters of WD

    The steps used to evaluate the parameters of the WD from the input csv file are described by Algorithm 1.

    Input:Three comma separated value file (CSV) files containing the matrices Xi and Yi, and their corresponding parameters(shape and scale)SCi.Output:The predicted shape ?λOLSMLP and scale ??OLSMLP for the test set.1:Normalize the inputs matrices Xi,Yi,and SCi separately to unit norm using RobustScaler followed by MinMaxScaler norm.2:Split t he normalized Xi e the neural netw ile the model and,Yi,and SCi into random training and test subsets.3:Creatork model(define the input layer,hidden layer,and output layer).4:Comp fit it to the data.5:Predict ?λOLSMLP and ??OLSMLP for the test set.6:Evaluate the performance of the proposed model.Steps 2,3,4,and 5 are explained in more detail in the following subsections:

    ·Data Normalization

    Normalization is an essential preprocessing tool for a neural network [36,37].Before training a neural network model,the input data are scaled using the RobustScaler norm in a preliminary phase,where each sample with at least one non-zero component is rescaled using the median and quartile range as described by Eq.(38).The RobustScaler norm is used to remove the influence of outliers.Following this,the MinMaxScaler,defined by Eq.(39),is applied to the output of the RobustScaler.The MinMaxScaler scales all the data features to the range[0,1]:

    whereXis a feature vector,Xiis an element of featureX,is the rescaled element obtained by using MinMaxScaler,andis the rescaled element obtained by using RobustScaler.

    ·Structure of the Proposed Neural Network

    To estimate the parameters of the WD,we propose using a multilayer perceptron(MLP),which is a feedforward network with backpropagation[38].According to the structure of the MLP,the proposed network,as shown in Fig.1,consists of an input layer(withnneurons),a hidden layer(withkneurons),and an output layer(withmneurons that yield the Weibull parameters as the output of the network).

    Figure 1:Topology of the proposed MLP

    Various criteria have been proposed in the literature to fix the number of hidden neurons[39].In our architecture,we use the rule whereby“the number of hidden neuronskshould be 2/3 times the size of the input layer,plus the size of the output layer”[38–40].

    The hyperbolic tangent activation function (tanh) is proposed here in the input layer, and the sigmoid function in the output layer.They are used frequently in feedforward nets, and are suitable for shallow networks as well as applications of prediction and mapping[38,41].

    The objective of our neural network is a model that performs well on the data used in both the training and the test datasets.For this reason,we add a well-known regularization layer as described in the next section.

    ·Regularization

    Regularization is a technique that can prevent overfitting [37,38].A number of regularization techniques have been develop in the literature, such as L1 and L2 regularizations, bagging, and dropout.In the proposed structure, we use dropout, a well-known technique that randomly “drops out” or omits hidden neurons of the neural network to make them unavailable during part of the training[38,42].This reduces the co-adaption between neurons,which results in less overfitting[38].

    ·Optimization Algorithm

    The optimization of deep networks is an active area of research[43].The most popular gradientbased optimization algorithms are Adagrad, Momentum, RMSProp, Adam, AdaDelta, AdaMax,Nadam, and AMSGrad [38,43,44].We chose Nadam due to its superiority in supervised machine learning over the other techniques, especially for a deep network [43].Moreover, it combines the strengths of the Nesterov acceleration gradient(NAG)and the adaptive estimation(Adam)algorithms as described in[44]:

    t:time step

    αnad:learning rate

    vt:the exponential average square of gradients

    mt:momentum vector

    wt:the weight that we want to update

    ε:smoothing term

    :gradient ofL;the loss function to minimize.

    β1,β2:momentum decay and scaling decay,respectively

    4 Performance Metrics

    To evaluate the proposed methods with respect to other methods, we used two statistical tools,the mean squared error(MSE)and the mean absolute percentage error(MAPE)[5],in addition to the computation time.

    5 Results and Discussion

    5.1 Dataset Description

    We generated 250,000 random data points from the WD for different parameters and different values of?ranging from 1 to 299,and those ofλranging from 0.5 to 100.For each shape/scale pair,we generated 10,000 samples of different sizesn=10, 20, 30, 40, and 50.

    We used the same dataset for the neural network in the training phase,but applied one sample to each shape/scale pair.This was unlike in the other methods(MLE,OLS,WLS,BLGE,and BLWGE),which used 10,000 samples to estimate the parameters of the WD.This dataset was divided into two subsets.The first subset was used to fit the model, and is referred to as the training dataset; it was characterized by known inputs and outputs.The second subset is referred to as the test dataset,and was used to evaluate the fitted machine learning model and make predictions on the new subset,for which we did not have the expected output.We chose the train–test procedure for our experiments because we guessed that we had a sufficiently large dataset available.

    5.2 Experimental Setting

    5.2.1 Parameter Selection for OLSMLP

    In all experiments,we trained the model with Google Collaboratory(GPU)for 25 epochs.We used the Nadam optimizer with learning rate ofαnad= 0.001; terms representing the momentum decay,scaling decay,and smoothing were kept at their default values:β1=0.9,β2=0.999,andε=10-7.A dropout with a ratio of 0.6 was applied to the hidden layer.As described in Section 3,the hidden and output layers used thetanhandsigmoidactivation functions,respectively.The error function or loss function was the mean squared error,and was used to estimate the loss of the model.

    5.2.2 Parameter Selection of BLGE and BLWGE

    In all experiments,the parameters of the BLWGE and BLGE were empirically determined.The values of the weightsqandzof the BLWGE were-3 and 6,respectively.For the BLGE,the parameterq=1.5.

    5.3 Estimating Parameters of Weibull Distribution

    5.3.1 Effect of Sample Size on Estimation of WD Parameters Using Prevalent Methods

    Fig.2 shows the evolution of the average MSE as a function of the sample sizen.The MSE decreased quasi-linearly fromn=10 ton=40 for all methods.Fig.2 shows that the BLWGE,WLS,BLGE,and MLE had the lower MSE values for the different sample sizes compared with the OLS.We can deduce also that the WLS,GE,and MLE gave similar results with a slightly better start for the MLE atn=10.

    5.3.2 Effect of Sample Size on Estimation of WD Parameters Using Proposed Method

    To illustrate how the sample size affects the calculation of the MSE,F(xiàn)ig.3 shows the evolution of the latter as a function of the sample sizenfrom 10 to 50.

    From Fig.3, we can deduce that as the sample size increased, the estimate of the MSE by the proposed method decreased and fluctuated.This fluctuation was due to the random nature of the information used and the limited number of samples(one sample)for each pair of shapes/scales.

    Tabs.1 and 2 show the results of the simulation of the proposed method and the other methods considered above.The results show the following:

    Figure 2:The evolution of the MSE using the parameters ? = 2.5 and λ =1.685 as a function of n =[10-40] for the MLE,OLS,WLS,BLGE,and BLGWGE

    Figure 3:The evolution of the MSE using the parameters ? = 0.75 and λ =1.75 as a function of n = [10-50]

    1.The MLE and WLS behaved similarly as shown in Tab.1:Their MSE values decreased gradually when their shape values increased at a fixed scale.Conversely,when the scale value increased with a fixed shape,the MSE increased.

    2.The behavior of the OLS and GE was the opposite of that of the MLE and WLS.As depicted in Tab.1,the MSE increased when the shape increased(at a fixed scale),and decreases when the scale increased(with a fixed shape).

    3.The BLWGE and the OLSMLP behaved similarly in terms of scale estimation,as shown in Tab.1.

    4.All methods had the same global variation function,as shown in Fig.4 and Tab.2.

    5.The MLE was slightly superior globally in terms of scale estimation to the other methods,but had the worst estimation of shape,as shown in Tab.2.

    6.The proposed MLP neural network acceptably estimated the scale, better than some methods.By contrast,it outperformed all other methods in terms of shape estimations most of the time.

    Table 1:MSEs of the estimated ? with varying values of the parameters λ and ? for different parameters of WD estimation methods

    Table 2:MSEs of the estimated λ with varying values of the parameters λ and ? using different methods to estimate the parameters of the WD

    Figure 4:MSEs of with varying values of the parameters ? = [1 1 1 2.5 3.25 4] and λ =[1.5 1.75 2 4 4 4]for the MLE,OLS,BLGE,WLS,and the proposed methods

    From Tab.3, we see that both statistical indicators, MSE and MAPE, yielded different values.The global rank was calculated to evaluate the best method.The results in the table indicate that the proposed method offered the best compromise between shape and scale estimation, as indicated by the global rank.Moreover,it retained the speed of the OLS and enhanced the accuracy of estimation of the parameters of the WD compared with the MLE,BLGE,and BLWGE.

    Table 3:Performance evaluation of the MLE,OLS,WLS,BLGE,BLWGE,and OLSMLP methods using different statistical indicators

    6 Conclusion

    This study proposed a method to estimate the parameters of the WD.This method is based on the OLS graphical method and the MLP neural network.The MLP solves the problems caused by the presence of outliers and eases the difficulty of determining the weights in the WLS method.It yielded acceptable results in simulations,especially in terms of shape estimation.It is also faster than the MLE,BLGE,and BLWGE.

    We also proposed a second method (BLWGE), in which we introduced weight to the GE loss function.The results of simulations showed that BLWGE yields good results, especially in terms of shape estimation,compared with the other methods.

    Acknowledgement:This project was supported by the Deanship of Scientific Research at Prince Sattam bin Abdulaziz University under Research Project No.2020/01/16725.

    Funding Statement:The authors are grateful to the Deanship of Scientific Research at Prince Sattam bin Abdulaziz University Supporting Project Number(2020/01/16725),Prince Sattam bin Abdulaziz University,Saudi Arabia.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding this study.

    国产爱豆传媒在线观看| kizo精华| 国产在线男女| 日韩人妻高清精品专区| 男女下面进入的视频免费午夜| 精品国产三级普通话版| 18禁裸乳无遮挡动漫免费视频| 亚洲av男天堂| 卡戴珊不雅视频在线播放| 亚洲欧美清纯卡通| 国产深夜福利视频在线观看| 国产精品国产av在线观看| 伦精品一区二区三区| 国产乱人视频| 在线观看国产h片| 简卡轻食公司| 欧美成人精品欧美一级黄| 国产亚洲av片在线观看秒播厂| 亚洲性久久影院| 国产永久视频网站| 国产精品爽爽va在线观看网站| 女性被躁到高潮视频| 丝袜喷水一区| 亚洲av日韩在线播放| 国产精品一及| 亚洲四区av| 国产爽快片一区二区三区| 三级国产精品片| 国产永久视频网站| 夜夜看夜夜爽夜夜摸| 网址你懂的国产日韩在线| 亚洲无线观看免费| 亚洲丝袜综合中文字幕| 最近最新中文字幕大全电影3| 国产成人午夜福利电影在线观看| 日本与韩国留学比较| 搡老乐熟女国产| 少妇人妻精品综合一区二区| 国产成人午夜福利电影在线观看| 亚洲熟女精品中文字幕| 99久久综合免费| 99热这里只有精品一区| 最近中文字幕2019免费版| 国内揄拍国产精品人妻在线| 最后的刺客免费高清国语| 国产在线一区二区三区精| 网址你懂的国产日韩在线| 又大又黄又爽视频免费| 18禁裸乳无遮挡免费网站照片| 97在线人人人人妻| 午夜福利在线观看免费完整高清在| 一级毛片 在线播放| 一区二区三区免费毛片| av黄色大香蕉| 午夜福利高清视频| 久久久久人妻精品一区果冻| 中文字幕制服av| 国产一区二区三区综合在线观看 | 国产伦理片在线播放av一区| 中文欧美无线码| 青春草亚洲视频在线观看| 在线免费观看不下载黄p国产| 亚洲丝袜综合中文字幕| 欧美精品一区二区免费开放| av卡一久久| 熟妇人妻不卡中文字幕| 天堂中文最新版在线下载| 日韩中文字幕视频在线看片 | 汤姆久久久久久久影院中文字幕| 国精品久久久久久国模美| 特大巨黑吊av在线直播| 国产男女超爽视频在线观看| 久久人人爽人人爽人人片va| 性色av一级| 99re6热这里在线精品视频| 少妇人妻一区二区三区视频| 王馨瑶露胸无遮挡在线观看| 成人一区二区视频在线观看| 久久99精品国语久久久| 欧美zozozo另类| 蜜桃久久精品国产亚洲av| 不卡视频在线观看欧美| 久久久久久伊人网av| 日韩免费高清中文字幕av| 国产亚洲av片在线观看秒播厂| 久久人妻熟女aⅴ| 国产亚洲91精品色在线| 丝袜脚勾引网站| 99热这里只有是精品50| 久久鲁丝午夜福利片| 少妇的逼好多水| 男女边吃奶边做爰视频| 一级二级三级毛片免费看| 免费大片18禁| 97热精品久久久久久| 午夜老司机福利剧场| 欧美成人一区二区免费高清观看| 日本一二三区视频观看| 国产精品人妻久久久久久| 国产黄频视频在线观看| 久久午夜福利片| 老师上课跳d突然被开到最大视频| av.在线天堂| 亚洲国产欧美在线一区| 啦啦啦啦在线视频资源| 性高湖久久久久久久久免费观看| 久久久久视频综合| 下体分泌物呈黄色| 九草在线视频观看| 国产精品av视频在线免费观看| videossex国产| 国产精品麻豆人妻色哟哟久久| 国产成人精品久久久久久| 精品久久久久久久久av| 18禁动态无遮挡网站| 波野结衣二区三区在线| 日本猛色少妇xxxxx猛交久久| 一级黄片播放器| 亚洲性久久影院| 男女下面进入的视频免费午夜| 午夜老司机福利剧场| 男女无遮挡免费网站观看| 亚洲精品自拍成人| 亚洲国产最新在线播放| 日韩视频在线欧美| 亚洲熟女精品中文字幕| 夜夜骑夜夜射夜夜干| 日韩视频在线欧美| 国产精品国产av在线观看| 一级毛片我不卡| 免费观看的影片在线观看| 嘟嘟电影网在线观看| 国产淫语在线视频| 久久久欧美国产精品| 亚洲成色77777| 国产精品成人在线| 丰满人妻一区二区三区视频av| 偷拍熟女少妇极品色| 色婷婷久久久亚洲欧美| 欧美极品一区二区三区四区| 人人妻人人看人人澡| 欧美日韩综合久久久久久| 这个男人来自地球电影免费观看 | 天天躁夜夜躁狠狠久久av| 亚洲熟女精品中文字幕| 男女边吃奶边做爰视频| 国产在视频线精品| 建设人人有责人人尽责人人享有的 | 国产 精品1| av黄色大香蕉| 日韩,欧美,国产一区二区三区| 国产爱豆传媒在线观看| 亚洲精品乱久久久久久| 国产高清有码在线观看视频| 狂野欧美激情性bbbbbb| 极品少妇高潮喷水抽搐| 国产精品一区二区三区四区免费观看| 日本av免费视频播放| 黄色怎么调成土黄色| 国产乱来视频区| 99国产精品免费福利视频| 久久久成人免费电影| 在线 av 中文字幕| 国产精品一区二区三区四区免费观看| 九色成人免费人妻av| 一级av片app| 高清午夜精品一区二区三区| 视频中文字幕在线观看| 九九爱精品视频在线观看| av.在线天堂| 在线观看国产h片| 欧美高清性xxxxhd video| 亚洲第一区二区三区不卡| 噜噜噜噜噜久久久久久91| 中文字幕免费在线视频6| 免费观看在线日韩| av国产精品久久久久影院| 久久精品国产亚洲av天美| 我的老师免费观看完整版| 中国国产av一级| 久久精品国产鲁丝片午夜精品| 中文天堂在线官网| 国产精品久久久久久av不卡| 欧美变态另类bdsm刘玥| 日本vs欧美在线观看视频 | 免费看不卡的av| 久久精品久久久久久噜噜老黄| 亚洲欧美清纯卡通| 国产伦在线观看视频一区| 国产伦精品一区二区三区视频9| 国产精品嫩草影院av在线观看| 国产伦在线观看视频一区| 高清日韩中文字幕在线| 久久精品久久久久久久性| 大陆偷拍与自拍| 国产精品欧美亚洲77777| 纯流量卡能插随身wifi吗| 两个人的视频大全免费| 日日摸夜夜添夜夜爱| 男人添女人高潮全过程视频| 美女主播在线视频| 国产中年淑女户外野战色| .国产精品久久| 国产视频首页在线观看| 日本欧美视频一区| 最新中文字幕久久久久| 成年免费大片在线观看| 我的老师免费观看完整版| 最近中文字幕高清免费大全6| 国产成人a区在线观看| 日本av免费视频播放| av不卡在线播放| 国产伦理片在线播放av一区| 国产男女内射视频| 中文字幕人妻熟人妻熟丝袜美| 亚洲真实伦在线观看| 亚洲va在线va天堂va国产| 18+在线观看网站| 日韩制服骚丝袜av| 欧美3d第一页| 午夜日本视频在线| 五月天丁香电影| 九九爱精品视频在线观看| 我的老师免费观看完整版| 精品少妇久久久久久888优播| 国产成人a区在线观看| av免费观看日本| a 毛片基地| 亚洲色图av天堂| 国产视频首页在线观看| 日韩电影二区| 亚洲怡红院男人天堂| 97精品久久久久久久久久精品| 人人妻人人澡人人爽人人夜夜| 日韩强制内射视频| 久久婷婷青草| 最近最新中文字幕免费大全7| 日本黄色片子视频| 久久精品人妻少妇| 精品一区二区三区视频在线| 久久久久久久久久人人人人人人| 校园人妻丝袜中文字幕| 久久亚洲国产成人精品v| 丰满人妻一区二区三区视频av| 国产精品伦人一区二区| av网站免费在线观看视频| 精品久久久久久久末码| 十八禁网站网址无遮挡 | 亚洲国产精品成人久久小说| 国产精品久久久久久久电影| 91精品伊人久久大香线蕉| 亚洲精品一二三| 国产精品国产三级国产av玫瑰| 热re99久久精品国产66热6| 在线看a的网站| 美女福利国产在线 | 亚洲av福利一区| 亚洲精品国产av蜜桃| 男女边摸边吃奶| 夜夜骑夜夜射夜夜干| 人妻制服诱惑在线中文字幕| 欧美日韩视频高清一区二区三区二| 又黄又爽又刺激的免费视频.| 欧美日韩视频精品一区| 国产黄色视频一区二区在线观看| 成人毛片60女人毛片免费| 久久 成人 亚洲| 国产精品国产av在线观看| 亚洲精品乱久久久久久| 国产日韩欧美亚洲二区| 99热国产这里只有精品6| 欧美xxxx性猛交bbbb| 色网站视频免费| 国产黄片视频在线免费观看| 人妻一区二区av| 中文精品一卡2卡3卡4更新| 久久久久久久国产电影| 日本午夜av视频| 日韩 亚洲 欧美在线| 成人亚洲欧美一区二区av| 日本av手机在线免费观看| 久热这里只有精品99| 97超视频在线观看视频| av黄色大香蕉| 亚洲国产精品专区欧美| 久久久久久久国产电影| 婷婷色综合大香蕉| 国产v大片淫在线免费观看| 婷婷色麻豆天堂久久| 亚洲四区av| av福利片在线观看| 王馨瑶露胸无遮挡在线观看| 日韩一区二区视频免费看| 国产爱豆传媒在线观看| 麻豆国产97在线/欧美| 色视频www国产| 欧美精品一区二区大全| 热99国产精品久久久久久7| 人人妻人人澡人人爽人人夜夜| 午夜免费鲁丝| 亚洲久久久国产精品| 免费av中文字幕在线| 亚洲国产毛片av蜜桃av| 丰满人妻一区二区三区视频av| 精品99又大又爽又粗少妇毛片| 黄片无遮挡物在线观看| 亚洲精品成人av观看孕妇| 成年美女黄网站色视频大全免费 | 91在线精品国自产拍蜜月| 国产又色又爽无遮挡免| 男女国产视频网站| 小蜜桃在线观看免费完整版高清| 成人特级av手机在线观看| 成人美女网站在线观看视频| 七月丁香在线播放| 国产欧美日韩一区二区三区在线 | 国产乱人视频| 亚洲人与动物交配视频| 亚洲精品视频女| 成年人午夜在线观看视频| 色网站视频免费| 国产成人a∨麻豆精品| 日本黄色日本黄色录像| 一区二区三区四区激情视频| 国产在线视频一区二区| 久久97久久精品| 熟女av电影| 黑人猛操日本美女一级片| 欧美97在线视频| 成人国产av品久久久| 国产片特级美女逼逼视频| 亚洲国产高清在线一区二区三| 卡戴珊不雅视频在线播放| 国产在线视频一区二区| 国精品久久久久久国模美| 内射极品少妇av片p| 成人18禁高潮啪啪吃奶动态图 | 国产亚洲一区二区精品| freevideosex欧美| 在线免费观看不下载黄p国产| av网站免费在线观看视频| 在线 av 中文字幕| 制服丝袜香蕉在线| av女优亚洲男人天堂| 欧美xxⅹ黑人| 婷婷色综合www| av福利片在线观看| 国产在视频线精品| 欧美区成人在线视频| 国产成人a∨麻豆精品| 九草在线视频观看| 高清不卡的av网站| 成人免费观看视频高清| 久久国产亚洲av麻豆专区| 日韩国内少妇激情av| 18禁在线播放成人免费| 日韩在线高清观看一区二区三区| 久久人人爽av亚洲精品天堂 | 美女内射精品一级片tv| 大陆偷拍与自拍| 能在线免费看毛片的网站| av天堂中文字幕网| 中文字幕精品免费在线观看视频 | 成年免费大片在线观看| 久久国产乱子免费精品| 中文字幕制服av| 日韩伦理黄色片| 国产欧美日韩一区二区三区在线 | 街头女战士在线观看网站| 久久精品夜色国产| 国产精品福利在线免费观看| 如何舔出高潮| 亚洲av成人精品一二三区| 亚洲精品乱码久久久v下载方式| 亚洲欧美成人综合另类久久久| 高清毛片免费看| 亚洲欧洲国产日韩| 亚洲国产精品国产精品| 丰满人妻一区二区三区视频av| 国产 一区 欧美 日韩| 久久久久视频综合| 成人一区二区视频在线观看| 国产精品一区二区在线不卡| 久久精品国产亚洲av涩爱| 少妇裸体淫交视频免费看高清| 国产毛片在线视频| 日本爱情动作片www.在线观看| 精品少妇黑人巨大在线播放| 一级毛片aaaaaa免费看小| 久久久国产一区二区| h视频一区二区三区| 亚洲内射少妇av| 国产av一区二区精品久久 | 丝袜喷水一区| 精品久久久久久久久av| 久久亚洲国产成人精品v| 天堂8中文在线网| 久久精品国产亚洲av涩爱| 国产av国产精品国产| 成人免费观看视频高清| 婷婷色综合www| 亚洲av日韩在线播放| 黄色视频在线播放观看不卡| 蜜臀久久99精品久久宅男| 久久精品国产亚洲网站| 欧美日韩在线观看h| 晚上一个人看的免费电影| 最新中文字幕久久久久| 一个人免费看片子| 国国产精品蜜臀av免费| 国产伦在线观看视频一区| 久久精品久久精品一区二区三区| 国产男女超爽视频在线观看| 日韩亚洲欧美综合| 日日啪夜夜撸| 成年免费大片在线观看| 五月伊人婷婷丁香| 老师上课跳d突然被开到最大视频| 亚州av有码| 精品久久久噜噜| 国产国拍精品亚洲av在线观看| 欧美极品一区二区三区四区| 日本av免费视频播放| 纯流量卡能插随身wifi吗| av在线观看视频网站免费| 免费久久久久久久精品成人欧美视频 | 亚洲丝袜综合中文字幕| 中国国产av一级| 日韩,欧美,国产一区二区三区| 天天躁日日操中文字幕| 波野结衣二区三区在线| 久久久久网色| 亚洲精品久久久久久婷婷小说| 超碰97精品在线观看| 在线观看一区二区三区| 老师上课跳d突然被开到最大视频| 亚洲av福利一区| 国产精品国产av在线观看| 人人妻人人添人人爽欧美一区卜 | 久久人人爽av亚洲精品天堂 | 国产淫片久久久久久久久| 精品人妻偷拍中文字幕| 啦啦啦啦在线视频资源| 菩萨蛮人人尽说江南好唐韦庄| 国产大屁股一区二区在线视频| 一边亲一边摸免费视频| 亚洲久久久国产精品| 嫩草影院入口| 交换朋友夫妻互换小说| 婷婷色av中文字幕| 国产精品一及| 亚洲人成网站在线观看播放| 国产av国产精品国产| 欧美日韩综合久久久久久| 亚洲国产色片| 欧美精品一区二区免费开放| 精品亚洲乱码少妇综合久久| 日韩av免费高清视频| 亚洲无线观看免费| 王馨瑶露胸无遮挡在线观看| av网站免费在线观看视频| 国产69精品久久久久777片| 日韩伦理黄色片| 亚洲国产毛片av蜜桃av| 久久精品久久久久久噜噜老黄| 在线免费十八禁| 永久免费av网站大全| 色婷婷久久久亚洲欧美| 日本-黄色视频高清免费观看| 午夜福利影视在线免费观看| 99精国产麻豆久久婷婷| 六月丁香七月| 九草在线视频观看| 欧美zozozo另类| 性高湖久久久久久久久免费观看| 亚洲色图综合在线观看| 亚洲性久久影院| 舔av片在线| 亚洲国产精品999| 日韩强制内射视频| 亚洲精华国产精华液的使用体验| 久久久精品94久久精品| 日韩欧美一区视频在线观看 | 欧美日韩视频高清一区二区三区二| 熟女av电影| 欧美性感艳星| 国产精品不卡视频一区二区| 久久ye,这里只有精品| 午夜视频国产福利| 免费久久久久久久精品成人欧美视频 | 最近手机中文字幕大全| 黑丝袜美女国产一区| 精品久久久久久久末码| 国产白丝娇喘喷水9色精品| 久久久精品94久久精品| 国产91av在线免费观看| 深夜a级毛片| 少妇 在线观看| 大香蕉久久网| 国产视频首页在线观看| 赤兔流量卡办理| 男女边吃奶边做爰视频| 国产淫语在线视频| 国产成人精品婷婷| 久久婷婷青草| 夫妻午夜视频| 欧美高清成人免费视频www| 欧美一区二区亚洲| 亚洲最大成人中文| 啦啦啦中文免费视频观看日本| kizo精华| 黑人猛操日本美女一级片| 国产伦精品一区二区三区四那| 久久久国产一区二区| 国产日韩欧美在线精品| 日韩大片免费观看网站| videos熟女内射| 国内揄拍国产精品人妻在线| 嘟嘟电影网在线观看| 国产 一区精品| 国产精品一二三区在线看| 久久热精品热| 日韩 亚洲 欧美在线| 中文天堂在线官网| a 毛片基地| 午夜福利网站1000一区二区三区| 一本一本综合久久| 精品熟女少妇av免费看| 日韩免费高清中文字幕av| 亚洲成人av在线免费| 天天躁日日操中文字幕| 人人妻人人澡人人爽人人夜夜| 一二三四中文在线观看免费高清| 国产精品一区二区三区四区免费观看| 国产av国产精品国产| 九九在线视频观看精品| 少妇熟女欧美另类| 十分钟在线观看高清视频www | 欧美最新免费一区二区三区| 一级黄片播放器| 亚洲综合色惰| 91狼人影院| 插阴视频在线观看视频| 欧美精品一区二区免费开放| 免费少妇av软件| 啦啦啦中文免费视频观看日本| 人妻夜夜爽99麻豆av| 纯流量卡能插随身wifi吗| 有码 亚洲区| 岛国毛片在线播放| 最近2019中文字幕mv第一页| www.色视频.com| 久久亚洲国产成人精品v| 最近手机中文字幕大全| 亚洲精品日韩av片在线观看| 日韩电影二区| 嫩草影院入口| 久久99精品国语久久久| 久久久久久久亚洲中文字幕| 亚洲色图综合在线观看| 精品人妻偷拍中文字幕| 精品人妻一区二区三区麻豆| 在线观看美女被高潮喷水网站| 日本黄大片高清| 久久久久久久国产电影| 成人特级av手机在线观看| 国产av精品麻豆| a级毛色黄片| 久久久久视频综合| www.av在线官网国产| 欧美xxxx性猛交bbbb| 99热网站在线观看| 高清视频免费观看一区二区| 亚洲精品日本国产第一区| 视频区图区小说| 午夜视频国产福利| 国产精品一区二区三区四区免费观看| 熟女人妻精品中文字幕| 久久久久久伊人网av| 欧美激情国产日韩精品一区| 免费观看的影片在线观看| 精品久久久久久久久亚洲| 久久这里有精品视频免费| 欧美成人一区二区免费高清观看| 欧美精品一区二区免费开放| 美女中出高潮动态图| 嫩草影院入口| 中文欧美无线码| 成人一区二区视频在线观看| 王馨瑶露胸无遮挡在线观看| 街头女战士在线观看网站| 国产 一区 欧美 日韩| 免费观看性生交大片5| 男的添女的下面高潮视频| 亚洲精品一区蜜桃| 欧美日韩视频高清一区二区三区二| 边亲边吃奶的免费视频| 高清在线视频一区二区三区| 99九九线精品视频在线观看视频| av在线蜜桃| 内地一区二区视频在线| 日韩强制内射视频| 日本黄大片高清| 老熟女久久久| 超碰av人人做人人爽久久| 国产女主播在线喷水免费视频网站| 老熟女久久久| 天美传媒精品一区二区| 中文字幕av成人在线电影| tube8黄色片| 91精品国产国语对白视频| 寂寞人妻少妇视频99o| 免费在线观看成人毛片| 联通29元200g的流量卡| 亚洲美女视频黄频|