• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Estimating Weibull Parameters Using Least Squares and Multilayer Perceptron vs.Bayes Estimation

    2022-08-24 03:31:44WalidAydiandFuadAlduais
    Computers Materials&Continua 2022年5期

    Walid Aydiand Fuad S.Alduais

    1Department of Computer Science,College of Humanities and Science in Al Aflaj,Prince Sattam Bin Abdulaziz University,Al-Aflaj,Saudi Arabia

    2Department of Mathematics,College of Humanities and Science in Al Aflaj,Prince Sattam Bin Abdulaziz University,Al-Aflaj,Saudi Arabia

    3Laboratory of Electronics&Information Technologies,Sfax University,Sfax,Tunisia

    4Department of Administration,Administrative Science College,Thamar University,Yemen

    Abstract: The Weibull distribution is regarded as among the finest in the family of failure distributions.One of the most commonly used parameters of the Weibull distribution(WD)is the ordinary least squares(OLS)technique,which is useful in reliability and lifetime modeling.In this study,we propose an approach based on the ordinary least squares and the multilayer perceptron(MLP) neural network called the OLSMLP that is based on the resilience of the OLS method.The MLP solves the problem of heteroscedasticity that distorts the estimation of the parameters of the WD due to the presence of outliers, and eases the difficulty of determining weights in case of the weighted least square(WLS).Another method is proposed by incorporating a weight into the general entropy (GE) loss function to estimate the parameters of the WD to obtain a modified loss function (WGE).Furthermore,a Monte Carlo simulation is performed to examine the performance of the proposed OLSMLP method in comparison with approximate Bayesian estimation(BLWGE)by using a weighted GE loss function.The results of the simulation showed that the two proposed methods produced good estimates even for small sample sizes.In addition, the techniques proposed here are typically the preferred options when estimating parameters compared with other available methods,in terms of the mean squared error and requirements related to time.

    Keywords: Weibull distribution;maximum likelihood;ordinary least squares;MLP neural network;weighted general entropy loss function

    1 Introduction

    The parameters of the Weibull distribution are widely used in reliability studies and many engineering applications, such as the lifetime analysis of material strength [1], estimation of rainfall[2],hydrology[3],predictions of material and structural failure[4],renewable and alternative energies[5–8],power electronic systems[9],and many other fields[10–12].

    The form of the probability density function(PDF)of two parameters of WD is given by:

    The cumulative distribution function (CDF) and the survival functionSof the WD can be expressed as

    where the parameters?andλrepresent the scale and the shape of the distribution,respectively.

    Several approaches to estimating the parameters of the WD have been proposed [13].They can generally be classified as manual or numerical[14].

    Manual approaches include the ordinary least squares [15,16], unbiased good linear estimators[17],and weighted least squares[18].Computational methods include maximum likelihood estimation [19], the moments estimation method [20], Bayesian approach [21], and least-squares estimation with particle swarm optimization[22].

    In addition to computational methods, many studies in the literature have attempted to use the neural network (NN) to anticipate the parameters of the WD in many areas, such as the method developed by Jesus that applies the Weibull and ANN analysis to anticipate the shelf life and acidity of vacuum-packed fresh cheese [23].In survival analysis, Achraf constructed a deep neural network model called DeepWeiSurv.It was assumed that the distribution of survival times follows a finite mixture of a two-parameter WD [24].In another work in the field of electric power generation, an artificial NN(ANN)and q-Weibull were applied to the survival function of brushes in hydroelectric generators[25].

    Recently,a few methods have been attempted to combine the robustness of the ANN and some of the above statistical methods.Maria modeled the distribution of tree diameters using the OLS and the ANN [26].In the same way and based on the ability of the OLS, in its simplest form, which assumes a linear relationship between the predictor and the unreliability function on one hand and the robustness and rapidness of the single-hidden-layer networks to handle the linear functions compared with multiple-hidden-layer [27] on the other hand, we will propose to combine OLS and a neural network to predict the two-parameter WD.

    In the proposed method, we solve the problem whereby the reliability of the OLS method is compromised by outliers through the introduction of a pre-trained neural network after the linearization of the CDF.The remaining sections of this paper are organized as follows:Section 2 provides a review of different numerical and graphical methods for estimating the parameters of the WD, such as the MLE, OLS, WLS, and BLGE.In Section 3 we present the proposed methods.To evaluate their appropriateness in comparison with competing methods,the relevant performance metrics are covered in Section 4.The results are discussed in Section 5.Finally,the conclusions of this study are provided in Section 6.

    2 Review of Numerical and Graphical Methods for Estimating Parameters of WD

    The most commonly used approaches to estimate the parametersλand?of the WD are described below.

    2.1 Maximum Likelihood Estimator(MLE)

    Let the set(x1,x2,x3,...xn)ofnrandom lifetimes from the WD be defined by Eq.(1).Then,the likelihood functionLfand its corresponding logarithm?for the given sample observations are shown in Eqs.(4)and(5),respectively[28]:

    The partial derivatives of the equation for?with respect to the variables?andλare given by:

    The parameterλcan be obtained by using any numerical method,such as the Newton–Raphson.

    2.2 Ordinary Least Squares Method(OLS)

    To estimate the parameters of the WD,the OLS method is extensively used in mathematics and engineering problems [16].We can obtain a linear relationship between parameters by taking the logarithm of Eq.(2)as follows:

    LetYi=ln[-ln(1-F(xi;?,λ))],Xi=lnx(i),α0= -λln?, andβ=λ.Then,Eq.(9)can be written as Yi=α0+βXi+∈i

    LetX(1),X(2),X(3),...X(n)be order statistics ofX1,X2,X3,...Xn,and letx(1) < x(2) <x(3) <...<x(n)be the ordered observations in a random sample of sizen.To estimate the values of the cumulative distribution functionF(x(i)?,λ),we use the mean rank method as follows:

    2.3 Weighted Least Squares Method(WLS)

    In the WLS estimate,the parametersλand?are the values of the parameters that minimize the function:

    The biggest challenge in the application of the WLS is in finding the weights Wiin Eq.(15).We use the delta method[29]to find them:

    Hence,the weights can be written as follows:

    MinimizingQ*W(λ,?)we obtain the WLS estimates ofλand?as

    2.4 Approximate Bayes Estimator

    In this section,the approximate Bayesian estimator under a GE loss function of the parametersλand?of the WD is discussed.We assume a non-informative(vague)prior according to[30]as

    The parametersλandφare estimated using Lindley’s approximation technique.The posterior expectationEis given by Eq.(22)[31]:

    Moreover,it can be asymptotically estimated by:

    wherei,j,k,l= 1,2,...m,φ=(φ1,φ1,...φm),π(φ) represents the prior distribution ofφ,u=u(φ),L=L(φ) is the likelihood function,ρ≡ρ(φ) = ln(π(φ)),andσij=element (i,j) of the covariance matrix of the parameter estimators.

    For the two-parameter caseφ=(λ,φ),Eq.(22)reduces to:

    The functions in Eq.(24)are computed using MLEs with respect toλandφ.

    To apply the Lindley model of Eq.(24)to estimate the parameters of the WD,the following are obtained from Eq.(23):

    The elementsσijof the covariance matrix are expressed by

    2.4.1 Estimates Based on General Entropy Loss Function

    The general entropy loss functionLforφ,shown in Eq.(24),is expressed by the following form[32]:

    In the same way,the BLGE offor?is found by the following expressions:

    3 Proposed Methods

    In the following sections,we describe the proposed BLWGE and OLSMLP methods.

    3.1 Weighted General Entropy Loss Function

    The WGE loss function was proposed as dependent on the weighted loss GE function as follows:

    whereφrepresents the estimated parameters that minimize the expectation of the loss function(Eq.(27)),andw(φ)represents the proposed weighted function as expressed by Eq.(28):

    Based on the posterior distribution of the parameterφ,and by using the WGE function given in Eq.(28),we obtain the estimated BLWGE of the parameter?as follows:

    Thus,we can find that

    Consequently,the BLWGE of parameterφ,obtained by using the WGE loss function,isas presented in Eq.(29):

    provided thatEφ(φ-z)andEφ(φ-(z+q))exist and are finite,whereEφrepresents the expected value.

    We note that the GE is a special case of the WGE whenz=0 in Eq.(29).

    3.1.1 Estimates of Parameters of WD Based on Weighted General Entropy Loss Function

    Based on the WGE and by using Eq.(29),the approximate Bayes estimatorforλis shown as:

    Thus,the weighted Bayes estimator for the shape parameter?is

    3.2 Ordinary Least Squares and the Multilayer Perceptron Neural Network(OLSMLP)

    As previous studies have shown[14,33],manual calculations yield the smallest standard deviation(STD)in the parameterλ,and are consequently more accurate than computational methods.Moreover, methods of manual estimation are more accurate for small sample sizes [14].However, these computational methods, especially the OLS, are sensitive to outliers and specific residual behavior[34].To solve these problems, many studies have proposed different methods, such as the iterative weighting method based on the modified OLS[34],the WLS,and many other methods based on the WLS[35].A major challenge in these methods is determining the weights.

    3.2.1 Proposed Method to Estimate Parameters of WD

    We now describe the proposed method,which is divided into two main parts:the linearization of the CDF,and the application of a feedforward network with backpropagation to estimate the values ofλand?of the WD.

    The OLS method takes the CDF defined in Eq.(2) and linearizes it as described in Eq.(10).It then determines the coefficientsα0andβvia linear regression by using the slope and the intercept.The principle of the method used by the OLS to computeα0andβcan be violated even with a few outliers.

    Therefore, instead of using the slope and the intercept, we propose applying Algorithm 1 as described below.

    ·Application of Proposed Model to Estimate Parameters of WD

    The steps used to evaluate the parameters of the WD from the input csv file are described by Algorithm 1.

    Input:Three comma separated value file (CSV) files containing the matrices Xi and Yi, and their corresponding parameters(shape and scale)SCi.Output:The predicted shape ?λOLSMLP and scale ??OLSMLP for the test set.1:Normalize the inputs matrices Xi,Yi,and SCi separately to unit norm using RobustScaler followed by MinMaxScaler norm.2:Split t he normalized Xi e the neural netw ile the model and,Yi,and SCi into random training and test subsets.3:Creatork model(define the input layer,hidden layer,and output layer).4:Comp fit it to the data.5:Predict ?λOLSMLP and ??OLSMLP for the test set.6:Evaluate the performance of the proposed model.Steps 2,3,4,and 5 are explained in more detail in the following subsections:

    ·Data Normalization

    Normalization is an essential preprocessing tool for a neural network [36,37].Before training a neural network model,the input data are scaled using the RobustScaler norm in a preliminary phase,where each sample with at least one non-zero component is rescaled using the median and quartile range as described by Eq.(38).The RobustScaler norm is used to remove the influence of outliers.Following this,the MinMaxScaler,defined by Eq.(39),is applied to the output of the RobustScaler.The MinMaxScaler scales all the data features to the range[0,1]:

    whereXis a feature vector,Xiis an element of featureX,is the rescaled element obtained by using MinMaxScaler,andis the rescaled element obtained by using RobustScaler.

    ·Structure of the Proposed Neural Network

    To estimate the parameters of the WD,we propose using a multilayer perceptron(MLP),which is a feedforward network with backpropagation[38].According to the structure of the MLP,the proposed network,as shown in Fig.1,consists of an input layer(withnneurons),a hidden layer(withkneurons),and an output layer(withmneurons that yield the Weibull parameters as the output of the network).

    Figure 1:Topology of the proposed MLP

    Various criteria have been proposed in the literature to fix the number of hidden neurons[39].In our architecture,we use the rule whereby“the number of hidden neuronskshould be 2/3 times the size of the input layer,plus the size of the output layer”[38–40].

    The hyperbolic tangent activation function (tanh) is proposed here in the input layer, and the sigmoid function in the output layer.They are used frequently in feedforward nets, and are suitable for shallow networks as well as applications of prediction and mapping[38,41].

    The objective of our neural network is a model that performs well on the data used in both the training and the test datasets.For this reason,we add a well-known regularization layer as described in the next section.

    ·Regularization

    Regularization is a technique that can prevent overfitting [37,38].A number of regularization techniques have been develop in the literature, such as L1 and L2 regularizations, bagging, and dropout.In the proposed structure, we use dropout, a well-known technique that randomly “drops out” or omits hidden neurons of the neural network to make them unavailable during part of the training[38,42].This reduces the co-adaption between neurons,which results in less overfitting[38].

    ·Optimization Algorithm

    The optimization of deep networks is an active area of research[43].The most popular gradientbased optimization algorithms are Adagrad, Momentum, RMSProp, Adam, AdaDelta, AdaMax,Nadam, and AMSGrad [38,43,44].We chose Nadam due to its superiority in supervised machine learning over the other techniques, especially for a deep network [43].Moreover, it combines the strengths of the Nesterov acceleration gradient(NAG)and the adaptive estimation(Adam)algorithms as described in[44]:

    t:time step

    αnad:learning rate

    vt:the exponential average square of gradients

    mt:momentum vector

    wt:the weight that we want to update

    ε:smoothing term

    :gradient ofL;the loss function to minimize.

    β1,β2:momentum decay and scaling decay,respectively

    4 Performance Metrics

    To evaluate the proposed methods with respect to other methods, we used two statistical tools,the mean squared error(MSE)and the mean absolute percentage error(MAPE)[5],in addition to the computation time.

    5 Results and Discussion

    5.1 Dataset Description

    We generated 250,000 random data points from the WD for different parameters and different values of?ranging from 1 to 299,and those ofλranging from 0.5 to 100.For each shape/scale pair,we generated 10,000 samples of different sizesn=10, 20, 30, 40, and 50.

    We used the same dataset for the neural network in the training phase,but applied one sample to each shape/scale pair.This was unlike in the other methods(MLE,OLS,WLS,BLGE,and BLWGE),which used 10,000 samples to estimate the parameters of the WD.This dataset was divided into two subsets.The first subset was used to fit the model, and is referred to as the training dataset; it was characterized by known inputs and outputs.The second subset is referred to as the test dataset,and was used to evaluate the fitted machine learning model and make predictions on the new subset,for which we did not have the expected output.We chose the train–test procedure for our experiments because we guessed that we had a sufficiently large dataset available.

    5.2 Experimental Setting

    5.2.1 Parameter Selection for OLSMLP

    In all experiments,we trained the model with Google Collaboratory(GPU)for 25 epochs.We used the Nadam optimizer with learning rate ofαnad= 0.001; terms representing the momentum decay,scaling decay,and smoothing were kept at their default values:β1=0.9,β2=0.999,andε=10-7.A dropout with a ratio of 0.6 was applied to the hidden layer.As described in Section 3,the hidden and output layers used thetanhandsigmoidactivation functions,respectively.The error function or loss function was the mean squared error,and was used to estimate the loss of the model.

    5.2.2 Parameter Selection of BLGE and BLWGE

    In all experiments,the parameters of the BLWGE and BLGE were empirically determined.The values of the weightsqandzof the BLWGE were-3 and 6,respectively.For the BLGE,the parameterq=1.5.

    5.3 Estimating Parameters of Weibull Distribution

    5.3.1 Effect of Sample Size on Estimation of WD Parameters Using Prevalent Methods

    Fig.2 shows the evolution of the average MSE as a function of the sample sizen.The MSE decreased quasi-linearly fromn=10 ton=40 for all methods.Fig.2 shows that the BLWGE,WLS,BLGE,and MLE had the lower MSE values for the different sample sizes compared with the OLS.We can deduce also that the WLS,GE,and MLE gave similar results with a slightly better start for the MLE atn=10.

    5.3.2 Effect of Sample Size on Estimation of WD Parameters Using Proposed Method

    To illustrate how the sample size affects the calculation of the MSE,F(xiàn)ig.3 shows the evolution of the latter as a function of the sample sizenfrom 10 to 50.

    From Fig.3, we can deduce that as the sample size increased, the estimate of the MSE by the proposed method decreased and fluctuated.This fluctuation was due to the random nature of the information used and the limited number of samples(one sample)for each pair of shapes/scales.

    Tabs.1 and 2 show the results of the simulation of the proposed method and the other methods considered above.The results show the following:

    Figure 2:The evolution of the MSE using the parameters ? = 2.5 and λ =1.685 as a function of n =[10-40] for the MLE,OLS,WLS,BLGE,and BLGWGE

    Figure 3:The evolution of the MSE using the parameters ? = 0.75 and λ =1.75 as a function of n = [10-50]

    1.The MLE and WLS behaved similarly as shown in Tab.1:Their MSE values decreased gradually when their shape values increased at a fixed scale.Conversely,when the scale value increased with a fixed shape,the MSE increased.

    2.The behavior of the OLS and GE was the opposite of that of the MLE and WLS.As depicted in Tab.1,the MSE increased when the shape increased(at a fixed scale),and decreases when the scale increased(with a fixed shape).

    3.The BLWGE and the OLSMLP behaved similarly in terms of scale estimation,as shown in Tab.1.

    4.All methods had the same global variation function,as shown in Fig.4 and Tab.2.

    5.The MLE was slightly superior globally in terms of scale estimation to the other methods,but had the worst estimation of shape,as shown in Tab.2.

    6.The proposed MLP neural network acceptably estimated the scale, better than some methods.By contrast,it outperformed all other methods in terms of shape estimations most of the time.

    Table 1:MSEs of the estimated ? with varying values of the parameters λ and ? for different parameters of WD estimation methods

    Table 2:MSEs of the estimated λ with varying values of the parameters λ and ? using different methods to estimate the parameters of the WD

    Figure 4:MSEs of with varying values of the parameters ? = [1 1 1 2.5 3.25 4] and λ =[1.5 1.75 2 4 4 4]for the MLE,OLS,BLGE,WLS,and the proposed methods

    From Tab.3, we see that both statistical indicators, MSE and MAPE, yielded different values.The global rank was calculated to evaluate the best method.The results in the table indicate that the proposed method offered the best compromise between shape and scale estimation, as indicated by the global rank.Moreover,it retained the speed of the OLS and enhanced the accuracy of estimation of the parameters of the WD compared with the MLE,BLGE,and BLWGE.

    Table 3:Performance evaluation of the MLE,OLS,WLS,BLGE,BLWGE,and OLSMLP methods using different statistical indicators

    6 Conclusion

    This study proposed a method to estimate the parameters of the WD.This method is based on the OLS graphical method and the MLP neural network.The MLP solves the problems caused by the presence of outliers and eases the difficulty of determining the weights in the WLS method.It yielded acceptable results in simulations,especially in terms of shape estimation.It is also faster than the MLE,BLGE,and BLWGE.

    We also proposed a second method (BLWGE), in which we introduced weight to the GE loss function.The results of simulations showed that BLWGE yields good results, especially in terms of shape estimation,compared with the other methods.

    Acknowledgement:This project was supported by the Deanship of Scientific Research at Prince Sattam bin Abdulaziz University under Research Project No.2020/01/16725.

    Funding Statement:The authors are grateful to the Deanship of Scientific Research at Prince Sattam bin Abdulaziz University Supporting Project Number(2020/01/16725),Prince Sattam bin Abdulaziz University,Saudi Arabia.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding this study.

    国产精品 欧美亚洲| 欧美 亚洲 国产 日韩一| 天堂影院成人在线观看| 国产91精品成人一区二区三区| 国产精品免费视频内射| 日韩国内少妇激情av| 亚洲第一电影网av| 男人操女人黄网站| 人成视频在线观看免费观看| 满18在线观看网站| 女生性感内裤真人,穿戴方法视频| 少妇被粗大的猛进出69影院| 亚洲第一av免费看| 亚洲国产精品999在线| 国产精品综合久久久久久久免费 | 国产精品一区二区在线不卡| 久久香蕉精品热| 国产主播在线观看一区二区| 久久久久久久精品吃奶| 亚洲国产精品久久男人天堂| 精品不卡国产一区二区三区| 男女做爰动态图高潮gif福利片 | 一本大道久久a久久精品| 成人国产一区最新在线观看| 中文字幕人妻丝袜一区二区| 一区二区三区精品91| 丰满人妻熟妇乱又伦精品不卡| 在线十欧美十亚洲十日本专区| 波多野结衣高清无吗| 国产精品二区激情视频| 制服诱惑二区| 9色porny在线观看| 悠悠久久av| 久久久国产欧美日韩av| 久久精品人人爽人人爽视色| e午夜精品久久久久久久| 真人做人爱边吃奶动态| 男女午夜视频在线观看| 亚洲av熟女| 久久久久国产一级毛片高清牌| 美国免费a级毛片| 黑人巨大精品欧美一区二区蜜桃| av有码第一页| 久久久久久亚洲精品国产蜜桃av| 女性被躁到高潮视频| 在线视频色国产色| 亚洲五月天丁香| 美女扒开内裤让男人捅视频| 美女国产高潮福利片在线看| 18禁国产床啪视频网站| 99精品久久久久人妻精品| 久久青草综合色| 色哟哟哟哟哟哟| 亚洲片人在线观看| 国产精品一区二区三区四区久久 | 亚洲国产毛片av蜜桃av| 色播在线永久视频| 亚洲中文日韩欧美视频| 美国免费a级毛片| 欧美日韩精品网址| 成人国产综合亚洲| 国产精品免费视频内射| 老司机靠b影院| 亚洲伊人色综图| 亚洲国产精品999在线| 欧美黑人精品巨大| 久久久水蜜桃国产精品网| 母亲3免费完整高清在线观看| 可以在线观看的亚洲视频| 国产99久久九九免费精品| 999精品在线视频| www.999成人在线观看| 久久午夜亚洲精品久久| 色播在线永久视频| 人人妻人人澡欧美一区二区 | 久热爱精品视频在线9| 欧美黑人精品巨大| 69精品国产乱码久久久| 麻豆一二三区av精品| 婷婷精品国产亚洲av在线| 嫩草影院精品99| 女人精品久久久久毛片| 久久久久久久久中文| 99国产极品粉嫩在线观看| 亚洲狠狠婷婷综合久久图片| 老司机福利观看| 免费无遮挡裸体视频| 日韩欧美一区视频在线观看| 麻豆国产av国片精品| 亚洲色图综合在线观看| 久久久国产欧美日韩av| 99riav亚洲国产免费| 国产aⅴ精品一区二区三区波| 激情视频va一区二区三区| 亚洲熟妇中文字幕五十中出| 最新在线观看一区二区三区| 一二三四在线观看免费中文在| 淫妇啪啪啪对白视频| 变态另类成人亚洲欧美熟女 | 热re99久久国产66热| av中文乱码字幕在线| 性色av乱码一区二区三区2| 亚洲av电影不卡..在线观看| 亚洲三区欧美一区| 国产精品爽爽va在线观看网站 | av天堂久久9| 欧美日韩黄片免| 亚洲熟妇熟女久久| 精品久久蜜臀av无| 一区二区三区激情视频| 亚洲人成77777在线视频| 国产单亲对白刺激| 侵犯人妻中文字幕一二三四区| 国产成人系列免费观看| 精品人妻1区二区| 精品久久蜜臀av无| 久久久久国产一级毛片高清牌| 操美女的视频在线观看| 欧美一级毛片孕妇| 九色国产91popny在线| 成人18禁高潮啪啪吃奶动态图| 亚洲av成人一区二区三| 美女国产高潮福利片在线看| 久久久久久人人人人人| 国产高清激情床上av| 国产成人精品在线电影| 91国产中文字幕| 老司机午夜十八禁免费视频| 久久国产精品人妻蜜桃| 国产亚洲精品一区二区www| 亚洲 欧美 日韩 在线 免费| 国产男靠女视频免费网站| 在线视频色国产色| 91麻豆精品激情在线观看国产| www.999成人在线观看| 男女之事视频高清在线观看| 亚洲色图 男人天堂 中文字幕| 国产精品秋霞免费鲁丝片| 法律面前人人平等表现在哪些方面| 日日爽夜夜爽网站| 成人国产综合亚洲| 波多野结衣高清无吗| 国产欧美日韩一区二区三| 精品久久久久久久毛片微露脸| 18美女黄网站色大片免费观看| 成年女人毛片免费观看观看9| 欧美国产精品va在线观看不卡| 午夜福利欧美成人| 69av精品久久久久久| 一区在线观看完整版| 人人澡人人妻人| 少妇粗大呻吟视频| 久久人妻福利社区极品人妻图片| x7x7x7水蜜桃| 欧美最黄视频在线播放免费| 亚洲全国av大片| 亚洲精品久久成人aⅴ小说| 18禁裸乳无遮挡免费网站照片 | 精品欧美一区二区三区在线| 欧美亚洲日本最大视频资源| 丝袜美腿诱惑在线| 久久久国产成人精品二区| 国产激情久久老熟女| 一区福利在线观看| 国产私拍福利视频在线观看| 他把我摸到了高潮在线观看| 中文字幕人妻丝袜一区二区| 91麻豆精品激情在线观看国产| 黄片小视频在线播放| 在线天堂中文资源库| 亚洲狠狠婷婷综合久久图片| 久久午夜亚洲精品久久| 久久精品国产99精品国产亚洲性色 | cao死你这个sao货| 操出白浆在线播放| 久久国产精品人妻蜜桃| 欧美日韩瑟瑟在线播放| 18禁观看日本| 国产精品免费一区二区三区在线| 国产成人精品无人区| 国产单亲对白刺激| 欧美乱码精品一区二区三区| 一进一出抽搐gif免费好疼| 国产亚洲av高清不卡| 国产精品美女特级片免费视频播放器 | 天天躁狠狠躁夜夜躁狠狠躁| 在线免费观看的www视频| 中文字幕另类日韩欧美亚洲嫩草| 亚洲色图 男人天堂 中文字幕| √禁漫天堂资源中文www| 不卡av一区二区三区| 亚洲全国av大片| 久久精品国产亚洲av香蕉五月| 国产精品自产拍在线观看55亚洲| 桃红色精品国产亚洲av| 国产精品一区二区在线不卡| 欧美精品亚洲一区二区| 午夜福利欧美成人| 狠狠狠狠99中文字幕| 久久人妻av系列| 波多野结衣高清无吗| 亚洲五月天丁香| 99国产精品99久久久久| 91麻豆精品激情在线观看国产| 美女 人体艺术 gogo| 免费无遮挡裸体视频| 亚洲精品国产精品久久久不卡| 久久久久国产一级毛片高清牌| 中文亚洲av片在线观看爽| 亚洲av电影在线进入| 级片在线观看| 久久中文字幕一级| 国产片内射在线| 日韩免费av在线播放| 免费看美女性在线毛片视频| 欧美最黄视频在线播放免费| 91麻豆精品激情在线观看国产| 欧美色欧美亚洲另类二区 | 在线观看免费午夜福利视频| 亚洲精品一卡2卡三卡4卡5卡| 变态另类丝袜制服| 亚洲国产欧美一区二区综合| 18禁观看日本| 成人三级黄色视频| 亚洲男人天堂网一区| 搡老妇女老女人老熟妇| 一进一出抽搐动态| 美女 人体艺术 gogo| 成人三级做爰电影| 夜夜躁狠狠躁天天躁| 久久精品91蜜桃| 精品乱码久久久久久99久播| 18美女黄网站色大片免费观看| 亚洲欧洲精品一区二区精品久久久| 9热在线视频观看99| 亚洲精品久久成人aⅴ小说| 亚洲五月天丁香| 亚洲天堂国产精品一区在线| 国产视频一区二区在线看| 亚洲精品久久成人aⅴ小说| 色精品久久人妻99蜜桃| 亚洲国产精品久久男人天堂| 变态另类成人亚洲欧美熟女 | 黄色片一级片一级黄色片| 日韩国内少妇激情av| 国产精品免费一区二区三区在线| 欧美成人免费av一区二区三区| 国产欧美日韩一区二区三区在线| 一本久久中文字幕| 免费在线观看视频国产中文字幕亚洲| 午夜a级毛片| 免费看十八禁软件| 久久人妻av系列| 一级a爱视频在线免费观看| 亚洲av成人不卡在线观看播放网| 免费在线观看完整版高清| 久久精品成人免费网站| 亚洲精品国产色婷婷电影| 国产亚洲av高清不卡| 亚洲精品美女久久久久99蜜臀| 国产欧美日韩一区二区精品| 久久青草综合色| 韩国精品一区二区三区| 久久中文字幕人妻熟女| 欧美一级毛片孕妇| 黄色成人免费大全| 99国产精品免费福利视频| 久99久视频精品免费| 国产熟女xx| 久久天堂一区二区三区四区| 国产亚洲欧美精品永久| 大型黄色视频在线免费观看| tocl精华| 在线观看66精品国产| 91av网站免费观看| 亚洲av片天天在线观看| 久久久久久国产a免费观看| 级片在线观看| 在线永久观看黄色视频| 亚洲男人的天堂狠狠| 最好的美女福利视频网| 国产麻豆69| 侵犯人妻中文字幕一二三四区| 天堂√8在线中文| 老鸭窝网址在线观看| 91麻豆av在线| 搡老熟女国产l中国老女人| 亚洲专区中文字幕在线| 国产精品久久久人人做人人爽| 免费女性裸体啪啪无遮挡网站| 国产成人一区二区三区免费视频网站| 50天的宝宝边吃奶边哭怎么回事| 亚洲三区欧美一区| 99在线人妻在线中文字幕| 久久这里只有精品19| 欧美国产精品va在线观看不卡| 久久婷婷人人爽人人干人人爱 | www日本在线高清视频| 在线永久观看黄色视频| 一区在线观看完整版| 久久精品国产综合久久久| 一a级毛片在线观看| 91国产中文字幕| 在线国产一区二区在线| 久久亚洲精品不卡| 午夜精品在线福利| 女人爽到高潮嗷嗷叫在线视频| 国产一区二区在线av高清观看| 国产日韩一区二区三区精品不卡| 91麻豆精品激情在线观看国产| 一卡2卡三卡四卡精品乱码亚洲| 欧美+亚洲+日韩+国产| 日韩高清综合在线| av免费在线观看网站| 免费女性裸体啪啪无遮挡网站| 久久久久久久久中文| 99在线人妻在线中文字幕| 久久影院123| 在线观看免费视频网站a站| 可以免费在线观看a视频的电影网站| 乱人伦中国视频| 久久久国产成人免费| 亚洲欧美日韩另类电影网站| 久久精品91无色码中文字幕| 侵犯人妻中文字幕一二三四区| 国产成人精品久久二区二区91| 国产亚洲精品一区二区www| 久久精品人人爽人人爽视色| 99国产精品99久久久久| 久久精品人人爽人人爽视色| 丝袜美足系列| 大陆偷拍与自拍| 热re99久久国产66热| 19禁男女啪啪无遮挡网站| 国产成人av激情在线播放| 精品国产美女av久久久久小说| 成人三级做爰电影| 男人舔女人下体高潮全视频| www.精华液| 在线观看一区二区三区| 国产精品久久电影中文字幕| 久久精品国产亚洲av香蕉五月| 校园春色视频在线观看| 一区二区日韩欧美中文字幕| 高清在线国产一区| 国产91精品成人一区二区三区| 国内久久婷婷六月综合欲色啪| 久久国产精品男人的天堂亚洲| 欧美丝袜亚洲另类 | 亚洲av熟女| 一本综合久久免费| 国产av精品麻豆| 99久久99久久久精品蜜桃| 成人免费观看视频高清| 看免费av毛片| 美女高潮喷水抽搐中文字幕| 国产在线精品亚洲第一网站| 国产私拍福利视频在线观看| 伦理电影免费视频| 精品午夜福利视频在线观看一区| 中文字幕高清在线视频| 搡老岳熟女国产| 真人一进一出gif抽搐免费| 欧美乱色亚洲激情| 午夜福利,免费看| 欧美一级毛片孕妇| 久久久久久免费高清国产稀缺| 露出奶头的视频| 亚洲免费av在线视频| 一本大道久久a久久精品| 亚洲欧美日韩无卡精品| 色综合站精品国产| 99久久国产精品久久久| 久久精品国产综合久久久| 国产不卡一卡二| 最近最新中文字幕大全免费视频| 欧美日韩福利视频一区二区| 一本综合久久免费| 亚洲成av片中文字幕在线观看| 久久久久国产一级毛片高清牌| 国产精品综合久久久久久久免费 | 成熟少妇高潮喷水视频| 悠悠久久av| 最近最新免费中文字幕在线| 亚洲av成人一区二区三| 国产私拍福利视频在线观看| 成人国语在线视频| 国产精品综合久久久久久久免费 | 国产日韩一区二区三区精品不卡| 久久国产乱子伦精品免费另类| av片东京热男人的天堂| 亚洲国产精品sss在线观看| 在线av久久热| 午夜a级毛片| 色尼玛亚洲综合影院| 国产精品久久久人人做人人爽| 天堂影院成人在线观看| 免费高清视频大片| 久久国产亚洲av麻豆专区| 琪琪午夜伦伦电影理论片6080| 成人精品一区二区免费| 琪琪午夜伦伦电影理论片6080| 真人做人爱边吃奶动态| 制服丝袜大香蕉在线| 老司机深夜福利视频在线观看| 日日干狠狠操夜夜爽| 精品福利观看| 国产成人影院久久av| 亚洲欧美日韩另类电影网站| 两性夫妻黄色片| 成人特级黄色片久久久久久久| 精品欧美国产一区二区三| 国产高清视频在线播放一区| 老司机午夜十八禁免费视频| 欧美在线黄色| 国产高清videossex| 99热只有精品国产| 一级毛片女人18水好多| 午夜福利免费观看在线| 国产精品乱码一区二三区的特点 | 身体一侧抽搐| 精品一区二区三区四区五区乱码| 巨乳人妻的诱惑在线观看| 免费久久久久久久精品成人欧美视频| 一进一出好大好爽视频| 啦啦啦韩国在线观看视频| 午夜福利免费观看在线| www日本在线高清视频| a级毛片在线看网站| 日韩中文字幕欧美一区二区| 免费在线观看黄色视频的| 黄网站色视频无遮挡免费观看| 女性生殖器流出的白浆| 久久中文看片网| 国产欧美日韩一区二区精品| 色综合站精品国产| 国产精品香港三级国产av潘金莲| 18禁裸乳无遮挡免费网站照片 | 日韩视频一区二区在线观看| 香蕉国产在线看| 成人欧美大片| 国产精品久久久久久亚洲av鲁大| 丝袜美腿诱惑在线| 亚洲精品国产一区二区精华液| 亚洲欧美激情在线| 黄色视频不卡| 99久久精品国产亚洲精品| 美女国产高潮福利片在线看| 男女之事视频高清在线观看| 亚洲一区二区三区不卡视频| 精品久久久久久久毛片微露脸| 中文字幕久久专区| 久久中文看片网| 亚洲国产中文字幕在线视频| 国产成人系列免费观看| 亚洲免费av在线视频| 男人舔女人的私密视频| 午夜精品久久久久久毛片777| 国产亚洲av高清不卡| 人成视频在线观看免费观看| 色婷婷久久久亚洲欧美| 18禁美女被吸乳视频| 久久精品国产综合久久久| 久久香蕉国产精品| 国产精品电影一区二区三区| 老汉色av国产亚洲站长工具| 9色porny在线观看| 在线免费观看的www视频| 日本在线视频免费播放| 久热这里只有精品99| 欧美日韩亚洲国产一区二区在线观看| 成人三级黄色视频| 久久草成人影院| 女人高潮潮喷娇喘18禁视频| 中文字幕人妻熟女乱码| 一进一出抽搐gif免费好疼| 久久午夜亚洲精品久久| 不卡一级毛片| 国产色视频综合| 欧美成人午夜精品| 十八禁人妻一区二区| 在线观看66精品国产| 母亲3免费完整高清在线观看| 色婷婷久久久亚洲欧美| 最近最新中文字幕大全免费视频| 动漫黄色视频在线观看| 极品人妻少妇av视频| 亚洲精品久久成人aⅴ小说| 50天的宝宝边吃奶边哭怎么回事| 久久中文字幕一级| 国产精品亚洲一级av第二区| 女警被强在线播放| 在线观看一区二区三区| 看免费av毛片| 免费观看人在逋| 在线观看免费视频日本深夜| 国产极品粉嫩免费观看在线| 免费少妇av软件| 9色porny在线观看| 久久中文字幕一级| 亚洲三区欧美一区| 日本vs欧美在线观看视频| 亚洲片人在线观看| 中文字幕av电影在线播放| 亚洲精品中文字幕一二三四区| 黄网站色视频无遮挡免费观看| 精品无人区乱码1区二区| 在线观看免费视频网站a站| 脱女人内裤的视频| 国产亚洲精品久久久久5区| 少妇 在线观看| 人人妻人人澡人人看| 国产xxxxx性猛交| 久久人人爽av亚洲精品天堂| 中文字幕另类日韩欧美亚洲嫩草| 日本黄色视频三级网站网址| 99香蕉大伊视频| 久久国产精品男人的天堂亚洲| 黄片播放在线免费| 欧美日韩亚洲综合一区二区三区_| 精品少妇一区二区三区视频日本电影| 精品无人区乱码1区二区| 国产高清激情床上av| 黄色毛片三级朝国网站| av网站免费在线观看视频| 老汉色∧v一级毛片| 可以免费在线观看a视频的电影网站| 搡老熟女国产l中国老女人| 国产国语露脸激情在线看| 一区二区三区高清视频在线| 久久九九热精品免费| 久久久久久免费高清国产稀缺| av福利片在线| 久久狼人影院| 啦啦啦韩国在线观看视频| 日日干狠狠操夜夜爽| 黄色 视频免费看| 精品午夜福利视频在线观看一区| 十八禁网站免费在线| 国产精品乱码一区二三区的特点 | 在线十欧美十亚洲十日本专区| 午夜免费激情av| 国产av在哪里看| 高清在线国产一区| 嫩草影院精品99| 一边摸一边做爽爽视频免费| 国产区一区二久久| 人人妻人人爽人人添夜夜欢视频| 国产免费男女视频| 男人操女人黄网站| 男人舔女人下体高潮全视频| 国产成人免费无遮挡视频| 久久久久久久久久久久大奶| 国产高清视频在线播放一区| 亚洲男人的天堂狠狠| 国产私拍福利视频在线观看| 怎么达到女性高潮| 亚洲人成77777在线视频| 搞女人的毛片| 国产精品久久视频播放| 夜夜看夜夜爽夜夜摸| 男人操女人黄网站| 神马国产精品三级电影在线观看 | 国产三级黄色录像| 国产熟女xx| cao死你这个sao货| 亚洲三区欧美一区| 免费在线观看黄色视频的| 最好的美女福利视频网| 成人亚洲精品av一区二区| 欧美午夜高清在线| 久久青草综合色| 黄色视频,在线免费观看| 国产精品国产高清国产av| 一边摸一边抽搐一进一小说| 每晚都被弄得嗷嗷叫到高潮| 女人精品久久久久毛片| 日韩欧美一区视频在线观看| 我的亚洲天堂| 国产精品精品国产色婷婷| 9热在线视频观看99| 国产色视频综合| 国产精品久久久久久亚洲av鲁大| 欧美成人免费av一区二区三区| 亚洲成人国产一区在线观看| 91精品国产国语对白视频| 淫秽高清视频在线观看| 国产精品国产高清国产av| 黄片大片在线免费观看| 久久人人97超碰香蕉20202| 国产麻豆成人av免费视频| 欧美av亚洲av综合av国产av| 中国美女看黄片| 最近最新中文字幕大全电影3 | 亚洲最大成人中文| 日本 欧美在线| 日韩中文字幕欧美一区二区| 一区二区三区精品91| 国产午夜福利久久久久久| 如日韩欧美国产精品一区二区三区| 欧美日韩黄片免| 久9热在线精品视频| 国产精品影院久久| 91麻豆av在线| 国产91精品成人一区二区三区| 国产精品野战在线观看| 日本免费a在线| 国产视频一区二区在线看| 日本一区二区免费在线视频| 久久人人精品亚洲av| 日本三级黄在线观看| 久久精品影院6| 美国免费a级毛片| 麻豆av在线久日| 18禁观看日本|