• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    LEARNING RATES OF KERNEL-BASED ROBUST CLASSIFICATION*

    2022-06-25 02:13:18ShuhuaWANG王淑華

    Shuhua WANG (王淑華)

    School of Information Engineering,Jingdezhen Ceramic University,Jingdezhen 333403,China

    E-mail:w614sh@126.com

    Baohuai SHENG (盛寶懷)?

    Department of Finance,Zhejiang Yuexiu University,Shaoxing 312030,China Department of Applied Statistics,Shaoxing University,Shaoxing 312000,China

    E-mail:shengbaohuai@163.com

    Abstract This paper considers a robust kernel regularized classification algorithm with a non-convex loss function which is proposed to alleviate the performance deterioration caused by the outliers.A comparison relationship between the excess misclassification error and the excess generalization error is provided;from this,along with the convex analysis theory,a kind of learning rate is derived.The results show that the performance of the classifier is effected by the outliers,and the extent of impact can be controlled by choosing the homotopy parameters properly.

    Key words Support vector machine;robust classification;quasiconvex loss function;learning rate;right-sided directional derivative

    1 Introduction

    The problem of classification is one of the most considered in learning theory.The goal of classification is to construct a classifier with a small misclassification error.Of the many classical learning methods that have been provided,the support vector machine (SVM) has been one of the most successful learning algorithms (see for example[1–6]).

    LetXbe a given compact set in thed-dimensional Euclidean space Rd(input space),and letY={-1,1}(representing the two classes).Letρbe a fixed but unknown probability distribution onZ=X×Ywhich yields its marginal distributionρXonXand its conditional distributionρ(·|x)=P (·|x) atx∈X.Then we denote bythe samples i.i.d.(independently and identically drawn) according to the distributionρ.

    As we know,the goal of classification is to produce a binary classifier C:X→Y.The misclassification error of C,which is used to measure the prediction power of C,is defined by

    The best classifier,called the Bayes rule,is given by

    The classifiers considered in this paper are induced by real-valued functionsf:X→R as Cf=sgn (f),which is defined by sgn (f)=1 iff(x)≥0 and sgn (f)=-1 otherwise.Recall that the regression ofρis

    It is easy to see thatfc(x)=sgn (fρ)(x).

    The Tikhonov regularization technique is an effective method for solving ill-posed problems ([7–10]).In recent years,many classification algorithms generated from Tikhonov regularization schemes have been developed and the convergencerates have been investigated (see,for example,[11–17]and references therein).We now briefly introduce the general framework of the kernel regularization classification algorithm.

    Given a classifying loss functionV(r):R→R+,the generalization error is defined as

    where F is the set of all of the measurable functions onX.

    We call HKa reproducing kernel Hilbert space (RKHS) associated with a Mercer kernelK(x,y)(see for example[17–23]) if,for allf∈HKandx∈X,it holds that

    The general kernel regularized classification associated with a given reproducing kernel Hilbert space HKtakes the form

    whereλ>0 is the regularization parameter.

    Based on model (1.1),a large number of scholars have studied the kernel regularization classification algorithms with some convex loss functions (see,for example,[11–14,16,17,19]).However,most of these research results do not consider the impact of outliers on the performance of the algorithms.It is known that,in many practical applications,outliers often occur in real data sets.Outliers can be described as data points that are far away from the pattern set by the majority of the data (see[24–26]).Outliers usually have a large impact on non-robust methods and can even make the whole statistical analysis useless if a non-robust loss is used (see,for example,[27,28]).

    In the literature of learning theory,a great deal of effort has been made to improve the robustness of the SVM and other similar learning algorithms.Robust loss functions for SV classification have been introduced and have been intensively studied (see,for example,[27–30]).Unfortunately,robustness comes with a cost:when minimizing a convex loss,even a single data point can dominate the result;that is,any convex loss function exhibits necessarily unbounded sensitivity to even a single outlier.The classical literature achieves bounded outlier sensitivity by considering redescending loss functions or bounded loss functions[27],both of which are inherently non-convex.As observed,non-convex loss functions are necessary to alleviate the effect of outliers.Along these lines,S.Suzumura,K.Ogawa,et al.[31]introduced a novel homotopy approach for the non-convex robust SVM learning.The basic idea is to introduce parameters,which bridge the standard SVM and fully robust SVM to express the influence of outliers.The loss functionVθ,s(r):R→[0,∞) that they defined is

    where[1-r]+:=max{0,1-r},θ∈[0,1] ands≤0.

    It can be seen thatVθ,s(r) is characterized by a pair of parameters.We refer toθandsas the tuning parameters which govern the influence of outliers.Figure 1 shows the loss functions for severalθands.

    Figure 1 The robust classification loss function with different values of θ and s

    The robust SV classification algorithm defined in[31]is

    wheref(x)=w?φ(x)(withφ(x) being the feature map implicitly defined by a kernelK),wis a vector in the feature space,and?denotes the transpose of vectors and matrices.C>0 is the penalty parameter.

    Algorithms and experiments in[31]illustrate the significance of this homotopy approach.The purpose of this paper is to analyze the convergence rate of the excess misclassification error associated with the non-convex loss functionVθ,s(r) defined as in (1.2),and to understand how the choice of the homotopy parameters affect the error.The main difference between our work and reference[31]is that the error bound of the algorithm is estimated theoretically,while reference[31]mainly studies the solution method of the model from the perspective of numerical calculation.

    The convex analysis method has become an effective error analysis method in the theoretical research field of machine learning (see,for example,[32–39]).Since the loss functionVθ,sis a non-convex function,the usual convex method cannot be used.In[39],the convergence rate of kernel regularized robust SV regression is provided by using an improved convex analysis method.Differently to the regression problem,the performance of classification algorithm is usually reflected by the convergence rate of the excess misclassification error rather than the excess generalization error considered in[39].Thus,the convergence analysis of classification algorithms is more difficult than that of regression algorithms.In order to overcome this difficulty,we improve the method in reference[39]by the technique of a Fenchel-Legendre conjugate,and an important comparison inequality is derived.

    The rest of this paper is organized as follows:in Section 2,the kernel regularized robust SV classification model considered in this paper is given.Section 3 presents the main results of the paper,where a comparison inequality and the learning rate are provided.Some proofs for the main results are given in Section 4.

    2 The Kernel Regularized Robust SV Classification

    In this section,a regularized robust SV classification model is provided.

    With the loss functionVθ,s(r) given by (1.2),we define the generalization error off:X→R as

    Also,the general kernel regularized classification associated with the reproducing kernel Hilbert space HKtakes the form

    In this paper,we consider a partition of the samplesinto two disjoint sets I and O.The samples in I and O are defined as Inliers and Outliers,respectively.We impose the restriction that the marginyif(xi) of an Inlier should be larger than or equal tos,while that of an Outlier should be smaller thans.In other words,

    where M:={1,...,m}.Let P={I,O}∈2mdenote the partition,where 2mis the power set of

    With these notions in hand,the kernel regularized homotopy algorithm is defined as (see[31])

    where|I|sand|O|sare the cardinality of I and O,respectively,and|I|s+|O|s=m.

    Take

    We know,from the analysis process of[31]and[31,Proposition 2.1],that(f) is a convex function on HK.Then,(2.2) can be rewritten as

    Note that whenθ=1 ors=-∞,algorithm (2.3) is equivalent to the standard formulation of the SV classification with the hinge lossV(r)=[1-r]+(see[11]).

    Our goal is to investigate the excess misclassification error for algorithm (2.3).In the present work,the sample error estimate for algorithm (2.3) is investigated with an improved convex approach and a capacity independent error is provided.The excess misclassification error will be shown by a K-functional associated with model (2.1).

    3 Error Analysis

    3.1 The comparison inequality

    We need to bound the excess misclassification errorto measure the performance of algorithm (2.3).The algorithm is designed by minimizing a penalized empirical error associated with the loss functionVθ,s,so relations between the excess misclassification error and the excess generalization errorbecome crucial.A systematic study of this problem is done in[37].

    In[11],the following important results are derived:

    Lemma 3.1([11]) LetVbe a classifying loss satisfyingV′′(0)>0.Then there exists a constantcV>0 such that for any measurable functionf:X→R,it holds that

    LetV(r)=[1-r]+.Then for any measurable functionf:X→R,it holds that

    In this subsection we shall give a comparison inequality forVθ,s(r).

    Denoteη=η(x)=P (Y=1|X=x) and Φη(r)=ηVθ,s(r)+(1-η)Vθ,s(-r).Then it holds that

    Forη∈[0,1],we define the optimal conditionalVθ,s-error as

    Then the optimalVθ,s-error satisfies

    We definer*:[0,1]→R as

    It is easy to see that,for any nonnegative loss functionV,H-(η)≥H(η) holds.

    Definition 3.2([37]) We say that the loss functionVis classification-calibrated if,for any,it holds thatH-(η)>H(η).

    Definition 3.3([37]) Given a loss functionV:R→[0,∞),we define the Ψ-transform function Ψ:[-1,1]→[0,∞) by,where

    andg**:[-1,1]→R is the Fenchel-Legendre biconjugate ofg:[-1,1]→R,which is characterized by epig**=epig.HereSis the closure of the convex hull of the setS,and epigis the epigraph of the functiong,that is,the set{(x,t):x∈[0,1],g(x)≤t}.

    Lemma 3.4([37]) For any nonnegative loss functionV,classification-calibration implies that Ψ is invertible on[0,1],and that the following statements are equivalent:

    (a)Vis classification-calibrated;

    (b)Ψ(τ)>0 for allτ∈(0,1].

    For the loss functionVθ,s(r) considered in this paper,we have the following result:

    Lemma 3.5LetVθ,s(r) be the loss function defined as (1.2),τ∈[-1,1].Fors≤-1,we have

    and for-1<s≤0,we have

    wheredθ,s=2θ+(1-θ)(1-s).

    Proof(1) Fors≤-1,we discussH(η) andH-(η).We consider the form of Φη(r) which is a convex combination ofVθ,s(r) andVθ,s(-r).

    Forη=0,Φη(r)=Vθ,s(-r),eachr≤-1 makes Φη(r)=0.Similarly,forη=1,Φη(r)=Vθ,s(r),eachr≥1 makes Φη(r)=0.For 0<η<,Φη(r) decreases strictly on (-∞,-1]and increases on[-1,+∞).For<η<1,Φη(r) decreases strictly on (-∞,1]and increases on[1,+∞).Forη=,Φη(r) decreases strictly on (-∞,-1]and increases on[1,+∞),and Φη(r)≡1 on[-1,1].

    Therefore,by the definition ofr*(η),we have

    This implies thatr*(η)=sgn (η-) for allη∈[0,1]and

    By the definition ofH(η),we have

    On the other hand,we need to derive the concrete expression ofH-(η).According to (3.2),we combine the monotonicity of Φηdiscussed above with the constraint conditionr(2η-1)≤0.The detailed derivation of this is given below.

    Forη=0,Φη(r)=Vθ,s(-r).Since 2η-1=-1<0,the constraint condition implies thatr≥0.Φη(r) attains the minimum value atr=0,and the minimum value is Φη(0)=Vθ,s(0)=1.Similarly,forη=1,Φη(r)=Vθ,s(r).Since 2η-1=1>0,the constraint condition implies thatr≤0.Φη(r) attains the minimum value atr=0,and the minimum value is Φη(0)=Vθ,s(0)=1.,2η-1≤0,the constraint condition implies thatr≥0.Since Φη(r) increases on[0,+∞),Φη(r) attains the minimum value atr=0,and the minimum value is Φη(0)=ηVθ,s(0)+(1-η)Vθ,s(0)=1.,the constraint condition implies thatr≤0.Φη(r) decreases strictly on (-∞,0],Φη(r) attains the minimum value atr=0,and the minimum value is Φη(0)=ηVθ,s(0)+(1-η)Vθ,s(0)=1.

    Thus,for any 0≤η≤1,we have

    (2) For-1<s≤0,we discussH(η) andH-(η).

    Forη=0,Φη(r)=Vθ,s(-r),it is easy to see that Φη(r)=0 for eachr≤-1.Similarly,forη=1,Φη(r)=Vθ,s(r),eachr≥1 makes Φη(r)=0.For,Φη(r) decreases on (-∞,-1]∪[-s,1]and increases on[-1,-s]∪[1,+∞).

    By a simple calculation,we get that

    Then it follows immediately from 0<η<that Φη(-1)<Φη(1).Therefore the minimum value must be attained at-1.

    According to the above analysis,we have that

    Hence,for any 0≤η≤1,

    Letdθ,s=2θ+(1-θ)(1-s).Then

    Now we are in a position to discussH-(η).

    Forη=0 orη=1,in a fashion similar to the case ofs≤-1,we have that Φη(r) attains the minimum value Φη(0)=Vθ,s(0)=1.For,2η-1≤0,the constraint condition implies thatr≥0.Φη(r) increases on[0,-s]∪[1,+∞),and decreases on[-s,1],so Φη(r) has two local minimum values:Φη(0)=1 and Φη(1)=(1-η)Vθ,s(-1)=dθ,s(1-η).Ifthendθ,s(1-η)≥1,otherwisedθ,s(1-η)<1.It can be concluded that

    Combining (3.10) and (3.11),for anyη∈[0,1],we have that

    By (3.9) and (3.12),we get that

    and by using the facts that we have,a,b∈R,it follows that

    This completes the proof. □

    For the loss functionVθ,s(r) defined as (1.2),F(xiàn)igures 2 and 3 show the graphs of Φη(r) with some values of homotopy parameters,while the graphs of Ψ(τ) andH(η) are shown in Figure 4.

    Figure 2 Φη(r) with θ=0.25.s=-1.5 in the left sub figure,s=-1 in the middle sub figure,and s=-0.5 in the right sub figure

    Figure 3 Φη(r) with θ=1.s=-1.5 in the left sub figure,s=-1 in the middle sub figure,and s=-0.5 in the right sub figure

    Figure 4 Ψ(τ) and H (η) for the loss function Vθ,s with θ=0.25 and s≤-1 or-1<s≤0.Ψ(τ) with θ=0.25 and s≤-1 or s=-0.5 in the left sub figure,H (η) with θ=0.25 and s≤-1 or s=-0.5 in the right sub figure

    Letting Ψ be the function defined in Definition 3.3,according to the analysis of[37],the following lemma holds:

    Lemma 3.6([37])(1) For any measurable functionf:X→R,any nonnegative loss functionV,and any probability distribution onX×{-1,+1},it holds that

    Proposition 3.7LetVθ,s(r) be the loss function defined as (1.2).Then for any measurable functionf:X→R,it holds that

    ProofFor any measurable functionf:X→R,it is immediately apparent from (3.13) that

    According to the results given in Lemma 3.5,for allτ∈(0,1],we have that Ψ(τ)>0.By Lemma 3.4,it follows that the loss functionVθ,s(r) is classification-calibrated,and that Ψ is invertible on[0,1].We now analyze the comparison inequality between R (sgn (f))-R (fc) andfrom two cases according the value of the parameters.

    Case 1:s≤-1.Since~Ψ(τ)=|τ|is a convex function,by Lemma 3.6,we have that

    Takingτ=R (sgn (f))-R (fc),it follows that,fors≤-1,we have

    It is easy to see that forθ=1 ors→-∞,the comparison relation,given by Proposition 3.7,is in accordance with the comparison relation (3.1) associated with the hinge loss.

    3.2 The learning rates

    In this subsection,we give results about the excess misclassification error of the algorithm (2.3).Let us begin with some concepts regarding convex analysis (see[40]).

    Let (H,‖·‖H) be a Hilbert space,and letF(f) be a function defined on a convex setS?H.We say thatFis a quasiconvex function onSif

    holds for allf1,f2∈Sand anyα∈[0,1].

    If,for allf1,f2∈Sand anyα∈[0,1],it holds that

    then we callFa convex function onS.

    Letting B (f,ε)={g∈H:‖g-f‖H<ε}be theε-ball of Hilbert space H,we callV:H→R the local Lipschitz nearf∈H if,for someε>0,it holds that

    whereLis called the Lipschitz constant,which depends uponfandε.

    A nonnegative continuous functionV:R→[0,∞) is called locally Lipschitz loss if,for alla≥0,there exists a constantca>0 such that

    Moreover,fora≥0,the smallestcais denoted byLa.Finally,if we havethen we say thatVis Lipschitz continuous.Ifa=+∞,then we say thatV(t) is a Lipschitz loss on R.

    For the loss functionVθ,s(r) defined as (1.2),it is easy to show thatVθ,s(r) is a quasiconvex function on R and thatVθ,s(r) is a Lipschitz continuous function with the Lipschitz constantLθ,s=1 forr>sandLθ,s=θf(wàn)orr≤s,i.e.,

    Definition 3.8Let (H,‖·‖H) be a Hilbert space,and letF(f) be a quasiconvex function from H to R∪±∞.We callD+F(f,h) the right-sided Gateaux directional derivative ofFatf∈H with respect tohif there existsD+F(f)∈H such that

    Moreover,D+F(f,h) can be written as

    It is easy to see that ifFis Lipschitz continuous on H,then,for?h∈H,we have that

    whereLis the Lipschitz constant,i.e.,

    By the definition of D+F (f),we know that ifF(f) attains its minimal value atf0∈H,then

    We now state the main result of this paper.

    Theorem 3.9Let (HK,‖·‖K) be the reproducing Hilbert space associated with the kernelKx(·)=K(x,·),Vθ,s(r) be the loss function defined as (1.2),andand R (f) be defined as above.Then,for anyδ∈(0,1),with 1-δ,it holds that

    Remark 3.10The K-functionaldenotes the approximation error.By the Lipschitzian ofVθ,s,we know that there exists a constantC>0 such thatsatisfies

    The decay rate for the right-hand side of (3.23) may be described by a modulus of smoothness ([33]),and the convergence ofis determined by the capacity of HK(see[41,42]).

    Remark 3.11If the parametersθandλare chosen such that form→+∞we have that,then,for a givenδ∈(0,1),with 1-δ,we have that

    and the convergence rate can be controlled by choosing the homotopy parameters properly.

    Remark 3.12Whens→-∞,we have that|I|s→mand|O|s→0.Therefore,the learning rates given have the homotopy continuous property.

    4 Proofs

    Now we are in a position to prove Theorem 3.9.We need to prove the following lemmas first:

    Lemma 4.1LetF(f):HK→R∪±∞be a convex function.Then,for anyf,g∈HK,it holds thatF(f)-F(g)≥D+F(g,f-g)=〈D+F(g),f-g〉HK.

    ProofSinceF(f) is a convex function,for anyf,g∈HKandα∈[0,1],it holds thatF(αf+(1-α)g)≤αF(f)+(1-α)F(g).This implies thatF(g+α(f-g))≤F(g)+α(F(f)-F(g)),so we have that

    Lettingα→0+in the left of inequality (4.1),our conclusion follows. □

    Lemma 4.2Let (HK,‖·‖K) be the reproducing Hilbert space associated with the kernelKx(·)=K(x,·),and letVθ,s(r) be the loss function defined as in (1.2).Then,for anyh∈HK,it holds that

    ProofBy the definition of,we have that

    The reproducing property of HKyields that

    Taking the above formula into (4.4),we have that

    This proves (4.2).Using the same argument as above,we obtain equation (4.3).The proof is complete. □

    By the same token,we have that

    Since Ω(f) and Ωz(f) are also strict convex functions,we have that

    Lemma 4.3Letbe defined as in (4.7) and (4.8),respectively.Then,

    ProofCombining the definitions ofwith Lemma 4.1,we have that

    Combining (4.11) with (4.2) and (4.3),we have that

    We have thus proven (4.9). □

    Lemma 4.4([43]) Letξbe a random variable taking values in a real separable Hilbert space H on a probability space (Ω,F(xiàn),P).Assume that there is a positive constantLsuch that

    Then,for alln≥1 and 0<δ<1,it holds that

    The next Lemma gives a bound for the sample error.

    Lemma 4.5Letbe defined as in (4.7) and (4.8),respectively.Then,for anyδ∈(0,1),it holds,with 1-δ,that

    ProofAccording to Lemma 4.3,it follows that

    Then (4.12) implies that

    where we have used the fact that|D+[1-·]+|≤1.

    According to Lemma 4.4,for anyδ∈(0,1),with,it holds that

    By the same token,for anyδ∈(0,1),with,we have that

    (4.13) and (4.15),together with (4.16),yield our conclusion. □

    A bound for the generalization error is provided in the following Lemma:

    Lemma 4.6LetVθ,s(r) be the loss function defined as in (1.2),and letandbe defined as above.Then,for anyδ∈(0,1),with 1-δ,it holds that

    ProofApplying the Lipschitz continuity of the loss functionVθ,swith the Lipschitz constantsatisfyingLθ,s≤1,and according to Lemma 4.5,we obtain that

    It is now obvious that the lemma holds. □

    For anyδ∈(0,1),by Lemma 4.6 and Proposition 3.7,with 1-δ,it holds that

    Thus,we have completed the proof of Theorem 3.9. □

    videosex国产| 国产成人影院久久av| 亚洲五月婷婷丁香| 最新在线观看一区二区三区| 午夜免费鲁丝| 水蜜桃什么品种好| 国产又色又爽无遮挡免费看| 91av网站免费观看| 精品国产一区二区久久| 亚洲在线自拍视频| 一区二区三区激情视频| 成年动漫av网址| 岛国毛片在线播放| 国产色视频综合| 美女扒开内裤让男人捅视频| 欧美性长视频在线观看| 好男人电影高清在线观看| 一进一出好大好爽视频| 老鸭窝网址在线观看| 操出白浆在线播放| 精品国产乱子伦一区二区三区| 激情在线观看视频在线高清 | 夜夜爽天天搞| 在线观看免费午夜福利视频| 国内久久婷婷六月综合欲色啪| 怎么达到女性高潮| 久久香蕉精品热| 日韩精品免费视频一区二区三区| 欧美日韩乱码在线| 免费在线观看亚洲国产| 一区二区三区国产精品乱码| 好男人电影高清在线观看| 午夜福利,免费看| 亚洲aⅴ乱码一区二区在线播放 | av电影中文网址| 国产欧美日韩一区二区三区在线| 国产欧美亚洲国产| 欧美日韩视频精品一区| av国产精品久久久久影院| 欧美成人午夜精品| 亚洲,欧美精品.| 99国产精品一区二区蜜桃av | 精品国产美女av久久久久小说| 免费女性裸体啪啪无遮挡网站| 美女 人体艺术 gogo| 日日夜夜操网爽| 婷婷成人精品国产| 国产欧美亚洲国产| 精品无人区乱码1区二区| 脱女人内裤的视频| 三上悠亚av全集在线观看| 99精品欧美一区二区三区四区| 1024视频免费在线观看| 最近最新中文字幕大全电影3 | 一个人免费在线观看的高清视频| 中文字幕人妻熟女乱码| 亚洲中文日韩欧美视频| 91字幕亚洲| 精品人妻熟女毛片av久久网站| 亚洲五月婷婷丁香| 亚洲精品久久午夜乱码| 成人国产一区最新在线观看| 成人免费观看视频高清| 99热国产这里只有精品6| 久久人妻av系列| 亚洲专区字幕在线| 亚洲片人在线观看| 亚洲全国av大片| 超碰成人久久| 亚洲国产看品久久| 色精品久久人妻99蜜桃| 久久久久精品国产欧美久久久| 亚洲国产精品合色在线| 欧美黑人精品巨大| 亚洲精华国产精华精| 国产精品久久电影中文字幕 | 久久香蕉国产精品| 免费在线观看黄色视频的| 国产成人av激情在线播放| 国产亚洲精品久久久久久毛片 | xxxhd国产人妻xxx| 黑人巨大精品欧美一区二区蜜桃| 91成年电影在线观看| 99香蕉大伊视频| 亚洲欧美日韩高清在线视频| 丝瓜视频免费看黄片| 亚洲精品美女久久av网站| 如日韩欧美国产精品一区二区三区| 制服人妻中文乱码| 少妇粗大呻吟视频| 99国产综合亚洲精品| 精品一区二区三区四区五区乱码| 无限看片的www在线观看| 天天躁狠狠躁夜夜躁狠狠躁| 亚洲精品自拍成人| 人妻丰满熟妇av一区二区三区 | 久久久久久久久久久久大奶| 国产精品永久免费网站| 午夜两性在线视频| 国产精品.久久久| 国产成人av教育| 男男h啪啪无遮挡| 老熟女久久久| 男女下面插进去视频免费观看| 久久久久视频综合| 中文欧美无线码| svipshipincom国产片| 又紧又爽又黄一区二区| 亚洲熟妇中文字幕五十中出 | 精品国产乱子伦一区二区三区| 国产精品久久久久久精品古装| 欧美老熟妇乱子伦牲交| 久久亚洲真实| 亚洲 欧美一区二区三区| 成人av一区二区三区在线看| 日韩视频一区二区在线观看| 老司机午夜十八禁免费视频| 国产精品99久久99久久久不卡| 久久精品熟女亚洲av麻豆精品| 91av网站免费观看| 一级片'在线观看视频| 欧美日韩精品网址| 日韩三级视频一区二区三区| 老司机午夜十八禁免费视频| 国产乱人伦免费视频| 午夜福利一区二区在线看| 亚洲五月天丁香| 一级片'在线观看视频| 水蜜桃什么品种好| 黄片小视频在线播放| 成年人黄色毛片网站| 日日爽夜夜爽网站| tube8黄色片| av网站免费在线观看视频| 日韩欧美在线二视频 | 精品熟女少妇八av免费久了| 亚洲午夜理论影院| 久久亚洲精品不卡| 久99久视频精品免费| 亚洲午夜理论影院| 热99久久久久精品小说推荐| 99香蕉大伊视频| 9191精品国产免费久久| 80岁老熟妇乱子伦牲交| 国产欧美日韩综合在线一区二区| 欧美乱色亚洲激情| 18禁国产床啪视频网站| 免费在线观看亚洲国产| 久热爱精品视频在线9| 99热只有精品国产| 两人在一起打扑克的视频| 热99国产精品久久久久久7| 国产主播在线观看一区二区| 男人操女人黄网站| 99国产精品99久久久久| 亚洲片人在线观看| bbb黄色大片| 香蕉久久夜色| 18禁国产床啪视频网站| 国产区一区二久久| 亚洲国产精品合色在线| a级片在线免费高清观看视频| 亚洲熟女精品中文字幕| 两性午夜刺激爽爽歪歪视频在线观看 | 亚洲国产欧美网| 久久久国产欧美日韩av| 亚洲人成电影观看| 欧美日韩国产mv在线观看视频| 精品国产一区二区三区四区第35| 一区在线观看完整版| 男男h啪啪无遮挡| 天天影视国产精品| 别揉我奶头~嗯~啊~动态视频| 王馨瑶露胸无遮挡在线观看| 亚洲欧美色中文字幕在线| 麻豆av在线久日| 亚洲 欧美一区二区三区| 国产野战对白在线观看| 男人的好看免费观看在线视频 | 极品少妇高潮喷水抽搐| 亚洲五月天丁香| 日日爽夜夜爽网站| 国产又色又爽无遮挡免费看| 亚洲人成伊人成综合网2020| 亚洲三区欧美一区| 亚洲精品中文字幕一二三四区| 成人影院久久| 午夜福利视频在线观看免费| 成人黄色视频免费在线看| 高清欧美精品videossex| 他把我摸到了高潮在线观看| 在线av久久热| 欧美日韩亚洲高清精品| 视频在线观看一区二区三区| 好看av亚洲va欧美ⅴa在| 在线永久观看黄色视频| 可以免费在线观看a视频的电影网站| 变态另类成人亚洲欧美熟女 | 欧美激情 高清一区二区三区| 精品无人区乱码1区二区| 国产精品99久久99久久久不卡| 大香蕉久久网| 成人影院久久| 啦啦啦免费观看视频1| 亚洲熟女精品中文字幕| 一二三四在线观看免费中文在| 成人免费观看视频高清| 丰满人妻熟妇乱又伦精品不卡| 日韩免费av在线播放| 69av精品久久久久久| 精品久久久久久久久久免费视频 | 啦啦啦免费观看视频1| 后天国语完整版免费观看| 欧美 亚洲 国产 日韩一| 美女高潮喷水抽搐中文字幕| √禁漫天堂资源中文www| 午夜视频精品福利| 18禁裸乳无遮挡免费网站照片 | 日本黄色日本黄色录像| 高清在线国产一区| 亚洲国产毛片av蜜桃av| 国产国语露脸激情在线看| 欧美不卡视频在线免费观看 | 99热国产这里只有精品6| 成年版毛片免费区| 久久人妻福利社区极品人妻图片| 亚洲 欧美一区二区三区| 午夜福利乱码中文字幕| 夜夜夜夜夜久久久久| 精品国产亚洲在线| 久久亚洲真实| 欧美性长视频在线观看| 日韩视频一区二区在线观看| 亚洲精品av麻豆狂野| 日韩欧美一区视频在线观看| 久久久久久亚洲精品国产蜜桃av| 一进一出抽搐gif免费好疼 | 久久久久久久久免费视频了| 午夜福利欧美成人| 日本撒尿小便嘘嘘汇集6| 亚洲成人免费电影在线观看| 亚洲欧美一区二区三区黑人| 色播在线永久视频| 啦啦啦 在线观看视频| 激情视频va一区二区三区| 国产亚洲欧美在线一区二区| 黑人欧美特级aaaaaa片| 午夜福利一区二区在线看| 99久久精品国产亚洲精品| 国产精品免费视频内射| 在线天堂中文资源库| svipshipincom国产片| 丝袜美腿诱惑在线| 老鸭窝网址在线观看| 亚洲人成77777在线视频| 免费在线观看完整版高清| 人妻久久中文字幕网| 午夜精品在线福利| a级毛片在线看网站| 亚洲精品国产精品久久久不卡| 丰满的人妻完整版| 美女高潮到喷水免费观看| 一二三四社区在线视频社区8| 俄罗斯特黄特色一大片| 国产高清videossex| 欧美日韩精品网址| 亚洲精品国产精品久久久不卡| 免费观看a级毛片全部| 99国产精品99久久久久| 国产一区在线观看成人免费| 热99re8久久精品国产| 中文字幕最新亚洲高清| 啦啦啦免费观看视频1| 中亚洲国语对白在线视频| 亚洲成人手机| 日韩欧美一区视频在线观看| 美女 人体艺术 gogo| 欧美激情久久久久久爽电影 | 亚洲av第一区精品v没综合| 国产色视频综合| 日韩欧美在线二视频 | 午夜福利,免费看| 大香蕉久久网| 国产高清国产精品国产三级| av线在线观看网站| 亚洲熟女精品中文字幕| 中文字幕精品免费在线观看视频| 精品人妻熟女毛片av久久网站| 国产亚洲欧美98| 国产无遮挡羞羞视频在线观看| 成人亚洲精品一区在线观看| 久久国产精品人妻蜜桃| 涩涩av久久男人的天堂| 在线观看免费午夜福利视频| 亚洲一区中文字幕在线| 国产男女超爽视频在线观看| 日韩欧美在线二视频 | 日本wwww免费看| 久久国产精品影院| 亚洲性夜色夜夜综合| 多毛熟女@视频| 免费久久久久久久精品成人欧美视频| 久久久精品区二区三区| 午夜福利在线观看吧| 午夜福利影视在线免费观看| 亚洲久久久国产精品| 黄色片一级片一级黄色片| 国产视频一区二区在线看| 国产亚洲精品久久久久5区| 久久久久久亚洲精品国产蜜桃av| 久久人人97超碰香蕉20202| 99riav亚洲国产免费| av天堂久久9| 婷婷精品国产亚洲av在线 | 国产精品美女特级片免费视频播放器 | 老熟女久久久| 免费看十八禁软件| 色综合婷婷激情| 亚洲成av片中文字幕在线观看| 男女之事视频高清在线观看| 成人精品一区二区免费| av在线播放免费不卡| 亚洲国产欧美一区二区综合| 香蕉久久夜色| 国产精品久久久av美女十八| 免费不卡黄色视频| 下体分泌物呈黄色| 亚洲一码二码三码区别大吗| 国内久久婷婷六月综合欲色啪| av线在线观看网站| 日本一区二区免费在线视频| 下体分泌物呈黄色| 久久99一区二区三区| 久久影院123| 亚洲av成人不卡在线观看播放网| 在线观看www视频免费| av片东京热男人的天堂| 老司机影院毛片| 国产精品国产av在线观看| 久久久久久免费高清国产稀缺| 黄色丝袜av网址大全| xxx96com| av国产精品久久久久影院| 日日摸夜夜添夜夜添小说| 国产乱人伦免费视频| 岛国在线观看网站| 脱女人内裤的视频| 啦啦啦视频在线资源免费观看| 国产精品乱码一区二三区的特点 | 大陆偷拍与自拍| 成人亚洲精品一区在线观看| 午夜精品在线福利| 女警被强在线播放| 国产成人av教育| 极品人妻少妇av视频| 18禁黄网站禁片午夜丰满| 国产成人啪精品午夜网站| 国产一区二区三区视频了| 色老头精品视频在线观看| 女人爽到高潮嗷嗷叫在线视频| 久久久久国产一级毛片高清牌| 18禁裸乳无遮挡动漫免费视频| 国产亚洲欧美精品永久| 夜夜夜夜夜久久久久| 欧美日韩视频精品一区| 男女之事视频高清在线观看| 黄色成人免费大全| 精品国产乱子伦一区二区三区| 在线观看免费视频日本深夜| 涩涩av久久男人的天堂| 国产91精品成人一区二区三区| 美女高潮喷水抽搐中文字幕| 欧美乱妇无乱码| 国产国语露脸激情在线看| 欧美精品av麻豆av| 久久久久久久久免费视频了| 韩国av一区二区三区四区| 丝袜美足系列| 国产区一区二久久| 丰满人妻熟妇乱又伦精品不卡| 黑人猛操日本美女一级片| 如日韩欧美国产精品一区二区三区| www.熟女人妻精品国产| 波多野结衣av一区二区av| 老司机影院毛片| 亚洲人成电影观看| 国产成+人综合+亚洲专区| 午夜福利一区二区在线看| 亚洲人成电影观看| 搡老乐熟女国产| 91在线观看av| 久久香蕉精品热| 五月开心婷婷网| 日韩大码丰满熟妇| 一级片'在线观看视频| 侵犯人妻中文字幕一二三四区| 一本一本久久a久久精品综合妖精| 十分钟在线观看高清视频www| 一级毛片高清免费大全| 12—13女人毛片做爰片一| 亚洲七黄色美女视频| 精品国产一区二区久久| 无限看片的www在线观看| 久久久精品免费免费高清| a级毛片在线看网站| 女人精品久久久久毛片| 国产精品国产高清国产av | 一级a爱片免费观看的视频| 日韩人妻精品一区2区三区| 亚洲精品国产色婷婷电影| 亚洲欧美激情在线| 亚洲性夜色夜夜综合| 黄色片一级片一级黄色片| 丝袜美足系列| 久久狼人影院| 久久久国产成人免费| 好看av亚洲va欧美ⅴa在| 午夜福利视频在线观看免费| 久久国产精品大桥未久av| 日本撒尿小便嘘嘘汇集6| 天堂俺去俺来也www色官网| 日韩欧美免费精品| 亚洲精品久久午夜乱码| 一区二区日韩欧美中文字幕| 一进一出抽搐动态| 一二三四在线观看免费中文在| 叶爱在线成人免费视频播放| 亚洲 欧美一区二区三区| 久久国产乱子伦精品免费另类| 波多野结衣一区麻豆| 757午夜福利合集在线观看| 女性生殖器流出的白浆| 精品一区二区三区av网在线观看| 村上凉子中文字幕在线| 成年人黄色毛片网站| 欧美人与性动交α欧美软件| 捣出白浆h1v1| 老司机福利观看| 天天躁夜夜躁狠狠躁躁| 日韩大码丰满熟妇| 亚洲一区中文字幕在线| 黄色视频,在线免费观看| 在线播放国产精品三级| 久久精品熟女亚洲av麻豆精品| 一级毛片高清免费大全| 国产精品综合久久久久久久免费 | 精品久久久久久,| 高清黄色对白视频在线免费看| 亚洲中文日韩欧美视频| 日本五十路高清| 午夜福利免费观看在线| 欧美一级毛片孕妇| 老熟妇乱子伦视频在线观看| a级片在线免费高清观看视频| √禁漫天堂资源中文www| 国产视频一区二区在线看| 国产三级黄色录像| 脱女人内裤的视频| 久久天堂一区二区三区四区| 高清在线国产一区| 国产视频一区二区在线看| 亚洲五月天丁香| 少妇猛男粗大的猛烈进出视频| 十八禁网站免费在线| 久久亚洲真实| 欧美久久黑人一区二区| 欧美日韩黄片免| 俄罗斯特黄特色一大片| www.精华液| 国产一区在线观看成人免费| 麻豆av在线久日| 国产精品国产高清国产av | 亚洲精品美女久久av网站| 国产精品免费大片| 中文亚洲av片在线观看爽 | 老司机午夜十八禁免费视频| 久久精品国产综合久久久| 国产淫语在线视频| 精品一区二区三区四区五区乱码| av免费在线观看网站| 最近最新中文字幕大全免费视频| 国产成人啪精品午夜网站| 在线视频色国产色| 黄片大片在线免费观看| 高清在线国产一区| 亚洲片人在线观看| 波多野结衣一区麻豆| 日本黄色视频三级网站网址 | 男人操女人黄网站| 日韩中文字幕欧美一区二区| 欧美在线一区亚洲| 免费在线观看影片大全网站| 高清毛片免费观看视频网站 | 成人国语在线视频| 99国产极品粉嫩在线观看| 大片电影免费在线观看免费| 人妻 亚洲 视频| 久久久国产一区二区| 亚洲成人国产一区在线观看| 人人澡人人妻人| 一个人免费在线观看的高清视频| 欧美日韩亚洲国产一区二区在线观看 | 天天躁狠狠躁夜夜躁狠狠躁| 亚洲精品自拍成人| 每晚都被弄得嗷嗷叫到高潮| 久久这里只有精品19| 色在线成人网| 免费av中文字幕在线| 日韩 欧美 亚洲 中文字幕| 两个人看的免费小视频| 18禁国产床啪视频网站| 亚洲中文日韩欧美视频| 久久香蕉国产精品| 女性被躁到高潮视频| 如日韩欧美国产精品一区二区三区| 身体一侧抽搐| 三级毛片av免费| 国产高清视频在线播放一区| 久久久国产成人精品二区 | 亚洲精品国产区一区二| 国产熟女午夜一区二区三区| 美女午夜性视频免费| 大码成人一级视频| 亚洲av美国av| 久久ye,这里只有精品| 黄色视频不卡| 曰老女人黄片| 中文字幕人妻熟女乱码| 亚洲片人在线观看| 一区二区三区国产精品乱码| 免费人成视频x8x8入口观看| 午夜视频精品福利| 国产精品久久电影中文字幕 | 又大又爽又粗| 伊人久久大香线蕉亚洲五| 欧美亚洲 丝袜 人妻 在线| 女人被躁到高潮嗷嗷叫费观| 亚洲成国产人片在线观看| 午夜福利乱码中文字幕| 国产片内射在线| 久久久久精品人妻al黑| 国产精品电影一区二区三区 | 免费观看a级毛片全部| 欧美在线黄色| 日韩一卡2卡3卡4卡2021年| 女人高潮潮喷娇喘18禁视频| 亚洲欧洲精品一区二区精品久久久| 亚洲成人国产一区在线观看| 欧美乱色亚洲激情| 高清毛片免费观看视频网站 | 国产单亲对白刺激| 免费在线观看完整版高清| 亚洲专区字幕在线| 欧美亚洲 丝袜 人妻 在线| 侵犯人妻中文字幕一二三四区| aaaaa片日本免费| 精品国产美女av久久久久小说| 成人三级做爰电影| 人妻一区二区av| 日韩免费高清中文字幕av| 黄片大片在线免费观看| 国产一区有黄有色的免费视频| av网站在线播放免费| 成年人免费黄色播放视频| 亚洲一区中文字幕在线| 一级,二级,三级黄色视频| 女人高潮潮喷娇喘18禁视频| 国产在线精品亚洲第一网站| 国产欧美日韩精品亚洲av| 曰老女人黄片| 视频区图区小说| 巨乳人妻的诱惑在线观看| 看免费av毛片| 男女下面插进去视频免费观看| 欧美精品啪啪一区二区三区| 免费观看a级毛片全部| 91字幕亚洲| e午夜精品久久久久久久| 久久狼人影院| 男女高潮啪啪啪动态图| 精品国产乱码久久久久久男人| 久久婷婷成人综合色麻豆| 狂野欧美激情性xxxx| 另类亚洲欧美激情| 日韩免费高清中文字幕av| 成在线人永久免费视频| 少妇 在线观看| 18禁裸乳无遮挡动漫免费视频| 国产淫语在线视频| 色播在线永久视频| 1024视频免费在线观看| 又紧又爽又黄一区二区| 三级毛片av免费| 午夜影院日韩av| 国产亚洲欧美精品永久| 天堂√8在线中文| 久久人人爽av亚洲精品天堂| 亚洲精品国产精品久久久不卡| 麻豆成人av在线观看| 麻豆av在线久日| av天堂在线播放| 精品久久蜜臀av无| 每晚都被弄得嗷嗷叫到高潮| 国产精品美女特级片免费视频播放器 | 亚洲精品国产区一区二| 人人妻人人澡人人爽人人夜夜| 免费在线观看视频国产中文字幕亚洲| 在线看a的网站| 亚洲国产精品合色在线| 日本a在线网址| 丰满饥渴人妻一区二区三| 午夜老司机福利片| 超碰97精品在线观看| 国产国语露脸激情在线看| 国产深夜福利视频在线观看|