• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Model Selection Consistency of Lasso for Empirical Data?

    2018-10-18 02:54:16YuehanYANGHuYANG

    Yuehan YANG Hu YANG

    Abstract Large-scale empirical data,the sample size and the dimension are high,often exhibit various characteristics.For example,the noise term follows unknown distributions or the model is very sparse that the number of critical variables is fixed while dimensionality grows with n.The authors consider the model selection problem of lasso for this kind of data.The authors investigate both theoretical guarantees and simulations,and show that the lasso is robust for various kinds of data.

    Keywords Lasso,Model selection,Empirical data

    1 Introduction

    Tibshirani[14]proposed the lasso(least absolute shrinkage and selection operator)method for simultaneous model selection and estimation of the regression parameters.It is very popular for high-dimensional estimation due to its statistical accuracy for prediction and model selection coupled with its computational feasibility.On the other hand,under some sufficient condition the lasso solution is unique,and the number of non-zero elements of lasso solution is always smaller than n(see[15–16]).In recent years,this kind of data has become more and more common in most fields.Similar properties can also be seen in other penalized least squares since they have a similar framework of solution.

    Consider the problem of model selection in the sparse linear regression model

    where the detail setting of the data can be found in the next section.Then the lasso estimator is defined as

    where λnis the tuning parameter which controls the amount of regularization.Set≡ {j ∈{1,2,···,pn}:bβj,n6=0}to select predictors by lasso estimatorConsequently,andboth depend on λn,and the model selection criteria results in the correct recovery of the set Sn≡ {j ∈ {1,2,···,pn}:βj,n6=0}:

    On the model selection front of the lasso estimator,Zhao and Yu[22]established the irrepresentable condition on the generating covariance matrices for the lasso’s model selection consistency.This condition was also discovered in[11,20,23].Using the language of[22],irrepresentable condition is defined as|C21sign(β(1))|6 1 ? η,where sign(·)maps positive entry to 1,negative entry to?1 and zero to zero.The definitions of C21and C11can be seen in Section 2.When signs of the true coefficients are unknown,they need l1norms of the regression coefficients to be smaller than 1.Beyond lasso,regularization methods also have been widely used for high-dimensional model selection,e.g.,[2,4,7–8,10,12,17–19,21,24–25].There has been a considerable amount of recent work dedicated to the lasso problem and regularization methods problem.

    Yet,the study of model selection problem for empirical data is still needed.Stock data for instance,the Gaussian assumption of the noise term is always unsatisfied for these data.And the critical variables are extremely few contrast to the collected dimensionality.In this paper,we consider this kind of data:The sample size and the dimension are high,but the information of critical variable data is missing(the signs of the true βnand the distribution of the noise terms are unknown)and the model is extremely sparse that the number of nonzero parameter is fixed.This kind of data is common in the empirical analysis hence we called it empirical data.

    We consider the model selection consistency of lasso and investigate regular conditions to fit this data setting.Under conditions,the probability for lasso to select the true model is covered by the probability of

    where Wn=and Gnis a function of λn,n,q.Above inequality is simple and also easy to calculate its probability.Based on the train of thought of the proof,we analyze the model selection consistency of lasso under easier conditions than the irrepresentable condition for empirical data.In the simulation part,we discuss the effectiveness of lasso.Four samples are given,in which the irrepresentable condition fails for all the settings,but lasso still can select variables correctly in two of them when our conditions hold.

    We discuss the different assumptions of noise terms ?i,nfor model selection consistency.Gaussian errors or the subgaussian errors1e.g.P(|?i,n|>t)6 Ce?ct2, ?t>0.would be standard,but possess a strong tail.One basic assumption in this paper is that,errors are assumed to be identically and independently distributed with zero mean and finite variance.

    The rest of the paper is organized as follows.In Section 2,we investigate the data setting,notations,and conditions.We introduce a lower bound to cover the case in which the lasso chooses wrong models when suitable conditions hold.Then,to demonstrate the advantages of this bound,we show the different settings and different assumptions of noise terms in Section 3.We show that the lasso has model selection consistency for empirical data with mild conditions.Section 4 presents the results of the simulation studies.Finally,in Section 5,we present the proof of the main theorem.

    2 Data Setting,Notations and Conditions

    Consider the problem of model selection for specific data

    where ?n=(?1,n,?2,n,···,?n,n)′is a vector of i.i.d.random variables with mean 0 and variance σ2,Xnis an n×pndesign matrix of predictor variables,βn∈ Rpnis a vector of true regression coefficients and is commonly imposed to be sparse with only a small proportion of nonzeros.Without loss of generality,write βn=(β1,n,···,βq,n,βq+1,n,···,βp,n)′where βj,n6=0 for j=1,···,q and βj,n=0 for j=q+1,···,pn.Then write=(β1,n,···,βq,n)′and=(βq+1,n,···,βp,n),that is,only the first q entries are nonvanishing.Besides,for any vector α =(α1,···,αm)′,we denote

    For deriving the theoretical results,we write Xn(1)and Xn(2)as the first q and the last pn?q columns of Xn,respectively.LetPartition Cnas

    where C11,nis q×q matrix and assumed to be invertible.Set Wn=.Similarly,andindicate the first q and the last pn? q elements of Wn.Suppose that Λmin(C11,n)>0 denotes the smallest eigenvalue of C11,nand consider that q does not grow with n.We introduce the following conditions:

    (C1)For j=q+1,···,pn,let ejbe the unit vector in the direction of j-th coordinate.There exists a positive constant 0<η<1 such that

    (C2)There exists δ∈ (0,1),such that for all n> δ?1and x ∈ Rq,y ∈ Rpn?q,

    (C1)and(C2)play a central role in our theoretical analysis.Both conditions are easy to satisfy.(C1)for instance,it requires an upper bound on l2-norm,which is much weaker than requires the upper bound on l1-norm,i.e.,irrepresentable condition and variants of this condition[6,9,11,22,25].Another advantage of(C1)is that we do not need the signs of the true coefficients.(C2)requires that the multiple correlations between relevant variables and the irrelevant variables is strictly less than one.It is weaker than assuming orthogonality of the two sets of variables.This condition also has regular appeared many times in the literature,for example,[13].

    Then we have the following theorem,which describes the relationship between the probability of lasso choosing the true model and the probability of{kWnk∞6 Gn}.Videlicet,it is a lower bound on the probability of lasso picking the true model.

    Theorem 2.1 Assume that(C1)–(C2)hold andSet ρ ∈ (0,1).We have

    Remark 2.1 Theorem 2.1 is a key technical tool in the theoretical results.It puts a lower bound on the probability of lasso selecting the true model,and this bound is intuitive to calculate.Besides that,considering about Gn,it is easy to find out that there exists a lower bound of non-zero coefficientsThis bound can be controlled by the regularization parameter λn.It is also a regular assumption in the literature that the non-zero coefficients cannot be too small.

    Remark 2.2 According to the proof of Theorem 2.1,we can find that it is also directly to obtain the sign consistency of the lasso(see the latter part of the proof).Besides,Theorem 2.1 can be applied in a wide range of dimensional setting.We will discuss the behavior of the lasso on model selection consistency under different settings in the next section.

    3 Model Selection Consistency

    Now we consider the decay rate of the probability of{kWnk∞>Gn}.Different dimensions and different assumptions of noise terms are discussed in this section.

    First,we consider general dimensional setting,i.e.,pn=O(nc1)where 0

    Before discussing the detail rate of the probability of lower bound,we give the following regular condition:

    It is a typical assumption in sparse linear regression literature.It can be achieved by normalizing the covariates(see[9,22]).

    3.1 General dimensional setting pn=O(nc1)

    In this part,we consider the general dimensional setting where pnis allowed to grow with n and show the model selection consistency of lasso as follows.

    Theorem 3.1 Assume that ?iare i.i.d random variables with mean 0 and variance σ2.Suppose that(C1)–(C3)hold.For pn=O(nc1)where 0

    Proof Following the result in Theorem 2.1,we have

    where

    Applying the setting of Theorem 3.1,hence for n→∞,

    Then there exists a positive constant Knthat

    If(C3)holds,by Markov’s inequality,we easily get

    The proof is completed.

    The proof of Theorem 3.1 states that in this setting,lasso is robust and selects the true model with regular restrains.Similarly,if we consider the classical setting where p,q and β are fixed when n→∞,then we have the following result.

    Corollary 3.1 For fixed p,q and β,under regularity assumptions(C1)–(C3),assume that ?iare i.i.d random variables with mean 0 and variance σ2.If λnsatisfies that→ ∞ andwhen n→∞,then

    Similar with the argument of Theorem 3.1,Corollary 3.1 can be proved directly by Markov’s inequality,hence the proof is omitted here.

    Besides,if we assume that the noise term follows the Gaussian assumption,under the same setting of Theorem 3.1,then we have

    where the last inequality holds because of the Gaussian distribution’s tail probability bound:P(|?i|>t)0.It can be relaxed to subgaussian assumption,i.e.,P(|?i|>t)6 Ce?ct2,?t>0.

    3.2 Ultra-high dimensional setting pn=O(enc2)

    In this part,we consider the ultra-high dimensional setting as pn=O(enc2)where 0

    We shall make use of the following condition:

    (C4)Assume that ?1,n,···,?n,nare independent random variables with mean 0 and the following inequality satisfies for j=1,···,pn,

    (C4)is the precondition for the non-Gaussian assumption(The model selection consistency of the lasso under the Gaussian assumption does not need this condition).It is applied here for the Bernstein’s inequality.According to(C4),we have

    where L0>L.This bound leads to Bernstein’s inequality as given in[1].Then we have the following result.

    Theorem 3.2 Assume that ?iare i.i.d random variables with mean 0 and variance σ2.Suppose that(C1)–(C2)and(C4)hold.IfWe have

    Proof By Bernstein’s inequality,let t>0 be arbitrary,we have

    Applying the result of Lemma 14.13 from[3],when(C3)holds,we have

    Following the setting of p,

    Let J∈(0,∞)to make the following inequalities hold for all t>0,

    Then we have

    which completes the proof.

    Similarly as in general high-dimensional setting,we have the following result under Gaussian assumption.Since the proof of Corollary 3.2 is direct,we just state the result here without proof.

    Corollary 3.2 Assume that ?iare i.i.d Gaussian random variables.Let pn=O(enc2)where 0

    4 Simulation Part

    In this section,we evaluate the finite sample property of lasso estimator with synthetic data.We start with the behavior of lasso under different settings,then consider the relationship between n,p,q and then consider the different noise terms.

    4.1 Model selection

    This first part illustrates two simple cases(low dimension vs high dimension)to show the efficiency of lasso.Following cases describe two different settings to lead the lasso’s model selection consistency and inconsistency when(C1)and(C2)hold and fail.As a contrast,we introduce the irrepresentable condition in this part,and it fails in all the settings.

    Example 4.1 In the low dimensional case,assume that there are n=100 observations and the values of parameters are chosen as p=3,q=2,that is,

    We generate the response y by

    where X1,X2and ? are i.i.d random variables from Gaussian distribution with mean 0 and variance 1.The third predictor X3is generated to be correlated with other parameters as the following two cases:

    and

    where e is i.i.d random variable with the same setting as ?.

    We can find that the lasso fails for the first case when(C1)and(C2)fail,and selects the right model for the second case when(C1)and(C2)hold.The different solutions are illustrated by Figure 1.Since the irrepresentable condition fails in both cases,it shows that the lasso suits more kinds of data even if the irrepresentable condition is relaxed.

    Figure 1 An example to illustrate the efficiency of lasso’s(in)consistency in model selection.The above two graphs are constructed in a low dimensional setting.The below graphs are constructed in a high dimensional setting.The left graphs are set where(C1)and(C2)fail,and the right graphs are set where(C1)and(C2)hold.

    Example 4.2 We construct a high dimensional case with p=400,q=4 and n=100.The true parameters are set as

    and the response y is generated by

    where

    is 100×400 matrix,and the elements of X are i.i.d random variables from Gaussian distribution with mean 0 and variance 1 except X400.The last predictor X400is generated in the following two settings respectively,

    and

    where e follows the same setting as Example 4.1.Hence X400is also constructed from Gaussian distribution with mean 0 and variance 1.We find that our conditions also fail for the first high dimensional case but hold for the second.Besides that,irrepresentable condition fails for both two situations.

    We get different lasso solutions for above four cases in Figure 1(the lasso path is got by lars algorithm in[5]).As shown in Figure 1,both graphs on the left satisfy neither irrepresentable condition nor(C1)–(C2),and lasso cannot select variables correctly(both graphs select other irrelevant variables,e.g.,X4in the first graph and X400in the second).In contrast,both graphs on the right select the right model in the settings that(C1)–(C2)hold and irrepresentable condition fails.

    Besides that,the above examples are all constructed based on the synthetic data,in which the unknown parameter is actually known.In the empirical analysis,the true model cannot be known in advance.We should recognize a situation in which lasso can be used without precondition.

    4.2 Relationship between p,q and n

    In this part,we give a direct view to show the relationship between n,p and q,or to say,how the sparsity and the sample size affect the model selection of lasso.

    The nonzero elements β(1)are set as

    If the number of nonzero elements is less than 14,we select the number in sequence.The rest of the other elements in this gather are shrunk to zero.The number of observations and the parameters are chosen as Table 1.The predictors are made from Gaussian random generation.Among this table,lasso selects the right variables in the first six items in the list and selects the wrong variables in the remaining items in the list.

    Table 1 Example settings

    The high dimensional settings are considered.The results indicate that q is always required to be small enough for the efficiency of the lasso.When the number of critical factors increases,the sample size needs to be increased too to make sure the lasso chooses the right model.In contrast,the number of zero elements has less influence on the lasso’s(in)consistency in model selection.

    4.3 Different noise terms

    In this part,we consider a high dimensional example with different noise terms.Data from the high-dimensional linear regression model is set as

    where the data have n=100 observations and the value of parameter is chosen as p=1000.The true regression coefficient vector is fixed as

    Figure 2 An example to illustrate the lasso’s behavior in the high dimensional setting with different assumptions of noise terms.It reflects that in a situation with standard data and strong sparsity,lasso always chooses the right model no matter the distribution of noise terms.

    For the distribution of the noise ?,we consider four distributions:Gaussian assumption with mean 0 and variance 1;exponential distribution with rate 1;uniform distribution with minimum 0 and maximum 1;student’s t with degrees of freedom 100.

    The results are depicted in Figure 2.It reflects that in a situation with standard data and strong sparsity,lasso always chooses the right model no matter the distribution of noise terms.

    5 Proof of Theorem 2.1

    Review the lasso estimator

    Define

    Then Vn>0 depends on

    Since Vn(0)=0,the minimum of Vn()cannot be attained atThen assume thatand(C1)holds.Set ejto be the unit vector in the direction of j-th coordinate.Then the following inequality holds uniformly:

    After discussing the model selection consistency of,we now consider about the model selection consistency of.According to the definition ofand the solution of the lasso,if we wantthe following hold,

    Combining above two restraints of,the existence of suchis implied by

    we have

    where Λmin= Λmin(C11,n).Besides,we also have

    By Bonferroni’s inequality,we know that if we want to prove

    it suffices to show that for every j∈Sn,

    Hence,we have

    which completes the proof.

    这个男人来自地球电影免费观看 | 国产精品伦人一区二区| 亚洲经典国产精华液单| 久久国内精品自在自线图片| 只有这里有精品99| 亚洲国产成人一精品久久久| 国产一区二区在线观看日韩| a级毛色黄片| 一级毛片黄色毛片免费观看视频| 成人漫画全彩无遮挡| av福利片在线观看| 好男人视频免费观看在线| 久久亚洲国产成人精品v| 2018国产大陆天天弄谢| 中文在线观看免费www的网站| 中国三级夫妇交换| 日韩精品免费视频一区二区三区 | 成人亚洲欧美一区二区av| 五月玫瑰六月丁香| 国产又色又爽无遮挡免| 国产伦精品一区二区三区视频9| 久久99热6这里只有精品| 国产熟女午夜一区二区三区 | 亚洲欧洲日产国产| 日韩欧美一区视频在线观看 | 中国国产av一级| 一区在线观看完整版| 国产精品99久久久久久久久| 九九爱精品视频在线观看| 日韩欧美精品免费久久| av免费在线看不卡| 乱码一卡2卡4卡精品| 哪个播放器可以免费观看大片| 精品亚洲成a人片在线观看| 九九久久精品国产亚洲av麻豆| 欧美高清成人免费视频www| 一区二区三区免费毛片| 久久99精品国语久久久| 伊人久久国产一区二区| 青春草亚洲视频在线观看| 欧美+日韩+精品| 欧美激情国产日韩精品一区| 国产精品免费大片| 97超碰精品成人国产| 我要看日韩黄色一级片| 男女边摸边吃奶| 三级国产精品片| 成人国产av品久久久| 22中文网久久字幕| 中国国产av一级| 国产精品伦人一区二区| 国内揄拍国产精品人妻在线| 日韩一区二区三区影片| av.在线天堂| 免费在线观看成人毛片| 亚洲精品国产色婷婷电影| 亚洲第一区二区三区不卡| 国产精品久久久久久久久免| 多毛熟女@视频| 久久99一区二区三区| 亚洲国产精品国产精品| 成人18禁高潮啪啪吃奶动态图 | 99精国产麻豆久久婷婷| 一级,二级,三级黄色视频| 3wmmmm亚洲av在线观看| 久久久a久久爽久久v久久| 亚洲第一区二区三区不卡| 日韩免费高清中文字幕av| 久久av网站| 亚洲国产精品专区欧美| 在线观看免费视频网站a站| 一本一本综合久久| 久久精品国产自在天天线| 国产高清三级在线| 国产亚洲最大av| 成人毛片60女人毛片免费| 亚洲av不卡在线观看| 久久综合国产亚洲精品| 伊人久久精品亚洲午夜| 亚洲欧美中文字幕日韩二区| 日韩中字成人| 内地一区二区视频在线| 一本一本综合久久| 熟女av电影| 精品久久国产蜜桃| 久久久久久人妻| 成人影院久久| 亚洲欧美一区二区三区黑人 | 国产伦精品一区二区三区四那| 久久久久国产精品人妻一区二区| 亚洲电影在线观看av| 免费黄频网站在线观看国产| 亚洲情色 制服丝袜| 亚洲精品乱码久久久久久按摩| 麻豆精品久久久久久蜜桃| 视频中文字幕在线观看| 看非洲黑人一级黄片| 久热这里只有精品99| 国产无遮挡羞羞视频在线观看| 一边亲一边摸免费视频| 国产精品福利在线免费观看| 久久久国产精品麻豆| 欧美精品高潮呻吟av久久| 简卡轻食公司| 一级二级三级毛片免费看| 日韩中字成人| 熟女电影av网| 国语对白做爰xxxⅹ性视频网站| 美女国产视频在线观看| 亚洲国产精品一区二区三区在线| 国产伦在线观看视频一区| 如何舔出高潮| 国产精品三级大全| 久久久久精品久久久久真实原创| .国产精品久久| 一级毛片aaaaaa免费看小| 国产精品国产三级国产专区5o| 久久 成人 亚洲| 国产成人精品婷婷| 日本午夜av视频| 国产伦在线观看视频一区| 亚洲人成网站在线播| a 毛片基地| 欧美另类一区| 亚洲精品日韩在线中文字幕| 久久精品国产亚洲av天美| 丰满少妇做爰视频| 99九九线精品视频在线观看视频| av在线播放精品| 国产一区二区三区av在线| 十八禁网站网址无遮挡 | 亚洲激情五月婷婷啪啪| 久久精品国产亚洲av涩爱| 久久99热6这里只有精品| 国产精品久久久久久av不卡| 国产女主播在线喷水免费视频网站| 91精品一卡2卡3卡4卡| 亚洲av在线观看美女高潮| 亚洲综合精品二区| 久久综合国产亚洲精品| 色吧在线观看| 日韩av在线免费看完整版不卡| 亚洲经典国产精华液单| av在线app专区| 国产精品一区www在线观看| 国产精品欧美亚洲77777| av播播在线观看一区| 另类亚洲欧美激情| 国产日韩一区二区三区精品不卡 | 国产男人的电影天堂91| 赤兔流量卡办理| 久热久热在线精品观看| 热re99久久国产66热| 国产精品国产三级国产av玫瑰| 国产伦精品一区二区三区四那| 中文字幕亚洲精品专区| 免费大片18禁| 日韩制服骚丝袜av| 永久网站在线| 日韩一区二区视频免费看| 免费黄色在线免费观看| 美女福利国产在线| 色吧在线观看| 亚洲av中文av极速乱| 熟女电影av网| 黑人巨大精品欧美一区二区蜜桃 | 久久午夜福利片| 国产伦精品一区二区三区视频9| 看免费成人av毛片| 欧美 日韩 精品 国产| av国产久精品久网站免费入址| 一级片'在线观看视频| 深夜a级毛片| 免费观看av网站的网址| 亚洲欧美成人精品一区二区| 新久久久久国产一级毛片| 狂野欧美白嫩少妇大欣赏| 少妇猛男粗大的猛烈进出视频| 激情五月婷婷亚洲| 国产成人免费观看mmmm| 精品人妻偷拍中文字幕| 老司机影院毛片| 国产欧美日韩精品一区二区| 亚洲精品中文字幕在线视频 | 最近手机中文字幕大全| 免费不卡的大黄色大毛片视频在线观看| 插逼视频在线观看| 熟妇人妻不卡中文字幕| 欧美日韩av久久| 91精品国产国语对白视频| 一级毛片电影观看| 中国三级夫妇交换| 国产一区二区在线观看日韩| 在线精品无人区一区二区三| 日韩电影二区| 国产欧美日韩一区二区三区在线 | 欧美区成人在线视频| 另类亚洲欧美激情| 久久狼人影院| 内地一区二区视频在线| 欧美精品亚洲一区二区| 久久韩国三级中文字幕| 91在线精品国自产拍蜜月| 国产亚洲午夜精品一区二区久久| 日产精品乱码卡一卡2卡三| 寂寞人妻少妇视频99o| 夫妻性生交免费视频一级片| 中文资源天堂在线| 另类亚洲欧美激情| 免费黄频网站在线观看国产| 香蕉精品网在线| 欧美日韩av久久| 国产高清有码在线观看视频| 一级毛片久久久久久久久女| 欧美少妇被猛烈插入视频| 日韩中文字幕视频在线看片| 全区人妻精品视频| 人人妻人人澡人人爽人人夜夜| 亚洲欧美成人精品一区二区| 久久狼人影院| 国产精品麻豆人妻色哟哟久久| 亚洲精品一区蜜桃| 午夜激情福利司机影院| 成人无遮挡网站| 一级二级三级毛片免费看| 亚洲国产欧美在线一区| 国产一区二区在线观看av| 国产极品天堂在线| 十八禁网站网址无遮挡 | 日本wwww免费看| 精品熟女少妇av免费看| 全区人妻精品视频| 人体艺术视频欧美日本| 欧美高清成人免费视频www| 99热这里只有是精品50| 国产深夜福利视频在线观看| a级毛色黄片| 一级黄片播放器| 久久精品夜色国产| 免费观看的影片在线观看| 99久久综合免费| 精品少妇久久久久久888优播| 亚洲内射少妇av| 99热网站在线观看| 欧美日韩视频精品一区| 国产精品福利在线免费观看| 永久免费av网站大全| 在线观看av片永久免费下载| 丰满迷人的少妇在线观看| 欧美最新免费一区二区三区| 欧美老熟妇乱子伦牲交| 超碰97精品在线观看| 久久久久网色| 视频中文字幕在线观看| 国产视频首页在线观看| 久久久精品94久久精品| 精品久久久久久电影网| av免费观看日本| 五月伊人婷婷丁香| 哪个播放器可以免费观看大片| 国产精品偷伦视频观看了| 亚洲精品,欧美精品| 亚洲精品久久久久久婷婷小说| 免费观看av网站的网址| 久久人妻熟女aⅴ| 亚洲成色77777| 久久久久久久久久人人人人人人| 色94色欧美一区二区| 伊人久久精品亚洲午夜| 高清视频免费观看一区二区| 中文资源天堂在线| 国产一区二区三区综合在线观看 | 国产日韩欧美在线精品| 伊人久久国产一区二区| av在线app专区| 国产69精品久久久久777片| 韩国av在线不卡| 秋霞在线观看毛片| 最后的刺客免费高清国语| 最后的刺客免费高清国语| 最近手机中文字幕大全| 婷婷色综合www| 亚洲四区av| 黄色怎么调成土黄色| 亚洲欧美日韩卡通动漫| 亚洲,一卡二卡三卡| 嫩草影院新地址| 卡戴珊不雅视频在线播放| 欧美性感艳星| 国产综合精华液| 天美传媒精品一区二区| 最近中文字幕高清免费大全6| 国产av国产精品国产| 啦啦啦视频在线资源免费观看| 久久精品久久精品一区二区三区| 少妇人妻久久综合中文| 色婷婷av一区二区三区视频| 中国三级夫妇交换| 有码 亚洲区| 亚洲国产精品专区欧美| 各种免费的搞黄视频| 在线观看三级黄色| 免费久久久久久久精品成人欧美视频 | 日本欧美国产在线视频| 人人澡人人妻人| 久久99热这里只频精品6学生| av一本久久久久| 免费大片18禁| 亚洲av男天堂| 一级毛片 在线播放| 国产成人精品婷婷| 91久久精品国产一区二区三区| 亚州av有码| 少妇人妻一区二区三区视频| 国产精品福利在线免费观看| 久久精品久久久久久久性| 多毛熟女@视频| 五月玫瑰六月丁香| 高清av免费在线| 激情五月婷婷亚洲| 一级毛片久久久久久久久女| 欧美日韩视频高清一区二区三区二| 日韩成人av中文字幕在线观看| 久久久久久久久大av| 桃花免费在线播放| 国产男人的电影天堂91| 亚洲欧洲精品一区二区精品久久久 | 特大巨黑吊av在线直播| 在线免费观看不下载黄p国产| 少妇人妻精品综合一区二区| 老司机亚洲免费影院| 国产美女午夜福利| 婷婷色av中文字幕| 久久青草综合色| 美女内射精品一级片tv| 青春草视频在线免费观看| 国产在线男女| 亚洲国产精品999| 少妇高潮的动态图| 国产成人精品久久久久久| 国产永久视频网站| 五月伊人婷婷丁香| 国内少妇人妻偷人精品xxx网站| 天天操日日干夜夜撸| 日本与韩国留学比较| 寂寞人妻少妇视频99o| 亚洲精品,欧美精品| 久久久久精品久久久久真实原创| 日韩中文字幕视频在线看片| 男女国产视频网站| 国产精品三级大全| 久久97久久精品| 老司机影院成人| 欧美 日韩 精品 国产| 欧美一级a爱片免费观看看| 亚洲国产日韩一区二区| 丝袜喷水一区| 成人美女网站在线观看视频| 青春草视频在线免费观看| 成人午夜精彩视频在线观看| 岛国毛片在线播放| 我的老师免费观看完整版| 汤姆久久久久久久影院中文字幕| 国产 一区精品| 亚洲av福利一区| 天美传媒精品一区二区| 日本欧美国产在线视频| 丝袜在线中文字幕| 少妇精品久久久久久久| 日韩欧美一区视频在线观看 | 黄色一级大片看看| 黄色日韩在线| 国内少妇人妻偷人精品xxx网站| 91精品国产国语对白视频| 最近手机中文字幕大全| 超碰97精品在线观看| 日本欧美视频一区| 久久这里有精品视频免费| 日韩,欧美,国产一区二区三区| 日本黄色日本黄色录像| 国产精品不卡视频一区二区| 亚洲av二区三区四区| 亚洲激情五月婷婷啪啪| 免费播放大片免费观看视频在线观看| 男女边吃奶边做爰视频| 免费大片18禁| 日韩欧美精品免费久久| 久久国内精品自在自线图片| 在线免费观看不下载黄p国产| 国产在线一区二区三区精| 亚洲国产精品成人久久小说| 色婷婷久久久亚洲欧美| 在线亚洲精品国产二区图片欧美 | 黄色一级大片看看| 久久精品久久久久久噜噜老黄| 亚洲精品日韩在线中文字幕| a级毛片免费高清观看在线播放| 亚洲色图综合在线观看| 亚洲人成网站在线播| 91精品一卡2卡3卡4卡| h视频一区二区三区| 看免费成人av毛片| freevideosex欧美| 日韩中文字幕视频在线看片| 久久精品久久精品一区二区三区| 夜夜骑夜夜射夜夜干| 水蜜桃什么品种好| av在线app专区| 国产成人freesex在线| 国产探花极品一区二区| 久久久国产精品麻豆| 91久久精品电影网| 草草在线视频免费看| 少妇高潮的动态图| 男的添女的下面高潮视频| 国产欧美日韩综合在线一区二区 | 一级黄片播放器| 亚洲av不卡在线观看| 精品一品国产午夜福利视频| 不卡视频在线观看欧美| 两个人的视频大全免费| 久久99蜜桃精品久久| 免费不卡的大黄色大毛片视频在线观看| 我要看黄色一级片免费的| 欧美日韩亚洲高清精品| 久热久热在线精品观看| 成人美女网站在线观看视频| 精品酒店卫生间| av在线观看视频网站免费| 欧美精品国产亚洲| 少妇猛男粗大的猛烈进出视频| 麻豆乱淫一区二区| 黄色欧美视频在线观看| 久久女婷五月综合色啪小说| 最新的欧美精品一区二区| 男女边吃奶边做爰视频| 精品卡一卡二卡四卡免费| 在现免费观看毛片| 国产深夜福利视频在线观看| 寂寞人妻少妇视频99o| 亚洲内射少妇av| 国产无遮挡羞羞视频在线观看| 波野结衣二区三区在线| 亚洲国产精品专区欧美| 最近2019中文字幕mv第一页| 99久久综合免费| 婷婷色综合www| 韩国av在线不卡| 大又大粗又爽又黄少妇毛片口| h日本视频在线播放| 久久久久久久久久久久大奶| www.色视频.com| 嫩草影院新地址| 妹子高潮喷水视频| 国产成人精品一,二区| 欧美精品高潮呻吟av久久| 亚洲成人av在线免费| 亚洲精品日韩av片在线观看| 亚洲精品日韩av片在线观看| 亚洲欧美成人综合另类久久久| 9色porny在线观看| 女性被躁到高潮视频| 纯流量卡能插随身wifi吗| 国产又色又爽无遮挡免| 波野结衣二区三区在线| 亚洲美女黄色视频免费看| 国产精品国产三级国产av玫瑰| 日本爱情动作片www.在线观看| 少妇 在线观看| 日韩视频在线欧美| av福利片在线观看| 国产精品偷伦视频观看了| 国产男人的电影天堂91| 亚洲一级一片aⅴ在线观看| 久久99蜜桃精品久久| 激情五月婷婷亚洲| 人妻少妇偷人精品九色| 久久精品国产亚洲av天美| tube8黄色片| 亚洲av在线观看美女高潮| 丁香六月天网| kizo精华| 三级经典国产精品| 观看av在线不卡| 一本—道久久a久久精品蜜桃钙片| 国产国拍精品亚洲av在线观看| 在线观看国产h片| 国国产精品蜜臀av免费| 高清不卡的av网站| 国产69精品久久久久777片| 久久久久久久久久成人| 久久国产乱子免费精品| 人人澡人人妻人| 水蜜桃什么品种好| 亚洲精品一二三| 日本免费在线观看一区| 国产一区二区在线观看av| 高清不卡的av网站| 日韩视频在线欧美| videos熟女内射| 九草在线视频观看| 日韩精品有码人妻一区| 嫩草影院新地址| 婷婷色av中文字幕| 色视频在线一区二区三区| 在线 av 中文字幕| 久久精品久久久久久久性| 成年女人在线观看亚洲视频| 亚洲成人av在线免费| 少妇丰满av| 美女福利国产在线| 少妇 在线观看| 国产成人精品一,二区| 夜夜看夜夜爽夜夜摸| 97超视频在线观看视频| 一级毛片久久久久久久久女| 亚洲精品成人av观看孕妇| av在线老鸭窝| 亚洲成色77777| 日本wwww免费看| 国产成人a∨麻豆精品| 欧美丝袜亚洲另类| 欧美人与善性xxx| 亚洲熟女精品中文字幕| 日韩大片免费观看网站| 国产成人a∨麻豆精品| 久久精品国产亚洲av天美| 伊人久久精品亚洲午夜| 日韩一本色道免费dvd| 黑人高潮一二区| 欧美 亚洲 国产 日韩一| 97在线人人人人妻| 三级经典国产精品| 3wmmmm亚洲av在线观看| 一级毛片久久久久久久久女| 美女cb高潮喷水在线观看| 精品亚洲成a人片在线观看| 亚洲成色77777| 丰满人妻一区二区三区视频av| 婷婷色综合www| tube8黄色片| 成人毛片60女人毛片免费| 一级毛片 在线播放| 午夜免费鲁丝| 久久午夜福利片| 久久久国产欧美日韩av| 亚洲欧美精品自产自拍| 99久久综合免费| 国产精品一区二区在线观看99| 最近中文字幕2019免费版| 一本色道久久久久久精品综合| 日韩强制内射视频| 精品久久久噜噜| 草草在线视频免费看| 亚洲国产欧美日韩在线播放 | 色视频在线一区二区三区| 国产午夜精品一二区理论片| 国产男人的电影天堂91| 丝袜喷水一区| 国产成人91sexporn| 精品99又大又爽又粗少妇毛片| 中文欧美无线码| 精品人妻熟女av久视频| 制服丝袜香蕉在线| av国产久精品久网站免费入址| 日日摸夜夜添夜夜添av毛片| 精品久久国产蜜桃| 夜夜爽夜夜爽视频| 18禁裸乳无遮挡动漫免费视频| 乱人伦中国视频| 亚洲欧美成人综合另类久久久| 极品教师在线视频| 国产精品福利在线免费观看| 51国产日韩欧美| 欧美精品一区二区大全| 成年人午夜在线观看视频| av.在线天堂| 久久韩国三级中文字幕| 日韩,欧美,国产一区二区三区| 久久人人爽av亚洲精品天堂| 另类亚洲欧美激情| 亚洲av福利一区| 国产色婷婷99| 草草在线视频免费看| 亚洲欧美清纯卡通| 久久国产精品大桥未久av | 亚洲人成网站在线播| 国产欧美日韩综合在线一区二区 | 国产伦精品一区二区三区四那| 久久久国产一区二区| 欧美97在线视频| 国产精品国产三级国产av玫瑰| 黑人巨大精品欧美一区二区蜜桃 | 99热6这里只有精品| 午夜日本视频在线| 免费黄色在线免费观看| 亚洲av国产av综合av卡| 乱系列少妇在线播放| 国产永久视频网站| 在线免费观看不下载黄p国产| 日韩av在线免费看完整版不卡| 国产 精品1| 日韩av不卡免费在线播放| 国产 精品1| 亚洲av中文av极速乱| 少妇熟女欧美另类| 在线观看免费日韩欧美大片 | 欧美 亚洲 国产 日韩一| 欧美日韩综合久久久久久| 丰满少妇做爰视频| 国产精品久久久久久久电影| 精品国产国语对白av| 全区人妻精品视频| 国产精品久久久久成人av| 精品熟女少妇av免费看|