• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    WEIGHTED LASSO ESTIMATES FOR SPARSE LOGISTIC REGRESSION:NON-ASYMPTOTIC PROPERTIES WITH MEASUREMENT ERRORS?

    2021-04-08 12:52:14HuameiHUANG黃華妹

    Huamei HUANG(黃華妹)

    Department of Statistics and Finance,University of Science and Technology of China,Hefei 230026,China E-mail:huanghm@mail.ustc.edu.cn

    Yujing GAO(高鈺婧)

    Guanghua School of Management,Peking University,Beijing 100871,China E-mail:jane.g1996@pku.edu.cn

    Huiming ZHANG(張慧銘)

    School of Mathematical Sciences,Peking University,Beijing 100871,China E-mail:zhanghuiming@pku.edu.cn

    Bo LI(李波)?

    School of Mathematics and Statistics,Central China Normal University,Wuhan 430079,China E-mail:haoyoulibo@163.com

    Abstract For high-dimensional models with a focus on classification performance,the ?1-penalized logistic regression is becoming important and popular.However,the Lasso estimates could be problematic when penalties of different coefficients are all the same and not related to the data.We propose two types of weighted Lasso estimates,depending upon covariates determined by the McDiarmid inequality.Given sample size n and a dimension of covariates p,thefinite sample behavior of our proposed method with a diverging number of predictors is illustrated by non-asymptotic oracle inequalities such as the ?1-estimation error and the squared prediction error of the unknown parameters.We compare the performance of our method with that of former weighted estimates on simulated data,then apply it to do real data analysis.

    Key words logistic regression;weighted Lasso;oracle inequalities;high-dimensional statistics;measurement error

    1 Introduction

    In recent years,with the advancement of modern science and technology,high-throughput and non-parametric complex data has been frequently collected in gene-biology,chemometrics,neuroscience and other scientific fields.With the massive data in regression problem,we encounter a situation in which both the number of covariates p and sample size n are increasing,and p is a function of n,i.e.,p=:p(n).One further assumption in the literature is that p is allowed to grow with n but p ≤n,this has been extensively studied in works(see [9,19,27,28]and references therein).When we consider the variable selection in terms of a linear or generalized linear model,massive data sets bring researchers unprecedented computational challenges,such as the“l(fā)arge p,small n”paradigm;see [16].Therefore,another potential characterization appearing in large-scale data is that we only have a few significant predictors among p covariates and p ?n.The main challenge is that directly utilizing low-dimensional (classical and traditional) statistical inference and computing methods for this increasing dimension data is prohibitive.Fortunately,the regularized(or penalized) method can perform parameter estimation and variable selection to enhance the prediction accuracy and interpretability of the regression model it generates.One famous proposed method is the Lasso (least absolute shrinkage and selection operator) method,which was introduced in [21]as modification of the least square method in the case of linear models.

    where Y∈{0,1}is the response variable of the individual i,and βis a p×1 vector of unknown regression coefficients belonging to a compact subset of R.The unknown parameter βis often estimated by the maximum likelihood estimator (MLE) through maximizing the log-likelihood function with respect to β,namely,

    For more discussions of binary response regression,we refer readers to [20]for a comprehensive introduction,and to [8]for a refreshing view of modern statistical inference in today’s computer age.

    In the high-dimensional case,we often encounter a situation where the number of predictors p is larger than the sample size n.When p ?n,the least square method leads to overparameterization;Lasso and many other regularization estimates are required to obtain a stable and satisfactory fitting.Although the Lasso method performs well,some generalized penalities have been proposed,since researchers want to compensate for certain of Lasso’s shortcomings,and to make the penalized method more useful for a particular data set;see [12].Since Lasso gives the same penalty for each β,an important extension is to use different levels of penalties to shrink each covariate’s coefficients.The weighted Lasso estimation method is an improvement of Lasso,where penalized coefficients are estimated based on different data-related weights,but one challenge is that this weighted method depends on both covariates and responses,so it is difficult for us to find the optimal weights.

    ? This paper proposes the concentration-based weighted Lasso for inference high dimensional sparse logistic regressions;the proposed weights are better than [15]in terms of applications.

    ? This paper derives the non-asymptotic oracle inequalities with a measurement error for weighted Lasso estimates in sparse logistic regressions,and the obtained oracle inequalities are shaper than [5]in the case of Lasso estimates of logistic regressions.

    ? The Corollary 2.1 is the theoretical guarantee when we do a simulation;the smallest signal should be larger than a threshold value.

    This paper is organized as following:in Section 2,the problem of estimating coefficients in a logistic regression model by weighted Lasso is discussed,and we give non-asymptotic oracle inequalities for it under the Stabil Condition.In Section 3,we introduce novel data dependent weights for Lasso penalized logistic regression and compare this method with other proposed weights.Our weights are based on the KKT conditions such that the KKT conditions hold with high probability.In Section 4,we use a simulation to show how our proposed methods can rival the existing weighted estimators,and apply our methods for a real genetic data analysis.

    2 Weighted Lasso Estimates and Oracle Inequalities

    2.1 ?1-penalized sparse logistic regression

    We consider the following estimator of ?-penalized logistic regression

    The gradient of ?(β) is

    Theoretically,the data-dependent turning parameter λwis required to ensure that the KKT conditions evaluated at the true parameter hold with high probability.In section 3,we will apply McDiarmid’s inequality of the weighted sum of random variables to obtain λw.In addition,McDiarmid’s inequality is an important ingredient that can help us establish the oracle inequality for the weighted Lasso estimate in the next section.

    2.2 Oracle inequalities

    Deriving oracle inequalities is a powerful mathematical tool which provides deep insight into the non-asymptotic fluctuation of an estimator in comparation with the ideal unknown estimate,which is called oracle.A comprehensive theory of high-dimensional regression has been developed for Lasso and its generalization.Chapter 6 of[3]and Chapter 11 of[12]outline the theory of Lasso,including works on oracle inequalities.In this section,non-asymptotic oracle inequalities for the weighted Lasso estimate of logistic regression are sought,under the assumptions of the Stabil Condition.

    Before we get to our arguments,we first introduce the definition of Stabil Condition (a type of restricted eigenvalue originally proposed in[2]),which provides a tuning parameter and sparsity-dependent bounds for the ?-estimation error and the square prediction error.Most importantly,we consider the following two assumptions:

    Although there are applications where unbounded covariates will be of interest,for convenience,we do not discuss the case with unbounded covariates.We also limit our analysis to bounded predictors,since the real data we collected was often bounded.If data is not bounded,we can take a log-transformation of the original data,thus making the transformed data almost bounded,We can also establish transformation f(X)=exp(X)/{1+exp(X)};thus the transformed predictors are undoubtedly bounded variables.

    Let βbe the true coefficient,which is defined by minimization of the unknown risk function

    where l(Y,X;β)=?Y Xβ+log{1+exp(Xβ)} is the logistic loss function.

    It can be shown that(2.3)coincides(1.1).The first order condition for convex optimization(2.3) is

    To establish the desired oracle inequalities,on WC(k,ε) we assume that the p×p matrix Σ=E(XX) satisfies at least one of the following conditions:the Stabil Condition (see [5]) or the Weighted Stabil Condition (our proposed condition).

    Definition 2.1

    (Stabil Condition) For a given constant c>0 and the measurement error ε>0,let Σ=E(XX) be a covariance matrix,which satisfies the Stabil condition S(c,ε,k)if there exists 0

    for any b ∈WC(k,ε).

    Definition 2.2

    (weighted Stabil Condition) For a given constant c>0 and the measurement error ε >0,let Σ=E(XX) be a covariance matrix,which satisfies the Weighted Stabil condition WS(c,ε,k) if there exists 0

    for any b ∈WC(k,ε).

    The constants c,cin the above two conditions are essentially the lower bound on the restricted eigenvalues of the covariance matrix.For convenience,we use the same ε in the above two conditions and in the weighted cone set.Under the above mentioned assumptions,we have the following oracle inequalities:

    Remark 2.5

    If (2.1) is replaced by the robust penalized logistic regression(see [17,24])as

    where Ris some weight such that nR≤C in the weighted log-likelihood(here C is a constant),then we still have oracle results similar to Theorem 2.3.

    The proof of Theorem 2.3 and the ensuing corollary are both given in Section 5.

    Corollary 2.7

    Let δ ∈(0,1) be a fixed number.Suppose that the assumption of Theorem 2.3 is satisfied,and the weakest signal and strongest signal meet the condition

    This corollary is the theoretical guarantee when we do a simulation that the smallest signal of βshould be large such that we have a threshold value which is also called the Beta-min Condition (see [3]).

    3 Data-dependent Weights

    Lemma 3.1

    Suppose that X,···,Xare independent random variables all taking values in the set A,and assume that f :A→R is a function satisfying the bounded difference condition

    One may also find a more particular way to define weights in references.In high-dimensional settings,unlike those for the previously proposed weighted Lasso estimates,our weights are based on conditions which hold with high probability.Various weights that we compare in the next section are:

    4 Simulation and Real Data Results

    4.1 Simulation Results

    In this section,we compare the performance of the ordinary Lasso estimate and the weighted Lasso estimate of logistic regression on simulated data sets.We use the R package glmnet with function glmreg() to fit the ordinary Lasso estimate of logistic regression.For the weighted Lasso estimate,we first apply function cv.glmnet() for 10-fold cross-validation to obtain the optimal tuning parameter λ.The actual weights we use are the standardized weights given by

    Then we transfer our weighted problem into an unweighted problem and apply the function lbfgs() in R package lbfgs to find the solution for the unweighted Lasso optimal problem.The original weighted optimal problem is

    A Data generation

    For each simulation,we set n=100 and n=200.We set the dimensions as p=50,100,150 and 200,and adopt the simulation setting as the following two patterns:

    1.The predictor variables X are randomly drawn from the multivariate normal distribution N(0,Σ),where Σ has elements ρ(k,l=1,2,···,p).The correlation among predictor variables is controlled by ρ with ρ=0.3,0.5 and 0.8.We assign the true coefficient parameter of logistic regression as

    2.Similar to case 1,we generate predictor variables X from multivariate normal distribution N(0,Σ),where Σ has elements ρ(k,l=1,2,···,p),and ρ=0.3,0.5 and 0.8,and we set the true coefficient parameter as

    B Simulation results

    These simulation results are listed in Tables 1 to 3.

    From these simulation results,our proposed estimate with Type II weight is better than other methods in most cases,both in terms as ?-error and prediction error.The adaptive Lasso estimate performs worst among these five ?-penalized estimates.

    4.2 Real Data Results

    In this section,we apply our proposed estimates to analyze biological data.We consider the following two complete data sets:

    (a)The first data set is the gene expression data from a leukemia microarray study(see[10]).The data comprises n=72 patients,in which 25 patients have acute myeloid leukemia (AML)and 47 patients have acute lymphoblastic leukemia(ALL).Therefore,the binary response in this data set is categorized as AML (label 0) and ALL (label 1).The predictors are the expression levels of p=7129 genes.The data can be found in R package golubEsets.

    (b) The second data set is the above gene expression data after preprocessing and filtering(see [7]).This process reduces the number of genes to p=3571.The data set can be found in R package cancerclass.

    Table 1 Means of ?1-error and prediction error for simulation 1 and 2 (ρ=0.3)

    Table 2 Means of ?1-error and prediction error for simulation 1 and 2 (ρ=0.5)

    Table 3 Means of ?1-error and prediction error for simulation 1 and 2 (ρ=0.8)

    Our target is to select useful genes for specifying AML and ALL.Note that there is no available information about model parameters,so we cannot directly compare the selection accuracy and prediction accuracy.Therefore,we list model size and the prediction error of the estimated model under the Leave-One-Out Cross-Validation(LOOCV) framework.Model size can show the coverage of the current estimated model,prediction error can show the prediction accuracy.We apply the ordinary Lasso method and four different weighted Lasso methods as described in the previous section to analyze these data sets.Since lots of coefficients estimated by weighted Lasso methods are small but not zero,we choose 10and 10as the limits for these two data sets separately,and set the coefficients that are less than these limits to zero.These results are summarized in Table 4.

    Table 4 Mean and standard deviation of model size and misclassification rate under Leave-One-Out Cross-Validation framework

    In the first data set with limit 10,the weighted Lasso estimate with Type II Weight has the best prediction performance;in the first data set with limit 10,the weighted Lasso estimate with weight Type III Weight has the best prediction performance.In the second data set with limit 10,the weighted Lasso estimates with Type I Weight,Type II Weight and Type IV Weight have a similar prediction performance,and the weighted Lasso estimates with Type II Weight and Type IV Weight use less predictor variables than other methods;in the second data set with limit 10,the weighted Lasso estimate with Type III Weight has the best prediction performance and also uses the fewest variables.Therefore,we can see that the weighted Lasso method with Type II Weight,Type III Weight and Type IV Weight can estimate more accurately and with less predictors than other methods.

    After comparing these five methods,we build a model for the complete observations,and report the selected genes in ordinary Lasso regression and some weighted Lasso methods which have small LOOCV errors.These results are listed in Tables 5 and 6.We observe that the weighted Lasso estimate with Type II Weight selects more variables than the weighted Lasso estimates with Type III and Type IV Weights,and has less refitting prediction errors.Summarizing the above results,our proposed weighted Lasso method can pick much more meaningful variables for explanation and prediction.

    Table 5(a) Genes associated with ALL and AML selected by four methods (p=7129)

    Table 6(b) Genes associated with ALL and AML selected by four methods (p=3571)

    5 Proofs

    5.1 The Proof of Theorem 2.3

    The non-asymptotic analysis of Lasso and its generalization often leans on several steps.

    The first step is to propose a restricted eigenvalue condition or other analogous condition about the design matrix,which guarantees local orthogonality via a restricted set of coefficient vectors.

    The second step is to get the size of tuning parameter based on KKT optimality conditions(or other KKT-like condition such as Dantzig selector).

    The language our proof is heavily influenced by theory of empirical process.For simplicity,we denote the theoretical risk by Pl(β)=:E{l(y,β,X)} and the empirical risk by

    5.2 Step 1:Choosing the order of tuning parameter

    Define the following stochastic Lipschitz constant in terms of the suprema of a centralized empirical process:

    To obtain (5.4),we need the next two lemmas.The proof of the ensuring symmetrization and contraction theorems can be found in Section 14.7 of [3].

    Let X,···,Xbe independent random variables taking values in some space X,and let F be a class of real-valued functions on X.

    Lemma 5.1

    (symmetrization theorem) Let ε,···,εbe a Rademacher sequence with uniform distribution on {?1,1},independent of X,···,Xand f ∈F.Then we have

    where E[·]refers to the expectation w.r.t.X,···,Xand E{·} w.r.t.?,···,?.

    Lemma 5.2

    (contraction theorem) Let x,···,xbe the non-random elements of X and let ε,···,εbe a Rademacher sequence.Consider c-Lipschitz functions g,i.e.,

    Thus the function ghere is 2-Lipschitz (in the sense of Lemma 5.2).

    Apply the symmetrization theorem and the contraction theorem implies

    5.3 Step 2:Check ?β?∈WC(3,εn)

    5.4 Step 3:Derive error bounds from Stabil Condition

    5.4.1 Case of the Weighted Stabil Condition

    5.4.2 Case of the Stabil Condition

    5.5 Proof of Corollary 2.7

    6 Summary

    麻豆成人av视频| xxx大片免费视频| 搡老乐熟女国产| 国语对白做爰xxxⅹ性视频网站| 色视频在线一区二区三区| 有码 亚洲区| 男女下面进入的视频免费午夜| 一区二区av电影网| 少妇丰满av| 欧美亚洲 丝袜 人妻 在线| 久久国产乱子免费精品| 亚洲成人中文字幕在线播放| 婷婷色综合大香蕉| 日韩伦理黄色片| 一本色道久久久久久精品综合| 免费久久久久久久精品成人欧美视频 | 免费黄频网站在线观看国产| 女性被躁到高潮视频| 成人无遮挡网站| av国产久精品久网站免费入址| 汤姆久久久久久久影院中文字幕| 一区二区三区精品91| 搡女人真爽免费视频火全软件| 日日撸夜夜添| 免费av中文字幕在线| 青春草亚洲视频在线观看| 国产91av在线免费观看| 丝袜喷水一区| 久久亚洲国产成人精品v| 黑丝袜美女国产一区| 国语对白做爰xxxⅹ性视频网站| 建设人人有责人人尽责人人享有的 | 久久女婷五月综合色啪小说| av免费在线看不卡| 国产老妇伦熟女老妇高清| 国产色爽女视频免费观看| h视频一区二区三区| 成人高潮视频无遮挡免费网站| 99视频精品全部免费 在线| av国产免费在线观看| 少妇裸体淫交视频免费看高清| 99热这里只有是精品在线观看| 亚洲av中文字字幕乱码综合| 久久久午夜欧美精品| 一级毛片电影观看| 亚洲成人一二三区av| 欧美+日韩+精品| 91aial.com中文字幕在线观看| 国产精品一区二区在线不卡| 欧美 日韩 精品 国产| 十八禁网站网址无遮挡 | 久久久久人妻精品一区果冻| 国产精品久久久久成人av| 麻豆乱淫一区二区| 日韩一区二区三区影片| 国产精品蜜桃在线观看| 黄色欧美视频在线观看| 亚洲无线观看免费| 国产在线免费精品| 国产乱人视频| 少妇丰满av| 高清av免费在线| 嫩草影院新地址| a 毛片基地| 亚洲精品国产av蜜桃| 联通29元200g的流量卡| 99精国产麻豆久久婷婷| 色吧在线观看| 久久久久网色| 欧美老熟妇乱子伦牲交| 久久久久精品久久久久真实原创| 春色校园在线视频观看| 亚洲av中文字字幕乱码综合| 一区在线观看完整版| 啦啦啦中文免费视频观看日本| 国产亚洲一区二区精品| av国产免费在线观看| 亚洲欧美日韩卡通动漫| 国产 一区 欧美 日韩| 最近中文字幕2019免费版| 国产黄色视频一区二区在线观看| 蜜桃在线观看..| 汤姆久久久久久久影院中文字幕| 嫩草影院新地址| 晚上一个人看的免费电影| 九草在线视频观看| 99久久中文字幕三级久久日本| 日韩免费高清中文字幕av| 又黄又爽又刺激的免费视频.| 欧美日韩一区二区视频在线观看视频在线| 亚洲国产欧美在线一区| 观看av在线不卡| 在线观看av片永久免费下载| 18禁裸乳无遮挡动漫免费视频| 国内精品宾馆在线| 亚洲精华国产精华液的使用体验| 欧美日韩在线观看h| 综合色丁香网| 日日啪夜夜爽| 久久久久久九九精品二区国产| 欧美少妇被猛烈插入视频| 国产精品国产三级国产专区5o| 色5月婷婷丁香| 亚洲人成网站高清观看| av网站免费在线观看视频| 欧美另类一区| 熟妇人妻不卡中文字幕| 免费观看在线日韩| 99九九线精品视频在线观看视频| 久久久久久久大尺度免费视频| 少妇高潮的动态图| 自拍欧美九色日韩亚洲蝌蚪91 | 日本黄色片子视频| 人妻制服诱惑在线中文字幕| av免费在线看不卡| 麻豆乱淫一区二区| 国产高清不卡午夜福利| 九色成人免费人妻av| 狂野欧美激情性bbbbbb| 久久鲁丝午夜福利片| 久久精品久久久久久噜噜老黄| 国产免费一区二区三区四区乱码| 80岁老熟妇乱子伦牲交| 日本午夜av视频| 六月丁香七月| 日韩一区二区视频免费看| 日产精品乱码卡一卡2卡三| 天美传媒精品一区二区| 久久久久久久久久久丰满| 午夜福利在线观看免费完整高清在| 爱豆传媒免费全集在线观看| 久久久久网色| 午夜视频国产福利| 汤姆久久久久久久影院中文字幕| 一二三四中文在线观看免费高清| 久久精品国产鲁丝片午夜精品| 成人亚洲欧美一区二区av| 五月天丁香电影| 精品一品国产午夜福利视频| 街头女战士在线观看网站| 99九九线精品视频在线观看视频| 国产久久久一区二区三区| 美女脱内裤让男人舔精品视频| av免费在线看不卡| 日韩亚洲欧美综合| 日本欧美视频一区| 国产精品国产三级专区第一集| 亚洲av电影在线观看一区二区三区| 欧美精品亚洲一区二区| 欧美人与善性xxx| 日本午夜av视频| av卡一久久| 伊人久久国产一区二区| 国内少妇人妻偷人精品xxx网站| 男女边吃奶边做爰视频| 免费在线观看成人毛片| 精品久久久久久电影网| 国产老妇伦熟女老妇高清| 成人国产麻豆网| 欧美成人一区二区免费高清观看| 大又大粗又爽又黄少妇毛片口| 在线看a的网站| 亚洲av福利一区| 国产精品一区www在线观看| 另类亚洲欧美激情| 亚洲成人中文字幕在线播放| 97热精品久久久久久| 又黄又爽又刺激的免费视频.| 日韩欧美 国产精品| 一级a做视频免费观看| 建设人人有责人人尽责人人享有的 | 久久热精品热| 国精品久久久久久国模美| tube8黄色片| 男人舔奶头视频| 一二三四中文在线观看免费高清| 美女中出高潮动态图| 黄片无遮挡物在线观看| 插阴视频在线观看视频| 亚洲精品456在线播放app| 少妇的逼水好多| 最黄视频免费看| 日本黄色片子视频| 国产女主播在线喷水免费视频网站| 国产高清国产精品国产三级 | 爱豆传媒免费全集在线观看| 久久久久久久大尺度免费视频| 午夜福利在线观看免费完整高清在| 在线观看国产h片| 直男gayav资源| 久久久亚洲精品成人影院| 免费观看性生交大片5| 高清在线视频一区二区三区| 国产精品无大码| 最近手机中文字幕大全| 日韩,欧美,国产一区二区三区| 最近最新中文字幕免费大全7| 少妇 在线观看| 国产精品一区二区在线观看99| 高清av免费在线| 欧美日韩亚洲高清精品| 欧美xxⅹ黑人| 99久久综合免费| 免费看av在线观看网站| 黄色配什么色好看| 有码 亚洲区| 国产91av在线免费观看| 国产爱豆传媒在线观看| 亚洲高清免费不卡视频| 日韩成人伦理影院| 在线观看三级黄色| 欧美成人午夜免费资源| 日本与韩国留学比较| 午夜老司机福利剧场| 欧美老熟妇乱子伦牲交| 777米奇影视久久| 久热久热在线精品观看| 边亲边吃奶的免费视频| 午夜老司机福利剧场| 中文资源天堂在线| 熟女人妻精品中文字幕| 国产av码专区亚洲av| 久久久久久久国产电影| 亚洲精品视频女| 欧美xxxx黑人xx丫x性爽| 赤兔流量卡办理| 草草在线视频免费看| 99热6这里只有精品| 网址你懂的国产日韩在线| 久久久久久久大尺度免费视频| 1000部很黄的大片| 国产成人免费无遮挡视频| 80岁老熟妇乱子伦牲交| 亚洲国产欧美在线一区| 新久久久久国产一级毛片| 自拍偷自拍亚洲精品老妇| 夜夜骑夜夜射夜夜干| 十八禁网站网址无遮挡 | 成人18禁高潮啪啪吃奶动态图 | 亚洲欧美成人综合另类久久久| 国产黄色视频一区二区在线观看| 亚洲欧美一区二区三区黑人 | 精品人妻一区二区三区麻豆| 美女主播在线视频| a级毛色黄片| 一级片'在线观看视频| 欧美成人精品欧美一级黄| 91久久精品电影网| 日韩免费高清中文字幕av| 亚洲欧美成人精品一区二区| 中文精品一卡2卡3卡4更新| 如何舔出高潮| 国产精品偷伦视频观看了| 99久国产av精品国产电影| 国产 一区精品| 黄色视频在线播放观看不卡| 亚洲第一区二区三区不卡| 久久青草综合色| 成年女人在线观看亚洲视频| 色婷婷久久久亚洲欧美| 三级经典国产精品| 欧美性感艳星| 国产在线男女| 大片电影免费在线观看免费| 乱码一卡2卡4卡精品| 一区二区三区精品91| 久久久色成人| 国产av码专区亚洲av| 免费观看性生交大片5| 成人午夜精彩视频在线观看| 国产欧美另类精品又又久久亚洲欧美| 国产乱人偷精品视频| 汤姆久久久久久久影院中文字幕| 欧美精品亚洲一区二区| 成人毛片60女人毛片免费| 99久久综合免费| 久久久久久久久久成人| 我要看日韩黄色一级片| 婷婷色综合大香蕉| 51国产日韩欧美| 日本免费在线观看一区| 亚洲av男天堂| 亚洲精品乱码久久久v下载方式| 日本与韩国留学比较| 人妻少妇偷人精品九色| 大陆偷拍与自拍| 亚洲色图综合在线观看| 国产成人一区二区在线| 国国产精品蜜臀av免费| 最近最新中文字幕免费大全7| 美女高潮的动态| 三级国产精品欧美在线观看| 天天躁日日操中文字幕| 亚洲国产精品一区三区| 男人添女人高潮全过程视频| 日韩三级伦理在线观看| av在线老鸭窝| 国产精品一及| 伊人久久国产一区二区| 99久久精品热视频| 亚洲色图av天堂| 大话2 男鬼变身卡| 色婷婷av一区二区三区视频| 99久久精品国产国产毛片| 中文字幕人妻熟人妻熟丝袜美| 麻豆成人av视频| 深夜a级毛片| 亚洲欧美日韩卡通动漫| 国产一区二区三区综合在线观看 | 久久精品熟女亚洲av麻豆精品| 七月丁香在线播放| 中文字幕人妻熟人妻熟丝袜美| 日韩伦理黄色片| 国产69精品久久久久777片| 午夜福利高清视频| 国产精品99久久99久久久不卡 | 成人亚洲精品一区在线观看 | 蜜桃在线观看..| 九色成人免费人妻av| 少妇精品久久久久久久| 高清日韩中文字幕在线| 中文字幕人妻熟人妻熟丝袜美| 国内揄拍国产精品人妻在线| 欧美 日韩 精品 国产| 久久99蜜桃精品久久| 最近的中文字幕免费完整| 在线看a的网站| 春色校园在线视频观看| 天堂中文最新版在线下载| av国产久精品久网站免费入址| 国产成人精品福利久久| 亚洲aⅴ乱码一区二区在线播放| 91狼人影院| 搡老乐熟女国产| 在线天堂最新版资源| 亚洲,一卡二卡三卡| 亚洲国产精品999| 成人影院久久| 在线播放无遮挡| 激情五月婷婷亚洲| 亚洲欧美成人精品一区二区| 18禁在线播放成人免费| 欧美人与善性xxx| 亚洲av电影在线观看一区二区三区| 人妻 亚洲 视频| 欧美精品国产亚洲| av专区在线播放| 大片电影免费在线观看免费| 夜夜爽夜夜爽视频| 一边亲一边摸免费视频| 成人一区二区视频在线观看| 欧美国产精品一级二级三级 | 乱码一卡2卡4卡精品| 草草在线视频免费看| 99热这里只有精品一区| 搡女人真爽免费视频火全软件| 久久久久久久国产电影| 九九久久精品国产亚洲av麻豆| 亚洲国产成人一精品久久久| 99热国产这里只有精品6| 国产v大片淫在线免费观看| 国产午夜精品久久久久久一区二区三区| 免费黄色在线免费观看| 日日啪夜夜撸| 欧美丝袜亚洲另类| 免费黄频网站在线观看国产| xxx大片免费视频| 国产成人精品一,二区| 九草在线视频观看| 美女福利国产在线 | 噜噜噜噜噜久久久久久91| 亚洲人成网站在线观看播放| 噜噜噜噜噜久久久久久91| 人人妻人人看人人澡| av一本久久久久| 少妇的逼水好多| 男女边吃奶边做爰视频| 91精品一卡2卡3卡4卡| 大片电影免费在线观看免费| 国产在线男女| 一级毛片久久久久久久久女| 不卡视频在线观看欧美| 国产精品久久久久久av不卡| 全区人妻精品视频| 久久久色成人| 男人舔奶头视频| 麻豆乱淫一区二区| 亚洲天堂av无毛| 亚洲av综合色区一区| 日本av免费视频播放| 欧美3d第一页| 男女下面进入的视频免费午夜| 欧美成人午夜免费资源| 一本—道久久a久久精品蜜桃钙片| 老女人水多毛片| 亚洲一级一片aⅴ在线观看| av黄色大香蕉| 一本—道久久a久久精品蜜桃钙片| 精品亚洲成a人片在线观看 | 尾随美女入室| 久久久久久久精品精品| 亚洲色图av天堂| 99热全是精品| 日本午夜av视频| 国产91av在线免费观看| 一级毛片 在线播放| 一个人免费看片子| 色哟哟·www| 午夜免费男女啪啪视频观看| 美女xxoo啪啪120秒动态图| 国产毛片在线视频| 国产免费视频播放在线视频| 熟女电影av网| 国产片特级美女逼逼视频| 男的添女的下面高潮视频| 国产成人aa在线观看| 蜜臀久久99精品久久宅男| 亚洲成人一二三区av| 欧美日韩精品成人综合77777| 国产爽快片一区二区三区| 精品一区二区免费观看| 伊人久久国产一区二区| 又粗又硬又长又爽又黄的视频| 国产欧美日韩精品一区二区| 亚洲国产精品一区三区| 自拍欧美九色日韩亚洲蝌蚪91 | av网站免费在线观看视频| 亚洲精品第二区| 亚洲色图av天堂| 最后的刺客免费高清国语| 亚洲欧美一区二区三区黑人 | 亚洲人成网站在线观看播放| 日韩成人av中文字幕在线观看| 黄色一级大片看看| 亚洲精品国产av成人精品| 身体一侧抽搐| 99热这里只有是精品在线观看| 日本午夜av视频| 老熟女久久久| 国产男人的电影天堂91| 91精品一卡2卡3卡4卡| av国产精品久久久久影院| av福利片在线观看| 国产精品久久久久久精品电影小说 | 久久青草综合色| 黄色日韩在线| 麻豆成人午夜福利视频| 街头女战士在线观看网站| 国产爽快片一区二区三区| 十分钟在线观看高清视频www | 一个人看视频在线观看www免费| 久久韩国三级中文字幕| 身体一侧抽搐| 日本黄色片子视频| 精品久久久久久电影网| 成人一区二区视频在线观看| 天堂8中文在线网| 国产精品一区二区三区四区免费观看| 国产在视频线精品| 91午夜精品亚洲一区二区三区| 久久精品久久久久久久性| 18+在线观看网站| 精品久久久久久久久av| 国产毛片在线视频| 大片免费播放器 马上看| 全区人妻精品视频| 国产中年淑女户外野战色| 精华霜和精华液先用哪个| 中文字幕人妻熟人妻熟丝袜美| 免费久久久久久久精品成人欧美视频 | 男的添女的下面高潮视频| 建设人人有责人人尽责人人享有的 | 国产黄片视频在线免费观看| 国产有黄有色有爽视频| 秋霞在线观看毛片| 草草在线视频免费看| 中文字幕久久专区| 综合色丁香网| 全区人妻精品视频| 久久精品国产亚洲网站| 亚洲国产av新网站| 免费人成在线观看视频色| 免费大片18禁| 老司机影院毛片| 精品一区二区三区视频在线| 日本av免费视频播放| 伊人久久国产一区二区| 成人特级av手机在线观看| 久久久久国产网址| 我要看黄色一级片免费的| 日日啪夜夜撸| 欧美成人午夜免费资源| 一级a做视频免费观看| 在线精品无人区一区二区三 | 午夜日本视频在线| 欧美成人午夜免费资源| 国产精品三级大全| 看免费成人av毛片| 在线 av 中文字幕| 国产精品成人在线| 国产 一区 欧美 日韩| 噜噜噜噜噜久久久久久91| 网址你懂的国产日韩在线| 男人爽女人下面视频在线观看| 亚洲美女黄色视频免费看| 中文字幕人妻熟人妻熟丝袜美| 欧美少妇被猛烈插入视频| 精品酒店卫生间| 成年人午夜在线观看视频| 亚洲美女黄色视频免费看| 乱码一卡2卡4卡精品| 国产高清国产精品国产三级 | 精品人妻熟女av久视频| 少妇被粗大猛烈的视频| 成人综合一区亚洲| 嫩草影院新地址| 亚洲欧美一区二区三区黑人 | 国产在线视频一区二区| 热99国产精品久久久久久7| 蜜臀久久99精品久久宅男| 亚洲国产精品国产精品| 日韩强制内射视频| 免费av不卡在线播放| 狂野欧美激情性bbbbbb| 国产精品av视频在线免费观看| 日韩一区二区视频免费看| 亚洲,一卡二卡三卡| 人妻夜夜爽99麻豆av| 亚洲精品456在线播放app| 日韩中文字幕视频在线看片 | 18禁裸乳无遮挡免费网站照片| 久久久午夜欧美精品| 尤物成人国产欧美一区二区三区| 日韩一区二区视频免费看| 最近中文字幕高清免费大全6| 永久网站在线| 晚上一个人看的免费电影| 成人国产麻豆网| av黄色大香蕉| 免费久久久久久久精品成人欧美视频 | 高清视频免费观看一区二区| 亚洲精品国产av成人精品| 日本与韩国留学比较| 最近最新中文字幕大全电影3| 深爱激情五月婷婷| 午夜福利在线观看免费完整高清在| 国产无遮挡羞羞视频在线观看| 日韩三级伦理在线观看| 日本黄色日本黄色录像| 日韩亚洲欧美综合| av国产精品久久久久影院| 我的老师免费观看完整版| 香蕉精品网在线| 大片电影免费在线观看免费| 国产精品.久久久| 最近2019中文字幕mv第一页| 亚洲国产精品专区欧美| 18+在线观看网站| 18禁裸乳无遮挡免费网站照片| 国产黄频视频在线观看| 99热这里只有是精品50| av在线播放精品| 国产永久视频网站| 伊人久久国产一区二区| 18禁在线播放成人免费| 日本黄大片高清| 久久久久久久国产电影| 激情五月婷婷亚洲| 高清午夜精品一区二区三区| 男人狂女人下面高潮的视频| 久久人人爽人人爽人人片va| 亚洲色图av天堂| 少妇熟女欧美另类| 高清日韩中文字幕在线| 精品少妇黑人巨大在线播放| 日韩成人伦理影院| 国产亚洲精品久久久com| 深夜a级毛片| 亚洲丝袜综合中文字幕| av国产久精品久网站免费入址| 自拍偷自拍亚洲精品老妇| 欧美一级a爱片免费观看看| 亚洲aⅴ乱码一区二区在线播放| 亚洲精品乱码久久久久久按摩| av黄色大香蕉| 嫩草影院新地址| 中文字幕av成人在线电影| 涩涩av久久男人的天堂| 午夜免费鲁丝| av网站免费在线观看视频| 日日啪夜夜爽| 黄色怎么调成土黄色| 国内少妇人妻偷人精品xxx网站| 国产乱人偷精品视频| 色视频在线一区二区三区| 男的添女的下面高潮视频| 97在线人人人人妻| 亚洲av在线观看美女高潮| 99九九线精品视频在线观看视频| 身体一侧抽搐| 久久精品国产a三级三级三级| 深爱激情五月婷婷| 纵有疾风起免费观看全集完整版| 国产在线一区二区三区精| 精品一区在线观看国产| 插逼视频在线观看| 国产一级毛片在线| 97在线视频观看| 亚洲电影在线观看av| 欧美极品一区二区三区四区| 久久ye,这里只有精品| 国产免费福利视频在线观看| 丝瓜视频免费看黄片| 欧美性感艳星| 男的添女的下面高潮视频|