• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Sparse Reduced-Rank Regression with Outlier Detection

    2021-10-14 02:10:14

    ( School of Data Science, University of Science and Technology of China, Hefei 230026, China.)

    Abstract: Based on the multivariate mean-shift regression model, we propose a new sparse reduced-rank regression approach to achieve low-rank sparse estimation and outlier detection simultaneously.A sparse mean-shift matrix is introduced in the model to indicate outliers.The rank constraint and the group-lasso type penalty for the coefficient matrix encourage the low-rank row sparse structure of coefficient matrix and help to achieve dimension reduction and variable selection.An algorithm is developed for solving our problem.In our simulation and real-data application, our new method shows competitive performance compared to other methods.

    Keywords: Reduced-rank regression; Sparsity; Outlier detection; Group-lasso type penalty

    §1.Introduction

    Multivariate linear regression model is widely applied in the scenrios where several response variables are affected by some common predictors.Many commonly used models can be transformed into multivariate linear regression model and be solved.

    Consider the multivariate regression model withnobservations ofppredictors andqresponse variables

    whereYis then×qresponse matrix,Xis then×ppredictor matrix, denoted byY=(y1,...,yn)T,X=(x1,...,xn)T,xi ∈Rpandyi ∈Rqfori=1,2,...,n,E=(e1,...,en)T∈Rn×qis a random error matrix,ei ∈Rqfori=1,2,...,n.B∈Rp×qis the unknown coefficient matrix we are interested in.

    The ordinary least squares ignores the relationships between responses and is equivalent to perform regression on each response independently [6], which may not work in real-data application.Thus reduced-rank regression [1,8,10]has been developed.Given a constraint on the rank of coefficient matrixB, reduced-rank regression solves the problem [8,10]

    The constraint on the rank of coefficient matrixBimplies thatBcan be written as a product of two lower dimensional matrices that are of full ranks [10].If we rewriteB=MNTwhereM∈Rp×randN∈Rq×r, we can regardXMas r latent variables to achieve dimension reduction, which can reduce the number of free parameters.Furthermore, the responses can be seen affected by r common latent variables and the relationships among responses have been taken into account.Similarly, a lot of methods using different penalties such as nuclear norm on the coefficient matrixBcan also achieve dimension reduction [5,9,17].

    Many sparse estimation methods have been developed for linear regression problems such as lasso [15]and group lasso [18].Combined with these methods, a lot of kinds of penalties are introduced in reduced-rank regression methods to achieve sparse estimation.Chen and Huang use a group-lasso type penalty to encourage the row sparsity of the coefficient matrix to choose important predictors.They proposed the sparse reduced-rank regression method (SRRR) [6].Similar idea can be seen in the RCGL method[3].Sparse singular value decomposition and eigen decomposition are also popular-used methods in sparse coefficient matrix estimation [4,16,19].However, these methods are sensitive to outliers, which are almost inevitable in real data application.Shen and Chen proposed the robust reduced-rank regression method to achieve low-rank coefficient estimation and outlier detection jointly [13]and demonstrate that jointly rank reduction and outlier detection can improve the prediction accuracy [13].

    We propose a new approach for low-rank sparse coefficient matrix estimation and outlier detection simultaneously, called sparse reduced-rank regression with outlier detection(SROD).A mean-shift matrix is introduced to the model (1.1) to reflect the effect of outliers and indicate outliers.We assume the coefficient matrix is of low rank and introduce a group-lasso type penalty on it to encourage its row sparsity.A penalty constructed from threshold function is added on the mean-shift matrix to ensure its sparsity since the outliers account for only a small fraction of samples.We develope an iterative algorithm for soving our SROD problem, see Section 2.We compare the new method with other reduced-rank regression methods in Section 3 and our proposed method performs well in our simulation.In Section 4 we apply our SROD method to real-data analysis to see its performance.Discussion and conclusions can be found in Section 5.

    Notation 1.1.For any vectora=(a1,...,an)T∈Rn, we use ‖a‖p to denote the lp norm of vectora, defined by1≤p<+∞.Given any m×n matrixA, let r(A)and tr(A)denote the rank and trace of matrixA.‖A‖F(xiàn) denotes the Frobenius norm for matrixAand1≤p<+∞denotes the order-p operator norm ofA.We useAi andAj to denote the ith row and the jth column of matrixArespectively.Ai,j denotes the element in the ith row and jth column of matrixA.Let |·| denote the cardinality of enclosedset and we define ‖A‖0=|(i,j):Ai,j≠0|, ‖A‖2,1=

    We also define the threshold functions as follow [13].

    Definition 1.1 (Threshold function).If a real-valued functionΘ(t;λ)with parameter λ for t∈Rand0≤λ<+∞satisfies:

    §2.Model

    2.1.Multivariate mean-shift regression model

    Our method is based on the multivariate mean-shift regression model introduced by She and Chen [13,14].Taking outliers into account, She and Chen introduce a mean-shift component to model (1.1), leading to the multivariate mean-shift regression model [13]

    whereBis the unknown coefficient matrix andCisn×qmean-shift matrix describing the effect of outliers on responses.Ehas i.i.d.rows fromN(0,Σ).

    Since outliers only account for a small part of samples in practice,Cis usually assumed to be row-wise sparse or element-wise sparse with a few nonzero entries.We consider the case whereCis row sparse, that is, if theith row ofCis set as zeros, then sample (xi,yi) is regarded as an outlier.

    Based on (2.1), She and Chen proposed the robust reduced-rank regression problem [13]

    whereP(C,λ) is the penalty function forCwith penalty parameterλ≥0 to ensure sparsity andΓis a positive definite weighting matrix.

    Although robust reduced-rank regression takes outliers into account, it cannot achieve variable selection.Motivated by robust reduced-rank regression method [13]and SRRR method[6], we propose our new method.

    2.2.Sparse reduced-rank regression with outlier detection (SROD)

    Taking outliers into account, we propose the sparse reduced-rank outlier detection (SROD)problem for low-rank sparse coefficient matrix estimation by solving

    whereλi,1≤i≤pare penalty parameters for coefficient matrixB,μis the penalty parameter for matrixC,P(·;μ) is the penalty function for mean-shift matrixC.

    We assume the coefficient matrixBis of low rank and introduce a row-sparse penalty on it to encourage row-wise sparsity.SettingBi=0, which is equivalent to setting‖Bi‖2=0, implies that theith predictor can be removed from the model [6].Thus we focus on the row-wise sparse estimation of coefficient matrixBto achieve variable selection by removing unimportant predictors.If we setλi’s all equal to zero, then problem(2.3)reduces to the robust reduced-rank regression (2.2).

    Different forms ofP(C;μ) can handle different types of outliers.We mainly consider the row-wise outliers

    whereCiis theith row ofC.It can also handle element-wise outliers by penalizing each element of matrixC

    Similar to robust reduced-rank regression,P(·;μ) is constructed from any threshold function Θ(·;μ) with parameterμ[13]

    for someq(·,μ) satisfyingq(s,μ)≥0 andq{Θ(s;μ);μ}=0 for alls∈R

    The association (2.5) between threshold rule Θ(·,μ) and penalty functionP(·;μ) covers a lot of popular-used penalties [12].

    For instance, when Θ is the soft-thresholding ΘS(t;μ)=sgn(t)(|t|-μ)1(|t|≥μ), the penalty function constructed from (2.5) is the penaltyP(t;μ)=μ|t|.Thus the form (2.4) is the groupl1penaltyIt also coverslpand other penalties [12].

    2.3.Algorithm for SROD

    An iterative algorithm is developed to solve SROD problem by updating the coefficient matrixBand mean-shift matrixCin turn.

    Note that for fixedB, problem (2.3) is equivalent to the problem

    where Oq×r={N∈Rq×r:NTN=Ir}.Since we minimize (2.7) over allp×qmatrices satisfyingr(B)≤r,we cannot directly give the explicit solution to(2.7).We can use an iterative algorithm similar to RCGL method [3]to solve (2.7) with matricesMandNupdated in turn, which is described as follow.

    Theorem 2.1.For any threshold functionΘsatisfies definition(1.1), G(B,C)defined by(2.8)and penalty function P(·,μ)constructed fromΘthrough(2.5):

    Proof.Part (1): The proof is based on the lemma 1 of She [12].The details can be seen in the paper [12]and thus omitted.Using this lemma, we can get the globally optimal solution for problem (2.8) given fixedBby

    Part (2): First we prove that in algorithm 1, we haveF(B(t))≥F(B(t+1)),whereF(B) is defined as (2.7).The proof is similar to RCGL method [3].We can rewrite

    which is convex inM, so the global minimum can be achieved at someM.And for fixedM,(2.7) is equivalent to

    The global maximum can also be achieved and the global optimal solution is the one used in algorithm 1.Thus we have

    That isF(B(t))≥F(B(t+1)), which is equivalent toG(B(t),C(t))≥G(B(t+1),C(t)).

    And for algorithm 2, combined with part (1), we have

    Consequently, the proof is completed.

    §3.Simulation

    3.1.Simulation setups

    We consider three model setups to compare our proposed method with some reduced-rank regression methods.Denoterx=r(X) andr★is the true rank of coefficient matrixB★.In all the models, the design matrixXis generated fromX1byX=X1X2Δ1/2.BothX1andX2have i.i.d.elements fromN(0,1).Δhas diagonal elements 1 and off-diagonal elements

    We construct sparse coefficient matrixB★with form

    whereB1andB2have i.i.d.entries fromN(0,1).JBis the number of nonzero rows of true coefficient matrix.Error matrixEhas i.i.d.rows fromN(0,σ2Σ), whereΣhas diagonal elements 1 and off-diagonal elementsρe.Similar to She and Chen [13],σ2is defined as the ratio between ther★th sigular value ofXB★and‖E‖F(xiàn), which is to control the signal to noise ratio.

    Outliers are added by setting the first 10% rows ofC★to be nonzero.Specifically, the elements of the first 10% rows ofC★are random uniformly sampled from [-15,-10]∪[10,15].Finally, the response matrixYis constructed fromY=XB★+C★+E.

    Three models have different sample sizes.Model 1 and 2 consider the situation wheren>p.In model 1 we setn=100,p=30,q=10,r★=3,ρe=0.3,ρx=0.5,JB=10 andrx=20.In model 2 we setn=200,p=50,q=10,r★=5,ρe=0.3,ρx=0.5,JB=10 andrx=20.In model 3 we consider the situation wherep>n.We setn=50,p=100,q=10,r★=5,ρe=0.3,ρx=0.5,JB=10.

    3.2.Methods and evaluation matrices

    We compare our proposed method SROD with some reduced-rank regression methods.We consider the plain reduced-rank regression(RRR) [2], sparse reduced-rank regression(SRRR) [6],SOFAR [16], RSSVD [4]and robust reduced-rank regression [13](R4) method.SRRR, SOFAR and RSSVD methods can achieve sparse coefficient estimation but are sensitive to outliers.Robust reduced-rank regression can achieve outlier detection but cannot select important variables.

    In the simulation we use thel0penalized form for the mean-shift matrixC.To reduce the number of parameters in our SROD method, we use the idea of adaptive lasso [6,20].Given a pilot estimatorof coefficient matrix , we setand simply chooseγ=-2 during simulation [20].Then we only need to tune three parametersr,λandμ.These three parameters are tuned by BIC.Parameters of RRR and SRRR are tuned by five-fold cross validation.Parameters of SOFAR are tuned by GIC [7,16]and parameters of RSSVD is tuned by BIC.Specially, we preset the desired rank of coefficient matrix for SRRR, SOFAR and RSSVD, which means these methods estimate the coefficient matrix on the premise that the optimal rank is already known.

    3.3.Simulation results

    Table 1-3 shows the simulation results for each model respectively.In model 1 and 2,whenn>p, our proposed SROD method can correctly detect the outliers.Except for robust reduced-rank regression (R4), the other four methods cannot achieve outlier detection.From the value ofwe can see that our proposed method shows competitive performance compared with robust reduced-rank regression.Our proposed method has the lowest mean of predicted errorcompared with other methods, which implies outlier detection can helpto recover low-rank sparse structure of coefficient matrixB.The high mean squared error of the last four methods is mainly because of the outliers.RRR, SRRR, RSSVD and SOFAR cannot explicitly detect outliers, which affects the estimation of coefficient matrixB.This also shows that reduced-rank regression methods are sensitive to outliers.Reduced-rank regression and robust reduced-rank regression don’t encourage sparsity of coefficient matrix and thus cannot achieve variable selection, so these two methods cannnot recover the sparse structure of coefficient matrix correctly.

    Table 1:Simulation results for model 1 with signal to noise ratio 0.75.Reported are the means and standard deviation (in parentheses) of our defined evaluation matrices from all runs.

    Table 2:Simulation results for model 2 with signal to noise ratio 0.75.Reported are the means and standard deviation (in parentheses) of our defined evaluation matrices from all runs.

    Table 3:Simulation results for model 3 with signal to noise ratio 0.75.Reported are the means and standard deviation(in parentheses) of our defined evaluation matrices from all runs.

    From table 1 and 2, our SROD method has the lowest FPR and can almost select all the important variables correctly.Other sparse reduced-rank regression methods tend to over-select,which means they tend to select more variables.They all produce high FPR and low FNR.

    Table 3 shows the result for model 3 when the number of predictors is more than observations.The estimatedof our proposed method has low mean squared errorand shows competitive performance in predicted errorcompared with robust reduced rank regression.SROD method correctly detected the outliers and has the lowest FPR among all the methods.But our SROD method tuned by BIC has high bias on the estimate of rank whenp>n.In fact for SROD, if we plot the value of BIC for each rank from all runs of model 3, the curves all drop dramatically when rank is 5, and then decrease slowly.To choose the rank of coefficient matrix accurately whenn<p, better information criterion for parameter tuning need to be further studied.

    Table 4:mean and standard deviation (in parentheses) of predicted squared error(× 1000).

    §4.Real data analysis

    Multivariate linear model has been widely used in finance.An important task is to predict the future returns on the basis of historical information.Vector autoregression model(VAR)is a uesful method for multivariate time series analysis.We choose a dataset of weekly stock log-return data for nine of the largest American companies in 2004.The data is available in the R package MRCE [11].This dataset has been analyzed many times in literature [11,13,17].We do not consider Chevron because its stock price dropped dramatically that year [17].Vector autoregression model with order 1 can be used to analyze the data [17].Letytbe the vector of returns in time t, VAR(1) model can be seen as a multivariate linear regression model

    wherehas rowsy1,...,yT-1andYhas rows ofy2,...,yT.

    For the weekly log-returns data, we setyt ∈R9, t=1,...,52.We use the data of first half year for training and the other for testing.Specifically, for training set we setT=26, using log-returns at weeks 1-25 as predictors and log-returns at weeks 2-26 as responses.For testing set, we use log-returns at weeks 27-51 as predictors and log-returns at weeks 28-52 as responses to fit the model built by training set.

    We consider entry-wise outliers and usel0penalized form for outlier detection.The parameters are tuned by PIC proposed by She and Chen [13].Prediction accuracy is measured by mean squared error of each stock.When estimating the coefficient matrix, the input of our procedure are multiplied by 100 for better computing.We calculate the predicted mean squared error for each stock in the testing set based on the original data.Other methods such as ordinary least squares, plain reduced-rank regression(RRR) [2], sparse reduced-rank regression(SRRR) [6]and robust reduced-rank regression(R4) are also used for comparison.For robust reduced-rank regression we use the entrywise penalized form.

    Fig.1 Weekly log-returns(×100)of nine companies in 2004, outliers detected by SROD method are labeled by blue circles.Green circles mark the outliers detected by both SROD and R4 methods.

    The weekly log-returns of these nine companies are shown in figure 1.Table 4 shows the averaged predicted squared error for each method.We run the SROD method for 30 times and SROD method detects four outliers, the log-returns at Ford and General Motors at weeks 5 and 17.The robust reduced-rank regression using entrywise penalized form detected three outliers,which are included in the four outliers our SROD method detected.According to She and Chen,these outliers are associated with two real market disturbances [13].We use green circles to mark the outliers detected by both R4 and SRRR methods and blue circles to mark the outliers that are only detected by SROD method.

    Our proposed SROD method performs competitively compared to other methods.All the reduced-rank regression methods resulted in unit-rank model.During the 30 runs, logreturns of Ford, ConocoPhillips and Citigroup are selected as important variables over 20 times.Log-returns of Exxon, GE and IBM are selected less than 10 times.So we consider Ford,ConocoPhillips and Citigroup as important variables for future price prediction.It’s also similar to the result of MRCE method [11].

    §5.Discussion

    Based on the mean-shift multivariate regression model, we take outliers into account and propose a new sparse reduced rank method to achieve low-rank sparse estimation as well as outlier detection.The simulation results and the real data application shows that our SROD method performs well.However, we just consider the row-sparsity of the coefficient matrix.There are many reduced-rank regression methods that encourage the element-wise sparsity of the coefficient matrix using sparse singular value decomposition.How to achieve element-wise sparsity of the coefficient matrix and outlier detection jointly needs further research.The disadvantage of our method is that we have many tuning parameters and for very large-scale data, tuning procedure and our proposed algorithm may be expensive since we cannot give the explicit form of solution.So how to choose the form of penalty that encourages the sparsity of coefficient matrixBand can be solved easily and effectively should be considered.

    亚洲午夜理论影院| 香蕉丝袜av| 午夜精品久久久久久毛片777| 欧美国产日韩亚洲一区| netflix在线观看网站| 12—13女人毛片做爰片一| 小说图片视频综合网站| 免费电影在线观看免费观看| 哪里可以看免费的av片| 欧美黑人巨大hd| 欧美色欧美亚洲另类二区| 国产又黄又爽又无遮挡在线| 婷婷亚洲欧美| 国产成人系列免费观看| 天堂√8在线中文| 岛国视频午夜一区免费看| 最近视频中文字幕2019在线8| 国模一区二区三区四区视频 | 日韩免费av在线播放| 99久久精品热视频| 99久久无色码亚洲精品果冻| www国产在线视频色| 国产1区2区3区精品| 亚洲成人免费电影在线观看| 在线永久观看黄色视频| 欧美在线一区亚洲| 久久草成人影院| 免费在线观看日本一区| 三级男女做爰猛烈吃奶摸视频| 无遮挡黄片免费观看| 国产亚洲欧美在线一区二区| 精品熟女少妇八av免费久了| 国产又黄又爽又无遮挡在线| 全区人妻精品视频| 国产91精品成人一区二区三区| а√天堂www在线а√下载| 亚洲人与动物交配视频| 免费一级毛片在线播放高清视频| 国产爱豆传媒在线观看 | 国产精品野战在线观看| 成人av一区二区三区在线看| 老汉色av国产亚洲站长工具| 国产av又大| 精品久久久久久久久久久久久| 亚洲va日本ⅴa欧美va伊人久久| 久久久久久人人人人人| 五月玫瑰六月丁香| 真人一进一出gif抽搐免费| 亚洲第一电影网av| 久热爱精品视频在线9| 丁香欧美五月| 两个人视频免费观看高清| 精品一区二区三区四区五区乱码| 亚洲国产欧美网| 国产午夜福利久久久久久| 久久精品国产亚洲av香蕉五月| 一夜夜www| 国内精品久久久久久久电影| 中文字幕av在线有码专区| 国产一区二区三区视频了| 色综合亚洲欧美另类图片| 亚洲国产精品成人综合色| 首页视频小说图片口味搜索| 又黄又粗又硬又大视频| 丁香欧美五月| 久久亚洲真实| 高清在线国产一区| 一边摸一边做爽爽视频免费| 丰满人妻一区二区三区视频av | 国产av一区二区精品久久| 99国产精品一区二区蜜桃av| 欧美中文综合在线视频| 在线观看66精品国产| 国产精品香港三级国产av潘金莲| 亚洲黑人精品在线| 一级片免费观看大全| 国产精品亚洲av一区麻豆| 在线国产一区二区在线| 午夜福利在线在线| 免费看美女性在线毛片视频| 精品欧美一区二区三区在线| 亚洲午夜精品一区,二区,三区| 国产精品久久久av美女十八| 亚洲黑人精品在线| 91大片在线观看| 亚洲熟女毛片儿| 宅男免费午夜| 国产熟女午夜一区二区三区| 一级毛片高清免费大全| 亚洲精品一区av在线观看| 亚洲人成77777在线视频| 国产麻豆成人av免费视频| 国产1区2区3区精品| ponron亚洲| 19禁男女啪啪无遮挡网站| 午夜精品久久久久久毛片777| 亚洲在线自拍视频| 999久久久国产精品视频| 变态另类丝袜制服| 精品一区二区三区四区五区乱码| 欧美绝顶高潮抽搐喷水| avwww免费| 亚洲国产精品成人综合色| 亚洲国产中文字幕在线视频| 久久香蕉精品热| 午夜免费观看网址| 国产熟女午夜一区二区三区| 久久精品国产综合久久久| 日本五十路高清| 叶爱在线成人免费视频播放| 成人三级做爰电影| 亚洲成人免费电影在线观看| 国产三级黄色录像| xxxwww97欧美| 成年版毛片免费区| 亚洲 国产 在线| 88av欧美| 人成视频在线观看免费观看| 欧美丝袜亚洲另类 | 999久久久国产精品视频| 国产三级黄色录像| 国产成人欧美在线观看| 777久久人妻少妇嫩草av网站| 欧美黑人精品巨大| 两性夫妻黄色片| 欧美大码av| 亚洲国产高清在线一区二区三| 久久久久久久久久黄片| 日本一区二区免费在线视频| 国产伦在线观看视频一区| 无遮挡黄片免费观看| 18禁美女被吸乳视频| 久久草成人影院| 成年版毛片免费区| 免费高清视频大片| 女人被狂操c到高潮| 国产成+人综合+亚洲专区| 精华霜和精华液先用哪个| 最近最新中文字幕大全免费视频| 级片在线观看| 又粗又爽又猛毛片免费看| 听说在线观看完整版免费高清| 国产亚洲欧美98| 两性午夜刺激爽爽歪歪视频在线观看 | 天堂√8在线中文| 黄色片一级片一级黄色片| netflix在线观看网站| 在线视频色国产色| 亚洲精品久久国产高清桃花| 成人一区二区视频在线观看| 美女大奶头视频| 啪啪无遮挡十八禁网站| 国产精品电影一区二区三区| 美女免费视频网站| 嫁个100分男人电影在线观看| 午夜精品一区二区三区免费看| 久久久久久人人人人人| 最近最新中文字幕大全免费视频| 淫妇啪啪啪对白视频| 精品电影一区二区在线| 两个人视频免费观看高清| www.熟女人妻精品国产| 18禁观看日本| 亚洲aⅴ乱码一区二区在线播放 | 久久中文字幕一级| 亚洲欧洲精品一区二区精品久久久| 精品国产超薄肉色丝袜足j| 日韩三级视频一区二区三区| 中出人妻视频一区二区| 亚洲精品在线美女| 亚洲九九香蕉| 国产精品一区二区免费欧美| 女人高潮潮喷娇喘18禁视频| 丝袜人妻中文字幕| 欧美高清成人免费视频www| 美女免费视频网站| 久久久国产成人免费| 久久精品国产综合久久久| 五月伊人婷婷丁香| 成人手机av| www.熟女人妻精品国产| 国产精品99久久99久久久不卡| 他把我摸到了高潮在线观看| 欧美黑人巨大hd| 国产成人啪精品午夜网站| 日本黄色视频三级网站网址| 国产精品av久久久久免费| 欧美另类亚洲清纯唯美| 久久精品人妻少妇| 免费人成视频x8x8入口观看| 一级毛片高清免费大全| 免费电影在线观看免费观看| 中出人妻视频一区二区| 91字幕亚洲| 亚洲国产精品成人综合色| 看免费av毛片| 亚洲精品久久成人aⅴ小说| 麻豆国产av国片精品| 99国产精品一区二区三区| 丰满人妻一区二区三区视频av | 叶爱在线成人免费视频播放| 国产高清videossex| 久久精品人妻少妇| 91国产中文字幕| 国产熟女xx| 丰满人妻一区二区三区视频av | 一夜夜www| 淫妇啪啪啪对白视频| 久久国产乱子伦精品免费另类| 俺也久久电影网| 国产亚洲精品综合一区在线观看 | 精品午夜福利视频在线观看一区| 久久午夜综合久久蜜桃| 色在线成人网| 午夜精品久久久久久毛片777| 国内精品一区二区在线观看| 久久久久久大精品| 亚洲av成人精品一区久久| 亚洲国产精品999在线| 在线观看免费日韩欧美大片| 白带黄色成豆腐渣| 久久中文字幕人妻熟女| 久久国产精品人妻蜜桃| 国产一区在线观看成人免费| 久久久精品国产亚洲av高清涩受| 久久精品91蜜桃| 亚洲九九香蕉| 精品高清国产在线一区| 亚洲精品色激情综合| 精品欧美一区二区三区在线| 亚洲精品在线观看二区| 91国产中文字幕| 天天一区二区日本电影三级| 国产精品自产拍在线观看55亚洲| 99精品在免费线老司机午夜| 久久天堂一区二区三区四区| 成人三级做爰电影| 久久久国产欧美日韩av| 女警被强在线播放| 国产午夜福利久久久久久| а√天堂www在线а√下载| 国产激情欧美一区二区| 黄片小视频在线播放| 国产区一区二久久| 日本免费一区二区三区高清不卡| 亚洲精品久久国产高清桃花| 欧美不卡视频在线免费观看 | 成人国产综合亚洲| 欧美日韩亚洲综合一区二区三区_| 国产成人av激情在线播放| 国产又色又爽无遮挡免费看| 欧美中文日本在线观看视频| 人人妻,人人澡人人爽秒播| 国产成人影院久久av| 国产亚洲精品一区二区www| 中文字幕人妻丝袜一区二区| 制服人妻中文乱码| 男女做爰动态图高潮gif福利片| 非洲黑人性xxxx精品又粗又长| 国产精品久久久久久久电影 | 91成年电影在线观看| 正在播放国产对白刺激| 国产精品av视频在线免费观看| 99热6这里只有精品| 国产视频一区二区在线看| 日本免费a在线| 成人一区二区视频在线观看| 国产精品一区二区三区四区久久| 国产片内射在线| 国产野战对白在线观看| 99热6这里只有精品| 妹子高潮喷水视频| 男人的好看免费观看在线视频 | 制服诱惑二区| 欧美精品亚洲一区二区| 日韩av在线大香蕉| 两性午夜刺激爽爽歪歪视频在线观看 | 亚洲avbb在线观看| 啦啦啦免费观看视频1| 久久久精品国产亚洲av高清涩受| 欧美一区二区精品小视频在线| 一边摸一边抽搐一进一小说| 曰老女人黄片| av在线天堂中文字幕| 成人一区二区视频在线观看| 亚洲av成人av| 国产精品一区二区免费欧美| 丰满人妻一区二区三区视频av | а√天堂www在线а√下载| 免费观看精品视频网站| 黄色女人牲交| 丰满人妻熟妇乱又伦精品不卡| 日韩成人在线观看一区二区三区| av片东京热男人的天堂| www.自偷自拍.com| 日本一区二区免费在线视频| 午夜福利高清视频| 久9热在线精品视频| 精品不卡国产一区二区三区| 最近最新中文字幕大全免费视频| 天天一区二区日本电影三级| 精品免费久久久久久久清纯| 一本综合久久免费| 国内毛片毛片毛片毛片毛片| 91在线观看av| 亚洲性夜色夜夜综合| 久久久精品大字幕| 免费电影在线观看免费观看| 亚洲av电影不卡..在线观看| 国产又色又爽无遮挡免费看| 欧美极品一区二区三区四区| 18禁黄网站禁片免费观看直播| 欧美色欧美亚洲另类二区| 日韩免费av在线播放| 亚洲成av人片免费观看| 久久久久久免费高清国产稀缺| 午夜福利在线观看吧| 亚洲va日本ⅴa欧美va伊人久久| 国产一区二区三区在线臀色熟女| 9191精品国产免费久久| 中文字幕熟女人妻在线| 欧美高清成人免费视频www| 国产午夜精品久久久久久| 亚洲人成网站高清观看| 国产精品一区二区三区四区久久| 久久久精品大字幕| 久久精品影院6| 国产在线精品亚洲第一网站| 男女床上黄色一级片免费看| 亚洲在线自拍视频| 天堂√8在线中文| 亚洲在线自拍视频| 国产av不卡久久| 精品一区二区三区av网在线观看| 男女床上黄色一级片免费看| 日韩大尺度精品在线看网址| 丰满人妻一区二区三区视频av | 青草久久国产| 久久精品91蜜桃| 日韩大码丰满熟妇| 中文亚洲av片在线观看爽| 国产精品久久视频播放| 亚洲人成网站在线播放欧美日韩| www.www免费av| 午夜激情av网站| 亚洲av第一区精品v没综合| 女人爽到高潮嗷嗷叫在线视频| 琪琪午夜伦伦电影理论片6080| 精品一区二区三区av网在线观看| 长腿黑丝高跟| 日日爽夜夜爽网站| 淫秽高清视频在线观看| 好看av亚洲va欧美ⅴa在| av有码第一页| 看片在线看免费视频| 女生性感内裤真人,穿戴方法视频| 一级毛片女人18水好多| 久久久水蜜桃国产精品网| 免费观看精品视频网站| 欧美一区二区国产精品久久精品 | 国产精品一及| 男女那种视频在线观看| 长腿黑丝高跟| 日日夜夜操网爽| 日韩三级视频一区二区三区| 亚洲中文日韩欧美视频| 国内毛片毛片毛片毛片毛片| 成人高潮视频无遮挡免费网站| 欧美日韩乱码在线| 又黄又爽又免费观看的视频| 亚洲精华国产精华精| 777久久人妻少妇嫩草av网站| 啪啪无遮挡十八禁网站| 国产视频一区二区在线看| 国产成人影院久久av| 午夜两性在线视频| 最近在线观看免费完整版| 亚洲中文字幕一区二区三区有码在线看 | 一进一出抽搐动态| av天堂在线播放| 国内精品久久久久久久电影| 天天躁狠狠躁夜夜躁狠狠躁| 欧美成人免费av一区二区三区| 国产又色又爽无遮挡免费看| 国产片内射在线| 国产私拍福利视频在线观看| 中文字幕久久专区| 露出奶头的视频| 黑人操中国人逼视频| 久久精品成人免费网站| 日韩中文字幕欧美一区二区| 1024手机看黄色片| 又粗又爽又猛毛片免费看| x7x7x7水蜜桃| 美女大奶头视频| 午夜福利成人在线免费观看| 免费av毛片视频| 精品第一国产精品| www.999成人在线观看| 啦啦啦韩国在线观看视频| 变态另类丝袜制服| 成人精品一区二区免费| 丝袜美腿诱惑在线| 成人高潮视频无遮挡免费网站| 欧美zozozo另类| 亚洲精品一卡2卡三卡4卡5卡| 欧美成人一区二区免费高清观看 | 精品乱码久久久久久99久播| 可以在线观看毛片的网站| 国产一区二区激情短视频| 欧美高清成人免费视频www| 欧美日韩亚洲综合一区二区三区_| 超碰成人久久| 色综合站精品国产| 国产片内射在线| 97人妻精品一区二区三区麻豆| 青草久久国产| 无限看片的www在线观看| avwww免费| av福利片在线| 看片在线看免费视频| 日本 av在线| 久久精品国产99精品国产亚洲性色| 亚洲国产精品sss在线观看| 欧美日韩亚洲国产一区二区在线观看| 国产精品久久久久久精品电影| 91大片在线观看| 长腿黑丝高跟| 国产熟女xx| 亚洲一码二码三码区别大吗| 国产精品亚洲一级av第二区| 亚洲色图 男人天堂 中文字幕| 18禁观看日本| 中出人妻视频一区二区| 久久精品国产清高在天天线| 久久久久久国产a免费观看| www.熟女人妻精品国产| www.999成人在线观看| 精品久久久久久久人妻蜜臀av| 国产精品自产拍在线观看55亚洲| 老熟妇乱子伦视频在线观看| 男人的好看免费观看在线视频 | 国产成人欧美在线观看| 久久精品国产亚洲av香蕉五月| 亚洲人成网站高清观看| 亚洲人成77777在线视频| 国产精品乱码一区二三区的特点| 久久精品成人免费网站| 国内少妇人妻偷人精品xxx网站 | 色老头精品视频在线观看| 久久久久国内视频| 大型黄色视频在线免费观看| 婷婷丁香在线五月| 亚洲男人天堂网一区| 亚洲av中文字字幕乱码综合| 亚洲精品中文字幕一二三四区| 激情在线观看视频在线高清| 久久午夜亚洲精品久久| 2021天堂中文幕一二区在线观| 精品久久久久久成人av| 亚洲av美国av| 黑人操中国人逼视频| 久久久久久久精品吃奶| 精品一区二区三区视频在线观看免费| 国产精品美女特级片免费视频播放器 | av在线播放免费不卡| 久久久久精品国产欧美久久久| 国产精品一区二区三区四区久久| 亚洲国产日韩欧美精品在线观看 | 亚洲性夜色夜夜综合| 激情在线观看视频在线高清| 亚洲欧美日韩高清专用| 国产亚洲精品第一综合不卡| 欧美国产日韩亚洲一区| 亚洲国产欧美人成| 欧美激情久久久久久爽电影| 制服诱惑二区| 精品久久久久久,| 熟女少妇亚洲综合色aaa.| 国产成年人精品一区二区| 亚洲av片天天在线观看| 欧美在线一区亚洲| 亚洲成人久久性| 夜夜夜夜夜久久久久| 久久精品国产亚洲av香蕉五月| 欧美最黄视频在线播放免费| 少妇裸体淫交视频免费看高清 | 制服丝袜大香蕉在线| 又紧又爽又黄一区二区| 国产成人av教育| av在线播放免费不卡| 黄色a级毛片大全视频| 听说在线观看完整版免费高清| 777久久人妻少妇嫩草av网站| 99在线视频只有这里精品首页| 女人高潮潮喷娇喘18禁视频| 精品久久久久久久久久久久久| 国产三级黄色录像| 成人国产综合亚洲| 在线观看日韩欧美| 亚洲国产看品久久| 欧美一级a爱片免费观看看 | 动漫黄色视频在线观看| 99久久精品国产亚洲精品| 久久精品国产亚洲av香蕉五月| 可以在线观看毛片的网站| 狠狠狠狠99中文字幕| 亚洲中文字幕日韩| 国产视频内射| 精品国产美女av久久久久小说| 怎么达到女性高潮| 听说在线观看完整版免费高清| 变态另类成人亚洲欧美熟女| 99久久国产精品久久久| 久久婷婷人人爽人人干人人爱| 国产激情欧美一区二区| 欧美在线一区亚洲| 日本免费一区二区三区高清不卡| 国产激情久久老熟女| 亚洲精品粉嫩美女一区| 久久久久国产精品人妻aⅴ院| 精品欧美国产一区二区三| 天天躁狠狠躁夜夜躁狠狠躁| 国产成人av激情在线播放| 国产伦人伦偷精品视频| 国产一级毛片七仙女欲春2| 欧美黄色淫秽网站| 99精品欧美一区二区三区四区| 国产午夜精品论理片| 午夜免费激情av| 日本五十路高清| 亚洲天堂国产精品一区在线| 国产精品av久久久久免费| 亚洲成人免费电影在线观看| 久久久久久亚洲精品国产蜜桃av| 国产精品,欧美在线| 9191精品国产免费久久| 十八禁人妻一区二区| 人人妻,人人澡人人爽秒播| 伦理电影免费视频| 极品教师在线免费播放| 亚洲电影在线观看av| 欧美3d第一页| 亚洲精品美女久久久久99蜜臀| 亚洲最大成人中文| 在线观看日韩欧美| 久久精品国产亚洲av香蕉五月| 亚洲中文日韩欧美视频| 久久久久国内视频| 亚洲成人精品中文字幕电影| 50天的宝宝边吃奶边哭怎么回事| 两个人视频免费观看高清| 日韩欧美 国产精品| 欧美中文日本在线观看视频| 免费高清视频大片| 亚洲色图av天堂| 中文字幕精品亚洲无线码一区| 两个人免费观看高清视频| 亚洲国产欧美网| 桃红色精品国产亚洲av| 亚洲国产精品999在线| 脱女人内裤的视频| 亚洲aⅴ乱码一区二区在线播放 | 国产精品久久电影中文字幕| 久久久久久大精品| 亚洲中文字幕日韩| 精品久久蜜臀av无| 亚洲中文av在线| 亚洲欧美精品综合一区二区三区| 日本撒尿小便嘘嘘汇集6| 亚洲,欧美精品.| 亚洲欧美日韩东京热| 亚洲一卡2卡3卡4卡5卡精品中文| 久久这里只有精品中国| 久久久久久久久中文| 午夜免费激情av| 亚洲成人中文字幕在线播放| 丁香六月欧美| 日韩av在线大香蕉| 国产精品影院久久| 亚洲在线自拍视频| 免费人成视频x8x8入口观看| 99国产极品粉嫩在线观看| 最新美女视频免费是黄的| 精品免费久久久久久久清纯| 日本免费一区二区三区高清不卡| 亚洲欧洲精品一区二区精品久久久| 91av网站免费观看| 欧美一区二区精品小视频在线| 国内久久婷婷六月综合欲色啪| 久久久精品欧美日韩精品| 妹子高潮喷水视频| www.精华液| 听说在线观看完整版免费高清| 亚洲精品中文字幕在线视频| www.精华液| 俄罗斯特黄特色一大片| 悠悠久久av| 国产高清videossex| 国产伦人伦偷精品视频| 日本免费一区二区三区高清不卡| 日韩欧美国产在线观看| 久久精品国产亚洲av高清一级| 日本免费一区二区三区高清不卡| 午夜久久久久精精品| 欧美成人性av电影在线观看| 91字幕亚洲| 国产精品一区二区免费欧美| 亚洲黑人精品在线| 午夜影院日韩av| 日日爽夜夜爽网站| 俺也久久电影网| svipshipincom国产片| 我要搜黄色片| 91字幕亚洲|