• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    An Efficient Algorithm for Low Rank Matrix Restoration Problem with Unknown Noise Level

    2022-01-11 09:22:02WANGDuoSHANGYoulinLVJinman

    WANG Duo SHANG You-linLV Jin-man

    (1.School of Mathematics and Statistics,Henan University of Science and Technology,Luoyang 471023,China;2.LMIB of the Ministry of Education,School of Mathematical Sciences,Beihang University,Beijing 100191,China;3.State Key Laboratory of Rail Traffic Control and Safety,Beijing Jiaotong University,Beijing 100044,China;4.School of Mathematics and Statistics,Wuhan University,Wuhan 430072,China)

    Abstract:Recovering an unknown high dimensional low rank matrix from a small set of entries is widely spread in the fields of machine learning,system identification and image restoration,etc.In many practical applications,the few observations are always corrupted by noise and the noise level is also unknown.A novel model with nuclear norm and square root type estimator has been proposed,which does not rely on the knowledge or on an estimation of the standard deviation of the noise.In this paper,we firstly reformulate the problem to an equivalent variable separated form by introducing an auxiliary variable.Then we propose an efficient alternating direction method of multipliers(ADMM) for solving it.Both of resulting subproblems admit an explicit solution,which makes our algorithm have a cheap computing.Finally,the numerical results show the benefits of the model and the efficiency of the proposed method.

    Keywords:Matrix restoration;Alternating direction method of multipliers;Square root least squares;Matrix completion

    §1.Introduction

    The problem of recovering a high dimensional matrix from noisy observations with unknown variance of the noise arises in many applications such as system identification,machine learning and image processing,etc[2,20,22].For the high dimensional setting,the unknown matrix always has the exactly or near low rank property.Thus the high dimensional low rank matrix estimation from noisy observations has been attracting much attention,such as its two particular settings of the matrix completion(MC) and the multivariate linear regression [17].In this paper,we focus on solving the problem of low rank matrix completion and its general form with unknown noise level,which are widely applied in collaborative filtering and machine learning [4,18,19,23].The mathematical model of MC with noiseless can be formulated as

    whereMis the real unknown matrix with some available sampled entries and Ω is a given set of index pairs (i,j).Its general form is the following matrix rank minimization problem

    whereA:Rm×n →Rpis a linear map andb∈Rpis a given measurement vector.The rank minimization problem of (1.1) and (1.2) are regarded as NP-hard problems because of the combinatorial properties of rank function [21].For solving these problems more efficiently,a popular method is to use the nuclear norm of the matrix instead of its rank,which is the best convex relaxation of the rank function over the unit ball of matrices with norm less than one[21].Then the problem (1.2) can be transformed into

    The nuclear norm ofXis defined as the sum of its singular values,i.e.,,where theσ1≥σ2≥···≥σr>0 are therpositive singular values of the matrixX.In practice,if the measurementbcontains a small amount of noise,the problem (1.3) can be relaxed to the following inequality constrained nuclear norm minimization problem

    or its equivalent nuclear norm regularized least square problem

    whereδ ≥0 reflects the noise level andμ>0 is a regularization parameter.

    The above nuclear norm minimization problems have been widely studied in recent years.Given the problems are convex,they have been solved by many efficient algorithms such as SeDuMi [27] and SDPT3 [31],singular value thresholding(SVT) algorithm [3],fixed point continuation with approximate SVD(FPCA) method [21],accelerated proximal gradient(APG)[30] method,ADMM type algorithms [6,14,16,26,33],etc.

    It is clear to see that most of above proposed noisy models have considered using the squaredl2norm to fit the data fidelity term,which can deal with the noiseless and Gaussian noisy problem effectively.Moreover,the recovery of theXrelies on knowing the standard deviation of the noise for the noisy case.However,in practical applications,the noise level is often unknown or is non-trivial to estimate when the problem scale is large.Therefore,there is a certain gap between the theory and the practice.Previously many researchers have studied the multivariate linear regression with unknown noise variance [12,32].The scaled Lasso [24] and the penalized Gaussian log-likelihood [25] have been presented for dealing with the unknown noise level in high dimensional sparse regression.Fonseca and Mark [9] firstly proposed a square-root Lasso method for estimating high-dimensional sparse linear regression models.Bellec et al [1] and Derumigny [5] showed that the square root Lasso can achieve the minimax optimal rate of convergence under some suitable conditions,even though the noise level is unknown.And the square root Lasso model is essentially equivalent to the scaled Lasso proposed by Sun and Zhang [28].Recently,Tang et al [29] considered the high-dimensional nonconvex square root loss regression problems and introduced a proximal majorization-minimization algorithm for solving these problems.Given the above research,we know that the choice of the regularized parameter is usually related to the level of the noise,and the advantage of choosing the square root norm is that choosing the regularized parameter does not rely on the knowledge or on an estimation of the standard deviation of the noise.Previously,multivariate linear regression(matrix regression)with unknown noise variance was considered in[2,11],which studied the rank penalized estimators.They need a assumption on the dimensions of the problem and the rank of the unknown matrix,respectively.Thus a new method by using the Frobenius norm instead of the squared Frobenius norm is proposed in the literature [17],which needs weaker conditions than the conditions obtained in [2,11].They considered the matrix completion problem and the matrix regression problem,and reformulate the low rank matrix completion problem as

    where the observationb∈Rpmay contain some noise andμ>0 is a given regularized parameter,||·||2is the Euclidean norm of vector.When theAis a linear operator to pick up the entries in index pairs Ω from a matrix,the problem(1.7)reduces to the MC problem.It is clear to see that the problem (1.7) isn’t easy to solve due to the nonsmooth terms in the objective function.Thus we firstly add a new variable and transfer the model (1.7) to its augmented Lagrangian form.Then we split the task of minimizing the corresponding augmented Lagrangian function into two subproblems.The resultingy?subproblem has closed-form solution,and the closed-form solution of theX?subproblem can also be easily derived by linearizing the quadratic term.So the proposed algorithm is easily performed and makes each subproblem easier to be solved.

    The main purpose of this paper is to introduce an ADMM for solving the model (1.7),which can deal with the high dimensional matrix restoration problem with unknown noise level.Moreover,the convergence results are given.The advantage of the model and the efficiency of the proposed algorithm are presented and demonstrated by some numerical experiments.

    The rest of this paper is organized as follows.In Section 2,we construct the ADMM for problem (1.7),and give the convergence of the proposed algorithm.The numerical results are reported in Section 3.Finally,we conclude the paper in Section 4.

    §2.Algorithm

    In this section,we will briefly review some typical ADMM firstly,and then show how to employ ADMM to solve the problem (1.7).Finally,we give the convergence result of the proposed method.

    2.1.Review of ADMM

    LetX,Y,Zbe finite dimensional real Euclidian spaces.Consider the following convex optimization problem with two block separable structure

    wheref:Y →(?∞,+∞] andg:Z →(?∞,+∞] are closed proper convex functions,A:X →YandB:X →Zare given linear maps,with adjointsA?andB?respectively,andc∈Xis given data.The augmented Lagrangian function associated with (2.1) is

    wherex∈Xis a multiplier,β>0 is a given penalty parameter.Starting from (x0,y0,z0)∈X×(dom f)×(dom g),the general iterative scheme of ADMM for solving (2.1) can be expressed as

    where theξis a step-length,which is chosen in the intervalIt is well known that the iterative scheme(2.3)is exactly the classical ADMM introduced by Glowinski&Marroco[13]and Gabay &Mericire [10] whenTf=0 andTg=0.WhenTfandTgare positive definite andξ=1,it becomes the proximal ADMM of Eckstein [7].When bothTfandTgare self-adjoint positive semi-definite linear operators,it turns into the semi-proximal ADMM of Fazel et al [8].

    2.2.Algorithm

    In this subsection,we employ the ADMM for solving the problem (1.7).Given an auxiliary variabley=A(X)?b,the problem (1.7) can be transformed equivalently into

    The corresponding augmented Lagrangian function of (2.4) is

    wherez ∈Rpis the Lagrangian multiplier andβ>0 is a penalty parameter.Given (Xk,yk,zk),we obtain (Xk+1,yk+1) by alternatively minimizing (2.5),i.e.,

    Firstly,for the given{(Xk,yk)},we can obtainXk+1by minimizing (2.5) with respect toXas follow:

    However,its analytic solution is not clear because of the linear mappingA.Letgk=A?(A(Xk)?) be the gradient ofatXk.Instead of solving (2.7),the Xsubproblem can be solved by an approximated model,i.e.,

    To get the exact solution of (2.8),letY=Xk ?τgk,then the solution ofX-subproblem can be reformulated as

    According to the theorem in [3,21],letY=UΣV T,Σ=diag({σi}1≤i≤r) be the singular value decomposition(SVD) of matrixY ∈Rm×nof rankr.For eachwe letDτμ/β(Y)=UΣτμ/βV T,Στμ/β=diag({σi ?τμ/β}+),where (t)+:=max{t,0}.Then theXk+1can be written directly as

    Secondly,for the givenzkand the latestXk+1,it is easy to deduce that

    where.

    Finally,for given (Xk+1,yk+1),the Lagrangian multiplier is updated by

    In light of all above analysis,we give the following framework of the ADMM method for solving the nuclear norm minimization with square root regularized problem (1.7).Here we note it as ADMM_NNSR.

    Initialization:InputX0,y0andz0.Given constantsβ>0,τ>0 andξ ∈.Setk=0.

    Step 1.ComputeXk+1via (2.10).

    Step 2.Computeyk+1via (2.11).

    Step 3.Computezk+1via (2.12).

    Step 4.If the termination criterion is not met,letk=k+1 and go toStep 1.

    This subsection is concluded with the following convergence theorem,and we state the convergence of the ADMM NNSR without proof.One should refer to Theorem B.1 in [8] for more detail.

    Theorem 2.1.Let the sequence {(Xk,yk,zk)} be generated by the proposedwith),then the sequence of {(Xk,yk,zk)} converges to(X?,y?,z?),where the(X?,y?) is an optimal solution of problem (2.4),and the z?is its corresponding Lagrangian multiplier.

    §3.Numerical experiments

    In this section,we report some numerical experiments to illustrate that the model (1.7)makes the optimal value of parameterμindependent of the noise level.Furthermore,it is more stable than nuclear norm regularized least squares problems with respect to the noise level.Firstly,we test the nuclear norm regularized least squares problems against (1.7) for solving randomly generated matrix completion problems.Then,we apply themethod to solve the general model (1.7) with different experimental settings.

    All experiments are performed under Windows 7 premium and MATLAB 2016a running on a Lenovo laptop with an Intel core CPU at 2.5 GHz and 4 GB memory.

    In all experiments,we produce a real low rank matrixMvia the Matlab script“randn(m,r)×randn(r,n)”.Obviously,the rank ofMisr.We user,sr,p,Ω to respectively denote the rank ofM,sampling ratio,the number of measurements and the index set of known elements for the matrix completion problem.Additionally,the number of degree of freedom for a matrix with rankris defined bydr=r(m+n?r).Since the DCT matrix-vector multiplication is implemented implicitly by FFT,this enables us to test problem more efficiently,so the partial discrete cosine transform(PDCT)matrix is chosen as the linear mapA.In all tests,the quadratic penalty parameter is chosen asβ=0.2/min(m,n) and we set the approximate parameterτ=1.1 because the values ofτslightly greater than 1/ρ(A?A) can accelerate the convergence rate by experiments,which can be seen in [34].In the noisy cases,the measurementbis produced byb=A(X)+ω,whereωis the additive Gaussian noise of zero mean and standard deviationσ,which can be generated byω=σ×randn(p,1).For each test,we useX?to represent the optimal solution produced by our algorithm,and use relative error(RelErr) to measure the quality ofX?for original matrixM,i.e.,.We say thatMis recovered successfully byX?if the corresponding RelErr is less than 10?3,which has been used in [3,15,21].

    It is clear that at each iteration,all singular values and singular vectors of Y need to be computed,which is very expensive.So in our implementation,we use a PROPACK package for partial SVD.However,PROPACK is not able to automatically compute those singular values larger than a threshold.Therefore,it needs us to determine the number of singular values to be computed at thek-th iteration,denoted bysvk.We initializesv0=min(m,n)/20,and update it by using the same strategy as [30],that is

    wheresvpkdenotes the number of positive singular values of the matrix.

    3.1.Matrix completion problems

    In this subsection,we present some numerical results to illustrate the feasibility and efficiency of the proposed algorithm.For our purpose,we test the following two matrix completion problems.One is nuclear norm matrix completion problem with square root fidelity term:

    The other one is nuclear norm regularized least squares problem:

    In the following tests,the models (3.1) and (3.2) are both solved by ADMM type method.Firstly,we test the two problems of (3.1) and (3.2) with different noise levels,which are 0.01,0.03,0.1 and 0.3.The test results can be seen in the Table 1.From the Table 1,we can observe that the model (3.1) is more efficient than model (3.2) in terms of accuracy and running time.Moreover,in this table,“rX”is the recovered rank by model (3.2),we note that the model (3.2)failed to attain a correct rank in some cases.Hence,it can be concluded that model (3.1) is more effective than model (3.2) in solving the low rank matrix restoration problem.

    Table 1 Numerical results of ADMM for models (3.1) and (3.2),m=n,sr=0.5,r=5.

    Secondly,we test the two problems of (3.1) and (3.2) with an increasing noise levels,which are 0.0001,0.001 and 0.01.The test results are given in the Table 2.Observing the Table 2,it is clear that the minimum of the relative error“RelErr”is attained for a parameterμthat increases withσfor model (3.2),while it keeps constant for model (3.1).This numerically illustrates the fact that the optimal value ofμis almost independent of the noise level when replacing least squares with square root least squares.

    Table 2 Numerical results of ADMM for models (3.1) and (3.2),m=n,sr=0.5,r=5.

    To further illustrate the advantage of the model (3.1),we plot the relative errors for a varying parameterμof models (3.1) and (3.2) in Figures 1 and 2.The horizontal and vertical axes denote the value ofμand relative error“RelErr”,respectively.To avoid repetition,the relative errors of the model (3.1) are plotted in three graphs,which can be seen in the Figure 1.From the Figure 1,we can see that the optimal value ofμfor model (3.1) is almost constant(see the position of minimum in each row).By observing each curve in the Figure 2,it is clear that the optimal value ofμis increased with the noise levelσ.Based on this test,we can conclude that model (3.1) is more stable than model (3.2) with respect to the noise level.

    Fig.1 Numerical results of model (3.1) with three noisy cases (m=n=100,r=5,sr=0.5,σ=0.01,0.001,0.0001).

    Fig.2 Numerical results of model (3.2) with three noisy cases (m=n=100,r=5,sr=0.5,σ=0.01,0.001,0.0001).

    3.2.Nuclear norm and square root least squares problems

    In this subsection,we report the efficiency of themethod for solving the problem (1.7) with different experimental settings.Firstly,we test it for solving the problem(1.7) with Gaussian noise.In this test,we choosem=nfrom 100 to 1000 and setsr=0.5,r=5,σ=0.01.The test results are given in the Table 3.From the Table 3,we can clearly see that the model (3.1) is more efficient for high dimensional problems,and a higher accuracy can be obtained compared to the lower dimensional problems.

    Table 3 Numerical results of ADMM_NNSR for problem (1.7),m=n,sr=0.5, σ=0.01.

    Secondly,we test ADMM_NNSR method asris increasing from 5 to 30.The test results are shown in the Table 4.Observing the Table 4,we can see that the accuracy of the solution is 10?4.In other words,the problem is solved successfully.Moreover,it can be seen from the table that the ADMM_NNSR method needs more time asrincreases except for the caser=15.

    Table 4 Numerical results of ADMM_NNSR for problem (1.7) with different r,m=n,sr=0.5

    Finally,we test the performance of ADMM_NNSR method for solving problem (1.7) with differentsr.The numerical results can be seen in the Figure 3.From the Figure 3,we can see that as the higher the sampling ratio is,the higher accuracy can be obtained.

    Fig.3 ADMM_NNSR for problem (1.7) with Gaussian noise (m=n=500,r=5,sr=0.3,0.5,0.7,0.9, σ=0.01).

    §4.Conclusions

    In this paper,we first applied the ADMM_NNSR for solving the problem (1.7),which is efficient for solving the high dimensional matrix restoration problem with unknown noise level.The convergence result of the proposed method was given as well.Then we presented some numerical experiments to illustrate the efficiency of the model and the proposed algorithm.Numerical results show that the optimal parameterμdoes not depend on the noise level for the model (3.1),while it needs to be increased withσfor the model (3.2).Furthermore,the model(3.1) is more efficient than model (3.2) in terms of accuracy and running time.Here we devote to illustrating the performance of the ADMM_NNSR for matrix completion problem.In the near future,we will use our proposed method to solve more high-dimensional matrix estimation problems.

    日韩欧美在线乱码| 我的老师免费观看完整版| 国产 一区精品| 精品乱码久久久久久99久播| 免费搜索国产男女视频| 男人狂女人下面高潮的视频| 免费一级毛片在线播放高清视频| av.在线天堂| 成人三级黄色视频| 免费观看人在逋| 国产人妻一区二区三区在| 色尼玛亚洲综合影院| 欧美激情国产日韩精品一区| 亚洲精品一区av在线观看| 国产亚洲91精品色在线| 好男人在线观看高清免费视频| 国产久久久一区二区三区| 久久久欧美国产精品| 国产免费一级a男人的天堂| 日韩精品有码人妻一区| 男女视频在线观看网站免费| 九九爱精品视频在线观看| 国产伦一二天堂av在线观看| 亚洲自拍偷在线| 成人特级av手机在线观看| 亚洲最大成人手机在线| 成人综合一区亚洲| 两个人的视频大全免费| 高清毛片免费观看视频网站| 国产高清三级在线| 亚洲欧美精品自产自拍| 长腿黑丝高跟| 美女cb高潮喷水在线观看| 在线播放国产精品三级| 国产亚洲精品久久久久久毛片| 男女下面进入的视频免费午夜| 18禁在线播放成人免费| 成人二区视频| 又黄又爽又免费观看的视频| 亚洲欧美成人综合另类久久久 | 搡女人真爽免费视频火全软件 | 啦啦啦韩国在线观看视频| 国产老妇女一区| 日韩精品有码人妻一区| 天天躁日日操中文字幕| 免费黄网站久久成人精品| 伦精品一区二区三区| 国产三级中文精品| 麻豆国产97在线/欧美| 毛片女人毛片| 男女那种视频在线观看| 国产女主播在线喷水免费视频网站 | 欧美一区二区亚洲| 中出人妻视频一区二区| 日韩精品中文字幕看吧| 国产精品av视频在线免费观看| 日本a在线网址| 桃色一区二区三区在线观看| 国产午夜精品论理片| 亚洲av中文av极速乱| 天堂动漫精品| 免费观看的影片在线观看| 99久久成人亚洲精品观看| 免费无遮挡裸体视频| 国产一区二区在线观看日韩| 最后的刺客免费高清国语| 久久精品国产亚洲av天美| 网址你懂的国产日韩在线| 亚洲精品成人久久久久久| 中国国产av一级| 秋霞在线观看毛片| 欧美高清成人免费视频www| 一进一出抽搐gif免费好疼| 亚洲av成人av| 韩国av在线不卡| 久久久久久久久中文| 97碰自拍视频| 色综合站精品国产| 少妇人妻精品综合一区二区 | 欧美+日韩+精品| 亚洲自拍偷在线| 我要搜黄色片| 日本三级黄在线观看| 婷婷亚洲欧美| 丰满的人妻完整版| videossex国产| 乱码一卡2卡4卡精品| 三级男女做爰猛烈吃奶摸视频| 日本 av在线| 亚洲国产精品成人综合色| 欧美极品一区二区三区四区| 小说图片视频综合网站| 国产av不卡久久| 乱系列少妇在线播放| 日日摸夜夜添夜夜添小说| 1000部很黄的大片| 免费看美女性在线毛片视频| 亚洲国产精品久久男人天堂| 婷婷亚洲欧美| 丰满人妻一区二区三区视频av| 在线观看午夜福利视频| 精品久久久久久久久久免费视频| 日韩av不卡免费在线播放| 久久久久久大精品| 如何舔出高潮| 亚洲电影在线观看av| 99热只有精品国产| 午夜激情福利司机影院| 精品少妇黑人巨大在线播放 | 亚洲精品亚洲一区二区| 午夜亚洲福利在线播放| 亚洲精品乱码久久久v下载方式| 老熟妇乱子伦视频在线观看| 亚洲av免费在线观看| 国产精品嫩草影院av在线观看| 淫妇啪啪啪对白视频| 亚洲第一区二区三区不卡| 亚洲自偷自拍三级| 久久精品国产亚洲网站| 精品福利观看| 久久久成人免费电影| 亚洲va在线va天堂va国产| 亚洲av成人精品一区久久| 两个人的视频大全免费| 搡老妇女老女人老熟妇| 乱码一卡2卡4卡精品| 国产极品精品免费视频能看的| 国产男人的电影天堂91| 成年免费大片在线观看| 国产高清不卡午夜福利| 成人特级av手机在线观看| 欧美色欧美亚洲另类二区| 欧美+日韩+精品| avwww免费| 午夜老司机福利剧场| 日本一本二区三区精品| 亚洲美女黄片视频| 国产成人a∨麻豆精品| 国产乱人偷精品视频| 国产午夜福利久久久久久| 天堂√8在线中文| 一进一出抽搐动态| 国产精品一区二区三区四区免费观看 | 麻豆国产av国片精品| 亚洲人成网站在线观看播放| 在线观看一区二区三区| 亚洲国产精品成人久久小说 | 又粗又爽又猛毛片免费看| 激情 狠狠 欧美| 性欧美人与动物交配| 老熟妇乱子伦视频在线观看| 女生性感内裤真人,穿戴方法视频| 美女 人体艺术 gogo| 亚洲无线观看免费| 我的女老师完整版在线观看| 在线免费十八禁| 久久久久久久久中文| 久久久午夜欧美精品| 中国国产av一级| 欧美极品一区二区三区四区| 成人毛片a级毛片在线播放| 日韩强制内射视频| 国产精品永久免费网站| 日日啪夜夜撸| 最后的刺客免费高清国语| 日韩大尺度精品在线看网址| 亚洲精品日韩av片在线观看| 久久九九热精品免费| 午夜视频国产福利| 男女做爰动态图高潮gif福利片| 亚洲欧美清纯卡通| 搡老熟女国产l中国老女人| 欧美bdsm另类| 最近手机中文字幕大全| 精品少妇黑人巨大在线播放 | 国产 一区精品| 18禁在线无遮挡免费观看视频 | 亚洲最大成人av| 久久草成人影院| 一级黄色大片毛片| 欧美性猛交╳xxx乱大交人| 色av中文字幕| 精品一区二区三区人妻视频| 亚洲欧美日韩卡通动漫| 小蜜桃在线观看免费完整版高清| 亚洲综合色惰| 啦啦啦啦在线视频资源| 午夜福利在线观看免费完整高清在 | 中国国产av一级| 亚洲不卡免费看| 国产高潮美女av| 免费观看人在逋| 日韩欧美国产在线观看| 深夜a级毛片| 久久久久九九精品影院| 少妇的逼水好多| 国产一区二区在线观看日韩| 午夜激情欧美在线| 黄色日韩在线| 偷拍熟女少妇极品色| 国产精品永久免费网站| 午夜影院日韩av| 六月丁香七月| 欧美日本视频| 日日撸夜夜添| 免费在线观看成人毛片| 日本-黄色视频高清免费观看| 欧美性猛交黑人性爽| 欧美极品一区二区三区四区| 亚洲精品亚洲一区二区| 国产精品久久视频播放| 亚洲无线观看免费| 国产私拍福利视频在线观看| 性欧美人与动物交配| 高清午夜精品一区二区三区 | 美女 人体艺术 gogo| 婷婷精品国产亚洲av| 有码 亚洲区| 色av中文字幕| 日本五十路高清| 一边摸一边抽搐一进一小说| 亚洲无线观看免费| 亚洲精品粉嫩美女一区| 国产av一区在线观看免费| 日韩av不卡免费在线播放| 夜夜爽天天搞| av在线亚洲专区| 美女黄网站色视频| 晚上一个人看的免费电影| 成年女人看的毛片在线观看| 岛国在线免费视频观看| 91在线观看av| 亚洲国产欧洲综合997久久,| 亚洲人成网站在线播| 成人av一区二区三区在线看| 一级毛片我不卡| 搡老熟女国产l中国老女人| 高清日韩中文字幕在线| 中出人妻视频一区二区| 久久久午夜欧美精品| 日韩高清综合在线| 成人高潮视频无遮挡免费网站| 十八禁网站免费在线| 成年女人毛片免费观看观看9| 变态另类成人亚洲欧美熟女| 91在线精品国自产拍蜜月| 中文字幕久久专区| 我的女老师完整版在线观看| 精品99又大又爽又粗少妇毛片| 成人av一区二区三区在线看| 久久久久九九精品影院| 精品福利观看| 精品免费久久久久久久清纯| a级一级毛片免费在线观看| 亚洲欧美成人综合另类久久久 | 久久国内精品自在自线图片| 婷婷精品国产亚洲av| 综合色av麻豆| 精品一区二区三区av网在线观看| 精品一区二区免费观看| 精品午夜福利视频在线观看一区| 精品人妻偷拍中文字幕| 午夜福利视频1000在线观看| 99久久九九国产精品国产免费| 九九热线精品视视频播放| 99久久成人亚洲精品观看| 欧美日本亚洲视频在线播放| 欧美色视频一区免费| 国产精品一区二区三区四区免费观看 | 色av中文字幕| 国产午夜精品久久久久久一区二区三区 | 欧洲精品卡2卡3卡4卡5卡区| 欧美又色又爽又黄视频| 国产人妻一区二区三区在| 变态另类成人亚洲欧美熟女| 丝袜美腿在线中文| 日日摸夜夜添夜夜添av毛片| 国产国拍精品亚洲av在线观看| 亚洲人成网站在线播| 99国产极品粉嫩在线观看| 男人的好看免费观看在线视频| 国产伦在线观看视频一区| 五月玫瑰六月丁香| 色av中文字幕| 91av网一区二区| 一进一出抽搐动态| 国产女主播在线喷水免费视频网站 | 赤兔流量卡办理| 免费av毛片视频| 性欧美人与动物交配| 老熟妇仑乱视频hdxx| videossex国产| 国产探花极品一区二区| 美女xxoo啪啪120秒动态图| 国产色婷婷99| 久久中文看片网| 国产精品久久久久久久久免| 亚洲真实伦在线观看| 久久九九热精品免费| 国产白丝娇喘喷水9色精品| 我要看日韩黄色一级片| 啦啦啦韩国在线观看视频| 99热网站在线观看| 欧美成人精品欧美一级黄| 中文字幕av成人在线电影| 六月丁香七月| 国语自产精品视频在线第100页| 欧美性猛交╳xxx乱大交人| 大又大粗又爽又黄少妇毛片口| 日韩制服骚丝袜av| 午夜福利成人在线免费观看| 午夜影院日韩av| 亚洲成人久久爱视频| 成人高潮视频无遮挡免费网站| 综合色av麻豆| 亚洲人成网站在线观看播放| 一级毛片电影观看 | 给我免费播放毛片高清在线观看| 在线天堂最新版资源| 欧美zozozo另类| 亚洲欧美成人精品一区二区| 长腿黑丝高跟| 成人美女网站在线观看视频| 国产精品免费一区二区三区在线| 国产精品永久免费网站| 99久久精品热视频| 直男gayav资源| 3wmmmm亚洲av在线观看| 老司机福利观看| 亚洲美女黄片视频| 亚洲一区高清亚洲精品| 国产三级在线视频| 97超级碰碰碰精品色视频在线观看| 日韩欧美 国产精品| 亚洲欧美清纯卡通| 精品熟女少妇av免费看| 插阴视频在线观看视频| 亚洲专区国产一区二区| 天天躁日日操中文字幕| .国产精品久久| 精品免费久久久久久久清纯| 国产成人freesex在线 | 成年女人看的毛片在线观看| 99久久精品国产国产毛片| 国产老妇女一区| 欧美成人a在线观看| 久久久久国产精品人妻aⅴ院| 日韩欧美免费精品| 不卡视频在线观看欧美| 国产黄色小视频在线观看| 午夜免费男女啪啪视频观看 | 内射极品少妇av片p| 舔av片在线| 日韩一区二区视频免费看| 日韩制服骚丝袜av| 亚洲精品乱码久久久v下载方式| 欧美日韩精品成人综合77777| 欧美色欧美亚洲另类二区| 91久久精品电影网| 熟女人妻精品中文字幕| 老熟妇仑乱视频hdxx| 久久久国产成人免费| 免费观看在线日韩| 18禁在线无遮挡免费观看视频 | 1024手机看黄色片| 1000部很黄的大片| 国产美女午夜福利| 午夜福利在线在线| 成人鲁丝片一二三区免费| 人人妻人人澡欧美一区二区| 国产视频内射| 亚洲欧美日韩无卡精品| 蜜桃亚洲精品一区二区三区| 色噜噜av男人的天堂激情| 精品日产1卡2卡| 五月伊人婷婷丁香| 久久精品国产99精品国产亚洲性色| 蜜臀久久99精品久久宅男| 美女 人体艺术 gogo| 亚洲激情五月婷婷啪啪| 国产中年淑女户外野战色| 日韩一本色道免费dvd| 国产黄色视频一区二区在线观看 | 在线观看66精品国产| 卡戴珊不雅视频在线播放| 欧美激情久久久久久爽电影| 嫩草影院入口| 全区人妻精品视频| 好男人在线观看高清免费视频| 不卡一级毛片| 人妻少妇偷人精品九色| 久久久午夜欧美精品| 免费在线观看成人毛片| 亚洲精品一区av在线观看| 亚洲成a人片在线一区二区| 神马国产精品三级电影在线观看| 欧美色欧美亚洲另类二区| 国产三级在线视频| 特大巨黑吊av在线直播| 久久婷婷人人爽人人干人人爱| 亚洲美女搞黄在线观看 | 中文字幕久久专区| 丰满乱子伦码专区| 色5月婷婷丁香| 亚洲av一区综合| 国产免费男女视频| av卡一久久| 亚洲专区国产一区二区| 亚洲四区av| 12—13女人毛片做爰片一| 久99久视频精品免费| ponron亚洲| 能在线免费观看的黄片| 最好的美女福利视频网| 午夜福利成人在线免费观看| 亚洲综合色惰| 大又大粗又爽又黄少妇毛片口| 有码 亚洲区| 中文字幕av在线有码专区| 亚洲最大成人av| 国产成人一区二区在线| 色播亚洲综合网| 亚洲精品一卡2卡三卡4卡5卡| 久久天躁狠狠躁夜夜2o2o| 国产高清不卡午夜福利| 一本一本综合久久| 男人舔奶头视频| 久久精品国产亚洲网站| eeuss影院久久| 亚洲人成网站在线播| 国产成人91sexporn| 精品久久久久久久久久免费视频| 欧美一区二区国产精品久久精品| 亚洲,欧美,日韩| 欧美zozozo另类| 三级男女做爰猛烈吃奶摸视频| 日本一二三区视频观看| 一区二区三区高清视频在线| a级毛色黄片| av国产免费在线观看| 午夜福利成人在线免费观看| 99久久中文字幕三级久久日本| 精品久久久噜噜| 床上黄色一级片| 久久久欧美国产精品| 日韩欧美免费精品| 波野结衣二区三区在线| 给我免费播放毛片高清在线观看| 亚洲精品亚洲一区二区| 精品午夜福利视频在线观看一区| 香蕉av资源在线| 亚洲三级黄色毛片| a级毛片a级免费在线| 欧美激情国产日韩精品一区| 18禁在线无遮挡免费观看视频 | 麻豆精品久久久久久蜜桃| 亚洲成人中文字幕在线播放| 久久久久久久久久久丰满| 日本精品一区二区三区蜜桃| 小蜜桃在线观看免费完整版高清| 久久精品夜色国产| 国产av在哪里看| 极品教师在线视频| 草草在线视频免费看| 又黄又爽又刺激的免费视频.| 亚洲内射少妇av| 尾随美女入室| 18禁在线无遮挡免费观看视频 | 国产精品一二三区在线看| 婷婷色综合大香蕉| 国产精品永久免费网站| 床上黄色一级片| 国产伦一二天堂av在线观看| videossex国产| 人妻夜夜爽99麻豆av| 久久精品夜色国产| 少妇丰满av| 亚洲人成网站高清观看| 我要看日韩黄色一级片| 亚洲av二区三区四区| 亚洲欧美日韩高清专用| 亚洲自拍偷在线| 久久久久久久久久久丰满| 国产欧美日韩精品亚洲av| 亚洲欧美成人精品一区二区| 99久久无色码亚洲精品果冻| 亚洲欧美精品自产自拍| 色视频www国产| 少妇丰满av| 成人美女网站在线观看视频| 18禁在线播放成人免费| 亚洲成人av在线免费| 在线免费十八禁| 亚洲欧美日韩无卡精品| 最近最新中文字幕大全电影3| 国产精品日韩av在线免费观看| 国产伦一二天堂av在线观看| 一级毛片我不卡| 男女啪啪激烈高潮av片| 欧美极品一区二区三区四区| 国产高清视频在线观看网站| 国产精品美女特级片免费视频播放器| 国产 一区精品| 老师上课跳d突然被开到最大视频| 国国产精品蜜臀av免费| 日韩精品中文字幕看吧| 国产片特级美女逼逼视频| av中文乱码字幕在线| 日本三级黄在线观看| 亚洲内射少妇av| 伊人久久精品亚洲午夜| 最近手机中文字幕大全| 人人妻人人澡欧美一区二区| 国产在视频线在精品| 欧美一区二区国产精品久久精品| 国产毛片a区久久久久| 美女cb高潮喷水在线观看| 99国产极品粉嫩在线观看| 亚洲图色成人| 国产精品三级大全| 少妇高潮的动态图| 精品欧美国产一区二区三| 国产精品福利在线免费观看| 国产大屁股一区二区在线视频| 久久久久久久午夜电影| 不卡视频在线观看欧美| 在现免费观看毛片| 少妇人妻一区二区三区视频| 国产成人a区在线观看| 俄罗斯特黄特色一大片| 一级黄色大片毛片| 久久精品国产亚洲网站| 久久久久久伊人网av| 久久久久久久久中文| 嫩草影院入口| 波多野结衣高清无吗| 成年av动漫网址| 国产成人福利小说| 白带黄色成豆腐渣| 成人性生交大片免费视频hd| 五月伊人婷婷丁香| 免费人成在线观看视频色| 亚洲av第一区精品v没综合| 欧美日韩一区二区视频在线观看视频在线 | 嫩草影院新地址| 不卡一级毛片| 欧美成人精品欧美一级黄| 欧美精品国产亚洲| 日韩,欧美,国产一区二区三区 | 亚洲av一区综合| av.在线天堂| 又爽又黄无遮挡网站| 国产精品人妻久久久影院| 内射极品少妇av片p| 亚洲av免费高清在线观看| 国产午夜精品久久久久久一区二区三区 | 波野结衣二区三区在线| 秋霞在线观看毛片| 国产精品,欧美在线| 国产午夜精品论理片| 久久国内精品自在自线图片| 深夜a级毛片| 国产色婷婷99| 丝袜美腿在线中文| 91午夜精品亚洲一区二区三区| 别揉我奶头 嗯啊视频| 露出奶头的视频| 波野结衣二区三区在线| 久久久久性生活片| 色吧在线观看| 尾随美女入室| 男人和女人高潮做爰伦理| 美女免费视频网站| 亚洲丝袜综合中文字幕| 欧美在线一区亚洲| 亚洲av中文av极速乱| 午夜久久久久精精品| 国产精品精品国产色婷婷| 成年女人看的毛片在线观看| 欧美成人免费av一区二区三区| 精品久久国产蜜桃| 在线观看66精品国产| 夜夜夜夜夜久久久久| 精品乱码久久久久久99久播| 日韩国内少妇激情av| 午夜爱爱视频在线播放| 久久亚洲精品不卡| 午夜免费男女啪啪视频观看 | 日日啪夜夜撸| 国产在线精品亚洲第一网站| 岛国在线免费视频观看| 日韩人妻高清精品专区| 亚洲天堂国产精品一区在线| 国产黄色小视频在线观看| 老熟妇仑乱视频hdxx| 精品久久久久久久久久久久久| 超碰av人人做人人爽久久| 免费在线观看影片大全网站| 欧美另类亚洲清纯唯美| 91久久精品国产一区二区三区| 夜夜看夜夜爽夜夜摸| 女的被弄到高潮叫床怎么办| 国产精品日韩av在线免费观看| 国产精品亚洲一级av第二区| 麻豆av噜噜一区二区三区| АⅤ资源中文在线天堂| 亚洲色图av天堂| 夜夜夜夜夜久久久久| 国产国拍精品亚洲av在线观看| .国产精品久久| 啦啦啦啦在线视频资源| 国产成人福利小说| 精品不卡国产一区二区三区| 91精品国产九色| 91久久精品国产一区二区三区|