• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Multiple Kernel Clustering Based on Self-Weighted Local Kernel Alignment

    2019-11-07 03:13:08ChuanliWangEnZhuXinwangLiuJiaohuaQinJianpingYinandKaikaiZhao
    Computers Materials&Continua 2019年10期

    Chuanli Wang,En ZhuXinwang LiuJiaohua Qin,Jianping Yinand Kaikai Zhao

    Abstract:Multiple kernel clustering based on local kernel alignment has achieved outstanding clustering performance by applying local kernel alignment on each sample.However,we observe that most of existing works usually assume that each local kernel alignment has the equal contribution to clustering performance,while local kernel alignment on different sample actually has different contribution to clustering performance.Therefore this assumption could have a negative effective on clustering performance.To solve this issue,we design a multiple kernel clustering algorithm based on self-weighted local kernel alignment,which can learn a proper weight to clustering performance for each local kernel alignment.Specifically,we introduce a new optimization variable-weight-to denote the contribution of each local kernel alignment to clustering performance,and then,weight,kernel combination coefficients and cluster membership are alternately optimized under kernel alignment frame.In addition,we develop a three-step alternate iterative optimization algorithm to address the resultant optimization problem.Broad experiments on five benchmark data sets have been put into effect to evaluate the clustering performance of the proposed algorithm.The experimental results distinctly demonstrate that the proposed algorithm outperforms the typical multiple kernel clustering algorithms,which illustrates the effectiveness of the proposed algorithm.

    Keywords:Multiple kernel clustering,kernel alignment,local kernel alignment,self-weighted.

    1 Introduction

    On the one hand,kernel-based clustering algorithms are simple and effective[Filippone,Camastra,Masulli et al.(2008);Tzortzis and Likas(2009)];on the other hand,many typical clustering algorithms,such as spectral clustering and non-negative matrix factorization clustering,can be interpreted from the perspective of kernel[Dhillon,Guan and Kulis(2007);Ding,He,Simon et al.(2005)].Therefore,kernel-based clustering algorithms have been a research hot in various applications[Gnen and Margolin(2014);Li,Qin,Xiang et al.(2015)].Compared with one kernel,multiple kernel can provides more useful and complementary information for clustering[Cai,Nie and Huang(2013);Cai,Jiao,Zhuge et al.(2018);Hou,Nie,Tao et al.(2017)].Multiple kernel clustering(MKC)has attracted more and more attention,and a lot of MKC algorithms and their variants have been proposed recently[Han,Yang,Yang et al.(2018);Du,Zhou,Shi et al.(2015)].

    MKC algorithms aim to improve the clustering performance by jointly optimizing a group of kernel combination coefficients and cluster membership[Liu,Dou,Yin et al.(2016)].In light of the difference of optimization frame,existing MKC algorithms can be roughly grouped into two categories.The spirit of the first category is that the single kernel is replaced with a combined kernel in objective function of clustering,and the optimal kernel combination coefficients and cluster membership are solved under clustering frame.The algorithms with regard to this one mainly include:multiple kernelK-means[Huang,Chuang,Chen et al.(2012)],multiple kernel fuzzyC-means[Chen,Chen,Lu et al.(2011)],robust multiple kernelK-means[Du,Zhou,Shi et al.(2015)],optimal neighborhood clustering[Liu,Zhou,Wang et al.(2017)],etc.Instead,the idea of the other category is that the cluster membership is viewed as pseudo label,and then it is put in the objective of kernel alignment[Wang,Zhao,and Tian(2015)],which is a widely used learning criterion in supervised learning,and the optimal kernel combination coefficients and pseudo label are optimized under multiple kernel learning frame.Along this idea,Lu et al.[Lu,Wang,Lu et al.(2014)]proposed centered kernel alignment for multiple kernel clustering,Liu et al.[Liu,Dou,Yin et al.(2016)]proposed kernel alignment maximization for clustering,Li et al.[Li,Liu,Wang et al.(2016)]proposed multiple kernel clustering based on local kernel alignment,etc.our work in this paper pays close attention to the clustering algorithms belonging to the second category.

    Among these algorithms belonging to the second category,multiple kernel clustering based on local kernel alignment(LKAMKC)obtains prominent clustering performance by using local kernel to exploit the local structure information of data for clustering.Concretely,the sum of the objective of each local kernel alignment is defined as the optimization objective of LKAMKC,that is,it conducts local kernel alignment on each sample.

    However,LKAMKC has achieved significant clustering performance,we observe that most of existing works usually assume that each local kernel alignment has an equal contribution to clustering performance,that is,each local kernel alignment is equally considered in whole clustering period.Obviously,this assumption does not well take the difference of each local kernel alignment into count,which could hinder the improving of the clustering performance.To address this issue,we propose a multiple kernel clustering algorithm based on self-weighted local kernel alignment to improve clustering performance.In detail,we introduce a weight variable to denote the contribution of each local kernel alignment to clustering performance,and then,weight of each local kernel alignment,kernel combination coefficients and cluster membership are jointly optimized.The proposed algorithm improves clustering performance by imposing learned weight on each local kernel alignment.After that,we develop a three-step alternate iterative optimization algorithm to solve the new optimization problem.Broad experiments on five benchmark data sets have been put into effect to evaluate the clustering performance of the proposed algorithm.The experimental results clearly show that the proposed algorithm outperforms the typically compared methods,which illustrates the effectiveness of the proposed algorithm.

    2 Related work

    In this section,we first review the related work about the kernel alignment and local kernel alignment for multiple kernel clustering.

    2.1 Kernel alignment for multiple kernel clustering

    Supposed a data set with n samples has m kernel feature matrices{Ki}i=1,…,m,and the data set needs to be divided into k clusters.Letand Kμdenote the relaxed cluster membership matrix and the combined kernel respectively.Kμcan be calculated as:

    where μp≥ 0 denotes the combination coefficient of kernel matrix Kpto Kμ.

    According to Lu et al.[Lu,Wang,Lu et al.(2014)],HH?can be regarded as a pseudo ideal kernel matrix.By substituting the true deal kernel with HH?,the objective of kernel alignment for multiple kernel clustering(KAMKC)can be expressed as:

    where 〈·,·〉Fdenotes the Frobenius inner product of the two matrices and μ=[μ1,…,μm].H?H=Ikmeans H satisfies orthogonal constraint,μ?1m=1 means μ satisfies one norm constraint.

    Because Eq.(2)is too complicated to directly optimize,Liu et al.[Liu,Dou,Yin et al.(2016)]not only theoretically discusses the connection between KAMKC and multiple kernelK-means(MKKM)but also derives an easy and equivalent optimization objective of KAMKC based on MKKM.The new optimization formula of KAMKC can be fulfilled as:

    2.2 Local kernel alignment for multiple kernel clustering

    As seen from Eq.(2)or Eq.(3),KAMKC only utilizes the global structure information of kernel,while ignores its local structure information.Local kernel alignment for multiple kernel clustering(LKAMKC)enhances the clustering performance by exploiting the local structure of each sample with local kernel.

    Replacing the global Kμ,H and M with the local,H(i)and M(i)respectively,the objective of local kernel alignment(LKA)on ithsample can be written as:

    By accumulating objective of each LKA one by one,the objective of LKAMKC can be written as:

    3 Multiple kernel clustering based on self-weighted local kernel alignment

    3.1 The proposed formulation

    As shown from Eq.(8),LKAMKC equally considers each local kernel alignment,while inappropriately ignores the difference between each local kernel alignment.Thus,the contribution of each LKA to clustering performance is not properly exploited,which could hinder the improving of the clustering performance.To address the issue,we introduce a new weight variable to denote the contribution of each local kernel alignment to clustering performance.The new optimization variable and the old optimization variables in Eq.(8)are jointly optimized.By imposing a weight for each local kernel alignment on Eq.(8),the formulation of the proposed multiple kernel clustering algorithm can be written as:

    where w=[w1,w2,…,wn]is the weight of the each local kernel alignment,w?1n=1 means w needs satisfy one norm constraint.

    3.2 Optimization

    Although the proposed algorithm introduces a new variable,the optimization problem in Eq.(9)can still be solved.Specifically,we proposed a three-step alternating iterative method to optimize Eq.(9).

    (i)Optimizing H when μ and w are given

    Supposed other two optimization variables are given beforehand,then Eq.(9)can be translated into the following optimization problem.

    Eq.(10)is a standard problem of kernel k-means,and the optimal H can be comprised by the k eigenvectors that correspond to the k largest eigenvalues of V.

    (ii)Optimizing μ when H and w are given

    If H and w are fixed,Eq.(9)is equivalent to a quadratic programming problem about μ.

    Eq.(11)can be effectively solved by existing off-the-shelf packages.

    (iii)Optimizing w when H and μ are given

    If H and μ are fixed,Eq.(9)is equivalent to the following optimization problem.

    Clearly,if aiis greater than zero,Eq.(12)is a convex quadratic programming problem,and it has an analytic solution.By applying the KKT condition on Eq.(12),the global optimal wican be computed by the following.

    To prove that aiis greater than zero,we only need to prove thatis greater than zero because μ?M(i)μ must be greater than zero since M(i)is a positive definite matrix.

    Proof:H?H=Ikand HH?H=H because of H is an orthogonal matrix.Let hidenote ithcolumn of matrix H,where i>=1 and i<=k.Clearly,HH?hi=hiwhich illustrates that HH?has k eigenvalue equalling to one and n-k eigenvalue equaling to zero.Alike,In-HH?has n-k eigenvalue equalling to one and k eigenvalue equaling to zero,so In-HH?is a positive definite matrix.In addition,Kμis a positive definite kernel matrix,therefore,is greater than zero.

    Proof:We have A(i)=A(i)*A(i)and A(i)=A(i)?since A(i)=S(i)S(i)?.A(i)-A(i)HH?A(i)=A(i)(In-HH?)A(i).Let y is arbitrary vector.Clearly,y?A(i)(In-HH?)A(i)y> 0 because(y?A(i))?=A(i)y and In-HH?is a positive definite matrix,which is justified by theorem 1.Therefore,A(i)-A(i)HH?A(i)is a positive definite matrix,correspondingly,

    3.3 Analysis of convergence

    In the proposed algorithm,the neighborhood of samples is crucial while it is difficult to exactly define during clustering.To simplify the optimization problem,we keep the neighborhood of samples fixed in the while process of optimization.By doing so,Eq.(10)is a standard kernel k-means optimization problem,Eq.(11)is a convex quadratic programming problem and Eq.(12)is also a convex quadratic programming problem.They are all convergent.Besides,the objective of the proposed algorithm has a lower bound.Therefore,the proposed clustering algorithm is convergent.The following results of experiment can illustrate the convergence of proposed algorithm.

    We use Algorithm 1 to describe the implementation of the proposed algorithm,where t is the number of iteration.The input of Algorithm 1 includes kernel matrix,the number k of clusters,regularization parameter λ and the threshold θ of convergence.The output includes the relaxed clustering membership H,kernel combination coefficients μ and the weight w of each local kernel alignment.The convergent condition of Algorithm 1 is that the difference of the last two objectives is less than θ.

    Algorithm 1Multiple Kernel Clustering based on Self-weighted Local Kernel Alignment

    Input:

    Output:μ and w.

    Initialize A(i)for ?it?samples according to one criterion of τ nearest neighbors.

    4 Experiment

    In this section,we conduct a large number of experiments to evaluate the clustering performance of the proposed algorithms.Moreover,we compare the proposed algorithm with many state-of-the-art MKC algorithms proposed recently.

    4.1 Data sets

    To conveniently and convincingly evaluate the clustering performance of the proposed algorithm,five benchmark data sets from multiple kernel learning,are adopted in our experiments.They are Yale2https://vismod.media.mit.edu/vismod/classes/mas62200/datasets/,Digital3https://ss.sysu.edu.cn/ py/,ProteinFold4https://mkl.ucsd.edu/dataset/protein-fold-prediction,Movement5https://archive.ics.uci.edu/ml/datasets/Libras+Movement,Catech1026https://mkl.ucsd.edu/dataset/.The detailed number of samples,kernels and classes of these data sets are listed in Tab.1.

    Table 1:The details of data sets in our experiments

    4.2 Compared algorithms

    Local kernel alignment for multiple kernel clustering(LKAMKC)[Li,Liu,Wang et al.(2016)]is a strong baseline since the proposed clustering algorithm directly extends it.In addition,the compared algorithms also include many related and the state-of-the-art multiple kernel clustering algorithms.Details of compared algorithms are as follows:Multiple kernelK-means(MKKM)[Huang,Chuang,Chen et al.(2012)],Localized multiple kernelK-means(LMKKM)[Gnen and Margolin(2014)],Robust multiple kernelK-means(RMKKM)[Du,Zhou,Shi et al.(2015)],Co-regularized spectral clustering(CRSC)[Kumar and Daumé(2011)],Robust multi-view spectral clustering(RMSC)[Xia,Pan,Du,et al.(2014)],Robust Multiple Kernel Clustering(RMKC)[Zhou,Du,Shi et al.(2015)],Kernel alignment for multiple kernel clustering(KAMKC)[Liu,Dou,Yin et al.(2016)],Optimal kernel clustering with multiple kernels(OKMKC)[Liu,Zhou,Wang et al.(2017)].

    4.3 Experiment setup

    For movement data set,12 kernel matrices are computed according to Zhou et al.[Zhou,Du,Shi et al.(2015)],and kernel matrices of the other data sets are downloaded from respective websites.To eliminate differences between kernels,we let the diagonal elements of all kernel matrices equal to one by applying centering and scaling on kernels[Cortes,Mohri and Rostamizadeh(2013)].

    LKAMKC algorithm and the proposed algorithm has the same two parameters:the number of neighbors τ and regularization parameter λ.For the number of neighbors,we respectively select the first kernel,the second kernel,the third kernel,and the average kernel to measure the neighborhood of samples,and the optimal τ is obtained by grid search from[0.05,0.1,…,0.95]*n where n is the number of samples.For the regularization parameter λ,the optimal value is chosen by grid search from[2-10,2-13,…,210].For the other compared algorithms,their parameters are set up according to the methods used in corresponding references.

    To objectively evaluate the performance of the clustering algorithms,in all experiments we use the true number of classes as the number of clusters,and we adopt clustering accuracy(ACC),normalized mutual information(NMI)and purity as the indicators of the clustering performance.For all experiments,the simulations of the proposed algorithm and compared algorithms are carried out in MATLAB 2013b environment with windows 8 operation system.To reduce the effect of randomness caused byK-means as much as possible,we repeat each experiment for 30 times and report the best result.

    Table 2:Clustering performance of all algorithms on all data sets

    4.4 Experimental results

    Tab.2 reports the best experimental results of the proposed algorithm and all compared algorithm,and Tab.3reports the more detailed comparison results between the proposed algorithm and LKAMKC algorithm.In all experiment,the neighborhood of samples is fixed but the criterion to measure the neighborhood of samples is adjustable.In Tab.2,LKAMKC and the proposed algorithm use the average combination kernel to measure the neighborhood of samples.In Tab.3,LKAMKC-K1,LKAMKC-K2,LKAMKC-K3 and LKAMKC-A denotes LKAMKC respectively adopts the first kernel,the second kernel,the third kernel and the average combination kernel to measure the neighborhood of samples.Also proposed-K1,proposed-K2,proposed-K3 and proposed-A denotes the proposed algorithm respectively adopts the first kernel,the second kernel,the third kernel and the average combination kernel to measure the neighborhood of samples.From Table 2,we have the following observation.

    Table 3:The detailed comparison between the proposed algorithm and LKAMKC

    These clustering algorithms which utilize local kernel,including LKAMKC and the proposed algorithm,significantly outperform the compared ones,which do not utilize local kernel,and among them,OKMKC demonstrates the best performance.Taking the results of ACC for an example,the proposed algorithm exceeds OKMKC 4.02%,10.92%,3.53%,3.47%,and 6.29% on Rale,Digital,ProteinFold,Movement and Caltech102,respectively.Similar conclusion can also be found in light of NMI and purity.It clearly indicates the importance of the local geometrical structure of data for clustering.

    In terms of performance indicators:ACC,NMI and purity,the proposed algorithm obtains the best clustering performance on all data sets.Taking the results of ACC for an example,it exceeds LKAMKC,which is a strong baseline since the proposed algorithm directly extends it,by 0.99%,3.23%,3.53%,3.01% and 4.13% on Rale,Digital,ProteinFold,ProteinFold,Movement and Caltech102,respectively.Also,the excellent performance of the proposed algorithm in terms of the NMI and purity can be seen from the Tab.2,where similar observation can be found.It clearly shows the superiority of suitably utilizing the local kernel alignment.

    From Tab.3,we can draw the following points:

    Both LKAMKC and the proposed algorithm are sensitive to the neighborhood of samples.Taking Digital for an example,for LKAMKC and the proposed algorithm using the third kernel to measure the neighborhood of samples can achieve the better performance than using the first kernel to measure the neighborhood of samples.

    Using the average kernel to measure the neighborhood of samples can achieve the better performance than using the single kernel to measure it.Taking ACC for an example,Proposed-A and LKAMKC-A exceed Proposed-K1 and LKAMKC-K1 by 0.06% and 0.08%,3.45% and 3.20%,5.75% and 2.65%,4.51% and 2.85%,0.73% and 0.78% on Yale,Digital,ProteinFold,Movement and Caltech102,respectively,which also shows that the combined kernel can contains more information of the neighborhood of samples than the single kernel contains.

    No matter which the neighborhood of samples is chosen,the proposed algorithm is always better than LKAMKC.Taking ACC for an example,Proposed-K1 exceed LKAMKC-K1 by 1.82%,1.35%,2.62%,0.97%,2.19% on Yale,Digital,ProteinFold,Movement and Caltech102,respectively,which confirms the superiority and effectiveness of the proposed algorithm again.

    4.5 Parameter selection and convergence

    When applying the proposed algorithm to cluster data,two parameters that contains the number τ of the nearest neighbors and regularization parameter λ need to be set up manually.Tab.3has analyzed the effect of the neighborhood of samples on the clustering performance.To evaluate the stability of the parameter λ,we select average kernel to measure the neighborhood of samples and fix the τ firstly and carry out a series of experiments on all data sets.Both the experimental results of the proposed algorithm and a baseline,which is the best result of LKAMKC with the same set,are drawn in Fig.1.From Fig.1,the following observation can be found.

    1)The clustering performance of the proposed algorithm on all data sets is stable when parameter λ varies from a wide range;2)For Yale,λ=2-1is a watershed,if λ is less than watershed the ACC of the proposed is higher than the baseline,or the ACC of proposed is lower than the baseline.3)For Digital and Caltech,λ also has a watershed,differently,if λ is less than watershed the ACC of proposed is lower than the baseline,or the ACC of proposed is higher than the baseline.4)For ProteinFold and Movement,The ACC of proposed is better than the baseline when λ varies from a bounded range.For instance,when 2-4.5≤λ≤28.5,the curve of the proposed algorithm is on the top of the baseline.

    To validate the convergence of the proposed algorithm,we record the objective value of the proposed algorithms at each iteration with fixing parameter τ and λ.Fig.2plots the number of iteration and the corresponding objective value of the proposed algorithms at one iteration.As seen from Fig.2,the objective value of the proposed algorithm is monotonically decreasing with regard to the time of iteration,and the proposed algorithm quickly converged in less than eleven iterations,which confirm the convergence of the proposed algorithm from the view of experiment.

    Figure 1:The performance of the proposed algorithm with regard to parameter λ

    Figure 2:The convergence of the proposed algorithm

    5 Conclusions and future work

    In this paper,we propose a multiple kernel clustering algorithm based on self-weighted local kernel alignment,which can improve the clustering performance by exploiting the contribution of each local kernel alignment to clustering performance more rationally.A three-step alternate optimization algorithm with convergence is developed to address the subsequent optimization problem.Broad experiments on five benchmark data sets validate the effectiveness and superiority of the proposed algorithm.

    As shown from Eq.(8)and Eq.(9),both LKAMKC and the proposed algorithm utilize all local kernel to cluster.However,if the number of samples is big,the clustering algorithms based on local kernel alignment is very time-consuming.Therefore,a fast version of the proposed algorithm,which is suitable for the big data sets[Xiao,Wang,Liu et al.(2018)],is worth studying in the future.

    Acknowledgement:This work was supported by the National Key R&D Program of China(No.2018YFB1003203),National Natural Science Foundation of China(Nos.61672528,61773392,61772561),Educational Commission of Hu Nan Province,China(No.14B193)and the Key Research&Development Plan of Hunan Province(No.2018NK2012).

    午夜福利乱码中文字幕| 日韩精品免费视频一区二区三区| 亚洲av成人不卡在线观看播放网 | 国产有黄有色有爽视频| 最近中文字幕2019免费版| 最近中文字幕2019免费版| 韩国高清视频一区二区三区| 侵犯人妻中文字幕一二三四区| 国产在视频线精品| 成年人午夜在线观看视频| 动漫黄色视频在线观看| 国产深夜福利视频在线观看| 精品国产一区二区三区四区第35| 久久精品熟女亚洲av麻豆精品| 免费在线观看完整版高清| 啦啦啦 在线观看视频| 国产有黄有色有爽视频| 18在线观看网站| 欧美日韩视频精品一区| 天天操日日干夜夜撸| 免费观看a级毛片全部| 成年女人毛片免费观看观看9 | 免费在线观看黄色视频的| 淫妇啪啪啪对白视频 | 亚洲一码二码三码区别大吗| 国产激情久久老熟女| 欧美 亚洲 国产 日韩一| 亚洲av成人不卡在线观看播放网 | 老司机影院成人| 日本91视频免费播放| 捣出白浆h1v1| 国产麻豆69| 久久 成人 亚洲| 日韩一区二区三区影片| 国产真人三级小视频在线观看| 国产又色又爽无遮挡免| 亚洲国产日韩一区二区| 51午夜福利影视在线观看| 亚洲一区二区三区欧美精品| 性少妇av在线| av国产精品久久久久影院| 在线十欧美十亚洲十日本专区| 黑人巨大精品欧美一区二区mp4| www.自偷自拍.com| 在线观看人妻少妇| 自线自在国产av| 日韩电影二区| 国产精品香港三级国产av潘金莲| avwww免费| 日韩三级视频一区二区三区| 久久国产精品大桥未久av| 啪啪无遮挡十八禁网站| 亚洲人成电影免费在线| 日韩制服骚丝袜av| 美女扒开内裤让男人捅视频| 男女高潮啪啪啪动态图| 国产av国产精品国产| 亚洲欧美清纯卡通| 亚洲成人免费电影在线观看| 十八禁高潮呻吟视频| 亚洲精品粉嫩美女一区| 免费不卡黄色视频| 国产老妇伦熟女老妇高清| 啦啦啦免费观看视频1| 男女之事视频高清在线观看| 国产成人啪精品午夜网站| 色婷婷久久久亚洲欧美| 青春草视频在线免费观看| 国产真人三级小视频在线观看| 国产精品一区二区在线观看99| 国产成人精品在线电影| 欧美成狂野欧美在线观看| 久久国产精品人妻蜜桃| 激情视频va一区二区三区| 午夜福利,免费看| 香蕉国产在线看| 黄色怎么调成土黄色| 免费在线观看影片大全网站| 成在线人永久免费视频| 日韩制服骚丝袜av| 人人澡人人妻人| a级毛片黄视频| 一个人免费看片子| 国产精品九九99| 亚洲精品日韩在线中文字幕| 如日韩欧美国产精品一区二区三区| 99久久精品国产亚洲精品| 精品一区在线观看国产| 精品亚洲成a人片在线观看| 亚洲成人免费av在线播放| 叶爱在线成人免费视频播放| 午夜久久久在线观看| 国产精品免费视频内射| 日本黄色日本黄色录像| 成人国语在线视频| 亚洲精品成人av观看孕妇| 国产一区有黄有色的免费视频| 欧美在线黄色| 美女福利国产在线| 大片免费播放器 马上看| 丝袜美足系列| 在线 av 中文字幕| 午夜激情av网站| 在线亚洲精品国产二区图片欧美| 亚洲中文av在线| 大香蕉久久网| 亚洲一区二区三区欧美精品| 久久久久国产一级毛片高清牌| 国产精品一区二区在线不卡| 日日摸夜夜添夜夜添小说| 欧美日韩精品网址| 成年人午夜在线观看视频| 欧美精品啪啪一区二区三区 | 免费日韩欧美在线观看| 国产精品久久久av美女十八| 中文字幕另类日韩欧美亚洲嫩草| 夜夜夜夜夜久久久久| 脱女人内裤的视频| 亚洲精品在线美女| 两个人免费观看高清视频| 99精国产麻豆久久婷婷| bbb黄色大片| 美女福利国产在线| 黄色视频在线播放观看不卡| 精品少妇一区二区三区视频日本电影| 色精品久久人妻99蜜桃| 嫁个100分男人电影在线观看| 人人妻人人澡人人看| www.精华液| 丰满少妇做爰视频| 国产欧美日韩一区二区三区在线| 精品少妇黑人巨大在线播放| 美女大奶头黄色视频| 狂野欧美激情性xxxx| 王馨瑶露胸无遮挡在线观看| 久久久久国产精品人妻一区二区| 日本欧美视频一区| 狠狠狠狠99中文字幕| 国产熟女午夜一区二区三区| 91麻豆av在线| 91九色精品人成在线观看| 黑人巨大精品欧美一区二区蜜桃| 日本黄色日本黄色录像| 嫁个100分男人电影在线观看| 91麻豆av在线| 咕卡用的链子| 不卡一级毛片| 精品一品国产午夜福利视频| 午夜福利在线免费观看网站| 最近中文字幕2019免费版| 亚洲国产欧美一区二区综合| av福利片在线| 91国产中文字幕| tocl精华| 亚洲欧洲日产国产| 亚洲国产精品成人久久小说| 国产精品免费视频内射| 在线观看免费视频网站a站| 黄色视频,在线免费观看| 精品一区二区三区四区五区乱码| 在线观看免费高清a一片| 国产成人免费无遮挡视频| 国产精品久久久久久精品古装| 精品亚洲成国产av| 久久久精品免费免费高清| 欧美日韩中文字幕国产精品一区二区三区 | 交换朋友夫妻互换小说| 一本—道久久a久久精品蜜桃钙片| 9191精品国产免费久久| 国产精品免费视频内射| 久久久久久人人人人人| 99精品欧美一区二区三区四区| 精品福利永久在线观看| 黄色片一级片一级黄色片| 我的亚洲天堂| 纯流量卡能插随身wifi吗| 性高湖久久久久久久久免费观看| 亚洲伊人久久精品综合| 久久精品国产综合久久久| 91字幕亚洲| 亚洲成人国产一区在线观看| 精品福利永久在线观看| www.精华液| 黄网站色视频无遮挡免费观看| 久久精品aⅴ一区二区三区四区| 成在线人永久免费视频| 精品国产一区二区久久| 91老司机精品| 婷婷成人精品国产| 91成年电影在线观看| 成人国语在线视频| 国产精品99久久99久久久不卡| 99热全是精品| 国产野战对白在线观看| 久久中文看片网| 美女大奶头黄色视频| www.精华液| 狠狠精品人妻久久久久久综合| 久久人妻熟女aⅴ| 久久人人爽av亚洲精品天堂| 精品视频人人做人人爽| 两人在一起打扑克的视频| 一级黄色大片毛片| 成年美女黄网站色视频大全免费| 亚洲 欧美一区二区三区| 国产深夜福利视频在线观看| 国内毛片毛片毛片毛片毛片| 男女高潮啪啪啪动态图| 午夜影院在线不卡| 亚洲国产av影院在线观看| 我的亚洲天堂| 久久久久久人人人人人| 欧美日韩成人在线一区二区| 午夜日韩欧美国产| 高清视频免费观看一区二区| 精品欧美一区二区三区在线| 亚洲精品中文字幕在线视频| 天天操日日干夜夜撸| 一级毛片精品| 考比视频在线观看| 自拍欧美九色日韩亚洲蝌蚪91| 妹子高潮喷水视频| 满18在线观看网站| 王馨瑶露胸无遮挡在线观看| 欧美日本中文国产一区发布| 久久久久精品国产欧美久久久 | 大型av网站在线播放| 精品国内亚洲2022精品成人 | 下体分泌物呈黄色| 国产精品自产拍在线观看55亚洲 | 91麻豆av在线| 亚洲少妇的诱惑av| 成人手机av| 亚洲专区字幕在线| 国产欧美日韩精品亚洲av| 精品欧美一区二区三区在线| 精品人妻一区二区三区麻豆| 美女主播在线视频| 国产精品一二三区在线看| 久久人妻熟女aⅴ| 精品国内亚洲2022精品成人 | av免费在线观看网站| 女性生殖器流出的白浆| 免费久久久久久久精品成人欧美视频| av天堂在线播放| 亚洲激情五月婷婷啪啪| 国产精品熟女久久久久浪| 久久久久精品人妻al黑| 亚洲av电影在线观看一区二区三区| 97在线人人人人妻| 一进一出抽搐动态| 97精品久久久久久久久久精品| 久久精品人人爽人人爽视色| 午夜免费鲁丝| 蜜桃在线观看..| 亚洲免费av在线视频| 性色av一级| 亚洲国产精品一区二区三区在线| 欧美日韩国产mv在线观看视频| 亚洲av美国av| 高清在线国产一区| 精品视频人人做人人爽| 操美女的视频在线观看| 51午夜福利影视在线观看| 成人影院久久| 少妇精品久久久久久久| 老司机午夜十八禁免费视频| 999精品在线视频| 久久久久久久久久久久大奶| 亚洲自偷自拍图片 自拍| 99国产精品免费福利视频| 18禁裸乳无遮挡动漫免费视频| 久久久精品区二区三区| 久久天躁狠狠躁夜夜2o2o| 一区二区三区乱码不卡18| 在线精品无人区一区二区三| 伦理电影免费视频| 一区二区日韩欧美中文字幕| 可以免费在线观看a视频的电影网站| 每晚都被弄得嗷嗷叫到高潮| 欧美日韩成人在线一区二区| 亚洲一区中文字幕在线| 国产男女超爽视频在线观看| 精品高清国产在线一区| 18禁国产床啪视频网站| 777米奇影视久久| 亚洲全国av大片| 桃红色精品国产亚洲av| 国产精品一区二区精品视频观看| 最近最新中文字幕大全免费视频| 一本大道久久a久久精品| 国产一卡二卡三卡精品| 香蕉国产在线看| 蜜桃在线观看..| 精品乱码久久久久久99久播| 啦啦啦 在线观看视频| 俄罗斯特黄特色一大片| 视频区图区小说| 大片电影免费在线观看免费| av电影中文网址| 日韩电影二区| 我要看黄色一级片免费的| 中文字幕制服av| 捣出白浆h1v1| 黄色片一级片一级黄色片| 免费在线观看视频国产中文字幕亚洲 | 亚洲精品久久成人aⅴ小说| 亚洲精品久久久久久婷婷小说| 午夜免费鲁丝| 亚洲精品一区蜜桃| 欧美精品一区二区大全| 91大片在线观看| 免费在线观看影片大全网站| 叶爱在线成人免费视频播放| 一级毛片电影观看| 国产高清视频在线播放一区 | 欧美人与性动交α欧美软件| 欧美午夜高清在线| 菩萨蛮人人尽说江南好唐韦庄| 国产精品久久久久久精品古装| 又大又爽又粗| 一级黄色大片毛片| 日韩人妻精品一区2区三区| 免费在线观看视频国产中文字幕亚洲 | 正在播放国产对白刺激| 久热这里只有精品99| 91麻豆av在线| 国产真人三级小视频在线观看| 精品久久久久久电影网| 午夜免费成人在线视频| 国产成人av教育| 婷婷成人精品国产| 一区在线观看完整版| 亚洲欧美激情在线| 亚洲精品久久久久久婷婷小说| a 毛片基地| 黄频高清免费视频| 99国产精品一区二区三区| 99国产综合亚洲精品| 美女福利国产在线| 国产精品香港三级国产av潘金莲| 成人手机av| 我要看黄色一级片免费的| 捣出白浆h1v1| 国产av国产精品国产| 男女床上黄色一级片免费看| 人人澡人人妻人| 国产一卡二卡三卡精品| 丰满迷人的少妇在线观看| 欧美午夜高清在线| 91大片在线观看| 在线永久观看黄色视频| 男女高潮啪啪啪动态图| 18禁国产床啪视频网站| 黑人欧美特级aaaaaa片| 一本一本久久a久久精品综合妖精| 国产高清videossex| 久久久精品免费免费高清| 熟女少妇亚洲综合色aaa.| 丝袜喷水一区| 91麻豆av在线| 午夜免费观看性视频| 2018国产大陆天天弄谢| 巨乳人妻的诱惑在线观看| 国产精品亚洲av一区麻豆| 97在线人人人人妻| 最近最新免费中文字幕在线| 色婷婷久久久亚洲欧美| 91av网站免费观看| 国产在视频线精品| 亚洲欧美精品自产自拍| 国产精品久久久久久精品电影小说| 最近最新中文字幕大全免费视频| 老熟妇乱子伦视频在线观看 | 亚洲中文日韩欧美视频| 91成年电影在线观看| 大片免费播放器 马上看| 成人亚洲精品一区在线观看| 黄色毛片三级朝国网站| 国产精品自产拍在线观看55亚洲 | 一级毛片精品| 国产精品一区二区在线观看99| 91精品伊人久久大香线蕉| 亚洲中文av在线| 在线观看免费高清a一片| 日韩熟女老妇一区二区性免费视频| 真人做人爱边吃奶动态| 最近中文字幕2019免费版| 精品国产国语对白av| 国产成人精品在线电影| 高清欧美精品videossex| 国产av精品麻豆| 精品久久久精品久久久| 亚洲性夜色夜夜综合| 搡老熟女国产l中国老女人| 日韩视频一区二区在线观看| av电影中文网址| 中文字幕人妻丝袜制服| 男人爽女人下面视频在线观看| 亚洲一码二码三码区别大吗| 狂野欧美激情性bbbbbb| 欧美日韩成人在线一区二区| 欧美 日韩 精品 国产| 精品熟女少妇八av免费久了| 男人舔女人的私密视频| 十分钟在线观看高清视频www| 日本91视频免费播放| 精品久久蜜臀av无| 日韩欧美一区视频在线观看| 久久ye,这里只有精品| kizo精华| 看免费av毛片| 午夜福利在线免费观看网站| 免费在线观看视频国产中文字幕亚洲 | 久久国产亚洲av麻豆专区| 久久精品人人爽人人爽视色| 我要看黄色一级片免费的| 高清黄色对白视频在线免费看| 欧美激情久久久久久爽电影 | 久久午夜综合久久蜜桃| 国产一区二区三区综合在线观看| 亚洲中文日韩欧美视频| 天堂中文最新版在线下载| 欧美国产精品va在线观看不卡| 久久99一区二区三区| 97人妻天天添夜夜摸| 免费看十八禁软件| 免费在线观看影片大全网站| 国产熟女午夜一区二区三区| 亚洲精品第二区| 人成视频在线观看免费观看| 亚洲熟女毛片儿| 青草久久国产| 成人国产一区最新在线观看| 狂野欧美激情性xxxx| 精品一品国产午夜福利视频| 9色porny在线观看| videosex国产| 不卡av一区二区三区| 电影成人av| 国产精品一二三区在线看| 久9热在线精品视频| 好男人电影高清在线观看| 久久久国产一区二区| 免费少妇av软件| 欧美另类一区| 欧美日本中文国产一区发布| 大片免费播放器 马上看| 国产亚洲精品久久久久5区| 老熟妇乱子伦视频在线观看 | 下体分泌物呈黄色| 成年人黄色毛片网站| 999久久久国产精品视频| 女人精品久久久久毛片| 欧美黑人精品巨大| 亚洲情色 制服丝袜| 黄片小视频在线播放| www.精华液| 久久国产精品男人的天堂亚洲| 国产成人一区二区三区免费视频网站| 看免费av毛片| 中文字幕最新亚洲高清| 精品人妻1区二区| 最新的欧美精品一区二区| 无限看片的www在线观看| 老司机影院毛片| 精品卡一卡二卡四卡免费| 啦啦啦 在线观看视频| 91大片在线观看| 他把我摸到了高潮在线观看 | 国产97色在线日韩免费| 老熟妇仑乱视频hdxx| 最新的欧美精品一区二区| 岛国在线观看网站| 99国产综合亚洲精品| 热re99久久国产66热| 国产精品二区激情视频| 国产高清videossex| 久久亚洲精品不卡| 99久久精品国产亚洲精品| 欧美日韩亚洲高清精品| 嫁个100分男人电影在线观看| 一区二区三区乱码不卡18| 一二三四社区在线视频社区8| 高清av免费在线| www.自偷自拍.com| 久久性视频一级片| 国产亚洲精品一区二区www | 999久久久精品免费观看国产| 一区二区av电影网| 久热这里只有精品99| 啦啦啦中文免费视频观看日本| 91av网站免费观看| 建设人人有责人人尽责人人享有的| 黄色片一级片一级黄色片| 日韩视频在线欧美| 免费在线观看日本一区| 免费观看人在逋| 日韩一区二区三区影片| 亚洲欧美成人综合另类久久久| 一个人免费看片子| 9色porny在线观看| 亚洲国产日韩一区二区| 亚洲 欧美一区二区三区| 国产三级黄色录像| 欧美 亚洲 国产 日韩一| 亚洲五月婷婷丁香| 国产亚洲av片在线观看秒播厂| 国产高清videossex| 国产精品.久久久| 免费日韩欧美在线观看| 国产欧美日韩一区二区精品| av在线老鸭窝| 成人18禁高潮啪啪吃奶动态图| videosex国产| 久热爱精品视频在线9| 中文字幕人妻熟女乱码| 青春草视频在线免费观看| 久久久久久久精品精品| 久久精品国产亚洲av高清一级| 久久久久久免费高清国产稀缺| 国产麻豆69| 美女午夜性视频免费| xxxhd国产人妻xxx| 天堂中文最新版在线下载| 啦啦啦啦在线视频资源| 在线av久久热| 亚洲国产精品999| 日韩免费高清中文字幕av| 一级,二级,三级黄色视频| 伊人久久大香线蕉亚洲五| 法律面前人人平等表现在哪些方面 | 精品久久蜜臀av无| 久久久久精品人妻al黑| a级毛片黄视频| 精品熟女少妇八av免费久了| 精品一品国产午夜福利视频| 欧美少妇被猛烈插入视频| 亚洲成人国产一区在线观看| 亚洲av电影在线观看一区二区三区| 亚洲久久久国产精品| 黄色毛片三级朝国网站| 国产精品av久久久久免费| 老司机影院成人| 一级a爱视频在线免费观看| 免费高清在线观看视频在线观看| 精品亚洲成国产av| 久久久久久久久免费视频了| 精品国产乱码久久久久久男人| 久久久精品区二区三区| 一本色道久久久久久精品综合| 高清视频免费观看一区二区| 搡老熟女国产l中国老女人| 最近最新免费中文字幕在线| 亚洲精品一区蜜桃| 久久久久精品国产欧美久久久 | 91大片在线观看| 久久中文字幕一级| 国产精品久久久久久精品古装| 777米奇影视久久| 国产精品九九99| 久热爱精品视频在线9| 日本撒尿小便嘘嘘汇集6| 精品乱码久久久久久99久播| 不卡av一区二区三区| 欧美日韩一级在线毛片| 考比视频在线观看| 国产亚洲精品一区二区www | 又大又爽又粗| 丝袜喷水一区| 国产av国产精品国产| 亚洲精华国产精华精| 黑人巨大精品欧美一区二区蜜桃| 91大片在线观看| 亚洲欧美一区二区三区黑人| 三上悠亚av全集在线观看| 亚洲国产中文字幕在线视频| 国产精品一区二区在线观看99| 亚洲熟女毛片儿| 亚洲国产欧美在线一区| 日韩熟女老妇一区二区性免费视频| 精品少妇一区二区三区视频日本电影| 成人国产av品久久久| 大型av网站在线播放| 18禁观看日本| 精品一区在线观看国产| 中文字幕人妻丝袜制服| 久久久久久久大尺度免费视频| 欧美另类一区| 免费观看人在逋| 在线亚洲精品国产二区图片欧美| 男女国产视频网站| 久久人妻熟女aⅴ| 狠狠精品人妻久久久久久综合| 欧美精品av麻豆av| 色94色欧美一区二区| 麻豆乱淫一区二区| 色综合欧美亚洲国产小说| 90打野战视频偷拍视频| 啦啦啦啦在线视频资源| www.精华液| 日韩欧美免费精品| 亚洲国产欧美日韩在线播放| 日本猛色少妇xxxxx猛交久久| 国产精品一区二区在线观看99| 热re99久久精品国产66热6| 国产精品免费大片| 老司机影院毛片| 美女视频免费永久观看网站| 亚洲国产中文字幕在线视频| 俄罗斯特黄特色一大片| 女人被躁到高潮嗷嗷叫费观| 搡老乐熟女国产| 男女下面插进去视频免费观看| 精品亚洲乱码少妇综合久久|