• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Trade-Off between Efficiency and Effectiveness: A Late Fusion Multi-View Clustering Algorithm

    2021-12-16 06:39:34YunpingZhaoWeixuanLiangJianzhuangLuXiaowenChenandNijiwaKong
    Computers Materials&Continua 2021年3期

    Yunping Zhao,Weixuan Liang, Jianzhuang Lu,*, Xiaowen Chen and Nijiwa Kong

    1National University of Defence Technology, Changsha, China

    2Department of Mathematics, Faculty of Science,University of Sargodha, Sargodha,Pakistan

    Abstract:Late fusion multi-view clustering(LFMVC)algorithms aim to integrate the base partition of each single view into a consensus partition.Base partitions can be obtained by performing kernel k-means clustering on all views.This type of method is not only computationally efficient,but also more accurate than multiple kernel k-means, and is thus widely used in the multi-view clustering context.LFMVC improves computational efficiency to the extent that the computational complexity of each iteration is reduced from O n3( )to O(n )(where n is the number of samples).However, LFMVC also limits the search space of the optimal solution, meaning that the clustering results obtained are not ideal.Accordingly,in order to obtain more information from each base partition and thus improve the clustering performance,we propose a new late fusion multi-view clustering algorithm with a computational complexity of O n2( ).Experiments on several commonly used datasets demonstrate that the proposed algorithm can reach quickly convergence.Moreover,compared with other late fusion algorithms with computational complexity of O(n ), the actual time consumption of the proposed algorithm does not significantly increase.At the same time, comparisons with several other state-of-the-art algorithms reveal that the proposed algorithm also obtains the best clustering performance.

    Keywords: Late fusion;kernel k-means;similarity matrix

    1 Introduction

    Many real-world datasets are naturally made up of different representations or views.Given that these multiple representations often provide compatible and complementary information,it is clearly preferable to integrate them in order to obtain better performance rather than relying on a single view.However,because data represented by different views can be heterogeneous and biased,the question of how to fully utilize the multi-view information in order to obtain a consensus representation for clustering purposes remains a challenging problem in the field of multi-view clustering (MVC).In recent years, various kinds of algorithms have been developed to provide better clustering performance or higher efficiency; these include multi-view subspace clustering [1-3], multi-view spectral clustering [4-6], and multiple kernel k-means (MKKM) [7-11].Among these, MKKM has been widely utilized due to its excellent clustering performance and strong interpretability.

    Depending on the stage of multi-view data fusion involved, the existing literature on MVC can be roughly divided into two categories.The first category, which can be described as ‘early fusion’, fuses the information of all views before clustering.Huang et al.[7] optimize an optimal kernel matrix as a linear combination of a set of pre-specified kernels for clustering purposes.To reduce the redundancy of the pre-defined kernels, Liu et al.[10] propose an MKKM algorithm with matrix-induced regularization.To further enhance the clustering performance, a local kernel alignment criterion is adopted for MKKM in[10]; moreover, the algorithm proposed in [11] allows the optimal kernel to be found in the neighbor of the combinational kernels,which increases the search space of the optimal kernel.

    Although the above-mentioned early fusion algorithms have greatly improved the clustering accuracy of MKKM in different respects,they also have two drawbacks that should not be overlooked.The first of these is the more intensive computational complexity, i.e., usually O n3( )per iteration (where n is the number of samples).The second relates to the over-complicated optimization processes of these methods, which increase the risk of the algorithm becoming trapped in bad local minimums and thus lead to unsatisfying clustering performance.

    The second category,which is termed as‘late fusion’,maximizes the alignment between the consensus clustering indicator matrix and the weighted base clustering indicator matrices with orthogonal transformation, in which each clustering indicator matrix is generated by performing clustering on each single view [12,13].This type of algorithm can improve the efficiency and clustering performance;however, these methods also simplify the search space of the optimal clustering indicator matrix, which decreases clustering performance.

    Accordingly, in this paper, we propose a new late fusion framework designed to extract more information from the base clustering indicator matrix.We first assume that each base clustering indicator matrix is a feature representation after the sample is mapped from the original space to the feature space.Compared with the original features, this involves nonlinear relations between data [14]; moreover, this approach can also filter out some noise arising from the kernel matrix.Therefore, the similarity matrix constructed by the base clustering indicator matrix is superior to the original kernel matrix for clustering purposes.

    We refer to the proposed method as‘late fusion multi-view clustering with learned consensus similarity matrix’(LF-MVCS).LF-MVCS jointly optimizes the combination coefficient of the base clustering indicator matrices and the consensus similarity matrix.This consensus similarity matrix is then used as the input of spectral clustering [15] to obtain the final result.Furthermore, we design an effective algorithm for solving the resulting optimization problem, as well as analyzing its computational complexity and convergence.Extensive experiments on 10 multi-view benchmark datasets are then conducted to evaluate the effectiveness and efficiency of our proposed method.

    The contributions of this paper can be summarized as follows:

    ●The proposed LF-MVSC integrates the information of all views in a late fusion manner.It further optimizes the combination coefficient of each base clustering indicator matrix and consensus similarity matrix.To the best of our knowledge, this is the first time that a unified form has been obtained that can better reflect the similarity between samples via late fusion.

    ●The proposed optimization formulation has just one parameter to tune and avoids excessive time consumption.We therefore design an alternate optimization algorithm with proved convergence that can efficiently tackle the resultant problem.Our algorithm avoids complex computation owing to fewer optimization variables being utilized,meaning that that the objective function can converge within fewer iterations.

    2 Related Work

    2.1 Multiple Kernel K-means

    In order to optimize Eq.(1),we can transform it as follows:

    Here, IKis an identity matrix with size k×k.Due towe impose an orthogonality constraint on H.Finally, the optimal H for Eq.(3) can be obtained by taking the k eigenvectors corresponding to the k largest eigenvalues of K.The Lloyd’s algorithm is usually executed on H for the clustering labels.

    Assume that the optimal kernel matrix can be represented aswe obtain the optimization objective of MKKM as follows:

    A two-step iterative algorithm can be used to optimize the Eq.(5).

    i) Optimize α with fixed H.Eq.(5)is equivalent to:

    where δp=According to the Cauchy-Schwartz inequality,we can obtain the closedform solution to Eq.(6)as αp=

    ii)Optimize H with fixed α.With the kernel coefficients α fixed,H can be optimized by taking the k eigenvectors corresponding to the k largest eigenvalues of Kα.

    2.2 Multi-View Clustering via Late Fusion Alignment Maximization

    Multi-view clustering via late fusion alignment maximization [12] (MVC-LFA) assumes that the clustering indicator matrix obtained from each single view is Hp(p ∈[m ]), while a set of rotation matricesare used to maximize the alignment of Hp(p ∈[m ]) and the optimal clustering indicator matrix H*.The objective function is as follows:

    where F denotes the average clustering indicator matrix, while λ is a trade-off parameter.Moreover,Tr (H*TF)is a regularization of the optimal clustering indicator matrix that prevents H*from being too far away from the prior average partition.This formula assumes that the optimal clustering indicator matrix H*is a linear combination of the base clustering indicator matrix Hp(p ∈[m ]).Readers are encouraged to refer to[12]for more details regarding the optimization of Eq.(7).

    The drawback of MKKM is its excessive computational complexity,which is O n3( ).Although MVCLFA does reduce the computational complexity from O n3( )to O(n ),it also over-simplifies the search space of the optimal solution, which in turn limits the clustering performance.We therefore propose a novel late fusion multi-view clustering algorithm in Section Three, which offers a trade-off between clustering efficiency and effectiveness.

    3 Multi-View Clustering Via Late Fusion Consensus Similarity Matrix

    3.1 Proposed Method

    Based on the assumption that the optimal consensus similarity matrix resides in the neighbor of a linear combination of all similarity matrices, our model can be expressed as follows:

    The solution to the above formula is clearly equal to a certain Sp,such that multi-view information is not fully utilized.Accordingly, inspired by [9], we introduce a regularization item capable of improving the diversity of all views for learning purposes.We let M(p, q)=Tr(H >HpTHq).A larger M(p, q) is associated with a higher degree of correlation between the p-th and q-th points.Let M ∈Rm×mbe the matrix that stores diversified information of views, while Mpq=M(p, q).Thus, when Mpqis large, to avoid the information redundancy caused by similar views, we can set the coefficient of view such that apaqMpqis small.Based on the above,we add the following regularization item to the formula:

    At the same time, we aim to have S satisfy the basic properties of the similarity matrix.Hence, we impose certain constraints on S,namely:i) The degree of each sample in S is greater than or equal to 0, i.e., Sij≥0; ii) The degree of each sample in S is equal to 1, i.e., ST1 = 1.We can accordingly obtain the following formula:

    Not only can the clustering indicator matrices obtained by KKM, but also the similar basic partitions constructed by any clustering algorithms, be the input of our model.The model developed thus far represents a new multi-view clustering framework based on late fusion.Compared to other late fusion algorithms, such as[12,13], our objective formula has three key advantages,namely:

    Fewer variables need to be optimized, meaning that the total number of iterations is reduced and the algorithm converges quickly.

    Other late fusion algorithms hold that the optimal clustering indicator matrix is a neighbor of the average clustering indicator matrix; however, this is a heuristic assumption that may not hold true in some cases.By contrast, our assumptions in Eq.(9) are more reasonable and will be effective in any multi-view clustering task.

    We further introduce a regularization item to measure the correlation of views in order to increase both the diversity and the utilization rate of multi-view information.

    3.2 Alternate Optimization

    In the following,we construct an efficient algorithm designed to solve the optimization problem in Eq.(9).More specifically, we design a two-step algorithm to alternately solve the following:

    Optimization α with fixed S.

    When S is fixed, the optimization problem in Eq.(9) is equivalent to the following optimization problem:

    Optimization S with fixed α.

    When α is fixed,the optimization problem in Eq.(9)w.r.t S can be rewritten as follows:

    Note that the problem in Eq.(11)is independent of the value of j,meaning that we can optimize each column of S as follows:

    where γ and τ are the Lagrangian multipliers.

    According to its Karush-Kuhn-Tucker condition[17],we can obtain the optimal solution S*in the form of Eq.(15):

    Here, (x )+is a signal function,which sets the elements of x that are less than 0 to 0,while γ*∈R makes 1T>=1.

    The algorithm is summarized in detail in Algorithm 1.

    Algorithm 1: Late fusion multi-view clustering with learned consensus similarity matrix

    ?

    3.3 Complexity and Convergence

    Computational Complexity.Obtaining the base clustering indicator matrices requires calculating the eigenvectors corresponding to the k largest eigenvalues of the m kernel matrices,leading to a complexity of O(k mn2).The proposed iterative algorithm further requires the optimization of the two variables α and S.In each iteration, the optimization of α via Eq.(10) requires the solving of a standard quadratic programming problem with linear constraint, the complexity of which is O(m3).Updating each sjby Eq.(13) requires O(n ) time.Moreover, the complexity associated with obtaining S is O(n2) per iteration, as n times are needed to calculate each sj.Overall, the total complexity of Algorithm 1 is O(( m3+n2)t), where t is the number of iterations.Finally,executing the standard spectral clustering algorithm on S takes O(n3)time.

    Convergence.In each of the optimization iterations of our proposed algorithm, the optimizations of α and S monotonically decrease the value of the objective Eq.(9).Moreover, this objective is also lowerbounded by zero.Consequently, our algorithm is guaranteed to converge to the local optimum of Eq.(9).

    4 Experiments

    4.1 Datasets and Experimental Settings

    Several benchmark datasets are utilized to demonstrate the effectiveness of the proposed method:namely, Flower17 (www.robots.ox.ac.uk/ vgg/data/flowers/17/), Flower102 (www.robots.ox.ac.uk/ vgg/data/flowers/102/), PFold (mkl.ucsd.edu/dataset/protein-fold-prediction), Digit (http://ss.sysu.edu.cn/py/),and Cal (www.vision.caltech.edu/Image Datasets/Caltech101/).Six sub-datasets, which are constructed by selecting the first 5, 10, 15, 20, 25 and 30 classes respectively from the Cal data, are also used in our experiments.More detailed information regarding these datasets is presented in Tab.1.From this table,we can observe that the number of samples, views and categories of these datasets range from 510 to 8189,3 to 48,and 5 to 102,respectively.

    4.2 Comparison with State-of-the-Art Algorithms

    In our experiments,LF-MVCS is compared with several state-of-the-art multi-view clustering methods,as outlined below.

    Average multiple kernel k-means (A-MKKM):The kernel matrix, which is a linear combination of the base kernels with the same coefficients,is taken as the input of the standard kernel K-means algorithm.

    Single best kernel k-means(SB-KKM):Stardard kernel k-means is performed on each single kernel,after which the best result is outputted.

    Multiple kernel k-means (MKKM)[7]: The algorithm jointly optimizes the consensus clustering indicator matrix and the kernel coefficients.

    Robust multiple kernel k-means (RMKKM)[18]: RMKC learns a robust low-rank kernel for clustering by capturing the structure of noise in multiple kernels.

    Multiple kernel k-means with matrix-induced regularization (MKKM-MR)[9]: This algorithm introduces a matrix-induced regularization designed to reduce the redundancy of the kernels and enhance their diversity.

    Optimal neighborhood kernel clustering with multiple kernels (ONKC)[11]: ONKC allows the optimal kernel to reside in the neighborhood of linear combination of base kernels, thereby effectively enlarging the search space of the optimal kernel.

    Multi-view clustering via late fusion alignment maximization (MVC-LFA)[12]: MVC-LFA proposes maximally aligning the consensus clustering indicator matrix with the weighted base clustering indicator matrix.

    Simple multiple kernel k-means (SMKKM)[19]: SimpleMKKM, or SMKKM, re-formulates the MKKM problem as a minimization-maximization problem in the kernel coefficients and the consensus clustering indicator matrix.

    4.3 Experimental Settings

    In our experiments,all base kernels are first centered and then normalized;therefore,for all samples xiand the p-th kernel,we have=1.For all datasets,the true number of clusters is known,and we set this to be the true number of classes.In addition,the parameters of RMKKM,MKKM-MR and MVC-LFA are selected by grid search according to the methods suggested in their respective papers.For the proposed algorithm, its regularization parameters λ are chosen from a large enough range [2-15,2-12, …,215] by means of grid search.We set the allowable error ?to 1×10-4, which is the termination condition of Algorithm 1.

    The widely used metrics of clustering accuracy (ACC), normalized mutual information (NMI) and purity are applied to evaluate the clustering performance of each algorithm.For all algorithms, we repeat each experiment 50 times with random initialization (to reduce randomness caused by k-means in the final spectral clustering) and report the average result.All experiments are performed on a desktop with Intel(R) Core(TM)-i7-7820X CPU and 64 GB RAM.

    4.4 Experimental Results

    Tab.2 reports the clustering performance of the above-mentioned algorithms on all datasets.From these results, we can make the following observations:

    First, the proposed algorithm consistently demonstrates the best NMI on all datasets.For example, it exceeds the second-best method (MVC-LFA) by over 8.56% on Flower17.The superiority of our algorithm is also confirmed by the ACC and purity reported in Tab.2.

    Table 2: Empirical evaluation and comparison of SimpleMKKM with eight baseline methods on 11 datasets in terms of clustering accuracy (ACC), normalized mutual information (NMI) and purity

    As a strong baseline, the recently proposed SMKKM [19] outperforms other comparison early-fusion multiple kernel clustering methods.However, the proposed algorithm significantly and consistently outperforms SMKKM by 9.80%, 3.11%, 2.27%, 10.21%, 3.02%, 6.52%, 4.66%, 4.72%, 7.11% and 5.05%respectively in terms of NMI on all datasets in the order listed in Tab.1.

    The late fusion algorithm MVC-LFA also outperforms most other algorithms; however, the proposed algorithm outperforms MVC-LFA by 8.56%, 4.78%, 4.63%, 12.74%, 2.36%, 5.70%, 3.68%, 4.04%,6.55%and 4.55%respectively in terms of NMI on all datasets in the order listed in Tab.1.

    We further study the clustering performance on Cal for each algorithm depending on the number of classes,as shown in Fig.1.As can be observed from the figure,our algorithm is consistently at the top of all sub-figures when the number of classes varies, indicating that it achieves the best performance.

    Figure 1: ACC,NMI and purity comparison with variations in number of classes on Cal.(left)ACC;(mid)NMI;(right) purity

    In summary, our proposed LF-MVCS demonstrates superior clustering performance over all SOTA multi-view clustering algorithms.From the above experiments, we can therefore conclude that the proposed algorithm effectively learns the information from all base clustering indicator matrices, bringing significant improvements to the clustering performance.

    4.5 Runtime,Convergence and Parameter Sensitivity

    Runtime.To investigate the computational efficiency of the proposed algorithm,we record the running time of various algorithms on the benchmark datasets and report them in Fig.2.The computational complexity of the proposed LF-MVCS is O (n2); as can be observed, LF-MVCS has slightly higher computational complexity than the existing MVC-LFA (with a computational complexity of O(n )).At the same time, however, the runtime of LF-MVCS is lower relative to with other kernel-based multi-view clustering algorithms.

    Figure 2: Runtime of different algorithms on eight benchmark datasets(in seconds).The experiments are conducted on a PC with Intel(R) Core(TM)-i7-7820X CPU and 64 GB RAM in the MATLAB environment.LF-MVCS is comparably fast relative to the alternatives while also providing superior performance when compared with most other multiple kernel k-means based algorithms.Results for other datasets are omitted due to space limitations

    Algorithm Convergence.We next theoretically analyze the convergence of our algorithm.Fig.3 reflects the variation of the objective Eq.(9) with the number of iterations on four datasets: namely,Flower17, ProteinFold, Digit, and Cal-30.It can be seen that the objective function achieves convergence over very few iterations, thereby demonstrating that fewer optimization variables can make the algorithm converge faster,which makes our algorithm more efficient than other algorithms.

    Figure 3: The objective value of our algorithm on four datasets at each iteration

    Parameter Sensitivity.The proposed algorithm introduces one hyper-parameter, i.e., the balancing coefficient of the diversity regularization item λ.The performance variation against λ is illustrated in Fig.4.As the figure shows: i) λ is effective in improving the algorithm performance; ii) It also achieves good performance over a wide range of parameter settings.

    Figure 4: Illustration of parameter sensitivity against λ.The red curves represent the results of the proposed algorithm,while the blue lines indicate the second-best performance on the corresponding dataset

    5 Conclusion

    This paper proposes a simple but effective algorithm, LF-MVCS, which learns a consensus similarity matrix from all base clustering indicator matrices.We determine that each base clustering indicator matrix is a better feature representation than the original feature and kernel.Based on this finding, we derive a simple novel optimization goal for multi-view clustering, which learns a consensus similarity matrix to facilitate better clustering performance.The proposed algorithm is developed to efficiently solve the resultant optimization problem.LF-MVCS outperforms all SOTA methods significantly in terms of clustering performance.Moreover, our algorithm also reduces the computational cost on benchmark datasets when compared with other kernel-based multi-view clustering methods.In future work, we plan to extend our algorithm to a general framework, then use this framework as a platform to revisit existing multi-view algorithms in order to further improve the fusion method.

    Funding Statement:This paper and research results were supported by the Hunan Provincial Science and Technology Plan Project.The specific grant number is 2018XK2102.Y.P.Zhao, W.X.Liang, J.Z.Lu and X.W.Chen all received this grant.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    欧美另类一区| 午夜免费激情av| 成人无遮挡网站| 国产在视频线精品| 色视频www国产| 精品久久久久久久久久久久久| 天堂av国产一区二区熟女人妻| 日韩国内少妇激情av| 熟妇人妻久久中文字幕3abv| av在线亚洲专区| 91精品一卡2卡3卡4卡| 一个人看的www免费观看视频| 国产爱豆传媒在线观看| 草草在线视频免费看| 国产黄a三级三级三级人| 噜噜噜噜噜久久久久久91| 亚洲高清免费不卡视频| 六月丁香七月| 亚洲精品成人av观看孕妇| 亚洲图色成人| 18+在线观看网站| 女人久久www免费人成看片| 精品一区二区三区人妻视频| 久久久成人免费电影| 国产成人精品婷婷| 国产精品久久视频播放| 午夜福利成人在线免费观看| 欧美人与善性xxx| 亚洲第一区二区三区不卡| 少妇人妻一区二区三区视频| 亚洲av成人精品一区久久| 小蜜桃在线观看免费完整版高清| 国产av码专区亚洲av| 婷婷色麻豆天堂久久| 热99在线观看视频| 亚洲精品久久午夜乱码| 国产中年淑女户外野战色| 午夜精品国产一区二区电影 | 国产精品人妻久久久影院| 亚洲熟女精品中文字幕| 一级黄片播放器| 久久精品久久久久久久性| 久久久久性生活片| 人妻少妇偷人精品九色| 午夜精品在线福利| 非洲黑人性xxxx精品又粗又长| 插阴视频在线观看视频| 亚洲国产最新在线播放| 精品久久久久久久久av| 亚洲精品色激情综合| 一二三四中文在线观看免费高清| 欧美xxxx性猛交bbbb| 亚洲不卡免费看| 国产激情偷乱视频一区二区| www.av在线官网国产| 边亲边吃奶的免费视频| 3wmmmm亚洲av在线观看| 日本猛色少妇xxxxx猛交久久| 国产黄片视频在线免费观看| 午夜福利视频1000在线观看| 国产69精品久久久久777片| 天堂网av新在线| 91狼人影院| 国产有黄有色有爽视频| 中文欧美无线码| 哪个播放器可以免费观看大片| 色综合亚洲欧美另类图片| 亚洲av.av天堂| 一级毛片电影观看| 国产精品久久久久久av不卡| 精品人妻熟女av久视频| 中文字幕亚洲精品专区| 免费人成在线观看视频色| 精品酒店卫生间| 日韩av在线免费看完整版不卡| 亚洲欧洲国产日韩| 国产精品精品国产色婷婷| 在现免费观看毛片| 丝瓜视频免费看黄片| 成人高潮视频无遮挡免费网站| av免费观看日本| 亚洲aⅴ乱码一区二区在线播放| 国产91av在线免费观看| 免费观看av网站的网址| 淫秽高清视频在线观看| 亚洲人成网站在线播| 看十八女毛片水多多多| 国产国拍精品亚洲av在线观看| 国产成人aa在线观看| 午夜免费男女啪啪视频观看| 男人舔女人下体高潮全视频| 一本一本综合久久| 亚洲性久久影院| 91av网一区二区| 成人午夜高清在线视频| 国产成人a区在线观看| 国产精品一区二区性色av| 国产亚洲午夜精品一区二区久久 | 日韩一区二区三区影片| 人人妻人人澡人人爽人人夜夜 | 久久草成人影院| 欧美激情久久久久久爽电影| 国产精品人妻久久久影院| 成人av在线播放网站| 欧美高清成人免费视频www| 免费看a级黄色片| 天美传媒精品一区二区| 欧美 日韩 精品 国产| 久久久久国产网址| 国产精品人妻久久久影院| 水蜜桃什么品种好| 日韩国内少妇激情av| 亚洲美女视频黄频| 亚洲成人久久爱视频| 欧美潮喷喷水| 日本午夜av视频| 搡老乐熟女国产| 好男人视频免费观看在线| 最近的中文字幕免费完整| 久热久热在线精品观看| 午夜免费观看性视频| 欧美xxⅹ黑人| 精品久久久久久久久亚洲| 夫妻午夜视频| 春色校园在线视频观看| 又爽又黄无遮挡网站| 亚洲av中文字字幕乱码综合| 啦啦啦韩国在线观看视频| 亚洲精品456在线播放app| 91精品一卡2卡3卡4卡| 成人亚洲精品一区在线观看 | 亚洲精品成人久久久久久| 免费看美女性在线毛片视频| 99久国产av精品| 国产淫语在线视频| 国产高清不卡午夜福利| 亚洲欧美成人综合另类久久久| 亚洲av电影在线观看一区二区三区 | 亚洲内射少妇av| 毛片一级片免费看久久久久| 在线免费观看不下载黄p国产| 蜜桃久久精品国产亚洲av| 国产高清三级在线| 99re6热这里在线精品视频| 精品人妻偷拍中文字幕| 哪个播放器可以免费观看大片| 亚洲18禁久久av| 欧美成人a在线观看| 久久人人爽人人片av| 免费人成在线观看视频色| 自拍偷自拍亚洲精品老妇| 欧美zozozo另类| 真实男女啪啪啪动态图| 街头女战士在线观看网站| 国产免费视频播放在线视频 | 国产亚洲午夜精品一区二区久久 | 国产成人精品福利久久| 亚洲一级一片aⅴ在线观看| 成人av在线播放网站| 国产伦在线观看视频一区| 亚洲欧洲国产日韩| 亚洲精品成人av观看孕妇| 精品酒店卫生间| av又黄又爽大尺度在线免费看| 久久人人爽人人片av| 成年版毛片免费区| 床上黄色一级片| 亚洲一区高清亚洲精品| 在线天堂最新版资源| 久久久久久久午夜电影| 亚洲国产精品成人久久小说| 又爽又黄a免费视频| 国产在线男女| 国产成人一区二区在线| 国产一级毛片七仙女欲春2| 日本午夜av视频| 亚洲精品乱久久久久久| 视频中文字幕在线观看| 高清欧美精品videossex| 啦啦啦啦在线视频资源| 中文在线观看免费www的网站| 日韩在线高清观看一区二区三区| 久久久久国产网址| 高清午夜精品一区二区三区| 91午夜精品亚洲一区二区三区| 99久久精品热视频| 亚洲av成人av| 18禁在线播放成人免费| 青青草视频在线视频观看| 一级毛片我不卡| 男人狂女人下面高潮的视频| 日韩欧美一区视频在线观看 | 亚洲经典国产精华液单| 边亲边吃奶的免费视频| 搡女人真爽免费视频火全软件| 国产av不卡久久| 欧美一区二区亚洲| 在线 av 中文字幕| 久久久久久久久久久丰满| 91精品一卡2卡3卡4卡| 国产一级毛片七仙女欲春2| 亚洲欧美精品专区久久| 国产精品国产三级国产av玫瑰| 青春草亚洲视频在线观看| 国产亚洲av嫩草精品影院| 特大巨黑吊av在线直播| 国产成人午夜福利电影在线观看| 免费观看在线日韩| 又大又黄又爽视频免费| 床上黄色一级片| 男插女下体视频免费在线播放| 国产中年淑女户外野战色| 丝瓜视频免费看黄片| 国产亚洲精品av在线| 国产有黄有色有爽视频| 久久久久精品久久久久真实原创| 丰满乱子伦码专区| 高清欧美精品videossex| 久久精品国产鲁丝片午夜精品| 久久这里有精品视频免费| 国产有黄有色有爽视频| 五月伊人婷婷丁香| 美女xxoo啪啪120秒动态图| 51国产日韩欧美| 国产综合懂色| 成人高潮视频无遮挡免费网站| 91午夜精品亚洲一区二区三区| 色吧在线观看| 国精品久久久久久国模美| 在线免费观看不下载黄p国产| 久久久a久久爽久久v久久| 国产精品一区二区在线观看99 | 国产精品日韩av在线免费观看| 日韩亚洲欧美综合| 中文字幕人妻熟人妻熟丝袜美| 伊人久久精品亚洲午夜| 国产永久视频网站| 亚洲欧洲日产国产| 在线 av 中文字幕| 国产在线一区二区三区精| 久久久久国产网址| 日韩精品青青久久久久久| 亚洲av成人av| 婷婷色综合www| 亚洲国产精品国产精品| 精品酒店卫生间| 亚洲精品自拍成人| 婷婷六月久久综合丁香| 午夜亚洲福利在线播放| 最近的中文字幕免费完整| 一级爰片在线观看| 看黄色毛片网站| 少妇熟女aⅴ在线视频| 永久免费av网站大全| 国产一区亚洲一区在线观看| 一区二区三区乱码不卡18| 乱码一卡2卡4卡精品| 在线天堂最新版资源| 国产男人的电影天堂91| 精品人妻熟女av久视频| 夜夜看夜夜爽夜夜摸| 日韩大片免费观看网站| 2021少妇久久久久久久久久久| 成年女人在线观看亚洲视频 | ponron亚洲| 中国国产av一级| 中文字幕制服av| 免费观看在线日韩| 搡老乐熟女国产| 国产精品蜜桃在线观看| 在线a可以看的网站| 中文精品一卡2卡3卡4更新| 久久久久久久大尺度免费视频| ponron亚洲| 亚洲综合精品二区| 国产白丝娇喘喷水9色精品| 久久久精品欧美日韩精品| 免费少妇av软件| 亚洲怡红院男人天堂| 亚洲一区高清亚洲精品| 亚洲精品,欧美精品| 亚洲欧美中文字幕日韩二区| 亚洲第一区二区三区不卡| 国产免费视频播放在线视频 | 六月丁香七月| 免费无遮挡裸体视频| 免费看美女性在线毛片视频| 中国美白少妇内射xxxbb| 久久久国产一区二区| 国产乱人偷精品视频| 91av网一区二区| 韩国av在线不卡| 身体一侧抽搐| 国模一区二区三区四区视频| 国产精品一区二区在线观看99 | 亚洲精华国产精华液的使用体验| 一级毛片电影观看| 大又大粗又爽又黄少妇毛片口| 热99在线观看视频| 欧美高清性xxxxhd video| 91aial.com中文字幕在线观看| 欧美xxxx性猛交bbbb| 寂寞人妻少妇视频99o| 91精品伊人久久大香线蕉| 国产伦在线观看视频一区| 丰满乱子伦码专区| 小蜜桃在线观看免费完整版高清| 久久久久久国产a免费观看| 国产精品麻豆人妻色哟哟久久 | 欧美潮喷喷水| 日本猛色少妇xxxxx猛交久久| 丝袜喷水一区| 免费电影在线观看免费观看| 中文在线观看免费www的网站| 国内揄拍国产精品人妻在线| 亚洲精品国产av蜜桃| 2021天堂中文幕一二区在线观| 国产黄色视频一区二区在线观看| 久久久久久九九精品二区国产| 亚洲四区av| 在线a可以看的网站| 91久久精品国产一区二区成人| 伦精品一区二区三区| or卡值多少钱| 亚洲精品一二三| 亚洲欧洲日产国产| 波多野结衣巨乳人妻| 欧美精品国产亚洲| 黄色欧美视频在线观看| 97超视频在线观看视频| 久久久久久久亚洲中文字幕| 亚洲伊人久久精品综合| 亚洲av在线观看美女高潮| 美女主播在线视频| 99热这里只有精品一区| 深夜a级毛片| 韩国高清视频一区二区三区| 日韩欧美精品v在线| 免费黄色在线免费观看| 天天一区二区日本电影三级| 精品午夜福利在线看| 亚洲精品一区蜜桃| 亚洲伊人久久精品综合| 亚洲精品乱码久久久v下载方式| 亚洲成人中文字幕在线播放| 晚上一个人看的免费电影| 亚洲欧美日韩无卡精品| 极品教师在线视频| eeuss影院久久| 又爽又黄无遮挡网站| av播播在线观看一区| 在线免费观看不下载黄p国产| 两个人的视频大全免费| 久久久久精品性色| 亚洲精品自拍成人| 国产亚洲一区二区精品| 国产真实伦视频高清在线观看| 天堂俺去俺来也www色官网 | 亚洲人与动物交配视频| 汤姆久久久久久久影院中文字幕 | 永久网站在线| 一本一本综合久久| 国内少妇人妻偷人精品xxx网站| 亚洲国产色片| 亚洲精品自拍成人| 久久精品国产亚洲网站| 亚洲欧美日韩无卡精品| 亚洲精品国产av蜜桃| 久久精品国产自在天天线| 特大巨黑吊av在线直播| 日韩制服骚丝袜av| 男女那种视频在线观看| 国产黄a三级三级三级人| 免费人成在线观看视频色| 亚洲内射少妇av| 欧美高清性xxxxhd video| 在线观看av片永久免费下载| 非洲黑人性xxxx精品又粗又长| 1000部很黄的大片| 能在线免费看毛片的网站| 十八禁网站网址无遮挡 | 全区人妻精品视频| 亚洲四区av| 1000部很黄的大片| 亚洲精品第二区| 男女啪啪激烈高潮av片| 色视频www国产| 国产真实伦视频高清在线观看| 91精品国产九色| 看非洲黑人一级黄片| 午夜福利成人在线免费观看| 亚洲国产精品专区欧美| 欧美一级a爱片免费观看看| 一边亲一边摸免费视频| 亚洲人成网站在线观看播放| 综合色丁香网| 亚洲色图av天堂| 午夜福利视频精品| 肉色欧美久久久久久久蜜桃 | 成人欧美大片| 色综合站精品国产| 国产伦理片在线播放av一区| 女人十人毛片免费观看3o分钟| 女人久久www免费人成看片| 亚洲婷婷狠狠爱综合网| 国产精品久久久久久久电影| 国产精品嫩草影院av在线观看| 边亲边吃奶的免费视频| 麻豆精品久久久久久蜜桃| 亚洲欧美日韩无卡精品| 直男gayav资源| 日本wwww免费看| 在线观看人妻少妇| 亚洲欧洲日产国产| 高清视频免费观看一区二区 | 成年女人看的毛片在线观看| 国产精品嫩草影院av在线观看| ponron亚洲| 久久久久性生活片| 欧美bdsm另类| av在线老鸭窝| 色综合站精品国产| 十八禁网站网址无遮挡 | 国内少妇人妻偷人精品xxx网站| 非洲黑人性xxxx精品又粗又长| 街头女战士在线观看网站| 国产v大片淫在线免费观看| 亚洲欧洲国产日韩| 亚洲在线自拍视频| 一级毛片电影观看| 少妇熟女aⅴ在线视频| 亚洲高清免费不卡视频| 三级国产精品欧美在线观看| 在线免费观看不下载黄p国产| 欧美日韩在线观看h| 国产91av在线免费观看| 亚洲精品色激情综合| 天堂av国产一区二区熟女人妻| 深爱激情五月婷婷| 少妇高潮的动态图| 午夜精品在线福利| 午夜爱爱视频在线播放| 91午夜精品亚洲一区二区三区| 建设人人有责人人尽责人人享有的 | 91精品伊人久久大香线蕉| 国产老妇女一区| 国产老妇伦熟女老妇高清| 日韩人妻高清精品专区| 99久国产av精品| 亚洲伊人久久精品综合| 成年女人在线观看亚洲视频 | 韩国高清视频一区二区三区| 久久精品夜夜夜夜夜久久蜜豆| 久久久a久久爽久久v久久| 国产亚洲午夜精品一区二区久久 | 看黄色毛片网站| 听说在线观看完整版免费高清| 国产午夜精品论理片| 69av精品久久久久久| 人体艺术视频欧美日本| 最近的中文字幕免费完整| 亚洲精品日韩av片在线观看| 国产亚洲av片在线观看秒播厂 | 日本一本二区三区精品| 国产精品爽爽va在线观看网站| 亚洲欧美日韩东京热| 最后的刺客免费高清国语| 三级国产精品欧美在线观看| 日本黄色片子视频| 久久99热这里只有精品18| 欧美一级a爱片免费观看看| 激情 狠狠 欧美| 国产毛片a区久久久久| 日本一二三区视频观看| 蜜臀久久99精品久久宅男| 国产亚洲一区二区精品| 国产精品福利在线免费观看| 韩国av在线不卡| 亚洲美女搞黄在线观看| 人体艺术视频欧美日本| 国产高清有码在线观看视频| 高清视频免费观看一区二区 | av黄色大香蕉| 精品国产三级普通话版| 丝瓜视频免费看黄片| 成人国产麻豆网| 亚洲精品国产av蜜桃| 精品人妻偷拍中文字幕| 婷婷色综合www| .国产精品久久| 久久韩国三级中文字幕| 亚洲欧美精品自产自拍| 亚洲成人一二三区av| 亚洲伊人久久精品综合| 中文字幕av在线有码专区| 91在线精品国自产拍蜜月| 九九在线视频观看精品| 三级国产精品片| 亚洲伊人久久精品综合| 国产午夜精品一二区理论片| 六月丁香七月| 亚洲精品亚洲一区二区| 99久久人妻综合| 国产精品日韩av在线免费观看| 欧美激情久久久久久爽电影| 日韩亚洲欧美综合| 97人妻精品一区二区三区麻豆| 真实男女啪啪啪动态图| 免费看美女性在线毛片视频| 你懂的网址亚洲精品在线观看| 国产乱来视频区| 亚洲国产精品专区欧美| 免费av观看视频| 18禁在线无遮挡免费观看视频| 在线观看av片永久免费下载| 欧美高清性xxxxhd video| 色哟哟·www| 女人久久www免费人成看片| 久久久久精品性色| 在线观看一区二区三区| 日韩欧美一区视频在线观看 | 国产免费一级a男人的天堂| 免费看av在线观看网站| 国产一区亚洲一区在线观看| 久久精品国产亚洲av涩爱| 2021少妇久久久久久久久久久| 亚洲自拍偷在线| 久久久久久久亚洲中文字幕| 亚洲精品久久午夜乱码| 永久免费av网站大全| 最近的中文字幕免费完整| 国产av在哪里看| 国产乱来视频区| 亚洲av中文av极速乱| 久久人人爽人人片av| 国产真实伦视频高清在线观看| 亚洲av不卡在线观看| 精品久久久久久久久亚洲| 精品久久久久久成人av| 久久久成人免费电影| 老师上课跳d突然被开到最大视频| 免费电影在线观看免费观看| 亚洲av成人精品一区久久| 亚洲内射少妇av| 波多野结衣巨乳人妻| 最近视频中文字幕2019在线8| 尾随美女入室| 嫩草影院新地址| 欧美+日韩+精品| 久久97久久精品| av天堂中文字幕网| 亚洲婷婷狠狠爱综合网| 人体艺术视频欧美日本| 男插女下体视频免费在线播放| 国产精品久久久久久久电影| 午夜精品在线福利| 久久热精品热| 国产乱来视频区| 亚洲欧美精品专区久久| 欧美日韩亚洲高清精品| 国内揄拍国产精品人妻在线| 久久久久久久久久久丰满| 精品午夜福利在线看| 亚洲丝袜综合中文字幕| 狂野欧美激情性xxxx在线观看| 久久久欧美国产精品| 亚洲av一区综合| 大香蕉久久网| 插阴视频在线观看视频| 亚洲成人av在线免费| 18禁裸乳无遮挡免费网站照片| 天堂中文最新版在线下载 | 免费看美女性在线毛片视频| 国内精品一区二区在线观看| 成人午夜高清在线视频| 国产精品一区二区三区四区免费观看| 97热精品久久久久久| 寂寞人妻少妇视频99o| 三级国产精品片| 国产午夜精品久久久久久一区二区三区| 国产精品久久久久久久久免| 亚洲欧洲国产日韩| 精品人妻一区二区三区麻豆| 床上黄色一级片| av在线亚洲专区| 在线观看免费高清a一片| 大陆偷拍与自拍| 精品一区二区三区视频在线| 日韩av不卡免费在线播放| 97精品久久久久久久久久精品| 中文欧美无线码| 精品久久久久久成人av| 亚洲av免费在线观看| 成人一区二区视频在线观看| 夫妻午夜视频| 久久久久免费精品人妻一区二区| 亚洲av一区综合| 亚洲最大成人中文| 亚洲国产精品成人综合色| 欧美日韩一区二区视频在线观看视频在线 | 精品久久久久久久人妻蜜臀av| 一级毛片电影观看| 精品少妇黑人巨大在线播放| 嫩草影院新地址| 天堂中文最新版在线下载 | 一级二级三级毛片免费看| 精品少妇黑人巨大在线播放| 日韩av在线大香蕉| 精品不卡国产一区二区三区| 日本一二三区视频观看| 午夜日本视频在线| 18禁动态无遮挡网站| 亚洲av男天堂| 免费人成在线观看视频色| 噜噜噜噜噜久久久久久91|