• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Kernel clustering—based K—means clustering

    2017-07-14 07:48:12ZHUChangmingZHANG
    關(guān)鍵詞:支持向量機(jī)

    ZHU+Changming+ZHANG+Mo

    Abstract: In order to process data sets with linear nonseparability or complicated structure, the Kernel Clusteringbased Kmeans Clustering (KCKC) method is proposed. The method clusters patterns in the original space and then maps them into the kernel space by Radial Basis Function (RBF) kernel so that the relationships between patterns are almost kept. The method is applied to the RBFbased Neural Network (RBFNN), the RBFbased Support Vector Machine (RBFSVM), and the Kernel Nearest Neighbor Classifier (KNNC). The results validate that the proposed method can generate more effective kernels, save the kernel generating time in the kernel space, avoid the sensitivity of setting the number of kernels, and improve the classification performance.

    Key words: kernel clustering; Kmeans clustering; Radial Basis Function (RBF); Support Vector Machine (SVM)

    摘要:為處理線性不可分、結(jié)構(gòu)復(fù)雜的數(shù)據(jù)集,提出基于核聚類的K均值聚類(Kernel Clusteringbased Kmeans Clustering,KCKC).該方法先在原始空間中對(duì)模式進(jìn)行聚類,再由徑向基函數(shù)(Radial Basis Function, RBF)核把它們映射到核空間,從而保持大部分模式之間的關(guān)系.把提出的方法應(yīng)用到基于RBF的神經(jīng)網(wǎng)絡(luò)(RBFbased Neural Network,RBFNN)、基于RBF的支持向量機(jī)(RBFbased Support Vector Machine, RBFSVM)和核最近鄰分類器(Kernel Nearest Neighbor Classifier,KNNC)中,結(jié)果表明本文提出的算法可以生成更有效的核,節(jié)省在核空間中的核生成時(shí)間,避免核數(shù)目設(shè)置的敏感性,并提高分類性能.

    關(guān)鍵詞: 核聚類; K均值聚類; 徑向基函數(shù)(RBF); 支持向量機(jī)(SVM)

    0 Introduction

    Clusters (i.e., kernels) play an important role on data analysis and the clustering is an unsupervised learning method and classifies patterns by their feature vectors. The Kmeans Clustering (KC)[1], the agglomerative hierarchical clustering[2] and the fuzzybased clustering[3] are classical clustering methods. But, some disadvantages of them are found[4]. First, they are used in some specific domains. Second, they always suffer from high complexity. Furthermore, patterns may be covered by wrong clusters and these methods can not process nonlinear problems well.

    In order to overcome the disadvantages of the above methods, some scholars proposed the Kernelbased KC (KKC)[5]. By the method, the patterns to be classified in an original space are mapped into a high dimensional kernel space so that the patterns become linearly separable (or nearly linearly separable), and then the KC is carried out in the kernel space. If this nonlinear mapping is continuous and smooth, the topological structures and orders of the patterns will be maintained in the kernel space. So, the patterns to be covered in a cluster together are also covered together in the kernel space. By nonlinear mapping, the differences of pattern characteristics are highlighted and the linearly inseparable patterns in the original space become linearly separable. During clustering, certain similarity metrics (Euclidean distance between patterns in the kernel space) and the objective function (least square error) are defined in the kernel space. Thus, for those linearly inseparable patterns as well as the asymmetric distribution of patterns, the kernelbased clustering can be adopted to reduce clustering error and get better clustering effect. In order to solve the problem of unsupervised learning methods, GAO et al.[6] proposed the kernel clustering method to generate clusters for each class and cover patterns by right label class.

    Taking the merits of the kernel clustering and KKC, they are combined and updated for clusters during the processing of the iteration between an original space and a kernel space. Namely, clusters are clustered in the original space and the centers and radii of clusters are got. Then, these centers and radii of clusters are used as the parameters of Radial Basis Functions (RBFs). Mapping the patterns in the original space into the kernel space by these RBFs, KKC is carried out in the kernel space and the new patterns are got in every cluster. The clusters are updated by the new patterns in the original space and the new centers and radii of clusters are provided. Then, the mapping and KKC are carried out again until the centers and radii of clusters in the kernel space are not changed any more. The proposed method is called Kernel Clusteringbased KC (KCKC).

    The rest of this paper is organized below. First, the original KKC and the kernel clustering method are reviewed, respectively. Then, the framework of KCKC and the application of KCKC to some classical methods are given. Finally, conclusions and future work are given.

    1 KKC and kernel clustering

    1.1 KKC

    The riginal KC[7] aims to divide the patterns into K parts by the unsupervised learning method and its steps are

    shown in Ref. [7]. The objective is to minimize the objective function J, J=Kk=1Nki=1xi-mk2, where K means that there are K classes to cover these patterns, Nk is the number of patterns in the kth class, xi is the ith pattern in the kth class, and mk is the center of the kth class. When the patterns are linearly inseparable in the original space, the kernel trick is useful. Note that the nonlinear mapping :Rn→F,x→(x) is used to map the patterns in the original space into the kernel space. J=Kk=1Nki=1(xi)-mk2, where (xi) is the mapping representation of pattern xi in the original space and mk is the center of the kth class in the kernel space. In the kernel space, (xi)-mk2=k(x,x)-2Nki=1k(x,xi)+Nki,j=1k(xi,xj) and k(·,·) is a kernel function. This Kmeans clustering is also called KKC.[7]

    1.2 Kernel clustering

    Kernel clustering always has a good performance for nonlinear data sets. GAO et al.[6] proposed the Kernel Nearest Neighbor Classifier (KNNC) and introduced an original method for clustering. By KNNC, an nclass problem can be transformed into n twoclass problems. Then, the class is clustered as the object class. Patterns in this class are called the object patterns. Then, other classes are called nonobject classes, and the patterns in nonobject classes are called outliers or nonobject patterns. For object class CA in each twoclass problem, KA kernels (also called clusters) are used to cover object patterns. After finishing all the n twoclass problems, n groups of kernels come into being. Then, for a test pattern, the minimum quadratic surface distance D(A)q from the existing KA kernels in CA is computed. If there are n classes, there are n minimum quadratic surface distances. The minimum one is chosen as the distance from the test pattern to the existing kernels. Finally, a pattern is labeled according to the class label of the nearest kernel.

    2 KCKC

    KCKC consists of 5 steps. (1) Use the kernel clustering to cover these patterns from n classes by some clusters. (2) Use centers and radii of clusters as the parameters of RBFs. Then, there are C clusters, and μc and σc mean the center and radius of cluster Pc, c=1, 2, …, C. (3) Map patterns by these clusters. k(x,Pc)=exp(-x-μc/σc) is used to compute the distance between each pattern and the center of cluster Pc. Then, the pattern is put into the nearest cluster. Here, the cluster with the expression like RBF is also called a kernel. (4) Return to the original space and use the new patterns covered by the clusters to update the centers and radii. (5) In the kernel space, the previous two steps are redone again until the centers of clusters will not be changed.

    3 Application of KCKC to other methods

    In order to validate the effectiveness of KCKC, the RBFbased Neural Network (RBFNN)[89], RBFbased Support Vector Machine (RBFSVM)[1011] and KNNC[6] with kernels from KCKC, KKC and KC are compared. In terms of kernels from these kernel clustering methods, the RBF kernel k(xp,xq)=exp-xp-xq22σ2 is adopted. The difference between them is the setting of the parameter σ (kernel width). For KKC, σ is set to the average of all the pairwise l2norm distances xi-xj2, i,j=1,2,…,N between patterns as used in Refs.[1215]. Here, N is the number of patterns and xi is the ith pattern in the training set. For others, σ equals the width of a cluster.

    3.1 Kernel clustering with RBFNN

    The principle of RBFNN is shown in Refs.[89]. There is a pattern x={x1, x2, …, xd} as the input vector, and then for RBFNN dr=1wrjxr=f(hj) is used to compute the input of the jth kernel. Lj=1wjkf(hj)=g(ok) is used to decide the input of kth output unit, and then the pattern is labeled into the class whose output result is the largest, i.e., the largest c. In RBFNN, RBF is used as the kernel function of the activation function of the hidden unit. By RBF, a ddimensional vector can be mapped into an Ldimensional kernel space; if L is larger than d, the nonlinearly separable problem in the ddimensional space may be linearly separable in the Ldimensional space.

    3.2 KCKC with RBFSVM

    In the original RBFSVM kernel, the width and the center of the kernel are fixed for each kernel expression, and it is not flexible. In Ref.[16], a clustering method is used to gain some kernels covering all patterns and each kernel only covers patterns belonging to one class, where the kernels have different widths σ and centers μ. The generated kernels are used as the newproduced patterns N1,N2,…,NNCLU, where the number of kernels is NCLU. The label of each new pattern dN1 is the original class label of patterns covered by the present kernel. NCLU new patterns are used to train SVM and each new pattern is a kernel. In our proposed method, the generated kernels are used to replace these in Ref.[16], and experimental results also show this method outperforms SVM with original RBF kernels.

    3.3 KCKC with KNNC

    KNNC means that a pattern is labeled according to the minimum distance between the pattern and the surfaces of the existing kernels, and a kernel means a cluster that covers some patterns with the same class label. After kernel clustering, KNNC is used to test the test set. As we know, in NNC, the test pattern belongs to the nearest patterns class. In KNNC, by the chosen distance computing method, the distance from the test pattern to each kernel is computed. If the nearest kernel belongs to a class, the test pattern is also classified into the class. Here, the distance from the pattern to the kernel is defined as D=dp→cKr, where dp→c is the distance from the pattern to the center of a kernel, and Kr is the radius of the kernel. Furthermore, KNNC is carried out in the kernel space with kernels from KCKC. The difference is the center and radius of the new kernel and the distance should be computed in the kernel space.

    3.4 Results from different kernels applied to RBFNN, RBFSVM and KNNC

    Table 1 gives the description of the adopted data sets.

    Table 2 shows the results of RBFNN, RBFSVM and KNNC with KCKC, KKC, KC and without kernel clustering. The algorithms about KKC used in RBFNN, RBFSVM and KNNC are like the method of KCKC. In terms of the cases without kernel clustering, KNNC, RBFSVM and RBFNN are used. In Table 2, A to M denote Data set, KNNC+KCKC, RBFSVM+KCKC, RBFNN+KCKC, KNNC+KKC, RBFSVM+KKC, RBFNN+KKC, KNNC+KC, RBFSVM+KC, RBFNN+KC, KNNC, RBFSVM and RBFNN, respectively.

    From Table 2, it can be found that KCKC outperforms KKC and KKC outperforms KC. It can be said that KCKC is used to process the problem of unsupervised learning method KC and has better clusters in the kernel space. Since the structure of KCKC is more complicated than that of KC and KKC, their computational complexity and efficiency are also different. Table 3 and Table 4 show the differences about computational complexity and efficiency, respectively. For convenience, define the computational complexity (efficiency) of KNNC+KCKC as 1. In the two tables, the larger the value is, the larger the computational complexity (efficiency) is. Compared with KC and without kernel clustering, KCKC is of a larger computational complexity and a lower efficiency at average. But from the comparison between KCKC and KKC, we find that their computational complexities are similar while KCKC has a better efficiency. According to these three tables, we draw a conclusion that KCKC is of better classification performance although it is of a larger computational complexity and a lower efficiency sometimes.

    3.5 Distribution of kernels and the mapping results under two kernels with KCKC, KKC and KC

    The “Two Spiral” data set is taken as an example, and the distribution of kernels and the mapping results under two kernels are given when KCKC, KKC and KC are carried out. For KC and KKC, predefine k = 6. Fig. 1 and Fig. 2 give the kernel distribution and the corresponding mapping results under KC, KKC and KCKC, respectively. Under KC, 2 kernels are chosen to map the patterns into a 2D space. In fact, 6 kernels will map the patterns into a 6D kernel space. Here, the showing under 2D is only given. In terms of the basic steps of KKC, circleshape kernels are generated in the kernel space, so in the original space, the kernels are not circles like the ones in Fig. 1a). It can be seen that new kernels can cover patterns well and only a small part of patterns are covered by wrong kernels. Comparing Fig. 2a) with Fig. 2b), it can be found that KC and KKC can not make patterns linearly separable in the kernel space, but the mapping results under KKC are better than those under KC. Then, for KCKC, the kernels cover patterns better in the original space as shown in Fig. 1c). From Fig. 2c), we find that KCKC makes patterns nearly linearly separable in the kernel space.

    4 Conclusions and future work

    A new kernelbased Kmeans clustering, namely KCKC, is proposed to replace KKC and Kmeans Clustering (KC). KCKC can avoid the problem of unsupervised learning method KC and improve the performance of KKC. By generating kernels with the alternating cycle between an original input space and a high dimensional kernel space, we can make the kernels more flexible. Furthermore, the ordinal KKC pays much time on the decision of the center of kernels in the kernel space. As we know, if a nonlinear mapping is continuous and smooth, the topological structures and the orders of the patterns will be maintained in the kernel space. So, during the processing of KCKC, the center and width of kernels can be estimated in the kernel space. This method can save the kernel generating time in the kernel space and avoid the sensitivity of setting K value. Furthermore, KCKC works well in other algorithms including the RBFbased Neural Network (RBFNN), the RBFbased Support Vector Machine (RBFSVM) and the Kernel Nearest Neighbor Classifier (KNNC). In the future research, more attention will be paid to the influence of different kernel clustering methods on KCKC using some image data sets.

    References:

    [1]HARTIGAN J A, WONG M A. Algorithm as 136: a Kmeans clustering algorithm[J]. Journal of the Royal Statistical Society: Series C (Applied Statistics), 1979, 28(1): 100108. DOI: 10.2307/2346830.

    [2]JAIN A K, DUBES R C. Algorithms for clustering data[M]. London: PrenticeHall, Inc, 1988: 227229. DOI: 10.1126/science.311.5762.765.

    [3]ZADEH L A. Fuzzy sets[J]. Information and Control, 1965, 8(3): 338353.

    [4]YU S, TRANCHEVENT L C, LIU X H, et al. Optimized data fusion for kernel Kmeans clustering[J]. IEEE Transactions on Software Engineering, 2012, 34(5): 10311039. DOI: 10.1007/9783642194061_4.

    [5]孔銳, 張國(guó)宣, 施澤生, 等. 基于核的K均值聚類[J]. 計(jì)算機(jī)工程, 2004, 30(11), 1214.

    [6]GAO Daqi, LI Jie. Kernel fisher discriminants and kernel nearest neighbor classifiers: a comparative study for largescale learning problems[C]//International Joint Conference on Neural Networks. Vancouverj, BC, Canada, July 1621, 2006: 13331338. DOI: 10.1109/IJCNN.2006.1716258.

    [7]BALOUCHESTANI M, KRISHNAN S. Advanced Kmeans clustering algorithm for large ECG data sets based on a collaboration of compressed sensing theory and KSVD approach[J]. Signal, Image and Video Processing, 2016, 10(1): 113120. DOI: 10.1007/s1176001407095.

    [8]YEH Icheng, ZHANG Xinying, WU Chong, et al. Radial basis function networks with adjustable kernel shape parameters[C]//2010 International Conference on Machine Learning and Cybernetics (ICMLC). Qingdao, 1114 July 2011. IEEE, 2011: 14821485. DOI: 10.1109/ICMLC.2010.5580841.

    [9]SILVER D, HUANG A, MADDISON C J, et al. Mastering the game of Go with deep neural networks and tree search[J]. Nature, 2016, 529: 484489. DOI: 10.1038/nature16961.

    [10]XU Guibiao, CAO Zheng, HU Baogang, et al. Robust support vector machines based on the rescaled hinge loss function[J]. Pattern Recognition, 2017, 63: 139148. DOI: 10.1016/j.patcog.2016.09.045.

    [11]WADE B S C, JOSHI S H, Gutman B A, et al. Machine learning on high dimensional shape data from subcortical brainsurfaces: a comparison of feature selection and classification methods[J]. Pattern Recognition, 2017, 63: 731739. DOI: 10.1016/j.patcog.2016.09.034.

    [12]CUI Yiqian, SHI Junyou, WANG Zili. Lazy quantum clustering induced radial basis function networks (LQCRBFN) with effective centers selection and radii determination[J]. Neurocomputing, 2016, 175: 797807. DOI: 10.1016/j.neucom.2015.10.091.

    [13]ZHU Changming. Doublefold localized multiple matrix learning machine with Universum[J]. Pattern Analysis and Applications, 2016, 128. DOI: 10.1007/s1004401605489.

    [14]ZHU Changming, WANG Zhe, GAO Daqi. New design goal of a classifier: global and local structural risk minimization[J]. KnowledgeBased Systems, 2016, 100: 2549. DOI: 10.1016/j.knosys.2016.02.002.

    [15]ZHU Changming. Improved multikernel classification machine with Nystrm approximation technique and Universum data[J]. Neurocomputing, 2016, 175: 610634. DOI: 10.1016/j.neucom.2015.10.102.

    [16]GAO Daqi, ZHANG Tao. Support vector machine classifiers using RBF kernels with clusteringbased centers and widths[C]//Proceedings of International Joint Conference on Neural Networks. Florida, USA, August 1217 2007. IEEE, 2007. DOI: 10.1109/IJCNN.2007.4371433.

    (Editor ZHAO Mian)

    猜你喜歡
    支持向量機(jī)
    基于改進(jìn)支持向量機(jī)的船舶縱搖預(yù)報(bào)模型
    基于SVM的煙草銷售量預(yù)測(cè)
    動(dòng)態(tài)場(chǎng)景中的視覺目標(biāo)識(shí)別方法分析
    論提高裝備故障預(yù)測(cè)準(zhǔn)確度的方法途徑
    基于熵技術(shù)的公共事業(yè)費(fèi)最優(yōu)組合預(yù)測(cè)
    基于支持向量機(jī)的金融數(shù)據(jù)分析研究
    黑人欧美特级aaaaaa片| 99精品久久久久人妻精品| www.999成人在线观看| 亚洲久久久国产精品| 美女高潮到喷水免费观看| 国产伦人伦偷精品视频| 亚洲免费av在线视频| 日韩大尺度精品在线看网址| 久久香蕉精品热| 91成人精品电影| 女生性感内裤真人,穿戴方法视频| 99国产极品粉嫩在线观看| 黄色 视频免费看| 老司机靠b影院| 亚洲精品一卡2卡三卡4卡5卡| 亚洲精品色激情综合| 国语自产精品视频在线第100页| 午夜成年电影在线免费观看| www国产在线视频色| 国产片内射在线| 一本综合久久免费| 巨乳人妻的诱惑在线观看| 久久精品夜夜夜夜夜久久蜜豆 | 一区二区三区高清视频在线| 一进一出好大好爽视频| 少妇裸体淫交视频免费看高清 | 免费看日本二区| 精品国产亚洲在线| 国产精品久久电影中文字幕| 热99re8久久精品国产| 91国产中文字幕| 大香蕉久久成人网| 亚洲性夜色夜夜综合| 日韩中文字幕欧美一区二区| 一边摸一边做爽爽视频免费| 日韩国内少妇激情av| 老鸭窝网址在线观看| 男人舔女人下体高潮全视频| 99热只有精品国产| 欧美黑人欧美精品刺激| 18禁国产床啪视频网站| 宅男免费午夜| 日韩欧美 国产精品| 精品电影一区二区在线| 国产亚洲欧美在线一区二区| 男人的好看免费观看在线视频 | 99精品在免费线老司机午夜| 国产99白浆流出| 免费av毛片视频| 亚洲av熟女| 午夜老司机福利片| 欧美性长视频在线观看| 久久久久亚洲av毛片大全| 亚洲av成人av| 欧美人与性动交α欧美精品济南到| 国产片内射在线| 婷婷精品国产亚洲av| 午夜福利18| 亚洲精品国产精品久久久不卡| 桃红色精品国产亚洲av| 久久热在线av| 在线观看免费视频日本深夜| 欧美激情 高清一区二区三区| 欧美在线一区亚洲| 在线播放国产精品三级| 又黄又粗又硬又大视频| 美女免费视频网站| 老鸭窝网址在线观看| 国产成人啪精品午夜网站| 天天一区二区日本电影三级| 中文字幕人成人乱码亚洲影| 母亲3免费完整高清在线观看| 国产v大片淫在线免费观看| 香蕉久久夜色| 久久国产精品男人的天堂亚洲| 黄色视频,在线免费观看| 18禁黄网站禁片午夜丰满| 精品国产乱子伦一区二区三区| 亚洲欧美精品综合一区二区三区| 欧美午夜高清在线| 欧美乱妇无乱码| 伦理电影免费视频| 精品国产亚洲在线| 亚洲av日韩精品久久久久久密| 中文字幕人妻熟女乱码| 91九色精品人成在线观看| 18禁观看日本| 琪琪午夜伦伦电影理论片6080| 国产成人影院久久av| 天堂影院成人在线观看| 国产99白浆流出| 黄色视频不卡| 在线av久久热| 国产成+人综合+亚洲专区| 人人妻人人澡欧美一区二区| 狠狠狠狠99中文字幕| 午夜成年电影在线免费观看| 欧美日韩黄片免| 88av欧美| 成人国语在线视频| 亚洲欧美精品综合一区二区三区| 99精品欧美一区二区三区四区| 中文字幕另类日韩欧美亚洲嫩草| 老鸭窝网址在线观看| 亚洲性夜色夜夜综合| 女性被躁到高潮视频| 精品国产乱子伦一区二区三区| av电影中文网址| 黄色成人免费大全| 国产男靠女视频免费网站| 亚洲成人精品中文字幕电影| 欧美午夜高清在线| 精品免费久久久久久久清纯| 热99re8久久精品国产| 久久 成人 亚洲| 少妇 在线观看| 91麻豆精品激情在线观看国产| 视频在线观看一区二区三区| 午夜久久久在线观看| 亚洲午夜理论影院| 精品国产超薄肉色丝袜足j| 在线观看一区二区三区| 深夜精品福利| 日韩三级视频一区二区三区| 中国美女看黄片| 久久久国产成人免费| 久久午夜综合久久蜜桃| 久久 成人 亚洲| 国产精品乱码一区二三区的特点| 欧美+亚洲+日韩+国产| 听说在线观看完整版免费高清| e午夜精品久久久久久久| 久久热在线av| 久久亚洲精品不卡| 别揉我奶头~嗯~啊~动态视频| 久久久精品国产亚洲av高清涩受| 男人操女人黄网站| 国产极品粉嫩免费观看在线| 老司机午夜十八禁免费视频| 国产精品亚洲美女久久久| 在线观看一区二区三区| 大香蕉久久成人网| 人人妻人人看人人澡| 亚洲国产毛片av蜜桃av| 精品久久久久久久末码| 夜夜躁狠狠躁天天躁| 亚洲精品中文字幕在线视频| 欧美性猛交黑人性爽| 性欧美人与动物交配| 激情在线观看视频在线高清| 久久久水蜜桃国产精品网| 久99久视频精品免费| 久久久久免费精品人妻一区二区 | 亚洲黑人精品在线| 日韩大尺度精品在线看网址| 美女免费视频网站| 1024手机看黄色片| 久久精品亚洲精品国产色婷小说| 免费在线观看完整版高清| 欧美日韩一级在线毛片| 欧美国产精品va在线观看不卡| 久久精品夜夜夜夜夜久久蜜豆 | 很黄的视频免费| 无限看片的www在线观看| 国产又色又爽无遮挡免费看| 久99久视频精品免费| 欧美性长视频在线观看| 婷婷六月久久综合丁香| 在线天堂中文资源库| 性色av乱码一区二区三区2| 久久性视频一级片| 成熟少妇高潮喷水视频| 国产精品 欧美亚洲| 免费一级毛片在线播放高清视频| 久久国产乱子伦精品免费另类| 国产高清有码在线观看视频 | 亚洲欧美日韩高清在线视频| 在线视频色国产色| 久久精品成人免费网站| 在线国产一区二区在线| 他把我摸到了高潮在线观看| 非洲黑人性xxxx精品又粗又长| 天天一区二区日本电影三级| 亚洲精品粉嫩美女一区| 亚洲成国产人片在线观看| 国产亚洲av高清不卡| 露出奶头的视频| 美女扒开内裤让男人捅视频| 老司机靠b影院| 亚洲性夜色夜夜综合| 美女扒开内裤让男人捅视频| 男人舔奶头视频| 最近最新免费中文字幕在线| 午夜日韩欧美国产| 巨乳人妻的诱惑在线观看| 欧美成人一区二区免费高清观看 | 最新美女视频免费是黄的| 好男人电影高清在线观看| 久久中文字幕一级| 淫妇啪啪啪对白视频| 久久久久亚洲av毛片大全| 国产精品久久视频播放| 久久精品亚洲精品国产色婷小说| 亚洲熟妇熟女久久| www.精华液| 亚洲精品中文字幕在线视频| 69av精品久久久久久| 午夜免费鲁丝| 美女免费视频网站| 热99re8久久精品国产| 老熟妇乱子伦视频在线观看| a级毛片a级免费在线| 亚洲avbb在线观看| 伊人久久大香线蕉亚洲五| 69av精品久久久久久| 日日摸夜夜添夜夜添小说| 在线观看一区二区三区| 亚洲人成伊人成综合网2020| 亚洲国产高清在线一区二区三 | 啦啦啦观看免费观看视频高清| 国产在线观看jvid| 无人区码免费观看不卡| 久久香蕉精品热| 中文字幕另类日韩欧美亚洲嫩草| 国产99久久九九免费精品| 久久久久久人人人人人| 亚洲国产看品久久| 亚洲精品色激情综合| 十八禁人妻一区二区| 亚洲一区高清亚洲精品| 成人欧美大片| 久久精品国产综合久久久| 久久久久久免费高清国产稀缺| 亚洲片人在线观看| 国产一区二区三区在线臀色熟女| 一边摸一边抽搐一进一小说| 欧美中文综合在线视频| 老司机深夜福利视频在线观看| 亚洲三区欧美一区| 女同久久另类99精品国产91| 国产精品久久久av美女十八| 日韩视频一区二区在线观看| 午夜精品在线福利| 久久久久久久久免费视频了| 国产蜜桃级精品一区二区三区| 午夜老司机福利片| 精品不卡国产一区二区三区| 午夜亚洲福利在线播放| 日本一本二区三区精品| 男人舔女人下体高潮全视频| 狂野欧美激情性xxxx| 国产99白浆流出| 国产成人精品久久二区二区91| 看片在线看免费视频| 国产欧美日韩一区二区三| 97超级碰碰碰精品色视频在线观看| 在线观看免费视频日本深夜| 老鸭窝网址在线观看| 久久久久免费精品人妻一区二区 | 日本在线视频免费播放| 美女午夜性视频免费| 亚洲人成电影免费在线| 精品国产超薄肉色丝袜足j| 一夜夜www| 国产激情欧美一区二区| 国产一卡二卡三卡精品| 国产v大片淫在线免费观看| 精品电影一区二区在线| 麻豆成人午夜福利视频| 亚洲熟女毛片儿| 一夜夜www| 人人澡人人妻人| 亚洲国产精品合色在线| 少妇熟女aⅴ在线视频| 亚洲人成网站高清观看| 国产精品1区2区在线观看.| 日韩欧美一区二区三区在线观看| 一进一出抽搐gif免费好疼| 国产av不卡久久| 在线观看免费午夜福利视频| 国产精品电影一区二区三区| 国产成人欧美| 亚洲男人天堂网一区| 国产真实乱freesex| 淫妇啪啪啪对白视频| 国产伦在线观看视频一区| 色在线成人网| 中文字幕另类日韩欧美亚洲嫩草| 国产亚洲欧美98| 国产熟女xx| 午夜福利欧美成人| 亚洲国产欧美一区二区综合| 日本在线视频免费播放| 国产99白浆流出| 淫秽高清视频在线观看| 别揉我奶头~嗯~啊~动态视频| 精品卡一卡二卡四卡免费| 国产亚洲精品第一综合不卡| 极品教师在线免费播放| 精品久久久久久久久久久久久 | 久久亚洲精品不卡| 亚洲成av人片免费观看| 国产精品98久久久久久宅男小说| 夜夜爽天天搞| 欧美性长视频在线观看| 日日干狠狠操夜夜爽| 韩国av一区二区三区四区| 亚洲av成人av| 日本一本二区三区精品| 一区二区日韩欧美中文字幕| 大香蕉久久成人网| 色播在线永久视频| 一进一出抽搐gif免费好疼| 午夜两性在线视频| 精品国产美女av久久久久小说| 日韩视频一区二区在线观看| 99久久国产精品久久久| 香蕉久久夜色| 亚洲av美国av| av片东京热男人的天堂| 亚洲午夜理论影院| 18禁国产床啪视频网站| 欧美丝袜亚洲另类 | 最好的美女福利视频网| 搡老妇女老女人老熟妇| 午夜成年电影在线免费观看| 精品国产超薄肉色丝袜足j| 美女高潮到喷水免费观看| 老司机福利观看| 亚洲欧美精品综合一区二区三区| 久久天堂一区二区三区四区| 欧美成人一区二区免费高清观看 | 69av精品久久久久久| 欧美中文综合在线视频| 国产主播在线观看一区二区| www.熟女人妻精品国产| 一区二区日韩欧美中文字幕| 久久久久精品国产欧美久久久| 成人三级黄色视频| 成人三级做爰电影| 一级a爱片免费观看的视频| 人人妻人人澡人人看| 国产黄片美女视频| 久久久久久久精品吃奶| 婷婷精品国产亚洲av| 亚洲aⅴ乱码一区二区在线播放 | 欧美午夜高清在线| 最近最新中文字幕大全电影3 | 少妇的丰满在线观看| 亚洲精品国产区一区二| 国产亚洲av嫩草精品影院| 亚洲中文字幕日韩| 日本a在线网址| 日韩国内少妇激情av| 听说在线观看完整版免费高清| 99精品在免费线老司机午夜| 99在线人妻在线中文字幕| 高清在线国产一区| 亚洲av美国av| 午夜久久久久精精品| 亚洲片人在线观看| 中文在线观看免费www的网站 | 18禁裸乳无遮挡免费网站照片 | 听说在线观看完整版免费高清| 久久午夜综合久久蜜桃| 久久天堂一区二区三区四区| 老司机午夜福利在线观看视频| 老汉色∧v一级毛片| 亚洲成人免费电影在线观看| 亚洲全国av大片| 日韩三级视频一区二区三区| 俺也久久电影网| 丰满人妻熟妇乱又伦精品不卡| 一级黄色大片毛片| 在线天堂中文资源库| 俺也久久电影网| 可以在线观看的亚洲视频| 久久 成人 亚洲| 搡老岳熟女国产| 欧美成人一区二区免费高清观看 | 嫩草影院精品99| 成人欧美大片| 午夜福利免费观看在线| 亚洲专区字幕在线| 大型av网站在线播放| 亚洲最大成人中文| 中文字幕另类日韩欧美亚洲嫩草| 看黄色毛片网站| 啦啦啦 在线观看视频| 精品久久久久久久久久免费视频| 国产91精品成人一区二区三区| 国产片内射在线| 宅男免费午夜| 国产精品一区二区免费欧美| 精华霜和精华液先用哪个| 两性夫妻黄色片| 夜夜看夜夜爽夜夜摸| a级毛片在线看网站| 美女国产高潮福利片在线看| 国产伦人伦偷精品视频| 国产精品国产高清国产av| 狠狠狠狠99中文字幕| 国产乱人伦免费视频| 亚洲精品久久国产高清桃花| 麻豆久久精品国产亚洲av| 国产亚洲av嫩草精品影院| 黄色 视频免费看| 久久久国产成人免费| 国产高清视频在线播放一区| 无人区码免费观看不卡| 此物有八面人人有两片| 国产亚洲欧美98| 午夜福利18| 在线av久久热| 久久狼人影院| 国产av一区在线观看免费| 1024香蕉在线观看| 中文字幕av电影在线播放| 91国产中文字幕| 国产精品一区二区精品视频观看| 久久九九热精品免费| 成人av一区二区三区在线看| 老司机午夜十八禁免费视频| 国产一区在线观看成人免费| 色尼玛亚洲综合影院| 日韩欧美三级三区| 看黄色毛片网站| 精品国产乱码久久久久久男人| 国产免费男女视频| 母亲3免费完整高清在线观看| 欧美成人午夜精品| 国产色视频综合| 一进一出抽搐gif免费好疼| 精品熟女少妇八av免费久了| 男人舔奶头视频| 成年免费大片在线观看| 美女免费视频网站| svipshipincom国产片| 国产精品免费一区二区三区在线| 可以在线观看毛片的网站| 国产精品自产拍在线观看55亚洲| 国产成+人综合+亚洲专区| 亚洲精品中文字幕一二三四区| 国产精品一区二区精品视频观看| 制服诱惑二区| 波多野结衣高清无吗| 女人被狂操c到高潮| 久久精品aⅴ一区二区三区四区| 精品第一国产精品| 国产三级黄色录像| 国产成+人综合+亚洲专区| 男人舔女人的私密视频| 免费在线观看亚洲国产| 精华霜和精华液先用哪个| 国产片内射在线| 国产av不卡久久| 欧美乱码精品一区二区三区| 最近最新中文字幕大全免费视频| 夜夜看夜夜爽夜夜摸| 一进一出抽搐动态| 国产成人系列免费观看| 91大片在线观看| 免费高清视频大片| 久久精品亚洲精品国产色婷小说| 18美女黄网站色大片免费观看| 熟女电影av网| 国产成人一区二区三区免费视频网站| 美女午夜性视频免费| 成人18禁高潮啪啪吃奶动态图| 美女高潮到喷水免费观看| 免费在线观看视频国产中文字幕亚洲| 午夜福利在线观看吧| 悠悠久久av| 97超级碰碰碰精品色视频在线观看| 国产又色又爽无遮挡免费看| 黄色a级毛片大全视频| 久久久久久久午夜电影| 好看av亚洲va欧美ⅴa在| 大型av网站在线播放| 日本黄色视频三级网站网址| 日韩一卡2卡3卡4卡2021年| avwww免费| 男女那种视频在线观看| 国产精品精品国产色婷婷| 欧美日韩瑟瑟在线播放| 亚洲色图 男人天堂 中文字幕| 亚洲真实伦在线观看| 成人三级做爰电影| 亚洲av成人av| 日韩国内少妇激情av| 黄片大片在线免费观看| 婷婷精品国产亚洲av在线| 国产精品 欧美亚洲| 亚洲精品粉嫩美女一区| 欧美日韩瑟瑟在线播放| 国产欧美日韩一区二区精品| 国产精品久久久av美女十八| 人人澡人人妻人| 国产精品av久久久久免费| 亚洲色图av天堂| 国产真实乱freesex| 一级a爱片免费观看的视频| 久久热在线av| 欧美激情高清一区二区三区| 亚洲国产高清在线一区二区三 | 一区二区三区高清视频在线| 久久久久国产一级毛片高清牌| 午夜免费观看网址| 91在线观看av| www.www免费av| 国产精品久久久久久亚洲av鲁大| 欧美中文综合在线视频| 国产三级黄色录像| 欧美精品亚洲一区二区| av福利片在线| 午夜a级毛片| 日日爽夜夜爽网站| 亚洲七黄色美女视频| 亚洲中文字幕日韩| 最新在线观看一区二区三区| 一进一出好大好爽视频| 精品国产国语对白av| 一个人观看的视频www高清免费观看 | 亚洲片人在线观看| 日日摸夜夜添夜夜添小说| 99国产精品一区二区蜜桃av| 身体一侧抽搐| 成人三级做爰电影| 亚洲精品国产精品久久久不卡| 又黄又爽又免费观看的视频| 男人操女人黄网站| 日韩 欧美 亚洲 中文字幕| 国产99久久九九免费精品| av视频在线观看入口| 亚洲中文字幕一区二区三区有码在线看 | 90打野战视频偷拍视频| 亚洲专区国产一区二区| 国产日本99.免费观看| 久久精品国产亚洲av香蕉五月| 国产精品电影一区二区三区| 一个人观看的视频www高清免费观看 | 欧美中文日本在线观看视频| 国产成年人精品一区二区| 好看av亚洲va欧美ⅴa在| 午夜福利视频1000在线观看| 亚洲美女黄片视频| 成人18禁高潮啪啪吃奶动态图| 丝袜美腿诱惑在线| 亚洲精品国产区一区二| 成人三级做爰电影| 久久久久久久午夜电影| 麻豆成人av在线观看| 久久久国产成人免费| 国产精品香港三级国产av潘金莲| 真人做人爱边吃奶动态| 在线观看午夜福利视频| 色在线成人网| 啦啦啦观看免费观看视频高清| 亚洲av熟女| 亚洲 国产 在线| 国产高清videossex| av在线天堂中文字幕| 精品一区二区三区视频在线观看免费| 中出人妻视频一区二区| 久久香蕉国产精品| 一级黄色大片毛片| 在线av久久热| 亚洲熟妇熟女久久| 中文字幕精品免费在线观看视频| 一个人观看的视频www高清免费观看 | 好男人电影高清在线观看| 99热6这里只有精品| 国产精品久久久av美女十八| 国产精品永久免费网站| 欧美黑人欧美精品刺激| 可以在线观看毛片的网站| 亚洲第一青青草原| 国产激情欧美一区二区| av有码第一页| 好看av亚洲va欧美ⅴa在| 亚洲精品中文字幕一二三四区| 少妇 在线观看| 中文字幕人妻熟女乱码| 女人高潮潮喷娇喘18禁视频| 最近在线观看免费完整版| 国产爱豆传媒在线观看 | e午夜精品久久久久久久| 男人舔女人下体高潮全视频| 九色国产91popny在线| 久久久久久久午夜电影| 欧美 亚洲 国产 日韩一| 日日干狠狠操夜夜爽| 一边摸一边做爽爽视频免费| 一级作爱视频免费观看| 国产三级在线视频| 免费在线观看日本一区| 1024视频免费在线观看| 99国产极品粉嫩在线观看| 亚洲在线自拍视频| 99国产精品一区二区蜜桃av| 久久久久亚洲av毛片大全| 50天的宝宝边吃奶边哭怎么回事| 国产精品久久久久久人妻精品电影| 韩国精品一区二区三区| 国产人伦9x9x在线观看| 久久精品国产亚洲av高清一级| 成人国产一区最新在线观看| 日韩欧美国产在线观看| 黄频高清免费视频| 日韩欧美 国产精品| 国产人伦9x9x在线观看| 久久精品国产亚洲av高清一级| 韩国精品一区二区三区| 91老司机精品| 国产精品久久视频播放|