• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    2016-12-05 07:01:07LiWenfa李文法WangGongmingMaNanLiuHongzhe
    High Technology Letters 2016年3期
    關(guān)鍵詞:文法

    Li Wenfa (李文法):Wang Gongming :Ma Nan:Liu Hongzhe

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    Li Wenfa (李文法)*To whom correspondence should be addressed.E-mail:liwenfa@buu.edu.cnReceived on Dec.16,2015*:Wang Gongming**:Ma Nan*:Liu Hongzhe*

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data.The equidistance problem is solved using NPsim function to calculate similarity.And a sequential NPsim matrix is built to improve indexing performance.To sum up the above innovations:a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others.In addition:the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.

    nearest neighbor search:high-dimensional data:similarity:indexing tree:NPsim:KD-tree:SR-tree:Munsell

    0 Introduction

    The nearest neighbor search is looking for several points that are nearest from the given point[1]:which is widely used in text clustering:recommendation system:multimedia retrieval:sequence analysis:etc.Generally speaking:the data whose dimensionality is more than 20 belongs to high-dimensional data[2].The traditional nearest neighbor search algorithms may fail in high-dimensional data because of the curse of dimensionality[3].Thus:the nearest neighbor search of high-dimensional data has become a challenging but useful issue in data mining.Currently:this issue has been researched to a certain extent.With position sensitive hashing algorithm[4]:a high-dimensional vector is mapped onto address space:and previous similar points are still close to each other in a larger probability:which overcomes the equidistance of high-dimensional data.But its applicability is limited because of same hash functions and neglection of data difference.To solve this problem:a self-taught hashing (STH)[5]was proposed by Zhang:et al.The similarity matrix is built at first.And the matrix decomposition and eigenvalue solution are carried out subsequently.But it has large time and space complexity.The iDistance[6]and vector approximation file (VA-File)[7]are suitable for indexing structure.However:its query cost is very huge.

    In essence:the similarity measurement and index tree construction have affected performance of nearest neighbor search of high-dimensional data.Thus:solving problems that exist in above aspects is very important.At present:most of similarity measurement methods of high-dimensional data ignore the relative difference of property:noise distribution:weight and other factors:which are valid for a small number of data types[8].Psim(X,Y) function considers the above factors[8]and is applicable for all kinds of data type.But it is unable to compare similarity under different dimensions because its range depends on spatial dimensionality.Thus:the NPsim(X:Y) function is proposed to solve this problem and makes its range [0:1].The defect of index tree on construction and query has made up with sequential NPsim matrix.This method is easy to parallelize.Assuming that the dimensionality is M:the time complexity of building sequential NPsim matrix is O(M2·n):but the one after parallelization is reduced into O(M·n).The time complexity of nearest neighbor search is O(1).This algorithm is compared with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.The experimental results show that the similarity of our proposed algorithm is better than the one of other algorithms.The construction of sequential NPsim matrix is time consuming:but its searching speed is more than thousands times of others.In addition:the construction time of sequential NPsim matrix can be reduced dramatically by virtue of parallelization.Thus:its whole performance is better than the one of others.

    1 Related work

    In recent years:the similarity measurement and index tree construction have been researched to a certain extent.But insufficiency still exists.

    To solve the problem in similarity measurement:the Hsim(X:Y) function[9]was proposed by Yang:which is better than traditional method:but neglects the relative difference and noise distribution.The Gsim(X:Y) function[10]is proposed according to relative difference of properties in different dimensions.But the weight discrepancy is ignored.The Close(X:Y) function[11]based on monotone decreasing of e-xcan overcome the influence from components in some dimensions whose variance are larger.But relative difference is not considered which would be affected by noise.The Esim(X:Y)[12]function is proposed by improving Hsim(X:Y) and Close(X:Y) functions.In each dimension:the Esim(X:Y) component is positive correlation to the value in this dimension.All dimensions are divided into two parts:normal dimension and noisy dimension.In noisy dimension:the noise occupies majority.When noise is similar to and larger than the one in normal dimension:this method will be invalid.The secondary measurement method[13]is used to calculate the similarity by virtue of property distribution:space distance:etc.But the noise distribution and weight have been neglected.In addition:it is time-consuming.The projection nearest neighbor is proposed by Hinneburg[14]:which is used to solve the problem in higher dimensional space by dimension reduction.But it is hard to find right quality criterion function.In high-dimensional space:Yi has found[8]that the difference in noisy dimension is larger:no matter how similar data is.This difference has occupied a large amount of similarity calculation:which results in the distances between all points to be similar.Therefore:Psim(X:Y) function[8]is proposed to eliminate the noisy influence by analyzing difference among all dimensions.The experimental result indicates that this method is suitable for all kinds of data type.But its range is [0:n]:where n is dimensionality.Thus:the similarities in different dimensions are unable to compare.

    There are two kinds of index trees used in high-dimensional space:index tree based on the vector space:and index tree based on metric space.The typical example of the former is R-tree[15].It is a natural extension of B-tree in high-dimensional space and can solve the data searching problem.However:R-tree has the problems of brother node overlap:multiple queries:and low utilization:etc.Therefore:the extension of R-tree has been proposed:such as R+tree:R*tree:cR-tree.The common structure of the later is VP-tree[16]:which is a binary search tree and suitable for large-scale data search.But it is a static tree and could not be inserted or deleted.In addition:the distance calculation is time-consuming.MVP-tree is the improvement of VP-tree[17]and the cost of distance calculation is decreased.But its time complexity in stage of creation or query is higher than the one of VP-tree.M-tree is the hash index represented by B-tree[18]and has high searching efficiency.However:it can only carry out single value searching:instead of range searching.SA-tree is created according to the distance between leaf node and root node[19].But it is a completely static tree and could not be inserted or deleted.

    2 Key technology

    2.1 Similarity measurement

    Inn-dimensional space:set S={S1:S2:…:SM} is composed of M points Si={Si1:Si2,…,Sij,…,Sin}:where i=1,2,…,M:j=1,2,…,n:and Sijis the jth property of Si.Assuming that X and Y are any two points in set S:Sim(X:Y) is similarity between X and Y.

    Sim(X:Y) is usually measured with distance function.The typical methods include Manhattan distance:Euclidean distance:etc.However:with the increase of dimensionality:the nearest and farthest distances become the same[2].Thus:these methods are invalid in high-dimensional space.To solve this problem:several methods are proposed:such as Hsim(X:Y):Gsim(X:Y):Close(X:Y):Esim(X:Y):yet they are valid for limited types of data[2].The Psim(X:Y) is suitable for a variety of data type:and its range is dependent on spatial dimensionality and unable to compare the similarity under the different dimensions.Under the circumstance of maintaining effects:Psim(X:Y) is updated as

    (1)

    where Xiand Yiare components in ith dimension.δ(Xi:Yi) is discriminant function.If Xiand Yiare in the same interval [ni:mi]:δ(Xi:Yi)=1 is hold.Otherwise:δ(Xi:Yi)=0 is hold.E(X:Y) is the number of intervals in which components of X and Y are all the same.It can be seen that the range of NPsim(X:Y) is in [0:1].The above is the outline of NPsim:and detailed introduction can be found in reference[8].

    To validate this method:several records in dimensions of 10:30:50:100:150:200:250:300:350:and 400 are generated with normrnd() function of Matlab.The number of records in every dimension is 1000.After that:relative difference between the farthest and the nearest neighbors is calculated with

    (2)

    where Dmaxn:Dminnand Davgnare maximal:minimal and average similarities in n-dimensional space respectively[20].

    According to the characteristics of results:similarity measurement methods are divided into two kinds.The first kind of methods include Manhattan distance:Euclidean distance:Hsim(X:Y):Gsim(X:Y):Close(X:Y) and Esim(X:Y).The others include Psim(X:Y) and NPsim(X:Y).The result is shown in Fig.1.It can be seen that relative difference of the second kind of methods is two or three magnitudes than the one of the first.Therefore:the performance advantage of the second kind of methods is obvious.

    Fig.1 Relative difference of various similarity measurement methods

    The numbers of Psim(X:Y)≥1 in different dimensions are shown in Table 1.The number of Psim(X:Y) in every dimension is 1000×1000=1000000.Thus:the 6%~15% result is more than 1:which is unable to compare the similarity in different dimensions.But this problem is not existed in NPsim(X:Y) function.

    Table 1 Number of Psim(X:Y)>=1 in different dimensions

    2.2Knearest neighbor search

    For any point Stin set S:1≤t≤M:search for set Rt?S composed of k points:which meets the following requirements.

    ?r∈Rt:NPsim(St:r)≥max{NPsim(St:s)|s∈S∩s?Rt}

    Rtis theKnearest neighbor (KNN) set of St.The course for generating Rtis called KNN search.

    3 Nearest neighbor search algorithm

    3.1 Whole framework

    The whole framework is shown in Fig.2.First of all,NPsim matrix is generated.After that:sequential NPsim matrix produced by sorting elements in each row of NPsim is sorted in descending order.

    Fig.2 Whole framework of the proposed algorithm

    3.2 Execution flow

    1) Construction of NPsim matrix

    The NPsim matrix is generated with the following steps.

    Step 1 M points Si:i=1,2,…,M are stored in M×n matrix DataSet.

    Step 2 The elements in every column of DataSet are sorted in ascending order in order to generate the matrix SortDataSet.

    Step 3 The elements in each column of SortDataSet are divided into G=「θ·n? intervals.Thus:the number of elements in every interval is T=「M/G?.Meanwhile:the endpoint of every interval is saved into the matrix FirCutBound.

    Step 4 The interval number of element of DataSet is determined according to the matrix FirCutBound:which is saved into the interval number matrix FirNumSet.

    Step 5 The M×M matrix SameDimNum is generated.For any two points Spand Sq:the number of intervals in which components of Spand Sqare all the same is calculated and saved into the matrix element SameDimNum[p][q].

    Step 6 The matrix SortDataSet is divided along the column again.The number of intervals is G′=G-1 and there are T′=|M/G′| elements in each interval.After that:the endpoint of every interval is saved into the matrix SecCutBound.

    Step 7 The interval number matrix SecNumSet is produced according to Step 4.

    Step 8 The matrix SameDimNum is updated.For any points Spand Sq:if the interval number of components in one dimension is different in Step 3:but same in Step 7:then SameDimNum[p][q]=SameDimNum[p][q]+1.

    Step 9 The M×M matrix NPsimMat is built according to the results from Step 3 to Step 9.The NPsim information of Spand Sq(1≤p:q≤M) is stored into NPsimMat[p][q]:which includes three parts:subscript p and q:NPsim(Sp:Sq).

    2) Sorting for NPsim matrix

    The sequential NPsim matrix is produced with the following steps.

    Step 1 The M×M matrix SortNPsimMat is produced:which is the copy of NPsimMat.

    Step 2 The elements in each row of SortNPsimMat are sorted in the descending order of NPsim.

    With the increase of column number:elements in pth row becomes lower and lower:which represents the distance between Spand corresponding point is farther and farther.

    3) Nearest neighbor search

    The KNN of Siis found out as follows.

    Step 1 The frontierKelements in ith row are selected.

    Step 2 The points different from Siare expected result.

    3.3 Time complexity analysis

    This algorithm is separated into two stages:construction of sequential NPsim matrix and searching KNN.There are two steps at the first stage:the time complexity is analyzed as follows.

    (1) Construction of NPsim matrix

    In this step:four kinds of matrixes are produced.The first is SortDataSet and is produced by sorting elements in every column.Its time complexity is O(MlogM·n).The second is FirCutBound and SecCutBound that are generated by visiting all elements of DataSet.Thus:the time complexity is O(M·n).The third is FirNumSet and SecNumSet which are obtained by locating the column number of element.The corresponding time complexity is O(M2·n).The fourth is SameDimNum that is produced by comparing the element per column.The time complexity of this operation is O(M2·n).Finally:the NPsim component is calculated and summed up to whole NPsim value.

    (2) Sorting for NPsim matrix

    The elements in every row of NPsimMat are sorted in the descending of NPsim.Its time complexity is O(M·nlogn).

    To sum up above analysis:the time complexity of construction stage is O(M2·n)+O(M·nlogn)=O(M2·n).

    In the course of searching:the frontierKelements in ith row are visited.Thus:the corresponding time complexity is O(1).

    4 Experiment

    The proposed algorithm includes two stages (construction and searching) that must be contained in the selected algorithm in comparison.For nearest neighbor search algorithm based on KD-tree:the KD-tree is built at first:and searching is carried out subsequently.The nearest neighbor search algorithm based on SR-tree is similar.Thus:the above two algorithms are selected in the following experiment.

    4.1 Data introduction

    The Munsell Color-Glossy set is proposed by American chromatist Munsell and is revised repeatedly by American National Standards Institute (ANSI) and Optical Society:which is one of the standard color sets and includes 1600 colors and each of them is represented with HV/C.The H:V:and C are abbreviations of hue:brightness and saturation respectively.

    The spectral reflectance of all colors in Munsell Color-Glossy set is downloaded from spectral color research group (http://www.uef.fi/fi/spectral/home).Each of them is a vector that contains 401 piece of spectral reflectance in different wavelengths:which is regarded as high-dimensional data.

    4.2 Overview

    First of all:the running times of constructing sequential NPsim matrix:KD-tree:and SR-tree are calculated.After that:KNN search of given point in Munsell color cubic is carried out.The locations of given point and neighbors must be close or continuous.

    Assuming that HV/C and HKVK/CKare given point and corresponding neighbors respectively:the Munsell distance between them is

    Distance=|HK-H|+|VK-V|+|CK-C|(3)

    On one hand:the neighbor colors from three algorithms are compared.On the other hand:with the increase of K:the construction and searching times of different algorithms are calculated and analyzed.

    4.3 Result

    The proposed algorithm:and traditional algorithms based on KD-tree and SR-tree are implemented in the experiment:and the results are compared in aspects of accuracy and speed.There is no parallel strategy used in the following experiment.The hardware includes AMD Athlon(tm) II X2-250 processor and Kingston 4G memory and the software is WinXP operation system and MicroSoft Visual Studio 2008.

    1) Accuracy analysis

    The Munsell color 5BG3/2 is selected for KNN search:which is shown in Fig.3.Searching result is shown in Table 2:3:and 4 under the circumstanceK=6.In Table 2:KNN distance is the NPsim value:and the one in Table 3 and 4 is the Euclidean distance.It can be seen that the color from the proposed method is closer to the given color.But the one from other methods has obvious difference from 5BG3/2:such as 5B3/4 in Table 3 and 7.5BG4/6 in Table 3.In addition:the Munsell distance of the proposed method is less than the one of others.In some cases:the Munsell distances of pioneers:nearest neighbors:and successors are not the ascending order:which is called as reverse phenomenon.The 10BG3/2 in Table 2:5B3/4 in Table 3:and 7.5BG4/6 in Table 4 are typical examples.The number of nearest neighbors with reverse phenomenon in Table 2:3:and 4 are 2:4:and 4:which indicates the stability of the proposed method is better than the one of others.

    Fig.3 Color of 5BG3/2

    Table 3 KNN result of KD-tree algorithm

    Table 4 KNN result of SR-tree algorithm

    2) Speed analysis

    The construction time of indexing structure is shown in Table 5.It can be seen that the one of sequential NPsim matrix is about ten times of the one of KD-tree or SR-tree.

    Table 5 Construction time of different indexing structures

    With the increase ofKvalue:the average searching time for KNN of 1000 selected Munsell colors are shown in Fig.4.The experimental result indicates that the magnitude of the proposed method is about 10-6:but the one of other methods is about 10-2.That is to say:the searching speed of the proposed method is more than thousands times of others.

    Although the construction speed of sequential NPsim matrix is slow:the searching speed is fast.And sequential NPsim matrix can be stored into disk and loaded with high speed at any time.So:the performance of sequential NPsim matrix is better than the one of KD-tree and SR-tree in nearest neighbor search of high-dimensional data.

    Fig.4 Average searching time of KNN with different algorithms

    5 Conclusion

    The nearest neighbor search of high-dimensional data is the foundation of high-dimensional data processing.The problem existing in the similarity measurement and index tree construction has affected the performance of traditional nearest neighbor search algorithm.The NPsim function and sequential NPsim matrix are designed to solve this problem:which are combined to propose a new nearest neighbor search algorithm.To validate the proposed algorithm:the traditional algorithms based on KD-tree and SR-tree are compared in the experiment on Munsell Color-Glossy set.The results show that the accuracy and searching speed of the method are better than the one of two methods.

    However:the construction speed of sequential NPsim matrix is slower than the one of KD-tree and SR-tree.The reason is the time complexity of constructing sequential NPsim matrix is O(M2·n):but the one of constructing KD-tree and SR-tree are both O(M·log2M·n).From section 4.2.1 and 4.2.2.The operations generating different rows of sequential NPsim matrix are independent of each other:which esay be paralleled.But the parallelization for construction of index tree is hard.Therefore:the time complexity would be reduced from O(M2·n) to O(M·n) by virute of parallel.Thus:using parallel in construction of index tree is the future work.

    [1] Jegou H:France L R R:Douze M:et al.Product quantization for nearest neighbor search.IEEETransactionsonPatternAnalysisandMachineIntelligence:2010:33(1):117-128

    [2] Ericson K:Pallickara S.On the performance of high dimensional data clustering and classification algorithms.FutureGenerationComputerSystems:2013:29(4):1024-1034

    [3] Bellman R.Dynamic Programming.Princeton:New Jersey:Dover Publications Inc:2010.152-153

    [4] Andoni A:Indyk P.Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions.CommunicationsoftheACM:2008:51(1):117-122

    [5] Zhang D:Wang J:Cai D:et al.Self-taught hashing for fast similarity search.In:Proceedings of the ACM SIGIR 2010:New York:ACM Press:2010.18-25

    [6] Jagadish H V:Ooi B C:Tan K L:et al.iDistance:an adaptive B+-tree based indexing method for nearest neighbor search.ACMTransactionsonDatabaseSystems:2005:30(2):364-397

    [7] Heisterkamp D R:Peng J.Kernel vector approximation files for relevance feedback retrieval in large image databases.MultimediaToolsandApplications:2005:26(2):175-189

    [8] Yi L H.Research on Clustering Algorithm for High Dimensional Data:[Ph.D dissertation].Qinhuangdao:Institute of Information Science and Engineering:Yanshan University:2011.28-30

    [9] Yang F Z:Zhu Y Y.An efficient method for similarity search on quantitative transaction data.JournalofComputerResearchandDevelopment:2004:41(2):361-368

    [10] Huang S D:Chen Q M.On clustering algorithm of high dimensional data based on similarity measurement.ComputerApplicationsandSoftware:2009:26(9):102-105

    [11] Shao C S:Lou W:Yan L M.Optimization of algorithm of similarity measurement in high-dimensional data.ComputerTechnologyandDevelopment:2011:21(2):1-4

    [12] Wang X Y:Zhang H Y:Shen L Z:et al.Research on high dimensional clustering algorithm based on similarity measurement.ComputerTechnologyandDevelopment:2013:23(5):30-33

    [13] Jia X Y.A high dimensional data clustering algorithm based on twice similarity.JournalofComputerApplications:2005:25(B12):176-177

    [14] Alexander H:Charu A C:Keim D A.What is the nearest neighbor in high dimensional spaces.In:Proceedings of the VLDB 2000:Birmingham:IEEE Computer Society:2000.506-515

    [15] Berkhin P.A survey of clustering data mining techniques.In:Grouping Multidimensional Data:Recent Advances in Clustering.Berlin:Springer-Verlag:2006.25-71

    [16] Nielsen F:Piro P:Barlaud M.Bregman vantage point trees for efficient nearest neighbor queries.In:Proceedings of the IEEE International Conference on Multimedia and Expo 2009.Birmingham:IEEE Computer Society:2009.878-881

    [17] Hetland M L.The basic principles of metric indexing.StudiesinComputationalIntelligence:2009:242:199-232

    [18] Kunze M:Weske M.Metric trees for efficient similarity search in large process model repositories.LectureNotesinBusinessInformationProcessing:2011:66:535-546

    [19] Navarro G.Searching in metric spaces by spatial approximation.TheVldbJournal:2002:11(1):28-46

    [20] Charu C:Aggarwal:Yu P S.The IGrid index:Reversing the dimensionality curse for similarity indexing in high dimensional space.In:Proceedings of ACM SIGKDD 2000.New York:ACM Press:2000.119-129

    Li Wenfa,born in 1974.He received his Ph.D.degree in Graduate University of Chinese Academy of Sciences in 2009.He also received his B.S.and M.S.degrees from PLA Information Engineering University in 1998 and 2003 respectively.His research interests include information security:data analysis and mining:etc.

    10.3772/j.issn.1006-6748.2016.03.002

    ①Supported by the National Natural Science Foundation of China (No.61300078):the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions (No.CIT&TCD201504039):Funding Project for Academic Human Resources Development in Beijing Union University (No.BPHR2014A03:Rk100201510) :and “New Start” Academic Research Projects of Beijing Union University (No.Hzk10201501).

    猜你喜歡
    文法
    從絕響到轉(zhuǎn)型:近現(xiàn)代“文法”概念與“文法學(xué)”
    關(guān)于1940 年尼瑪抄寫的《托忒文文法》手抄本
    中國石油大學(xué)勝利學(xué)院文法與經(jīng)濟管理學(xué)院簡介
    西夏文銅鏡的真言文法與四臂觀音像研究
    西夏學(xué)(2018年2期)2018-05-15 11:24:00
    Similarity measurement method of high-dimensional data based on normalized net lattice subspace①
    LL(1)文法分析器的研究與分析
    科技風(2017年25期)2017-05-30 15:40:44
    25年呵護患病妻子不離不棄
    兵團工運(2016年9期)2016-11-09 05:46:13
    基于領(lǐng)域文法的微博輿情分析方法及其應(yīng)用
    基于單向點格自動機的UPG文法識別并行算法
    文法有道,為作文注入音樂美
    桃花免费在线播放| 国产黄色视频一区二区在线观看| 亚洲国产成人一精品久久久| 捣出白浆h1v1| 别揉我奶头~嗯~啊~动态视频 | 十分钟在线观看高清视频www| 久久久久久久久免费视频了| 性色av一级| 国产午夜精品一二区理论片| 久久天躁狠狠躁夜夜2o2o | 欧美老熟妇乱子伦牲交| 免费人妻精品一区二区三区视频| 9色porny在线观看| 国产精品一二三区在线看| 一区二区日韩欧美中文字幕| 亚洲黑人精品在线| 婷婷丁香在线五月| 国产精品久久久久久精品电影小说| 天天影视国产精品| 日本a在线网址| 亚洲欧美成人综合另类久久久| 精品少妇久久久久久888优播| 国产视频一区二区在线看| h视频一区二区三区| 久久精品国产亚洲av高清一级| 国产成人91sexporn| 99国产精品99久久久久| 韩国精品一区二区三区| 午夜福利在线免费观看网站| 国产成人欧美在线观看 | 日韩一本色道免费dvd| 大片电影免费在线观看免费| 美女国产高潮福利片在线看| 国产av一区二区精品久久| 大话2 男鬼变身卡| 爱豆传媒免费全集在线观看| 一边摸一边抽搐一进一出视频| 亚洲欧美色中文字幕在线| 香蕉丝袜av| 操美女的视频在线观看| 在现免费观看毛片| 欧美变态另类bdsm刘玥| 国产精品成人在线| 午夜免费鲁丝| 欧美+亚洲+日韩+国产| 岛国毛片在线播放| 亚洲天堂av无毛| 欧美日韩视频高清一区二区三区二| 性色av一级| 久久av网站| 久久99精品国语久久久| 成人18禁高潮啪啪吃奶动态图| 一级黄片播放器| 午夜福利一区二区在线看| 亚洲黑人精品在线| 青青草视频在线视频观看| 国产亚洲一区二区精品| 国产欧美亚洲国产| 国产97色在线日韩免费| 国产精品一区二区精品视频观看| 国产伦理片在线播放av一区| 在线观看免费午夜福利视频| 啦啦啦在线免费观看视频4| 晚上一个人看的免费电影| 男女床上黄色一级片免费看| 精品国产一区二区久久| 老司机亚洲免费影院| 国产免费又黄又爽又色| 国产黄色免费在线视频| 国产精品一区二区在线不卡| 久久久久久久大尺度免费视频| 岛国毛片在线播放| 久久久久视频综合| 亚洲欧美精品综合一区二区三区| 国产伦理片在线播放av一区| 国产麻豆69| 99九九在线精品视频| 亚洲精品一区蜜桃| 最近最新中文字幕大全免费视频 | 亚洲男人天堂网一区| 香蕉国产在线看| 青青草视频在线视频观看| 悠悠久久av| 啦啦啦 在线观看视频| 日韩伦理黄色片| 国产成人免费无遮挡视频| www.自偷自拍.com| 高清黄色对白视频在线免费看| 熟女少妇亚洲综合色aaa.| 欧美性长视频在线观看| 中文字幕人妻丝袜一区二区| 91国产中文字幕| 50天的宝宝边吃奶边哭怎么回事| av天堂在线播放| 国产精品.久久久| 日韩制服骚丝袜av| 手机成人av网站| 久热这里只有精品99| 日韩大码丰满熟妇| 黑人欧美特级aaaaaa片| 精品人妻1区二区| 欧美激情高清一区二区三区| 国产高清国产精品国产三级| 久久人人97超碰香蕉20202| 久久精品人人爽人人爽视色| 久久久久视频综合| 在线观看www视频免费| 十八禁网站网址无遮挡| 精品久久蜜臀av无| 啦啦啦啦在线视频资源| 久久人妻福利社区极品人妻图片 | 一本一本久久a久久精品综合妖精| 成年人黄色毛片网站| 国产成人免费无遮挡视频| 日本五十路高清| 国产伦人伦偷精品视频| 国产精品欧美亚洲77777| 一边摸一边做爽爽视频免费| 波野结衣二区三区在线| 伦理电影免费视频| 欧美日韩亚洲高清精品| 久久国产精品大桥未久av| 亚洲免费av在线视频| 在现免费观看毛片| 亚洲精品美女久久av网站| 日本wwww免费看| 老司机深夜福利视频在线观看 | 亚洲精品自拍成人| 99国产精品99久久久久| 日韩 欧美 亚洲 中文字幕| 天天躁夜夜躁狠狠久久av| 两个人看的免费小视频| 男女午夜视频在线观看| 一级,二级,三级黄色视频| 50天的宝宝边吃奶边哭怎么回事| av在线老鸭窝| 久久精品成人免费网站| 精品卡一卡二卡四卡免费| 国产xxxxx性猛交| 亚洲国产欧美一区二区综合| 国产成人a∨麻豆精品| 国产亚洲av片在线观看秒播厂| 天天操日日干夜夜撸| 9色porny在线观看| 欧美av亚洲av综合av国产av| 日本色播在线视频| 亚洲精品日本国产第一区| 宅男免费午夜| 桃花免费在线播放| 另类精品久久| 久久99一区二区三区| 制服人妻中文乱码| videos熟女内射| 永久免费av网站大全| 久久午夜综合久久蜜桃| 狠狠精品人妻久久久久久综合| 久久中文字幕一级| 久久ye,这里只有精品| 午夜视频精品福利| 伊人亚洲综合成人网| 久久综合国产亚洲精品| 国产亚洲精品第一综合不卡| 后天国语完整版免费观看| 久久精品国产综合久久久| 亚洲精品久久成人aⅴ小说| 欧美成狂野欧美在线观看| 热re99久久国产66热| 欧美在线黄色| 欧美日韩福利视频一区二区| 男人爽女人下面视频在线观看| 欧美激情 高清一区二区三区| 午夜老司机福利片| 99热全是精品| 中文字幕另类日韩欧美亚洲嫩草| 欧美日韩视频高清一区二区三区二| 精品人妻1区二区| 久久免费观看电影| 一级毛片我不卡| 国产精品国产av在线观看| 在线观看免费午夜福利视频| 亚洲五月色婷婷综合| 日韩熟女老妇一区二区性免费视频| 亚洲国产欧美在线一区| 中文字幕制服av| 久久人妻福利社区极品人妻图片 | 国产精品一区二区在线观看99| 欧美日韩亚洲国产一区二区在线观看 | 国产淫语在线视频| 黄色片一级片一级黄色片| 亚洲国产精品成人久久小说| 看十八女毛片水多多多| 国产视频首页在线观看| 91成人精品电影| 国产精品国产三级专区第一集| 大片免费播放器 马上看| 一区二区日韩欧美中文字幕| 十分钟在线观看高清视频www| 一边摸一边做爽爽视频免费| 国产伦人伦偷精品视频| 日日爽夜夜爽网站| 欧美激情 高清一区二区三区| 国产国语露脸激情在线看| 免费一级毛片在线播放高清视频 | 国产精品.久久久| 别揉我奶头~嗯~啊~动态视频 | 天天躁日日躁夜夜躁夜夜| 伊人久久大香线蕉亚洲五| 成人免费观看视频高清| 亚洲欧美成人综合另类久久久| 久久 成人 亚洲| 老汉色∧v一级毛片| 欧美黑人欧美精品刺激| 日本av免费视频播放| 久久人人爽av亚洲精品天堂| 丰满饥渴人妻一区二区三| 蜜桃在线观看..| 超色免费av| 成人18禁高潮啪啪吃奶动态图| 自线自在国产av| 久久久久久久久久久久大奶| 午夜福利影视在线免费观看| 高清视频免费观看一区二区| 亚洲国产中文字幕在线视频| 999久久久国产精品视频| 一级片免费观看大全| 热re99久久国产66热| 久久精品熟女亚洲av麻豆精品| 老鸭窝网址在线观看| av一本久久久久| 亚洲精品一卡2卡三卡4卡5卡 | 亚洲精品久久成人aⅴ小说| 美女脱内裤让男人舔精品视频| 亚洲人成电影免费在线| 91精品国产国语对白视频| 午夜两性在线视频| 成人午夜精彩视频在线观看| 十八禁人妻一区二区| 黑人欧美特级aaaaaa片| 飞空精品影院首页| 亚洲中文av在线| 桃花免费在线播放| 欧美国产精品va在线观看不卡| 亚洲第一av免费看| 国产女主播在线喷水免费视频网站| 久久久国产一区二区| 一二三四在线观看免费中文在| 欧美老熟妇乱子伦牲交| 欧美人与性动交α欧美精品济南到| 黄色a级毛片大全视频| 中文字幕亚洲精品专区| 久久久国产精品麻豆| 18禁裸乳无遮挡动漫免费视频| 欧美日韩黄片免| 亚洲av日韩在线播放| 91精品国产国语对白视频| 亚洲视频免费观看视频| 大片电影免费在线观看免费| 国产在线一区二区三区精| 国产精品成人在线| 麻豆乱淫一区二区| 日韩人妻精品一区2区三区| 欧美日韩国产mv在线观看视频| 一本一本久久a久久精品综合妖精| 午夜激情久久久久久久| 精品亚洲成国产av| 视频区图区小说| 亚洲三区欧美一区| 色精品久久人妻99蜜桃| 久久久久网色| 中文字幕人妻丝袜制服| 天天躁日日躁夜夜躁夜夜| 国产欧美亚洲国产| 久久99热这里只频精品6学生| 考比视频在线观看| 男女边吃奶边做爰视频| 美女午夜性视频免费| 精品一区二区三区四区五区乱码 | 亚洲av欧美aⅴ国产| 欧美日韩国产mv在线观看视频| 精品人妻熟女毛片av久久网站| 国产成人啪精品午夜网站| 亚洲国产精品国产精品| 国产欧美日韩综合在线一区二区| 亚洲中文字幕日韩| a级毛片在线看网站| 婷婷色综合www| 性高湖久久久久久久久免费观看| av国产精品久久久久影院| 久久精品aⅴ一区二区三区四区| 亚洲伊人久久精品综合| 日韩制服丝袜自拍偷拍| 少妇人妻 视频| 亚洲精品国产区一区二| 国产黄色免费在线视频| 亚洲精品第二区| 大话2 男鬼变身卡| 日本欧美国产在线视频| 欧美xxⅹ黑人| 高清欧美精品videossex| 岛国毛片在线播放| 亚洲精品av麻豆狂野| 国产伦理片在线播放av一区| 最新在线观看一区二区三区 | 国产黄频视频在线观看| 亚洲专区国产一区二区| av网站免费在线观看视频| 精品卡一卡二卡四卡免费| 在线天堂中文资源库| 久久影院123| 成年动漫av网址| 国产97色在线日韩免费| 日本91视频免费播放| 国产成人一区二区三区免费视频网站 | 欧美老熟妇乱子伦牲交| 制服人妻中文乱码| 久久毛片免费看一区二区三区| 自拍欧美九色日韩亚洲蝌蚪91| 极品少妇高潮喷水抽搐| 日韩制服骚丝袜av| 亚洲图色成人| 18禁黄网站禁片午夜丰满| 在线观看www视频免费| 麻豆av在线久日| 在线 av 中文字幕| 永久免费av网站大全| av不卡在线播放| 99精国产麻豆久久婷婷| 午夜91福利影院| 国产精品九九99| 免费在线观看日本一区| 一二三四社区在线视频社区8| av天堂久久9| 国产午夜精品一二区理论片| 搡老乐熟女国产| 国产精品久久久人人做人人爽| 国产男人的电影天堂91| 在线观看免费视频网站a站| 777米奇影视久久| 涩涩av久久男人的天堂| 少妇的丰满在线观看| 亚洲精品一区蜜桃| 女人精品久久久久毛片| 搡老岳熟女国产| 一级毛片 在线播放| 精品人妻一区二区三区麻豆| 一本—道久久a久久精品蜜桃钙片| 国产深夜福利视频在线观看| av片东京热男人的天堂| 亚洲国产精品国产精品| 日韩大片免费观看网站| 欧美精品高潮呻吟av久久| 最新在线观看一区二区三区 | 美女主播在线视频| 国产爽快片一区二区三区| 99香蕉大伊视频| 亚洲av综合色区一区| 免费高清在线观看视频在线观看| 热re99久久精品国产66热6| 国产精品偷伦视频观看了| 亚洲三区欧美一区| 成人黄色视频免费在线看| 免费看av在线观看网站| 精品国产乱码久久久久久小说| 婷婷成人精品国产| 免费在线观看黄色视频的| 欧美成人精品欧美一级黄| 午夜av观看不卡| 老汉色∧v一级毛片| 免费观看a级毛片全部| 亚洲国产欧美日韩在线播放| 在线观看免费午夜福利视频| 亚洲国产精品国产精品| 狂野欧美激情性xxxx| 波多野结衣av一区二区av| 精品国产一区二区久久| 日本91视频免费播放| 欧美黄色淫秽网站| 亚洲国产看品久久| 国产视频首页在线观看| 99国产精品免费福利视频| 看免费av毛片| 色94色欧美一区二区| 在线观看免费高清a一片| 成人亚洲精品一区在线观看| 亚洲国产最新在线播放| 国产高清不卡午夜福利| 国产精品熟女久久久久浪| 欧美xxⅹ黑人| 国产精品久久久久久精品电影小说| 亚洲av欧美aⅴ国产| 亚洲专区国产一区二区| 久久九九热精品免费| 亚洲国产欧美日韩在线播放| 国产麻豆69| 新久久久久国产一级毛片| 国产精品亚洲av一区麻豆| 在线观看人妻少妇| 大片电影免费在线观看免费| 蜜桃在线观看..| 亚洲情色 制服丝袜| 在线精品无人区一区二区三| 日日爽夜夜爽网站| 人体艺术视频欧美日本| 亚洲国产精品成人久久小说| 亚洲人成电影观看| 久久久久久人人人人人| 国产成人91sexporn| 少妇精品久久久久久久| 国产成人啪精品午夜网站| 国产又色又爽无遮挡免| 校园人妻丝袜中文字幕| 国产精品九九99| 精品国产乱码久久久久久小说| 亚洲专区中文字幕在线| 首页视频小说图片口味搜索 | 午夜免费鲁丝| 亚洲av电影在线进入| 少妇猛男粗大的猛烈进出视频| 视频区欧美日本亚洲| 99久久99久久久精品蜜桃| 精品国产一区二区三区久久久樱花| 天天躁日日躁夜夜躁夜夜| 日本a在线网址| 国产精品一区二区精品视频观看| 两性夫妻黄色片| 狠狠精品人妻久久久久久综合| 国产麻豆69| 国产在线观看jvid| 伦理电影免费视频| 在线天堂中文资源库| 99久久99久久久精品蜜桃| 黄频高清免费视频| 九色亚洲精品在线播放| 国产精品成人在线| 国产在线一区二区三区精| 国产一区亚洲一区在线观看| 两个人免费观看高清视频| 国产精品一区二区免费欧美 | 日韩伦理黄色片| 精品人妻一区二区三区麻豆| 欧美中文综合在线视频| 欧美黄色片欧美黄色片| 黄色怎么调成土黄色| 90打野战视频偷拍视频| 亚洲一区中文字幕在线| 免费高清在线观看视频在线观看| 亚洲七黄色美女视频| 国产在视频线精品| 男的添女的下面高潮视频| 建设人人有责人人尽责人人享有的| 婷婷色综合大香蕉| tube8黄色片| 国产精品 欧美亚洲| 久久亚洲国产成人精品v| 国精品久久久久久国模美| 99精国产麻豆久久婷婷| 国产精品99久久99久久久不卡| 成人手机av| 狠狠精品人妻久久久久久综合| 国产97色在线日韩免费| 国产精品国产av在线观看| av在线播放精品| 欧美日韩亚洲综合一区二区三区_| 亚洲一区中文字幕在线| 欧美成狂野欧美在线观看| 亚洲图色成人| www.精华液| 欧美人与善性xxx| 国产免费一区二区三区四区乱码| 国产福利在线免费观看视频| 久久久精品区二区三区| 免费不卡黄色视频| 国产精品久久久人人做人人爽| 一级毛片女人18水好多 | 亚洲欧美精品综合一区二区三区| 亚洲一卡2卡3卡4卡5卡精品中文| 男女午夜视频在线观看| 国产片内射在线| 岛国毛片在线播放| www日本在线高清视频| 亚洲人成电影观看| 乱人伦中国视频| av网站在线播放免费| 人妻一区二区av| 亚洲国产欧美网| 欧美精品一区二区大全| 欧美亚洲日本最大视频资源| 亚洲av男天堂| 国产一卡二卡三卡精品| 午夜福利视频精品| 亚洲av欧美aⅴ国产| 黄色 视频免费看| 亚洲免费av在线视频| 少妇被粗大的猛进出69影院| 国产1区2区3区精品| 亚洲七黄色美女视频| 久久久精品区二区三区| 两人在一起打扑克的视频| 午夜福利一区二区在线看| 一区二区三区四区激情视频| 性色av乱码一区二区三区2| 亚洲精品第二区| 人人妻人人澡人人看| 精品视频人人做人人爽| 国精品久久久久久国模美| 亚洲欧美一区二区三区黑人| 亚洲av综合色区一区| 首页视频小说图片口味搜索 | 国产免费一区二区三区四区乱码| 色精品久久人妻99蜜桃| 国产深夜福利视频在线观看| 国产精品欧美亚洲77777| 岛国毛片在线播放| 手机成人av网站| 婷婷色综合大香蕉| 青春草视频在线免费观看| 久久久久久久精品精品| 亚洲国产精品一区二区三区在线| 亚洲精品在线美女| 成人国产av品久久久| 亚洲精品中文字幕在线视频| 精品久久蜜臀av无| 99re6热这里在线精品视频| 欧美乱码精品一区二区三区| 日韩大码丰满熟妇| 涩涩av久久男人的天堂| 久久国产精品人妻蜜桃| 国产av精品麻豆| 一二三四在线观看免费中文在| 黄色片一级片一级黄色片| 欧美黑人精品巨大| 中文字幕av电影在线播放| 亚洲欧美精品自产自拍| 免费在线观看影片大全网站 | 中文字幕亚洲精品专区| 日韩av免费高清视频| 19禁男女啪啪无遮挡网站| 久久久久网色| 国产极品粉嫩免费观看在线| 久久久精品免费免费高清| 亚洲中文日韩欧美视频| 国产黄色免费在线视频| 国产亚洲精品第一综合不卡| 久久国产精品人妻蜜桃| 国产伦人伦偷精品视频| 国产成人欧美| 免费久久久久久久精品成人欧美视频| av有码第一页| 少妇 在线观看| 性少妇av在线| 亚洲,欧美精品.| 中文字幕精品免费在线观看视频| 亚洲色图 男人天堂 中文字幕| 免费少妇av软件| 久久久久久久精品精品| cao死你这个sao货| 欧美+亚洲+日韩+国产| 亚洲欧洲精品一区二区精品久久久| 色网站视频免费| 久久国产精品影院| 七月丁香在线播放| 免费在线观看完整版高清| 一本综合久久免费| 久久九九热精品免费| 国产又爽黄色视频| 久久青草综合色| 99国产精品一区二区蜜桃av | 午夜激情久久久久久久| 女人久久www免费人成看片| 九色亚洲精品在线播放| 99热网站在线观看| 国产成人一区二区在线| 国产片特级美女逼逼视频| 视频区图区小说| 国产一卡二卡三卡精品| 中文字幕精品免费在线观看视频| 别揉我奶头~嗯~啊~动态视频 | 国产成人影院久久av| 国产日韩欧美亚洲二区| 国产成人av激情在线播放| 亚洲七黄色美女视频| 男人舔女人的私密视频| 国产精品一区二区在线不卡| 男女之事视频高清在线观看 | 欧美日韩成人在线一区二区| av在线老鸭窝| 高清不卡的av网站| 免费看十八禁软件| 欧美av亚洲av综合av国产av| 日日爽夜夜爽网站| 又大又爽又粗| 国产又爽黄色视频| 亚洲中文日韩欧美视频| 在线观看免费日韩欧美大片| 国产在线观看jvid| 美女福利国产在线| 国产成人影院久久av| 免费少妇av软件| 中文字幕高清在线视频| 在线观看免费日韩欧美大片| 久久狼人影院| 亚洲国产最新在线播放| 日本午夜av视频| 午夜免费男女啪啪视频观看| 免费在线观看视频国产中文字幕亚洲 | 日韩一卡2卡3卡4卡2021年| 男女床上黄色一级片免费看| 欧美日韩精品网址| 一级毛片 在线播放| 啦啦啦 在线观看视频| 国产亚洲av高清不卡| 国产成人欧美在线观看 | 欧美日韩av久久| 国产97色在线日韩免费| 国产在线免费精品|