• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    2016-12-05 07:01:07LiWenfa李文法WangGongmingMaNanLiuHongzhe
    High Technology Letters 2016年3期
    關(guān)鍵詞:文法

    Li Wenfa (李文法):Wang Gongming :Ma Nan:Liu Hongzhe

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    Li Wenfa (李文法)*To whom correspondence should be addressed.E-mail:liwenfa@buu.edu.cnReceived on Dec.16,2015*:Wang Gongming**:Ma Nan*:Liu Hongzhe*

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data.The equidistance problem is solved using NPsim function to calculate similarity.And a sequential NPsim matrix is built to improve indexing performance.To sum up the above innovations:a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others.In addition:the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.

    nearest neighbor search:high-dimensional data:similarity:indexing tree:NPsim:KD-tree:SR-tree:Munsell

    0 Introduction

    The nearest neighbor search is looking for several points that are nearest from the given point[1]:which is widely used in text clustering:recommendation system:multimedia retrieval:sequence analysis:etc.Generally speaking:the data whose dimensionality is more than 20 belongs to high-dimensional data[2].The traditional nearest neighbor search algorithms may fail in high-dimensional data because of the curse of dimensionality[3].Thus:the nearest neighbor search of high-dimensional data has become a challenging but useful issue in data mining.Currently:this issue has been researched to a certain extent.With position sensitive hashing algorithm[4]:a high-dimensional vector is mapped onto address space:and previous similar points are still close to each other in a larger probability:which overcomes the equidistance of high-dimensional data.But its applicability is limited because of same hash functions and neglection of data difference.To solve this problem:a self-taught hashing (STH)[5]was proposed by Zhang:et al.The similarity matrix is built at first.And the matrix decomposition and eigenvalue solution are carried out subsequently.But it has large time and space complexity.The iDistance[6]and vector approximation file (VA-File)[7]are suitable for indexing structure.However:its query cost is very huge.

    In essence:the similarity measurement and index tree construction have affected performance of nearest neighbor search of high-dimensional data.Thus:solving problems that exist in above aspects is very important.At present:most of similarity measurement methods of high-dimensional data ignore the relative difference of property:noise distribution:weight and other factors:which are valid for a small number of data types[8].Psim(X,Y) function considers the above factors[8]and is applicable for all kinds of data type.But it is unable to compare similarity under different dimensions because its range depends on spatial dimensionality.Thus:the NPsim(X:Y) function is proposed to solve this problem and makes its range [0:1].The defect of index tree on construction and query has made up with sequential NPsim matrix.This method is easy to parallelize.Assuming that the dimensionality is M:the time complexity of building sequential NPsim matrix is O(M2·n):but the one after parallelization is reduced into O(M·n).The time complexity of nearest neighbor search is O(1).This algorithm is compared with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.The experimental results show that the similarity of our proposed algorithm is better than the one of other algorithms.The construction of sequential NPsim matrix is time consuming:but its searching speed is more than thousands times of others.In addition:the construction time of sequential NPsim matrix can be reduced dramatically by virtue of parallelization.Thus:its whole performance is better than the one of others.

    1 Related work

    In recent years:the similarity measurement and index tree construction have been researched to a certain extent.But insufficiency still exists.

    To solve the problem in similarity measurement:the Hsim(X:Y) function[9]was proposed by Yang:which is better than traditional method:but neglects the relative difference and noise distribution.The Gsim(X:Y) function[10]is proposed according to relative difference of properties in different dimensions.But the weight discrepancy is ignored.The Close(X:Y) function[11]based on monotone decreasing of e-xcan overcome the influence from components in some dimensions whose variance are larger.But relative difference is not considered which would be affected by noise.The Esim(X:Y)[12]function is proposed by improving Hsim(X:Y) and Close(X:Y) functions.In each dimension:the Esim(X:Y) component is positive correlation to the value in this dimension.All dimensions are divided into two parts:normal dimension and noisy dimension.In noisy dimension:the noise occupies majority.When noise is similar to and larger than the one in normal dimension:this method will be invalid.The secondary measurement method[13]is used to calculate the similarity by virtue of property distribution:space distance:etc.But the noise distribution and weight have been neglected.In addition:it is time-consuming.The projection nearest neighbor is proposed by Hinneburg[14]:which is used to solve the problem in higher dimensional space by dimension reduction.But it is hard to find right quality criterion function.In high-dimensional space:Yi has found[8]that the difference in noisy dimension is larger:no matter how similar data is.This difference has occupied a large amount of similarity calculation:which results in the distances between all points to be similar.Therefore:Psim(X:Y) function[8]is proposed to eliminate the noisy influence by analyzing difference among all dimensions.The experimental result indicates that this method is suitable for all kinds of data type.But its range is [0:n]:where n is dimensionality.Thus:the similarities in different dimensions are unable to compare.

    There are two kinds of index trees used in high-dimensional space:index tree based on the vector space:and index tree based on metric space.The typical example of the former is R-tree[15].It is a natural extension of B-tree in high-dimensional space and can solve the data searching problem.However:R-tree has the problems of brother node overlap:multiple queries:and low utilization:etc.Therefore:the extension of R-tree has been proposed:such as R+tree:R*tree:cR-tree.The common structure of the later is VP-tree[16]:which is a binary search tree and suitable for large-scale data search.But it is a static tree and could not be inserted or deleted.In addition:the distance calculation is time-consuming.MVP-tree is the improvement of VP-tree[17]and the cost of distance calculation is decreased.But its time complexity in stage of creation or query is higher than the one of VP-tree.M-tree is the hash index represented by B-tree[18]and has high searching efficiency.However:it can only carry out single value searching:instead of range searching.SA-tree is created according to the distance between leaf node and root node[19].But it is a completely static tree and could not be inserted or deleted.

    2 Key technology

    2.1 Similarity measurement

    Inn-dimensional space:set S={S1:S2:…:SM} is composed of M points Si={Si1:Si2,…,Sij,…,Sin}:where i=1,2,…,M:j=1,2,…,n:and Sijis the jth property of Si.Assuming that X and Y are any two points in set S:Sim(X:Y) is similarity between X and Y.

    Sim(X:Y) is usually measured with distance function.The typical methods include Manhattan distance:Euclidean distance:etc.However:with the increase of dimensionality:the nearest and farthest distances become the same[2].Thus:these methods are invalid in high-dimensional space.To solve this problem:several methods are proposed:such as Hsim(X:Y):Gsim(X:Y):Close(X:Y):Esim(X:Y):yet they are valid for limited types of data[2].The Psim(X:Y) is suitable for a variety of data type:and its range is dependent on spatial dimensionality and unable to compare the similarity under the different dimensions.Under the circumstance of maintaining effects:Psim(X:Y) is updated as

    (1)

    where Xiand Yiare components in ith dimension.δ(Xi:Yi) is discriminant function.If Xiand Yiare in the same interval [ni:mi]:δ(Xi:Yi)=1 is hold.Otherwise:δ(Xi:Yi)=0 is hold.E(X:Y) is the number of intervals in which components of X and Y are all the same.It can be seen that the range of NPsim(X:Y) is in [0:1].The above is the outline of NPsim:and detailed introduction can be found in reference[8].

    To validate this method:several records in dimensions of 10:30:50:100:150:200:250:300:350:and 400 are generated with normrnd() function of Matlab.The number of records in every dimension is 1000.After that:relative difference between the farthest and the nearest neighbors is calculated with

    (2)

    where Dmaxn:Dminnand Davgnare maximal:minimal and average similarities in n-dimensional space respectively[20].

    According to the characteristics of results:similarity measurement methods are divided into two kinds.The first kind of methods include Manhattan distance:Euclidean distance:Hsim(X:Y):Gsim(X:Y):Close(X:Y) and Esim(X:Y).The others include Psim(X:Y) and NPsim(X:Y).The result is shown in Fig.1.It can be seen that relative difference of the second kind of methods is two or three magnitudes than the one of the first.Therefore:the performance advantage of the second kind of methods is obvious.

    Fig.1 Relative difference of various similarity measurement methods

    The numbers of Psim(X:Y)≥1 in different dimensions are shown in Table 1.The number of Psim(X:Y) in every dimension is 1000×1000=1000000.Thus:the 6%~15% result is more than 1:which is unable to compare the similarity in different dimensions.But this problem is not existed in NPsim(X:Y) function.

    Table 1 Number of Psim(X:Y)>=1 in different dimensions

    2.2Knearest neighbor search

    For any point Stin set S:1≤t≤M:search for set Rt?S composed of k points:which meets the following requirements.

    ?r∈Rt:NPsim(St:r)≥max{NPsim(St:s)|s∈S∩s?Rt}

    Rtis theKnearest neighbor (KNN) set of St.The course for generating Rtis called KNN search.

    3 Nearest neighbor search algorithm

    3.1 Whole framework

    The whole framework is shown in Fig.2.First of all,NPsim matrix is generated.After that:sequential NPsim matrix produced by sorting elements in each row of NPsim is sorted in descending order.

    Fig.2 Whole framework of the proposed algorithm

    3.2 Execution flow

    1) Construction of NPsim matrix

    The NPsim matrix is generated with the following steps.

    Step 1 M points Si:i=1,2,…,M are stored in M×n matrix DataSet.

    Step 2 The elements in every column of DataSet are sorted in ascending order in order to generate the matrix SortDataSet.

    Step 3 The elements in each column of SortDataSet are divided into G=「θ·n? intervals.Thus:the number of elements in every interval is T=「M/G?.Meanwhile:the endpoint of every interval is saved into the matrix FirCutBound.

    Step 4 The interval number of element of DataSet is determined according to the matrix FirCutBound:which is saved into the interval number matrix FirNumSet.

    Step 5 The M×M matrix SameDimNum is generated.For any two points Spand Sq:the number of intervals in which components of Spand Sqare all the same is calculated and saved into the matrix element SameDimNum[p][q].

    Step 6 The matrix SortDataSet is divided along the column again.The number of intervals is G′=G-1 and there are T′=|M/G′| elements in each interval.After that:the endpoint of every interval is saved into the matrix SecCutBound.

    Step 7 The interval number matrix SecNumSet is produced according to Step 4.

    Step 8 The matrix SameDimNum is updated.For any points Spand Sq:if the interval number of components in one dimension is different in Step 3:but same in Step 7:then SameDimNum[p][q]=SameDimNum[p][q]+1.

    Step 9 The M×M matrix NPsimMat is built according to the results from Step 3 to Step 9.The NPsim information of Spand Sq(1≤p:q≤M) is stored into NPsimMat[p][q]:which includes three parts:subscript p and q:NPsim(Sp:Sq).

    2) Sorting for NPsim matrix

    The sequential NPsim matrix is produced with the following steps.

    Step 1 The M×M matrix SortNPsimMat is produced:which is the copy of NPsimMat.

    Step 2 The elements in each row of SortNPsimMat are sorted in the descending order of NPsim.

    With the increase of column number:elements in pth row becomes lower and lower:which represents the distance between Spand corresponding point is farther and farther.

    3) Nearest neighbor search

    The KNN of Siis found out as follows.

    Step 1 The frontierKelements in ith row are selected.

    Step 2 The points different from Siare expected result.

    3.3 Time complexity analysis

    This algorithm is separated into two stages:construction of sequential NPsim matrix and searching KNN.There are two steps at the first stage:the time complexity is analyzed as follows.

    (1) Construction of NPsim matrix

    In this step:four kinds of matrixes are produced.The first is SortDataSet and is produced by sorting elements in every column.Its time complexity is O(MlogM·n).The second is FirCutBound and SecCutBound that are generated by visiting all elements of DataSet.Thus:the time complexity is O(M·n).The third is FirNumSet and SecNumSet which are obtained by locating the column number of element.The corresponding time complexity is O(M2·n).The fourth is SameDimNum that is produced by comparing the element per column.The time complexity of this operation is O(M2·n).Finally:the NPsim component is calculated and summed up to whole NPsim value.

    (2) Sorting for NPsim matrix

    The elements in every row of NPsimMat are sorted in the descending of NPsim.Its time complexity is O(M·nlogn).

    To sum up above analysis:the time complexity of construction stage is O(M2·n)+O(M·nlogn)=O(M2·n).

    In the course of searching:the frontierKelements in ith row are visited.Thus:the corresponding time complexity is O(1).

    4 Experiment

    The proposed algorithm includes two stages (construction and searching) that must be contained in the selected algorithm in comparison.For nearest neighbor search algorithm based on KD-tree:the KD-tree is built at first:and searching is carried out subsequently.The nearest neighbor search algorithm based on SR-tree is similar.Thus:the above two algorithms are selected in the following experiment.

    4.1 Data introduction

    The Munsell Color-Glossy set is proposed by American chromatist Munsell and is revised repeatedly by American National Standards Institute (ANSI) and Optical Society:which is one of the standard color sets and includes 1600 colors and each of them is represented with HV/C.The H:V:and C are abbreviations of hue:brightness and saturation respectively.

    The spectral reflectance of all colors in Munsell Color-Glossy set is downloaded from spectral color research group (http://www.uef.fi/fi/spectral/home).Each of them is a vector that contains 401 piece of spectral reflectance in different wavelengths:which is regarded as high-dimensional data.

    4.2 Overview

    First of all:the running times of constructing sequential NPsim matrix:KD-tree:and SR-tree are calculated.After that:KNN search of given point in Munsell color cubic is carried out.The locations of given point and neighbors must be close or continuous.

    Assuming that HV/C and HKVK/CKare given point and corresponding neighbors respectively:the Munsell distance between them is

    Distance=|HK-H|+|VK-V|+|CK-C|(3)

    On one hand:the neighbor colors from three algorithms are compared.On the other hand:with the increase of K:the construction and searching times of different algorithms are calculated and analyzed.

    4.3 Result

    The proposed algorithm:and traditional algorithms based on KD-tree and SR-tree are implemented in the experiment:and the results are compared in aspects of accuracy and speed.There is no parallel strategy used in the following experiment.The hardware includes AMD Athlon(tm) II X2-250 processor and Kingston 4G memory and the software is WinXP operation system and MicroSoft Visual Studio 2008.

    1) Accuracy analysis

    The Munsell color 5BG3/2 is selected for KNN search:which is shown in Fig.3.Searching result is shown in Table 2:3:and 4 under the circumstanceK=6.In Table 2:KNN distance is the NPsim value:and the one in Table 3 and 4 is the Euclidean distance.It can be seen that the color from the proposed method is closer to the given color.But the one from other methods has obvious difference from 5BG3/2:such as 5B3/4 in Table 3 and 7.5BG4/6 in Table 3.In addition:the Munsell distance of the proposed method is less than the one of others.In some cases:the Munsell distances of pioneers:nearest neighbors:and successors are not the ascending order:which is called as reverse phenomenon.The 10BG3/2 in Table 2:5B3/4 in Table 3:and 7.5BG4/6 in Table 4 are typical examples.The number of nearest neighbors with reverse phenomenon in Table 2:3:and 4 are 2:4:and 4:which indicates the stability of the proposed method is better than the one of others.

    Fig.3 Color of 5BG3/2

    Table 3 KNN result of KD-tree algorithm

    Table 4 KNN result of SR-tree algorithm

    2) Speed analysis

    The construction time of indexing structure is shown in Table 5.It can be seen that the one of sequential NPsim matrix is about ten times of the one of KD-tree or SR-tree.

    Table 5 Construction time of different indexing structures

    With the increase ofKvalue:the average searching time for KNN of 1000 selected Munsell colors are shown in Fig.4.The experimental result indicates that the magnitude of the proposed method is about 10-6:but the one of other methods is about 10-2.That is to say:the searching speed of the proposed method is more than thousands times of others.

    Although the construction speed of sequential NPsim matrix is slow:the searching speed is fast.And sequential NPsim matrix can be stored into disk and loaded with high speed at any time.So:the performance of sequential NPsim matrix is better than the one of KD-tree and SR-tree in nearest neighbor search of high-dimensional data.

    Fig.4 Average searching time of KNN with different algorithms

    5 Conclusion

    The nearest neighbor search of high-dimensional data is the foundation of high-dimensional data processing.The problem existing in the similarity measurement and index tree construction has affected the performance of traditional nearest neighbor search algorithm.The NPsim function and sequential NPsim matrix are designed to solve this problem:which are combined to propose a new nearest neighbor search algorithm.To validate the proposed algorithm:the traditional algorithms based on KD-tree and SR-tree are compared in the experiment on Munsell Color-Glossy set.The results show that the accuracy and searching speed of the method are better than the one of two methods.

    However:the construction speed of sequential NPsim matrix is slower than the one of KD-tree and SR-tree.The reason is the time complexity of constructing sequential NPsim matrix is O(M2·n):but the one of constructing KD-tree and SR-tree are both O(M·log2M·n).From section 4.2.1 and 4.2.2.The operations generating different rows of sequential NPsim matrix are independent of each other:which esay be paralleled.But the parallelization for construction of index tree is hard.Therefore:the time complexity would be reduced from O(M2·n) to O(M·n) by virute of parallel.Thus:using parallel in construction of index tree is the future work.

    [1] Jegou H:France L R R:Douze M:et al.Product quantization for nearest neighbor search.IEEETransactionsonPatternAnalysisandMachineIntelligence:2010:33(1):117-128

    [2] Ericson K:Pallickara S.On the performance of high dimensional data clustering and classification algorithms.FutureGenerationComputerSystems:2013:29(4):1024-1034

    [3] Bellman R.Dynamic Programming.Princeton:New Jersey:Dover Publications Inc:2010.152-153

    [4] Andoni A:Indyk P.Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions.CommunicationsoftheACM:2008:51(1):117-122

    [5] Zhang D:Wang J:Cai D:et al.Self-taught hashing for fast similarity search.In:Proceedings of the ACM SIGIR 2010:New York:ACM Press:2010.18-25

    [6] Jagadish H V:Ooi B C:Tan K L:et al.iDistance:an adaptive B+-tree based indexing method for nearest neighbor search.ACMTransactionsonDatabaseSystems:2005:30(2):364-397

    [7] Heisterkamp D R:Peng J.Kernel vector approximation files for relevance feedback retrieval in large image databases.MultimediaToolsandApplications:2005:26(2):175-189

    [8] Yi L H.Research on Clustering Algorithm for High Dimensional Data:[Ph.D dissertation].Qinhuangdao:Institute of Information Science and Engineering:Yanshan University:2011.28-30

    [9] Yang F Z:Zhu Y Y.An efficient method for similarity search on quantitative transaction data.JournalofComputerResearchandDevelopment:2004:41(2):361-368

    [10] Huang S D:Chen Q M.On clustering algorithm of high dimensional data based on similarity measurement.ComputerApplicationsandSoftware:2009:26(9):102-105

    [11] Shao C S:Lou W:Yan L M.Optimization of algorithm of similarity measurement in high-dimensional data.ComputerTechnologyandDevelopment:2011:21(2):1-4

    [12] Wang X Y:Zhang H Y:Shen L Z:et al.Research on high dimensional clustering algorithm based on similarity measurement.ComputerTechnologyandDevelopment:2013:23(5):30-33

    [13] Jia X Y.A high dimensional data clustering algorithm based on twice similarity.JournalofComputerApplications:2005:25(B12):176-177

    [14] Alexander H:Charu A C:Keim D A.What is the nearest neighbor in high dimensional spaces.In:Proceedings of the VLDB 2000:Birmingham:IEEE Computer Society:2000.506-515

    [15] Berkhin P.A survey of clustering data mining techniques.In:Grouping Multidimensional Data:Recent Advances in Clustering.Berlin:Springer-Verlag:2006.25-71

    [16] Nielsen F:Piro P:Barlaud M.Bregman vantage point trees for efficient nearest neighbor queries.In:Proceedings of the IEEE International Conference on Multimedia and Expo 2009.Birmingham:IEEE Computer Society:2009.878-881

    [17] Hetland M L.The basic principles of metric indexing.StudiesinComputationalIntelligence:2009:242:199-232

    [18] Kunze M:Weske M.Metric trees for efficient similarity search in large process model repositories.LectureNotesinBusinessInformationProcessing:2011:66:535-546

    [19] Navarro G.Searching in metric spaces by spatial approximation.TheVldbJournal:2002:11(1):28-46

    [20] Charu C:Aggarwal:Yu P S.The IGrid index:Reversing the dimensionality curse for similarity indexing in high dimensional space.In:Proceedings of ACM SIGKDD 2000.New York:ACM Press:2000.119-129

    Li Wenfa,born in 1974.He received his Ph.D.degree in Graduate University of Chinese Academy of Sciences in 2009.He also received his B.S.and M.S.degrees from PLA Information Engineering University in 1998 and 2003 respectively.His research interests include information security:data analysis and mining:etc.

    10.3772/j.issn.1006-6748.2016.03.002

    ①Supported by the National Natural Science Foundation of China (No.61300078):the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions (No.CIT&TCD201504039):Funding Project for Academic Human Resources Development in Beijing Union University (No.BPHR2014A03:Rk100201510) :and “New Start” Academic Research Projects of Beijing Union University (No.Hzk10201501).

    猜你喜歡
    文法
    從絕響到轉(zhuǎn)型:近現(xiàn)代“文法”概念與“文法學(xué)”
    關(guān)于1940 年尼瑪抄寫的《托忒文文法》手抄本
    中國石油大學(xué)勝利學(xué)院文法與經(jīng)濟管理學(xué)院簡介
    西夏文銅鏡的真言文法與四臂觀音像研究
    西夏學(xué)(2018年2期)2018-05-15 11:24:00
    Similarity measurement method of high-dimensional data based on normalized net lattice subspace①
    LL(1)文法分析器的研究與分析
    科技風(2017年25期)2017-05-30 15:40:44
    25年呵護患病妻子不離不棄
    兵團工運(2016年9期)2016-11-09 05:46:13
    基于領(lǐng)域文法的微博輿情分析方法及其應(yīng)用
    基于單向點格自動機的UPG文法識別并行算法
    文法有道,為作文注入音樂美
    国产在线免费精品| 国产在线一区二区三区精| 男女无遮挡免费网站观看| 亚洲成人av在线免费| 成人毛片a级毛片在线播放| 男人舔奶头视频| 在线观看免费日韩欧美大片 | 在线 av 中文字幕| 免费黄网站久久成人精品| 亚洲久久久国产精品| 亚洲精品自拍成人| 欧美国产精品一级二级三级 | 亚洲人成网站高清观看| 成年美女黄网站色视频大全免费 | 美女主播在线视频| 欧美成人a在线观看| 性色av一级| 小蜜桃在线观看免费完整版高清| 亚洲精品乱码久久久久久按摩| 亚州av有码| 久久精品久久精品一区二区三区| 成年av动漫网址| 十八禁网站网址无遮挡 | 欧美日韩视频高清一区二区三区二| 最近的中文字幕免费完整| 成人综合一区亚洲| 日韩人妻高清精品专区| 亚洲国产欧美人成| 国产黄色视频一区二区在线观看| 久久婷婷青草| 国产黄色免费在线视频| 国产精品爽爽va在线观看网站| 在线免费十八禁| 亚洲三级黄色毛片| 亚洲精品日韩在线中文字幕| a级毛片免费高清观看在线播放| 久久久色成人| 成人一区二区视频在线观看| 欧美精品一区二区大全| 国产精品免费大片| 狂野欧美白嫩少妇大欣赏| 黄色欧美视频在线观看| 日韩欧美 国产精品| 国产一区亚洲一区在线观看| 五月天丁香电影| 老师上课跳d突然被开到最大视频| 免费av不卡在线播放| 国内揄拍国产精品人妻在线| 18禁裸乳无遮挡免费网站照片| 久久久久久人妻| 久久鲁丝午夜福利片| 精品人妻视频免费看| 青青草视频在线视频观看| 三级国产精品片| 97超碰精品成人国产| 国产女主播在线喷水免费视频网站| 国产欧美另类精品又又久久亚洲欧美| 国产精品一区www在线观看| 国产色婷婷99| 中文字幕av成人在线电影| 国产精品免费大片| 韩国高清视频一区二区三区| 中文资源天堂在线| 久久精品国产自在天天线| 免费观看无遮挡的男女| 极品少妇高潮喷水抽搐| 欧美97在线视频| 在线看a的网站| 黑人高潮一二区| 99久久精品国产国产毛片| 国产色爽女视频免费观看| 夜夜看夜夜爽夜夜摸| 97超视频在线观看视频| 熟女人妻精品中文字幕| 久久久精品免费免费高清| 天天躁夜夜躁狠狠久久av| 多毛熟女@视频| 午夜激情福利司机影院| 亚洲最大成人中文| 美女主播在线视频| 我要看黄色一级片免费的| 99热这里只有是精品50| 成人毛片60女人毛片免费| 亚洲国产欧美在线一区| 人妻系列 视频| 91精品一卡2卡3卡4卡| 婷婷色麻豆天堂久久| 国产精品精品国产色婷婷| 亚洲成人一二三区av| 国产精品不卡视频一区二区| 日本色播在线视频| 日韩欧美精品免费久久| 成人午夜精彩视频在线观看| 免费观看a级毛片全部| 一区二区三区四区激情视频| 老师上课跳d突然被开到最大视频| 免费观看av网站的网址| 伦理电影免费视频| 老熟女久久久| 国产高潮美女av| 国产精品免费大片| 女的被弄到高潮叫床怎么办| 欧美区成人在线视频| 日韩三级伦理在线观看| 麻豆精品久久久久久蜜桃| 国产毛片在线视频| 成人国产麻豆网| videossex国产| 国产高清三级在线| 国产永久视频网站| 最近中文字幕高清免费大全6| 久久久久久久大尺度免费视频| 十八禁网站网址无遮挡 | 国产免费一区二区三区四区乱码| 日韩欧美一区视频在线观看 | av又黄又爽大尺度在线免费看| 亚洲四区av| 成年女人在线观看亚洲视频| 日本av免费视频播放| 高清不卡的av网站| 久久国产精品男人的天堂亚洲 | 国产视频首页在线观看| 亚洲国产精品成人久久小说| 黑人猛操日本美女一级片| 久久国产乱子免费精品| 日韩成人av中文字幕在线观看| 国产精品久久久久久精品电影小说 | 国产91av在线免费观看| 久久久久久久久久人人人人人人| 亚洲真实伦在线观看| 亚洲自偷自拍三级| 六月丁香七月| 美女高潮的动态| 成人一区二区视频在线观看| 国产精品久久久久久久电影| 欧美bdsm另类| 亚洲av电影在线观看一区二区三区| av不卡在线播放| 高清毛片免费看| 人妻一区二区av| 久久久久久九九精品二区国产| 亚洲av.av天堂| 国精品久久久久久国模美| 韩国高清视频一区二区三区| 午夜视频国产福利| av在线蜜桃| av免费在线看不卡| 一区二区三区免费毛片| 欧美日韩亚洲高清精品| av专区在线播放| 我的老师免费观看完整版| 亚洲av.av天堂| 六月丁香七月| 午夜精品国产一区二区电影| 91精品一卡2卡3卡4卡| 精品人妻偷拍中文字幕| 亚州av有码| 午夜福利影视在线免费观看| 99久久人妻综合| 大香蕉久久网| 少妇人妻一区二区三区视频| 精品一区在线观看国产| 国产老妇伦熟女老妇高清| 国产日韩欧美亚洲二区| 成人亚洲精品一区在线观看 | 97在线视频观看| 国产成人aa在线观看| 国产欧美日韩精品一区二区| 久久人妻熟女aⅴ| 色吧在线观看| 在线观看免费高清a一片| 美女高潮的动态| 熟女av电影| 午夜日本视频在线| 国产精品三级大全| 久久久久网色| 亚洲第一av免费看| 日本-黄色视频高清免费观看| 色婷婷av一区二区三区视频| 亚洲婷婷狠狠爱综合网| 纯流量卡能插随身wifi吗| 伊人久久精品亚洲午夜| 国产精品国产三级专区第一集| 爱豆传媒免费全集在线观看| 亚洲精品国产av成人精品| 久久精品国产鲁丝片午夜精品| h日本视频在线播放| 国产一区二区在线观看日韩| 天美传媒精品一区二区| 欧美3d第一页| 久久精品国产亚洲av涩爱| 只有这里有精品99| 18禁在线播放成人免费| 99热全是精品| 国产成人午夜福利电影在线观看| 男人舔奶头视频| 成人特级av手机在线观看| 国产片特级美女逼逼视频| 久久久久久久亚洲中文字幕| 国产黄片美女视频| 五月伊人婷婷丁香| 国产高清有码在线观看视频| 日韩一区二区三区影片| 人妻 亚洲 视频| 国产女主播在线喷水免费视频网站| xxx大片免费视频| 99久国产av精品国产电影| 丝袜喷水一区| 夜夜看夜夜爽夜夜摸| 亚洲精品乱久久久久久| 欧美xxxx黑人xx丫x性爽| 99热这里只有精品一区| 看免费成人av毛片| 久久精品国产自在天天线| 亚洲欧美日韩卡通动漫| 香蕉精品网在线| 亚洲最大成人中文| 精品久久久久久久久亚洲| 国产免费视频播放在线视频| 亚洲国产精品国产精品| 人人妻人人澡人人爽人人夜夜| 3wmmmm亚洲av在线观看| 卡戴珊不雅视频在线播放| 免费黄网站久久成人精品| 人人妻人人澡人人爽人人夜夜| 亚洲国产av新网站| 亚洲精品色激情综合| 在线观看免费高清a一片| 蜜桃亚洲精品一区二区三区| av在线app专区| 三级经典国产精品| 免费观看在线日韩| 亚洲精品,欧美精品| 中文精品一卡2卡3卡4更新| 深爱激情五月婷婷| 亚洲欧美日韩另类电影网站 | 亚洲av国产av综合av卡| 久久人人爽人人爽人人片va| 亚洲四区av| 国产日韩欧美在线精品| 美女主播在线视频| 狂野欧美白嫩少妇大欣赏| 国产av精品麻豆| 丰满迷人的少妇在线观看| av在线老鸭窝| 99久久精品热视频| 女的被弄到高潮叫床怎么办| 欧美精品一区二区免费开放| 韩国av在线不卡| 国产熟女欧美一区二区| 久久国产亚洲av麻豆专区| 97超碰精品成人国产| 黑人高潮一二区| 久久久久久久国产电影| 一区在线观看完整版| 人妻少妇偷人精品九色| 肉色欧美久久久久久久蜜桃| 王馨瑶露胸无遮挡在线观看| 久久久久国产网址| 国产高清国产精品国产三级 | 我要看日韩黄色一级片| 久久精品熟女亚洲av麻豆精品| 又大又黄又爽视频免费| 免费看av在线观看网站| 久久精品人妻少妇| 亚洲av免费高清在线观看| 精品酒店卫生间| 91久久精品国产一区二区三区| 内地一区二区视频在线| 婷婷色综合大香蕉| 久久久久国产精品人妻一区二区| 成人毛片60女人毛片免费| 国产免费一区二区三区四区乱码| 欧美激情国产日韩精品一区| 青春草亚洲视频在线观看| 黄色欧美视频在线观看| 一本—道久久a久久精品蜜桃钙片| 韩国高清视频一区二区三区| 婷婷色麻豆天堂久久| 久久午夜福利片| 午夜免费观看性视频| 亚洲一区二区三区欧美精品| 高清不卡的av网站| 国产免费又黄又爽又色| 男女边摸边吃奶| 国产高清有码在线观看视频| 精品99又大又爽又粗少妇毛片| 国产精品一区二区三区四区免费观看| 26uuu在线亚洲综合色| 好男人视频免费观看在线| 精品亚洲乱码少妇综合久久| 亚洲成色77777| 一级毛片 在线播放| 男男h啪啪无遮挡| 欧美 日韩 精品 国产| 九九在线视频观看精品| 18禁裸乳无遮挡动漫免费视频| 亚洲国产最新在线播放| 日本欧美视频一区| 亚洲av欧美aⅴ国产| 欧美一区二区亚洲| 亚洲国产成人一精品久久久| 欧美日韩综合久久久久久| 免费久久久久久久精品成人欧美视频 | 国产成人免费观看mmmm| 久久久久久久久久久丰满| 精品人妻视频免费看| 麻豆乱淫一区二区| 午夜日本视频在线| 久久人人爽人人片av| 午夜福利网站1000一区二区三区| av不卡在线播放| 最近中文字幕2019免费版| 久久6这里有精品| 日日摸夜夜添夜夜添av毛片| 久久影院123| 午夜视频国产福利| 91久久精品国产一区二区三区| 青春草视频在线免费观看| 亚洲精品乱久久久久久| 亚洲av成人精品一区久久| 国产伦精品一区二区三区视频9| 婷婷色av中文字幕| 亚洲精品日韩av片在线观看| 久久99热6这里只有精品| 国产视频首页在线观看| 18+在线观看网站| 2018国产大陆天天弄谢| 在线观看一区二区三区激情| 自拍欧美九色日韩亚洲蝌蚪91 | 少妇丰满av| 国产精品伦人一区二区| 免费观看a级毛片全部| 免费看av在线观看网站| 哪个播放器可以免费观看大片| 久久久久性生活片| 日产精品乱码卡一卡2卡三| 在线免费十八禁| 三级国产精品欧美在线观看| 国产精品久久久久久精品电影小说 | 有码 亚洲区| 国产成人a区在线观看| 99久久中文字幕三级久久日本| 高清毛片免费看| 精品国产三级普通话版| 伊人久久国产一区二区| 亚洲一级一片aⅴ在线观看| 国产日韩欧美亚洲二区| 国产精品一及| 日本欧美视频一区| 色哟哟·www| 久久婷婷青草| 亚洲精品国产色婷婷电影| 国产在线男女| 狂野欧美激情性xxxx在线观看| 日韩欧美一区视频在线观看 | 少妇人妻精品综合一区二区| 这个男人来自地球电影免费观看 | 国产片特级美女逼逼视频| 寂寞人妻少妇视频99o| 亚洲四区av| 亚洲第一区二区三区不卡| 欧美成人一区二区免费高清观看| 日本免费在线观看一区| 亚洲av免费高清在线观看| 欧美xxⅹ黑人| 男人和女人高潮做爰伦理| av.在线天堂| 久久国内精品自在自线图片| 国产成人精品久久久久久| 亚洲综合精品二区| 国产免费一级a男人的天堂| 国产精品久久久久久精品电影小说 | 18禁在线播放成人免费| 99re6热这里在线精品视频| 一级二级三级毛片免费看| 国产精品不卡视频一区二区| 久久久久久久精品精品| 伦理电影大哥的女人| 亚洲欧美日韩卡通动漫| 久久国产精品大桥未久av | 国产精品麻豆人妻色哟哟久久| 99热网站在线观看| 黄色欧美视频在线观看| 久久久精品94久久精品| 久久亚洲国产成人精品v| 青青草视频在线视频观看| 久久精品国产亚洲av涩爱| 色视频在线一区二区三区| 亚洲欧美成人精品一区二区| 天堂8中文在线网| 久久久久久久久久成人| 青青草视频在线视频观看| 不卡视频在线观看欧美| 99久久综合免费| 精品国产露脸久久av麻豆| 高清欧美精品videossex| 亚洲第一区二区三区不卡| 亚洲精品日本国产第一区| 两个人的视频大全免费| 午夜激情福利司机影院| 中文欧美无线码| 欧美日韩亚洲高清精品| 美女国产视频在线观看| 如何舔出高潮| 国产女主播在线喷水免费视频网站| 国模一区二区三区四区视频| 免费大片黄手机在线观看| 搡老乐熟女国产| 国产伦理片在线播放av一区| av国产精品久久久久影院| www.av在线官网国产| 18禁在线无遮挡免费观看视频| 国产熟女欧美一区二区| 熟女人妻精品中文字幕| 日本av免费视频播放| 亚洲精品国产av蜜桃| 亚洲av成人精品一二三区| videos熟女内射| 大码成人一级视频| 少妇的逼水好多| 偷拍熟女少妇极品色| 看免费成人av毛片| 久久韩国三级中文字幕| 插逼视频在线观看| 国产精品99久久99久久久不卡 | 全区人妻精品视频| 国产老妇伦熟女老妇高清| 亚洲精品亚洲一区二区| 如何舔出高潮| 日韩大片免费观看网站| a级一级毛片免费在线观看| 成人影院久久| 亚洲中文av在线| 国语对白做爰xxxⅹ性视频网站| 免费观看av网站的网址| 亚洲精品第二区| 亚洲精品日韩av片在线观看| 精品人妻一区二区三区麻豆| 一级a做视频免费观看| 亚洲精品一二三| 亚洲,一卡二卡三卡| 国产男女超爽视频在线观看| 亚洲精品日本国产第一区| 国产黄色免费在线视频| 欧美极品一区二区三区四区| 色吧在线观看| 国产又色又爽无遮挡免| 91精品国产国语对白视频| 欧美日韩一区二区视频在线观看视频在线| 最近2019中文字幕mv第一页| 久久国产乱子免费精品| 精品亚洲成国产av| 男女边摸边吃奶| 免费黄色在线免费观看| 五月伊人婷婷丁香| 国产色爽女视频免费观看| 夫妻午夜视频| 国产欧美亚洲国产| 毛片女人毛片| 在线观看免费日韩欧美大片 | 啦啦啦啦在线视频资源| 国产精品av视频在线免费观看| 最近最新中文字幕大全电影3| 男女边吃奶边做爰视频| 久久久久国产网址| 热99国产精品久久久久久7| 亚洲精品乱码久久久久久按摩| 91精品国产九色| av免费在线看不卡| 欧美高清成人免费视频www| 国产高清有码在线观看视频| 亚洲第一区二区三区不卡| 亚洲欧美日韩另类电影网站 | 国产亚洲一区二区精品| 亚洲伊人久久精品综合| 国产av一区二区精品久久 | 亚洲精华国产精华液的使用体验| 卡戴珊不雅视频在线播放| 制服丝袜香蕉在线| 国产69精品久久久久777片| 精品久久久久久电影网| 成人午夜精彩视频在线观看| 久久 成人 亚洲| 日本-黄色视频高清免费观看| 秋霞在线观看毛片| 久久99热这里只有精品18| 精品一区二区三卡| 天美传媒精品一区二区| 久久人妻熟女aⅴ| 麻豆精品久久久久久蜜桃| 草草在线视频免费看| av在线老鸭窝| 深爱激情五月婷婷| 国产熟女欧美一区二区| 国产精品国产三级国产专区5o| 亚洲精品自拍成人| 99久久人妻综合| 多毛熟女@视频| 国产在视频线精品| 汤姆久久久久久久影院中文字幕| 老熟女久久久| 精品一区在线观看国产| 亚洲经典国产精华液单| 在现免费观看毛片| 日本色播在线视频| 欧美一级a爱片免费观看看| 搡老乐熟女国产| 波野结衣二区三区在线| 国产黄色视频一区二区在线观看| 精品久久久久久久久亚洲| 精品久久久噜噜| 天美传媒精品一区二区| 男女免费视频国产| 欧美成人精品欧美一级黄| 久久久久视频综合| 高清在线视频一区二区三区| 久久久久人妻精品一区果冻| 欧美区成人在线视频| 秋霞在线观看毛片| 亚洲av男天堂| 国产毛片在线视频| 毛片女人毛片| 哪个播放器可以免费观看大片| 在线观看一区二区三区激情| 精品少妇久久久久久888优播| 欧美xxⅹ黑人| 免费看日本二区| 777米奇影视久久| 十八禁网站网址无遮挡 | 精品一品国产午夜福利视频| 91久久精品国产一区二区成人| 欧美精品一区二区大全| 亚洲人成网站在线播| 精品久久久久久久久av| 国产伦在线观看视频一区| 春色校园在线视频观看| 精品人妻视频免费看| 久久国产精品男人的天堂亚洲 | 三级国产精品欧美在线观看| 国产精品蜜桃在线观看| 在线观看一区二区三区| 在线亚洲精品国产二区图片欧美 | 最近最新中文字幕免费大全7| 久久亚洲国产成人精品v| 最后的刺客免费高清国语| 亚洲欧美一区二区三区黑人 | 波野结衣二区三区在线| 久久精品国产亚洲网站| 中国三级夫妇交换| 国产探花极品一区二区| 亚洲国产精品成人久久小说| 日韩不卡一区二区三区视频在线| 国产又色又爽无遮挡免| 日韩免费高清中文字幕av| 在线观看免费日韩欧美大片 | 久久精品国产鲁丝片午夜精品| 成人影院久久| 免费观看av网站的网址| 国产精品蜜桃在线观看| 国产精品欧美亚洲77777| 夜夜爽夜夜爽视频| 在线观看免费高清a一片| 午夜日本视频在线| 观看免费一级毛片| 国产黄片视频在线免费观看| 午夜福利高清视频| 尤物成人国产欧美一区二区三区| 久久人人爽av亚洲精品天堂 | 人妻少妇偷人精品九色| 99久久精品热视频| 亚洲性久久影院| 97热精品久久久久久| 亚洲av国产av综合av卡| 少妇精品久久久久久久| 一区二区三区四区激情视频| 久久精品人妻少妇| 国产 一区 欧美 日韩| 日韩不卡一区二区三区视频在线| 国产精品不卡视频一区二区| 亚洲av免费高清在线观看| 男女下面进入的视频免费午夜| 国产又色又爽无遮挡免| 国产亚洲精品久久久com| 看非洲黑人一级黄片| 国产又色又爽无遮挡免| 大香蕉97超碰在线| 久久久久精品久久久久真实原创| 欧美高清成人免费视频www| 国产亚洲精品久久久com| 看非洲黑人一级黄片| 国产精品国产av在线观看| 国产av国产精品国产| 亚洲一区二区三区欧美精品| av在线老鸭窝| 久久 成人 亚洲| 午夜福利在线在线| 久久国产精品男人的天堂亚洲 | 永久免费av网站大全| 欧美成人午夜免费资源| 深爱激情五月婷婷| 亚洲av欧美aⅴ国产| 新久久久久国产一级毛片| 超碰av人人做人人爽久久| 亚洲av欧美aⅴ国产| videos熟女内射| 久久97久久精品| 日韩一本色道免费dvd| 国产久久久一区二区三区| 在线观看美女被高潮喷水网站| av网站免费在线观看视频| 久久久久久久久久久丰满| 精品少妇久久久久久888优播| 亚洲激情五月婷婷啪啪| 亚洲第一av免费看|