• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    2016-12-05 07:01:07LiWenfa李文法WangGongmingMaNanLiuHongzhe
    High Technology Letters 2016年3期
    關(guān)鍵詞:文法

    Li Wenfa (李文法):Wang Gongming :Ma Nan:Liu Hongzhe

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    ?

    A nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix①

    Li Wenfa (李文法)*To whom correspondence should be addressed.E-mail:liwenfa@buu.edu.cnReceived on Dec.16,2015*:Wang Gongming**:Ma Nan*:Liu Hongzhe*

    (*Beijing Key Laboratory of Information Service Engineering,Beijing Union University,Beijing 100101,P.R.China)(**National Laboratory of Biomacromolecules:Institute of Biophysics:Chinese Academy of Sciences:Beijing 100101:P.R.China)

    Problems existin similarity measurement and index tree construction which affect the performance of nearest neighbor search of high-dimensional data.The equidistance problem is solved using NPsim function to calculate similarity.And a sequential NPsim matrix is built to improve indexing performance.To sum up the above innovations:a nearest neighbor search algorithm of high-dimensional data based on sequential NPsim matrix is proposed in comparison with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.Experimental results show that the proposed algorithm similarity is better than that of other algorithms and searching speed is more than thousands times of others.In addition:the slow construction speed of sequential NPsim matrix can be increased by using parallel computing.

    nearest neighbor search:high-dimensional data:similarity:indexing tree:NPsim:KD-tree:SR-tree:Munsell

    0 Introduction

    The nearest neighbor search is looking for several points that are nearest from the given point[1]:which is widely used in text clustering:recommendation system:multimedia retrieval:sequence analysis:etc.Generally speaking:the data whose dimensionality is more than 20 belongs to high-dimensional data[2].The traditional nearest neighbor search algorithms may fail in high-dimensional data because of the curse of dimensionality[3].Thus:the nearest neighbor search of high-dimensional data has become a challenging but useful issue in data mining.Currently:this issue has been researched to a certain extent.With position sensitive hashing algorithm[4]:a high-dimensional vector is mapped onto address space:and previous similar points are still close to each other in a larger probability:which overcomes the equidistance of high-dimensional data.But its applicability is limited because of same hash functions and neglection of data difference.To solve this problem:a self-taught hashing (STH)[5]was proposed by Zhang:et al.The similarity matrix is built at first.And the matrix decomposition and eigenvalue solution are carried out subsequently.But it has large time and space complexity.The iDistance[6]and vector approximation file (VA-File)[7]are suitable for indexing structure.However:its query cost is very huge.

    In essence:the similarity measurement and index tree construction have affected performance of nearest neighbor search of high-dimensional data.Thus:solving problems that exist in above aspects is very important.At present:most of similarity measurement methods of high-dimensional data ignore the relative difference of property:noise distribution:weight and other factors:which are valid for a small number of data types[8].Psim(X,Y) function considers the above factors[8]and is applicable for all kinds of data type.But it is unable to compare similarity under different dimensions because its range depends on spatial dimensionality.Thus:the NPsim(X:Y) function is proposed to solve this problem and makes its range [0:1].The defect of index tree on construction and query has made up with sequential NPsim matrix.This method is easy to parallelize.Assuming that the dimensionality is M:the time complexity of building sequential NPsim matrix is O(M2·n):but the one after parallelization is reduced into O(M·n).The time complexity of nearest neighbor search is O(1).This algorithm is compared with the nearest neighbor search algorithms based on KD-tree or SR-tree on Munsell spectral data set.The experimental results show that the similarity of our proposed algorithm is better than the one of other algorithms.The construction of sequential NPsim matrix is time consuming:but its searching speed is more than thousands times of others.In addition:the construction time of sequential NPsim matrix can be reduced dramatically by virtue of parallelization.Thus:its whole performance is better than the one of others.

    1 Related work

    In recent years:the similarity measurement and index tree construction have been researched to a certain extent.But insufficiency still exists.

    To solve the problem in similarity measurement:the Hsim(X:Y) function[9]was proposed by Yang:which is better than traditional method:but neglects the relative difference and noise distribution.The Gsim(X:Y) function[10]is proposed according to relative difference of properties in different dimensions.But the weight discrepancy is ignored.The Close(X:Y) function[11]based on monotone decreasing of e-xcan overcome the influence from components in some dimensions whose variance are larger.But relative difference is not considered which would be affected by noise.The Esim(X:Y)[12]function is proposed by improving Hsim(X:Y) and Close(X:Y) functions.In each dimension:the Esim(X:Y) component is positive correlation to the value in this dimension.All dimensions are divided into two parts:normal dimension and noisy dimension.In noisy dimension:the noise occupies majority.When noise is similar to and larger than the one in normal dimension:this method will be invalid.The secondary measurement method[13]is used to calculate the similarity by virtue of property distribution:space distance:etc.But the noise distribution and weight have been neglected.In addition:it is time-consuming.The projection nearest neighbor is proposed by Hinneburg[14]:which is used to solve the problem in higher dimensional space by dimension reduction.But it is hard to find right quality criterion function.In high-dimensional space:Yi has found[8]that the difference in noisy dimension is larger:no matter how similar data is.This difference has occupied a large amount of similarity calculation:which results in the distances between all points to be similar.Therefore:Psim(X:Y) function[8]is proposed to eliminate the noisy influence by analyzing difference among all dimensions.The experimental result indicates that this method is suitable for all kinds of data type.But its range is [0:n]:where n is dimensionality.Thus:the similarities in different dimensions are unable to compare.

    There are two kinds of index trees used in high-dimensional space:index tree based on the vector space:and index tree based on metric space.The typical example of the former is R-tree[15].It is a natural extension of B-tree in high-dimensional space and can solve the data searching problem.However:R-tree has the problems of brother node overlap:multiple queries:and low utilization:etc.Therefore:the extension of R-tree has been proposed:such as R+tree:R*tree:cR-tree.The common structure of the later is VP-tree[16]:which is a binary search tree and suitable for large-scale data search.But it is a static tree and could not be inserted or deleted.In addition:the distance calculation is time-consuming.MVP-tree is the improvement of VP-tree[17]and the cost of distance calculation is decreased.But its time complexity in stage of creation or query is higher than the one of VP-tree.M-tree is the hash index represented by B-tree[18]and has high searching efficiency.However:it can only carry out single value searching:instead of range searching.SA-tree is created according to the distance between leaf node and root node[19].But it is a completely static tree and could not be inserted or deleted.

    2 Key technology

    2.1 Similarity measurement

    Inn-dimensional space:set S={S1:S2:…:SM} is composed of M points Si={Si1:Si2,…,Sij,…,Sin}:where i=1,2,…,M:j=1,2,…,n:and Sijis the jth property of Si.Assuming that X and Y are any two points in set S:Sim(X:Y) is similarity between X and Y.

    Sim(X:Y) is usually measured with distance function.The typical methods include Manhattan distance:Euclidean distance:etc.However:with the increase of dimensionality:the nearest and farthest distances become the same[2].Thus:these methods are invalid in high-dimensional space.To solve this problem:several methods are proposed:such as Hsim(X:Y):Gsim(X:Y):Close(X:Y):Esim(X:Y):yet they are valid for limited types of data[2].The Psim(X:Y) is suitable for a variety of data type:and its range is dependent on spatial dimensionality and unable to compare the similarity under the different dimensions.Under the circumstance of maintaining effects:Psim(X:Y) is updated as

    (1)

    where Xiand Yiare components in ith dimension.δ(Xi:Yi) is discriminant function.If Xiand Yiare in the same interval [ni:mi]:δ(Xi:Yi)=1 is hold.Otherwise:δ(Xi:Yi)=0 is hold.E(X:Y) is the number of intervals in which components of X and Y are all the same.It can be seen that the range of NPsim(X:Y) is in [0:1].The above is the outline of NPsim:and detailed introduction can be found in reference[8].

    To validate this method:several records in dimensions of 10:30:50:100:150:200:250:300:350:and 400 are generated with normrnd() function of Matlab.The number of records in every dimension is 1000.After that:relative difference between the farthest and the nearest neighbors is calculated with

    (2)

    where Dmaxn:Dminnand Davgnare maximal:minimal and average similarities in n-dimensional space respectively[20].

    According to the characteristics of results:similarity measurement methods are divided into two kinds.The first kind of methods include Manhattan distance:Euclidean distance:Hsim(X:Y):Gsim(X:Y):Close(X:Y) and Esim(X:Y).The others include Psim(X:Y) and NPsim(X:Y).The result is shown in Fig.1.It can be seen that relative difference of the second kind of methods is two or three magnitudes than the one of the first.Therefore:the performance advantage of the second kind of methods is obvious.

    Fig.1 Relative difference of various similarity measurement methods

    The numbers of Psim(X:Y)≥1 in different dimensions are shown in Table 1.The number of Psim(X:Y) in every dimension is 1000×1000=1000000.Thus:the 6%~15% result is more than 1:which is unable to compare the similarity in different dimensions.But this problem is not existed in NPsim(X:Y) function.

    Table 1 Number of Psim(X:Y)>=1 in different dimensions

    2.2Knearest neighbor search

    For any point Stin set S:1≤t≤M:search for set Rt?S composed of k points:which meets the following requirements.

    ?r∈Rt:NPsim(St:r)≥max{NPsim(St:s)|s∈S∩s?Rt}

    Rtis theKnearest neighbor (KNN) set of St.The course for generating Rtis called KNN search.

    3 Nearest neighbor search algorithm

    3.1 Whole framework

    The whole framework is shown in Fig.2.First of all,NPsim matrix is generated.After that:sequential NPsim matrix produced by sorting elements in each row of NPsim is sorted in descending order.

    Fig.2 Whole framework of the proposed algorithm

    3.2 Execution flow

    1) Construction of NPsim matrix

    The NPsim matrix is generated with the following steps.

    Step 1 M points Si:i=1,2,…,M are stored in M×n matrix DataSet.

    Step 2 The elements in every column of DataSet are sorted in ascending order in order to generate the matrix SortDataSet.

    Step 3 The elements in each column of SortDataSet are divided into G=「θ·n? intervals.Thus:the number of elements in every interval is T=「M/G?.Meanwhile:the endpoint of every interval is saved into the matrix FirCutBound.

    Step 4 The interval number of element of DataSet is determined according to the matrix FirCutBound:which is saved into the interval number matrix FirNumSet.

    Step 5 The M×M matrix SameDimNum is generated.For any two points Spand Sq:the number of intervals in which components of Spand Sqare all the same is calculated and saved into the matrix element SameDimNum[p][q].

    Step 6 The matrix SortDataSet is divided along the column again.The number of intervals is G′=G-1 and there are T′=|M/G′| elements in each interval.After that:the endpoint of every interval is saved into the matrix SecCutBound.

    Step 7 The interval number matrix SecNumSet is produced according to Step 4.

    Step 8 The matrix SameDimNum is updated.For any points Spand Sq:if the interval number of components in one dimension is different in Step 3:but same in Step 7:then SameDimNum[p][q]=SameDimNum[p][q]+1.

    Step 9 The M×M matrix NPsimMat is built according to the results from Step 3 to Step 9.The NPsim information of Spand Sq(1≤p:q≤M) is stored into NPsimMat[p][q]:which includes three parts:subscript p and q:NPsim(Sp:Sq).

    2) Sorting for NPsim matrix

    The sequential NPsim matrix is produced with the following steps.

    Step 1 The M×M matrix SortNPsimMat is produced:which is the copy of NPsimMat.

    Step 2 The elements in each row of SortNPsimMat are sorted in the descending order of NPsim.

    With the increase of column number:elements in pth row becomes lower and lower:which represents the distance between Spand corresponding point is farther and farther.

    3) Nearest neighbor search

    The KNN of Siis found out as follows.

    Step 1 The frontierKelements in ith row are selected.

    Step 2 The points different from Siare expected result.

    3.3 Time complexity analysis

    This algorithm is separated into two stages:construction of sequential NPsim matrix and searching KNN.There are two steps at the first stage:the time complexity is analyzed as follows.

    (1) Construction of NPsim matrix

    In this step:four kinds of matrixes are produced.The first is SortDataSet and is produced by sorting elements in every column.Its time complexity is O(MlogM·n).The second is FirCutBound and SecCutBound that are generated by visiting all elements of DataSet.Thus:the time complexity is O(M·n).The third is FirNumSet and SecNumSet which are obtained by locating the column number of element.The corresponding time complexity is O(M2·n).The fourth is SameDimNum that is produced by comparing the element per column.The time complexity of this operation is O(M2·n).Finally:the NPsim component is calculated and summed up to whole NPsim value.

    (2) Sorting for NPsim matrix

    The elements in every row of NPsimMat are sorted in the descending of NPsim.Its time complexity is O(M·nlogn).

    To sum up above analysis:the time complexity of construction stage is O(M2·n)+O(M·nlogn)=O(M2·n).

    In the course of searching:the frontierKelements in ith row are visited.Thus:the corresponding time complexity is O(1).

    4 Experiment

    The proposed algorithm includes two stages (construction and searching) that must be contained in the selected algorithm in comparison.For nearest neighbor search algorithm based on KD-tree:the KD-tree is built at first:and searching is carried out subsequently.The nearest neighbor search algorithm based on SR-tree is similar.Thus:the above two algorithms are selected in the following experiment.

    4.1 Data introduction

    The Munsell Color-Glossy set is proposed by American chromatist Munsell and is revised repeatedly by American National Standards Institute (ANSI) and Optical Society:which is one of the standard color sets and includes 1600 colors and each of them is represented with HV/C.The H:V:and C are abbreviations of hue:brightness and saturation respectively.

    The spectral reflectance of all colors in Munsell Color-Glossy set is downloaded from spectral color research group (http://www.uef.fi/fi/spectral/home).Each of them is a vector that contains 401 piece of spectral reflectance in different wavelengths:which is regarded as high-dimensional data.

    4.2 Overview

    First of all:the running times of constructing sequential NPsim matrix:KD-tree:and SR-tree are calculated.After that:KNN search of given point in Munsell color cubic is carried out.The locations of given point and neighbors must be close or continuous.

    Assuming that HV/C and HKVK/CKare given point and corresponding neighbors respectively:the Munsell distance between them is

    Distance=|HK-H|+|VK-V|+|CK-C|(3)

    On one hand:the neighbor colors from three algorithms are compared.On the other hand:with the increase of K:the construction and searching times of different algorithms are calculated and analyzed.

    4.3 Result

    The proposed algorithm:and traditional algorithms based on KD-tree and SR-tree are implemented in the experiment:and the results are compared in aspects of accuracy and speed.There is no parallel strategy used in the following experiment.The hardware includes AMD Athlon(tm) II X2-250 processor and Kingston 4G memory and the software is WinXP operation system and MicroSoft Visual Studio 2008.

    1) Accuracy analysis

    The Munsell color 5BG3/2 is selected for KNN search:which is shown in Fig.3.Searching result is shown in Table 2:3:and 4 under the circumstanceK=6.In Table 2:KNN distance is the NPsim value:and the one in Table 3 and 4 is the Euclidean distance.It can be seen that the color from the proposed method is closer to the given color.But the one from other methods has obvious difference from 5BG3/2:such as 5B3/4 in Table 3 and 7.5BG4/6 in Table 3.In addition:the Munsell distance of the proposed method is less than the one of others.In some cases:the Munsell distances of pioneers:nearest neighbors:and successors are not the ascending order:which is called as reverse phenomenon.The 10BG3/2 in Table 2:5B3/4 in Table 3:and 7.5BG4/6 in Table 4 are typical examples.The number of nearest neighbors with reverse phenomenon in Table 2:3:and 4 are 2:4:and 4:which indicates the stability of the proposed method is better than the one of others.

    Fig.3 Color of 5BG3/2

    Table 3 KNN result of KD-tree algorithm

    Table 4 KNN result of SR-tree algorithm

    2) Speed analysis

    The construction time of indexing structure is shown in Table 5.It can be seen that the one of sequential NPsim matrix is about ten times of the one of KD-tree or SR-tree.

    Table 5 Construction time of different indexing structures

    With the increase ofKvalue:the average searching time for KNN of 1000 selected Munsell colors are shown in Fig.4.The experimental result indicates that the magnitude of the proposed method is about 10-6:but the one of other methods is about 10-2.That is to say:the searching speed of the proposed method is more than thousands times of others.

    Although the construction speed of sequential NPsim matrix is slow:the searching speed is fast.And sequential NPsim matrix can be stored into disk and loaded with high speed at any time.So:the performance of sequential NPsim matrix is better than the one of KD-tree and SR-tree in nearest neighbor search of high-dimensional data.

    Fig.4 Average searching time of KNN with different algorithms

    5 Conclusion

    The nearest neighbor search of high-dimensional data is the foundation of high-dimensional data processing.The problem existing in the similarity measurement and index tree construction has affected the performance of traditional nearest neighbor search algorithm.The NPsim function and sequential NPsim matrix are designed to solve this problem:which are combined to propose a new nearest neighbor search algorithm.To validate the proposed algorithm:the traditional algorithms based on KD-tree and SR-tree are compared in the experiment on Munsell Color-Glossy set.The results show that the accuracy and searching speed of the method are better than the one of two methods.

    However:the construction speed of sequential NPsim matrix is slower than the one of KD-tree and SR-tree.The reason is the time complexity of constructing sequential NPsim matrix is O(M2·n):but the one of constructing KD-tree and SR-tree are both O(M·log2M·n).From section 4.2.1 and 4.2.2.The operations generating different rows of sequential NPsim matrix are independent of each other:which esay be paralleled.But the parallelization for construction of index tree is hard.Therefore:the time complexity would be reduced from O(M2·n) to O(M·n) by virute of parallel.Thus:using parallel in construction of index tree is the future work.

    [1] Jegou H:France L R R:Douze M:et al.Product quantization for nearest neighbor search.IEEETransactionsonPatternAnalysisandMachineIntelligence:2010:33(1):117-128

    [2] Ericson K:Pallickara S.On the performance of high dimensional data clustering and classification algorithms.FutureGenerationComputerSystems:2013:29(4):1024-1034

    [3] Bellman R.Dynamic Programming.Princeton:New Jersey:Dover Publications Inc:2010.152-153

    [4] Andoni A:Indyk P.Near-optimal hashing algorithms for approximate nearest neighbor in high dimensions.CommunicationsoftheACM:2008:51(1):117-122

    [5] Zhang D:Wang J:Cai D:et al.Self-taught hashing for fast similarity search.In:Proceedings of the ACM SIGIR 2010:New York:ACM Press:2010.18-25

    [6] Jagadish H V:Ooi B C:Tan K L:et al.iDistance:an adaptive B+-tree based indexing method for nearest neighbor search.ACMTransactionsonDatabaseSystems:2005:30(2):364-397

    [7] Heisterkamp D R:Peng J.Kernel vector approximation files for relevance feedback retrieval in large image databases.MultimediaToolsandApplications:2005:26(2):175-189

    [8] Yi L H.Research on Clustering Algorithm for High Dimensional Data:[Ph.D dissertation].Qinhuangdao:Institute of Information Science and Engineering:Yanshan University:2011.28-30

    [9] Yang F Z:Zhu Y Y.An efficient method for similarity search on quantitative transaction data.JournalofComputerResearchandDevelopment:2004:41(2):361-368

    [10] Huang S D:Chen Q M.On clustering algorithm of high dimensional data based on similarity measurement.ComputerApplicationsandSoftware:2009:26(9):102-105

    [11] Shao C S:Lou W:Yan L M.Optimization of algorithm of similarity measurement in high-dimensional data.ComputerTechnologyandDevelopment:2011:21(2):1-4

    [12] Wang X Y:Zhang H Y:Shen L Z:et al.Research on high dimensional clustering algorithm based on similarity measurement.ComputerTechnologyandDevelopment:2013:23(5):30-33

    [13] Jia X Y.A high dimensional data clustering algorithm based on twice similarity.JournalofComputerApplications:2005:25(B12):176-177

    [14] Alexander H:Charu A C:Keim D A.What is the nearest neighbor in high dimensional spaces.In:Proceedings of the VLDB 2000:Birmingham:IEEE Computer Society:2000.506-515

    [15] Berkhin P.A survey of clustering data mining techniques.In:Grouping Multidimensional Data:Recent Advances in Clustering.Berlin:Springer-Verlag:2006.25-71

    [16] Nielsen F:Piro P:Barlaud M.Bregman vantage point trees for efficient nearest neighbor queries.In:Proceedings of the IEEE International Conference on Multimedia and Expo 2009.Birmingham:IEEE Computer Society:2009.878-881

    [17] Hetland M L.The basic principles of metric indexing.StudiesinComputationalIntelligence:2009:242:199-232

    [18] Kunze M:Weske M.Metric trees for efficient similarity search in large process model repositories.LectureNotesinBusinessInformationProcessing:2011:66:535-546

    [19] Navarro G.Searching in metric spaces by spatial approximation.TheVldbJournal:2002:11(1):28-46

    [20] Charu C:Aggarwal:Yu P S.The IGrid index:Reversing the dimensionality curse for similarity indexing in high dimensional space.In:Proceedings of ACM SIGKDD 2000.New York:ACM Press:2000.119-129

    Li Wenfa,born in 1974.He received his Ph.D.degree in Graduate University of Chinese Academy of Sciences in 2009.He also received his B.S.and M.S.degrees from PLA Information Engineering University in 1998 and 2003 respectively.His research interests include information security:data analysis and mining:etc.

    10.3772/j.issn.1006-6748.2016.03.002

    ①Supported by the National Natural Science Foundation of China (No.61300078):the Importation and Development of High-Caliber Talents Project of Beijing Municipal Institutions (No.CIT&TCD201504039):Funding Project for Academic Human Resources Development in Beijing Union University (No.BPHR2014A03:Rk100201510) :and “New Start” Academic Research Projects of Beijing Union University (No.Hzk10201501).

    猜你喜歡
    文法
    從絕響到轉(zhuǎn)型:近現(xiàn)代“文法”概念與“文法學(xué)”
    關(guān)于1940 年尼瑪抄寫的《托忒文文法》手抄本
    中國石油大學(xué)勝利學(xué)院文法與經(jīng)濟管理學(xué)院簡介
    西夏文銅鏡的真言文法與四臂觀音像研究
    西夏學(xué)(2018年2期)2018-05-15 11:24:00
    Similarity measurement method of high-dimensional data based on normalized net lattice subspace①
    LL(1)文法分析器的研究與分析
    科技風(2017年25期)2017-05-30 15:40:44
    25年呵護患病妻子不離不棄
    兵團工運(2016年9期)2016-11-09 05:46:13
    基于領(lǐng)域文法的微博輿情分析方法及其應(yīng)用
    基于單向點格自動機的UPG文法識別并行算法
    文法有道,為作文注入音樂美
    又爽又黄无遮挡网站| 99视频精品全部免费 在线| 在线国产一区二区在线| 国产精品不卡视频一区二区| 中国美白少妇内射xxxbb| 中国美白少妇内射xxxbb| 国内精品一区二区在线观看| 精品久久久久久成人av| 国产主播在线观看一区二区| 久久中文看片网| 久久6这里有精品| 午夜爱爱视频在线播放| 99久久成人亚洲精品观看| 久久久久久久久久久丰满 | 日韩欧美在线乱码| 变态另类成人亚洲欧美熟女| 在线观看av片永久免费下载| 最新中文字幕久久久久| 亚洲性夜色夜夜综合| 欧美zozozo另类| 久久99热6这里只有精品| 亚洲成a人片在线一区二区| 又爽又黄a免费视频| 日日摸夜夜添夜夜添小说| 又黄又爽又免费观看的视频| 久久欧美精品欧美久久欧美| avwww免费| 久久久久久久午夜电影| 性插视频无遮挡在线免费观看| 午夜激情欧美在线| 婷婷丁香在线五月| 亚洲欧美精品综合久久99| 日本三级黄在线观看| 午夜福利成人在线免费观看| 国产成人aa在线观看| 美女高潮的动态| 亚洲人成伊人成综合网2020| 桃色一区二区三区在线观看| 熟女人妻精品中文字幕| 精品无人区乱码1区二区| 在现免费观看毛片| 最后的刺客免费高清国语| 国产av不卡久久| 国产精品,欧美在线| 搡老岳熟女国产| ponron亚洲| 国产黄片美女视频| 人妻制服诱惑在线中文字幕| 国产激情偷乱视频一区二区| 色尼玛亚洲综合影院| 国产精品伦人一区二区| 国产69精品久久久久777片| 一本一本综合久久| 在线免费观看不下载黄p国产 | 亚洲美女黄片视频| 国产精品久久久久久精品电影| 精品久久久久久久久av| 听说在线观看完整版免费高清| 一个人观看的视频www高清免费观看| 国产精品久久视频播放| 99在线视频只有这里精品首页| 成人二区视频| 精品人妻一区二区三区麻豆 | 午夜福利成人在线免费观看| 久久久精品大字幕| 日日摸夜夜添夜夜添av毛片 | 午夜福利在线观看吧| 夜夜夜夜夜久久久久| 成人永久免费在线观看视频| 国产免费av片在线观看野外av| 99久久无色码亚洲精品果冻| 久久久精品大字幕| 亚洲精品亚洲一区二区| 琪琪午夜伦伦电影理论片6080| 熟女人妻精品中文字幕| 国产伦一二天堂av在线观看| 搡女人真爽免费视频火全软件 | 国产淫片久久久久久久久| 成人综合一区亚洲| 午夜福利在线在线| 久久亚洲真实| 国产色婷婷99| ponron亚洲| 91久久精品国产一区二区三区| 国产中年淑女户外野战色| 国产精品不卡视频一区二区| 日韩欧美精品v在线| 亚洲18禁久久av| 在线观看一区二区三区| 能在线免费观看的黄片| 女的被弄到高潮叫床怎么办 | 精品人妻一区二区三区麻豆 | 校园春色视频在线观看| 日日摸夜夜添夜夜添小说| 一级黄色大片毛片| 日本五十路高清| 男女下面进入的视频免费午夜| 国产精品一区二区三区四区久久| 亚州av有码| 免费搜索国产男女视频| 亚洲欧美激情综合另类| 日韩一区二区视频免费看| 精品不卡国产一区二区三区| 亚洲中文日韩欧美视频| 小说图片视频综合网站| 久久精品夜夜夜夜夜久久蜜豆| 舔av片在线| 国产欧美日韩一区二区精品| 国内久久婷婷六月综合欲色啪| 熟妇人妻久久中文字幕3abv| 成人国产一区最新在线观看| 久久人人爽人人爽人人片va| 国产精品久久视频播放| 国产精品久久电影中文字幕| 伦精品一区二区三区| 国产精品亚洲一级av第二区| 91午夜精品亚洲一区二区三区 | 亚洲人成网站在线播| 人妻丰满熟妇av一区二区三区| 国产精品不卡视频一区二区| 12—13女人毛片做爰片一| 国产主播在线观看一区二区| 亚洲成人久久爱视频| 久久久久免费精品人妻一区二区| av国产免费在线观看| 久久亚洲真实| 国产激情偷乱视频一区二区| 久久精品夜夜夜夜夜久久蜜豆| a在线观看视频网站| 国内精品久久久久精免费| 久久热精品热| 免费观看人在逋| 日本三级黄在线观看| 国产午夜福利久久久久久| 亚洲精品粉嫩美女一区| 亚洲男人的天堂狠狠| 日日摸夜夜添夜夜添av毛片 | 国产精品女同一区二区软件 | h日本视频在线播放| 小说图片视频综合网站| 亚洲中文字幕日韩| 日韩人妻高清精品专区| 国内毛片毛片毛片毛片毛片| 国产又黄又爽又无遮挡在线| 国产三级在线视频| 美女cb高潮喷水在线观看| 大又大粗又爽又黄少妇毛片口| 亚洲av中文av极速乱 | 又紧又爽又黄一区二区| a级毛片a级免费在线| 国产单亲对白刺激| 全区人妻精品视频| 超碰av人人做人人爽久久| 亚洲男人的天堂狠狠| 欧美日韩黄片免| 久久久久久大精品| 日本三级黄在线观看| 两个人的视频大全免费| 国产精品久久久久久av不卡| 真人做人爱边吃奶动态| 国产伦精品一区二区三区视频9| 亚洲四区av| 18+在线观看网站| 亚洲精品一区av在线观看| 两个人视频免费观看高清| 又爽又黄a免费视频| 久久热精品热| av视频在线观看入口| 老师上课跳d突然被开到最大视频| 12—13女人毛片做爰片一| 在线观看美女被高潮喷水网站| 亚洲第一电影网av| ponron亚洲| 日本a在线网址| 精品人妻一区二区三区麻豆 | 18禁黄网站禁片免费观看直播| 亚洲精华国产精华精| 啦啦啦观看免费观看视频高清| 精品人妻1区二区| 亚洲国产精品成人综合色| 亚洲在线自拍视频| 精品久久久久久久久亚洲 | 最好的美女福利视频网| 免费观看的影片在线观看| 午夜免费男女啪啪视频观看 | 午夜福利在线观看吧| 免费一级毛片在线播放高清视频| 国产蜜桃级精品一区二区三区| 国产淫片久久久久久久久| 欧美激情国产日韩精品一区| 有码 亚洲区| 人妻久久中文字幕网| 日本-黄色视频高清免费观看| 狂野欧美激情性xxxx在线观看| 一卡2卡三卡四卡精品乱码亚洲| 色哟哟哟哟哟哟| 我要搜黄色片| 观看免费一级毛片| 精华霜和精华液先用哪个| 久久国内精品自在自线图片| 国产精品国产高清国产av| 国产亚洲91精品色在线| 亚洲图色成人| 久久人人精品亚洲av| 性色avwww在线观看| 小蜜桃在线观看免费完整版高清| 亚洲综合色惰| 国产单亲对白刺激| 欧美+亚洲+日韩+国产| 嫩草影院入口| 两性午夜刺激爽爽歪歪视频在线观看| 国产精品三级大全| 国产综合懂色| 午夜a级毛片| 简卡轻食公司| 国产精品一区二区三区四区免费观看 | 亚洲一区二区三区色噜噜| 午夜亚洲福利在线播放| 日日摸夜夜添夜夜添小说| 热99在线观看视频| 又爽又黄无遮挡网站| 国产午夜福利久久久久久| 国产黄色小视频在线观看| 国产在线精品亚洲第一网站| 色噜噜av男人的天堂激情| 久久久国产成人免费| 国产毛片a区久久久久| 国产一区二区激情短视频| 九九爱精品视频在线观看| 97碰自拍视频| 听说在线观看完整版免费高清| 成人亚洲精品av一区二区| 别揉我奶头~嗯~啊~动态视频| 亚洲最大成人av| 舔av片在线| 美女 人体艺术 gogo| 免费一级毛片在线播放高清视频| 亚洲精品在线观看二区| 久久久国产成人精品二区| 欧美另类亚洲清纯唯美| .国产精品久久| 3wmmmm亚洲av在线观看| а√天堂www在线а√下载| 欧美3d第一页| 搞女人的毛片| 欧美又色又爽又黄视频| 我要搜黄色片| 欧美另类亚洲清纯唯美| 欧美最新免费一区二区三区| 丰满的人妻完整版| 18禁裸乳无遮挡免费网站照片| 天堂影院成人在线观看| 国产极品精品免费视频能看的| 国产精品人妻久久久影院| 桃色一区二区三区在线观看| 欧美+亚洲+日韩+国产| 久久久久九九精品影院| 国产乱人伦免费视频| or卡值多少钱| 午夜视频国产福利| 亚洲熟妇中文字幕五十中出| 亚洲天堂国产精品一区在线| 午夜福利欧美成人| 亚洲成a人片在线一区二区| 精品一区二区三区人妻视频| 性色avwww在线观看| 亚洲va日本ⅴa欧美va伊人久久| 亚洲国产欧美人成| 亚洲在线自拍视频| 热99在线观看视频| 中文字幕免费在线视频6| 欧美又色又爽又黄视频| 无人区码免费观看不卡| 久久久久久九九精品二区国产| 性色avwww在线观看| 人妻制服诱惑在线中文字幕| 男女之事视频高清在线观看| 国产伦一二天堂av在线观看| 俄罗斯特黄特色一大片| 国产欧美日韩精品一区二区| 久久精品国产亚洲av香蕉五月| 日韩高清综合在线| 午夜免费激情av| 99久久中文字幕三级久久日本| 亚洲一级一片aⅴ在线观看| 俺也久久电影网| 啦啦啦观看免费观看视频高清| 日韩一本色道免费dvd| 最近视频中文字幕2019在线8| 成人二区视频| 又黄又爽又免费观看的视频| 18禁黄网站禁片免费观看直播| 91久久精品国产一区二区成人| 国产成人a区在线观看| 我的老师免费观看完整版| 亚洲最大成人av| 欧美3d第一页| 在线a可以看的网站| 亚洲18禁久久av| 久久精品国产亚洲网站| 99久久九九国产精品国产免费| 五月玫瑰六月丁香| aaaaa片日本免费| 身体一侧抽搐| 久久久久国内视频| 欧美精品啪啪一区二区三区| 嫩草影院新地址| 日韩欧美在线乱码| 舔av片在线| 看片在线看免费视频| 亚洲18禁久久av| 中文字幕人妻熟人妻熟丝袜美| 欧美bdsm另类| 国产精品久久电影中文字幕| 精品国产三级普通话版| 亚洲国产精品久久男人天堂| 久久精品夜夜夜夜夜久久蜜豆| 国产精品伦人一区二区| 亚洲精品日韩av片在线观看| 久久久国产成人精品二区| 欧美又色又爽又黄视频| 无人区码免费观看不卡| 神马国产精品三级电影在线观看| 亚洲人成伊人成综合网2020| av中文乱码字幕在线| 亚洲av免费高清在线观看| 在线观看免费视频日本深夜| 一本久久中文字幕| 欧美三级亚洲精品| 给我免费播放毛片高清在线观看| 午夜福利在线观看免费完整高清在 | 在线观看免费视频日本深夜| 国产成人影院久久av| 老熟妇乱子伦视频在线观看| 久久久久久久精品吃奶| 国语自产精品视频在线第100页| 天美传媒精品一区二区| 欧美精品啪啪一区二区三区| 日韩av在线大香蕉| 亚洲精品一卡2卡三卡4卡5卡| 久久久久久久精品吃奶| 网址你懂的国产日韩在线| 日本a在线网址| 内射极品少妇av片p| 免费看美女性在线毛片视频| 日韩国内少妇激情av| 免费大片18禁| 久久久久久久久久成人| 狠狠狠狠99中文字幕| 国产伦一二天堂av在线观看| 最近中文字幕高清免费大全6 | 国产精品三级大全| 小蜜桃在线观看免费完整版高清| 亚洲狠狠婷婷综合久久图片| 日日摸夜夜添夜夜添av毛片 | 欧美最新免费一区二区三区| 99riav亚洲国产免费| 国产精品98久久久久久宅男小说| 久久久久久久精品吃奶| 三级男女做爰猛烈吃奶摸视频| 啪啪无遮挡十八禁网站| 国产老妇女一区| 色精品久久人妻99蜜桃| 国产老妇女一区| 午夜老司机福利剧场| 国产精品亚洲一级av第二区| 亚洲最大成人手机在线| ponron亚洲| 日韩 亚洲 欧美在线| 亚洲 国产 在线| 午夜激情欧美在线| 午夜影院日韩av| 亚洲一区高清亚洲精品| 欧美bdsm另类| 中文字幕人妻熟人妻熟丝袜美| 91麻豆精品激情在线观看国产| 天堂影院成人在线观看| 国产高清视频在线观看网站| 国产伦精品一区二区三区四那| 男女边吃奶边做爰视频| 国产精品人妻久久久久久| 最近视频中文字幕2019在线8| 亚洲国产日韩欧美精品在线观看| 亚洲av一区综合| 免费看光身美女| 久久6这里有精品| 亚洲色图av天堂| 色精品久久人妻99蜜桃| 国产老妇女一区| 春色校园在线视频观看| 久久久久久九九精品二区国产| 亚洲人与动物交配视频| 最好的美女福利视频网| 一卡2卡三卡四卡精品乱码亚洲| 亚洲美女视频黄频| 99热这里只有是精品50| 有码 亚洲区| 少妇的逼水好多| 亚洲狠狠婷婷综合久久图片| 亚洲欧美日韩东京热| 99国产精品一区二区蜜桃av| 小说图片视频综合网站| 麻豆国产av国片精品| 老熟妇仑乱视频hdxx| 狂野欧美白嫩少妇大欣赏| 精品人妻1区二区| 一级av片app| 精品久久久久久久久亚洲 | 色哟哟·www| 在线国产一区二区在线| 最近中文字幕高清免费大全6 | 制服丝袜大香蕉在线| 自拍偷自拍亚洲精品老妇| 国产精品嫩草影院av在线观看 | 波多野结衣高清作品| 中文在线观看免费www的网站| av国产免费在线观看| 国产视频内射| 老熟妇仑乱视频hdxx| 免费观看的影片在线观看| 国产黄片美女视频| 一区二区三区四区激情视频 | 亚洲欧美激情综合另类| 国产精品久久久久久亚洲av鲁大| 简卡轻食公司| 亚洲乱码一区二区免费版| 老司机午夜福利在线观看视频| 欧美日韩精品成人综合77777| 国内久久婷婷六月综合欲色啪| 亚洲av.av天堂| 淫妇啪啪啪对白视频| 精品久久久久久久久av| 久久久久久久久久成人| 久久天躁狠狠躁夜夜2o2o| 日韩中文字幕欧美一区二区| 在现免费观看毛片| 国产av在哪里看| 老司机福利观看| 久久久成人免费电影| 九色国产91popny在线| 国产黄a三级三级三级人| 亚洲午夜理论影院| 91久久精品国产一区二区成人| 三级男女做爰猛烈吃奶摸视频| 麻豆国产av国片精品| 99视频精品全部免费 在线| 91久久精品国产一区二区成人| 欧美人与善性xxx| 成年女人永久免费观看视频| 亚洲美女搞黄在线观看 | ponron亚洲| 亚洲欧美清纯卡通| 伦理电影大哥的女人| 国产伦人伦偷精品视频| 国产亚洲精品综合一区在线观看| АⅤ资源中文在线天堂| 日韩欧美国产在线观看| 国产精品av视频在线免费观看| 老司机午夜福利在线观看视频| 欧美日本亚洲视频在线播放| 成人av在线播放网站| 日本黄大片高清| 性色avwww在线观看| 婷婷精品国产亚洲av| 日本a在线网址| 欧美+日韩+精品| 免费看美女性在线毛片视频| 色综合婷婷激情| 国产精品日韩av在线免费观看| 午夜精品一区二区三区免费看| 亚洲综合色惰| 国产av一区在线观看免费| 精品国产三级普通话版| 国产成人a区在线观看| 国产精品爽爽va在线观看网站| 人妻制服诱惑在线中文字幕| 俺也久久电影网| 欧美+亚洲+日韩+国产| 欧美成人免费av一区二区三区| 高清日韩中文字幕在线| 国产在线男女| 欧美在线一区亚洲| av在线蜜桃| 日本撒尿小便嘘嘘汇集6| 国内揄拍国产精品人妻在线| 亚洲三级黄色毛片| 国产黄色小视频在线观看| 国产视频一区二区在线看| 国产淫片久久久久久久久| 国产午夜精品久久久久久一区二区三区 | 中文字幕精品亚洲无线码一区| 美女大奶头视频| 国产 一区 欧美 日韩| 色综合站精品国产| 成年版毛片免费区| 午夜爱爱视频在线播放| 国产一区二区在线观看日韩| 日本一本二区三区精品| 国产精品精品国产色婷婷| 免费av不卡在线播放| 久99久视频精品免费| 18禁黄网站禁片免费观看直播| 十八禁国产超污无遮挡网站| 舔av片在线| 亚洲专区国产一区二区| 乱系列少妇在线播放| www日本黄色视频网| 淫妇啪啪啪对白视频| 偷拍熟女少妇极品色| 在线a可以看的网站| 狂野欧美白嫩少妇大欣赏| avwww免费| 窝窝影院91人妻| 成人午夜高清在线视频| 12—13女人毛片做爰片一| 中出人妻视频一区二区| 精品午夜福利在线看| 久久久久久久久久成人| 久久九九热精品免费| 最好的美女福利视频网| 国产 一区 欧美 日韩| 国产精品自产拍在线观看55亚洲| 国产高清不卡午夜福利| 久久久久性生活片| 日日撸夜夜添| 国产亚洲精品av在线| 精品久久久久久久末码| 此物有八面人人有两片| 日韩欧美在线二视频| 久久久久久久午夜电影| av天堂中文字幕网| x7x7x7水蜜桃| 国产精品久久久久久精品电影| 看黄色毛片网站| 亚洲精华国产精华精| 国产三级中文精品| 日本在线视频免费播放| 亚洲国产欧美人成| 波野结衣二区三区在线| 国产成人av教育| 国产免费av片在线观看野外av| 中文字幕av在线有码专区| 色精品久久人妻99蜜桃| av在线观看视频网站免费| 国产在视频线在精品| 欧美性猛交╳xxx乱大交人| 在线天堂最新版资源| 国产69精品久久久久777片| 亚州av有码| 日本熟妇午夜| 成人性生交大片免费视频hd| 在线观看一区二区三区| 少妇的逼水好多| 最后的刺客免费高清国语| 村上凉子中文字幕在线| 精品久久久久久久久久免费视频| 日日干狠狠操夜夜爽| 国产免费一级a男人的天堂| 成人亚洲精品av一区二区| 久久久久久国产a免费观看| 变态另类成人亚洲欧美熟女| 亚洲成a人片在线一区二区| 免费不卡的大黄色大毛片视频在线观看 | 在线观看66精品国产| 日本撒尿小便嘘嘘汇集6| 97超视频在线观看视频| 欧美精品啪啪一区二区三区| 可以在线观看的亚洲视频| 久久亚洲精品不卡| 在线播放无遮挡| 毛片女人毛片| 色噜噜av男人的天堂激情| 小蜜桃在线观看免费完整版高清| 99riav亚洲国产免费| 少妇人妻一区二区三区视频| 日日夜夜操网爽| 久久亚洲真实| 免费看a级黄色片| 久久久精品大字幕| 一个人观看的视频www高清免费观看| 日韩欧美国产一区二区入口| 中文在线观看免费www的网站| 天美传媒精品一区二区| 国内少妇人妻偷人精品xxx网站| 国产精品永久免费网站| 国产精品自产拍在线观看55亚洲| 国产极品精品免费视频能看的| 校园春色视频在线观看| 国产一区二区三区在线臀色熟女| 动漫黄色视频在线观看| 一卡2卡三卡四卡精品乱码亚洲| 淫秽高清视频在线观看| 亚洲第一区二区三区不卡| 欧美另类亚洲清纯唯美| 黄色视频,在线免费观看| 精品一区二区三区人妻视频| 久久精品国产亚洲av天美| 免费大片18禁| 欧美一区二区精品小视频在线| 精品人妻一区二区三区麻豆 | 亚洲四区av| 精品国产三级普通话版| 尾随美女入室| 真人一进一出gif抽搐免费| 草草在线视频免费看| 伦精品一区二区三区| 熟女人妻精品中文字幕| 美女黄网站色视频| 少妇高潮的动态图| 桃色一区二区三区在线观看| 亚洲熟妇熟女久久| 小说图片视频综合网站| 欧美成人免费av一区二区三区| 噜噜噜噜噜久久久久久91| 国产精品一区二区性色av|