• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Multi-Label Learning Based on Transfer Learning and Label Correlation

    2019-11-07 03:12:24KehuaYangChaoweiSheWeiZhangJiqingYaoandShaosongLong
    Computers Materials&Continua 2019年10期

    Kehua Yang,Chaowei SheWei Zhang Jiqing Yao and Shaosong Long

    Abstract:In recent years,multi-label learning has received a lot of attention.However,most of the existing methods only consider global label correlation or local label correlation.In fact,on the one hand,both global and local label correlations can appear in real-world situation at same time.On the other hand,we should not be limited to pairwise labels while ignoring the high-order label correlation.In this paper,we propose a novel and effective method called GLLCBN for multi-label learning.Firstly,we obtain the global label correlation by exploiting label semantic similarity.Then,we analyze the pairwise labels in the label space of the data set to acquire the local correlation.Next,we build the original version of the label dependency model by global and local label correlations.After that,we use graph theory,probability theory and Bayesian networks to eliminate redundant dependency structure in the initial version model,so as to get the optimal label dependent model.Finally,we obtain the feature extraction model by adjusting the Inception V3 model of convolution neural network and combine it with the GLLCBN model to achieve the multi-label learning.The experimental results show that our proposed model has better performance than other multi-label learning methods in performance evaluating.

    Keywords:Bayesian networks,multi-label learning,global and local label correlations,transfer learning.

    1 Introduction

    Nowadays,we live in an information age.An instance cannot be labeled with just single label,so the instance is often associated with more than one class label.For example,an image can be annotated with several labels[Su,Chou,Lin et al.(2011)],a piece of music can belong to many types[Turnbull,Barrington,Torres et al.(2008)],a text can reflect different themes[Ueda and Saito(2002)].Therefore,multi-label classification attracts more and more researchers to research.

    There are two categories in multi-label learning algorithms[Zhang and Zhou(2007)]:problem transformation and algorithm adaption.Problem transformation is a straightforward method.The main idea is to convert multi-label problem as one or more traditional single label problems.Algorithms include Binary Relevance(BR)[Boutell,Luo,Shen et al.(2004)],Pruned Problem Transformation(PPT)[Read,Pfahringer and Holmes(2009)]and so on.Algorithm adaptation is an adaptive method.The main idea is to use single-label classification algorithm to adapt to multi-label classification.Classic algorithms include C4.5 Decision Tree[Quinlan(1992)],Multi-label Dimensionality reduction via Dependence Maximization(MDDM)[Zhang and Zhou(2010)],Multi-label Informed Latent Semantic Indexing(MLSL)[Yu,Yu and Tresp(2005)]and so on.

    Label correlation can provide important information for multi-label classification.For example,“blue sky” and “white cloud” are highly symbiotic labels,while “sunny” and“black clouds” are highly exclusive labels.“ocean” and “sailboat” appear at the same time,it is highly likely that the “fish” label will be included,while the “desert” label will not appear.However,most of the existing methods mainly focus on the sharing characteristics of global label and ignore the label correlation of local data sets.For example,“Jack Ma” is associated with “Alibaba” in the IT company data set[Liu,Peng and Wang(2018)],but it is weakly related to global label correlation.Therefore,according to the above analysis,it is more practical and comprehensive to consider the global and local label correlations in multi-label classification.

    Each instance has characteristics of multi-dimensional label in multi-label learning.If the label of instance is annotated simply by manual labeling,human may sometimes ignore labels that they do not know or of little interest,or follow the guide by some algorithm to reduce labeling costs[Huang,Chen and Zhou(2015)].Some labels may be missing from the training set,which is a kind of weakly supervised learning.So subjectivity factors are unavoidable in the labels.As a result,some labels may be missing from the data set,resulting in label imbalance,which makes it more difficult and potentially negatively impacting performance to estimate label correlations.

    In this paper,we propose a novel and effective method called “Bayesian Networks with Global and Local Label Correlation”(GLLCBN).The main idea of GLLCBN is to use the global label semantics correlation and local label correlation of data set to balance label correlation and reduce the impact of label noise on data set.First of all,the probability of each individual label is obtained by analyzing the data set.Similarly,we get the probability between pairwise labels by using the data set.And then,the global label correlation matrix is constructed by label semantic similarity.After that,according to the relevant probability information received in the first to three steps,the initial Bayesian networks topology is constructed to obtain the high-order label correlation.In addition,redundant edge(label correlation)in the network structure are optimized by graph theory and probability theory.Subsequently,GLLCBN model is constructed.Finally,the initial prediction label is obtained by using transfer learning to adjust and train the Inception V3 model,and then,the prediction result is combined with GLLCBN to achieve multi-label classification.

    The remainder of this paper is organized as follows.Section 2 introduces the related work of multi-label learning.Section 3 presents our proposed algorithm in detail.We experimented to verify the performance of our proposed method in Section 4.Finally,conclusions and future work are given in Section 5.

    2 Related work

    In recent years,multi-label learning has been extensively studied and many methods have been proposed[Zhang and Zhou(2007)].In addition,the role of label correlation has gradually become the focus of researchers.Methods can be divided into three categories according to the degree of label correlation[Zhang and Zhang(2010)].

    First-order method is to convert a multi-label classification into multiple one-dimensional independent classifiers.For example,the classic BR[Boutell,Luo,Shen et al.(2004)]trained a corresponding classifier for each label independently.Obviously,the advantage of this approach is its simplicity,but it ignores label correlation.Second-order method refers to the correlation between pairs of labels.For example,the CLR[Brinker(2008)]achieved conversion of multi-label classification problems by analyzing the correlation of pairwise labels and establishing label rankings.Although the advantage of this method is that it considers the internal pairwise label correlations,which has a certain efficiency improvement.However,multi-label learning generally has high dimensions,we should not be limited only to consider the existence of pairwise labels.Therefore,higher-order methods are proposed.High-order method refers to analyzing the correlation between the high-dimensional of the labels and is not limited to the pairwise labels.For example,MLLRC[Huang and Zhou(2012)]solves multi-label classification problem by using characteristics of the matrix rank.Obviously,the advantage of high-order method is to extract the intrinsic connection of label and strengthen the dependency of labels,but label correlation analysis is more difficult and the label correlation structure is more complicated.

    Labeling of an instance may result in label imbalance due to subjectivity factors.For example,the actual label for this image should contain “bull”,“mountain” and “road” in Fig.1.By manual labeling,on the one hand,the picture on the left can be marked in the order of “cattle”,“mountain”,“road”.On the other hand,the picture on the right may be marked as “mountain”,“road”,“bull”.Sometimes the label of “bull” even be lost by visual effects.GLOCAL[Zhu,Kwok and Zhou(2017)]indicated that missing label and label order are influential factors for multi-label classification.

    Figure 1:Image annotation

    In summary,for the study of multi-label classification,not only global label correlation should be considered,but also local label correlation.Therefore,a more balanced and comprehensive label correlation can be received.

    3 The proposed approach

    In this section,details of the proposed approach GLLCBN will be presented.Firstly,we perform predefined of model and analysis global and local label correlation to obtain GLLCBN model.Secondly,we combine the optimized Inception V3[Szegedy,Vanhoucke,Ioffe et al.(2016)]model by transfer learning with GLLCBN to achieve multi-label classification.

    3.1 Preliminaries

    Since multi-label classification has high-dimensional features,this is the difference between multi-label classification and single-label classification,we have following predefined processing.LetD=Rnbe n-dimensional sample space andwheremis the number of labels in dataset.On the one hand,the correspondence between the data set instance and the sample label is defined asQ=({Ni,Mi)|i=1,2,...,n},wherenrepresents the total number of data set andNi∈Dis an n-dimensional feature vector.So we defineto represent the feature vector of a sample instance.On the other hand,we denoteas sample label matrix,whereis the label vector of instance associated withNi.In addition,we denoteas each elementif thei-th instance hasj-th label,otherwise.

    3.2 Label correlation

    Label correlation contains potentially important information for multi-label classification problem,so label correlation is an essential part of our analysis[Punera,Rajan and Ghosh(2005)].However,there are certain difficulties in the analysis of this aspect,then how to solve this problem has become a new research direction.In order to analyze label correlation more reasonably and comprehensively,we deal with local correlation of the data set and global correlation of the label semantics.

    3.2.1 Local label correlation

    We consider local label correlation from the data set.Since data in the data set is random,the probability of different labels is inconsistent.According to this feature,we denoteP=[p(l1),p(li),...,p(lm)]as the probability of each label occurrence,wheremrepresents the total number of sample labels andp(li)indicate the probability of thei-th label in the data set.Since label correlation is at least second-order,we need to calculate the probability of pairwise labels.The local label correlation is defined as:

    wherei,jare a single label in the label set andi,j∈m.We denoteT(Nlj)as the number of sample instances with the labellj.However,to avoid anomalous expressions,ifT(Nlj)value is equal to 0,it means thatp(li|lj)is also equal to 0.Similarly,T(Nli|lj)represents the number of sample instances that simultaneously have both labelsliandlj.In addition,we denotep(li|lj)andX(li|lj)as the probability of pairwise label correlations.It is important to note that pairwise label correlations is not a symmetric equivalent relationship,which is defined as:

    For example,there is a data set as shown in Tab.1:

    Table 1:Data set

    According to the above table,,there isp(lA|lB)≠p(lB|lA),so Eq.(2)is correct.

    3.2.2 Global label correlation

    We obtain global correlation by analyzing the word similarity.At present,the correlation between words mainly uses context semantics of words,the word vector is used to judge correlation between two words and Word2vec[Mikolov,Chen,Corrado et al.(2013)]is a classic algorithm.For example,words “man” and “woman” are highly relevant to “man”and “beautiful”,because they are used in a similar context.For example the word “man”can be used in the position of the sentence and the word “woman” can be replaced.Therefore,we defineW=[W1,W2,...,Wm]T∈[0,1]m×mas word matrix,whereW1=[w(l1|l1),w(l1|l2),...,w(l1|lm)]is a vector of pairwise words correlation andw(li|lj)is the word correlation probability between labelsiandj.The process is defined as:

    As shown in Eq.(3),each label is perfectly correlated with itself,so the value is 1 and a small value means that the label correlation is low,otherwise the opposite.

    3.3 GLLCBN model

    According to the analysis in Section 3.2,we have dealt with global and local label correlations.The relationship between them is defined as Eq.(4):

    wherei,jare pairwise labels,andλ1,λ2∈[0,1],λ1+λ2=1are trade-off parameters for controlling the weight between global and local label correlations.Then that,E(li|lj)is the comprehensive label correlation.

    It is not enough to finally acquire pairwise label correlations through theE(li|lj),because according to the global and local label correlation,this will result in a cyclic relationship between pairwise labels,the labelsliandljhave a relationship betweenE(li|lj)andE(lj|li).When a symmetrical relationship occurs,it becomes ambiguous because it is impossible to determine which side of the pairwise labels is strongly dependent on the other.Therefore,in order to solve this problem,it is necessary to eliminate the ambiguous dependencies of pairwise labels,so that the definition can be defined as Eq.(5):

    whereli,ljare pairwise labels,andlx1|lx2is finalized label dependency.

    For example,there is a structure in Fig.2,where a circle represents a label(e.g.,A,B)and the edges between the circles represent the probability of pairwise label correlations(e.g.,E(lB|lA),E(lA|lB)).

    Figure 2:Correlation between label A and B

    According to the description of the Eq.(5),since correlation follow the principle of maximum value,correlation between label A and label B should be such that the structure of Fig.2 should be optimized to the structure of Fig.3.Therefore,this shows that label A is more dependent on label B.

    Figure 3:Optimized label A and B correlation

    As shown in Fig.4,it is worth noting that if there are multiple reachable paths for one label to another.As shown in the Eq.(6),it is used to determine label dependencies of multiple reachable edges.

    Figure 4:Multiple reachable paths between label

    Eq.(6)can be equivalent to the form as shown in the Eq.(7):

    wherek1,k2,...are the middle label node on the path from labelitoj.Then that,Q1=The

    specific proof of the Eq.(7)is as follows:

    Proof.First,we are based on Fig.4.On the one hand,N(A),N(B),N(C),N(A,B),N(A,C),N(B,C),N(A,B,C)are the number of sample instances.On the other hand,p(lC|lA),p(lB|lA),p(lC|lB)are the probability of pairwise local label correlations.Then,w(lB|lA),w(lC|lA),w(lC|lB)are the probability of label semantic correlation.Moreover,their values are known according to the analysis from Eq.(1)to Eq.(5).

    Second,according to graph theory and probability theory,we have the following defines:

    If we have to proveE(lC|lA)≥(E(lB|lA),E(lC|lB)),according to Eq.(8)and Eq.(9),we only need to prove.

    We observe that the data set can be able to guaranteeN(A,C)≥N(A,B,C),because both of them belong to the inclusion relationship,and the former has a larger scope.By the same logic,we know thatN(A)≥N(A,B)and the values ofλ1and λ2 on both sides of the equation are the same,so it does not require additional consideration.In addition,w(lABC|lAB)=w(lB|lA)×w(lC|lB)×(lC),wherew(lC)is equal 1,thus,w(lABC|lAB)=w(lB|lA)×w(lC|lB).

    According to the above analysis,we know that only if the sample data set satisfies the value of,thenE(lC|lA)≥(E(lB|lA),E(lC|lB))can be obtained,otherwise,the opposite is true.Therefore,we prove that Eq.(7)is true.

    According to the Eq.(7),the reachable path for eliminating the dependency of the label correlation can be performed,but the following two cases require special handling.

    Case 1.E(li|lj)=E(li|lk1|lk2...|lj)

    According to the principle of maximum label correlation,Fig.4 is optimized as shown in Fig.5.

    Figure 5:Optimized GLLCBN model

    Case 2.E(li|lj)=E(li|lj)

    Fig.4 does not need to be changed.Because there may be pairwise label correlations,the intermediate nodes in them cannot be eliminated,and the correlation structure of each intermediate node should be retained.

    In summary,directed graph model of GLLCBN can be constructed by analyzing label correlation and Bayesian networks[Friedman,Linial,Nachman et al.(2000)].The GLLCBN model can be used to optimize the label correlation and facilitate the extraction of potential association information between labels,and reduce the impact of label imbalances in the sample data set.

    3.4 Adjustment of Inception V3 model

    Convolutional Neural Network(CNN)plays a very important role in the research of image classification[Song,Hong,Mcloughlin et al.(2017)].There are many excellent models of CNN,such as AlexNet[Krizhevsky,Sutskever and Hinton(2012)],VGGNet[Russakovsky,Deng,Su et al.(2015)],ResNet[He,Zhang,Ren et al.(2015)]and GoogleNet[Szegedy,Liu,Jia et al.(2014)].Among them,Inception V3[Szegedy,Vanhoucke,Ioffe et al.(2016)]created by Google is a very portable and highly usable model.Therefore,we use transfer learning approach to make related adjustments for the Inception V3 model to improve the performance of the multi-label classification problem.In order to adjust the Inception V3 model,we need to make some adjustments.First of all,since the Inception V3 model was initially trained for single classification,but our images are multi-label attributes,we need to treat the label storage of the input data as multidimensional,rather than only as single label.Secondly,in order to achieve applicability,it is generally necessary to remove the top-level structure and then add some new various layers of customization.Therefore,we add a fully connected layer of 1024 nodes for association with the last pooling layer.Finally,since the softmax in Inception V3 outputs 1000 nodes(the ImageNet[Deng,Dong,Socher et al.(2009)]data set has 1000 categories),we need to modify the last layer of the network and convert it to the number of nodes(it equivalent to the label type in the data set)so that label classification is achieved through our model.

    4 Experiments

    In this section,to evaluate the performance of GLLCBN,a description of the multi-label data set used in the experiments,the performance evaluation of multi-label classification and comparative algorithm with GLLCBN model are explained.Finally,the experimental results and analysis are presented.

    4.1 Data sets

    In order to verify the performance of GLLCBN,we chose the open source data set collected by Nanjing University.The download link for the data set ishttp://lamda.nju.edu.cn/files/miml-image-data.rar.The data set contains 2,000 landscape images and five labels(desert,ocean,sunset,mountains,trees).In addition,each instance has an average of two labels.It is worth noting that the original data set archive contains a file called miml_data with a.mat suffix(Matlab file format).It contains three files:bags.mat,targets.mat and class_name.mat.The first file can be ignored directly because it has no special effect.The second file is the label definition for each image,which means a matrix of 5×2000,each column represents label for an instance image,and the value 1 indicates the presence of the label,and-1 means that there is no corresponding label,and the label order is the same as the class_name.mat file.In addition,we need to deal with label format content and convert the matrix into a.txt format file in order to facilitate follow-up training of Inception V3 model.The last file shows all possible label names for the data set.

    4.2 Performance evaluation

    Since an instance has multiple label attributes in the multi-label classification,prediction label may belong to a subset of the actual label,which is represented as.To evaluate the performance of GLLCBN model,we select five widely-used evaluation metrics[Gibaja and Ventura(2015)].

    Hamming Loss expresses the degree of inconsistency between the prediction label and the actual label.Eq.(10)shows the expression of the Hamming Loss.

    Coverage evaluates how far it is needed,on average,to go down the ranked list of labels in order to cover all ground true labels.Eq.(11)shows the expression of the Coverage.

    Ranking Loss evaluates the average fraction of mis-ordered label pairs.Ranking Loss is defined as follows:

    Average Precision represents the average accuracy of the predicted instance label set,just like Eq.(13).

    Average Predicted Time expresses the average time to predict each instance and it time unit is second,which is expressed as follows:

    It is worth noting that in the above five performance evaluation,Hamming Loss,Ranking Loss,Coverage,Ranking Loss and Average Predicted Time,the smaller value means the better performance.But for Average Precision,the larger value means better performance.

    4.3 Comparative algorithms

    In order to validate the validity of GLLCBN model,we compare GLLCBN to the following most advanced multi-label learning algorithms:

    1.Binary Relevance(BR)[Boutell,Luo,Shen et al.(2004)]is first-order method.The main idea is to train a binary linear SVM classifier independently for each label.

    2.Calibarated Label Ranking(CLR)[Brinker(2008)]is second-order method.The main idea is to establish a label ranking by analyzing the pairwise labels.

    3.Multi-label Learning Using Local Correlation(ML-LOC)[Huang and Zhou(2012)]is high-order method.The main idea is to analyze the local label correlation by encoding instance features.

    4.Random K-Labelsets(RAKEL)[Tsoumakas,Katakis and Vlahavas(2011)]is highorder method.The main idea is to transform the multi-label classification problem into several multi-class learning problems by exploiting high-order global label correlation.

    All compared algorithms are summarized in Tab.2.

    Table 2:Compared methods

    4.4 Experimental results

    In our experiments,we randomly use 30%,50% and 70% of the data in data set as the training set,and the rest of data as test data set.Experimental results are shown from Tab.3 to Tab.5.In addition,all of our experimental methods are studied by using the Python or Matlab environment.

    Table 3:Performance evaluation of different algorithms for randomly marking 30% data sets as training data sets:mean ±std(rank)

    Table 4:Performance evaluation of different algorithms for randomly marking 50% data sets as training data sets:mean ±std(rank)

    Table 5:Performance evaluation of different algorithms for randomly marking 70% data sets as training data sets:mean ±std(rank)

    4.5 Experimental analysis

    According to experimental results in Section 4.4,we get the following summary:

    1.When the training data set is 30% of the data set,BR algorithm has some advantages in the performance evaluation of Hamming Loss,Coverage,Ranking Loss,Average Precision and Average Prediction Time.Global label correlation algorithm is superior to local label correlation algorithm.

    2.When the training data set is 50% of the data set,BR algorithm has better performance evaluation in Hamming Loss than the label correlation algorithm.In terms of Coverage,Ranking Loss,Average Precision and Average Prediction Time,the CLR,ML-LOC and GLLCBN algorithms that consider local label correlation have a better advantage than the RAKEL algorithm that considers global label correlation.

    3.When the training data set is 70% of the data set,the advantages of BR are gradually replaced by other algorithms.CLR algorithm in Hamming Loss is higher than other algorithms.For the label correlation algorithm,GLLCBN has better performance evaluation in terms of Coverage,Ranking Loss and Average Precision than MLLOC and RAKEL algorithms,but it is longer in Average Prediction Time.

    4.By analyzing the above three points,we can know that when the training data set is gradually increased,the advantage of the label correlation algorithm in the performance evaluation is gradually reflected,indicating that the label correlation has a certain influence on the multi-label classification problem.

    5 Conclusion and future work

    For the study of multi-label classification,how to mine the potential label correlation information is still a worthy direction in the future.In this paper,we propose a novel and effective approach named GLLCBN for multi-label learning.In the GLLCBN model,the node represents label space,and edge represents global and local comprehensive label correlation.We firstly obtain a complex model by analyzing label,global semantic relevance and local label correlation of data set(This process is called building node association graph),and secondly by using probability theory,Bayesian networks and graph theory to optimize label dependency graph(This process is called eliminating redundant edges),thus we construct a label-dependent network called GLLCBN model.Finally,the multi-label classification is solved by combining the initial prediction results by the Inception V3 model with the GLLCBN model.In addition,experimental results show that our proposed approach has certain effectiveness in performance evaluation.

    In the future,we consider optimizing the performance of our proposed methods in large scale label space data sets and applying this approach to more different multi-label data sets.

    Acknowledgement:The authors gratefully acknowledge support from National Key R&D Program of China(No.2018YFC0831800)and Innovation Base Project for Graduates(Research of Security Embedded System).

    中文在线观看免费www的网站| 精品久久国产蜜桃| 一边摸一边抽搐一进一小说| 一级av片app| 大又大粗又爽又黄少妇毛片口| 国产探花极品一区二区| av天堂在线播放| 国产国拍精品亚洲av在线观看| 69av精品久久久久久| 最近2019中文字幕mv第一页| 精品国内亚洲2022精品成人| 老熟妇仑乱视频hdxx| 香蕉av资源在线| 欧美在线一区亚洲| 99久国产av精品| 亚洲中文日韩欧美视频| 成年女人看的毛片在线观看| 久久欧美精品欧美久久欧美| 久久中文看片网| 69av精品久久久久久| 蜜桃亚洲精品一区二区三区| 久久亚洲国产成人精品v| 1000部很黄的大片| 国产一区二区激情短视频| 久久九九热精品免费| 直男gayav资源| 亚洲欧美日韩东京热| 国产三级在线视频| 男女之事视频高清在线观看| 成熟少妇高潮喷水视频| 一个人看的www免费观看视频| 亚洲精品一区av在线观看| 99在线人妻在线中文字幕| 亚洲精品久久国产高清桃花| 久久人妻av系列| 欧美不卡视频在线免费观看| 久久久成人免费电影| 欧美在线一区亚洲| 国产亚洲精品久久久com| 最近的中文字幕免费完整| 久久人妻av系列| 欧美区成人在线视频| 精品国产三级普通话版| 免费人成视频x8x8入口观看| 午夜精品国产一区二区电影 | 尾随美女入室| 午夜影院日韩av| 国产亚洲91精品色在线| 热99re8久久精品国产| 大又大粗又爽又黄少妇毛片口| 日本黄大片高清| 久久久久久久久久黄片| 成人性生交大片免费视频hd| 亚洲av中文字字幕乱码综合| 一进一出抽搐gif免费好疼| 亚洲自偷自拍三级| 日韩 亚洲 欧美在线| 成人一区二区视频在线观看| 国产一级毛片七仙女欲春2| 国产精品,欧美在线| 一区二区三区免费毛片| 国产乱人偷精品视频| 欧美最新免费一区二区三区| 精品乱码久久久久久99久播| 久久精品夜色国产| 赤兔流量卡办理| 久久精品人妻少妇| 欧美日韩乱码在线| 啦啦啦啦在线视频资源| 黄色日韩在线| 亚洲18禁久久av| 久久九九热精品免费| 成人美女网站在线观看视频| 国产爱豆传媒在线观看| 欧美一区二区国产精品久久精品| 成人鲁丝片一二三区免费| 搡老熟女国产l中国老女人| 免费看a级黄色片| 18禁在线播放成人免费| 啦啦啦啦在线视频资源| 日本成人三级电影网站| 日韩欧美三级三区| 男人的好看免费观看在线视频| 日韩欧美国产在线观看| 久久久久久九九精品二区国产| 精品久久久久久久久av| 亚洲成a人片在线一区二区| 成人欧美大片| 欧美日韩精品成人综合77777| 卡戴珊不雅视频在线播放| 免费黄网站久久成人精品| 免费无遮挡裸体视频| 丰满的人妻完整版| 成人av在线播放网站| 日韩一区二区视频免费看| 大型黄色视频在线免费观看| 日本免费a在线| 国产v大片淫在线免费观看| 又黄又爽又免费观看的视频| 五月玫瑰六月丁香| 亚洲欧美中文字幕日韩二区| 日日摸夜夜添夜夜添av毛片| www日本黄色视频网| av在线播放精品| 中文字幕免费在线视频6| 一个人看的www免费观看视频| 久久精品国产清高在天天线| 能在线免费观看的黄片| 久久精品国产亚洲av涩爱 | 日韩av在线大香蕉| 级片在线观看| 91午夜精品亚洲一区二区三区| 综合色av麻豆| 麻豆av噜噜一区二区三区| 搞女人的毛片| 久久精品久久久久久噜噜老黄 | 一个人免费在线观看电影| 国产v大片淫在线免费观看| 网址你懂的国产日韩在线| 久久久成人免费电影| 亚洲av免费高清在线观看| 可以在线观看毛片的网站| 日产精品乱码卡一卡2卡三| 嫩草影院新地址| 欧美国产日韩亚洲一区| 老熟妇仑乱视频hdxx| 俄罗斯特黄特色一大片| 国产精品永久免费网站| 免费在线观看影片大全网站| 日韩成人av中文字幕在线观看 | 亚洲熟妇中文字幕五十中出| 最近的中文字幕免费完整| 熟妇人妻久久中文字幕3abv| 日韩制服骚丝袜av| 精品人妻熟女av久视频| 亚洲最大成人中文| 亚洲国产欧洲综合997久久,| 国产v大片淫在线免费观看| 在线a可以看的网站| 在线免费观看的www视频| 乱系列少妇在线播放| 日日摸夜夜添夜夜添av毛片| 亚洲欧美成人综合另类久久久 | 卡戴珊不雅视频在线播放| 久久人妻av系列| 嫩草影视91久久| 欧美日韩乱码在线| 久久人妻av系列| 男女啪啪激烈高潮av片| 最近中文字幕高清免费大全6| 亚洲美女搞黄在线观看 | 在线看三级毛片| 三级经典国产精品| 国产高清视频在线观看网站| 女人被狂操c到高潮| 在线观看66精品国产| 少妇的逼好多水| 久久久国产成人精品二区| 三级经典国产精品| 久久精品综合一区二区三区| 亚洲人成网站在线播| 国产精品久久久久久久电影| 欧美性猛交黑人性爽| 蜜桃亚洲精品一区二区三区| 麻豆久久精品国产亚洲av| 国产精品爽爽va在线观看网站| 国产单亲对白刺激| 中文亚洲av片在线观看爽| 精品久久国产蜜桃| 亚洲国产精品国产精品| 男女啪啪激烈高潮av片| 高清毛片免费观看视频网站| 日韩成人伦理影院| 免费不卡的大黄色大毛片视频在线观看 | 久久久久久久久久成人| 在线免费观看的www视频| 三级毛片av免费| 日本熟妇午夜| 国产一区二区激情短视频| 欧美成人精品欧美一级黄| 插逼视频在线观看| 免费大片18禁| 精品一区二区三区av网在线观看| 少妇人妻一区二区三区视频| 一区二区三区免费毛片| 国产在线男女| 色在线成人网| 久久九九热精品免费| 欧美日韩国产亚洲二区| 国产一区亚洲一区在线观看| 久久精品国产亚洲网站| 人人妻人人澡人人爽人人夜夜 | 日本三级黄在线观看| 色哟哟哟哟哟哟| 国产黄色小视频在线观看| 又爽又黄a免费视频| 22中文网久久字幕| 国模一区二区三区四区视频| av在线亚洲专区| 99热网站在线观看| 国产成人aa在线观看| 精品少妇黑人巨大在线播放 | 日韩欧美精品v在线| 男插女下体视频免费在线播放| 午夜免费男女啪啪视频观看 | 青春草视频在线免费观看| 人人妻,人人澡人人爽秒播| 欧美成人一区二区免费高清观看| 天堂av国产一区二区熟女人妻| 美女大奶头视频| 亚洲欧美清纯卡通| 搞女人的毛片| 亚洲精品国产成人久久av| 免费黄网站久久成人精品| 精品人妻偷拍中文字幕| 久久久久国内视频| 日韩精品中文字幕看吧| 中文字幕熟女人妻在线| 色尼玛亚洲综合影院| 日韩精品青青久久久久久| 99久久无色码亚洲精品果冻| 日日摸夜夜添夜夜添av毛片| 99热只有精品国产| 亚洲av电影不卡..在线观看| 午夜影院日韩av| 午夜福利视频1000在线观看| 大香蕉久久网| 久久久精品欧美日韩精品| 老师上课跳d突然被开到最大视频| 嫩草影视91久久| 国模一区二区三区四区视频| 婷婷亚洲欧美| 黄色视频,在线免费观看| 日韩欧美国产在线观看| 国内揄拍国产精品人妻在线| 久久久久性生活片| 国产亚洲精品久久久久久毛片| 激情 狠狠 欧美| 又爽又黄a免费视频| 国产精品福利在线免费观看| 午夜免费激情av| 久久精品人妻少妇| 亚洲婷婷狠狠爱综合网| 亚洲无线观看免费| 亚州av有码| 男人狂女人下面高潮的视频| 久久热精品热| 美女内射精品一级片tv| 中文资源天堂在线| 日韩欧美三级三区| 91精品国产九色| 亚洲成人久久性| 欧美zozozo另类| 99久久精品一区二区三区| 搡女人真爽免费视频火全软件 | 99久久久亚洲精品蜜臀av| av在线播放精品| 国产在线男女| 国产亚洲91精品色在线| 男女那种视频在线观看| 亚洲美女视频黄频| 极品教师在线视频| av天堂在线播放| 男人和女人高潮做爰伦理| 国内精品一区二区在线观看| 俄罗斯特黄特色一大片| 免费观看精品视频网站| 我的女老师完整版在线观看| 亚洲国产精品久久男人天堂| 色噜噜av男人的天堂激情| 秋霞在线观看毛片| 两个人的视频大全免费| 99久国产av精品国产电影| 亚洲最大成人中文| 国产黄色小视频在线观看| 看片在线看免费视频| 美女黄网站色视频| 亚洲天堂国产精品一区在线| 九九久久精品国产亚洲av麻豆| 久久久久性生活片| 国产精品久久久久久亚洲av鲁大| 久99久视频精品免费| 成人亚洲欧美一区二区av| 超碰av人人做人人爽久久| 啦啦啦观看免费观看视频高清| 日韩精品有码人妻一区| 成年女人毛片免费观看观看9| 国产精品综合久久久久久久免费| 国产精华一区二区三区| 少妇高潮的动态图| 久久久精品欧美日韩精品| 国产探花在线观看一区二区| 久久婷婷人人爽人人干人人爱| 99热这里只有是精品在线观看| 亚洲中文字幕日韩| 中国美女看黄片| 内射极品少妇av片p| 精品久久国产蜜桃| 美女被艹到高潮喷水动态| 一进一出好大好爽视频| 亚洲久久久久久中文字幕| 亚洲成人精品中文字幕电影| aaaaa片日本免费| 日韩欧美 国产精品| 一边摸一边抽搐一进一小说| 变态另类成人亚洲欧美熟女| 欧美xxxx黑人xx丫x性爽| 国产精品99久久久久久久久| 听说在线观看完整版免费高清| 亚洲精品456在线播放app| 变态另类成人亚洲欧美熟女| 亚洲五月天丁香| 国产在线精品亚洲第一网站| 免费看a级黄色片| 国产麻豆成人av免费视频| 色哟哟哟哟哟哟| 高清毛片免费观看视频网站| 女同久久另类99精品国产91| 精品久久久久久久末码| 国产白丝娇喘喷水9色精品| 18禁在线播放成人免费| 欧美日韩国产亚洲二区| 一级毛片久久久久久久久女| 国产免费男女视频| 欧美日韩乱码在线| 特级一级黄色大片| 亚洲七黄色美女视频| 国产伦在线观看视频一区| 国产欧美日韩精品亚洲av| 最近最新中文字幕大全电影3| 婷婷六月久久综合丁香| 久久久久久久久久久丰满| av国产免费在线观看| 国产亚洲欧美98| 天天躁夜夜躁狠狠久久av| 久久午夜亚洲精品久久| 亚洲中文字幕一区二区三区有码在线看| 嫩草影院入口| 在线观看免费视频日本深夜| 亚洲精品久久国产高清桃花| 你懂的网址亚洲精品在线观看 | 校园人妻丝袜中文字幕| 欧美激情国产日韩精品一区| 久久综合国产亚洲精品| 一个人看视频在线观看www免费| 欧美一区二区亚洲| 伊人久久精品亚洲午夜| 最近中文字幕高清免费大全6| 成人永久免费在线观看视频| 我要搜黄色片| 国内精品一区二区在线观看| 男女视频在线观看网站免费| 精品不卡国产一区二区三区| 一个人观看的视频www高清免费观看| 一个人看视频在线观看www免费| 久久综合国产亚洲精品| 天天一区二区日本电影三级| 久久韩国三级中文字幕| 婷婷精品国产亚洲av| 国产成人a∨麻豆精品| 伦理电影大哥的女人| 国产黄色小视频在线观看| 欧美激情国产日韩精品一区| 久久6这里有精品| 欧美激情久久久久久爽电影| 少妇猛男粗大的猛烈进出视频 | 日本 av在线| 精品日产1卡2卡| 一a级毛片在线观看| 一卡2卡三卡四卡精品乱码亚洲| 国产乱人视频| 免费人成视频x8x8入口观看| 午夜视频国产福利| 天天躁夜夜躁狠狠久久av| 美女xxoo啪啪120秒动态图| 国产精品久久电影中文字幕| 此物有八面人人有两片| 欧美xxxx黑人xx丫x性爽| 午夜视频国产福利| 精品一区二区免费观看| 欧美丝袜亚洲另类| 成人特级黄色片久久久久久久| 舔av片在线| 在线播放国产精品三级| 欧美又色又爽又黄视频| 国产精品久久视频播放| 亚洲天堂国产精品一区在线| 97碰自拍视频| 色播亚洲综合网| 国产精品野战在线观看| 国产高清激情床上av| 国产中年淑女户外野战色| av福利片在线观看| 最近视频中文字幕2019在线8| 亚洲人成网站在线播| 亚洲精品亚洲一区二区| 男插女下体视频免费在线播放| 特级一级黄色大片| 99在线人妻在线中文字幕| 亚洲第一电影网av| 欧美一区二区精品小视频在线| 亚洲高清免费不卡视频| 18+在线观看网站| 色吧在线观看| 欧美成人精品欧美一级黄| 亚洲真实伦在线观看| 免费看av在线观看网站| 少妇被粗大猛烈的视频| 又爽又黄a免费视频| 黑人高潮一二区| 精品久久久噜噜| 国产一区二区亚洲精品在线观看| 日韩在线高清观看一区二区三区| 亚洲七黄色美女视频| 网址你懂的国产日韩在线| 性色avwww在线观看| 亚洲欧美精品自产自拍| 精品无人区乱码1区二区| 色在线成人网| 99国产精品一区二区蜜桃av| 亚洲精品一卡2卡三卡4卡5卡| 一级毛片我不卡| 亚洲中文字幕日韩| 国产精品美女特级片免费视频播放器| 色在线成人网| 色尼玛亚洲综合影院| 99热只有精品国产| 成人精品一区二区免费| 日韩欧美免费精品| 国产探花极品一区二区| 97热精品久久久久久| 你懂的网址亚洲精品在线观看 | 夜夜爽天天搞| 精品一区二区三区人妻视频| 国产黄色视频一区二区在线观看 | 成人特级黄色片久久久久久久| 欧美色视频一区免费| 久久欧美精品欧美久久欧美| 女的被弄到高潮叫床怎么办| 亚洲一级一片aⅴ在线观看| 久久午夜福利片| av在线亚洲专区| 国产精品日韩av在线免费观看| 国产大屁股一区二区在线视频| 亚洲精品成人久久久久久| a级毛色黄片| 久久久午夜欧美精品| 欧美xxxx性猛交bbbb| 人妻夜夜爽99麻豆av| 99久国产av精品国产电影| 国产熟女欧美一区二区| 99国产精品一区二区蜜桃av| 国产伦在线观看视频一区| 狂野欧美激情性xxxx在线观看| 国内精品久久久久精免费| 美女黄网站色视频| 久久久久久久久久久丰满| 亚洲精品成人久久久久久| 最新在线观看一区二区三区| 91久久精品电影网| 给我免费播放毛片高清在线观看| 欧美不卡视频在线免费观看| 搡老妇女老女人老熟妇| 中国美白少妇内射xxxbb| 丰满乱子伦码专区| 国产精品电影一区二区三区| 高清日韩中文字幕在线| 欧美高清性xxxxhd video| 美女免费视频网站| 久久精品夜色国产| 12—13女人毛片做爰片一| 午夜精品在线福利| 免费在线观看影片大全网站| 欧美3d第一页| 菩萨蛮人人尽说江南好唐韦庄 | 在现免费观看毛片| 神马国产精品三级电影在线观看| 精品午夜福利视频在线观看一区| 国产精品一及| 国产亚洲精品综合一区在线观看| 天天躁日日操中文字幕| 身体一侧抽搐| 男人狂女人下面高潮的视频| 丰满人妻一区二区三区视频av| 成人精品一区二区免费| 色播亚洲综合网| 身体一侧抽搐| 成年女人毛片免费观看观看9| 国产精品一及| 在线播放无遮挡| 亚洲高清免费不卡视频| 午夜福利高清视频| 91精品国产九色| 日韩欧美三级三区| 尾随美女入室| 久久久久九九精品影院| 日韩av不卡免费在线播放| 老司机福利观看| 国产男人的电影天堂91| 中国美白少妇内射xxxbb| 久久天躁狠狠躁夜夜2o2o| 国内精品美女久久久久久| 免费在线观看成人毛片| 日本五十路高清| 干丝袜人妻中文字幕| 国产 一区精品| 亚洲欧美精品综合久久99| 一a级毛片在线观看| 久久久久精品国产欧美久久久| 免费电影在线观看免费观看| 中文字幕人妻熟人妻熟丝袜美| 日韩制服骚丝袜av| 最近最新中文字幕大全电影3| 精品乱码久久久久久99久播| 真实男女啪啪啪动态图| 一进一出好大好爽视频| 久久精品国产鲁丝片午夜精品| 日韩欧美一区二区三区在线观看| 亚洲久久久久久中文字幕| 久久精品夜夜夜夜夜久久蜜豆| 高清毛片免费看| 日韩欧美一区二区三区在线观看| 久久久久精品国产欧美久久久| 久久久久久久久大av| 一本精品99久久精品77| 在线免费观看的www视频| 99热这里只有是精品50| 国产男人的电影天堂91| 亚洲丝袜综合中文字幕| 精品乱码久久久久久99久播| 国产精品一区二区免费欧美| 在线免费观看的www视频| 成人美女网站在线观看视频| АⅤ资源中文在线天堂| 免费一级毛片在线播放高清视频| a级毛片免费高清观看在线播放| 午夜日韩欧美国产| 深爱激情五月婷婷| 两性午夜刺激爽爽歪歪视频在线观看| 亚洲av熟女| 亚洲专区国产一区二区| 能在线免费观看的黄片| 久久亚洲精品不卡| 久久久久久久久中文| 欧美xxxx黑人xx丫x性爽| 两个人的视频大全免费| 成熟少妇高潮喷水视频| 男女边吃奶边做爰视频| 成人永久免费在线观看视频| 99久国产av精品国产电影| 一级a爱片免费观看的视频| 亚洲天堂国产精品一区在线| 免费看美女性在线毛片视频| 成年女人看的毛片在线观看| 国产精品久久久久久久久免| 国产成人福利小说| 国产毛片a区久久久久| 免费观看的影片在线观看| 免费看a级黄色片| 国产色爽女视频免费观看| 在线观看美女被高潮喷水网站| 亚洲第一区二区三区不卡| 如何舔出高潮| 亚洲熟妇中文字幕五十中出| 国产伦一二天堂av在线观看| 69人妻影院| 成人亚洲欧美一区二区av| 毛片女人毛片| 身体一侧抽搐| 亚洲成人久久爱视频| 美女大奶头视频| 中文字幕免费在线视频6| 国产蜜桃级精品一区二区三区| 国产大屁股一区二区在线视频| 久久久久久久久久黄片| 国产探花在线观看一区二区| 成人特级av手机在线观看| a级一级毛片免费在线观看| 九九热线精品视视频播放| 色5月婷婷丁香| 日日摸夜夜添夜夜爱| 国产精品国产高清国产av| 国产精品久久久久久av不卡| 日本黄色视频三级网站网址| 99久国产av精品国产电影| 又粗又爽又猛毛片免费看| 久久草成人影院| 久久99热这里只有精品18| 亚洲三级黄色毛片| av在线天堂中文字幕| 国产精品国产高清国产av| 春色校园在线视频观看| 国产精品一区二区性色av| 97超级碰碰碰精品色视频在线观看| 3wmmmm亚洲av在线观看| 一本精品99久久精品77| 国产一区二区在线av高清观看| 日韩成人伦理影院| 亚洲性夜色夜夜综合| 高清午夜精品一区二区三区 | 国产精品亚洲美女久久久| 麻豆国产av国片精品| 热99在线观看视频| 日日摸夜夜添夜夜添av毛片| 欧美色视频一区免费| 精品一区二区免费观看| 成人永久免费在线观看视频| 麻豆成人午夜福利视频| 日韩欧美一区二区三区在线观看| 国产精品99久久久久久久久| 久久午夜亚洲精品久久| 精品国产三级普通话版| 亚洲性久久影院| 可以在线观看的亚洲视频| 免费电影在线观看免费观看|