• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Topic-Aware Abstractive Summarization Based on Heterogeneous Graph Attention Networks for Chinese Complaint Reports

    2023-10-26 13:15:16YanLiXiaoguangZhangTianyuGongQiDongHailongZhuTianqiangZhangandYanjiJiang
    Computers Materials&Continua 2023年9期

    Yan Li ,Xiaoguang Zhang,? ,Tianyu Gong ,Qi Dong ,Hailong Zhu ,Tianqiang Zhang and Yanji Jiang

    1Engineering Technical Department,China FAW Group Co.,Ltd.,Changchun,130012,China

    2Suzhou Automotive Research Institute,Tsinghua University,Suzhou,215200,China

    3College of Software,Liaoning Technical University,Fuxin,125105,China

    ABSTRACT Automatic text summarization(ATS)plays a significant role in Natural Language Processing(NLP).Abstractive summarization produces summaries by identifying and compressing the most important information in a document.However,there are only relatively several comprehensively evaluated abstractive summarization models that work well for specific types of reports due to their unstructured and oral language text characteristics.In particular,Chinese complaint reports,generated by urban complainers and collected by government employees,describe existing resident problems in daily life.Meanwhile,the reflected problems are required to respond speedily.Therefore,automatic summarization tasks for these reports have been developed.However,similar to traditional summarization models,the generated summaries still exist problems of informativeness and conciseness.To address these issues and generate suitably informative and less redundant summaries,a topic-based abstractive summarization method is proposed to obtain global and local features.Additionally,a heterogeneous graph of the original document is constructed using word-level and topic-level features.Experiments and analyses on public review datasets(Yelp and Amazon)and our constructed dataset(Chinese complaint reports)show that the proposed framework effectively improves the performance of the abstractive summarization model for Chinese complaint reports.

    KEYWORDS Text summarization;topic;Chinese complaint report;heterogeneous graph attention network

    1 Introduction

    The goal of text summarization is to produce a concise sequence of words that captures the most important information from the original document.Most research on text summarization focuses on news articles(short documents),chat logs(multi-party documents),reviews(unstructured documents) and scientific reports (long documents).However,complaint reports are collected daily without receiving much attention.Currently,only 40 government staff are responsible for addressing the reflected problems raised in these reports[1].Moreover,these reports often use negative language and oral expressions to describe the cause,process,result,and purpose of events [2].As shown in Fig.1,complaint reports contain numerous redundant and meaningless sentences,such as repetition and vague language.Therefore,an automatic summarization model is necessary to reduce redundancy and extract relevant information.This is essential for assisting the government in addressing social problems.

    Figure 1:An example of the Chinese complaint report

    Existing summarization models for complaint reports fall into two categories: extractive and abstractive.Extractive models aim to select several sentences as a summary,while abstractive models generate a sequence of words that represent the entire document.However,Chinese complaint reports consist of many elliptical sentences that are strongly related to context.Topics can switch frequently and be diverse in each sentence.Therefore,due to the unique characteristic of Chinese complaint reports,extractive models are unavailable to capture all the relevant information in the document whilst the summary maintains a high compression ratio.To tackle this problem,we propose a framework that relies on abstractive summarization with fine-grained topic modeling inspired by review summarization.The main contributions of this paper are summarized as follows:

    ? To our knowledge,we are the first ones to construct a heterogeneous graph for Chinese complaint reports.A heterogeneous graph with word and topic nodes is constructed to obtain fine-grained textual features.The graph embedding over the heterogeneous graph is proposed to iteratively update the word node.

    ? A heterogeneous graph attention network with topic distribution is proposed for abstractive summarization.

    ? We further collect large-scale Chinese complaint reports dataset from the 12345 hotline and platform.Experiments on review and Chinese complaint report datasets show that our proposed model achieves competitive results compared to the state-of-the-art summarization models in different aspects.

    2 Related Work

    2.1 Topic Extractor for Text Summarization

    Topic signature has been established as a critical element to improve automatic text summarization and information retrieval.However,its implementation in text abstractive summarization is limited.Encapsulating the topic as a representation of the entire document into word embedding is a potential solution for enhancing the sequence network.The word topic distribution of the Latent Dirichlet Allocation (LDA) model can combine with a sequence-to-sequence model to enhance abstractive sentence summarization.Several works reported significant improvements by leveraging topic signatures for text summarization[3–9].These techniques utilize topic models as an additional mechanism to improve text generation,such as obtaining semantic and sequential features of text generation using topic models [4],mining cross-document subtopics using a topic-based model [5],enriching word representation with topical information [6],and capturing sparse candidate topics using a neighborhood preserving semantic measure[7].Ailem et al.[8]developed a topic-augmented decoder that generates a summary conditioned on both the input document and the latent topics of the document.They found that latent topics reveal more global semantic information that can be used to bias the decoder to generate words.Gao et al.[9]incorporated a neural generative topic matrix as an abstractive level of topic information by mapping global semantics into a local generative language model.Inspired by the success of topic modeling,we propose a summarization framework that uses latent topics to capture long-range dependencies in documents.

    2.2 Text Summarization Based on Graph

    Graph neural networks (GNN) have become increasingly popular for modeling relationships between text spans for abstractive summarization.Graph Convolutional Networks(GCN)and Graph Attention Networks (GAT) are two representative GNN models that aggregate node embeddings to obtain compressed graph representations.While GNNs are originally designed to represent the entire document with the same node type,multi-type nodes in the graph have emerged as novel applications for text summarization.Recent work in text summarization has explored how to model these structures.Wang et al.[10]constructed a heterogeneous graph using GAT to learn cross-sentence relations and converted it to multi-document extractive summarization.Cui et al.[11] integrated a joint Neural Topic Model (NTM) to discover latent topics,which can provide document-level features for sentence selection.Jia et al.[12] combined heterogeneous and homogeneous graphs to represent multi-level features.Li et al.[13] proposed an extension of Graph Neural Networks to incorporate graph-structured information into output sequences,achieving remarkable results on bAbI tasks[14].Ma et al.[15]proposed a graph-based semi-supervised learning model for document classification.Similarly,Yao et al.[16] used GCN for text classification,constructing a large graph containing both words and documents.Inspired by previous work on graph-based models,we propose a heterogeneous graph to represent Chinese complaint reports.Moreover,the representation of topic and word embedding is updated by message passing in the heterogeneous graph attention network.

    3 Method

    To generate a summary with less redundancy and suitable information,the proposed model consists of 4 components:topic extractor,word embedding,graph embedding,and sentence decoder,as shown in Fig.2.Firstly,the topic extractor aims to extract the topic distribution of the entire document and each word.Secondly,similar to other summarization models,word embedding is to learn the contextual representation of each word.Additionally,graph embedding uses a heterogeneous graph to combine global features (topic distribution) and local features (contextual representation).Finally,the decoder generates a summary based on the contextual representation.Further details of each architecture are provided in the following section.

    Figure 2:Schema of our proposed model architecture

    3.1 Problem Definition

    Formally,a Chinese complaint report containsnsentencesD={s1,s2,...,sn},where thei-thsentence can be represented assi={wi1,wi2,...,wim}.The abstractive summarization process for complaint reports is formulated as a text-generation task,which aims at generating an abstractive summaryS={w1,w2,...,wr}using a heterogeneous graph.In detail,the sequence-to-graph transformation learns to construct a graph structure that captures the global and local features of the original document.The graph-based encoder generates contextual hidden states with graph features denoted asH={H1,H2,...,Hn}.The decoder combines updated word embeddings to form a summaryS=F(H)={w1,w2,...,wr},whereF(·)is an abstractor function.Meanwhile,the gold-standard summary containslwordsSG={a1,a2,...,al}.

    3.2 Topic Extractor

    Topic,as a high-level semantic unit,becomes a latent feature of the whole document.In this paper,a topic extractor model based on a probabilistic distribution is employed to enrich the semantic representation of Chinese complaint reports.Each topic can be seen as a distribution of semantically coherent terms and each document exhibits these topics with different probabilities or proportions.The main purpose of the topic extractor model is to transform the whole document into latent topic features.Compared to other topic approaches(such as Latent Semantic Analysis,Probability Latent Semantic Analysis),LDA[17]can generate a well-topic distribution and reduce the risk of model overfitting.Therefore,LDA as the topic model to generate a topic vectorTwjfor each wordwjis chosen.Meanwhile,the topic distributionTDof the whole document is obtained in the same way.In this paper,we consider that the topic representation of the whole document conveys the topic of words.The topic vector of the wordwjcan be represented as:

    3.3 Word Embedding

    Word embedding is computed for each word to capture contextual information.A pre-training model is employed to represent the local features of each word as a structural content vector.The original document is first represented as word embeddings using the most advanced pre-trained language model,Bidirectional Encoder Representations from Transformers (BERT),which greatly guarantees the contextual semantics.BERT trains on a corpus of 33 million words via masked language modeling and next-sentence prediction.Formally,the input text is first preprocessed by inserting two special tokens.The token ‘[CLS]’is inserted at the beginning of each sentence,and the output calculated by this token is used to integrate the information in each sequence.Meanwhile,the token‘[SEP]’is inserted at the end of each sentence as an indicator of sentence boundaries.Specifically,the special tokens‘[CLS]’and‘[SEP]’are inserted at the beginning and end of each sentence,respectively.The hidden states can be represented as follows:

    wherewijis thej-th word in thei-th sentence.

    3.4 Graph Embedding

    Heterogeneous graph constructionGiven a complaint reportD,we start with modelingDas a semantic graph.The constructed graph can be defined asG={V,E}.Topic wordsTDand tokens(words inD)wjcan be considered graph nodesV.Meanwhile,the two edge types are created:1)innerconnection edgeew-w,which represents the word dependency in a sentence.The edge weight for a word pair is the cosine similarity of their embedding.We choose GloVe as the static word embedding in this paper.2)topic-word edgeet-w,which connects nodes of topics and semantic words.The edge weights of topic-word are set to 1.0 to preserve the hierarchical structure.Therefore,the constructed heterogeneous graph can simultaneously capture the global and local features since the topic and word can represent the substance of the whole document and the dependency relationship,respectively.

    Graph EncoderNode embedding for the heterogeneous graph is initialized using contextual embedding and topic distribution.As the previous graph neural network cannot deal with the heterogeneous graph through message passing,a heterogeneous graph attention network is proposed,inspired by the graph attention network.The network aggregate embeddings of 2 edge types(wordword and word-topic edge).The hidden state of a wordWiin the(l+1)-th layer is shown as follows:

    wheredWanddTare the dimension of word embedding and topic representation,respectively.

    Then,the representation of semantic nodes in the proposed heterogeneous graph can be updated with the graph attention network.The updating process of the graph attention layer with multi-head attention is designed as follows:

    whereWa,Wq,WkandWcare the model trainable parameters.Kis the number of attention heads.Nirepresents the neighbors ofi-th node.αi,jis the attention weight betweenhiandhj.

    Topic and word vectors are updated through the proposed graph propagation,shown in Fig.3.On the one hand,topic vectors,as the intermediary to build an inter-word relationship,can be considered as the global feature.On the other hand but similarly,updated word embedding as the local feature not only captures the contextual feature but also distills the high-level semantic units from topic vectors.The graph attention layer connects the semantic relationship between topics and words to pay attention to the salient features.

    Figure 3:The processing of graph propagation

    Moreover,the graph attention layer is further modified to infuse the edge information weights(ew-wandet-w),which can be represented as a matrix in multi-dimensional embedding space (ew-w∈Rnn×dwandet-w∈).Therefore,Eq.(3)is updated as follows:

    3.5 Decoder

    The decoder of the proposed summarization model employs a Transformer architecture and uses topic-aware word representation and contextual information to predict the probability of each word in the summary sequence.During decoding,the beam search algorithm is utilized to generate a group tag at each search step to construct the final summary.In this way,the model can better capture the theme and relevant information of the original text,thus generating a more accurate and comprehensive text summaryY={Y1,Y2,...,Yr}.

    4 Experiments and Analysis

    In this section,we first present the Chinese complaint reports and public review (Yelp and Amazon) datasets and hyperparameters setting.Then experimental results on these datasets are introduced to verify the performance of our proposed model.Lastly,the ablation and case studies are given for further analysis.

    4.1 Dataset

    The Chinese complaint reports are collected from the government service hotline and website where complaints have taken place between the government department and complainer in the Chinese language.The Chinese complaint reports dataset contains 220,855 complaint reports that happened in recent 2 years.For the raw data,we run a preprocessing to replace the invalid data(such as consulting,or dialing a wrong number).Additionally,all the complaint-summary pairs are manually annotated with 3 government department staff.To demonstrate the complexity of Chinese complaint reports,the problem types are shown in Fig.4.Moreover,we select 100–500 Chinese characters as the proper length of Chinese complaint reports.The length distribution of the Chinese complaint report is demonstrated in Fig.5.The results are reported in F1 ROUGE scores,containing ROUGE-1(unigram),ROUGE-2(bigram)and ROUGE-L(Longest Common Subsequence,LCS).

    Figure 4:The problem type of Chinese complaint reports

    Figure 5:The length distribution of Chinese complaint reports

    4.2 Hyperparameter Setting

    The dimension of word embedding and topic vector is initialized as 768.The transformer decoder is set to 2 layers.Specifically,Transformer has 768 hidden units and 8 heads.We utilize the Adam optimizer[18]to optimize the parameters in the proposed model.The learning rate is 0.001.The word embedding is set to the pre-trained Chinese BERT model released by [19].We train our proposed model for 5,000 epochs.Except for the pre-trained language model,the other parameters are randomly initialized.

    4.3 Comparison Methods

    Several state-of-the-art comparison models are applied to evaluate our proposed model:

    ?Lead[20]:Lead is to select the first several sentences in a document as the summary.

    ?Oracle[20]:It uses the greedy algorithm to select the most salient sentences as the gold summary.

    ?TextRank[21]:It transforms the original document into a graph and selects several sentences through a graph-based ranking algorithm.

    ?PacSum[22]:It improves the performance of TextRank with the edge-weighted calculation in a graph.

    ? MeanSum [23]: Through an autoencoder model,it calculates the representation in average to decode a summary.

    ?BERTSUM[24]:It inserts multiple segmentation tokens into the original document to obtain each sentence representation.It is the first BERT-based extractive summarization model.

    ? HiBERT [25]: It modifies BERT into a hierarchical structure and designs an unsupervised method to pre-train it.

    ? DISCOBERT [26]: It is a state-of-the-art BERT-based extractive model which encodes documents with BERT and then updates sentence representations with a graph encoder.DISCOBERT builds a document graph with only sentence units based on discourse analysis.

    ?ConsistSum[27]:An unsupervised opinion summarization system can capture the consistency of aspects and sentiment between reviews and summaries.It first constructs high-quality synthetic datasets and then utilizes them to train the summarization model.

    ? TranSum [28]: TranSum is an abstractive summarization model that learns embeddings of reviews by utilizing intra-group and inter-group invariances.

    4.4 Automatic Evaluation

    The compared results on the Chinese complaint report dataset are shown in Table 1.The comparison methods can be categorized into three groups:baselines,extractive models and abstractive models.

    Table 1:The compared results on the Chinese complaint report dataset

    Table 1 depicts Oracle and Lead baselines in the first part,followed by state-of-the-art extractive summarization models in the second part.In the third part,abstractive methods are presented,including DISCOBERT which employs a topic model and pre-training language model BERT.Notably,our proposed model achieves highly competitive results.Compared to extractive models,our proposed graph-based model with topic features demonstrates superior performance.The result proves that the proposed graph structure with topic features is positively effective for the summarization model performance.While most extractive models select several sentences containing salient information,our proposed model with a heterogeneous graph outperforms other extractive models,ensuring both informativeness and conciseness of the generated summary.Moreover,our proposed model achieves competitive performance against other abstractive models on all metrics,affirming the effectiveness of high-level topic features and heterogeneous graph construction.

    Furthermore,to understand how much k affects the generated summary,we compare the topic number setting of the topic extractor in Table 2.Considering the characteristic of Chinese complaint reports,we set the topic number to 2,4,6,8,10.The result demonstrates that the quality of generated summary decreases with the increasing topic number.We infer that the topic feature is more unobvious with the more topic number.Moreover,the result manifests that the topic influences the performance of the summary model.Therefore,we set the topic number to 4.

    Table 2:The influence of different topic numbers on complaint summarization

    To verify the generalization of our proposed framework,Yelp and Amazon datasets are utilized to obtain the review summary.The result of 2 review datasets is depicted in Table 3.For the Yelp dataset,our framework keeps a similar performance of SOTA abstractive summarization(ConsistSum).For the Amazon dataset,our proposed model outperforms all of the compared benchmark summarization models,demonstrating the topic feature and heterogeneous graph embedding improve the result of generated summary.This shows that our model is also suitable for the review datasets.

    4.5 Human Evaluation

    Intuitively,our proposed model can paraphrase abstractive summarization instead of extracting several sentences,improving the conciseness of the summary.Furthermore,the proposed abstractive summarization model with high-level semantic features can enhance readability and informativeness.To evaluate the hypothesis,human evaluation is conducted by 3 government experts.The 100 random samples from the test set are selected.The evaluation scores contain conciseness,readability and informativeness of summaries,which are calculated from 1(the worst)to 5(the best)on average.

    The result of the human evaluation is shown in Table 4.We compare our model against TextRank,BERTSum,HiBERT and DISCOBERT.The proposed model produces high-quality summaries accepted by volunteers.It means that the summary generated by the proposed model is more relevant than others to the original report.Meanwhile,the conciseness result of generated summary signifies that the proposed abstractive model obtains the suitable length through the transformer decoder.It indicates that our model can generate the most relevant summary covering different topics and effectively reduce the redundancy of generated summary.Finally,in terms of readability,our model weakly outperforms other models since the other models are extractive models which select several sentences from the original document to keep the readability.

    Table 4:The human evaluation result of comparison models for complaint summarization

    4.6 Ablation Study

    We also perform an ablation study to evaluate each module in the proposed model.Fig.6 shows the results of each part in the proposed framework.To show the effectiveness of topic features and graph attention layer,we calculate the F1 ROUGE scores of the proposed model without topic features and the heterogeneous graph.

    Figure 6:The ablation result of our proposed model

    The model without topic features means that the homogeneous graph is based on words.The model without GAT represents that the representation of word embedding is the combination of contextual and topic information.Through the ablation study,the proposed model without topic features obtains more salient performance than the model without GAT.The result demonstrates that graph propagation can combine the contextual information into word embedding which is effective for the abstractive summary model.Therefore,it can be seen that the GAT in the proposed model is more capable of summary generation.

    4.7 Case Study

    Table 5 provides an example to illustrate the role of topic features in enhancing our proposed summarization model.The table includes the original complaint report,its gold summary,the best extractive summary,the summary generated by SOTA models and the summary generated by our proposed model.The report is divided into four topics—community,heating,temperature and a problem highlighted in blue.While the extractive model summary contains informative content,it is longer than both the gold summary and the summary generated by our proposed model.Additionally,our proposed model generates a summary that captures important information related to the topic.Compared to state-of-the-art models such as MeanSum and ConsistSum,our proposed model generates the shortest summary,indicating that it achieves the right balance between conciseness and informativeness.This suggests that the topic-aware approach employed in our proposed model effectively preserves salient information while keeping the summary length.

    Table 5:A case study of the proposed model

    5 Conclusion

    In this paper,a novel approach for generating topic-aware abstractive summaries of Chinese complaint reports is presented through a proposed heterogeneous graph attention network.To capture the high-level semantic information that is essential for generating an accurate summary,a topic-aware heterogeneous graph is constructed.Additionally,in terms of the unique characteristics of Chinese complaint reports,topic features (the main content of the original document) as extra nodes in the heterogeneous graph are designated.Our experimental results demonstrate the effectiveness of our model for generating informative and non-redundant summaries of Chinese complaint reports and public review datasets.

    Acknowledgement:We thank the anonymous reviewers for their useful comments.

    Funding Statement:This work is partially supported by National Natural Science Foundation of China(52274205)and Project of Education Department of Liaoning Province(LJKZ0338).

    Author Contributions:The authors confirm contribution to the paper as follows:study conception and design: Yan Li,Xiaoguang Zhang;data collection: Tianyu Gong,Qi Dong,Hailong Zhu;analysis and interpretation of results:Yan Li,Xiaoguang Zhang,Tianyu Gong;draft manuscript preparation:Hailong Zhu,Tianqiang Zhang and Yanji Jiang.All authors reviewed the results and approved the final version of the manuscript.

    Availability of Data and Materials:Reader can email the corresponding authors in order to obtain the available data of Chinese complaint reports.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    韩国av在线不卡| 在线观看免费视频网站a站| 久久精品久久久久久噜噜老黄| 久久这里有精品视频免费| 国产精品熟女久久久久浪| 中文字幕久久专区| 亚洲欧美日韩另类电影网站| 国产一区亚洲一区在线观看| 亚洲欧美一区二区三区黑人 | 熟女电影av网| a 毛片基地| 菩萨蛮人人尽说江南好唐韦庄| 一个人免费看片子| 中文字幕免费在线视频6| 国产69精品久久久久777片| 婷婷色综合www| 51国产日韩欧美| 久久99热6这里只有精品| 国产精品一二三区在线看| 热re99久久精品国产66热6| 美女视频免费永久观看网站| av线在线观看网站| 老熟女久久久| 免费黄频网站在线观看国产| 亚洲丝袜综合中文字幕| 日韩一区二区视频免费看| 高清毛片免费看| 亚洲成人一二三区av| 亚洲欧美日韩卡通动漫| 免费不卡的大黄色大毛片视频在线观看| 制服丝袜香蕉在线| 国产探花极品一区二区| 中文字幕久久专区| 国产深夜福利视频在线观看| a级毛色黄片| 噜噜噜噜噜久久久久久91| 成人国产av品久久久| 一区二区三区乱码不卡18| 9色porny在线观看| 777米奇影视久久| 最新中文字幕久久久久| 高清黄色对白视频在线免费看 | 日本色播在线视频| 男的添女的下面高潮视频| 亚洲av二区三区四区| 亚洲无线观看免费| 日本91视频免费播放| 精品99又大又爽又粗少妇毛片| 建设人人有责人人尽责人人享有的| 日本猛色少妇xxxxx猛交久久| 国产熟女午夜一区二区三区 | 新久久久久国产一级毛片| 免费看日本二区| 少妇人妻精品综合一区二区| 午夜福利影视在线免费观看| 亚洲婷婷狠狠爱综合网| 国产av码专区亚洲av| 男人狂女人下面高潮的视频| 人妻系列 视频| 久久久久精品性色| 男男h啪啪无遮挡| 校园人妻丝袜中文字幕| 日本猛色少妇xxxxx猛交久久| 大片电影免费在线观看免费| 七月丁香在线播放| 26uuu在线亚洲综合色| 日产精品乱码卡一卡2卡三| 欧美精品一区二区大全| av有码第一页| 亚洲欧美成人精品一区二区| 又黄又爽又刺激的免费视频.| 自线自在国产av| 亚洲成人一二三区av| 国产视频内射| 亚洲欧美清纯卡通| 91精品伊人久久大香线蕉| 亚洲情色 制服丝袜| 狂野欧美激情性xxxx在线观看| 又粗又硬又长又爽又黄的视频| 天天躁夜夜躁狠狠久久av| 国产免费福利视频在线观看| 欧美变态另类bdsm刘玥| 日本欧美视频一区| 欧美精品高潮呻吟av久久| 日韩 亚洲 欧美在线| 在现免费观看毛片| 人妻人人澡人人爽人人| 国产av国产精品国产| 我要看日韩黄色一级片| 亚洲自偷自拍三级| 91午夜精品亚洲一区二区三区| 精品国产露脸久久av麻豆| .国产精品久久| 欧美区成人在线视频| .国产精品久久| 国产成人午夜福利电影在线观看| 亚洲人成网站在线播| 三级国产精品片| videossex国产| 久久久国产一区二区| 亚洲自偷自拍三级| 午夜激情久久久久久久| 国产精品国产三级国产av玫瑰| 久久久久久久久大av| 菩萨蛮人人尽说江南好唐韦庄| 成人毛片a级毛片在线播放| 大话2 男鬼变身卡| av.在线天堂| 人妻夜夜爽99麻豆av| 国产精品一区二区性色av| 天堂中文最新版在线下载| 99久久精品一区二区三区| 亚洲三级黄色毛片| 日本欧美视频一区| 日韩电影二区| 一级爰片在线观看| 欧美精品高潮呻吟av久久| 夜夜爽夜夜爽视频| 丝袜在线中文字幕| 另类精品久久| 99热全是精品| 欧美丝袜亚洲另类| 美女视频免费永久观看网站| 在现免费观看毛片| 在线 av 中文字幕| 久久97久久精品| 美女内射精品一级片tv| 久久综合国产亚洲精品| 丰满少妇做爰视频| 18禁在线播放成人免费| 国产成人a∨麻豆精品| 老司机亚洲免费影院| 久久影院123| 久久精品国产自在天天线| 交换朋友夫妻互换小说| 看十八女毛片水多多多| 免费大片黄手机在线观看| 国产精品一区二区在线观看99| 亚洲精品日韩在线中文字幕| 国产欧美日韩综合在线一区二区 | 爱豆传媒免费全集在线观看| 高清视频免费观看一区二区| 卡戴珊不雅视频在线播放| 伊人久久精品亚洲午夜| 成人午夜精彩视频在线观看| 国产精品人妻久久久久久| 男人添女人高潮全过程视频| 中文字幕制服av| 人妻制服诱惑在线中文字幕| 日韩精品免费视频一区二区三区 | 国产色爽女视频免费观看| 18禁在线无遮挡免费观看视频| 欧美日韩一区二区视频在线观看视频在线| 亚洲av二区三区四区| 性色av一级| 国产一区亚洲一区在线观看| 亚洲激情五月婷婷啪啪| 国产精品国产三级专区第一集| 亚洲精品亚洲一区二区| 色网站视频免费| 久久鲁丝午夜福利片| 王馨瑶露胸无遮挡在线观看| 久久久久久久久久人人人人人人| 午夜免费男女啪啪视频观看| 欧美日韩亚洲高清精品| 国模一区二区三区四区视频| 男女国产视频网站| 亚洲av.av天堂| 中国三级夫妇交换| 少妇人妻久久综合中文| 大片电影免费在线观看免费| 国产男女内射视频| 色哟哟·www| 青春草亚洲视频在线观看| 久久综合国产亚洲精品| 久久99热这里只频精品6学生| 国产成人a∨麻豆精品| 精品国产国语对白av| 狂野欧美白嫩少妇大欣赏| 欧美老熟妇乱子伦牲交| av福利片在线| 日韩亚洲欧美综合| 尾随美女入室| 桃花免费在线播放| 精品人妻熟女av久视频| 国产精品不卡视频一区二区| 亚洲国产精品专区欧美| 日韩亚洲欧美综合| 久久av网站| 国内揄拍国产精品人妻在线| a 毛片基地| 午夜免费观看性视频| 男女啪啪激烈高潮av片| 国精品久久久久久国模美| 中文在线观看免费www的网站| 天天操日日干夜夜撸| 一级二级三级毛片免费看| 国产爽快片一区二区三区| 熟女电影av网| 日韩成人av中文字幕在线观看| 蜜桃久久精品国产亚洲av| 肉色欧美久久久久久久蜜桃| 少妇裸体淫交视频免费看高清| 国产深夜福利视频在线观看| 亚洲av国产av综合av卡| 久久 成人 亚洲| 乱码一卡2卡4卡精品| 久久久久久久久久久久大奶| 久久人人爽av亚洲精品天堂| 国产一区有黄有色的免费视频| 在线观看免费日韩欧美大片 | 久久国内精品自在自线图片| 欧美激情国产日韩精品一区| 中文精品一卡2卡3卡4更新| 秋霞在线观看毛片| av视频免费观看在线观看| 精品国产乱码久久久久久小说| 卡戴珊不雅视频在线播放| 大又大粗又爽又黄少妇毛片口| 在线观看免费日韩欧美大片 | 精品亚洲成国产av| 午夜日本视频在线| 国产免费一区二区三区四区乱码| 久久亚洲国产成人精品v| 99久久精品热视频| 熟女人妻精品中文字幕| 免费大片黄手机在线观看| 一本—道久久a久久精品蜜桃钙片| 99热6这里只有精品| 看十八女毛片水多多多| 精品久久国产蜜桃| 中文字幕制服av| 中文天堂在线官网| 美女福利国产在线| 久久久欧美国产精品| 国内揄拍国产精品人妻在线| 日本黄色片子视频| 最近中文字幕2019免费版| 97在线视频观看| 久久国产亚洲av麻豆专区| 日韩大片免费观看网站| 亚洲成人一二三区av| 成人无遮挡网站| 男女国产视频网站| 日本av手机在线免费观看| 婷婷色综合www| 99热这里只有是精品在线观看| 女的被弄到高潮叫床怎么办| 欧美精品高潮呻吟av久久| 亚洲精品一二三| 欧美日韩综合久久久久久| 久久久午夜欧美精品| 桃花免费在线播放| 国产黄片视频在线免费观看| 日本欧美国产在线视频| 极品教师在线视频| 在线观看人妻少妇| 看十八女毛片水多多多| 国产女主播在线喷水免费视频网站| 一本—道久久a久久精品蜜桃钙片| 韩国高清视频一区二区三区| 日韩欧美 国产精品| 久久精品久久精品一区二区三区| 少妇被粗大的猛进出69影院 | 人人妻人人看人人澡| 国产一区二区在线观看av| 日韩一区二区三区影片| 天堂中文最新版在线下载| 一级毛片久久久久久久久女| av专区在线播放| 蜜臀久久99精品久久宅男| 我要看黄色一级片免费的| 久久av网站| 国产亚洲91精品色在线| 王馨瑶露胸无遮挡在线观看| 啦啦啦视频在线资源免费观看| 国产av精品麻豆| 欧美人与善性xxx| 精品人妻熟女毛片av久久网站| 久久毛片免费看一区二区三区| 18+在线观看网站| 成人综合一区亚洲| 99热这里只有是精品50| 国产色爽女视频免费观看| 嘟嘟电影网在线观看| 欧美 亚洲 国产 日韩一| 99久久精品国产国产毛片| 热re99久久国产66热| 久久国产乱子免费精品| 伊人亚洲综合成人网| 国产男女超爽视频在线观看| 欧美老熟妇乱子伦牲交| 黑丝袜美女国产一区| 日日啪夜夜撸| 十分钟在线观看高清视频www | 精品卡一卡二卡四卡免费| 51国产日韩欧美| 性色av一级| 99热全是精品| 高清在线视频一区二区三区| 国产精品麻豆人妻色哟哟久久| 蜜桃在线观看..| 免费看不卡的av| 欧美激情极品国产一区二区三区 | 免费观看性生交大片5| 日本欧美视频一区| 人人妻人人澡人人看| 99热全是精品| 亚洲av中文av极速乱| 99久国产av精品国产电影| 人妻制服诱惑在线中文字幕| 校园人妻丝袜中文字幕| 97超碰精品成人国产| 高清av免费在线| 国产91av在线免费观看| 一区二区三区精品91| 国内精品宾馆在线| 亚洲经典国产精华液单| 国产综合精华液| 日韩精品免费视频一区二区三区 | 久久人妻熟女aⅴ| 亚洲欧美日韩卡通动漫| 日韩成人伦理影院| a级毛片免费高清观看在线播放| 夜夜爽夜夜爽视频| 久久久久久久久大av| 久久久久久人妻| 国产精品成人在线| 亚洲精品第二区| 国产精品一区二区性色av| 精品国产国语对白av| 伦理电影大哥的女人| 嫩草影院新地址| 日韩在线高清观看一区二区三区| 精品一区在线观看国产| 久久狼人影院| 精品午夜福利在线看| 久久精品国产鲁丝片午夜精品| 99热国产这里只有精品6| 一级av片app| 国产午夜精品久久久久久一区二区三区| 亚洲国产色片| 国产精品一区二区性色av| 欧美 亚洲 国产 日韩一| 九九在线视频观看精品| 色哟哟·www| 九九爱精品视频在线观看| 夫妻午夜视频| 在线 av 中文字幕| 日日摸夜夜添夜夜添av毛片| 国产一区二区在线观看av| 亚洲欧美清纯卡通| 高清视频免费观看一区二区| 美女国产视频在线观看| 中文字幕精品免费在线观看视频 | 亚洲欧洲精品一区二区精品久久久 | 欧美 亚洲 国产 日韩一| 麻豆成人av视频| 一本—道久久a久久精品蜜桃钙片| 亚洲av综合色区一区| 又爽又黄a免费视频| 久久99一区二区三区| 久久99热这里只频精品6学生| 极品少妇高潮喷水抽搐| 在线看a的网站| 蜜臀久久99精品久久宅男| 不卡视频在线观看欧美| 日韩中字成人| 最后的刺客免费高清国语| 国产伦精品一区二区三区四那| 久久久久精品久久久久真实原创| 精品人妻熟女av久视频| 免费av不卡在线播放| 国产日韩一区二区三区精品不卡 | 成人无遮挡网站| 久久亚洲国产成人精品v| 亚洲精品乱久久久久久| 香蕉精品网在线| 欧美精品人与动牲交sv欧美| 国产成人a∨麻豆精品| 亚洲精华国产精华液的使用体验| www.色视频.com| 麻豆精品久久久久久蜜桃| 人体艺术视频欧美日本| 校园人妻丝袜中文字幕| h视频一区二区三区| 又大又黄又爽视频免费| 久久精品夜色国产| 久久人妻熟女aⅴ| 最近中文字幕高清免费大全6| 80岁老熟妇乱子伦牲交| 丰满乱子伦码专区| 国产精品人妻久久久久久| 青春草亚洲视频在线观看| 人人妻人人爽人人添夜夜欢视频 | 成年女人在线观看亚洲视频| 亚洲精品国产色婷婷电影| 久久毛片免费看一区二区三区| 成人影院久久| 国产精品99久久久久久久久| 久久97久久精品| 三级经典国产精品| 王馨瑶露胸无遮挡在线观看| 美女脱内裤让男人舔精品视频| 欧美日韩国产mv在线观看视频| 欧美区成人在线视频| 美女cb高潮喷水在线观看| 国产免费福利视频在线观看| 最近的中文字幕免费完整| 精品人妻偷拍中文字幕| 97精品久久久久久久久久精品| 国内少妇人妻偷人精品xxx网站| 亚洲高清免费不卡视频| 久久久久国产精品人妻一区二区| a级毛色黄片| 大陆偷拍与自拍| 大码成人一级视频| 美女xxoo啪啪120秒动态图| 成人美女网站在线观看视频| 国产成人91sexporn| 亚洲人成网站在线播| 91午夜精品亚洲一区二区三区| 国产精品熟女久久久久浪| .国产精品久久| 免费观看性生交大片5| 久久久欧美国产精品| 自拍欧美九色日韩亚洲蝌蚪91 | 人人妻人人爽人人添夜夜欢视频 | 国产精品久久久久久av不卡| 伊人久久国产一区二区| 免费播放大片免费观看视频在线观看| 嘟嘟电影网在线观看| 2022亚洲国产成人精品| 水蜜桃什么品种好| 日韩欧美 国产精品| 丰满少妇做爰视频| av有码第一页| 校园人妻丝袜中文字幕| 国产午夜精品一二区理论片| 丰满乱子伦码专区| 人妻 亚洲 视频| 午夜久久久在线观看| 久久狼人影院| 亚洲av福利一区| 国产日韩欧美亚洲二区| 日韩伦理黄色片| 亚洲精品第二区| 制服丝袜香蕉在线| 欧美性感艳星| 成人综合一区亚洲| 看非洲黑人一级黄片| 午夜激情福利司机影院| 免费看光身美女| 国产亚洲91精品色在线| 如何舔出高潮| 亚洲第一区二区三区不卡| 久久久久久久久久久丰满| 青春草国产在线视频| 寂寞人妻少妇视频99o| 最后的刺客免费高清国语| 人人妻人人澡人人爽人人夜夜| av一本久久久久| 午夜影院在线不卡| www.色视频.com| 亚洲国产欧美日韩在线播放 | 黄色毛片三级朝国网站 | 97在线视频观看| 久久精品国产a三级三级三级| 久久99精品国语久久久| 亚洲人与动物交配视频| 欧美精品人与动牲交sv欧美| 亚洲人与动物交配视频| 天天躁夜夜躁狠狠久久av| 老司机亚洲免费影院| 插逼视频在线观看| 好男人视频免费观看在线| 老司机影院毛片| 亚洲av电影在线观看一区二区三区| 18+在线观看网站| 亚洲精品国产成人久久av| 久久毛片免费看一区二区三区| 日韩强制内射视频| 午夜福利,免费看| 青青草视频在线视频观看| 久久亚洲国产成人精品v| 亚洲情色 制服丝袜| 中文字幕久久专区| 精品久久久久久电影网| 亚洲精品中文字幕在线视频 | 久久久久精品久久久久真实原创| 热re99久久精品国产66热6| 汤姆久久久久久久影院中文字幕| 亚洲欧美清纯卡通| 国产精品久久久久久久电影| 国语对白做爰xxxⅹ性视频网站| 黑人猛操日本美女一级片| 99热这里只有精品一区| 亚洲精品自拍成人| 美女内射精品一级片tv| 亚洲欧美成人精品一区二区| 各种免费的搞黄视频| 免费看av在线观看网站| 欧美最新免费一区二区三区| 天堂8中文在线网| 欧美97在线视频| 免费观看a级毛片全部| 久热久热在线精品观看| 只有这里有精品99| 最近2019中文字幕mv第一页| 午夜福利在线观看免费完整高清在| 亚洲精品久久午夜乱码| 晚上一个人看的免费电影| 成年人免费黄色播放视频 | 99九九在线精品视频 | 美女福利国产在线| 人妻制服诱惑在线中文字幕| 嫩草影院新地址| 亚洲人成网站在线观看播放| 激情五月婷婷亚洲| 蜜桃在线观看..| 我要看日韩黄色一级片| 成年女人在线观看亚洲视频| 精品午夜福利在线看| 久久久久久久亚洲中文字幕| 桃花免费在线播放| 欧美精品高潮呻吟av久久| 人人妻人人澡人人看| 亚洲精品国产av蜜桃| av在线观看视频网站免费| 熟女人妻精品中文字幕| 精品人妻偷拍中文字幕| 精品一区二区三区视频在线| 边亲边吃奶的免费视频| videossex国产| 欧美日本中文国产一区发布| 国产日韩一区二区三区精品不卡 | 欧美 亚洲 国产 日韩一| 免费看光身美女| 日本av手机在线免费观看| 中文资源天堂在线| 久久综合国产亚洲精品| 免费大片黄手机在线观看| videossex国产| 国产熟女欧美一区二区| 国产免费一区二区三区四区乱码| 人妻制服诱惑在线中文字幕| 成人无遮挡网站| av.在线天堂| 高清欧美精品videossex| 国产精品久久久久久久电影| 欧美 亚洲 国产 日韩一| 成年美女黄网站色视频大全免费 | 国产在视频线精品| 国产熟女欧美一区二区| 国产免费视频播放在线视频| 一级毛片aaaaaa免费看小| 99久久中文字幕三级久久日本| 乱码一卡2卡4卡精品| 久久久久视频综合| 久久久国产欧美日韩av| 日韩大片免费观看网站| 久久久久久久国产电影| 看非洲黑人一级黄片| 国产熟女欧美一区二区| 少妇 在线观看| 欧美日韩一区二区视频在线观看视频在线| 在线观看免费高清a一片| h日本视频在线播放| 精品国产国语对白av| 欧美日韩综合久久久久久| 精品久久久精品久久久| 精品国产一区二区三区久久久樱花| 在现免费观看毛片| 亚洲激情五月婷婷啪啪| 国产69精品久久久久777片| 国产欧美日韩综合在线一区二区 | 精品久久久噜噜| 亚洲精品乱久久久久久| 男女无遮挡免费网站观看| 久久久精品94久久精品| 亚洲久久久国产精品| 精品卡一卡二卡四卡免费| 男的添女的下面高潮视频| 成年人午夜在线观看视频| 少妇人妻一区二区三区视频| 精品人妻一区二区三区麻豆| 免费观看a级毛片全部| 免费黄频网站在线观看国产| 五月玫瑰六月丁香| 免费看不卡的av| 一二三四中文在线观看免费高清| 三级国产精品欧美在线观看| 国产日韩欧美视频二区| 菩萨蛮人人尽说江南好唐韦庄| 国产伦精品一区二区三区四那| 一区二区三区四区激情视频| 午夜福利在线观看免费完整高清在| 国产成人aa在线观看| 亚洲精品日韩av片在线观看| 国产一区有黄有色的免费视频| 欧美性感艳星| 一级毛片黄色毛片免费观看视频| 三级国产精品片| 我的女老师完整版在线观看| 国产精品人妻久久久久久| 六月丁香七月| 亚洲精品自拍成人| 亚洲国产精品国产精品| 伦理电影免费视频| av天堂中文字幕网| 秋霞在线观看毛片| 美女国产视频在线观看| 国产精品不卡视频一区二区| 丰满少妇做爰视频| 一区二区三区精品91|