• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Hybrid Neural Network for Automatic Recovery of Elliptical Chinese Quantity Noun Phrases

    2021-12-15 07:11:04HanyuShiWeiguangQuTingxinWeiJunshengZhouYunfeiLongYanhuiGuandBinLi
    Computers Materials&Continua 2021年12期

    Hanyu Shi,Weiguang Qu,,*,Tingxin Wei,Junsheng Zhou,Yunfei Long,Yanhui Gu and Bin Li

    1School of Computer Science and Technology,Nanjing Normal University,Nanjing,210023,China

    2School of Chinese Language and Literature,Nanjing Normal University,Nanjing,210097,China

    3International College for Chinese Studies,Nanjing Normal University,Nanjing,210097,China

    4School of Computer Science and Electronic Engineering,University of Essex,Colchester,UK

    Abstract: In Mandarin Chinese, when the noun head appears in the context, a quantity noun phrase can be reduced to a quantity phrase with the noun head omitted.This phrase structure is called elliptical quantity noun phrase.The automatic recovery of elliptical quantity noun phrase is crucial in syntactic parsing,semantic representation and other downstream tasks.In this paper,we propose a hybrid neural network model to identify the semantic category for elliptical quantity noun phrases and realize the recovery of omitted semantics by supplementing concept categories.Firstly,we use BERT to generate character-level vectors.Secondly, Bi-LSTM is applied to capture the context information of each character and compress the input into the context memory history.Then CNN is utilized to capture the local semantics of n-grams with various granularities.Based on the Chinese Abstract Meaning Representation (CAMR) corpus and Xinhua News Agency corpus, we construct a hand-labeled elliptical quantity noun phrase dataset and carry out the semantic recovery of elliptical quantity noun phrase on this dataset.The experimental results show that our hybrid neural network model can effectively improve the performance of the semantic complement for the elliptical quantity noun phrases.

    Keywords: Elliptical quantity noun phrase; semantic complement; neural network

    1 Introduction

    Ellipsis is a widespread linguistic phenomenon that aims to recover missing objects in the sentence based on the surrounding discourse.The human can quickly restore the omitted semantics based on context, however, it is a big challenge for machine understanding.The gap in syntax and the omission in semantics bring obstacles to many NLP tasks such as syntactic parsing,machine translation and information extraction.Therefore, ellipsis recovery has become one of the fundamental tasks in natural language processing.

    Quantity noun phrase is one of the most common structures in Chinese, in which quantity phrase is used to modify and specify the noun head.However, in some contexts, the noun head is omitted, while the quantity phrase is used to represent the meaning of the whole quantity noun phrase.According to Dai et al.[1], the elliptical quantity phrase accounts for 38.7% of the omission in the corpus of Chinese Abstract Meaning Representation (CAMR).As the omitted noun bears the core meaning of the quantity noun phrase, it is necessary and critical to restore it for the downstream tasks of natural language understanding [2].However, this research topic has been neglected by most of the research communities.To fill this gap, our research proposes a hybrid neural network model for automatically complement the elliptical quantity noun phrase in Chinese to fill the current research gap.

    The clerk had no choice but to pour spirits for Wu Song.Wu Song drunk eighteen bowls[of spirits]altogether.

    In CAMR, the concept nodes of omitted arguments are allowed to be added, which makes the semantic representation of sentences more natural and comprehensive.As is shown in Fig.1,CAMR recovers the noun head of the elliptical quantity noun phrase(eighteen bowls)”by adding the conceptthing.

    Figure 1:An example of concept addition of an elliptical quantity noun phrase

    AMR is an abstract meaning representation method which represents the semantics of a sentence with a single rooted directed acyclic graph [7].What distinguishes it from the other sentential semantic representation methods is that the elliptical arguments are also represented with concept nodes in its semantic graph [8].There are overall 109 concepts in its Name Entity List that can be used to represent elliptical arguments in AMR.Inspired by AMR (Abstract Meaning Representation) [9], we utilize concepts to complete the noun head of elliptical quantity noun phrases.This paper uses the concepts from Name Entity List of AMR to denote the omitted noun heads in elliptical quantity noun phrases [10].In the case(WuSong drunk eighteen bowls [of spirits] altogether.)”, we use the concept “thing” to represent the head that is omitted by the quantity phrase(eighteen bowls)”.Moreover, in our work,we build an elliptical quantity noun phrase dataset based on the Chinese AMR corpus [11] and Xinhua News Agency corpus [12].

    Since both the characters and words in the elliptical quantity noun phrases are crucial to recover the omitted noun head, we focus on the information of n-grams with various granularities in the target sentence.We utilize a hybrid neural network to obtain sufficient semantic information from the elliptical quantity noun phrase and its context for the omission recovery.We aim to recover the omitted head of an elliptical quantity noun phrase automatically with a hybrid neural network model, combining bidirectional encoder representation transformer (BERT) [13],bidirectional long short-term memory network (Bi-LSTM) [14] and convolutional neural network(CNN) [15].

    To be specific, firstly, we utilize BERT to generate character-level vectors.Secondly, we utilize Bi-LSTM to capture both the preceding and following context information of each character and compress the input into the context memory history.Then CNN is utilized to capture the local semantics of n-grams with various granularities.Finally, the semantic representation of elliptical quantity noun phrase and its context are both used to predict the probabilities of the concepts that can represent the elliptical component.

    To my knowledge, this is the first time that use the deep learning method on the Chinese elliptical quantity noun phrase dataset and fill the gaps in the research on semantic complement of elliptical quantity noun phrase.The main contributions of this paper are as follows:1.We build a hand-labeled dataset of elliptical quantity noun phrases.2.The method based on the hybrid neural network model is first proposed to complement elliptical quantity noun phrases.We anticipate that our model could recover the omitted noun head of the elliptical quantity noun phrases in a high accuracy, and subsequently helps to improve the downstream tasks.

    2 Related Work

    In this paper, we utilize the neural network model to predict the concept for complement the elliptical quantity noun phrases.Our work is related to previous works on quantity noun phrase research, ellipsis recovery and the related knowledge of AMR.

    2.1 Quantity Noun Phrase

    For the semantic research of quantity noun phrase, the existing works mainly focus on the boundary recognition of quantity noun phrases, mostly by adopting rules-based and databasebased methods to identify the structural features of quantity noun phrases.Zhang et al.[16] built a database that provides vocabulary knowledge and phrase structure knowledge for numerical phrase recognition and implemented the boundary recognition of “numerical words + quantifier”.Xiong et al.[17] proposed a rule-based Chinese quantitative phrase recognition method based on Zhang’s method without word segmentation.Fang et al.[18] proposed a backoff algorithm to get the quantifier-noun collocations that are not collected in dictionaries, which resulted in greatly improved recall in quantity noun phrase recognition.However, no researchers have paid attention to the semantic recovery of elliptical quantity noun phrases yet.Therefore, this paper aims to recover the omitted noun heads for elliptical quantity noun phrases to provide accurate and comprehensive sentential semantic representation for downstream semantic analysis and applications.

    2.2 Ellipsis Recovery

    In recent years, there has been a lot of work on semantic Chinese omission recovery.Shi et al.[19] explored a method to automatically complement the omitted head of elliptical DE phrases with a hybrid model, by combining densely connected Bi-LSTM and CNN(DC-Bi-LSTM).Zhang et al.[20] proposed a neural framework to uniformly integrate the two subtasks,VPE (Verb Phrase Ellipsis) detection and VPE resolution.For VPE detection, Zhang chose an SVM model with non-linear kernel function, a simple multilayer perception (MLP) and the Transformer.For VPE resolution, Zhang applied a MLP and the Transformer model, and finally a novel neural framework is proposed to integrate the two subtasks uniformly.Wu et al.[21] present CorefQA (Coreference Resolution as Query-based Span Prediction), an accurate and extensible approach for the coreference resolution task.Wu formulated the problem as a span prediction task, like in question answering:A query is generated for each candidate mention using its surrounding context, and a span prediction module is employed to extract the spans of the coreferences within the document using the generated query.Existing works to solve omission recovery are based on the specific phrase structure of the task, and might not suitable for all omission recovery tasks.We learn from its methods and research ideas, and carry out omission recovery work on the elliptical quantity noun phrases in this article.

    2.3 Abstract Meaning Representation(AMR)

    AMR [9] represents the semantics of sentences with a single rooted directed acyclic graph and allows sharing arguments.As it is more specific and accurate in semantic representation than other semantic representation methods, it has been applied to and proved effective in many NLP tasks such as event extraction [22], natural language modeling [23], text summarization [24], machine translation [25] and question answering [26].Chinese AMR corpus (CAMR) was released in 2019 containing 10,419 Chinese sentences annotated with 109 concepts.We create our own dataset by extracting the elliptical quantity noun phrases from CAMR corpus and Xinhua News Agency corpus [12].

    By studying and analyzing the existing related work of omission recovery, we use neural network methods to implement the semantic completion for elliptical quantity noun phrases in this paper.We regard the head complement of elliptical quantity noun phrases as a classification task, and propose a hybrid model to determine the corresponding noun head for a concept.

    3 Methodology

    3.1 Model Overview

    An elliptical quantity noun phrase in Chinese sentence can be denoted as:S={s1,s2,...,wheresirepresents the character in the sentence, 1 ≤i≤n,1 ≤y≤n, m is the length of elliptical quantity noun phrase.{s1,...,sy-1} represents the preceding context words of the quantity noun phrase,represents the elliptical quantity noun phrase or quantity noun phrase, {sy+m+1,...,sn}represents the following context words.This paper aims to select the correct concept to complement the omitted head of the elliptical quantity noun phrase.We use the powerful BERT pre-training model, which can generate character-level embedding vectors combined with contextual information.Subsequently, Bi-LSTM is used to further generate a textual representation based on contextual semantics.Then, the CNN model is used to extract feature in sentences.Therefore, after Bi-LSTM semantic encoding, we use the feature extraction module to obtain various granularities features.

    The overview of the model framework proposed in this paper is shown in Fig.2:Overall, the model has four principal modules:

    Figure 2:The illustration of model architecture

    ? Input Module:Input the raw corpus and generate character-level vectors through BERT pre-training.

    ? Semantic Encoding Module:Bi-LSTM model is utilized to obtain the semantic information of each word in the elliptical quantity noun phrase and its context.

    ? Feature Extraction Module:CNN model and Max pooling are used to capture semantics of n-grams with various granularities.

    ? Output Module:Generate predicted probability distribution over all concepts.

    We will introduce the details of above four modules in the rest of this section.

    3.2 Input Module

    In the input module, we feed the data set of elliptical quantity noun phrases into the BERT pre-training module by character to obtain character vectors combined with contextual semantics.We take the output of this module based on the character-level vector of BERT as the input of the semantic encoding module.

    BERT is a pre-trained language model proposed in 2018.Due to its powerful text representation capability, the model provides strong support in many different natural language processing tasks [27], especially in semantic analysis and semantic similarity computing.BERT pre-trains a general “l(fā)anguage understanding” model in an unsupervised manner on a very large-scale corpus.When it is applied to other downstream tasks, it only needs to fine-tune the parameters of the network according to the specific task requirements [28].

    The sub-structure of BERT is the encoder block of Transformer [29].BERT abandons the recurrent network structure of RNN, regards the Transformer encoder as the main structure of the model, and uses the self-attention mechanism to model sentences.Its model structure is shown in Fig.3.

    Figure 3:BERT model structure

    In the model, multi-head self-attention is calculated in the same way with the conventional attention mechanism, while Layer-Norm is that the upper layer normalizes the residual connection.And Feed Forward denotes a two-layer linear transformation calculation.The entire BERT model consists of 12 layers of Transformer.

    As shown in Eq.(1), in order to obtain the character-level embedding vector pre-trained by the corpus, the original corpusis processed into token embedding, segment embedding, position embedding.These 3 parts are used as input to the BERT model.

    In Eq.(1), two special tokens need to be embedded, where the [CLS] is the tag at the beginning of each sentence and the [SEP] is the tag at the end of the sentence.Since the elliptical quantity noun phrases always appear in one sentence there is no other sentences that [CLS] is used to tag.For input, we fill the segment embedding with 0 s.Besides, position embedding is mainly used to encode the sequence order.The BERT model jointly adjusts the internal twoway Transformer encoder and uses the self-attention mechanism to learn the contribution of the remaining characters in the context to the current character, thereby enhancing contextual semantic information acquisition.Finally, BERT model generates character-level embedding vectorsbased on the current context.

    3.3 Semantic Encoding Module

    Semantic encoding module obtains the character-level vectorXfrom the previous module as input.This module uses the two-layer Bi-LSTM model to perform semantic encoding and passes the Bi-LSTM hidden layer valuehas the output to the local feature extraction module.

    Long short-term memory neural network (LSTM) is a specific type of recurrent neural network (RNN) [30] that introduces memory mechanism and forgetting mechanism for neurons in hidden layers [31].It effectively addresses the issue of long-distance dependence in RNN and avoids the gradient exploding or vanishing in the stage of error backpropagation calculation caused by long input sequence.

    As shown in Fig.4, each memory element of LSTM contains one or more memory blocks and three adaptive multiplication gates, i.e., Input Gate, Output Gate and Forget Gate,through which information can be saved and controlled.LSTM computes the hidden statesht={h1,h2,...,ht}by iterating the following equation:

    At time stept, the memoryctand the hidden statehtare updated with the following equations:

    whereXtis the input at current time-step,i,f,o,gare respectively input gate, forget gate, output gate and new candidate memory state,σandtanhare respectively sigmoid and hyperbolic tangent activation function, and ⊙denotes element-wise multiplication.

    Figure 4:The Bi-LSTM architecture

    However, the LSTM model, as a unidirectional neural network, is unable to learn the context information sufficiently, so Bi-LSTM is developed to improve semantic representation.In our work, we feed the character vectorpretrained by BERT to the two-layer Bi-LSTM.The first layer Bi-LSTM output is fed to the second layer Bi-LSTM as input, and the second layer Bi-LSTM output is the concatenation of the last unit output of the forward and the backward propagation.The final outputhis obtained through stacked Bi-LSTM layers:

    The architecture of the proposed “BERT + Bi-LSTM” model is shown in Fig.5.

    3.4 Local Feature Extraction Module

    Local feature extraction module combines the output of the semantic encoding module and the output of the BERT pre-training layer as the input of this module.This module feeds the input vector through the convolution neural network and max-pooling feature extraction to the new output module.

    CNN is an effective model for extracting semantic features and capturing salient features in a flat structure [32].The reason is that they are capable of capturing local semantics of ngrams with various granularities.While in the sentence containing an elliptical quantity phrase,the omitted noun head usually can be inferred by the context as the words in the context are informative.Therefore, for the feature extraction module, we use CNN and Max pooling layer to extract features.Meanwhile, in order to obtain more effective information and avoid the loss of semantic information during semantic encoding, we concatenate the character vectors obtained by BERT pre-training with the semantic vectors generated by Bi-LSTM and input them into the feature extraction module.

    We use a convolution filterF∈Rω×dto obtain the feature mapY∈Rn-ω+1, thej-th elementyjis given by:

    wherefis theReluactivation function,Wis the weight matrix of convolution filter,bis a bias,ωis the length of the filter, anddis the dimension of word vector.The convolutional layer uses multiple filters in parallel to obtain feature maps.We also can use convolutional filters of various length to extract different feature information.

    The feature maps are fed to the max-pooling layer to obtain the most salient information to form the feature vectors that have been proved effective in improving the model performance.

    Figure 5:The illustration of “BERT + Bi-LSTM” architecture

    3.5 Output Module

    The output module uses the output of the previous feature extraction module to obtain the probability distribution of the concept category through the classification function.

    With the semantic encoding module and the feature extraction module, we obtain the feature representation of elliptical quantity noun phrase and its context.Then we feed the output of local feature extraction module intosoftmaxfunction to classify the omitted semantic categories of elliptical quantity noun phrase.

    4 Experiments

    The experiment in this paper utilizes a hybrid neural network model to recover elliptical quantity noun phrases and compares multiple sets of experiments to show the improvement of our model.We first compare and analyze the experimental results, then discuss some error cases,and propose the feasible future improvements in this section.

    4.1 Dataset

    As there is no such dataset ready for use, we build our own elliptical quantity noun phrase dataset by extracting all the head omitted quantity noun phrases from the CAMR corpus and Xinhua News Agency corpus.In this dataset, we extract 838 elliptical quantity noun phrases from the CAMR corpus and 676 elliptical quantity noun phrases from the Xinhua News Agency corpus.We observe that in the corpus the omitted head concepts of the elliptical quantity noun phrases are only distributed on few categories such asthingandperson, accounting for 85% of the overall elliptical quantity noun phrases.Tab.1 shows the distribution of the elliptical quantity noun phrases and their concept categories in the training and test sets.

    Table 1:Concept distribution the experimental dataset

    4.2 Experiment Settings

    The experiment settings of the combined neural network based on our Hybrid Neural Network are shown in Tab.2.

    Table 2:Settings of hyperparameters

    4.3 Evaluation Metrics

    We use accuracy and Macro-F1 score to measure the performance of our proposed method.Macro-F1 gives equal weight to each class label which is suitable for classification task with unbalanced categories [33].

    4.4 Results

    We conduct comparative experiments to prove the effectiveness of our proposed method, the results are shown in Tab.3.CharVec means that we adopt 300-dimensional character embedding vector obtained from the Baidu Encyclopedia corpus trained by Skip-Gram with Negative Sampling (SGNS) [34] method.Bi-LSTM means a bidirectional LSTM network is used.ATT refers to attention mechanism for extracting features.CNN + Max pooling means using convolutional neural network and max-pooling to extract features.

    Table 3:Performance of different methods

    As shown in line 1 of Tab.3, with the conventional character vector CharVec and Bi-LSTM model, we obtain an accuracy of 70.98%, which illustrates the effectiveness of baseline model complement the elliptical quantity noun phrase in this paper.When extra context features are incorporated, the performance is improved accordingly.Comparing method 4 with method 5, we find that the syntactic information provided through BERT pre-trained character vector plays a vital role in improving our proposed model.It can help locate important characters in the elliptical quantity noun phrase, capturing better contextual information in the following modules.Comparing method 2 with method 4, we can see that the CNN and Max pooling feature extraction methods perform better than the attention-based mechanism in this paper.That is because in the sentence of the elliptical quantity noun phrase, the words in the context are crucial for people to infer what is omitted.Based on this observation, we use convolutional neural networks to obtain contextual semantics of various granularities for feature extraction.As is shown in line 2 and 3,CNN and the max-pooling model obtain better results than Bi-LSTM+ATT, which also proves the effectiveness of CNN in extracting features.

    Despite the achievement in complement, we notice that there exists unbalance among the categories of the concepts that are omitted in language, Tab.4 shows the recognition results of models for each type of concept.The numbering sequence in the Tab.4 is the same as in Tab.3.

    One can notice that the poor performance of location and animal do not affect the overall performance obviously, due to the small amount of data of these concepts.In addition to continuously improving the model to advance the experimental results, we can also use the method of expanding our experimental data set.More training corpus makes further improvements in the characteristics of model learning.

    Table 4:Experiment results on different concepts

    4.5 Case Study

    Most of the concept complement errors occur in cross-sentences.We use the following sentence as an example and analyze it.

    In this example, elliptical quantity noun phrase “213(213)” should be recognized as thelocationconcept to complete the noun head of elliptical quantity noun phrase.Since our work uses one sentence as a unit for ellipsis recovery, the phenomenon of cross-sentence omission cannot be recovered well.This problem caused our model fail in obtaining information of “small towns” in the example sentence, and the elliptical quantity noun phrase “213” was classified into the category ofthingconcept.Therefore, in the later work, we will introduce the cross-sentence semantic information into the semantic complement task of elliptical quantity noun phrases, while expanding the data set to balance the elliptical quantity noun phrases of each concept category.

    4.6 Ablation Experiment

    The ablation experiments are implemented to evaluate the performance of each module.As shown in Tab.5, we conduct four sets of experiments to compare the effects of the modules.

    Table 5:Ablation experiment

    Comparing experiment I with experiment IV, local feature extraction module can improve the experimental results by 4.55%.Comparing experiment II with experiment IV, the accuracy of the semantic encoding module for the overall recovery increased by 2.63%.In the input module, the BERT pre-training module has the best improvement.Comparing experiment III with experiment IV, with pre-training, the accuracy of the model increased by 8.7%.This also reminds us that the training effect of larger-scale corpus can lead to better deep learning models.Enlarging the scale of the corpus is an effective way to improve the experimental results.

    5 Conclusion

    In this paper, we aim to recover the elliptical quantity noun phrases by identifying the corresponding semantic categories automatically.We propose a hybrid model of BERT-Bi-LSTM-CNN,which utilizes BERT to obtain good representation, the Bi-LSTM model to encode long sentences and CNN to extract features.Experiments show that our model can effectively complement the omission of elliptical quantity noun phrases.Recovery of elliptical Chinese quantity noun phrases help the subsequent Chinese Abstract Meaning Representation (CAMR) parsing by improving the accuracy of the parser; hence, it is beneficial to downstream tasks based on the CAMR.

    In the future, we will explore the following directions.First, we will develop better models to recognize the elliptical quantity noun phrases with more concrete concepts categories.Second, we can also incorporate comprehensive linguistic knowledge of the elliptical quantity noun phrases into the model construction to improve the performance.

    Acknowledgement:The authors would like to thank all anonymous reviewers for their suggestions and feedback.

    Funding Statement:This research is supported by the National Science Foundation of China (Grant 61772278, author:Qu, W.; Grant Number:61472191, author:Zhou, J.http://www.nsfc.gov.cn/), the National Social Science Foundation of China (Grant Number:18BYY127, author:Li B.http://www.cssn.cn), the Philosophy and Social Science Foundation of Jiangsu Higher Institution (Grant Number:2019SJA0220, author:Wei, T.https://jyt.jiangsu.gov.cn) and Jiangsu Higher Institutions’Excellent Innovative Team for Philosophy and Social Science (Grant Number:2017STD006, author:Qu, W.https://jyt.jiangsu.gov.cn).

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    亚洲在线观看片| 欧美日韩综合久久久久久 | 国产精品爽爽va在线观看网站| 欧美一级毛片孕妇| 观看美女的网站| 成人永久免费在线观看视频| 高清在线国产一区| 欧美丝袜亚洲另类 | 日韩欧美国产在线观看| 精品欧美国产一区二区三| 国产欧美日韩一区二区三| 婷婷精品国产亚洲av| 欧美成人免费av一区二区三区| 18美女黄网站色大片免费观看| 啦啦啦免费观看视频1| 亚洲av二区三区四区| 99热这里只有是精品50| 最新在线观看一区二区三区| 日韩欧美精品免费久久 | 国产亚洲欧美在线一区二区| 99国产极品粉嫩在线观看| 波野结衣二区三区在线 | 国产精品永久免费网站| 99精品久久久久人妻精品| 成熟少妇高潮喷水视频| 成人精品一区二区免费| 成人av一区二区三区在线看| 亚洲,欧美精品.| av视频在线观看入口| 精品熟女少妇八av免费久了| 人妻丰满熟妇av一区二区三区| 国产精品99久久99久久久不卡| 欧美日本亚洲视频在线播放| 九九热线精品视视频播放| 久久九九热精品免费| 中出人妻视频一区二区| 一个人看视频在线观看www免费 | 欧美另类亚洲清纯唯美| 波多野结衣巨乳人妻| 在线a可以看的网站| 亚洲国产精品sss在线观看| or卡值多少钱| 无人区码免费观看不卡| 中文资源天堂在线| 国产乱人视频| 国产一区二区三区在线臀色熟女| 日本黄色片子视频| 国产精品乱码一区二三区的特点| 一区福利在线观看| 日韩欧美精品免费久久 | 亚洲成人免费电影在线观看| 亚洲18禁久久av| 亚洲精品久久国产高清桃花| 国产在线精品亚洲第一网站| 成人特级黄色片久久久久久久| 国产精品久久久人人做人人爽| 女人十人毛片免费观看3o分钟| 在线观看免费午夜福利视频| 日韩亚洲欧美综合| 久久精品91蜜桃| 成年版毛片免费区| 99国产精品一区二区三区| www日本在线高清视频| 波野结衣二区三区在线 | 亚洲专区国产一区二区| 国产乱人伦免费视频| 国产成+人综合+亚洲专区| av视频在线观看入口| 99久久精品热视频| 午夜免费观看网址| 一进一出好大好爽视频| 黄色女人牲交| 老司机在亚洲福利影院| 一卡2卡三卡四卡精品乱码亚洲| 国产69精品久久久久777片| 黄色片一级片一级黄色片| 老司机午夜十八禁免费视频| 亚洲欧美激情综合另类| 久久久久性生活片| 久久伊人香网站| 中文字幕高清在线视频| 看免费av毛片| 青草久久国产| 日本撒尿小便嘘嘘汇集6| 亚洲av不卡在线观看| 欧美大码av| 亚洲国产欧美网| 国产亚洲精品av在线| 国产亚洲精品一区二区www| 午夜福利在线观看免费完整高清在 | 国产高清激情床上av| 日本熟妇午夜| 1000部很黄的大片| 18禁国产床啪视频网站| 亚洲aⅴ乱码一区二区在线播放| av女优亚洲男人天堂| 欧美日韩一级在线毛片| 又爽又黄无遮挡网站| 怎么达到女性高潮| 757午夜福利合集在线观看| 国产精品综合久久久久久久免费| 91av网一区二区| 国产不卡一卡二| 90打野战视频偷拍视频| 精品久久久久久久末码| 亚洲人成伊人成综合网2020| 亚洲在线自拍视频| 国产精品国产高清国产av| 国产视频一区二区在线看| 神马国产精品三级电影在线观看| 黄色女人牲交| 国内毛片毛片毛片毛片毛片| 国产成人影院久久av| 99热这里只有精品一区| 色哟哟哟哟哟哟| 欧美日韩中文字幕国产精品一区二区三区| 欧美极品一区二区三区四区| 女同久久另类99精品国产91| 深夜精品福利| e午夜精品久久久久久久| 特级一级黄色大片| 亚洲,欧美精品.| 在线观看日韩欧美| 日韩国内少妇激情av| 国产一区在线观看成人免费| 嫁个100分男人电影在线观看| 欧美一级a爱片免费观看看| 国产高清视频在线播放一区| 性色av乱码一区二区三区2| 欧美极品一区二区三区四区| 人妻夜夜爽99麻豆av| 国产精品日韩av在线免费观看| 两人在一起打扑克的视频| 欧美乱码精品一区二区三区| 久久久久久人人人人人| 免费大片18禁| av片东京热男人的天堂| 小蜜桃在线观看免费完整版高清| 成人三级黄色视频| 亚洲成av人片免费观看| 香蕉久久夜色| 美女黄网站色视频| 日本黄色片子视频| 亚洲电影在线观看av| 有码 亚洲区| 成年免费大片在线观看| 精品国产超薄肉色丝袜足j| 欧美一区二区国产精品久久精品| 久久伊人香网站| 人妻夜夜爽99麻豆av| 听说在线观看完整版免费高清| 国产v大片淫在线免费观看| 九九在线视频观看精品| 欧美性猛交╳xxx乱大交人| 欧美另类亚洲清纯唯美| 国产午夜福利久久久久久| 国产精品日韩av在线免费观看| 狂野欧美激情性xxxx| 三级男女做爰猛烈吃奶摸视频| 丰满人妻一区二区三区视频av | 国内精品一区二区在线观看| 老汉色av国产亚洲站长工具| 美女高潮的动态| h日本视频在线播放| 美女高潮喷水抽搐中文字幕| 精品无人区乱码1区二区| 亚洲国产精品合色在线| 久久中文看片网| 国产麻豆成人av免费视频| 两性午夜刺激爽爽歪歪视频在线观看| 免费高清视频大片| 日本三级黄在线观看| 国内精品美女久久久久久| 有码 亚洲区| 午夜亚洲福利在线播放| 人人妻人人澡欧美一区二区| 午夜a级毛片| 国产一区二区亚洲精品在线观看| 精品久久久久久久末码| 色综合欧美亚洲国产小说| 色综合婷婷激情| 12—13女人毛片做爰片一| 国产精品一区二区三区四区久久| 校园春色视频在线观看| 精品人妻偷拍中文字幕| 中亚洲国语对白在线视频| 免费观看精品视频网站| 久久这里只有精品中国| tocl精华| 99国产精品一区二区三区| 久久久成人免费电影| 一本精品99久久精品77| 成人一区二区视频在线观看| 成年人黄色毛片网站| 久久精品国产99精品国产亚洲性色| 欧美黄色淫秽网站| 亚洲av不卡在线观看| 亚洲黑人精品在线| 很黄的视频免费| 麻豆成人午夜福利视频| 国产精品一区二区免费欧美| 老熟妇仑乱视频hdxx| 欧美日韩乱码在线| 国产成人欧美在线观看| 欧美不卡视频在线免费观看| 久久精品国产清高在天天线| 久久亚洲真实| 美女高潮的动态| 免费无遮挡裸体视频| 热99在线观看视频| 一个人免费在线观看的高清视频| 在线观看午夜福利视频| 黄色丝袜av网址大全| 成人av一区二区三区在线看| 淫秽高清视频在线观看| 久久精品国产99精品国产亚洲性色| 亚洲欧美日韩东京热| 亚洲成人精品中文字幕电影| 亚洲aⅴ乱码一区二区在线播放| 亚洲在线观看片| 国产野战对白在线观看| 国产精品久久久久久人妻精品电影| 亚洲av成人精品一区久久| 国产三级中文精品| 国产av不卡久久| 熟女人妻精品中文字幕| 99精品在免费线老司机午夜| 女同久久另类99精品国产91| 日本精品一区二区三区蜜桃| 黄色片一级片一级黄色片| 欧美激情久久久久久爽电影| 日韩欧美三级三区| 在线观看日韩欧美| 少妇的逼水好多| 身体一侧抽搐| 日本免费a在线| 亚洲av电影不卡..在线观看| 熟女电影av网| 99热只有精品国产| 午夜福利在线观看吧| 国产单亲对白刺激| 熟女电影av网| 欧美黄色片欧美黄色片| 熟女人妻精品中文字幕| netflix在线观看网站| 国产精品香港三级国产av潘金莲| 成人无遮挡网站| 日韩大尺度精品在线看网址| av在线蜜桃| 欧美午夜高清在线| 欧美成狂野欧美在线观看| 蜜桃久久精品国产亚洲av| 岛国在线观看网站| 色在线成人网| 亚洲国产欧美人成| 亚洲avbb在线观看| 老汉色∧v一级毛片| 90打野战视频偷拍视频| 亚洲国产中文字幕在线视频| 91久久精品电影网| 黄色视频,在线免费观看| 国产av一区在线观看免费| 身体一侧抽搐| 久久6这里有精品| 男女那种视频在线观看| 午夜影院日韩av| 国模一区二区三区四区视频| 美女被艹到高潮喷水动态| 男女午夜视频在线观看| 亚洲熟妇熟女久久| 一进一出抽搐动态| 日韩欧美在线乱码| eeuss影院久久| 久久久久九九精品影院| 欧美国产日韩亚洲一区| h日本视频在线播放| 一a级毛片在线观看| 俺也久久电影网| 成人亚洲精品av一区二区| 精品国产超薄肉色丝袜足j| 九色成人免费人妻av| 亚洲一区高清亚洲精品| 一个人看的www免费观看视频| 国产蜜桃级精品一区二区三区| 日本一二三区视频观看| 一级毛片女人18水好多| 精品一区二区三区人妻视频| 久久亚洲精品不卡| 动漫黄色视频在线观看| 搡女人真爽免费视频火全软件 | 日本与韩国留学比较| 国产高清有码在线观看视频| 欧美国产日韩亚洲一区| 国产一级毛片七仙女欲春2| 成人特级av手机在线观看| 黄色片一级片一级黄色片| 国产aⅴ精品一区二区三区波| 俺也久久电影网| 亚洲不卡免费看| 婷婷精品国产亚洲av| 免费电影在线观看免费观看| 精品一区二区三区视频在线 | 国产三级黄色录像| 免费人成在线观看视频色| 宅男免费午夜| 国产综合懂色| 精品国产美女av久久久久小说| 中出人妻视频一区二区| 日本与韩国留学比较| 国产麻豆成人av免费视频| 精品国产亚洲在线| 午夜免费激情av| 熟妇人妻久久中文字幕3abv| 精华霜和精华液先用哪个| 久久精品91蜜桃| 他把我摸到了高潮在线观看| 老鸭窝网址在线观看| 国产在视频线在精品| 日韩人妻高清精品专区| 别揉我奶头~嗯~啊~动态视频| 国产精品久久久久久久电影 | 又黄又爽又免费观看的视频| 69av精品久久久久久| 好看av亚洲va欧美ⅴa在| 九九热线精品视视频播放| 免费在线观看日本一区| 国产精品一区二区三区四区久久| 国产高清三级在线| 97超视频在线观看视频| 精品久久久久久久久久免费视频| 亚洲18禁久久av| 男女那种视频在线观看| www.999成人在线观看| 国产高清videossex| 草草在线视频免费看| 国产精品久久久久久久久免 | 精品国产三级普通话版| 亚洲男人的天堂狠狠| 少妇人妻一区二区三区视频| 首页视频小说图片口味搜索| 精品久久久久久,| 在线看三级毛片| 日韩欧美国产一区二区入口| 亚洲人与动物交配视频| 久久久久亚洲av毛片大全| 国产真实乱freesex| 成年女人看的毛片在线观看| 精品欧美国产一区二区三| 精品国产亚洲在线| 精品一区二区三区视频在线 | 亚洲一区二区三区色噜噜| 18禁裸乳无遮挡免费网站照片| 变态另类成人亚洲欧美熟女| 俺也久久电影网| 国产精品亚洲一级av第二区| 欧美黑人巨大hd| 国产精品国产高清国产av| 亚洲av熟女| 日韩精品中文字幕看吧| 欧美黑人巨大hd| 国产精品99久久99久久久不卡| 国产精品久久久久久人妻精品电影| www日本在线高清视频| 亚洲欧美一区二区三区黑人| 两性午夜刺激爽爽歪歪视频在线观看| 免费观看人在逋| 又黄又粗又硬又大视频| а√天堂www在线а√下载| 成人永久免费在线观看视频| 狠狠狠狠99中文字幕| 两性午夜刺激爽爽歪歪视频在线观看| 久久久久久久精品吃奶| 一级黄色大片毛片| a在线观看视频网站| 亚洲人成网站在线播放欧美日韩| 男人的好看免费观看在线视频| 女同久久另类99精品国产91| 久久午夜亚洲精品久久| 国产爱豆传媒在线观看| aaaaa片日本免费| 日韩欧美国产一区二区入口| 首页视频小说图片口味搜索| 国产不卡一卡二| 法律面前人人平等表现在哪些方面| 男女下面进入的视频免费午夜| 精品99又大又爽又粗少妇毛片 | 国产午夜精品久久久久久一区二区三区 | 国产成人av激情在线播放| 欧美黑人欧美精品刺激| 欧美最黄视频在线播放免费| 90打野战视频偷拍视频| 内射极品少妇av片p| 51午夜福利影视在线观看| 久久久久国产精品人妻aⅴ院| а√天堂www在线а√下载| 欧美激情久久久久久爽电影| 在线观看午夜福利视频| 老汉色∧v一级毛片| 久99久视频精品免费| 国产三级中文精品| 狂野欧美激情性xxxx| 一a级毛片在线观看| 国产久久久一区二区三区| 久久久久久久久大av| 国产在线精品亚洲第一网站| 亚洲国产欧美人成| 国产精品99久久久久久久久| 母亲3免费完整高清在线观看| 国语自产精品视频在线第100页| 免费看十八禁软件| 两个人的视频大全免费| 非洲黑人性xxxx精品又粗又长| 久久欧美精品欧美久久欧美| 成人鲁丝片一二三区免费| 美女免费视频网站| 国产av一区在线观看免费| 欧美黄色片欧美黄色片| 麻豆一二三区av精品| 精品久久久久久久毛片微露脸| 无限看片的www在线观看| 精品免费久久久久久久清纯| 又紧又爽又黄一区二区| 一本一本综合久久| 久久久久亚洲av毛片大全| 免费电影在线观看免费观看| 国产蜜桃级精品一区二区三区| 欧美丝袜亚洲另类 | 国产真实乱freesex| 亚洲 欧美 日韩 在线 免费| 免费看十八禁软件| 午夜两性在线视频| 99riav亚洲国产免费| 国产老妇女一区| 午夜免费激情av| 国产乱人伦免费视频| 久久亚洲真实| 欧美不卡视频在线免费观看| 18禁美女被吸乳视频| 午夜免费男女啪啪视频观看 | 欧美性猛交黑人性爽| 国产精品久久视频播放| 麻豆一二三区av精品| 99热只有精品国产| 尤物成人国产欧美一区二区三区| 亚洲成人中文字幕在线播放| 首页视频小说图片口味搜索| 日本五十路高清| av片东京热男人的天堂| 亚洲 国产 在线| 久久午夜亚洲精品久久| 日本成人三级电影网站| 少妇裸体淫交视频免费看高清| 午夜日韩欧美国产| 国产蜜桃级精品一区二区三区| 精品久久久久久久久久久久久| 男人舔女人下体高潮全视频| 蜜桃亚洲精品一区二区三区| 国内少妇人妻偷人精品xxx网站| 久久久久久久精品吃奶| 在线观看舔阴道视频| 国产av在哪里看| 日韩欧美三级三区| 精品久久久久久成人av| 婷婷精品国产亚洲av在线| 久久久久九九精品影院| 亚洲av五月六月丁香网| 国产男靠女视频免费网站| 99国产综合亚洲精品| 国产欧美日韩精品一区二区| 在线a可以看的网站| 香蕉av资源在线| 老司机午夜福利在线观看视频| 欧美+亚洲+日韩+国产| 一本一本综合久久| 波野结衣二区三区在线 | 1024手机看黄色片| 啦啦啦韩国在线观看视频| 国产精品久久久久久久电影 | 亚洲国产日韩欧美精品在线观看 | 午夜免费男女啪啪视频观看 | 欧美一级毛片孕妇| 亚洲成人精品中文字幕电影| 可以在线观看的亚洲视频| 色哟哟哟哟哟哟| 日韩有码中文字幕| 午夜免费成人在线视频| 免费在线观看成人毛片| 国内少妇人妻偷人精品xxx网站| 亚洲五月婷婷丁香| av天堂中文字幕网| 久久亚洲真实| 少妇高潮的动态图| 国产伦人伦偷精品视频| 国产免费男女视频| 搡老妇女老女人老熟妇| 国产精品久久久人人做人人爽| 欧美国产日韩亚洲一区| 亚洲激情在线av| 久久久精品大字幕| 日韩欧美一区二区三区在线观看| 精品99又大又爽又粗少妇毛片 | 一本综合久久免费| 亚洲av第一区精品v没综合| 亚洲va日本ⅴa欧美va伊人久久| 中文字幕熟女人妻在线| 亚洲成av人片在线播放无| 欧美性猛交黑人性爽| 国产熟女xx| 亚洲aⅴ乱码一区二区在线播放| 麻豆成人av在线观看| 午夜精品在线福利| 我的老师免费观看完整版| 精品熟女少妇八av免费久了| 国产爱豆传媒在线观看| 久久久久性生活片| 亚洲电影在线观看av| 欧美成狂野欧美在线观看| 熟女电影av网| 国产淫片久久久久久久久 | 无限看片的www在线观看| 最近视频中文字幕2019在线8| 国产亚洲精品一区二区www| 免费大片18禁| 老司机深夜福利视频在线观看| 最近视频中文字幕2019在线8| 香蕉久久夜色| 偷拍熟女少妇极品色| 在线观看舔阴道视频| or卡值多少钱| 9191精品国产免费久久| 乱人视频在线观看| 香蕉久久夜色| 18禁美女被吸乳视频| 欧美激情久久久久久爽电影| 国产精品 国内视频| 在线观看66精品国产| 美女 人体艺术 gogo| 国产精品久久久人人做人人爽| 人人妻人人看人人澡| 久久6这里有精品| 老熟妇仑乱视频hdxx| 99热这里只有精品一区| 国产一区二区在线av高清观看| 亚洲国产精品合色在线| 可以在线观看的亚洲视频| 国产主播在线观看一区二区| 国产欧美日韩精品一区二区| 久久亚洲真实| 99国产极品粉嫩在线观看| 日韩欧美国产在线观看| 香蕉丝袜av| 日本黄大片高清| 亚洲av不卡在线观看| 国产精品三级大全| 免费av不卡在线播放| 老司机午夜十八禁免费视频| 精品乱码久久久久久99久播| 久久精品国产亚洲av涩爱 | 国产探花在线观看一区二区| 丝袜美腿在线中文| 亚洲专区中文字幕在线| av黄色大香蕉| 舔av片在线| 午夜两性在线视频| 成年女人毛片免费观看观看9| 长腿黑丝高跟| 婷婷精品国产亚洲av| 在线看三级毛片| 88av欧美| 成人国产综合亚洲| 51国产日韩欧美| 国产精品一及| 天天一区二区日本电影三级| 国产精品美女特级片免费视频播放器| 国产免费av片在线观看野外av| 欧美乱码精品一区二区三区| 精品欧美国产一区二区三| 男人舔女人下体高潮全视频| 日韩欧美在线乱码| 青草久久国产| 色噜噜av男人的天堂激情| 色综合婷婷激情| 国产精华一区二区三区| 日韩 欧美 亚洲 中文字幕| 一区二区三区国产精品乱码| 午夜免费激情av| 国产一区在线观看成人免费| 小蜜桃在线观看免费完整版高清| 俺也久久电影网| 欧美在线黄色| 18+在线观看网站| 男插女下体视频免费在线播放| 啦啦啦免费观看视频1| 国产欧美日韩精品一区二区| 午夜老司机福利剧场| 美女免费视频网站| 亚洲中文日韩欧美视频| 美女cb高潮喷水在线观看| 午夜福利欧美成人| 18+在线观看网站| 亚洲av免费高清在线观看| 国产高清激情床上av| 国产伦一二天堂av在线观看| 欧美日韩一级在线毛片| 91麻豆精品激情在线观看国产| 日韩国内少妇激情av| 午夜激情欧美在线| 男女床上黄色一级片免费看| 久久久久国产精品人妻aⅴ院| 一级黄片播放器| 国产精品一区二区三区四区久久| 亚洲av二区三区四区| 又爽又黄无遮挡网站| 久久精品综合一区二区三区| 国产黄片美女视频|