• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Word Sense Disambiguation Model with a Cache-Like Memory Module

    2021-10-20 06:55:02LINQianLIUXinXINChunlei辛春蕾ZHANGHaiying張海英ZENGHualin曾華琳ZHANGTonghui張同輝SUJinsong蘇勁松
    關(guān)鍵詞:春蕾勁松

    LIN Qian(林 倩), LIU Xin(劉 鑫), XIN Chunlei(辛春蕾), ZHANG Haiying(張海英), ZENG Hualin(曾華琳), ZHANG Tonghui(張同輝), SU Jinsong(蘇勁松)

    School of Informatics, Xiamen University, Xiamen 361005, China

    Abstract: Word sense disambiguation (WSD), identifying the specific sense of the target word given its context, is a fundamental task in natural language processing. Recently, researchers have shown promising results using long short term memory (LSTM), which is able to better capture sequential and syntactic features of text. However, this method neglects the dependencies among instances, such as their context semantic similarities. To solve this problem, we proposed a novel WSD model by introducing a cache-like memory module to capture the semantic dependencies among instances for WSD. Extensive evaluations on standard datasets demonstrate the superiority of the proposed model over various baselines.

    Key words: word sense disambiguation (WSD); memory module; semantic dependencies

    Introduction

    Word sense disambiguation (WSD) aims to accurately identify the specific meaning of an ambiguous word according to particular context. As a fundamental task in natural language processing (NLP), it is beneficial to the studies of many other NLP tasks, such as neural machine translation (NMT), question answering (QA) and sentiment analysis. Therefore, how to construct a high-quality WSD model has attracted much attention in academia and industry.

    To achieve this goal, previous studies usually resorted to artificial features containing linguistic and other information. Generally, these models can be grouped into four categories: unsupervised[1-3], supervised[4-5], semi-supervised[6-8]and knowledge-based[9-10]approaches. Recently, with the rapid development of deep learning, the studies of WSD have evolved from conventional feature engineering based models into neural network architectures. From this point of view, the common practice is to use word embeddings. For example, the word embeddings were leveraged as WSD features in different ways[11]. In addition, recurrent neural networks (RNN) effectively exploiting word order have been proven to be effective. Some researchers[12-13]mainly focused on long short term memory (LSTM) based WSD models, which can capture the sequential and syntactic patterns of the given sentence and thus achieve competitive performance in this task. Despite their success, previous studies conducted WSD in isolation, while neglecting the semantic dependencies among instances: the considered words with similar context should have the same sense, which has been adopted in many NLP tasks, such as entity linking[14-15]. As shown in Fig.1, for the target worddykes, the same word senses appear in similar contexts.

    Instance 1 Assuming that magnetizations in the South Mountains granodiorite,Telegraph Pass granite and felsic dykes were acquired before and during ductile extensional deformation,we interpret these data as demonstrating that the South Mountains footwall has not been significantly tilted after mylonitic deformation.Sense_id: dykeXl:06:00::Instance 2 Similarly,assuming that magnetizations in the microdiorite dykes were acquired during initial stages of brittle deforma-tion,we interpret these data as demonstrating that the South Mountains footwall has not been significantly tilted after the brittle sta-ges of deformation.Sense_id: dykeXHiOO::

    In this paper, we propose a novel WSD model with a cache-like memory module. It is a significant extension of the conventional LSTM-based WSD model[13]. The introduced cache-like memory module is used to memorize the sense disambiguation results of other instances for the same target word, and thus provide helpful information to the current WSD. We design this module based on the fact that the sense disambiguation result may be the same based on the similar context. Besides, since the memory can be traced back to training examples, it might help explain the decisions that the model is making and thus improve understandability of the model, so that the memorized value could help to improve the model accuracy, as verified in the studies of other NLP tasks[16-17]. It is worth mentioning that, the introduced cache-like memory is composed of key-value pairs. The keys denote the semantic representations of instances, and values are the corresponding sense disambiguation results. We compute the dot product similarities between the current hidden state and the stored keys in memory. Then, according to these dot product similarities, we summarize the memorized sense disambiguation results as the weighted sum of the values. This summarized vector can be incorporated into the conventional decoder to refine the sense disambiguation result of the current instance. According to this, our proposed model is able to fully exploit the semantic similarities among instances to refine the conventional WSD model. To investigate the effectiveness of our proposed WSD model, we carry out multiple groups of experiments on benchmark datasets. Experimental results and in-depth analysis show that our model outperforms previous WSD models.

    The related work mainly includes WSD and memory neural network. WSD has been one of hot research topics in the community of NLP. More specifically, the previous studies on WSD can be roughly classified into the following aspects: unsupervised WSD, supervised WSD, semi-supervised WSD, and knowledge-based WSD. Unsupervised WSD is based on the assumption that similar word senses appear in similar contexts. Therefore, studies on unsupervised WSD mainly focuses on how to automatically learn the sense tags of target words from unlabeled data. The typical approaches take sense disambiguation task as a clustering problem which aims to group together examples with similar contexts[1-3, 18-19].

    Different from unsupervised WSD, supervised WSD mainly uses manually sense-annotated to train a classifier for WSD. Zhong and Ng[4]used a linear kernel support vector machine as the classifier. Shenetal.[5]also trained a multiclass classifier to distinguish categories[5]. Experimental results on many datasets demonstrate that these approaches can achieve satisfying performance in this task.

    Apparently, it is costly to obtain sense annotated corpora so that it is harder to extend the supervised WSD to the new domain. To deal with this problem, many researchers paid attention to semi-supervised WSD, which can simultaneously exploit both label and unlabeled datasets[6-8, 20-25].

    Unlike the above-mentioned approaches, dominant methods in this aspect mainly resort to leveraging external knowledge resources to identify the senses of target words such as knowledge bases, semantic networks and dictionaries[9, 19, 26-33]. However, knowledge-based WSD cannot been widely used due to the fact that external knowledge resources are rare for many languages and domains.

    Recently, with the rapid development of deep learning, neural network based WSD has attracted increasing attention and become dominant models in this tasks[9, 11, 34-36]. Compared with traditional methods, neural network-based models can automatically learn features that are beneficial to WSD. Particularly, some researchers use LSTM networks to capture the relationship between the context and word meaning by modeling the sequence of words surrounding the target word[12, 13, 37]. However, all above work conducts WSD in isolation.

    Recently, due to the role of memory in storing previous results and capturing useful history, memory neural network has been widely used in many NLP tasks, such as language modeling[38-40], QA[41]and NMT[16, 17, 42]. To the best of our knowledge, our work to introduce a memory module into WSD is meaningful, which directly utilizes the memorized useful information from similar examples, and thus makes better use of semantic dependencies between instances.

    The remainder of this paper is organized as follows. Session 1 describes our proposed model, including details on the model architecture and objective function. Experimental results are presented and analyzed in section 2, followed by conclusions in section 3.

    1 Proposed Model

    In this section, we will describe our proposed WSD model in detail. Our model is a significant extension of the conventional LSTM-based WSD model[13]. However, it is worth to note that our introduced cache-like memory is also applicable to other neural network based WSD model.

    Figure 2 illustrates the model architecture, which is composed of a conventional LSTM-based WSD model and a cache-like memory module.

    1.1 LSTM-based WSD model

    Fig. 2 Architecture of our WSD model

    Given a target wordxiand its contextual hidden statehi, we introduce a softmax layer to predict the probability distribution over its candidate senses. Formally, we produce the probability distribution of candidate senses as:

    (1)

    (2)

    1.2 Cache-like memory module

    In order to better identify the sense of a target word, we explicitly model the semantic dependencies among instances to refine the neural based WSD model. To this end, we introduce a cache-like memory module which memorizes the sense disambiguation results of other instances as an array of key-value pairs. Our basic intuition is that the more similar the context of current instance with other instances in memory, the closer their word sense disambiguation results should be.

    To exploit the cache-like memory information, we summarize the memorized sense disambiguation results as a memory vectormi. Formally,miis defined as the sum over the valuesvtweighted by the normalized similarities {st}:

    (3)

    Then we incorporate the memory vectormiinto the final output as

    (4)

    (5)

    whereσis the sigmoid function, the dynamic weightλis used to control the effect of the cache-like memory module andW(3),W(4), andW(5)are learnable parameter matrixes. The basic idea behind our strategy is that the same target word of different instances requires different sizes of context to be disambiguated. For one considered instance, if our model is able to retrieve another instance with the similar context from the cache-like memory, it is more reasonable for our model to exploit the disambiguation result of this instance, and vice versa.

    1.3 Training objective function

    Given a training corpusD, we train the model according to the following cross-entropy with parametersθ:

    (6)

    whereS(xi) is the sense set of the target wordxi, andtj(xi) is thejth element of the sense distributiont(xi) forxi. We will describe training details in section 2.1.2.

    2 Experiments

    2.1 Setup

    2.1.1Datasets

    To evaluate our proposed model, we carry out WSD experiments on the lexical sample task of SensEval2[43]and SensEval3[44].

    Table 1 provides the details of experimental data sets, including training set and testing set. Our baseline is a BiLSTM-based WSD model, proposed in Ref. [13].

    Table 1 Details of experimental data sets

    We train the proposed WSD model in two steps: pre-training and fine-tuning. In the pre-training step, from Ref. [13], we train a WSD model based on BiLSTM, with hyper-parameter setting presented in Table 2. Please note that we also train the baseline model under the same hyper-parameters, insuring the fair comparison.

    Table 2 Hyper-parameter settings at the pre-training

    2.1.2TrainingDetails

    Since our training datasets is not in large scale, it is better to employ dropout to prevent the model from over-fitting. Specifically, we set both dropout rates of embedding and hidden state as 0.5. Besides, we add Gaussian noise ~N(0, 0.2σi) to the word embeddings of input sentences, whereσiis theith dimension standard deviation in word embeddings matrix. In addition, we randomly discard some input words with rate 0.1 to further alleviate this issue and also we use theGloVevectors to initialize word embedding. For the out-of-vocabulary (OOV) words not appearing in the Glove vocabulary, we directly initialize these words according to the uniform distribution ~u(-0.1, 0.1).

    We apply stochastic gradient descent(SGD) algorithm to optimize model training. In order to balance the performance and the training speed of the model, at the early stage, we first use a large learning rate to ensure that the model can quickly descend after finding the gradient descent direction, and then at the later stage, a smaller learning rate is adopted to make the parameters slowly change to approximate the optimal parameters. The decay factor of the learning rate is set as 0.96 every fixed 75 steps.

    At the fine-tuning stage, we add a cache-like memory module into our WSD model. Note that before fine-tuning, we have stored the hidden states and sense disambiguation results of all training instances as key-value pairs into our cache-like memory, where these key-value pairs are fixed during fine-tuning. The hyper-parameters of the cache-like memory module is shown in Table 3. To avoid the slow training caused by a smaller learning rate, we limit the learning rate using a threshold. In addition, we clip the gradient to deal with the gradient vanishing problem.

    Table 3 Hyper-parameter settings at the cache-like

    2.1.3Baselines

    We refer to our model as MEM-BiLSTM and compare it with the following baselines.

    (1) 100JHU(R)[45]. It exploits a rich set of features for WSD.

    (2) IMS+adapted CW[34]. It uses a feedforward neural network to incorporate word embeddings into WSD model.

    (3) BiLSTM[13]. It is a commonly-used WSD model, which is based on bi-directional LSTM.

    2.2 Experimental results

    2.2.1Performance

    The results of different models measured in terms of F1 score are given in Table 4. Compared with the previous models, our reimplemented baseline achieves better or similar performance on the two datasets, respectively. This result demonstrates that our reimplemented baseline is competitive. Furthermore, when equipped with the baseline with our cache-like memory module, our WSD model achieves the best scores on SensEval2 and SensEval3 with varying degrees of improvements. Specifically, on the two datasets, our WSD model outperforms the reimplemented BiLSTM baseline by 0.4 and 0.3, respectively, which strongly proves that adding the memory module can help the WSD model.

    Table 4 Results for SensEval 2 and SensEval 3 on the

    2.2.2Generality

    To verify the generality of our proposed model, we also train different models using different sizes of training corpora: 10%, 25%, 50%, 75% and 100%, and then report the performances in Table 5. We can observe that with the increase of the amount of training data, the performance gap between the baseline and ours become larger. The underlying reason is when using the large training data, our model is able to exploit more similar instances to refine WSD.

    Table 5 Results for SensEval 2 and SensEval 3 on the English lexical sample task

    2.3 Case study

    To analyze why our model can outperform the baseline, we compare the WSD results of different models. Figures 3-5 show three examples, respectively. We can observe that in comparison to BiLSTM, our proposed model is able to make correct predictions with the help of the semantic related instances from the memory. Moreover, we simultaneously provide the three most similar instances for target words “argument”, “activate” and “hearth” in the last three rows of Figs. 3-5, respectively.

    Instance: ... When it fell to Dukes to introduce the second stage of the Bill empowering the referendum,he was forced to address himself specifically to the bishop, arguments in their letter (text,Irish Times,15 May 1986).Reference: argument0Zol: 10:02::BiLSTM: argumentyol: 10:03::MEM-BiLSTM: argumentyol: 10:02::The most similar instance 1: ...This has some affinity with the Marxist position. In a published argument between Scholes and Hir-sch, the former made the following statement, on the assumption that the conservative Hirsch would disagree with it ...The most similar instance 2: ... he accepted Chinas offer of a seat on the Basic Law Drafting Committee, helping to write Hong Kong SAR Chinas post - 1997 mini - constitution, and was embroiled in more unsuccessful arguments for direct elections, op-posed by mainland communists and Hong Kong conservatives ...The most similar instance 3: ...Although the banks will begin to present their arguments today,Mr Scrivener said: this court is not concerned with private rights ...

    Fig. 3 The first example

    Fig. 4 The second example

    3 Conclusions

    In this paper, we proposed a novel WSD model with a cache-like memory module. As an improvement of the conventional LSTM-based WSD model, our model incorporates a cache-like memory module composed of key-value pairs, where the keys denote the semantic representation of instances, and values are the corresponding sense disambiguation results. We first compute the dot product similarities between the current hidden state and the stored keys in memory. Then, we summarize the memory values as a memory vector according to these dot product similarities, and then the induced memory vector is exploited to refine the WSD results of the current instance. Extensive experiments also validate the effectiveness of our proposed model.

    In the future, we plan to design more effective architectures to better exploit semantic dependencies between instances for WSD. Besides, how to introduce graph neural networks into WSD is also one of our focuses in future researches.

    猜你喜歡
    春蕾勁松
    顧勁松
    藝術(shù)家(2024年2期)2024-04-15 08:19:20
    嚴(yán)冬過盡綻春蕾——致公黨連云港市委會齊心協(xié)力戰(zhàn)疫情
    莊勁松美術(shù)作品
    莆田市婦聯(lián)開展“精準(zhǔn)脫貧·春蕾圓夢”助學(xué)行動
    海峽姐妹(2020年10期)2020-10-28 08:08:46
    Simulation and experimental research of digital valve control servo system based on CMAC-PID control method①
    余勁松
    春蕾圓夢 高考助學(xué)在行動
    海峽姐妹(2015年8期)2015-02-27 15:12:33
    春蕾之花 美麗綻放——百名“春蕾之星”尋訪記
    福利中國(2015年6期)2015-01-03 08:44:38
    閱讀理解精練
    Gross Error Detection and Identification Based on Parameter Estimation for Dynamic Systems*
    国产 一区精品| 欧美日韩亚洲综合一区二区三区_| 欧美精品高潮呻吟av久久| 9色porny在线观看| 亚洲人成电影观看| 最近中文字幕2019免费版| 啦啦啦 在线观看视频| 亚洲成色77777| 90打野战视频偷拍视频| 大陆偷拍与自拍| 最近中文字幕2019免费版| 国产福利在线免费观看视频| 成人毛片60女人毛片免费| 一级,二级,三级黄色视频| 丝袜人妻中文字幕| 亚洲精品日韩在线中文字幕| 午夜av观看不卡| 亚洲av成人精品一二三区| 亚洲综合色网址| 国产日韩一区二区三区精品不卡| av天堂久久9| 各种免费的搞黄视频| 亚洲欧美成人综合另类久久久| 成人黄色视频免费在线看| videosex国产| 欧美乱码精品一区二区三区| 飞空精品影院首页| 综合色丁香网| 精品一区二区三区四区五区乱码 | 免费在线观看黄色视频的| 国语对白做爰xxxⅹ性视频网站| 精品国产一区二区三区四区第35| 亚洲精品成人av观看孕妇| 不卡av一区二区三区| 免费在线观看视频国产中文字幕亚洲 | 国产精品久久久久久精品古装| 久久久亚洲精品成人影院| 大陆偷拍与自拍| 在线看a的网站| 成人手机av| 欧美中文综合在线视频| 久久99热这里只频精品6学生| 女人爽到高潮嗷嗷叫在线视频| 热99国产精品久久久久久7| 国产成人免费无遮挡视频| 亚洲精品日本国产第一区| 国产精品国产三级国产专区5o| 十八禁高潮呻吟视频| 日本91视频免费播放| 国产精品久久久人人做人人爽| 免费少妇av软件| 另类精品久久| 亚洲精品中文字幕在线视频| 亚洲精品乱久久久久久| 国产黄频视频在线观看| 十八禁网站网址无遮挡| 熟女av电影| 亚洲国产看品久久| 蜜桃在线观看..| 在线观看人妻少妇| 少妇 在线观看| 免费黄网站久久成人精品| 妹子高潮喷水视频| 飞空精品影院首页| 无遮挡黄片免费观看| 秋霞伦理黄片| 午夜精品国产一区二区电影| 亚洲欧美中文字幕日韩二区| a 毛片基地| 亚洲精品乱久久久久久| 超碰成人久久| 婷婷色综合www| av女优亚洲男人天堂| 国产99久久九九免费精品| 精品少妇久久久久久888优播| 国产片内射在线| 欧美日韩成人在线一区二区| 成人毛片60女人毛片免费| 亚洲成人免费电影在线观看| 波多野结衣巨乳人妻| 国内精品久久久久精免费| 最近最新中文字幕大全电影3 | 大陆偷拍与自拍| 久久香蕉国产精品| 欧美最黄视频在线播放免费| 日本五十路高清| 精品电影一区二区在线| 巨乳人妻的诱惑在线观看| 国产三级黄色录像| 亚洲成av人片免费观看| 亚洲国产中文字幕在线视频| 一区二区三区国产精品乱码| 窝窝影院91人妻| 一二三四在线观看免费中文在| 欧美亚洲日本最大视频资源| 午夜福利一区二区在线看| 久久精品人人爽人人爽视色| 又大又爽又粗| 最新在线观看一区二区三区| 又大又爽又粗| 母亲3免费完整高清在线观看| 真人一进一出gif抽搐免费| 夜夜夜夜夜久久久久| 级片在线观看| 9191精品国产免费久久| 一区二区三区高清视频在线| 啦啦啦免费观看视频1| 啦啦啦 在线观看视频| 欧美在线一区亚洲| 黄色片一级片一级黄色片| 老司机午夜福利在线观看视频| 中文字幕久久专区| 免费无遮挡裸体视频| 嫩草影视91久久| 国产一区二区三区综合在线观看| 中亚洲国语对白在线视频| 女人精品久久久久毛片| 多毛熟女@视频| 91精品国产国语对白视频| 国产视频一区二区在线看| 夜夜爽天天搞| 性少妇av在线| 欧美日本视频| 不卡一级毛片| 少妇被粗大的猛进出69影院| 欧美日韩亚洲综合一区二区三区_| 日本撒尿小便嘘嘘汇集6| 日日摸夜夜添夜夜添小说| 亚洲精品av麻豆狂野| 久久天堂一区二区三区四区| 亚洲专区国产一区二区| 亚洲第一欧美日韩一区二区三区| 欧美日韩福利视频一区二区| 别揉我奶头~嗯~啊~动态视频| 国产人伦9x9x在线观看| 日韩国内少妇激情av| 国产av在哪里看| 欧美黄色片欧美黄色片| 老鸭窝网址在线观看| 国产国语露脸激情在线看| 亚洲情色 制服丝袜| netflix在线观看网站| 午夜激情av网站| 午夜福利高清视频| 久久国产乱子伦精品免费另类| 亚洲国产高清在线一区二区三 | 欧美中文综合在线视频| 免费在线观看完整版高清| 1024香蕉在线观看| 久久香蕉国产精品| 久久久久九九精品影院| 欧美日韩福利视频一区二区| 日韩欧美一区视频在线观看| 美女大奶头视频| 美女高潮喷水抽搐中文字幕| 国产精品av久久久久免费| av网站免费在线观看视频| 亚洲免费av在线视频| 精品一区二区三区av网在线观看| 老熟妇乱子伦视频在线观看| 91成人精品电影| АⅤ资源中文在线天堂| 免费看十八禁软件| 狂野欧美激情性xxxx| 99久久99久久久精品蜜桃| 欧美日韩黄片免| 免费av毛片视频| 狠狠狠狠99中文字幕| 性少妇av在线| 国产伦人伦偷精品视频| 一级,二级,三级黄色视频| 亚洲人成伊人成综合网2020| 久久久国产精品麻豆| 久久人人97超碰香蕉20202| 国产一卡二卡三卡精品| 美女大奶头视频| 深夜精品福利| 亚洲av日韩精品久久久久久密| www.999成人在线观看| 女性生殖器流出的白浆| 亚洲激情在线av| 91精品三级在线观看| av网站免费在线观看视频| 99久久精品国产亚洲精品| 亚洲第一欧美日韩一区二区三区| 伦理电影免费视频| 日韩国内少妇激情av| 亚洲av成人不卡在线观看播放网| 在线观看一区二区三区| 老汉色av国产亚洲站长工具| 欧美日本中文国产一区发布| 色播亚洲综合网| 欧美色欧美亚洲另类二区 | 国产精品98久久久久久宅男小说| 国产精品综合久久久久久久免费 | 啦啦啦 在线观看视频| 人成视频在线观看免费观看| 国产亚洲av嫩草精品影院| 亚洲av电影不卡..在线观看| 一级毛片精品| 欧美日韩精品网址| 国产xxxxx性猛交| 久久久精品欧美日韩精品| 一进一出抽搐动态| 亚洲欧美日韩高清在线视频| 日本黄色视频三级网站网址| 精品国产美女av久久久久小说| 啪啪无遮挡十八禁网站| 性欧美人与动物交配| 国产亚洲精品av在线| 亚洲专区中文字幕在线| 国产欧美日韩精品亚洲av| 真人一进一出gif抽搐免费| 成人欧美大片| av免费在线观看网站| 动漫黄色视频在线观看| av超薄肉色丝袜交足视频| 午夜福利视频1000在线观看 | 免费无遮挡裸体视频| 久久国产精品男人的天堂亚洲| 亚洲狠狠婷婷综合久久图片| 18禁黄网站禁片午夜丰满| 国产激情欧美一区二区| 不卡一级毛片| 亚洲av成人av| 国产熟女午夜一区二区三区| 50天的宝宝边吃奶边哭怎么回事| 妹子高潮喷水视频| 99国产精品免费福利视频| 在线观看免费日韩欧美大片| 亚洲欧美日韩高清在线视频| 日韩精品青青久久久久久| 91成年电影在线观看| 婷婷丁香在线五月| 成人特级黄色片久久久久久久| 一区二区日韩欧美中文字幕| 免费女性裸体啪啪无遮挡网站| 最新美女视频免费是黄的| 无遮挡黄片免费观看| 欧美成狂野欧美在线观看| 国产免费男女视频| 国产精品九九99| 法律面前人人平等表现在哪些方面| 欧美一区二区精品小视频在线| 啪啪无遮挡十八禁网站| 欧美 亚洲 国产 日韩一| 亚洲成av片中文字幕在线观看| 两人在一起打扑克的视频| 精品一区二区三区视频在线观看免费| 国产av精品麻豆| 91大片在线观看| 色在线成人网| x7x7x7水蜜桃| 日本免费一区二区三区高清不卡 | 亚洲一区二区三区色噜噜| av视频在线观看入口| 脱女人内裤的视频| 日韩欧美三级三区| 国产片内射在线| 精品久久久久久,| 午夜精品在线福利| 在线观看免费视频日本深夜| 满18在线观看网站| 又黄又粗又硬又大视频| 免费高清在线观看日韩| 免费一级毛片在线播放高清视频 | 亚洲 欧美一区二区三区| 亚洲国产看品久久| 国产精品亚洲av一区麻豆| 国产亚洲av高清不卡| 欧美日韩亚洲综合一区二区三区_| 精品久久久久久久久久免费视频| 久久婷婷人人爽人人干人人爱 | 国产主播在线观看一区二区| 夜夜夜夜夜久久久久| 亚洲精品国产区一区二| 18禁裸乳无遮挡免费网站照片 | 女同久久另类99精品国产91| 母亲3免费完整高清在线观看| 精品国产国语对白av| 美女高潮喷水抽搐中文字幕| 级片在线观看| 久久精品国产亚洲av香蕉五月| 久久久久久亚洲精品国产蜜桃av| 男人舔女人的私密视频| 成人手机av| 亚洲国产日韩欧美精品在线观看 | 母亲3免费完整高清在线观看| 女人爽到高潮嗷嗷叫在线视频| 国产欧美日韩精品亚洲av| svipshipincom国产片| 国产野战对白在线观看| 亚洲欧美激情综合另类| 成人国产一区最新在线观看| 亚洲五月婷婷丁香| 九色亚洲精品在线播放| 免费搜索国产男女视频| 国产精品亚洲美女久久久| 搡老熟女国产l中国老女人| 精品人妻在线不人妻| av片东京热男人的天堂| 伊人久久大香线蕉亚洲五| 在线观看免费日韩欧美大片| 亚洲欧美激情在线| 国产成人影院久久av| 天天一区二区日本电影三级 | 曰老女人黄片| 香蕉丝袜av| 一区二区三区国产精品乱码| 国产欧美日韩精品亚洲av| 级片在线观看| 午夜免费观看网址| 在线av久久热| 此物有八面人人有两片| 操出白浆在线播放| 又紧又爽又黄一区二区| 亚洲狠狠婷婷综合久久图片| 91精品国产国语对白视频| 国产精品二区激情视频| 精品久久久久久久久久免费视频| 99精品欧美一区二区三区四区| 欧美中文综合在线视频| 在线观看一区二区三区| 狂野欧美激情性xxxx| 亚洲少妇的诱惑av| 女人精品久久久久毛片| 无遮挡黄片免费观看| 极品人妻少妇av视频| 午夜福利高清视频| 国产精品美女特级片免费视频播放器 | 97人妻精品一区二区三区麻豆 | 亚洲男人的天堂狠狠| 村上凉子中文字幕在线| 午夜老司机福利片| 人人妻,人人澡人人爽秒播| 中文字幕人妻熟女乱码| 午夜视频精品福利| 可以在线观看的亚洲视频| 亚洲精品在线美女| 国产av又大| 亚洲成人免费电影在线观看| 在线观看午夜福利视频| 国产精品1区2区在线观看.| 久久精品国产亚洲av高清一级| av电影中文网址| 亚洲无线在线观看| 真人做人爱边吃奶动态| 亚洲av电影在线进入| 亚洲色图综合在线观看| 久久久久久国产a免费观看| 精品国内亚洲2022精品成人| 女生性感内裤真人,穿戴方法视频| 夜夜躁狠狠躁天天躁| 国产91精品成人一区二区三区| 欧美国产日韩亚洲一区| 国产av一区二区精品久久| 校园春色视频在线观看| 黄色女人牲交| 久久国产精品人妻蜜桃| www.999成人在线观看| 久久久久久久久免费视频了| 亚洲第一青青草原| 黄色a级毛片大全视频| 岛国视频午夜一区免费看| 久久人妻av系列| 一进一出好大好爽视频| 制服人妻中文乱码| 亚洲成av片中文字幕在线观看| 黄片播放在线免费| 久久久水蜜桃国产精品网| 欧美一级a爱片免费观看看 | 国产免费av片在线观看野外av| 一卡2卡三卡四卡精品乱码亚洲| 国产真人三级小视频在线观看| 性欧美人与动物交配| 精品久久久久久,| 国产单亲对白刺激| 美女免费视频网站| √禁漫天堂资源中文www| 国产不卡一卡二| 亚洲精品国产一区二区精华液| 中文字幕人成人乱码亚洲影| 日韩欧美在线二视频| 国产不卡一卡二| 亚洲精品国产一区二区精华液| 国产精品香港三级国产av潘金莲| 在线播放国产精品三级| 1024香蕉在线观看| 免费观看精品视频网站| 国产精品爽爽va在线观看网站 | 成人国产综合亚洲| 久久精品成人免费网站| 亚洲一区二区三区色噜噜| 一区在线观看完整版| 午夜视频精品福利| 97超级碰碰碰精品色视频在线观看| 国产精品久久久久久亚洲av鲁大| 亚洲成国产人片在线观看| 免费少妇av软件| 亚洲精品国产一区二区精华液| 日本 av在线| 欧洲精品卡2卡3卡4卡5卡区| 黄片小视频在线播放| 黄色a级毛片大全视频| 亚洲精品美女久久av网站| 久久 成人 亚洲| 琪琪午夜伦伦电影理论片6080| 热re99久久国产66热| 一个人免费在线观看的高清视频| 超碰成人久久| 1024视频免费在线观看| 动漫黄色视频在线观看| 日本一区二区免费在线视频| 久久人人精品亚洲av| 久久精品国产99精品国产亚洲性色 | 免费看美女性在线毛片视频| 亚洲专区字幕在线| 电影成人av| 亚洲成人免费电影在线观看| 亚洲一卡2卡3卡4卡5卡精品中文| 99国产极品粉嫩在线观看| 成人国语在线视频| 久久精品aⅴ一区二区三区四区| 国产人伦9x9x在线观看| 亚洲熟妇熟女久久| 国产亚洲精品av在线| 性色av乱码一区二区三区2| 亚洲va日本ⅴa欧美va伊人久久| 两个人视频免费观看高清| 国语自产精品视频在线第100页| 亚洲五月天丁香| 国产精品亚洲一级av第二区| 久久午夜亚洲精品久久| 好男人电影高清在线观看| 99久久精品国产亚洲精品| 成人三级黄色视频| svipshipincom国产片| 午夜福利,免费看| 亚洲精品av麻豆狂野| videosex国产| 欧美日韩瑟瑟在线播放| 桃红色精品国产亚洲av| 国产欧美日韩一区二区三区在线| 国产av在哪里看| 男人的好看免费观看在线视频 | 在线观看免费视频日本深夜| 欧美人与性动交α欧美精品济南到| 老司机靠b影院| 国产精品美女特级片免费视频播放器 | 这个男人来自地球电影免费观看| 亚洲精品在线美女| 老司机午夜十八禁免费视频| 免费少妇av软件| 久久久久精品国产欧美久久久| 午夜福利,免费看| 欧美中文综合在线视频| 88av欧美| 看黄色毛片网站| 欧美日韩乱码在线| 在线观看免费视频日本深夜| 在线观看免费午夜福利视频| 国产精品久久久av美女十八| 丰满人妻熟妇乱又伦精品不卡| 91精品三级在线观看| 在线国产一区二区在线| 人成视频在线观看免费观看| 身体一侧抽搐| 亚洲一区高清亚洲精品| 精品国产美女av久久久久小说| 免费观看精品视频网站| 精品国产乱码久久久久久男人| 亚洲一区中文字幕在线| 欧美成狂野欧美在线观看| 乱人伦中国视频| 国产亚洲欧美精品永久| 亚洲成av片中文字幕在线观看| 色综合婷婷激情| 男女下面插进去视频免费观看| 国产视频一区二区在线看| 欧美日韩一级在线毛片| 一区二区三区高清视频在线| 久久久国产成人免费| 伊人久久大香线蕉亚洲五| 大码成人一级视频| 色综合亚洲欧美另类图片| 一级毛片女人18水好多| 制服丝袜大香蕉在线| 欧美精品啪啪一区二区三区| 亚洲欧美一区二区三区黑人| 亚洲最大成人中文| 国产色视频综合| 国产精品影院久久| 日韩一卡2卡3卡4卡2021年| 亚洲 国产 在线| 亚洲免费av在线视频| 国产免费av片在线观看野外av| 亚洲 欧美一区二区三区| 中文字幕另类日韩欧美亚洲嫩草| 又紧又爽又黄一区二区| 非洲黑人性xxxx精品又粗又长| 精品国产一区二区三区四区第35| 淫妇啪啪啪对白视频| 天天添夜夜摸| 在线观看66精品国产| 免费在线观看黄色视频的| 国产又爽黄色视频| 欧美一级a爱片免费观看看 | 欧美日韩中文字幕国产精品一区二区三区 | 免费看a级黄色片| 国产精品 欧美亚洲| 亚洲精品粉嫩美女一区| 亚洲va日本ⅴa欧美va伊人久久| 黄片大片在线免费观看| 乱人伦中国视频| 欧美中文综合在线视频| 国产三级黄色录像| 老熟妇乱子伦视频在线观看| av福利片在线| avwww免费| 日韩大尺度精品在线看网址 | 日韩中文字幕欧美一区二区| 国产麻豆69| 两个人免费观看高清视频| 丁香六月欧美| 久久久久久久久久久久大奶| 91字幕亚洲| 午夜福利成人在线免费观看| 99在线人妻在线中文字幕| 天天一区二区日本电影三级 | 涩涩av久久男人的天堂| 性少妇av在线| 亚洲自拍偷在线| 美女扒开内裤让男人捅视频| 欧美国产精品va在线观看不卡| 久久精品国产亚洲av香蕉五月| 久久伊人香网站| 亚洲精品一区av在线观看| 欧美成人免费av一区二区三区| 不卡一级毛片| 久久亚洲真实| 亚洲 欧美 日韩 在线 免费| 99香蕉大伊视频| 国产精品综合久久久久久久免费 | 久久久国产成人精品二区| 午夜福利18| 国产亚洲欧美精品永久| 国产欧美日韩一区二区三| xxx96com| 婷婷六月久久综合丁香| 丰满人妻熟妇乱又伦精品不卡| 91精品三级在线观看| 久久 成人 亚洲| 18禁美女被吸乳视频| 在线观看免费视频日本深夜| 在线观看日韩欧美| 韩国精品一区二区三区| 久久久国产成人精品二区| 一区二区日韩欧美中文字幕| 久久人人爽av亚洲精品天堂| 日韩欧美一区二区三区在线观看| 日韩欧美在线二视频| 国产男靠女视频免费网站| 亚洲欧美精品综合久久99| av在线播放免费不卡| xxx96com| or卡值多少钱| 久久久久国产精品人妻aⅴ院| 久久久久国内视频| 激情视频va一区二区三区| 很黄的视频免费| 在线视频色国产色| svipshipincom国产片| 99精品欧美一区二区三区四区| 真人做人爱边吃奶动态| 丝袜美腿诱惑在线| 看片在线看免费视频| 19禁男女啪啪无遮挡网站| 一本综合久久免费| 亚洲av电影在线进入| 在线十欧美十亚洲十日本专区| 日韩大码丰满熟妇| 此物有八面人人有两片| 桃色一区二区三区在线观看| av视频免费观看在线观看| 免费观看精品视频网站| 亚洲性夜色夜夜综合| 国产精品98久久久久久宅男小说| 一区在线观看完整版| 天天躁狠狠躁夜夜躁狠狠躁| svipshipincom国产片| 十八禁网站免费在线| 女人爽到高潮嗷嗷叫在线视频| 午夜福利在线观看吧| 日韩免费av在线播放| 亚洲成国产人片在线观看| 国产精品久久久久久精品电影 | 午夜日韩欧美国产| 人成视频在线观看免费观看| 中文亚洲av片在线观看爽| 国产欧美日韩综合在线一区二区| 久久人人精品亚洲av| 欧美中文综合在线视频| 国产欧美日韩精品亚洲av| 欧美日本视频| 亚洲aⅴ乱码一区二区在线播放 | 18禁黄网站禁片午夜丰满| 天天躁狠狠躁夜夜躁狠狠躁| www.精华液| 亚洲熟妇熟女久久| 在线国产一区二区在线| 高清黄色对白视频在线免费看| 在线观看免费视频日本深夜| tocl精华| 好男人电影高清在线观看| 欧美一级毛片孕妇|