• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Semantic Path Attention Network Based on Heterogeneous Graphs for Natural Language to SQL Task

    2023-10-29 11:50:56ZHOUHaoran周浩冉SONGHuiGONGMengmeng龔蒙蒙
    關(guān)鍵詞:蒙蒙

    ZHOU Haoran(周浩冉), SONG Hui(宋 暉), GONG Mengmeng(龔蒙蒙)

    College of Computer Science and Technology, Donghua University, Shanghai 201620, China

    Abstract:The natural language to SQL (NL2SQL) task is an emerging research area that aims to transform a natural language with a given database into an SQL query. The earlier approaches were to process the input into a heterogeneous graph. However, previous models failed to distinguish the types of multi-hop connections of the heterogeneous graph, which tended to ignore crucial semantic path information. To this end, a two-layer attention network is presented to focus on essential neighbor nodes and mine enlightening semantic paths for feature encoding. The weighted edge is introduced for schema linking to connect the nodes with semantic similarity. In the decoding phase, a rule-based pruning strategy is offered to refine the generated SQL queries. From the experimental results, the approach is shown to learn a good encoding representation and decode the representation to generate results with practical meaning.

    Key words:semantic path; two-layer attention network; semantic linking

    0 Introduction

    Natural language to SQL (NL2SQL) task is an emerging research area that aims to transform a natural language with a given database into an SQL query. NL2SQL techniques are gradually applied to the commercial relational database management system to develop natural language interfaces to relational databases.

    With the introduction of the single table query task WIKISQL[1], the X-SQL[2]model has achieved performance beyond the human level in this task. However, researchers improve the encoding and decoding methods[3-4], and there is still a big gap for multi-table query tasks.

    Existing encoding methods, such as relation-aware schema encoding and linking for text-to-SQL(RAT-SQL)[3], construct graph neural networks to characterize the relationship between question tokens (nodes) and database tokens (table, column). Line graph enhanced text-to-SQL(LGESQL)[4]establishes extra line graphs for edges to obtain structural relationships. These methods only established edges for the identical tokens, ignoring the ones with semantic similarity. Moreover, the lack of attention to nodes’ relation through multi-hops leads to a partial loss of semantic information. For example, the models do not perform well when the gold column names are not mentioned in the question sentences. In fact, they can be found by linking the table names mentioned in the question to those columns.

    The decoding methods are based on abstract syntax tree structures to generate SQL queries. RAT-SQL and LGESQL adopt top-down decoding and have linear time complexity with a slow generation speed. Semi-autoregressive bottom-up parser(SMBOP)[5]proposed a bottom-up approach to generate syntax trees in decoding and improve the time complexity to logn.However, it generates some SQL queries that are syntactically incorrect. A fixed number of tables and columns at the beginning leads to predicted redundant tables in SQL query.

    To solve the above-identified problems, a multi-table semantic parsing model named semantic path attention network based on heterogeneous graphs(SPAN) is proposed in this paper. To better characterize the multi-hop relationship among question tokens, table names and column names, the concept of the semantic path is introduced. Six types of semantic paths are constructed to connect three types of token nodes. A two-layer heterogeneous graph attention network(HAN)[6]is employed to calculate the attention of semantic paths based on node attention, which adds the information of various semantic paths to the current node in the graph. By considering the synonymous tokens in question sentences and databases, the similarity between tokens is characterized as weights of edges. Based on SMBOP, a novel pruning strategy is designed to improve the correctness of generated SQL queries at both key word levels and syntax levels. The model is tested on the SPIDER dataset[7].

    The main contributions of this paper are as follows.

    1) A multi-table NL2SQL model SPAN which fuses the nodes information with a two-level heterogeneous graph through node-level attention and semantic-level attention is proposed in the encoding phase.

    2) Weighted edges are built between nodes with semantic similarity in the encoded network graph, which enables the model to identify synonyms and homonyms in the text of question words, table names, and column names.

    3) A rule-based pruning strategy is proposed to correct invalid key words in the generated results. SQL clauses with logical errors and unnecessary multi-table join queries are discarded.

    1 Related Work

    Research on semantic parsing models for NL2SQL has flourished since the proposal of the single-table dataset WIKISQL[1]. So far, schema-aware denoising[8](SeaD) has achieved 93.0% accuracy in the WIKISQL dataset. However, in the SPIDER[7]dataset which is a multi-table task, existing models can still not achieve this performance due to the complex relationships between tables. Most existing technologies use the encoder-decoder architecture to solve the multi-table problem. Encoding and decoding methods are the hotspots of current research.

    1.1 Encoding issues

    The encoding method of the early NL2SQL semantic parsing model was carried out by the seq2seq. The method has significant limitations in the face of complex datasets.

    In RAT-SQL[3], a graph-based idea was used to handle the encoding problem by considering the question words and the database words as nodes in a graph. A schema linking strategy is proposed to establish relationships between nodes in a graph. This strategy is based on string matching rules to develop full or partial matches among question words, table names and column names. This strategy is imperfect and fails when some semantically similar words are compared. For example, string matching cannot connect the words “year old” in the question and the column name “age”. There is also a problem that each node in the model encoding cannot effectively distinguish the importance of the surrounding nodes. It results in the node feature being affected due to the interference of too much irrelevant information. LGESQL also encodes based on heterogeneous graphs. To discriminate important surrounding nodes and capture edges semantic information, two relational graph attention network(RGAT) modules[9]are proposed in LGESQL to train on nodes and edges, respectively. But it fails to focus on long path nodes and ignores the information of semantic paths. Bridging textual and tabular data for cross-domain text-to-SQL semantic parsing (BRIDGE)[10]only uses the bidirectional encoder representation from transformer (BERT)[11]framework for encoding work. BERT uses question words, table names and column names as inputs for training. Since BRIDGE does not take full advantage of the relations between words, it does not achieve a good level of performance.

    In general, encoding methods in NL2SQL tasks are primarily based on the structure of heterogeneous graphs. In the past, information on the long-distance path needed to be obtained by multi-hops. The learning of features may be affected by noise interference, and this information will be ignored. To cope with this problem, a semantic path approach combined with a two-layer attention mechanism is used to fuse this richly heterogeneous information.

    1.2 Decoding issues

    In the decoding phase, RAT-SQL generates SQL queries in a depth-first traversal order like a syntax tree[12]. It outputs the nodes of each step by decoding the information of the previous step by long short term memory[13](LSTM). LGESQL is similar to RAT-SQL in decoding, except that the syntax tree rules are optimized. The disadvantage of decoding methods is that they consume too much time and huge search space, which leads to low efficiency. BRIDGE uses a pointer network[14]approach to decode. This generation method still needs to consume a lot of time to complete the search work. The decoding method of SMBOP is different from the previous top-down decoding methods. It proposes a bottom-up method to generate the syntax tree. The generation of each step only needs to merge the subtrees of the previous step, and the time complexity reaches the log(n) level. The current mainstream decoding method is to build a syntax tree for result generation. In terms of efficiency, the bottom-up approach is the choice to save time and improve efficiency.

    2 SPAN Model

    2.1 Problem definition

    The task of the SPAN model is generally described as follows. Given a query databaseS=T∪Cand a natural language questionX={x1,x2, …,x|x|} with |X| tokens, whereXandSare inputs.Tstands for table names.Crepresents the column names. The model automatically parses its corresponding SQL queryY.

    2.2 Encoder

    The semantic linking is calculated to build a heterogeneous graph and semantic paths. With semantic linking graphs and feature representation as input, a node-level graph attention network and a semantic level graph attention network are trained to represent the features of question tokens, table names and column names. The specific model encoding structure is shown in Fig.1.

    Fig.1 SPAN model encoding structure

    The input of the encoding part of the model is divided into the representation of each feature node and the construction of the heterogeneous graph. The representation of the feature node is represented by the large-scale pre-training language model, and the heterogeneous graph is constructed according to different types of nodes and edges. The semantic paths are the ones that are found in the heterogeneous graph by definition before training.

    2.2.1Semanticlinkinggraph

    A heterogeneous graph[15]is denoted asG=(V,E), where the nodeVconsist of a question tokenxi, a column nameciand a table nameti;Edenotes the edges of the nodesV.The edge types between the question token and the table name have partial and full matches. The edge types between the question and the column are not only partial matching and full matching but also value matching, which means that question token belongs to the column value type. The edge types between the table name and the column name are primary keys, foreign keys, and ownership relationships.

    The linking strategy based on string matching rules ignores words with semantic similarities, such as the question words “year old” and the column name “age”. The connections between tokens are established based on cosine similarity.

    (1)

    whereV1,iandV2,iare the vectors of the two nodes;iis the sequence numbers of the node;nis a dimension ofV.

    2.2.2Semanticpath

    Table 1 Six semantic paths

    DenoteNφ,ias the semantic-path based neighbor. The semantic-path based neighbors of nodeiare defined as a set of nodes connecting nodeivia a semantic-path form. The neighbors of the same node on different paths may be different.

    2.2.3Two-layerattentionnetwork

    The semantic path attention network consists of two parts: the node-level attention part learns the attention between the central node and the neighboring nodes obtained based on semantic paths; the semantic-level attention part learns the attention value of different semantic paths.

    2.2.3.1 Node-level attention

    Neighbors of different relation types in a heterogeneous graph should have various influences, and the node-level attention is used to discover useful neighbor nodes. Each node obtains different weights from neighboring nodes and then aggregates this neighbor information. The feature of each nodeiishi.Self-attention is used to learn the weightwφ,ijbetween the neighborsNφ,iof nodeiunder this semantic pathφ.The feature of central nodeiis calculated by the node level attention layer to obtain its new featurehφ,iunder multiple semantic pathφ.

    aφ,ij=attention(hi,hj;φ),

    (2)

    whereaφ,ijdenotes the neural network which performs the node-level attention between nodeiand nodej. The attention weight is shared for nodej(neighbors of nodeiunder a semantic path). But the weights of nodeito nodejare not shared with nodejto nodei.

    (3)

    (4)

    wherehφ,idenotes the representation of nodeiafter aggregating neighbor node information under the semantic pathφ.The calculation is a process of capturing semantic information, and the obtained representation has semantic information.

    (5)

    Using a single attention mechanism may overfit or pay too much attention to a type of neighbor nodes. Thus a multi-head attention mechanism is used. The model addresses these issues by integrating the results of multiple independent attentions.

    2.2.3.2 Semantic level attention

    Different semantic paths correspond to different semantic information. A semantic-level attention mechanism is used to aggregate semantic information from different semantic paths by taking the output of the node-level attention as the input.

    (6)

    wherewφpis the importance of the semantic pathφp;Wdenotes the weight matrix;qTis the parameter matrix vector;Vis the node in this semantic path;bis the offset vector.

    (7)

    We perform softmax calculation. The weight of each semantic pathβφpcan be obtained. The larger the weight value, the greater the influence of the semantic path.

    (8)

    wherepis the number of semantic paths required for nodei;Hiis the fusion representation of each semantic path information.

    2.3 Decoder and pruning

    SQL queries are generated based on the decoding structure of SMBOP[5]. SMBOP generates a linear algebra syntax tree[16]. The terminal nodes of the tree are database column names or values, and the non-terminal nodes are SQL operators, which are either monadic or binary operations. A new pruning strategy is added in the decoding stage. In the initial stage, the column names and the table names are predicted to narrow the search space, and cross-entropy is used to calculate the loss between the ground truth and the prediction. Then during the decoding process, the key words of the SQL queries are modified according to the rules. Finally, the predicted SQL query is obtained.

    Inspired by pruning methods reported in Refs. [4, 17-18], pruning is performed in both the initial and the decoding stages.

    2.3.1Initialstageofdecoding

    The predicted column names and table names are directly used as the terminal nodes of the syntax tree. In this way, some invalid predictions can be removed in advance to improve accuracy.

    (9)

    s′i=(∑jγh,jiXjWh,sv)Wso,

    (10)

    wheresidenotes as the encoded schema (table name, column name);s′iis the information after contextualizing with the questionxands′iis used to compute the probability of each database constant being selected;Wh,sq,Wh,sk,Wh,svandWsoare the weight matrices;Xjrepresents the vector of the questions;dis the dimension of the word embedding vector.

    The biaffine[19]binary classifier is used to decide whether to select the node as a leaf node:

    P(si|X)=σ(siUs′Ti+[si;s′i]Ws+b),

    (11)

    whereWandUare the weight matrices;bis the bias vector;σis the activation function. The loss function of this task is

    L=-∑si[Si,goldlogP+(1-Si,gold)log(1-P)],

    (12)

    whereSi,goldindicates whether the column actually appears in the query.

    2.3.2Decodingprocess

    Rules are used to constrain SQL queries to weed out invalid generation.

    For example: “select tid.cid from tables”

    SQL simple syntax check:tidmust containcid.

    Logic syntax check:tablesmust contain the table corresponding totid.

    Construct an external knowledge base to check whether the special columns are correct.

    A knowledge base is defined in advance to contain a large number of words and those words or symbols with similar semantics. When the predicted SQL contains a key word in the knowledge base, the word bag corresponding to the key word is queried in the knowledge base. If the question sentence does not contain a similar word from the word package, the key word is corrected. We will consider replacing or abandoning this prediction field.

    The examples of knowledge base are shown in Table 2.

    Table 2 Examples of knowledge base

    3 Experiments and Analyses

    3.1 Experiment setup

    The AllenNLP[20]framework is used for experiments in the environment of Python 3.8. The dimension of the token vector of the question and the database is 1 024, and grammar-augmented pre-training for table semantic parsing(GRAPPA)[22]is used to encode the tokens. The threshold of semantic similarity judgment is 0.2. When the semantic similarity is greater than 0.2, a related situation arises. The multi-head attention numberKis 8. The semantic-level attention vector dimension is 512. Beam size during decoding is 30. The decoding step during inference is 9. The head number of the multi-head attention in the decoding step is 8, and the hidden dimension is 256. The batch size during training is 10. About 200 rounds are trained, and the learning rate is 0.000 186.

    3.2 Datasets

    The SPIDER dataset[7]is composed of 10 181 questions and 5 693 unique queries involving more than 200 tables in the database and 138 different domains. The dataset has complex SQL queries and semantic parsing tasks close to real-world scenarios. Many of the models mentioned above are based on the SPIDER dataset for experiments, and the model is applied to the dataset.

    3.3 Evaluation index

    Exact set match without values (EM) and execution with values(EX). EM refers to similarity between the corresponding components in the gold SQL and the predicted SQL. If their corresponding components are consistent, this case is right; otherwise, it is false, regardless of the order of the components. EX refers to executing the gold SQL and the predicted SQL in the database, respectively. By comparing the two results, the prediction is considered correct when the results are the same. We use the verification script officially provided by the SPIDER dataset to verify and evaluate the model of the experiment. The above two indicators are used to evaluate.

    3.4 Analysis of experimental results

    To compare the model performance, models in Table 3 were trained and predicted in the same experimental environment. The Dev datasets are SPIDER. EX of the SMBOP model is 74.6% and EM of the SMBOP is 74.1%. EX of the SPAN model reaches 76.1%. EM of the SPAN model achieves 75.8%.

    Table 3 Comparison of experimental results for each model

    As shown in Table 3, five mainstream models are compared. The SPAN model is a great improvement over RAT-SQL(69.7%). However, the performance improvement of SPAN is less than that of the LGESQL model(75.1%). With the employment of semantic linking information, SPAN further captures distant path representation and distinguishes different types of paths.

    3.5 Ablation studies

    Ablation experiments were conducted by using several models to investigate the contribution of various encoding and decoding combinations. The result is shown in Table 4.

    Table 4 Ablation experiment result

    1) Compare SPAN encoding with other encoding approaches. SPAN performance(76.1%) improves greatly in EX, and in EM compared to the model without encoding (GRAPPA’s initial word vector, 68.6%). This indicates that the two-layer attention network with semantic-path based encoding is more effective than the simple encoding of RAT-SQL, and the two-layer attention network can obtain more information.

    2) Adding the semantic linking on top of RAT encoding could compensate for the unlinkable relations in the original method and rich the heterogeneous graph.

    3) The addition of the pruning method results in a less improvement on EX and EM for RAT-SQL. The improvement is low on EX and EM for SPAN encoding. The pruning method is not significant for performance enhancement, but it is valuable in practical applications to improve the execution effect and avoid some obvious misrepresentations.

    3.6 Case studies

    Four examples generated with SPAN and SMBOP were given to illustrate the model effect as shown in Figs.2 and 3.

    As is shown in Fig.2, different models have different perceptions of the same column. In example 1, the original model could not predict “country”, but selected “ID” due to other information. SPAN can effectively learn that the “country” column is also important, so it can select the correct column in the “where” condition. The word in the question can be effectively associated with the ground truth.

    Fig.2 Effect of encoding

    In example 2, the question words “mobile phone number” strongly correlates with the column “cell_mobile_number”. SMBOP cannot recognize their connection, but the improved linking rules proposed in this paper can link the two. Even though the gold value and the question word are somewhat different, the SPAN model can still find them.

    It can be found in example 3, redundancies occur because too many irrelevant tables are predicted. We can find that generation errors can be reduced by applying the pruning rules. In example 4, there are obvious key word errors in the example. “Reversed lexicographical” means that a descending arrangement is needed to get. The previous model may have been overfitted, resulting in redundancy. It is also overly rigid in some reversed semantics.

    Fig.3 Effect of pruning strategy

    4 Conclusions

    In this work, a multi-table semantic parsing model SPAN for the natural language processing (NLP) task is proposed, which explores semantic paths based on a two-layer attention network in the encoding phase. It enriches the heterogeneous graph structure by establishing weighted edges with the semantic similarity. The rule-based pruning of the decoding process corrects the predicted SQL. Experimental results fully illustrate that inspiring semantic information in complex semantic NL2SQL tasks is beneficial to the model performance. However, the SPAN model is unable to generate words out of the question and the database. In future work, the predictive generation of words external to this section to refine the representation of the model will be studied.

    猜你喜歡
    蒙蒙
    煙雨蒙蒙青西河
    小讀者(2021年2期)2021-03-29 05:03:52
    《家》《竹》
    啟啟、蒙蒙勇闖螞蟻國(176)——我怎么變成了“汗飽包”
    啟蒙(3-7歲)(2019年8期)2019-09-10 03:08:58
    啟啟、蒙蒙勇闖螞蟻國(173)——癩蛤蟆真不賴
    啟蒙(3-7歲)(2019年5期)2019-06-27 07:24:44
    啟啟、蒙蒙勇闖螞蟻國(172)——“氣人”的氣球
    啟蒙(3-7歲)(2019年4期)2019-06-27 07:06:18
    啟啟、蒙蒙勇闖螞蟻國(171)
    ——一天三“變”的太陽
    啟蒙(3-7歲)(2019年3期)2019-04-03 01:40:14
    啟啟、蒙蒙勇闖螞蟻國(170)
    ——飄飛的泡泡
    啟蒙(3-7歲)(2019年2期)2019-02-18 06:36:52
    啟啟、蒙蒙勇闖螞蟻國(168)
    ——敏感的皮膚
    啟蒙(3-7歲)(2018年12期)2018-12-13 01:49:52
    啟啟、蒙蒙勇闖螞蟻國(163)——人為什么眨眼睛
    啟蒙(3-7歲)(2018年7期)2018-07-10 09:37:48
    啟啟、蒙蒙勇闖螞蟻國(158)——“狗年”快樂
    啟蒙(3-7歲)(2018年2期)2018-03-15 08:03:27
    中亚洲国语对白在线视频| 伊人亚洲综合成人网| 中亚洲国语对白在线视频| 国产一区二区在线观看av| 成人18禁高潮啪啪吃奶动态图| 欧美日韩视频精品一区| 亚洲第一欧美日韩一区二区三区 | 国产精品一区二区在线观看99| 国产免费现黄频在线看| 亚洲激情五月婷婷啪啪| 欧美日韩亚洲国产一区二区在线观看 | 亚洲成av片中文字幕在线观看| 欧美av亚洲av综合av国产av| 大陆偷拍与自拍| 国产精品成人在线| 成人18禁高潮啪啪吃奶动态图| 国产成人a∨麻豆精品| 色视频在线一区二区三区| 亚洲 欧美一区二区三区| 欧美 亚洲 国产 日韩一| 亚洲精品国产av蜜桃| av视频免费观看在线观看| 亚洲性夜色夜夜综合| 国产av又大| 黑人猛操日本美女一级片| 脱女人内裤的视频| 秋霞在线观看毛片| 亚洲视频免费观看视频| 亚洲一区二区三区欧美精品| 丰满人妻熟妇乱又伦精品不卡| 黑人欧美特级aaaaaa片| 不卡一级毛片| 捣出白浆h1v1| 麻豆国产av国片精品| 欧美少妇被猛烈插入视频| 欧美乱码精品一区二区三区| 97在线人人人人妻| 国产精品av久久久久免费| 中文字幕人妻熟女乱码| 啦啦啦在线免费观看视频4| 亚洲精品成人av观看孕妇| 国产在线观看jvid| 男女之事视频高清在线观看| 爱豆传媒免费全集在线观看| 欧美人与性动交α欧美软件| 欧美精品高潮呻吟av久久| 国产欧美亚洲国产| 精品国产乱码久久久久久小说| 午夜成年电影在线免费观看| 国产精品一区二区精品视频观看| 好男人电影高清在线观看| 亚洲久久久国产精品| 亚洲精品自拍成人| av网站在线播放免费| 咕卡用的链子| 久久久国产欧美日韩av| 国内毛片毛片毛片毛片毛片| 丝袜人妻中文字幕| 1024香蕉在线观看| 97人妻天天添夜夜摸| 精品久久蜜臀av无| cao死你这个sao货| 亚洲av电影在线进入| 国产高清视频在线播放一区 | 免费在线观看影片大全网站| 一本大道久久a久久精品| 中文字幕av电影在线播放| 正在播放国产对白刺激| 婷婷丁香在线五月| 亚洲av男天堂| 国产精品一区二区在线不卡| 欧美中文综合在线视频| 亚洲国产精品一区三区| 夜夜骑夜夜射夜夜干| 亚洲午夜精品一区,二区,三区| 亚洲中文字幕日韩| 免费少妇av软件| 亚洲精品粉嫩美女一区| 两性夫妻黄色片| 2018国产大陆天天弄谢| 真人做人爱边吃奶动态| 欧美少妇被猛烈插入视频| 久久女婷五月综合色啪小说| 亚洲一码二码三码区别大吗| 日韩人妻精品一区2区三区| 中文欧美无线码| 成人国产av品久久久| 搡老岳熟女国产| 一本一本久久a久久精品综合妖精| 欧美黄色片欧美黄色片| 国内毛片毛片毛片毛片毛片| 青草久久国产| 中文字幕色久视频| 精品少妇久久久久久888优播| 精品国产乱子伦一区二区三区 | 久久久欧美国产精品| 天天添夜夜摸| 在线看a的网站| 成人三级做爰电影| 亚洲专区国产一区二区| 国产av一区二区精品久久| 最新的欧美精品一区二区| 成人影院久久| 纯流量卡能插随身wifi吗| 国产精品.久久久| 一区二区av电影网| 欧美激情久久久久久爽电影 | 99re6热这里在线精品视频| 成年人黄色毛片网站| 免费在线观看日本一区| 乱人伦中国视频| 大陆偷拍与自拍| 亚洲国产欧美在线一区| 侵犯人妻中文字幕一二三四区| 美女中出高潮动态图| 中国美女看黄片| 亚洲av电影在线观看一区二区三区| 性色av乱码一区二区三区2| 一区二区三区精品91| 欧美黄色淫秽网站| 侵犯人妻中文字幕一二三四区| 久久毛片免费看一区二区三区| 国产精品久久久久久人妻精品电影 | 精品久久久久久电影网| 人妻人人澡人人爽人人| 亚洲第一青青草原| 国产一区二区在线观看av| 亚洲自偷自拍图片 自拍| 欧美在线黄色| 香蕉国产在线看| 看免费av毛片| 在线观看一区二区三区激情| 纵有疾风起免费观看全集完整版| 午夜福利在线免费观看网站| 交换朋友夫妻互换小说| 十分钟在线观看高清视频www| 9色porny在线观看| 十八禁高潮呻吟视频| 国产亚洲精品久久久久5区| 中文精品一卡2卡3卡4更新| 9热在线视频观看99| 国产一卡二卡三卡精品| 亚洲国产av新网站| 老司机在亚洲福利影院| 18禁国产床啪视频网站| 极品人妻少妇av视频| 亚洲va日本ⅴa欧美va伊人久久 | 亚洲人成电影免费在线| 岛国在线观看网站| 亚洲成av片中文字幕在线观看| 如日韩欧美国产精品一区二区三区| 国产成人av激情在线播放| 亚洲 国产 在线| 亚洲欧美一区二区三区黑人| 国产人伦9x9x在线观看| videos熟女内射| www.999成人在线观看| 久9热在线精品视频| 黑人操中国人逼视频| av欧美777| 成人手机av| 国产成人a∨麻豆精品| 男人添女人高潮全过程视频| 啪啪无遮挡十八禁网站| 国产成人欧美在线观看 | 老司机靠b影院| 日韩视频在线欧美| 国产精品九九99| 又大又爽又粗| 午夜成年电影在线免费观看| 在线观看舔阴道视频| 国产日韩欧美视频二区| 欧美97在线视频| 国产伦人伦偷精品视频| 日韩制服骚丝袜av| 一级毛片女人18水好多| 悠悠久久av| 国产一区二区三区综合在线观看| 免费高清在线观看日韩| 别揉我奶头~嗯~啊~动态视频 | 亚洲全国av大片| 亚洲精品美女久久久久99蜜臀| 狠狠狠狠99中文字幕| 人人妻人人澡人人看| 亚洲第一欧美日韩一区二区三区 | 波多野结衣一区麻豆| 777米奇影视久久| 中国国产av一级| 黄色视频在线播放观看不卡| 动漫黄色视频在线观看| av欧美777| 一个人免费在线观看的高清视频 | √禁漫天堂资源中文www| 亚洲色图综合在线观看| 一级黄色大片毛片| 久久精品aⅴ一区二区三区四区| 久久久国产精品麻豆| 最新的欧美精品一区二区| 欧美少妇被猛烈插入视频| 久久久久国产精品人妻一区二区| 成年av动漫网址| avwww免费| 免费女性裸体啪啪无遮挡网站| 国产熟女午夜一区二区三区| 国产97色在线日韩免费| 久久 成人 亚洲| 超碰97精品在线观看| 欧美午夜高清在线| 中亚洲国语对白在线视频| 夫妻午夜视频| 免费日韩欧美在线观看| 极品少妇高潮喷水抽搐| 90打野战视频偷拍视频| 久久久精品国产亚洲av高清涩受| 欧美中文综合在线视频| 97人妻天天添夜夜摸| av欧美777| 1024视频免费在线观看| 久久久久视频综合| 亚洲精品国产av成人精品| 91字幕亚洲| 亚洲国产精品999| 欧美日韩成人在线一区二区| 黑人猛操日本美女一级片| 久久国产精品影院| 亚洲精品久久成人aⅴ小说| 国产成人精品久久二区二区91| 国产成人av教育| 男男h啪啪无遮挡| 国产无遮挡羞羞视频在线观看| 成年动漫av网址| 久久久久久久久久久久大奶| 男女高潮啪啪啪动态图| 又紧又爽又黄一区二区| 女人被躁到高潮嗷嗷叫费观| 亚洲av电影在线进入| 国产97色在线日韩免费| 亚洲 国产 在线| 亚洲精品日韩在线中文字幕| 午夜免费鲁丝| 丝袜美足系列| a 毛片基地| 午夜福利乱码中文字幕| a级毛片在线看网站| 青青草视频在线视频观看| 国产无遮挡羞羞视频在线观看| 国产xxxxx性猛交| 永久免费av网站大全| www.精华液| 少妇 在线观看| 亚洲第一青青草原| 国产熟女午夜一区二区三区| 精品卡一卡二卡四卡免费| 丰满饥渴人妻一区二区三| 日韩电影二区| 男女下面插进去视频免费观看| 国产国语露脸激情在线看| 黑人欧美特级aaaaaa片| 国产成人欧美| 国产在线免费精品| 日本wwww免费看| 一本大道久久a久久精品| 久久狼人影院| 色精品久久人妻99蜜桃| 国产高清videossex| 国产亚洲一区二区精品| 亚洲精品美女久久久久99蜜臀| 中文欧美无线码| 窝窝影院91人妻| www.自偷自拍.com| 99国产综合亚洲精品| 成年女人毛片免费观看观看9 | 一二三四社区在线视频社区8| 嫁个100分男人电影在线观看| 欧美中文综合在线视频| 9热在线视频观看99| 久久综合国产亚洲精品| 午夜精品国产一区二区电影| www.999成人在线观看| 精品免费久久久久久久清纯 | a 毛片基地| 国产一区有黄有色的免费视频| 亚洲av日韩在线播放| 国产精品一区二区免费欧美 | 老司机影院毛片| 五月天丁香电影| kizo精华| 免费在线观看黄色视频的| 国产成+人综合+亚洲专区| 搡老岳熟女国产| 男女下面插进去视频免费观看| 十八禁人妻一区二区| 欧美久久黑人一区二区| av不卡在线播放| 久久久国产欧美日韩av| 久久免费观看电影| av一本久久久久| 国产日韩欧美视频二区| 在线永久观看黄色视频| 日韩大码丰满熟妇| 一区二区三区四区激情视频| 丝袜喷水一区| 99国产精品一区二区蜜桃av | 成人亚洲精品一区在线观看| 久久人人爽av亚洲精品天堂| 午夜福利一区二区在线看| 成人国产av品久久久| 欧美在线黄色| 亚洲第一欧美日韩一区二区三区 | 男女无遮挡免费网站观看| 色婷婷av一区二区三区视频| 国产深夜福利视频在线观看| 久久精品久久久久久噜噜老黄| 一级a爱视频在线免费观看| 国产免费福利视频在线观看| 日韩免费高清中文字幕av| 国产一区二区激情短视频 | 日韩,欧美,国产一区二区三区| 亚洲国产中文字幕在线视频| 久久亚洲精品不卡| 免费av中文字幕在线| 精品国产一区二区三区四区第35| 日韩制服骚丝袜av| 精品人妻1区二区| 国产国语露脸激情在线看| 欧美亚洲日本最大视频资源| 国产欧美日韩一区二区精品| 国产精品久久久久久精品古装| 在线看a的网站| 亚洲精品国产av成人精品| 人人妻人人澡人人爽人人夜夜| av天堂久久9| 欧美日韩亚洲高清精品| 19禁男女啪啪无遮挡网站| 99国产精品一区二区蜜桃av | 精品亚洲成国产av| 永久免费av网站大全| 黄色怎么调成土黄色| 精品亚洲成a人片在线观看| 18在线观看网站| 精品卡一卡二卡四卡免费| 午夜福利一区二区在线看| 欧美日韩亚洲国产一区二区在线观看 | 国产色视频综合| 麻豆乱淫一区二区| 成人手机av| 一区二区三区激情视频| 免费人妻精品一区二区三区视频| 老司机深夜福利视频在线观看 | 在线 av 中文字幕| 男女下面插进去视频免费观看| 我要看黄色一级片免费的| 欧美黑人欧美精品刺激| 91老司机精品| www.999成人在线观看| 久久久久久久久免费视频了| 久久久久精品人妻al黑| 淫妇啪啪啪对白视频 | 精品国产一区二区久久| 国产精品影院久久| 成人国语在线视频| 国产一区有黄有色的免费视频| 一区在线观看完整版| 日本一区二区免费在线视频| 国产在线一区二区三区精| 欧美精品一区二区免费开放| 动漫黄色视频在线观看| 丝袜人妻中文字幕| 热re99久久精品国产66热6| 一级,二级,三级黄色视频| 亚洲专区国产一区二区| 亚洲五月婷婷丁香| 欧美日韩黄片免| 久久性视频一级片| 亚洲精品中文字幕在线视频| 国产极品粉嫩免费观看在线| 午夜精品国产一区二区电影| 日本一区二区免费在线视频| 97在线人人人人妻| 欧美一级毛片孕妇| av视频免费观看在线观看| 伦理电影免费视频| www.999成人在线观看| 久久99热这里只频精品6学生| 亚洲视频免费观看视频| 亚洲欧美日韩高清在线视频 | 麻豆av在线久日| 欧美日韩av久久| 纵有疾风起免费观看全集完整版| 国产精品久久久久成人av| 99精国产麻豆久久婷婷| 不卡av一区二区三区| 叶爱在线成人免费视频播放| 色老头精品视频在线观看| 伊人亚洲综合成人网| 久热爱精品视频在线9| 亚洲精品久久久久久婷婷小说| 国产精品久久久久久精品电影小说| 欧美xxⅹ黑人| 亚洲一区中文字幕在线| 亚洲精华国产精华精| 亚洲午夜精品一区,二区,三区| 肉色欧美久久久久久久蜜桃| 女性被躁到高潮视频| 伊人亚洲综合成人网| 亚洲精品第二区| 免费人妻精品一区二区三区视频| 久久久久久免费高清国产稀缺| 嫁个100分男人电影在线观看| 久久国产精品影院| 波多野结衣一区麻豆| 狠狠精品人妻久久久久久综合| 香蕉国产在线看| 老熟妇仑乱视频hdxx| 18禁黄网站禁片午夜丰满| 老熟女久久久| 各种免费的搞黄视频| 亚洲欧美色中文字幕在线| 2018国产大陆天天弄谢| 亚洲,欧美精品.| 国产成人av激情在线播放| 国产三级黄色录像| 无限看片的www在线观看| 男女下面插进去视频免费观看| 五月天丁香电影| 久久久久久久久久久久大奶| 另类精品久久| 69精品国产乱码久久久| 国产在线一区二区三区精| 狂野欧美激情性xxxx| 精品一品国产午夜福利视频| 黄片播放在线免费| 精品久久久久久久毛片微露脸 | 亚洲国产日韩一区二区| 亚洲中文日韩欧美视频| 国产免费福利视频在线观看| 国产视频一区二区在线看| 久久性视频一级片| 久久精品国产a三级三级三级| 亚洲欧美成人综合另类久久久| 日韩制服丝袜自拍偷拍| 色综合欧美亚洲国产小说| 他把我摸到了高潮在线观看 | 国产av国产精品国产| 国产一区二区激情短视频 | 伊人亚洲综合成人网| 精品亚洲成a人片在线观看| 午夜免费成人在线视频| 69精品国产乱码久久久| 国产精品一二三区在线看| 国产精品av久久久久免费| 黄色毛片三级朝国网站| 成年人免费黄色播放视频| 亚洲国产精品一区三区| 天天影视国产精品| 亚洲精品乱久久久久久| 日韩中文字幕欧美一区二区| 中国美女看黄片| 国产日韩欧美视频二区| 男女边摸边吃奶| 无限看片的www在线观看| 五月开心婷婷网| 50天的宝宝边吃奶边哭怎么回事| 精品少妇一区二区三区视频日本电影| 国产av精品麻豆| 国产免费一区二区三区四区乱码| 久久毛片免费看一区二区三区| 亚洲精品乱久久久久久| 久久精品国产a三级三级三级| 男女边摸边吃奶| 曰老女人黄片| 最近最新中文字幕大全免费视频| 日本av手机在线免费观看| 91精品三级在线观看| 中国国产av一级| 国产淫语在线视频| 99香蕉大伊视频| 黑人巨大精品欧美一区二区mp4| 高清黄色对白视频在线免费看| 国产免费视频播放在线视频| 亚洲精品粉嫩美女一区| 久久久久精品国产欧美久久久 | 老汉色av国产亚洲站长工具| www.熟女人妻精品国产| 午夜影院在线不卡| 一级片'在线观看视频| 久久久国产欧美日韩av| 国产97色在线日韩免费| 久久久久久久国产电影| 久久香蕉激情| 亚洲精品中文字幕在线视频| 亚洲天堂av无毛| 亚洲国产欧美日韩在线播放| 欧美精品一区二区免费开放| 老司机在亚洲福利影院| 久久中文字幕一级| 99久久99久久久精品蜜桃| 国产精品一区二区在线观看99| 国产高清国产精品国产三级| 青春草视频在线免费观看| 婷婷丁香在线五月| 巨乳人妻的诱惑在线观看| 国产一区二区在线观看av| 亚洲激情五月婷婷啪啪| a级毛片黄视频| 亚洲av日韩在线播放| 丰满少妇做爰视频| 国产男女超爽视频在线观看| 亚洲国产欧美网| 亚洲av电影在线进入| 精品少妇内射三级| 女人久久www免费人成看片| 色视频在线一区二区三区| 亚洲男人天堂网一区| av一本久久久久| 国产精品久久久人人做人人爽| 国产男人的电影天堂91| tube8黄色片| 国产97色在线日韩免费| 国产精品麻豆人妻色哟哟久久| 成人国产av品久久久| 热re99久久精品国产66热6| 久久九九热精品免费| 国产亚洲一区二区精品| 亚洲美女黄色视频免费看| 欧美亚洲日本最大视频资源| 波多野结衣一区麻豆| 亚洲国产成人一精品久久久| 女警被强在线播放| 乱人伦中国视频| 老司机福利观看| 日韩电影二区| 亚洲avbb在线观看| 国产又色又爽无遮挡免| 精品第一国产精品| 日韩,欧美,国产一区二区三区| 精品第一国产精品| 亚洲精品美女久久久久99蜜臀| 中文字幕最新亚洲高清| 人妻一区二区av| 亚洲情色 制服丝袜| 久久亚洲国产成人精品v| 淫妇啪啪啪对白视频 | 18禁黄网站禁片午夜丰满| 国产在线一区二区三区精| 欧美黄色淫秽网站| 日韩 欧美 亚洲 中文字幕| 国产精品免费大片| 午夜免费观看性视频| 国产av国产精品国产| 欧美精品人与动牲交sv欧美| 日韩制服丝袜自拍偷拍| 国产人伦9x9x在线观看| 国内毛片毛片毛片毛片毛片| 不卡一级毛片| 2018国产大陆天天弄谢| 久久精品人人爽人人爽视色| 亚洲五月色婷婷综合| 两人在一起打扑克的视频| 蜜桃在线观看..| 9191精品国产免费久久| 久久精品亚洲av国产电影网| 在线观看一区二区三区激情| 一区二区三区四区激情视频| 亚洲精华国产精华精| 97在线人人人人妻| 天堂8中文在线网| 满18在线观看网站| 中文字幕制服av| 男女国产视频网站| 亚洲av电影在线进入| 建设人人有责人人尽责人人享有的| 精品国产一区二区久久| a级片在线免费高清观看视频| 在线观看免费高清a一片| 一级,二级,三级黄色视频| 亚洲少妇的诱惑av| 50天的宝宝边吃奶边哭怎么回事| 肉色欧美久久久久久久蜜桃| 国产高清视频在线播放一区 | 亚洲成av片中文字幕在线观看| 午夜福利影视在线免费观看| 成人黄色视频免费在线看| 国产淫语在线视频| 9191精品国产免费久久| 日本av免费视频播放| 国产真人三级小视频在线观看| 看免费av毛片| 欧美精品一区二区大全| av线在线观看网站| 人人澡人人妻人| 啪啪无遮挡十八禁网站| 女人高潮潮喷娇喘18禁视频| 亚洲欧美日韩高清在线视频 | 日本猛色少妇xxxxx猛交久久| 国产欧美亚洲国产| 狂野欧美激情性bbbbbb| 一本色道久久久久久精品综合| 国产精品二区激情视频| 成年人午夜在线观看视频| 欧美日韩福利视频一区二区| 日韩欧美一区视频在线观看| 王馨瑶露胸无遮挡在线观看| 日本av免费视频播放| 午夜免费观看性视频| 一本综合久久免费| 麻豆国产av国片精品| 在线永久观看黄色视频| 两性午夜刺激爽爽歪歪视频在线观看 | 超碰97精品在线观看| 国产成人一区二区三区免费视频网站| 波多野结衣av一区二区av| 亚洲欧洲日产国产| 宅男免费午夜| av国产精品久久久久影院|