• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Semantic Path Attention Network Based on Heterogeneous Graphs for Natural Language to SQL Task

    2023-10-29 11:50:56ZHOUHaoran周浩冉SONGHuiGONGMengmeng龔蒙蒙
    關(guān)鍵詞:蒙蒙

    ZHOU Haoran(周浩冉), SONG Hui(宋 暉), GONG Mengmeng(龔蒙蒙)

    College of Computer Science and Technology, Donghua University, Shanghai 201620, China

    Abstract:The natural language to SQL (NL2SQL) task is an emerging research area that aims to transform a natural language with a given database into an SQL query. The earlier approaches were to process the input into a heterogeneous graph. However, previous models failed to distinguish the types of multi-hop connections of the heterogeneous graph, which tended to ignore crucial semantic path information. To this end, a two-layer attention network is presented to focus on essential neighbor nodes and mine enlightening semantic paths for feature encoding. The weighted edge is introduced for schema linking to connect the nodes with semantic similarity. In the decoding phase, a rule-based pruning strategy is offered to refine the generated SQL queries. From the experimental results, the approach is shown to learn a good encoding representation and decode the representation to generate results with practical meaning.

    Key words:semantic path; two-layer attention network; semantic linking

    0 Introduction

    Natural language to SQL (NL2SQL) task is an emerging research area that aims to transform a natural language with a given database into an SQL query. NL2SQL techniques are gradually applied to the commercial relational database management system to develop natural language interfaces to relational databases.

    With the introduction of the single table query task WIKISQL[1], the X-SQL[2]model has achieved performance beyond the human level in this task. However, researchers improve the encoding and decoding methods[3-4], and there is still a big gap for multi-table query tasks.

    Existing encoding methods, such as relation-aware schema encoding and linking for text-to-SQL(RAT-SQL)[3], construct graph neural networks to characterize the relationship between question tokens (nodes) and database tokens (table, column). Line graph enhanced text-to-SQL(LGESQL)[4]establishes extra line graphs for edges to obtain structural relationships. These methods only established edges for the identical tokens, ignoring the ones with semantic similarity. Moreover, the lack of attention to nodes’ relation through multi-hops leads to a partial loss of semantic information. For example, the models do not perform well when the gold column names are not mentioned in the question sentences. In fact, they can be found by linking the table names mentioned in the question to those columns.

    The decoding methods are based on abstract syntax tree structures to generate SQL queries. RAT-SQL and LGESQL adopt top-down decoding and have linear time complexity with a slow generation speed. Semi-autoregressive bottom-up parser(SMBOP)[5]proposed a bottom-up approach to generate syntax trees in decoding and improve the time complexity to logn.However, it generates some SQL queries that are syntactically incorrect. A fixed number of tables and columns at the beginning leads to predicted redundant tables in SQL query.

    To solve the above-identified problems, a multi-table semantic parsing model named semantic path attention network based on heterogeneous graphs(SPAN) is proposed in this paper. To better characterize the multi-hop relationship among question tokens, table names and column names, the concept of the semantic path is introduced. Six types of semantic paths are constructed to connect three types of token nodes. A two-layer heterogeneous graph attention network(HAN)[6]is employed to calculate the attention of semantic paths based on node attention, which adds the information of various semantic paths to the current node in the graph. By considering the synonymous tokens in question sentences and databases, the similarity between tokens is characterized as weights of edges. Based on SMBOP, a novel pruning strategy is designed to improve the correctness of generated SQL queries at both key word levels and syntax levels. The model is tested on the SPIDER dataset[7].

    The main contributions of this paper are as follows.

    1) A multi-table NL2SQL model SPAN which fuses the nodes information with a two-level heterogeneous graph through node-level attention and semantic-level attention is proposed in the encoding phase.

    2) Weighted edges are built between nodes with semantic similarity in the encoded network graph, which enables the model to identify synonyms and homonyms in the text of question words, table names, and column names.

    3) A rule-based pruning strategy is proposed to correct invalid key words in the generated results. SQL clauses with logical errors and unnecessary multi-table join queries are discarded.

    1 Related Work

    Research on semantic parsing models for NL2SQL has flourished since the proposal of the single-table dataset WIKISQL[1]. So far, schema-aware denoising[8](SeaD) has achieved 93.0% accuracy in the WIKISQL dataset. However, in the SPIDER[7]dataset which is a multi-table task, existing models can still not achieve this performance due to the complex relationships between tables. Most existing technologies use the encoder-decoder architecture to solve the multi-table problem. Encoding and decoding methods are the hotspots of current research.

    1.1 Encoding issues

    The encoding method of the early NL2SQL semantic parsing model was carried out by the seq2seq. The method has significant limitations in the face of complex datasets.

    In RAT-SQL[3], a graph-based idea was used to handle the encoding problem by considering the question words and the database words as nodes in a graph. A schema linking strategy is proposed to establish relationships between nodes in a graph. This strategy is based on string matching rules to develop full or partial matches among question words, table names and column names. This strategy is imperfect and fails when some semantically similar words are compared. For example, string matching cannot connect the words “year old” in the question and the column name “age”. There is also a problem that each node in the model encoding cannot effectively distinguish the importance of the surrounding nodes. It results in the node feature being affected due to the interference of too much irrelevant information. LGESQL also encodes based on heterogeneous graphs. To discriminate important surrounding nodes and capture edges semantic information, two relational graph attention network(RGAT) modules[9]are proposed in LGESQL to train on nodes and edges, respectively. But it fails to focus on long path nodes and ignores the information of semantic paths. Bridging textual and tabular data for cross-domain text-to-SQL semantic parsing (BRIDGE)[10]only uses the bidirectional encoder representation from transformer (BERT)[11]framework for encoding work. BERT uses question words, table names and column names as inputs for training. Since BRIDGE does not take full advantage of the relations between words, it does not achieve a good level of performance.

    In general, encoding methods in NL2SQL tasks are primarily based on the structure of heterogeneous graphs. In the past, information on the long-distance path needed to be obtained by multi-hops. The learning of features may be affected by noise interference, and this information will be ignored. To cope with this problem, a semantic path approach combined with a two-layer attention mechanism is used to fuse this richly heterogeneous information.

    1.2 Decoding issues

    In the decoding phase, RAT-SQL generates SQL queries in a depth-first traversal order like a syntax tree[12]. It outputs the nodes of each step by decoding the information of the previous step by long short term memory[13](LSTM). LGESQL is similar to RAT-SQL in decoding, except that the syntax tree rules are optimized. The disadvantage of decoding methods is that they consume too much time and huge search space, which leads to low efficiency. BRIDGE uses a pointer network[14]approach to decode. This generation method still needs to consume a lot of time to complete the search work. The decoding method of SMBOP is different from the previous top-down decoding methods. It proposes a bottom-up method to generate the syntax tree. The generation of each step only needs to merge the subtrees of the previous step, and the time complexity reaches the log(n) level. The current mainstream decoding method is to build a syntax tree for result generation. In terms of efficiency, the bottom-up approach is the choice to save time and improve efficiency.

    2 SPAN Model

    2.1 Problem definition

    The task of the SPAN model is generally described as follows. Given a query databaseS=T∪Cand a natural language questionX={x1,x2, …,x|x|} with |X| tokens, whereXandSare inputs.Tstands for table names.Crepresents the column names. The model automatically parses its corresponding SQL queryY.

    2.2 Encoder

    The semantic linking is calculated to build a heterogeneous graph and semantic paths. With semantic linking graphs and feature representation as input, a node-level graph attention network and a semantic level graph attention network are trained to represent the features of question tokens, table names and column names. The specific model encoding structure is shown in Fig.1.

    Fig.1 SPAN model encoding structure

    The input of the encoding part of the model is divided into the representation of each feature node and the construction of the heterogeneous graph. The representation of the feature node is represented by the large-scale pre-training language model, and the heterogeneous graph is constructed according to different types of nodes and edges. The semantic paths are the ones that are found in the heterogeneous graph by definition before training.

    2.2.1Semanticlinkinggraph

    A heterogeneous graph[15]is denoted asG=(V,E), where the nodeVconsist of a question tokenxi, a column nameciand a table nameti;Edenotes the edges of the nodesV.The edge types between the question token and the table name have partial and full matches. The edge types between the question and the column are not only partial matching and full matching but also value matching, which means that question token belongs to the column value type. The edge types between the table name and the column name are primary keys, foreign keys, and ownership relationships.

    The linking strategy based on string matching rules ignores words with semantic similarities, such as the question words “year old” and the column name “age”. The connections between tokens are established based on cosine similarity.

    (1)

    whereV1,iandV2,iare the vectors of the two nodes;iis the sequence numbers of the node;nis a dimension ofV.

    2.2.2Semanticpath

    Table 1 Six semantic paths

    DenoteNφ,ias the semantic-path based neighbor. The semantic-path based neighbors of nodeiare defined as a set of nodes connecting nodeivia a semantic-path form. The neighbors of the same node on different paths may be different.

    2.2.3Two-layerattentionnetwork

    The semantic path attention network consists of two parts: the node-level attention part learns the attention between the central node and the neighboring nodes obtained based on semantic paths; the semantic-level attention part learns the attention value of different semantic paths.

    2.2.3.1 Node-level attention

    Neighbors of different relation types in a heterogeneous graph should have various influences, and the node-level attention is used to discover useful neighbor nodes. Each node obtains different weights from neighboring nodes and then aggregates this neighbor information. The feature of each nodeiishi.Self-attention is used to learn the weightwφ,ijbetween the neighborsNφ,iof nodeiunder this semantic pathφ.The feature of central nodeiis calculated by the node level attention layer to obtain its new featurehφ,iunder multiple semantic pathφ.

    aφ,ij=attention(hi,hj;φ),

    (2)

    whereaφ,ijdenotes the neural network which performs the node-level attention between nodeiand nodej. The attention weight is shared for nodej(neighbors of nodeiunder a semantic path). But the weights of nodeito nodejare not shared with nodejto nodei.

    (3)

    (4)

    wherehφ,idenotes the representation of nodeiafter aggregating neighbor node information under the semantic pathφ.The calculation is a process of capturing semantic information, and the obtained representation has semantic information.

    (5)

    Using a single attention mechanism may overfit or pay too much attention to a type of neighbor nodes. Thus a multi-head attention mechanism is used. The model addresses these issues by integrating the results of multiple independent attentions.

    2.2.3.2 Semantic level attention

    Different semantic paths correspond to different semantic information. A semantic-level attention mechanism is used to aggregate semantic information from different semantic paths by taking the output of the node-level attention as the input.

    (6)

    wherewφpis the importance of the semantic pathφp;Wdenotes the weight matrix;qTis the parameter matrix vector;Vis the node in this semantic path;bis the offset vector.

    (7)

    We perform softmax calculation. The weight of each semantic pathβφpcan be obtained. The larger the weight value, the greater the influence of the semantic path.

    (8)

    wherepis the number of semantic paths required for nodei;Hiis the fusion representation of each semantic path information.

    2.3 Decoder and pruning

    SQL queries are generated based on the decoding structure of SMBOP[5]. SMBOP generates a linear algebra syntax tree[16]. The terminal nodes of the tree are database column names or values, and the non-terminal nodes are SQL operators, which are either monadic or binary operations. A new pruning strategy is added in the decoding stage. In the initial stage, the column names and the table names are predicted to narrow the search space, and cross-entropy is used to calculate the loss between the ground truth and the prediction. Then during the decoding process, the key words of the SQL queries are modified according to the rules. Finally, the predicted SQL query is obtained.

    Inspired by pruning methods reported in Refs. [4, 17-18], pruning is performed in both the initial and the decoding stages.

    2.3.1Initialstageofdecoding

    The predicted column names and table names are directly used as the terminal nodes of the syntax tree. In this way, some invalid predictions can be removed in advance to improve accuracy.

    (9)

    s′i=(∑jγh,jiXjWh,sv)Wso,

    (10)

    wheresidenotes as the encoded schema (table name, column name);s′iis the information after contextualizing with the questionxands′iis used to compute the probability of each database constant being selected;Wh,sq,Wh,sk,Wh,svandWsoare the weight matrices;Xjrepresents the vector of the questions;dis the dimension of the word embedding vector.

    The biaffine[19]binary classifier is used to decide whether to select the node as a leaf node:

    P(si|X)=σ(siUs′Ti+[si;s′i]Ws+b),

    (11)

    whereWandUare the weight matrices;bis the bias vector;σis the activation function. The loss function of this task is

    L=-∑si[Si,goldlogP+(1-Si,gold)log(1-P)],

    (12)

    whereSi,goldindicates whether the column actually appears in the query.

    2.3.2Decodingprocess

    Rules are used to constrain SQL queries to weed out invalid generation.

    For example: “select tid.cid from tables”

    SQL simple syntax check:tidmust containcid.

    Logic syntax check:tablesmust contain the table corresponding totid.

    Construct an external knowledge base to check whether the special columns are correct.

    A knowledge base is defined in advance to contain a large number of words and those words or symbols with similar semantics. When the predicted SQL contains a key word in the knowledge base, the word bag corresponding to the key word is queried in the knowledge base. If the question sentence does not contain a similar word from the word package, the key word is corrected. We will consider replacing or abandoning this prediction field.

    The examples of knowledge base are shown in Table 2.

    Table 2 Examples of knowledge base

    3 Experiments and Analyses

    3.1 Experiment setup

    The AllenNLP[20]framework is used for experiments in the environment of Python 3.8. The dimension of the token vector of the question and the database is 1 024, and grammar-augmented pre-training for table semantic parsing(GRAPPA)[22]is used to encode the tokens. The threshold of semantic similarity judgment is 0.2. When the semantic similarity is greater than 0.2, a related situation arises. The multi-head attention numberKis 8. The semantic-level attention vector dimension is 512. Beam size during decoding is 30. The decoding step during inference is 9. The head number of the multi-head attention in the decoding step is 8, and the hidden dimension is 256. The batch size during training is 10. About 200 rounds are trained, and the learning rate is 0.000 186.

    3.2 Datasets

    The SPIDER dataset[7]is composed of 10 181 questions and 5 693 unique queries involving more than 200 tables in the database and 138 different domains. The dataset has complex SQL queries and semantic parsing tasks close to real-world scenarios. Many of the models mentioned above are based on the SPIDER dataset for experiments, and the model is applied to the dataset.

    3.3 Evaluation index

    Exact set match without values (EM) and execution with values(EX). EM refers to similarity between the corresponding components in the gold SQL and the predicted SQL. If their corresponding components are consistent, this case is right; otherwise, it is false, regardless of the order of the components. EX refers to executing the gold SQL and the predicted SQL in the database, respectively. By comparing the two results, the prediction is considered correct when the results are the same. We use the verification script officially provided by the SPIDER dataset to verify and evaluate the model of the experiment. The above two indicators are used to evaluate.

    3.4 Analysis of experimental results

    To compare the model performance, models in Table 3 were trained and predicted in the same experimental environment. The Dev datasets are SPIDER. EX of the SMBOP model is 74.6% and EM of the SMBOP is 74.1%. EX of the SPAN model reaches 76.1%. EM of the SPAN model achieves 75.8%.

    Table 3 Comparison of experimental results for each model

    As shown in Table 3, five mainstream models are compared. The SPAN model is a great improvement over RAT-SQL(69.7%). However, the performance improvement of SPAN is less than that of the LGESQL model(75.1%). With the employment of semantic linking information, SPAN further captures distant path representation and distinguishes different types of paths.

    3.5 Ablation studies

    Ablation experiments were conducted by using several models to investigate the contribution of various encoding and decoding combinations. The result is shown in Table 4.

    Table 4 Ablation experiment result

    1) Compare SPAN encoding with other encoding approaches. SPAN performance(76.1%) improves greatly in EX, and in EM compared to the model without encoding (GRAPPA’s initial word vector, 68.6%). This indicates that the two-layer attention network with semantic-path based encoding is more effective than the simple encoding of RAT-SQL, and the two-layer attention network can obtain more information.

    2) Adding the semantic linking on top of RAT encoding could compensate for the unlinkable relations in the original method and rich the heterogeneous graph.

    3) The addition of the pruning method results in a less improvement on EX and EM for RAT-SQL. The improvement is low on EX and EM for SPAN encoding. The pruning method is not significant for performance enhancement, but it is valuable in practical applications to improve the execution effect and avoid some obvious misrepresentations.

    3.6 Case studies

    Four examples generated with SPAN and SMBOP were given to illustrate the model effect as shown in Figs.2 and 3.

    As is shown in Fig.2, different models have different perceptions of the same column. In example 1, the original model could not predict “country”, but selected “ID” due to other information. SPAN can effectively learn that the “country” column is also important, so it can select the correct column in the “where” condition. The word in the question can be effectively associated with the ground truth.

    Fig.2 Effect of encoding

    In example 2, the question words “mobile phone number” strongly correlates with the column “cell_mobile_number”. SMBOP cannot recognize their connection, but the improved linking rules proposed in this paper can link the two. Even though the gold value and the question word are somewhat different, the SPAN model can still find them.

    It can be found in example 3, redundancies occur because too many irrelevant tables are predicted. We can find that generation errors can be reduced by applying the pruning rules. In example 4, there are obvious key word errors in the example. “Reversed lexicographical” means that a descending arrangement is needed to get. The previous model may have been overfitted, resulting in redundancy. It is also overly rigid in some reversed semantics.

    Fig.3 Effect of pruning strategy

    4 Conclusions

    In this work, a multi-table semantic parsing model SPAN for the natural language processing (NLP) task is proposed, which explores semantic paths based on a two-layer attention network in the encoding phase. It enriches the heterogeneous graph structure by establishing weighted edges with the semantic similarity. The rule-based pruning of the decoding process corrects the predicted SQL. Experimental results fully illustrate that inspiring semantic information in complex semantic NL2SQL tasks is beneficial to the model performance. However, the SPAN model is unable to generate words out of the question and the database. In future work, the predictive generation of words external to this section to refine the representation of the model will be studied.

    猜你喜歡
    蒙蒙
    煙雨蒙蒙青西河
    小讀者(2021年2期)2021-03-29 05:03:52
    《家》《竹》
    啟啟、蒙蒙勇闖螞蟻國(176)——我怎么變成了“汗飽包”
    啟蒙(3-7歲)(2019年8期)2019-09-10 03:08:58
    啟啟、蒙蒙勇闖螞蟻國(173)——癩蛤蟆真不賴
    啟蒙(3-7歲)(2019年5期)2019-06-27 07:24:44
    啟啟、蒙蒙勇闖螞蟻國(172)——“氣人”的氣球
    啟蒙(3-7歲)(2019年4期)2019-06-27 07:06:18
    啟啟、蒙蒙勇闖螞蟻國(171)
    ——一天三“變”的太陽
    啟蒙(3-7歲)(2019年3期)2019-04-03 01:40:14
    啟啟、蒙蒙勇闖螞蟻國(170)
    ——飄飛的泡泡
    啟蒙(3-7歲)(2019年2期)2019-02-18 06:36:52
    啟啟、蒙蒙勇闖螞蟻國(168)
    ——敏感的皮膚
    啟蒙(3-7歲)(2018年12期)2018-12-13 01:49:52
    啟啟、蒙蒙勇闖螞蟻國(163)——人為什么眨眼睛
    啟蒙(3-7歲)(2018年7期)2018-07-10 09:37:48
    啟啟、蒙蒙勇闖螞蟻國(158)——“狗年”快樂
    啟蒙(3-7歲)(2018年2期)2018-03-15 08:03:27
    亚洲精品自拍成人| 欧美97在线视频| 欧美激情极品国产一区二区三区 | 亚洲欧美一区二区三区国产| 日韩精品免费视频一区二区三区 | 久久国产亚洲av麻豆专区| 欧美国产精品va在线观看不卡| 丰满饥渴人妻一区二区三| 考比视频在线观看| 人成视频在线观看免费观看| 国产精品免费大片| 久久久久精品性色| 一区二区日韩欧美中文字幕 | 国产成人91sexporn| 国产日韩一区二区三区精品不卡| 热re99久久精品国产66热6| 久久毛片免费看一区二区三区| 午夜福利网站1000一区二区三区| 国产精品欧美亚洲77777| 国产精品久久久久成人av| 久久精品国产a三级三级三级| 亚洲国产日韩一区二区| 亚洲国产av影院在线观看| 久久97久久精品| 免费看av在线观看网站| 国产探花极品一区二区| 欧美日韩一区二区视频在线观看视频在线| 国产成人精品福利久久| 夜夜爽夜夜爽视频| 久久国产精品大桥未久av| 高清视频免费观看一区二区| 亚洲精品美女久久久久99蜜臀 | 亚洲人成77777在线视频| 日韩一本色道免费dvd| 国产成人免费观看mmmm| 最近中文字幕高清免费大全6| 久久精品夜色国产| 精品亚洲成国产av| 亚洲欧美成人精品一区二区| 国产免费又黄又爽又色| 巨乳人妻的诱惑在线观看| 日本欧美视频一区| 国产午夜精品一二区理论片| 国产成人一区二区在线| 成人亚洲精品一区在线观看| av.在线天堂| 丝袜人妻中文字幕| 午夜av观看不卡| 美女国产视频在线观看| 亚洲内射少妇av| 色哟哟·www| 制服人妻中文乱码| 黄色毛片三级朝国网站| 97人妻天天添夜夜摸| 亚洲精品国产av蜜桃| 乱人伦中国视频| 韩国高清视频一区二区三区| 久久婷婷青草| 高清视频免费观看一区二区| 女性生殖器流出的白浆| 久久久久久久久久人人人人人人| 亚洲欧美精品自产自拍| 国产免费一级a男人的天堂| 最新的欧美精品一区二区| 午夜av观看不卡| 777米奇影视久久| 丰满迷人的少妇在线观看| 嫩草影院入口| 日韩一区二区三区影片| 亚洲av在线观看美女高潮| 亚洲丝袜综合中文字幕| 伦精品一区二区三区| 国产av码专区亚洲av| 这个男人来自地球电影免费观看 | 午夜免费观看性视频| 交换朋友夫妻互换小说| 精品一区在线观看国产| 人妻系列 视频| 人人妻人人爽人人添夜夜欢视频| 国产 一区精品| 亚洲精品久久午夜乱码| 极品少妇高潮喷水抽搐| 亚洲一码二码三码区别大吗| 日本猛色少妇xxxxx猛交久久| 久久久久精品性色| 伦精品一区二区三区| 免费高清在线观看视频在线观看| 观看av在线不卡| 黑人猛操日本美女一级片| 精品久久久精品久久久| 曰老女人黄片| 亚洲国产av影院在线观看| 又粗又硬又长又爽又黄的视频| 国产免费视频播放在线视频| 精品卡一卡二卡四卡免费| 亚洲内射少妇av| 精品少妇久久久久久888优播| a 毛片基地| 少妇精品久久久久久久| 成人亚洲精品一区在线观看| 极品人妻少妇av视频| 91在线精品国自产拍蜜月| 国产精品嫩草影院av在线观看| 老女人水多毛片| 免费在线观看完整版高清| 国产有黄有色有爽视频| 18在线观看网站| 少妇的逼好多水| 国产精品一区www在线观看| 国产精品久久久av美女十八| 久久久久久久久久久免费av| 91国产中文字幕| 亚洲一级一片aⅴ在线观看| 国产视频首页在线观看| 亚洲 欧美一区二区三区| 黄色毛片三级朝国网站| 99久久综合免费| 999精品在线视频| 巨乳人妻的诱惑在线观看| 免费高清在线观看日韩| 男人操女人黄网站| 久久ye,这里只有精品| 国产精品99久久99久久久不卡 | 日韩在线高清观看一区二区三区| 久久精品国产亚洲av天美| 亚洲av免费高清在线观看| 国产精品女同一区二区软件| 精品一区二区三区四区五区乱码 | 99精国产麻豆久久婷婷| 精品酒店卫生间| 国产欧美日韩综合在线一区二区| 欧美日韩av久久| 国产黄色视频一区二区在线观看| 久久精品国产鲁丝片午夜精品| 亚洲,欧美精品.| 母亲3免费完整高清在线观看 | 欧美精品一区二区免费开放| 亚洲欧洲日产国产| 精品一区二区三区视频在线| 丁香六月天网| 日韩熟女老妇一区二区性免费视频| 国产黄色免费在线视频| 精品久久蜜臀av无| 男男h啪啪无遮挡| 热re99久久精品国产66热6| 久久久国产欧美日韩av| 如何舔出高潮| 在线观看www视频免费| 国产黄色视频一区二区在线观看| 色婷婷久久久亚洲欧美| 乱人伦中国视频| 亚洲成av片中文字幕在线观看 | 高清在线视频一区二区三区| 新久久久久国产一级毛片| 免费看光身美女| 亚洲精品乱码久久久久久按摩| 99久久中文字幕三级久久日本| 成人漫画全彩无遮挡| 欧美日韩av久久| 国产精品麻豆人妻色哟哟久久| 在线天堂中文资源库| 国产精品久久久久久久电影| 国产免费一级a男人的天堂| 香蕉丝袜av| 欧美人与善性xxx| 赤兔流量卡办理| 国产精品麻豆人妻色哟哟久久| 狠狠精品人妻久久久久久综合| 天堂俺去俺来也www色官网| 欧美老熟妇乱子伦牲交| 国产精品不卡视频一区二区| 一级片免费观看大全| 高清av免费在线| 99久国产av精品国产电影| 黑人高潮一二区| 99热网站在线观看| 免费播放大片免费观看视频在线观看| 亚洲成色77777| 欧美日韩精品成人综合77777| 免费不卡的大黄色大毛片视频在线观看| 日本av手机在线免费观看| 午夜福利影视在线免费观看| 麻豆精品久久久久久蜜桃| 青春草视频在线免费观看| 国产有黄有色有爽视频| 免费播放大片免费观看视频在线观看| 久久人人爽av亚洲精品天堂| 狂野欧美激情性bbbbbb| 少妇精品久久久久久久| 欧美最新免费一区二区三区| 日本av免费视频播放| 国产片内射在线| 成年人午夜在线观看视频| 免费黄网站久久成人精品| 精品酒店卫生间| 最黄视频免费看| 色视频在线一区二区三区| 欧美成人午夜精品| 插逼视频在线观看| 中国国产av一级| 韩国精品一区二区三区 | 亚洲婷婷狠狠爱综合网| 欧美激情 高清一区二区三区| 精品久久久久久电影网| 国产亚洲一区二区精品| 纵有疾风起免费观看全集完整版| 91精品伊人久久大香线蕉| 秋霞伦理黄片| 亚洲av成人精品一二三区| 大片免费播放器 马上看| 狠狠精品人妻久久久久久综合| 女性生殖器流出的白浆| 18在线观看网站| 在线观看免费高清a一片| 男女无遮挡免费网站观看| 哪个播放器可以免费观看大片| 热99国产精品久久久久久7| 99re6热这里在线精品视频| 亚洲精品色激情综合| 成人二区视频| 日韩av不卡免费在线播放| 久久午夜福利片| 国产成人午夜福利电影在线观看| av片东京热男人的天堂| 婷婷色av中文字幕| 母亲3免费完整高清在线观看 | 一本大道久久a久久精品| 欧美+日韩+精品| 精品午夜福利在线看| 欧美亚洲日本最大视频资源| 视频在线观看一区二区三区| 热99国产精品久久久久久7| 日韩,欧美,国产一区二区三区| 久久久国产一区二区| 精品99又大又爽又粗少妇毛片| 男女啪啪激烈高潮av片| 色婷婷久久久亚洲欧美| 欧美最新免费一区二区三区| 亚洲,一卡二卡三卡| 日韩一区二区视频免费看| 国产麻豆69| 日韩一区二区三区影片| 亚洲欧美精品自产自拍| 黄色毛片三级朝国网站| 伦精品一区二区三区| 欧美精品国产亚洲| tube8黄色片| 美女主播在线视频| 在线免费观看不下载黄p国产| 色网站视频免费| 9色porny在线观看| 久久久精品免费免费高清| 国产精品秋霞免费鲁丝片| 韩国av在线不卡| 精品熟女少妇av免费看| 久久精品国产综合久久久 | 一区二区三区乱码不卡18| 午夜老司机福利剧场| 国产成人一区二区在线| 国产精品一区二区在线不卡| 久久这里有精品视频免费| 日本午夜av视频| 九色亚洲精品在线播放| 精品国产露脸久久av麻豆| 看免费av毛片| 国产高清三级在线| 国产黄色免费在线视频| 亚洲色图综合在线观看| 丝袜在线中文字幕| 欧美最新免费一区二区三区| 午夜激情久久久久久久| 久久久精品区二区三区| 成人亚洲欧美一区二区av| 精品久久蜜臀av无| 久热久热在线精品观看| 大香蕉久久成人网| 三级国产精品片| 国产成人精品久久久久久| 黄色怎么调成土黄色| 久久精品国产a三级三级三级| 亚洲国产精品一区二区三区在线| 国产精品欧美亚洲77777| 黄网站色视频无遮挡免费观看| 欧美日韩中文字幕国产精品一区二区三区 | 人人妻,人人澡人人爽秒播| 成人黄色视频免费在线看| 老司机在亚洲福利影院| 狠狠狠狠99中文字幕| 看黄色毛片网站| 国产欧美日韩精品亚洲av| 高清在线国产一区| 成年动漫av网址| 国产区一区二久久| 男人舔女人的私密视频| 欧美乱妇无乱码| 少妇 在线观看| а√天堂www在线а√下载 | 99riav亚洲国产免费| 精品国产国语对白av| 久久久精品免费免费高清| 男人的好看免费观看在线视频 | 免费看十八禁软件| 夜夜躁狠狠躁天天躁| 亚洲专区中文字幕在线| 男女之事视频高清在线观看| 久久久久精品国产欧美久久久| 91字幕亚洲| 国产成人精品在线电影| 欧美成人免费av一区二区三区 | 日本精品一区二区三区蜜桃| 久久久精品国产亚洲av高清涩受| 国产精品国产高清国产av | 人人妻人人添人人爽欧美一区卜| 婷婷精品国产亚洲av在线 | 手机成人av网站| 一二三四社区在线视频社区8| 日本撒尿小便嘘嘘汇集6| 国产精品98久久久久久宅男小说| 国产欧美日韩精品亚洲av| 久久久久久亚洲精品国产蜜桃av| 国产精品影院久久| 两人在一起打扑克的视频| 亚洲专区中文字幕在线| 99久久国产精品久久久| 欧美+亚洲+日韩+国产| 久9热在线精品视频| 激情视频va一区二区三区| av超薄肉色丝袜交足视频| 欧美日韩瑟瑟在线播放| 亚洲精品中文字幕一二三四区| av电影中文网址| 午夜福利免费观看在线| 看黄色毛片网站| 五月开心婷婷网| 亚洲色图综合在线观看| av有码第一页| 亚洲av成人一区二区三| 国产男女超爽视频在线观看| 9191精品国产免费久久| 国产三级黄色录像| 久久久久国产一级毛片高清牌| 亚洲精品中文字幕在线视频| 1024视频免费在线观看| 免费日韩欧美在线观看| www.999成人在线观看| 精品午夜福利视频在线观看一区| 日韩制服丝袜自拍偷拍| a在线观看视频网站| 欧美乱色亚洲激情| 老汉色av国产亚洲站长工具| 老司机福利观看| 身体一侧抽搐| 五月开心婷婷网| 欧美中文综合在线视频| 午夜精品国产一区二区电影| 一夜夜www| 久久青草综合色| 精品国产一区二区三区四区第35| 午夜精品久久久久久毛片777| www日本在线高清视频| 日韩三级视频一区二区三区| 国产男女内射视频| 伊人久久大香线蕉亚洲五| 夜夜爽天天搞| 一区二区三区国产精品乱码| 久热这里只有精品99| 久99久视频精品免费| 美女国产高潮福利片在线看| 捣出白浆h1v1| 午夜成年电影在线免费观看| 婷婷精品国产亚洲av在线 | 国产精品亚洲一级av第二区| 免费人成视频x8x8入口观看| 九色亚洲精品在线播放| 一区二区三区精品91| 动漫黄色视频在线观看| 在线观看日韩欧美| 婷婷精品国产亚洲av在线 | 他把我摸到了高潮在线观看| 国产在线精品亚洲第一网站| 19禁男女啪啪无遮挡网站| 人妻丰满熟妇av一区二区三区 | 欧美大码av| 亚洲精品一二三| 9色porny在线观看| 大香蕉久久网| 色播在线永久视频| 久久人妻福利社区极品人妻图片| 成年人午夜在线观看视频| 国产精品久久久久久人妻精品电影| 欧美激情久久久久久爽电影 | 久久人妻福利社区极品人妻图片| 岛国在线观看网站| 又大又爽又粗| 欧美 日韩 精品 国产| 中文字幕人妻熟女乱码| 国产男靠女视频免费网站| 亚洲免费av在线视频| 国产成人系列免费观看| 国内久久婷婷六月综合欲色啪| 久久精品国产清高在天天线| 国产99久久九九免费精品| 99在线人妻在线中文字幕 | 69av精品久久久久久| 国产精品一区二区精品视频观看| 国产精品成人在线| 自线自在国产av| 免费不卡黄色视频| 国产淫语在线视频| 国产高清国产精品国产三级| 一二三四社区在线视频社区8| 黄色毛片三级朝国网站| 叶爱在线成人免费视频播放| a级毛片黄视频| 免费看十八禁软件| 亚洲国产欧美日韩在线播放| 国产麻豆69| 国产一区有黄有色的免费视频| 亚洲精品国产一区二区精华液| 国产一区二区三区视频了| 久久久国产成人免费| 嫩草影视91久久| 首页视频小说图片口味搜索| 在线观看免费视频网站a站| 国产人伦9x9x在线观看| 黄色视频不卡| 中文字幕精品免费在线观看视频| 夜夜爽天天搞| 不卡一级毛片| 99热网站在线观看| 亚洲片人在线观看| 精品久久久久久久久久免费视频 | 久久99一区二区三区| 九色亚洲精品在线播放| 男人操女人黄网站| 精品亚洲成a人片在线观看| 久久久久国产一级毛片高清牌| 一区二区日韩欧美中文字幕| 精品久久久久久久久久免费视频 | 欧美在线一区亚洲| 亚洲成人免费电影在线观看| 视频区图区小说| 波多野结衣av一区二区av| 中亚洲国语对白在线视频| 亚洲欧美激情在线| 99久久国产精品久久久| 国产xxxxx性猛交| 伦理电影免费视频| 久久久水蜜桃国产精品网| 欧美日韩国产mv在线观看视频| 亚洲av第一区精品v没综合| 一二三四在线观看免费中文在| 香蕉国产在线看| 好男人电影高清在线观看| 国产单亲对白刺激| 热99久久久久精品小说推荐| 丁香六月欧美| 亚洲七黄色美女视频| 国产精品一区二区精品视频观看| 麻豆国产av国片精品| 女人高潮潮喷娇喘18禁视频| 夫妻午夜视频| 国产午夜精品久久久久久| 欧美乱码精品一区二区三区| 亚洲精华国产精华精| 美女视频免费永久观看网站| 夜夜夜夜夜久久久久| 久久精品aⅴ一区二区三区四区| 久久久久久久久免费视频了| 黄色视频,在线免费观看| www日本在线高清视频| 欧美亚洲 丝袜 人妻 在线| 三上悠亚av全集在线观看| 十分钟在线观看高清视频www| 日韩欧美一区视频在线观看| 精品国产乱子伦一区二区三区| 色94色欧美一区二区| 女性被躁到高潮视频| 91国产中文字幕| 国产在线一区二区三区精| 高清av免费在线| 亚洲成av片中文字幕在线观看| 99久久99久久久精品蜜桃| 人成视频在线观看免费观看| 亚洲国产欧美日韩在线播放| 国产成人欧美在线观看 | 亚洲国产欧美网| 高清黄色对白视频在线免费看| 久久久精品区二区三区| 亚洲成a人片在线一区二区| 久久精品aⅴ一区二区三区四区| 精品国产乱子伦一区二区三区| 俄罗斯特黄特色一大片| 在线观看免费视频日本深夜| 三级毛片av免费| 国产精品香港三级国产av潘金莲| 脱女人内裤的视频| 满18在线观看网站| 亚洲精品国产区一区二| 亚洲精品乱久久久久久| 午夜福利,免费看| 99久久99久久久精品蜜桃| 一级片免费观看大全| 国产欧美日韩一区二区三区在线| 欧美成人免费av一区二区三区 | av一本久久久久| av有码第一页| 777久久人妻少妇嫩草av网站| 99香蕉大伊视频| 欧美在线黄色| 精品国内亚洲2022精品成人 | 9191精品国产免费久久| 中国美女看黄片| 久久久久久人人人人人| 校园春色视频在线观看| 99精品欧美一区二区三区四区| av电影中文网址| 超色免费av| 99国产极品粉嫩在线观看| 欧美中文综合在线视频| 女性生殖器流出的白浆| 亚洲色图av天堂| 免费在线观看黄色视频的| 国产淫语在线视频| 免费一级毛片在线播放高清视频 | 中文字幕另类日韩欧美亚洲嫩草| 国产亚洲精品久久久久久毛片 | 少妇被粗大的猛进出69影院| 丁香六月欧美| 狠狠狠狠99中文字幕| a级毛片黄视频| 中国美女看黄片| 欧美日韩黄片免| 成年女人毛片免费观看观看9 | 伦理电影免费视频| 大陆偷拍与自拍| 麻豆成人av在线观看| 日韩一卡2卡3卡4卡2021年| 亚洲视频免费观看视频| 好看av亚洲va欧美ⅴa在| 嫁个100分男人电影在线观看| 亚洲国产欧美日韩在线播放| 久久久精品国产亚洲av高清涩受| 涩涩av久久男人的天堂| 天天影视国产精品| 无人区码免费观看不卡| 一进一出抽搐gif免费好疼 | 精品一品国产午夜福利视频| 国产成人欧美在线观看 | 精品久久久久久电影网| 亚洲av成人不卡在线观看播放网| 亚洲欧美激情综合另类| 国产男靠女视频免费网站| 精品一区二区三区四区五区乱码| 纯流量卡能插随身wifi吗| 国产成人av教育| 新久久久久国产一级毛片| 国产成人欧美在线观看 | 十分钟在线观看高清视频www| 女人高潮潮喷娇喘18禁视频| 91麻豆av在线| 国产蜜桃级精品一区二区三区 | 人妻 亚洲 视频| svipshipincom国产片| 日韩精品免费视频一区二区三区| 亚洲第一欧美日韩一区二区三区| 女人高潮潮喷娇喘18禁视频| 99热网站在线观看| 免费看a级黄色片| 极品人妻少妇av视频| 日韩免费av在线播放| 日韩一卡2卡3卡4卡2021年| 国产精品影院久久| 99久久精品国产亚洲精品| 日韩 欧美 亚洲 中文字幕| 看片在线看免费视频| 亚洲一区二区三区欧美精品| 亚洲成人免费av在线播放| xxxhd国产人妻xxx| 亚洲国产欧美一区二区综合| 免费看a级黄色片| 亚洲欧美日韩高清在线视频| 啪啪无遮挡十八禁网站| 亚洲少妇的诱惑av| 在线播放国产精品三级| 高清视频免费观看一区二区| 久久国产亚洲av麻豆专区| 国产精品九九99| 免费在线观看日本一区| 午夜福利免费观看在线| 在线观看舔阴道视频| 精品国产一区二区三区久久久樱花| 成年人午夜在线观看视频| 一区二区三区精品91| 午夜精品国产一区二区电影| 免费在线观看亚洲国产| 女人高潮潮喷娇喘18禁视频| 国产成人免费观看mmmm| 亚洲国产精品sss在线观看 | 久久久久国产一级毛片高清牌| 国产精品.久久久| 建设人人有责人人尽责人人享有的| 777米奇影视久久| 国产成人精品在线电影| 欧美日韩黄片免| 国产精品欧美亚洲77777| 丝袜美腿诱惑在线| 老熟妇仑乱视频hdxx| 搡老乐熟女国产| 国产成人精品久久二区二区91| 亚洲成人免费电影在线观看| 人人妻人人澡人人看| 亚洲国产毛片av蜜桃av| 美女扒开内裤让男人捅视频| 亚洲精品一二三|