• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Using Vector Representation of Propositions and Actions for STRIPS Action Model Learning

    2019-01-17 01:24:06WeiGaoandDunboCai

    Wei Gao and Dunbo Cai

    (1.Beijing Aerospace Control Center, Beijing 100094, China; 2.School of Computer Science and Engineering, Wuhan Institute of Technology, Wuhan 430205, China)

    Abstract: Action model learning has become a hot topic in knowledge engineering for automated planning. A key problem for learning action models is to analyze state changes before and after action executions from observed “plan traces”. To support such an analysis, a new approach is proposed to partition propositions of plan traces into states. First, vector representations of propositions and actions are obtained by training a neural network called Skip-Gram borrowed from the area of natural language processing (NLP). Then, a type of semantic distance among propositions and actions is defined based on their similarity measures in the vector space. Finally, k-means and k-nearest neighbor (kNN) algorithms are exploited to map propositions to states. This approach is called state partition by word vector (SPWV), which is implemented on top of a recent action model learning framework by Rao et al. Experimental results on the benchmark domains show that SPWV leads to a lower error rate of the learnt action model, compared to the probability based approach for state partition that was developed by Rao et al.

    Key words: automated planning; action model learning; vector representation of propositions

    Automated planning for intelligent agents is the problem of arranging a sequence of actions to achieve goals, given the current situation and the set of available actions. Actions are models of how agents can change the states of the world. In the action representing languages: STRIPS[1]and PDDL (planning domain definition language)[2], an action consists of a set of preconditions and a set of effects which are both logical propositions. Preconditions specify under what conditions the action can execute, while effects specify the value changes of propositions after the execution. Usually, design of action models is done by domain experts from scratch. However, for many application domains, preconditions and effects are complex and difficult for human to figure out. Therefore, building action models is a challenging knowledge engineering task.

    Wang[3]is the first researcher to propose to automatically build action models, with the aim to help human experts. Along this direction, advanced techniques have been developed[4-11]. Yang et al.[6]proposed an algorithm for STRIPS-style action melding, which utilized MAX-SAT techniques to reason constraints on mapping propositions to action preconditions or effects. Zhuo et al.[7]develop a method using Markov Logic Network to learn the logical qualifiers ? and ?, and logical implications for actions. Lanchas et al.[11]used Regression Tree to predicate the duration of action. Rao et al.[9]extend the learning problem from deterministic action models to non-deterministic model in a partially observable environment.

    In partially observable environment, one important problem is the analysis of state change before and after the execution of an action in a plan trace. Specifically, two neighboring propositionspandp′ observed from a plan trace cannot be assumed to be from the same state, due to that some action a may execute between the times the two hold but the execution was not observed. In other words, we cannot determine for which actionpis a precondition and for which actionp′ is an effect. Due to the same reason, if an action is observed to be next to a proposition we cannot assume that the proposition is one of the preconditions of that action. To handle the problem of partial observability, Amir and Chang[8]used a kind of logical formula called Transition Belief Formula, to track the possibly transitions among world states. Rao et al.[9]extracted state change knowledge under two assumptions: actions make least modification on states, and each action has an equal opportunity of affecting a changed literal. These assumptions are purely made from the aspect of probability change where the relation among propositions are not considered.

    This paper exploits a recently proposed distributed representation of words from the work of Word2vec[12]. Specifically, words are represented by vectors, and this representation is obtained by training a kind of neural network called Skip-Gram. This representation is useful in many natural language processing (NLP) tasks, e.g., named entity reconignition[13], latent semantic analysis[14], and automated summarization of texts[15-16]. Notably, the representation supports the vector of a word in carrying some meanings of the word. This property is utilized here to partition propositions into states. The vectors also provide us a way to define semantic distances among propositions and actions of our problem. Further, the semantic distances are used to develop a novel state partition method which exploits the k-means and kNN[17]algorithm frameworks. Extensive experiments were conducted to evaluate our method on the International Planning Competition (IPC) domains. And, results show that our method is superior to the state partition method designed by Rao et al.[9], both in terms of the error rate of state partition and the error rate in STRIPS action model learning.

    1 Backgrounds

    In this paper some important concepts and notions are introduced from predicate logic, STRIPS planning, action model learning, and vector representation of words (an interesting technique from NLP).

    1.1 Predicates, propositions and literals

    In mathematical logic, each predicate has a name and a set of parameters. For example, the predicate At(x,y) has the name At and two parametersx,y, and can be used to represent that an agentxis at the locationy. Propositions are instantiated predicates. For example, if a1is an agent and Beijing a location, then At(a1, Beijing) is a proposition. Usually, for a propositionp, we usepto denote thatpis true, and use its negationpto denote thatpis false. Propositions and the negation of propositions are called literals. In this paper, we use a set of literals {l1, …,lk} to denote their conjunctionsl1∧…∧lk.

    1.2 STRIPS action model

    A STRIPS action model (aka, action schema, or, operator)ois of the form

    〈name(o),para(o),pre(o),add(o),del(o)〉

    where name(o) is the name of the action model, para(o) is its parameters, pre(o), add(o), and del(o) are preconditions, add effects, delete effects ofo, respectively. name(o) is a string of characters. para(o) is a set of typed objects. A typed object consists of its name and its type (both are strings). We refer to name(o) and para(o) as the head ofo. pre(o), add(o), and del(o) are sets of predicates. Please refer to Fig. 1 for an example of an action represented in STRIPS for the Rover domain from IPC. For a detailed description of the syntax and semantics of STRIPS, please refer to Refs.[18].

    (:action sample_soil:parameters (?x-rover ?s-store ?p-waypoint)

    :precondition (and

    (at ?x ?p)

    (at_soil_sample ?p)

    (equipped_for_soil_analysis ?x)

    (store_of ?s ?x)

    (empty ?s) )

    :effect (and (not (empty ?s))

    (full ?s)

    (have_soil_analysis ?x ?p)

    (not (at_soil_sample ?p))))

    Fig.1 Action model expressed in STRIPS from the Rover domain

    An action is one instantiation of an action model. For example, ifris a rover,sa store, andwa waypoint, then sample_soil(r,s,w) is an action.

    1.3 Action model learning

    Action model learning is the problem of building the structures (preconditions and effects) of each action model in a domain, given a set of plan traces from the domain. A plan trace is composed of observations of facts and actions.

    Definition1An observation of a fact is of the form (p,x1, …,xk), wherepis the predicate name,xiare objects,k∈N is a const.

    Definition2An observation of an action is of the form (a,x1, …,xm), whereais the action name,xiare objects, andm∈N is a const.

    Definition3A plan trace is a sequence of observationsT= (v1,v2, …,vt), whereviis either an observation of a fact or an action, andt∈N is the number of steps.

    Definition4An action model learning problem (AML) is to construct the preconditions and effects for action models in a planning domain, given:

    A finite set of predicatesP,

    A finite set of action model heads: {(name(o),para(o))},

    A finite set of plan traces.

    In other words, for each action model, its preconditions and effects are not known. And we will develop a method to fill the flaws. The way to solve this is called the learning of action models[6].

    We cite the learning process from Rao et al[9]. The process consists of the following steps:

    ① State partition: partition propositions in the plan traces into a set of states.

    ② Precondition extraction:associate propositions as preconditions of an action based on changes among states.

    ③ Effects extraction: associate propositions as effects of an action based on changes among states.

    1.4 Vector representation of words

    Words are symbols with meanings. However, these symbols are difficult to handle by mathematic approaches. So, researchers in NLP attempts to design representations of words to support those approaches. A classical method is to represent word as vectors of 0 or 1. The first idea is the one-hot representation[12]. It assigns a total order on the words in a corpus. The length of each vector of a word is of the same length: the number of words in the corpus. And, the vector of a word is defined by setting a 1 at the word’s position and setting a 0 elsewhere. For example, the corpus {He, She, They} will have a one-hot representation of the words “He”, “She”, “They” with vectors: [1, 0, 0], [0, 1, 0] and [0, 0, 1], respectively. However, this representation is found to lose the meanings of words[12].

    A recent well-known vector representation is called Word2vec proposed by Mikolov et al[12, 19]. They designed two kinds of neural networks, CBox and Skip-Gram, to learn the vector of words. They hope this representation can carry meanings of words. The motivation is that the meaning of a word depends on its contexts (i.e., the words around it). So, they train the neural networks with pairs of words in a certain length of window. It is shown that the vectors obtained by CBox have the potential to use a set of words to predict the word among them, while those obtained by Skip-Gram have the potential to predict the words around for a given word[19]. This paper will use Skip-Gram to analyze meanings of propositions and actions.

    2 Building the Embedding of Propositions and Actions

    First, propositions and actions in plan traces are converted into words. Second, plan traces are viewed as sentences from which the samples (pairs of words co-occurred) for training are generated. A Skip-Gram network is then trained with those samples. Third, a semantic distance is defined for a pair of words via a similarity measure on their vectors.

    2.1 Mapping plan traces to sentences

    As discussed in section 1.3, propositions and actions in a plan trace have a certain form, but do not correspond to words as we required. Specifically, the name and arguments of a proposition are separated, which make them be seen as separated words. For example, the proposition At(a1, Beijing) corresponds to three words: the first word is “At”, the second is “a1”, and the third is “Beijing”. To make all the parts of a proposition to form a single word, we concatenate the together with the - (hyphen) operator. For example, the proposition At(a1, Beijing) is transformed to At-a1-Beijing, which is a word in.

    Definition5A sentence mapping of a plan traceT=(v1,v2, …,vt) is a sequence of words (an artificial sentence)φ=(w1,w2, …,wt) where a wordwi(i=1,…,t) is formed by concatenating each part ofviwith the - (hyphen) operator.

    2.2 Building vector representation of words

    When a set of plan traces are mapped into a set of sentences, these sentences form a corpus (Note that, corpus is a terminology borrowed from NLP). Given a corpusC, we will useCto train a Skip-Gram neural network to build the vector representation of words inC.

    We use the Skip-Gram neural network implemented in the open-source software Deeplearing4j (https://deeplearning4j.org/). And, we configure the network with parameters shown in Tab.1.

    Tab.1 Configuration for the parameters of the Skip-Gram neural network

    2.3 Semantic distances among propositions and actions

    In this paper, the semantic distance of a pair of propositions is considered in terms of the frequency of the two propositions’ occurrences in the same state. Further, we want to use the semantic distance to figure out those propositions that belong to the same state. In this way, a sequence of propositions are able to be partitioned into different states.

    Given two wordswandw′, cosine distance between their vectors w and w′ is used to measure their similarity[12]. Specifically,

    (1)

    It can be seen that the value of Similarity(w,w′) is in [-1, 1]. To regulate the values for semantic analysis of words, we define

    (2)

    In Eq.(2), the relationships between SemDistance(w,w′) and Similarity(w,w′) are:

    ① if Similarity(w,w′)=1, then SemDistance(w,w′)=0, which predicts that the two words have similar meanings.

    ② if Similarity(w,w′)=0, then SemDistance(w,w′)=0.5, which is not so useful to predict the relation of the two words in terms of semantic meanings.

    ③ if Similarity(w,w′)=-1, then SemDistance(w,w′)=1, which predicts that the two words have different meanings.

    To show the usefulness of SemDistance in characterizing the semantic distance of propositions, we present an example in Tab.2. This example is taken from the IPC’s rovers domain, trained on the plan traces obtained on the first problem (named pfile01) of the domain. A representative case is for propositions “at_soil_sample-waypoint” and “available-rover”. Their SemDistance is 0.045 (close to 0), based on which we think that the two proposition may occur in the same state. In fact, “at_soil_sample-waypoint” and “available-rover” can be in the same state.

    Tab.2 Example distances among propositions on the first task in the rovers domain

    3 State Partition Using Word Vectors

    Using SemDistance as the semantic measure of propositions and actions, in this section we develop an algorithm for partitioning propositions on plan traces into states.

    3.1 Identifying co-occurrence of propositions

    For a set of plan traces, we first obtain the semantic distances of the pairs of propositions for every plan trace. We then order the distances in an ascending order. For example, the set of distances {0.3, 0.1, 0.2} will be ordered into a sequence 〈0.1, 0.2, 0.3〉. Given, such a sequence, we want to predict how many hidden states are there. We use k-means to develop a clustering algorithm for this purpose (Fig.2).

    Now, the algorithm is illustrated in Fig.2. Our first note is that it uses a fixed numberKof clusters. Here, we setK=5, and our reason is as follows.

    Observation1For two neighboring actions in a plan trace, the number of actions that occurred between them but were not observed should be less than 5. Suppose the event e: “an action that appeared but was not observed” is with the probability of 0.5. Then,e′: “5 events happens simultaneously” is with the probability of 0.55=0.031 25<5%. So,e′ is a small probability event. Due to this observation, there are at most 5 hidden states between the two observed actions, and therefore we setK=5.

    Input: A sequence of semantic distances,D= {x1,x2, …,xm},

    The expected number of clusters:K

    Output: Clusters ofD, with formC={C1,C2, …,Ck}

    Randomly selectKvalues fromD: {μ1,μ2, …,μK}

    Repeat

    Ci?(1≤i≤K)

    For 1≤j≤mDo

    dji←‖xj-μi‖2

    Cλj←Cλj∪{xj}

    EndFor

    For 1≤i≤KDo

    Ifμ′i≠μiThen

    μiμ′i

    EndIf

    EndFor

    Until {μ1,μ2, …,μK} is not changed

    Fig.2 Hidden states cluster algorithm

    Our second note is on the idea of the algorithm in Fig.2. The algorithm does not work directly on the propositions, but on the SemDistance between them. It is used to predict what SemDistance values the pairs of propositions in the same sate (i.e., the two propositions co-occur) may have. In the following subsection, we propose an algorithm that use the information to map the propositions to a particular state.

    3.2 Mapping propositions into states based on their co-occurrence

    Our mapping method is based on the co-occurrence of propositions. Specifically, for a propositionp, if the pairs of proposition 〈p,q〉, 〈p,l〉, and 〈p,s〉 are predicted to co-occur in sates1,s2, ands2, respectively. Due to that the times of co-occurrence ins2 is more than that ins1, we predict thatpbelongs tos2. Utilizing this method and the idea of kNN, our algorithm (in Fig.3) maps propositions to states. In other words, the algorithm partitions a sequence of propositions to a set of states. And every state is on a different time step. The difference between two states can be seen as the changes an action made, and therefore can be used to model the preconditions and effects of actions. We call our method of partitioning propositions as Sate Partition by Word Vector (SPWV).

    Input: A set of propositionsF, and their vectors;

    The functionSemDist( );

    A set of clusters of distances:C={C1,C2, …,Ck}

    Output: A set of mapping from proposition to statePar: {(p,s)|pis a proposition,sis a state}

    Par{}

    Forp∈FDo

    tempthe nearestkSemDist(p,q), forq∈Fandqp

    Forc∈CDo

    count(c) = the frequency of elements intempbelongs toc

    EndFor

    ParPar

    EndFor

    Fig.3 State partition algorithm

    4 Experiments

    In this section, we evaluate SPWV on International Planning Competition (IPC) benchmarks. The set of benchmarks includes the following domains: Depots, DriverLog, Rovers, Zenotravel, Airport, Pipesworld, and Satellite. These are all STRIPS domains. For comparison, we take the action model learning algorithm developed in Ref.[9], which is based on two assumptions on properties of the probability distribution of propositions. We call it state partition via probability (SPP), and on the other hand refer to the action model learning method as SPP-AML. Replacing SPP in SSP-AML with SPWV, a new action model learner is obtained, which we call SPWV-AML.

    The aim of our experiment is to evaluate SPWV with respect to the error rate in state partition, and the error rate in action model learning. For the first purpose, we compare SPWV to SPP, and for the second we compare SPWV-AML to SPP-AML. For rules that determines the two kinds of error rate, please refer to Ref.[9].

    4.1 Experimental setup

    We use a 5-fold cross-validation. The planning traces for sampling and training are generated in the following way: the first 10 tasks in a domain is used for plan trace generating, 10 random plan traces are generated for each task, and the first 100 steps of a plan trace is kept. In this way, we have 100 plan traces. Further, we take the prefixes of increasing length of every plan traces as the test planning traces. Therefore, we have 100×100=10 000 samples, in which 8 000 samples are used for training and the left is for validation. The randomness in a plan trace is controlled by the observation probabilityγ.

    The test bed is with a 4 core CPU running at 1.63 GHz, and with 8 GB memory. Our SPWV method was implemented in Java using Deeplearning4j, and the other modules of SPWV-AML were implemented using C++.

    4.2 Results and analysis

    The first result is about the error rate in state partition. We observed that the error rates of both SPP and SPWV decrease whenγ(the observation probability) increases. These outputs are reasonable as a method gets more information to reason about. However, our SPWV method becomes better than SPP whenγincreases. Specifically, SPWV is superior to SPP whenγ=0.9 (shown in Tab.3).

    The second result is about the error rate in the action model learning (Fig.3). The result shows that both SPP-AML and our SPWV-AML becomes better whenγincreases. Notably, SPWV-AML is consistently better than SPP-AML on all the benchmark domains. It is reasonable to conclude that the good performance of SPP-AML is due to the low error rate obtained by SPP in the state partition process.

    5 Conclusions

    In this paper, a recent word embedding technique from the NLP community is utilized to predict the semantic distance between propositions. To our knowledge, this is the first work in automated planning which exploits advancements from NLP. Our preferred properties of the word embedding are: providing a convenient way to be manipulated in a formal way, and carrying meanings of words. We show how these two properties can be utilized to define a reasonable distance measure between propositions. The resulted state partition method is shown to be superior to a representative method. Further, the resulted action model learner shows a better performance on the IPC benchmarks.

    Further improvements on the proposed state partition method are possible, such as using a more recent word embedding model, partitioning propositions with both the semantic distances among them and changing their parameter structures[20].

    国产午夜福利久久久久久| 97碰自拍视频| 久久精品aⅴ一区二区三区四区| videosex国产| 三级毛片av免费| 欧美久久黑人一区二区| 国产亚洲精品久久久久久毛片| av免费在线观看网站| 人妻久久中文字幕网| 真人做人爱边吃奶动态| 精品电影一区二区在线| 欧美性猛交黑人性爽| tocl精华| 国产激情欧美一区二区| 黄色 视频免费看| 18禁国产床啪视频网站| 1024视频免费在线观看| √禁漫天堂资源中文www| 不卡av一区二区三区| 婷婷精品国产亚洲av在线| 亚洲成人国产一区在线观看| 男插女下体视频免费在线播放| 午夜成年电影在线免费观看| 亚洲中文字幕日韩| 免费电影在线观看免费观看| 成年人黄色毛片网站| 欧美大码av| 婷婷精品国产亚洲av| 日韩 欧美 亚洲 中文字幕| 我要搜黄色片| 亚洲av电影不卡..在线观看| 亚洲欧美精品综合一区二区三区| 久久精品夜夜夜夜夜久久蜜豆 | 国产真实乱freesex| 九色国产91popny在线| 国产精品99久久99久久久不卡| 日韩 欧美 亚洲 中文字幕| 51午夜福利影视在线观看| 国产69精品久久久久777片 | 制服诱惑二区| 黄色a级毛片大全视频| 亚洲专区国产一区二区| 亚洲av美国av| 在线观看免费午夜福利视频| 波多野结衣巨乳人妻| 三级国产精品欧美在线观看 | 欧美乱妇无乱码| 国产aⅴ精品一区二区三区波| 黄色成人免费大全| 岛国视频午夜一区免费看| 桃红色精品国产亚洲av| 国产在线精品亚洲第一网站| 一区二区三区激情视频| 麻豆国产97在线/欧美 | 成人av一区二区三区在线看| 亚洲熟妇熟女久久| 免费在线观看亚洲国产| 欧美精品啪啪一区二区三区| 久久精品aⅴ一区二区三区四区| 老鸭窝网址在线观看| 熟女少妇亚洲综合色aaa.| 欧美日韩瑟瑟在线播放| 男人舔女人下体高潮全视频| 嫁个100分男人电影在线观看| 级片在线观看| 日韩欧美在线乱码| 亚洲欧美日韩高清在线视频| 亚洲精品中文字幕在线视频| a级毛片a级免费在线| 久久久久性生活片| 成人永久免费在线观看视频| 日本 av在线| 男女视频在线观看网站免费 | 亚洲精品美女久久久久99蜜臀| 丝袜人妻中文字幕| 久久 成人 亚洲| 国内少妇人妻偷人精品xxx网站 | 亚洲第一欧美日韩一区二区三区| 黄色成人免费大全| 深夜精品福利| 三级毛片av免费| 黑人操中国人逼视频| 日本五十路高清| 丝袜美腿诱惑在线| 日韩欧美在线二视频| 国产精品99久久99久久久不卡| 久久天堂一区二区三区四区| 1024手机看黄色片| 久久精品亚洲精品国产色婷小说| 国产精品久久久久久精品电影| 欧美成人一区二区免费高清观看 | 亚洲成人免费电影在线观看| 琪琪午夜伦伦电影理论片6080| 日韩有码中文字幕| 亚洲成av人片在线播放无| 日韩欧美精品v在线| 国产黄色小视频在线观看| 国产v大片淫在线免费观看| 岛国在线观看网站| 亚洲av美国av| 桃色一区二区三区在线观看| 亚洲精品一卡2卡三卡4卡5卡| 一进一出好大好爽视频| 日韩有码中文字幕| 老司机深夜福利视频在线观看| 搡老岳熟女国产| av超薄肉色丝袜交足视频| 级片在线观看| 国产精品影院久久| 精品免费久久久久久久清纯| 最新美女视频免费是黄的| 久久久久久免费高清国产稀缺| 久久香蕉国产精品| 日韩大码丰满熟妇| 日日摸夜夜添夜夜添小说| 欧美+亚洲+日韩+国产| 亚洲性夜色夜夜综合| 久久精品国产99精品国产亚洲性色| 岛国在线观看网站| 天堂√8在线中文| 热99re8久久精品国产| 精品久久久久久久人妻蜜臀av| 神马国产精品三级电影在线观看 | 手机成人av网站| 精品国产美女av久久久久小说| 香蕉国产在线看| 免费一级毛片在线播放高清视频| 欧美日本亚洲视频在线播放| 一个人观看的视频www高清免费观看 | 成人午夜高清在线视频| 国产精品一区二区免费欧美| 亚洲成人精品中文字幕电影| 久久九九热精品免费| 国产av麻豆久久久久久久| av天堂在线播放| 岛国在线免费视频观看| 国产在线精品亚洲第一网站| 欧美av亚洲av综合av国产av| av超薄肉色丝袜交足视频| 亚洲在线自拍视频| 美女大奶头视频| 中国美女看黄片| 国产激情欧美一区二区| 亚洲精品久久成人aⅴ小说| 久久久久九九精品影院| 人人妻人人看人人澡| 亚洲成人中文字幕在线播放| 成年人黄色毛片网站| 国产成人精品久久二区二区免费| 国产av一区二区精品久久| 制服诱惑二区| 手机成人av网站| 日本精品一区二区三区蜜桃| 亚洲第一欧美日韩一区二区三区| 两个人的视频大全免费| 国产精品久久久久久人妻精品电影| 国产又黄又爽又无遮挡在线| 久久精品人妻少妇| 中国美女看黄片| 国产1区2区3区精品| 成人国产一区最新在线观看| 免费搜索国产男女视频| 精品国内亚洲2022精品成人| 国产不卡一卡二| 国产亚洲av嫩草精品影院| 国产成人精品久久二区二区免费| 美女午夜性视频免费| 久久久久九九精品影院| 两性午夜刺激爽爽歪歪视频在线观看 | bbb黄色大片| 少妇熟女aⅴ在线视频| 99热只有精品国产| 91麻豆av在线| 欧美一区二区精品小视频在线| 久久中文字幕人妻熟女| 国产在线精品亚洲第一网站| 日韩 欧美 亚洲 中文字幕| 欧美成狂野欧美在线观看| 欧美乱码精品一区二区三区| 亚洲精品色激情综合| 久久久久性生活片| 日本黄色视频三级网站网址| 日日干狠狠操夜夜爽| 欧美久久黑人一区二区| 中文字幕人成人乱码亚洲影| 无人区码免费观看不卡| 国产亚洲av高清不卡| or卡值多少钱| xxx96com| 国产主播在线观看一区二区| 成年免费大片在线观看| 两个人的视频大全免费| 国产精品久久视频播放| 熟妇人妻久久中文字幕3abv| 美女黄网站色视频| 桃色一区二区三区在线观看| 每晚都被弄得嗷嗷叫到高潮| 99国产极品粉嫩在线观看| 可以在线观看毛片的网站| 国产精品精品国产色婷婷| 黄色视频,在线免费观看| 99在线视频只有这里精品首页| 丰满人妻熟妇乱又伦精品不卡| 免费在线观看日本一区| 免费无遮挡裸体视频| 又粗又爽又猛毛片免费看| 88av欧美| 无人区码免费观看不卡| 日韩精品中文字幕看吧| 久久久久久久久久黄片| 别揉我奶头~嗯~啊~动态视频| 女人被狂操c到高潮| 国产av麻豆久久久久久久| 国产伦一二天堂av在线观看| 亚洲一区二区三区色噜噜| 亚洲乱码一区二区免费版| 国产一区二区在线av高清观看| 国产欧美日韩一区二区精品| 亚洲av成人精品一区久久| 国产视频一区二区在线看| 色综合亚洲欧美另类图片| av超薄肉色丝袜交足视频| 观看免费一级毛片| 欧美黑人精品巨大| 成年女人毛片免费观看观看9| 久久精品91蜜桃| 国产一区二区激情短视频| 特大巨黑吊av在线直播| 日韩免费av在线播放| 久久天躁狠狠躁夜夜2o2o| 亚洲av成人一区二区三| 国产私拍福利视频在线观看| 亚洲国产中文字幕在线视频| 色在线成人网| 免费在线观看亚洲国产| 久久婷婷成人综合色麻豆| 免费人成视频x8x8入口观看| 搡老熟女国产l中国老女人| 日韩欧美三级三区| www.精华液| 亚洲真实伦在线观看| 久久久久国内视频| 白带黄色成豆腐渣| 51午夜福利影视在线观看| 久久 成人 亚洲| 久久精品国产清高在天天线| 午夜福利成人在线免费观看| 日韩成人在线观看一区二区三区| 国产成人av教育| 精品不卡国产一区二区三区| 99国产精品一区二区蜜桃av| 精品高清国产在线一区| 国产精品一区二区三区四区久久| 国产成人欧美在线观看| 亚洲精品在线观看二区| 国产成人av教育| 欧美三级亚洲精品| 色精品久久人妻99蜜桃| 香蕉久久夜色| 精品久久久久久久毛片微露脸| 亚洲五月天丁香| 男女那种视频在线观看| 美女午夜性视频免费| x7x7x7水蜜桃| 国内少妇人妻偷人精品xxx网站 | 欧美日韩福利视频一区二区| 欧美成人午夜精品| 18美女黄网站色大片免费观看| 色综合婷婷激情| 五月玫瑰六月丁香| 亚洲天堂国产精品一区在线| 欧美黑人精品巨大| 国产精品久久电影中文字幕| 国产一区在线观看成人免费| 狂野欧美激情性xxxx| 亚洲专区中文字幕在线| 国产真实乱freesex| 在线观看免费视频日本深夜| 男女视频在线观看网站免费 | 国产成人影院久久av| 亚洲欧美日韩东京热| 久久久水蜜桃国产精品网| 国产人伦9x9x在线观看| 欧美黑人精品巨大| 日韩欧美在线二视频| 成人三级黄色视频| 国产69精品久久久久777片 | 日本a在线网址| 人人妻人人看人人澡| 波多野结衣高清作品| √禁漫天堂资源中文www| 少妇人妻一区二区三区视频| 国产成人欧美在线观看| 波多野结衣高清无吗| 后天国语完整版免费观看| 777久久人妻少妇嫩草av网站| 少妇裸体淫交视频免费看高清 | 亚洲国产欧洲综合997久久,| 亚洲av成人精品一区久久| 99久久精品热视频| www.精华液| 五月伊人婷婷丁香| 成年女人毛片免费观看观看9| 国产精品影院久久| 久99久视频精品免费| 亚洲中文字幕一区二区三区有码在线看 | 国产精品 国内视频| 日本熟妇午夜| av视频在线观看入口| 国产伦一二天堂av在线观看| 日韩欧美免费精品| 成人永久免费在线观看视频| 欧美日韩乱码在线| 欧美性长视频在线观看| 我要搜黄色片| 免费在线观看视频国产中文字幕亚洲| 欧美性猛交╳xxx乱大交人| 日日夜夜操网爽| 岛国视频午夜一区免费看| 国产片内射在线| 国产不卡一卡二| 国产单亲对白刺激| 亚洲美女黄片视频| 狂野欧美激情性xxxx| 一边摸一边抽搐一进一小说| 亚洲国产精品成人综合色| 欧美乱码精品一区二区三区| 国产精品av久久久久免费| 日日摸夜夜添夜夜添小说| 男女床上黄色一级片免费看| 怎么达到女性高潮| 男女床上黄色一级片免费看| 脱女人内裤的视频| 久久久久久久午夜电影| 看免费av毛片| 久久久久久国产a免费观看| 精品久久久久久久久久免费视频| 香蕉av资源在线| 亚洲男人的天堂狠狠| 五月伊人婷婷丁香| 亚洲色图 男人天堂 中文字幕| 亚洲精品在线美女| 久久久水蜜桃国产精品网| 国内精品久久久久久久电影| 777久久人妻少妇嫩草av网站| 久久久国产精品麻豆| 欧美日韩亚洲国产一区二区在线观看| 亚洲中文字幕一区二区三区有码在线看 | 国产精品亚洲美女久久久| 国产免费av片在线观看野外av| 国产精品久久视频播放| 日韩大尺度精品在线看网址| 欧美黄色片欧美黄色片| www.自偷自拍.com| 国产成人精品无人区| 欧美大码av| 亚洲全国av大片| cao死你这个sao货| 国产激情欧美一区二区| 99re在线观看精品视频| 毛片女人毛片| 特级一级黄色大片| 19禁男女啪啪无遮挡网站| 一卡2卡三卡四卡精品乱码亚洲| 色哟哟哟哟哟哟| 毛片女人毛片| 日本免费一区二区三区高清不卡| 99久久国产精品久久久| 国产三级在线视频| 国产97色在线日韩免费| 日韩 欧美 亚洲 中文字幕| 国产v大片淫在线免费观看| 美女免费视频网站| 精品欧美一区二区三区在线| 国产精品香港三级国产av潘金莲| 国产精品 欧美亚洲| 国产精品av视频在线免费观看| 美女扒开内裤让男人捅视频| 色综合亚洲欧美另类图片| 中文字幕最新亚洲高清| 88av欧美| 99国产精品99久久久久| 国产精品爽爽va在线观看网站| 亚洲中文日韩欧美视频| 俄罗斯特黄特色一大片| 欧美 亚洲 国产 日韩一| 日本免费一区二区三区高清不卡| 久久久久性生活片| 精华霜和精华液先用哪个| 又爽又黄无遮挡网站| 18禁黄网站禁片免费观看直播| 精品午夜福利视频在线观看一区| 日本 av在线| 在线播放国产精品三级| 十八禁网站免费在线| 一级毛片女人18水好多| 亚洲人成77777在线视频| 亚洲国产欧美网| 欧美日韩亚洲综合一区二区三区_| 久久久久免费精品人妻一区二区| 国产欧美日韩精品亚洲av| 成人欧美大片| 国内毛片毛片毛片毛片毛片| 激情在线观看视频在线高清| 黄片大片在线免费观看| 精品电影一区二区在线| 首页视频小说图片口味搜索| 亚洲片人在线观看| 成人18禁高潮啪啪吃奶动态图| 亚洲精品色激情综合| 欧美最黄视频在线播放免费| 极品教师在线免费播放| 欧美精品啪啪一区二区三区| cao死你这个sao货| 成人特级黄色片久久久久久久| 亚洲色图 男人天堂 中文字幕| 国内少妇人妻偷人精品xxx网站 | 亚洲av片天天在线观看| 狠狠狠狠99中文字幕| 国产精品98久久久久久宅男小说| 桃红色精品国产亚洲av| 每晚都被弄得嗷嗷叫到高潮| 很黄的视频免费| 97人妻精品一区二区三区麻豆| 校园春色视频在线观看| 午夜免费观看网址| 国产精品野战在线观看| 久久中文字幕一级| 国产精品1区2区在线观看.| 国产成人影院久久av| 麻豆久久精品国产亚洲av| 国产成人一区二区三区免费视频网站| 日韩精品青青久久久久久| 成人国语在线视频| 久久精品aⅴ一区二区三区四区| 中文在线观看免费www的网站 | 国产午夜精品论理片| 国产成人精品无人区| 成年版毛片免费区| 成人午夜高清在线视频| 亚洲专区字幕在线| 亚洲欧洲精品一区二区精品久久久| 午夜精品在线福利| 黄色成人免费大全| 大型av网站在线播放| 9191精品国产免费久久| 国产高清视频在线观看网站| 精品久久久久久成人av| 美女大奶头视频| 桃色一区二区三区在线观看| 国产三级中文精品| 亚洲成人久久爱视频| 亚洲精品中文字幕在线视频| 俺也久久电影网| 在线观看免费视频日本深夜| 一级片免费观看大全| 亚洲va日本ⅴa欧美va伊人久久| 亚洲成人国产一区在线观看| 搡老妇女老女人老熟妇| √禁漫天堂资源中文www| 久久精品国产亚洲av香蕉五月| 欧美成人免费av一区二区三区| av国产免费在线观看| 免费看日本二区| 国产av在哪里看| 久久这里只有精品19| 欧美一区二区精品小视频在线| 最近在线观看免费完整版| 国产亚洲精品久久久久久毛片| 亚洲七黄色美女视频| av超薄肉色丝袜交足视频| 国产视频内射| 人人妻,人人澡人人爽秒播| 狂野欧美白嫩少妇大欣赏| 又黄又爽又免费观看的视频| 91av网站免费观看| 亚洲一区二区三区不卡视频| 欧美性猛交╳xxx乱大交人| 人人妻人人看人人澡| 欧美日韩中文字幕国产精品一区二区三区| 真人做人爱边吃奶动态| 最近视频中文字幕2019在线8| 亚洲成av人片在线播放无| 日韩欧美免费精品| 在线十欧美十亚洲十日本专区| cao死你这个sao货| 亚洲专区国产一区二区| 高清毛片免费观看视频网站| 久久国产精品影院| 久久天躁狠狠躁夜夜2o2o| 在线a可以看的网站| 99久久无色码亚洲精品果冻| 搡老熟女国产l中国老女人| 精品国产乱码久久久久久男人| 老司机深夜福利视频在线观看| av视频在线观看入口| 国产精品一区二区三区四区免费观看 | 少妇裸体淫交视频免费看高清 | 国产精品,欧美在线| 最新美女视频免费是黄的| 中文字幕人妻丝袜一区二区| 国产片内射在线| 国产精品,欧美在线| 一级毛片女人18水好多| 琪琪午夜伦伦电影理论片6080| 波多野结衣巨乳人妻| 俄罗斯特黄特色一大片| 岛国在线观看网站| 精品久久久久久久末码| 国产精品亚洲美女久久久| 亚洲精品国产精品久久久不卡| 丁香六月欧美| 国产黄a三级三级三级人| 黄色a级毛片大全视频| 18禁国产床啪视频网站| 久久久久精品国产欧美久久久| 欧美午夜高清在线| 久久午夜综合久久蜜桃| 99精品欧美一区二区三区四区| 国产探花在线观看一区二区| 91成年电影在线观看| 超碰成人久久| 免费搜索国产男女视频| 午夜两性在线视频| 久久久国产成人免费| 亚洲精品在线观看二区| 亚洲欧美精品综合久久99| 国产精品久久久人人做人人爽| 久久人妻av系列| 亚洲18禁久久av| 国产1区2区3区精品| 极品教师在线免费播放| 婷婷亚洲欧美| 久久久久九九精品影院| 每晚都被弄得嗷嗷叫到高潮| xxxwww97欧美| 搡老岳熟女国产| 国内久久婷婷六月综合欲色啪| 亚洲18禁久久av| 亚洲中文日韩欧美视频| 午夜亚洲福利在线播放| 成年版毛片免费区| 日韩高清综合在线| 两个人的视频大全免费| 无人区码免费观看不卡| 亚洲成人中文字幕在线播放| 黄色丝袜av网址大全| 十八禁网站免费在线| 在线国产一区二区在线| 欧美zozozo另类| 妹子高潮喷水视频| 很黄的视频免费| 色噜噜av男人的天堂激情| 国产av一区在线观看免费| 国产成年人精品一区二区| 窝窝影院91人妻| 叶爱在线成人免费视频播放| 国产免费男女视频| 夜夜躁狠狠躁天天躁| 国产蜜桃级精品一区二区三区| 淫妇啪啪啪对白视频| 欧美性猛交黑人性爽| 日韩国内少妇激情av| 亚洲av熟女| 日本成人三级电影网站| 午夜精品在线福利| 久久久久亚洲av毛片大全| 亚洲avbb在线观看| 日韩欧美三级三区| 麻豆一二三区av精品| 嫩草影院精品99| 欧美 亚洲 国产 日韩一| 岛国在线免费视频观看| 婷婷精品国产亚洲av| 后天国语完整版免费观看| 久久婷婷成人综合色麻豆| 免费在线观看黄色视频的| 欧美+亚洲+日韩+国产| 国产亚洲欧美98| 美女免费视频网站| 国产片内射在线| 哪里可以看免费的av片| 欧美日本视频| cao死你这个sao货| 欧美三级亚洲精品| 女警被强在线播放| 久久国产乱子伦精品免费另类| 日韩欧美在线二视频| 天堂动漫精品| netflix在线观看网站| 性色av乱码一区二区三区2| 777久久人妻少妇嫩草av网站| 亚洲成av人片免费观看| 天天躁夜夜躁狠狠躁躁| 久久久久国产一级毛片高清牌| 国产蜜桃级精品一区二区三区| 国产99久久九九免费精品| 极品教师在线免费播放| av免费在线观看网站| 国产熟女午夜一区二区三区| 亚洲精品一区av在线观看| 午夜成年电影在线免费观看| 亚洲精品美女久久久久99蜜臀| 香蕉久久夜色| 亚洲精品久久国产高清桃花| 麻豆久久精品国产亚洲av| 国产亚洲欧美98| 宅男免费午夜| 91在线观看av| 十八禁人妻一区二区| 身体一侧抽搐| 久久久久性生活片| 天堂√8在线中文| 国产精品亚洲一级av第二区| 少妇的丰满在线观看|