• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A physics-constrained deep residual network for solving the sine-Gordon equation

    2021-05-19 09:01:54JunLi李軍andYongChen陳勇
    Communications in Theoretical Physics 2021年1期
    關(guān)鍵詞:陳勇李軍

    Jun Li(李軍) and Yong Chen(陳勇)

    1 Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    2 School of Mathematical Sciences,Shanghai Key Laboratory of PMMP,Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    3 College of Mathematics and Systems Science,Shandong University of Science and Technology,Qingdao,266590,China

    4 Department of Physics,Zhejiang Normal University,Jinhua,321004,China

    Abstract Despite some empirical successes for solving nonlinear evolution equations using deep learning,there are several unresolved issues.First,it could not uncover the dynamical behaviors of some equations where highly nonlinear source terms are included very well.Second,the gradient exploding and vanishing problems often occur for the traditional feedforward neural networks.In this paper,we propose a new architecture that combines the deep residual neural network with some underlying physical laws.Using the sine-Gordon equation as an example,we show that the numerical result is in good agreement with the exact soliton solution.In addition,a lot of numerical experiments show that the model is robust under small perturbations to a certain extent.

    Keywords:sine-Gordon equation,deep residual network,soliton,integrable system

    1.Introduction

    Solving nonlinear evolution equations computationally plays a very important role in physics and engineering.Approximating these equations using deep learning rather than traditional numerical methods has been studied[1–3].Han et al[1]introduce a deep learning approach to approximate the gradient of the unknown solution.However,Raissi et al[2,3]approximate the latent solution using deep neural networks directly.Recently,we also explore the applications in other evolution equations such as the Burgers equation(it should be emphasized that strictly speaking,we just study kink-type solitary wave solutions to this equation),the Korteweg–de Vries(KdV)equation,the modified KdV equation,and the Sharma–Tasso–Olver equation[4,5]where these integrable equations have exact and explicit solutions to test the accuracy of the solutions via neural network methods.These data-driven solutions are closed analytic,differentiable and easy to be used in subsequent calculations compared with traditional numerical approaches.Moreover,compared with other traditional numerical approaches,the deep learning method does not require the discretization of spatial and temporal domains.

    Some experiments show that the model in[2]could not approximate the solutions to some equations including highly nonlinear source terms such as sin(u)very well and then the model in[3]usually cannot represent the solution dynamics when using some activation functions.Thus,it is very interesting to improve the network framework to solve or alleviate these problems.In this paper,we propose a physics-constrained deep residual network that combines the deep residual neural network[6,7]with some underlying laws of physics.To our knowledge,it is the first framework combining the residual network with the underlying physical laws.This framework is easier to optimize than the classical feedforward networks.Moreover,it can increase gradient flow and then alleviate the gradient vanishing/exploding problems through adding the skip connection accordingly.It can also be used to train very deep networks to improve the performance of the network.By the way,here we just use a simple identity connection,more complex connections between these network layers are enable[8].

    Figure 1.This figure sketches how the residual block works.

    Specifically,in this paper,we just consider the sine-Gordon equation[9–11]which includes a highly nonlinear source term:

    where the solution u is a function of the space variable x and the time variable t.Note that,through the coordinate transformationandwe can obtain another expression of this equation,namely,

    The paper is organized as follows.In section 2,we propose the physics-constrained deep residual network and then present a new objective function.Then we give our main focus on the antikink solution to the sine-Gordon equation and the influence on the model of different settings such as noise,sampled data points,network layers and neurons in section 3.Finally,we conclude the paper in section 4.

    2.Method

    The residual block usually adds the output of a series of layers to the input of the block directly.Formally,in this work,we consider a residual block defined as

    where W(n,i),b(n,i),i=1…4 are the weights and biases of the corresponding layers,respectively.Note that we use the activation function only in the hidden layers rather than the output layer.The identity map skips training from a few layers and connects to the output directly.If any layer hurt the performance of the network,then it will be skipped.So,this mechanism can be regarded as a kind of regularization.This results in training very deep neural networks without the problems caused by vanishing or exploding gradients.

    In addition,we know that this residual structure is just a special case of the Euler forward scheme[12]

    when h=1.See[13,14]for more detailed analyses from the viewpoints of dynamical systems and differential equations.

    In this work,we approximate directly the unknown solution u using a deep residual network composed of three residual blocks and accordingly define a network

    where these two networks share all parameters to be learned.The solution network is trained to satisfy the sine-Gordon equation(1)and corresponding initial/boundary conditions.With the help of the automatic differentiation techniques[15],equation(5)is embedded into a new loss function we propose here:

    in order to utilize the underlying laws of physics to discover patterns fromexperimental/simulated data wheredenote the initial/boundary training data on the networku andspecify the collocation points for the networkf.The first term of the right hand side of equation(6)measures the difference between the true underlying and measured dynamics due to the noise that may be caused by different factors.Then the second term learns to satisfy f which prevents overfitting and denoises the data.In this paper,we just set λ=1 and choose the L-BFGS[16]algorithm to optimize the loss function(6).By the way,different from numerical differentiation,automatic differentiation possesses the advantages of small influence of noise and desirable stability,which will be observed in the next section.From equation(6),it can be observed that we use logcosh(x),which is also called smooth L1 function in some contexts,as the objective function.Through a simple calculus,we know that

    Obviously,it is less affected by outliers and second-order differentiable everywhere.Moreover,the comparison of the difference between the absolute function,the square function and this function is given in figure 2.The logcosh function’s relation to the absolute function is just like the swish function’s relation to the rectified linear unit(ReLU).More discussions about the choice of the objective will be given in the next section.

    Figure 2.The comparison of some common loss functions.

    Some studies indicate that orthogonal initializations can restrict unit variance to improve the training procedure and the performance of architecture,but for most applications,Xavier initialization[17]used in this paper is enough.The activation function is what makes a neural network nonlinear.This nonlinearity property makes the network more expressive.Specifically,in this paper,the second-order derivatives of the solution with respect to the spatial variable x and the temporal variable t are needed,so ReLU(max{0,x})evidently does not work.Then,the data are usually rescaled to[?1,1].The sigmoid function(σ,1(1 +exp(?x))),however,restricts the output to[0,1].Thus,this function also does not work well.Moreover,most of the derivative values of this type of S-shaped functions including σ and tanh tend to 0 which will lose too much information and lead to the vanishing gradient problem to some extent.In addition,ReLU can not represent complicated and fine-grained details.Some numerical experiments demonstrate that they indeed cannot recover the solution dynamics correctly.We think that other different choices of weight initializations and data normalizations may also affect the selection of the activation functions.So,in this paper,we choose some periodic functions such as the sinusoid functions as the activation functions[18].For sin(x)which is zero centered,its derivativecos(x)is just a shifted sine function.

    All experiments in this work are conducted on a Mac-Book Pro computer with 2.4 GHz Dual-Core Intel Core i5 processor.

    3.Numerical Results

    The soliton phenomena exist widely in physics,biology,communication and other scientific disciplines.The soliton solution(namely,kink or antikink)of the sine-Gordon equation we mention here,which is distinctly different from the KdV soliton(bell shaped),represents topologically invariant quantities in a system.Specifically,the exact one antikink solution is given by the B?cklund transformation:

    Figure 3.The comparison of the loss(taking the natural logarithm)curves when the physics-constrained deep residual network is trained with different loss functions.

    where c is the translation speed and x0is the initial position.For simplicity,we setand x0=0.Then the antikink solution is reduced to

    Using the derivative formulaand simplifying the result,we obtain the derivative of(9)with respect to t:

    For the antikink solution(9),we know that the initial conditions areand then the boundary conditions are u(t,x=?20)=2π,u(t,x=20)=0 for t ∈[0,10].We consider the sine-Gordon equation along with Dirichlet boundary condition.To obtain the training and testing data,we just sample the data on the evenly spaced grid every Δt=0.02 from t=0 to t=10 and finally obtain totally 501 snapshots.Out of this data,we generate a smaller training subset by randomly sub-sampling Nu=200(it is usually relatively small,more details will be discussed below)initial/boundary data and Nf=20 000 collocation data points generated usually using the Latin hypercube sampling method[19].

    From figure 3,we obviously know that the logcosh function as the loss objective is significantly better than the square function.Moreover,the algorithm only takes much less iterations for the former to achieve its optimal performance,that is to say,the function can accelerate the convergence of the optimization algorithm.

    Figure 4 graphically shows the antikink evolution of the sine-Gordon equation.The top panel of figure 4 compares between the exact dynamics and the predicted spatiotemporal behavior.The model achieves a relativeerror of size 6.09e-04 in a runtime of about 10 min where the error is defined as ‖utrue?upred‖2/‖utrue‖2.More detailed assessments are presented in the bottom panel of figure 4.In particular,we present a comparison between the exact solutions

    Figure 4.The antikink solution to the sine-Gordon equation.Top:an exact antikink is compared to the predicted solution of the learned model(right panel).The model correctly captures the dynamics and accurately reproduces the solution with a relative 2 error of 6.09e-04.Bottom:the comparison of the predicted solutions and the exact solutions at the three different temporal snapshots depicted by the white vertical lines in the top panel is presented.

    Figure 5.The antikink solution to the sine-Gordon equation.(a)The spatiotemporal behavior of the reconstructed antikink;(b)the spattiotemporal dynamics of the corresponding potential where the potential is given by v=?ux.

    Table 1.Relative 2 errors under different random seeds.

    Table 1.Relative 2 errors under different random seeds.

    1.79e-03 2.02e-03 1.77e-03 7.87e-04 1.88e-03 3.20e-03 1.61e-03 3.45e-04

    and predicted solutions at the three different instants t=1.24,3.74,8.76.The result indicates that the model can accurately capture the antikink dynamics of the sine-Gordon equation.Moreover,we can observe the reconstructed single antikink motion better from figure 5.

    Additionally,lots of numerical experiments show that our model is very robust for different random initializations(see table 1).

    Figure 6.The comparison of the relative 2 errors of the prediction results under small perturbations with different noise levels.

    Next,the performance of this framework in the presence of noise is compared.By adding some small amounts of noise

    where δ denotes the amount of noise,s is the standard deviation of u(t,x)andwe disturb the data.From fgiure 6,the numerical experiments reveal that the architecture is remarkably robust for some small noise and then the model is able to perform the long-term prediction even when trained with noisy data.That is to say,the model could reconstruct the solution behavior from noisy data.From these experiments,we believe that the input noise at a certain extent can be regarded as a regularization mechanism that increases the robustness of the network.This is a kind of weak generalization.Additionally,this perturbation phenomenon can also be described by results on orbital stability and asymptotic stability of solitons[20,21]from the mathematical viewpoint.

    As the noise level increases,however,the accuracy of the predicted solution decreases and the training time increases remarkably.Here,for noisy data,we can increase the numbers of training data to decrease the relative error and then improve the accuracy of the predicted solution.The experimental comparison of the influence of different numbers of sub-sampled data points on the prediction under different noise levels is given in table 2.

    Generally speaking,with more layers and more neurons,the model has a better performance[22].So,we carry out a lot of numerical experiments in order to check this empirical result.See table 3 for the comparison with different numbers of hidden layers and neurons per hidden layer.Some more detailed theoretical analyses are also conducted[23,24].

    Last,we also train the model using the Adam[25]optimizer with default parameter setups to approximate the antikink solution to the sine-Gordon equation.The training procedure takes approximately 2.5 h with Nu=100 and Nf=10 000 under 30 000 epochs.The relative2error between the exact and predicted solutions is 1.47e-02.The experimental result shows that the L-BFGS algorithm is much faster than Adam in this case and it gets a more accurate solution.However,the former sometimes suffers from some convergence issues.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Noise0%Noise1%NfNf Nu1000500010 00020 000Nu1000500010 00020 000 103.67e-01 1.47e-01 1.02e-01 2.27e-01105.17e-01 1.41e-01 8.58e-02 5.34e-02 506.98e-03 2.12e-03 1.14e-03 1.68e-03501.35e-02 5.71e-03 5.81e-03 3.75e-03 1002.91e-03 1.48e-03 1.37e-03 1.23e-031009.52e-03 5.95e-03 6.92e-03 3.55e-03 2001.07e-03 1.21e-03 9.73e-04 6.09e-042003.60e-03 1.42e-03 2.59e-03 2.03e-03

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Neurons Hidden layers10204080 4 5.09e-01 7.01e-04 4.98e-04 1.21e-03 8 1.76e-03 1.47e-03 9.65e-04 8.43e-04 127.91e-04 7.87e-04 6.09e-04 7.05e-03 166.65e-04 3.38e-04 3.70e-04 2.16e-03

    4.Conclusions and discussion

    In this paper,we propose a new architecture that combines deep residual network with underlying physical laws for extracting soliton dynamics of the sine-Gordon equation from spatiotemporal data.This architecture can be used easily to train very deep networks and then alleviates the gradient exploding and vanishing problems.Moreover,we use the logcosh function rather than the square function in the objective in this paper in order to accelerate the training and improve the performance of the network.The numerical results show that the model could reconstruct the solution behaviors of the equation very accurately.Moreover,this model is remarkably robust under small disturbances to some extent.

    Despite some progress,we are still at the early stages of understanding the capabilities and limitations of such deep learning models.In addition,other advanced frameworks,for example,generative adversarial networks,recurrent neural networks and networks with some numerical schemes embedded,will also be considered in the future research.

    Acknowledgments

    The first author would like to express his sincere thanks to Dr Yuqi Li and Dr Xiaoen Zhang for their valuable comments and excellent suggestions on this work.The authors gratefully acknowledge the support of the National Natural Science Foundation of China(Grant No.11675054),Shanghai Collaborative Innovation Center of Trustworthy Software for Internet of Things(Grant No.ZF1213)and Science and Technology Commission of Shanghai Municipality(Grant No.18dz2271000).

    猜你喜歡
    陳勇李軍
    木棉花開
    人民之聲(2022年3期)2022-04-12 12:00:14
    Superconductivity in octagraphene
    A Direct Algorithm Maple Package of One-Dimensional Optimal System for Group Invariant Solutions?
    Lump Solutions and Interaction Phenomenon for(2+1)-Dimensional Sawada–Kotera Equation?
    Symmetry Analysis and Exact Solutions of the 2D Unsteady Incompressible Boundary-Layer Equations?
    In fluence of Cell-Cell Interactions on the Population Growth Rate in a Tumor?
    Mechanical Behavior of Plastic PiPe Reinforced by Cross-Winding Steel Wire Subject to Foundation Settlement
    陳勇:勵精圖治 銳意創(chuàng)新
    滬港通一周成交概況
    陳勇:我不看好這樣的藥房托管
    噜噜噜噜噜久久久久久91| 日本 av在线| 国产极品精品免费视频能看的| 国产老妇女一区| 1000部很黄的大片| 欧美日韩综合久久久久久 | 一级黄色大片毛片| 久久精品国产综合久久久| 啦啦啦韩国在线观看视频| 级片在线观看| 亚洲av成人精品一区久久| 久久国产精品人妻蜜桃| 国产色婷婷99| 久久久精品大字幕| 淫妇啪啪啪对白视频| 一级毛片女人18水好多| a级一级毛片免费在线观看| 国产熟女xx| 无限看片的www在线观看| 亚洲 国产 在线| 亚洲精品美女久久久久99蜜臀| 亚洲精品久久国产高清桃花| 久久人妻av系列| 老司机在亚洲福利影院| 午夜免费观看网址| 给我免费播放毛片高清在线观看| 夜夜躁狠狠躁天天躁| 欧美日韩综合久久久久久 | 久久国产精品影院| 国产欧美日韩一区二区三| 亚洲七黄色美女视频| 夜夜夜夜夜久久久久| 91在线观看av| 国产成人系列免费观看| 日韩av在线大香蕉| 精品99又大又爽又粗少妇毛片 | 亚洲av中文字字幕乱码综合| 日韩欧美国产一区二区入口| 18禁在线播放成人免费| 亚洲人与动物交配视频| 在线观看舔阴道视频| 亚洲精品亚洲一区二区| 欧美一区二区亚洲| 精品午夜福利视频在线观看一区| 亚洲av五月六月丁香网| 国内揄拍国产精品人妻在线| 青草久久国产| 国产av麻豆久久久久久久| av在线蜜桃| 亚洲av免费在线观看| 99riav亚洲国产免费| 国内久久婷婷六月综合欲色啪| 国产精品一区二区三区四区免费观看 | 91麻豆精品激情在线观看国产| 午夜福利在线观看吧| 成熟少妇高潮喷水视频| 亚洲av五月六月丁香网| 岛国视频午夜一区免费看| 日韩欧美精品免费久久 | 午夜两性在线视频| 老汉色av国产亚洲站长工具| 精品无人区乱码1区二区| 中文字幕高清在线视频| 久久人人精品亚洲av| 国产精品精品国产色婷婷| 在线免费观看不下载黄p国产 | 天堂网av新在线| 欧美日韩黄片免| 亚洲,欧美精品.| 青草久久国产| 91在线观看av| 国产精品av视频在线免费观看| 69av精品久久久久久| 欧美日韩亚洲国产一区二区在线观看| 国产老妇女一区| 色吧在线观看| 国产精品美女特级片免费视频播放器| 国产激情欧美一区二区| 2021天堂中文幕一二区在线观| 天天添夜夜摸| 日韩欧美在线二视频| 波野结衣二区三区在线 | 特大巨黑吊av在线直播| 精品国产亚洲在线| 波多野结衣高清作品| 欧美乱妇无乱码| 国产激情欧美一区二区| 精华霜和精华液先用哪个| 狠狠狠狠99中文字幕| 特大巨黑吊av在线直播| 亚洲中文字幕一区二区三区有码在线看| 88av欧美| 老司机午夜十八禁免费视频| 成年版毛片免费区| 国产成年人精品一区二区| 久久午夜亚洲精品久久| 久久精品国产综合久久久| 亚洲欧美日韩无卡精品| 一边摸一边抽搐一进一小说| 国产精品野战在线观看| 中文字幕人成人乱码亚洲影| 黄片大片在线免费观看| 99久久精品一区二区三区| 亚洲av一区综合| 国产单亲对白刺激| 精品不卡国产一区二区三区| 亚洲av日韩精品久久久久久密| 桃色一区二区三区在线观看| 国语自产精品视频在线第100页| 久久精品国产亚洲av香蕉五月| 日韩av在线大香蕉| 久久人妻av系列| 12—13女人毛片做爰片一| 亚洲18禁久久av| 99精品久久久久人妻精品| 一区二区三区国产精品乱码| 色噜噜av男人的天堂激情| 夜夜爽天天搞| 脱女人内裤的视频| 久久国产乱子伦精品免费另类| 亚洲欧美一区二区三区黑人| 亚洲av电影不卡..在线观看| 在线天堂最新版资源| www.999成人在线观看| 久99久视频精品免费| 免费av观看视频| 国产精品影院久久| 色精品久久人妻99蜜桃| 99久久精品一区二区三区| 国产三级在线视频| 欧美激情在线99| 国产又黄又爽又无遮挡在线| 神马国产精品三级电影在线观看| 99久久无色码亚洲精品果冻| 看免费av毛片| 一级a爱片免费观看的视频| 亚洲黑人精品在线| 亚洲在线自拍视频| 亚洲精品一区av在线观看| 欧美激情在线99| 久久国产精品人妻蜜桃| 久久人人精品亚洲av| 嫩草影院入口| 国产成人a区在线观看| 久久久久久九九精品二区国产| 欧美一区二区亚洲| 真实男女啪啪啪动态图| 欧美日本亚洲视频在线播放| 午夜老司机福利剧场| 一级毛片女人18水好多| 少妇丰满av| 中亚洲国语对白在线视频| 两个人看的免费小视频| 91av网一区二区| 日本 欧美在线| 夜夜躁狠狠躁天天躁| 午夜免费成人在线视频| 国产成人影院久久av| 成人一区二区视频在线观看| 男女午夜视频在线观看| 嫁个100分男人电影在线观看| 国产激情欧美一区二区| 久久婷婷人人爽人人干人人爱| 亚洲成av人片免费观看| 国产成+人综合+亚洲专区| 亚洲片人在线观看| 国产欧美日韩一区二区精品| 久久久久国产精品人妻aⅴ院| 国产精品,欧美在线| 熟女人妻精品中文字幕| 欧美+日韩+精品| 黄色女人牲交| 久久久久性生活片| 熟女电影av网| 淫秽高清视频在线观看| av黄色大香蕉| 午夜影院日韩av| 精品久久久久久久末码| netflix在线观看网站| 欧美bdsm另类| 狂野欧美白嫩少妇大欣赏| 一个人看视频在线观看www免费 | 757午夜福利合集在线观看| 日韩欧美一区二区三区在线观看| 日本撒尿小便嘘嘘汇集6| 亚洲av五月六月丁香网| or卡值多少钱| 九色国产91popny在线| 日本五十路高清| 99久久无色码亚洲精品果冻| 亚洲无线观看免费| 亚洲av美国av| 99久久综合精品五月天人人| 国产高潮美女av| 亚洲欧美日韩卡通动漫| 看片在线看免费视频| 啦啦啦免费观看视频1| 午夜福利视频1000在线观看| 天天一区二区日本电影三级| 国产精品三级大全| 亚洲不卡免费看| 国产高清视频在线播放一区| 老鸭窝网址在线观看| 精品久久久久久久人妻蜜臀av| 国产一区在线观看成人免费| 午夜激情欧美在线| 在线天堂最新版资源| 欧美性感艳星| 高潮久久久久久久久久久不卡| 国产男靠女视频免费网站| 日本 欧美在线| 成人精品一区二区免费| 久久中文看片网| 在线看三级毛片| 午夜两性在线视频| 岛国在线免费视频观看| 免费观看精品视频网站| 久久久久久久久大av| x7x7x7水蜜桃| 日韩欧美 国产精品| 啪啪无遮挡十八禁网站| 国产亚洲精品一区二区www| www.色视频.com| 国产中年淑女户外野战色| 黄色日韩在线| 国产精品爽爽va在线观看网站| 99热这里只有精品一区| 好男人电影高清在线观看| 夜夜爽天天搞| 天天一区二区日本电影三级| 黄色视频,在线免费观看| 国产色爽女视频免费观看| 亚洲乱码一区二区免费版| 十八禁人妻一区二区| 欧美日本视频| 九色成人免费人妻av| 欧美另类亚洲清纯唯美| 女人高潮潮喷娇喘18禁视频| 熟女人妻精品中文字幕| 日本黄大片高清| 51午夜福利影视在线观看| 午夜精品一区二区三区免费看| 午夜影院日韩av| 日韩亚洲欧美综合| 一级黄色大片毛片| 免费搜索国产男女视频| 伊人久久精品亚洲午夜| av在线天堂中文字幕| 国产精品一区二区免费欧美| 搡女人真爽免费视频火全软件 | 国产精品综合久久久久久久免费| 午夜老司机福利剧场| 熟妇人妻久久中文字幕3abv| 少妇的逼好多水| 法律面前人人平等表现在哪些方面| 国产精品影院久久| 女警被强在线播放| 亚洲中文字幕一区二区三区有码在线看| 在线免费观看不下载黄p国产 | 亚洲av免费在线观看| 99riav亚洲国产免费| 国产在视频线在精品| 国语自产精品视频在线第100页| 亚洲av美国av| 久久久国产成人精品二区| 国产午夜福利久久久久久| 狠狠狠狠99中文字幕| 中文字幕av在线有码专区| 香蕉av资源在线| 首页视频小说图片口味搜索| 亚洲av成人不卡在线观看播放网| 麻豆国产av国片精品| 成年版毛片免费区| 久久6这里有精品| 可以在线观看毛片的网站| 国产高清激情床上av| 在线免费观看不下载黄p国产 | 亚洲色图av天堂| 嫩草影院精品99| 精品一区二区三区av网在线观看| 国产精品久久久人人做人人爽| 国产单亲对白刺激| 国产野战对白在线观看| 午夜激情欧美在线| 中文字幕av成人在线电影| 高清毛片免费观看视频网站| 亚洲av二区三区四区| 极品教师在线免费播放| 国产亚洲精品综合一区在线观看| 欧美日韩中文字幕国产精品一区二区三区| 欧美最黄视频在线播放免费| 久久精品影院6| 国产精品永久免费网站| 草草在线视频免费看| 久久草成人影院| 久久久久久九九精品二区国产| 国产成人影院久久av| 少妇高潮的动态图| 亚洲av第一区精品v没综合| 午夜日韩欧美国产| 久久久久久国产a免费观看| 天堂影院成人在线观看| 热99在线观看视频| 人妻久久中文字幕网| 亚洲精品在线观看二区| bbb黄色大片| 国产精品久久久久久久电影 | ponron亚洲| a在线观看视频网站| 国产乱人伦免费视频| 51午夜福利影视在线观看| 国产亚洲精品av在线| 99在线人妻在线中文字幕| 亚洲天堂国产精品一区在线| 精品国产三级普通话版| 内地一区二区视频在线| 久久精品国产自在天天线| 色视频www国产| 日韩免费av在线播放| 亚洲五月婷婷丁香| 午夜福利在线观看免费完整高清在 | 毛片女人毛片| 全区人妻精品视频| 最后的刺客免费高清国语| 老熟妇乱子伦视频在线观看| 亚洲黑人精品在线| 国产精品99久久99久久久不卡| 亚洲av熟女| 香蕉久久夜色| 18禁在线播放成人免费| 国产精品亚洲美女久久久| 久久久久久久午夜电影| 国产激情偷乱视频一区二区| 亚洲内射少妇av| 真实男女啪啪啪动态图| 亚洲18禁久久av| 久久人妻av系列| 香蕉av资源在线| 在线播放国产精品三级| www.色视频.com| 亚洲国产精品久久男人天堂| 国产真人三级小视频在线观看| 夜夜夜夜夜久久久久| 99国产极品粉嫩在线观看| 在线观看免费午夜福利视频| 亚洲精品亚洲一区二区| 在线看三级毛片| 神马国产精品三级电影在线观看| aaaaa片日本免费| 亚洲 国产 在线| bbb黄色大片| 大型黄色视频在线免费观看| bbb黄色大片| 大型黄色视频在线免费观看| 免费在线观看亚洲国产| 久久精品人妻少妇| 国产中年淑女户外野战色| 嫩草影视91久久| 成人性生交大片免费视频hd| 97碰自拍视频| 最新在线观看一区二区三区| 久久国产乱子伦精品免费另类| 国产69精品久久久久777片| 波多野结衣高清无吗| bbb黄色大片| 又粗又爽又猛毛片免费看| 女警被强在线播放| 人人妻,人人澡人人爽秒播| 热99re8久久精品国产| 91久久精品电影网| 国产美女午夜福利| 大型黄色视频在线免费观看| 老司机午夜十八禁免费视频| 欧美av亚洲av综合av国产av| 国产单亲对白刺激| 国内揄拍国产精品人妻在线| 午夜久久久久精精品| 久久久成人免费电影| 色尼玛亚洲综合影院| 亚洲美女视频黄频| av黄色大香蕉| 亚洲精品美女久久久久99蜜臀| 啦啦啦观看免费观看视频高清| 久久精品影院6| 国产激情欧美一区二区| 国产一区二区三区在线臀色熟女| 久久精品国产亚洲av涩爱 | 亚洲人成网站在线播| 久久精品国产自在天天线| 蜜桃久久精品国产亚洲av| 久久精品国产亚洲av香蕉五月| 99久久成人亚洲精品观看| 高清日韩中文字幕在线| 99热这里只有是精品50| 免费高清视频大片| 丁香六月欧美| 国产一区二区三区在线臀色熟女| 免费看光身美女| 欧美在线一区亚洲| 亚洲精品一卡2卡三卡4卡5卡| 日韩欧美精品免费久久 | 老汉色av国产亚洲站长工具| 看黄色毛片网站| or卡值多少钱| 欧美精品啪啪一区二区三区| 18美女黄网站色大片免费观看| 九九在线视频观看精品| 精品欧美国产一区二区三| 麻豆久久精品国产亚洲av| 最近在线观看免费完整版| 免费在线观看影片大全网站| 国产中年淑女户外野战色| 久久人人精品亚洲av| 99久国产av精品| 国产高清激情床上av| 一本精品99久久精品77| 亚洲av免费在线观看| 成人一区二区视频在线观看| 18禁在线播放成人免费| 白带黄色成豆腐渣| 18禁裸乳无遮挡免费网站照片| 不卡一级毛片| 成人性生交大片免费视频hd| 村上凉子中文字幕在线| 日本在线视频免费播放| 美女高潮喷水抽搐中文字幕| av专区在线播放| 国产精品自产拍在线观看55亚洲| 伊人久久精品亚洲午夜| 久久久久国内视频| 国产亚洲精品久久久久久毛片| 欧美色视频一区免费| 一区二区三区高清视频在线| 亚洲欧美激情综合另类| 男女下面进入的视频免费午夜| 天堂影院成人在线观看| 国产一区二区激情短视频| 国产熟女xx| 欧美激情在线99| 亚洲人成网站在线播放欧美日韩| 亚洲国产色片| 国产男靠女视频免费网站| 欧美成人免费av一区二区三区| 亚洲18禁久久av| 国产精品野战在线观看| 男女之事视频高清在线观看| 婷婷精品国产亚洲av| 黄片大片在线免费观看| 男人舔奶头视频| 国产69精品久久久久777片| 在线观看一区二区三区| 久久国产精品人妻蜜桃| 露出奶头的视频| 国产伦一二天堂av在线观看| 久久亚洲真实| 成人av一区二区三区在线看| 成人鲁丝片一二三区免费| 亚洲美女视频黄频| 免费人成在线观看视频色| 黄色视频,在线免费观看| 又黄又爽又免费观看的视频| 黄片大片在线免费观看| 国产精品久久久久久亚洲av鲁大| 亚洲avbb在线观看| 国产成年人精品一区二区| 中文字幕av在线有码专区| 日本a在线网址| 波野结衣二区三区在线 | 国产视频一区二区在线看| 一级作爱视频免费观看| 色在线成人网| 久久伊人香网站| 色综合亚洲欧美另类图片| 国产在视频线在精品| 国产欧美日韩一区二区三| 神马国产精品三级电影在线观看| 国产乱人伦免费视频| 成年女人永久免费观看视频| 老司机福利观看| av天堂在线播放| 亚洲成av人片免费观看| 午夜日韩欧美国产| 老汉色av国产亚洲站长工具| 国内久久婷婷六月综合欲色啪| 无遮挡黄片免费观看| 午夜精品久久久久久毛片777| 一个人看的www免费观看视频| 久久久久久国产a免费观看| 99久久成人亚洲精品观看| 亚洲专区中文字幕在线| 亚洲七黄色美女视频| 国产色婷婷99| 国产高清激情床上av| 美女高潮的动态| 大型黄色视频在线免费观看| 久久久久九九精品影院| 好看av亚洲va欧美ⅴa在| 国产国拍精品亚洲av在线观看 | 啦啦啦韩国在线观看视频| 啦啦啦免费观看视频1| 成人亚洲精品av一区二区| 最新中文字幕久久久久| 首页视频小说图片口味搜索| 国产三级在线视频| 最新在线观看一区二区三区| 国产男靠女视频免费网站| 国产成人啪精品午夜网站| 亚洲内射少妇av| 亚洲最大成人中文| 国产亚洲精品久久久com| 国产精品久久久久久亚洲av鲁大| 成人18禁在线播放| 成年女人永久免费观看视频| 色老头精品视频在线观看| 欧美成狂野欧美在线观看| 午夜福利18| av黄色大香蕉| 99国产综合亚洲精品| 少妇高潮的动态图| 熟女人妻精品中文字幕| 精品国产三级普通话版| 波多野结衣巨乳人妻| 久久久久久久精品吃奶| 我的老师免费观看完整版| 欧美黄色淫秽网站| 国产成年人精品一区二区| 熟妇人妻久久中文字幕3abv| 亚洲成av人片免费观看| 丁香欧美五月| 脱女人内裤的视频| 丝袜美腿在线中文| 国产高清视频在线播放一区| 亚洲男人的天堂狠狠| 亚洲国产欧美人成| 亚洲av二区三区四区| 午夜激情欧美在线| 欧美午夜高清在线| 国产伦在线观看视频一区| 午夜福利视频1000在线观看| 亚洲国产精品999在线| 亚洲天堂国产精品一区在线| 叶爱在线成人免费视频播放| 欧美区成人在线视频| 两人在一起打扑克的视频| 欧美午夜高清在线| 成人永久免费在线观看视频| 欧美成狂野欧美在线观看| 91av网一区二区| 国产探花在线观看一区二区| 国产精品久久久久久人妻精品电影| 两个人的视频大全免费| 特大巨黑吊av在线直播| 亚洲国产欧美人成| 在线a可以看的网站| 国产视频内射| 五月玫瑰六月丁香| 日韩精品中文字幕看吧| 欧美日韩一级在线毛片| 一区福利在线观看| 在线国产一区二区在线| 国模一区二区三区四区视频| 久久人人精品亚洲av| 精品一区二区三区av网在线观看| 国产高清有码在线观看视频| 午夜福利高清视频| 日韩欧美在线二视频| 精品久久久久久久毛片微露脸| 中文亚洲av片在线观看爽| 两人在一起打扑克的视频| 99精品欧美一区二区三区四区| 搞女人的毛片| 久久久色成人| 日本 av在线| 91在线观看av| tocl精华| 色噜噜av男人的天堂激情| 中文字幕高清在线视频| 午夜免费激情av| 亚洲成人精品中文字幕电影| 免费在线观看日本一区| 欧美黑人巨大hd| 12—13女人毛片做爰片一| 欧洲精品卡2卡3卡4卡5卡区| 90打野战视频偷拍视频| 国产毛片a区久久久久| 日韩高清综合在线| 国产高清视频在线播放一区| 色吧在线观看| 午夜福利视频1000在线观看| 亚洲真实伦在线观看| 精品99又大又爽又粗少妇毛片 | 尤物成人国产欧美一区二区三区| 亚洲内射少妇av| 两性午夜刺激爽爽歪歪视频在线观看| 日本 欧美在线| 999久久久精品免费观看国产| 国产精品嫩草影院av在线观看 | 欧美成狂野欧美在线观看| 嫩草影视91久久| 免费看日本二区| 婷婷六月久久综合丁香| 久久久久久久亚洲中文字幕 | 国产黄色小视频在线观看| 亚洲av五月六月丁香网| 国产亚洲精品久久久com| 午夜福利在线在线| 精品99又大又爽又粗少妇毛片 | 变态另类成人亚洲欧美熟女| 久久午夜亚洲精品久久| 人人妻人人澡欧美一区二区| 校园春色视频在线观看| 久久久久久久久久黄片| 亚洲国产日韩欧美精品在线观看 | 日本黄色视频三级网站网址| 欧美极品一区二区三区四区|