• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A physics-constrained deep residual network for solving the sine-Gordon equation

    2021-05-19 09:01:54JunLi李軍andYongChen陳勇
    Communications in Theoretical Physics 2021年1期
    關(guān)鍵詞:陳勇李軍

    Jun Li(李軍) and Yong Chen(陳勇)

    1 Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    2 School of Mathematical Sciences,Shanghai Key Laboratory of PMMP,Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    3 College of Mathematics and Systems Science,Shandong University of Science and Technology,Qingdao,266590,China

    4 Department of Physics,Zhejiang Normal University,Jinhua,321004,China

    Abstract Despite some empirical successes for solving nonlinear evolution equations using deep learning,there are several unresolved issues.First,it could not uncover the dynamical behaviors of some equations where highly nonlinear source terms are included very well.Second,the gradient exploding and vanishing problems often occur for the traditional feedforward neural networks.In this paper,we propose a new architecture that combines the deep residual neural network with some underlying physical laws.Using the sine-Gordon equation as an example,we show that the numerical result is in good agreement with the exact soliton solution.In addition,a lot of numerical experiments show that the model is robust under small perturbations to a certain extent.

    Keywords:sine-Gordon equation,deep residual network,soliton,integrable system

    1.Introduction

    Solving nonlinear evolution equations computationally plays a very important role in physics and engineering.Approximating these equations using deep learning rather than traditional numerical methods has been studied[1–3].Han et al[1]introduce a deep learning approach to approximate the gradient of the unknown solution.However,Raissi et al[2,3]approximate the latent solution using deep neural networks directly.Recently,we also explore the applications in other evolution equations such as the Burgers equation(it should be emphasized that strictly speaking,we just study kink-type solitary wave solutions to this equation),the Korteweg–de Vries(KdV)equation,the modified KdV equation,and the Sharma–Tasso–Olver equation[4,5]where these integrable equations have exact and explicit solutions to test the accuracy of the solutions via neural network methods.These data-driven solutions are closed analytic,differentiable and easy to be used in subsequent calculations compared with traditional numerical approaches.Moreover,compared with other traditional numerical approaches,the deep learning method does not require the discretization of spatial and temporal domains.

    Some experiments show that the model in[2]could not approximate the solutions to some equations including highly nonlinear source terms such as sin(u)very well and then the model in[3]usually cannot represent the solution dynamics when using some activation functions.Thus,it is very interesting to improve the network framework to solve or alleviate these problems.In this paper,we propose a physics-constrained deep residual network that combines the deep residual neural network[6,7]with some underlying laws of physics.To our knowledge,it is the first framework combining the residual network with the underlying physical laws.This framework is easier to optimize than the classical feedforward networks.Moreover,it can increase gradient flow and then alleviate the gradient vanishing/exploding problems through adding the skip connection accordingly.It can also be used to train very deep networks to improve the performance of the network.By the way,here we just use a simple identity connection,more complex connections between these network layers are enable[8].

    Figure 1.This figure sketches how the residual block works.

    Specifically,in this paper,we just consider the sine-Gordon equation[9–11]which includes a highly nonlinear source term:

    where the solution u is a function of the space variable x and the time variable t.Note that,through the coordinate transformationandwe can obtain another expression of this equation,namely,

    The paper is organized as follows.In section 2,we propose the physics-constrained deep residual network and then present a new objective function.Then we give our main focus on the antikink solution to the sine-Gordon equation and the influence on the model of different settings such as noise,sampled data points,network layers and neurons in section 3.Finally,we conclude the paper in section 4.

    2.Method

    The residual block usually adds the output of a series of layers to the input of the block directly.Formally,in this work,we consider a residual block defined as

    where W(n,i),b(n,i),i=1…4 are the weights and biases of the corresponding layers,respectively.Note that we use the activation function only in the hidden layers rather than the output layer.The identity map skips training from a few layers and connects to the output directly.If any layer hurt the performance of the network,then it will be skipped.So,this mechanism can be regarded as a kind of regularization.This results in training very deep neural networks without the problems caused by vanishing or exploding gradients.

    In addition,we know that this residual structure is just a special case of the Euler forward scheme[12]

    when h=1.See[13,14]for more detailed analyses from the viewpoints of dynamical systems and differential equations.

    In this work,we approximate directly the unknown solution u using a deep residual network composed of three residual blocks and accordingly define a network

    where these two networks share all parameters to be learned.The solution network is trained to satisfy the sine-Gordon equation(1)and corresponding initial/boundary conditions.With the help of the automatic differentiation techniques[15],equation(5)is embedded into a new loss function we propose here:

    in order to utilize the underlying laws of physics to discover patterns fromexperimental/simulated data wheredenote the initial/boundary training data on the networku andspecify the collocation points for the networkf.The first term of the right hand side of equation(6)measures the difference between the true underlying and measured dynamics due to the noise that may be caused by different factors.Then the second term learns to satisfy f which prevents overfitting and denoises the data.In this paper,we just set λ=1 and choose the L-BFGS[16]algorithm to optimize the loss function(6).By the way,different from numerical differentiation,automatic differentiation possesses the advantages of small influence of noise and desirable stability,which will be observed in the next section.From equation(6),it can be observed that we use logcosh(x),which is also called smooth L1 function in some contexts,as the objective function.Through a simple calculus,we know that

    Obviously,it is less affected by outliers and second-order differentiable everywhere.Moreover,the comparison of the difference between the absolute function,the square function and this function is given in figure 2.The logcosh function’s relation to the absolute function is just like the swish function’s relation to the rectified linear unit(ReLU).More discussions about the choice of the objective will be given in the next section.

    Figure 2.The comparison of some common loss functions.

    Some studies indicate that orthogonal initializations can restrict unit variance to improve the training procedure and the performance of architecture,but for most applications,Xavier initialization[17]used in this paper is enough.The activation function is what makes a neural network nonlinear.This nonlinearity property makes the network more expressive.Specifically,in this paper,the second-order derivatives of the solution with respect to the spatial variable x and the temporal variable t are needed,so ReLU(max{0,x})evidently does not work.Then,the data are usually rescaled to[?1,1].The sigmoid function(σ,1(1 +exp(?x))),however,restricts the output to[0,1].Thus,this function also does not work well.Moreover,most of the derivative values of this type of S-shaped functions including σ and tanh tend to 0 which will lose too much information and lead to the vanishing gradient problem to some extent.In addition,ReLU can not represent complicated and fine-grained details.Some numerical experiments demonstrate that they indeed cannot recover the solution dynamics correctly.We think that other different choices of weight initializations and data normalizations may also affect the selection of the activation functions.So,in this paper,we choose some periodic functions such as the sinusoid functions as the activation functions[18].For sin(x)which is zero centered,its derivativecos(x)is just a shifted sine function.

    All experiments in this work are conducted on a Mac-Book Pro computer with 2.4 GHz Dual-Core Intel Core i5 processor.

    3.Numerical Results

    The soliton phenomena exist widely in physics,biology,communication and other scientific disciplines.The soliton solution(namely,kink or antikink)of the sine-Gordon equation we mention here,which is distinctly different from the KdV soliton(bell shaped),represents topologically invariant quantities in a system.Specifically,the exact one antikink solution is given by the B?cklund transformation:

    Figure 3.The comparison of the loss(taking the natural logarithm)curves when the physics-constrained deep residual network is trained with different loss functions.

    where c is the translation speed and x0is the initial position.For simplicity,we setand x0=0.Then the antikink solution is reduced to

    Using the derivative formulaand simplifying the result,we obtain the derivative of(9)with respect to t:

    For the antikink solution(9),we know that the initial conditions areand then the boundary conditions are u(t,x=?20)=2π,u(t,x=20)=0 for t ∈[0,10].We consider the sine-Gordon equation along with Dirichlet boundary condition.To obtain the training and testing data,we just sample the data on the evenly spaced grid every Δt=0.02 from t=0 to t=10 and finally obtain totally 501 snapshots.Out of this data,we generate a smaller training subset by randomly sub-sampling Nu=200(it is usually relatively small,more details will be discussed below)initial/boundary data and Nf=20 000 collocation data points generated usually using the Latin hypercube sampling method[19].

    From figure 3,we obviously know that the logcosh function as the loss objective is significantly better than the square function.Moreover,the algorithm only takes much less iterations for the former to achieve its optimal performance,that is to say,the function can accelerate the convergence of the optimization algorithm.

    Figure 4 graphically shows the antikink evolution of the sine-Gordon equation.The top panel of figure 4 compares between the exact dynamics and the predicted spatiotemporal behavior.The model achieves a relativeerror of size 6.09e-04 in a runtime of about 10 min where the error is defined as ‖utrue?upred‖2/‖utrue‖2.More detailed assessments are presented in the bottom panel of figure 4.In particular,we present a comparison between the exact solutions

    Figure 4.The antikink solution to the sine-Gordon equation.Top:an exact antikink is compared to the predicted solution of the learned model(right panel).The model correctly captures the dynamics and accurately reproduces the solution with a relative 2 error of 6.09e-04.Bottom:the comparison of the predicted solutions and the exact solutions at the three different temporal snapshots depicted by the white vertical lines in the top panel is presented.

    Figure 5.The antikink solution to the sine-Gordon equation.(a)The spatiotemporal behavior of the reconstructed antikink;(b)the spattiotemporal dynamics of the corresponding potential where the potential is given by v=?ux.

    Table 1.Relative 2 errors under different random seeds.

    Table 1.Relative 2 errors under different random seeds.

    1.79e-03 2.02e-03 1.77e-03 7.87e-04 1.88e-03 3.20e-03 1.61e-03 3.45e-04

    and predicted solutions at the three different instants t=1.24,3.74,8.76.The result indicates that the model can accurately capture the antikink dynamics of the sine-Gordon equation.Moreover,we can observe the reconstructed single antikink motion better from figure 5.

    Additionally,lots of numerical experiments show that our model is very robust for different random initializations(see table 1).

    Figure 6.The comparison of the relative 2 errors of the prediction results under small perturbations with different noise levels.

    Next,the performance of this framework in the presence of noise is compared.By adding some small amounts of noise

    where δ denotes the amount of noise,s is the standard deviation of u(t,x)andwe disturb the data.From fgiure 6,the numerical experiments reveal that the architecture is remarkably robust for some small noise and then the model is able to perform the long-term prediction even when trained with noisy data.That is to say,the model could reconstruct the solution behavior from noisy data.From these experiments,we believe that the input noise at a certain extent can be regarded as a regularization mechanism that increases the robustness of the network.This is a kind of weak generalization.Additionally,this perturbation phenomenon can also be described by results on orbital stability and asymptotic stability of solitons[20,21]from the mathematical viewpoint.

    As the noise level increases,however,the accuracy of the predicted solution decreases and the training time increases remarkably.Here,for noisy data,we can increase the numbers of training data to decrease the relative error and then improve the accuracy of the predicted solution.The experimental comparison of the influence of different numbers of sub-sampled data points on the prediction under different noise levels is given in table 2.

    Generally speaking,with more layers and more neurons,the model has a better performance[22].So,we carry out a lot of numerical experiments in order to check this empirical result.See table 3 for the comparison with different numbers of hidden layers and neurons per hidden layer.Some more detailed theoretical analyses are also conducted[23,24].

    Last,we also train the model using the Adam[25]optimizer with default parameter setups to approximate the antikink solution to the sine-Gordon equation.The training procedure takes approximately 2.5 h with Nu=100 and Nf=10 000 under 30 000 epochs.The relative2error between the exact and predicted solutions is 1.47e-02.The experimental result shows that the L-BFGS algorithm is much faster than Adam in this case and it gets a more accurate solution.However,the former sometimes suffers from some convergence issues.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Noise0%Noise1%NfNf Nu1000500010 00020 000Nu1000500010 00020 000 103.67e-01 1.47e-01 1.02e-01 2.27e-01105.17e-01 1.41e-01 8.58e-02 5.34e-02 506.98e-03 2.12e-03 1.14e-03 1.68e-03501.35e-02 5.71e-03 5.81e-03 3.75e-03 1002.91e-03 1.48e-03 1.37e-03 1.23e-031009.52e-03 5.95e-03 6.92e-03 3.55e-03 2001.07e-03 1.21e-03 9.73e-04 6.09e-042003.60e-03 1.42e-03 2.59e-03 2.03e-03

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Neurons Hidden layers10204080 4 5.09e-01 7.01e-04 4.98e-04 1.21e-03 8 1.76e-03 1.47e-03 9.65e-04 8.43e-04 127.91e-04 7.87e-04 6.09e-04 7.05e-03 166.65e-04 3.38e-04 3.70e-04 2.16e-03

    4.Conclusions and discussion

    In this paper,we propose a new architecture that combines deep residual network with underlying physical laws for extracting soliton dynamics of the sine-Gordon equation from spatiotemporal data.This architecture can be used easily to train very deep networks and then alleviates the gradient exploding and vanishing problems.Moreover,we use the logcosh function rather than the square function in the objective in this paper in order to accelerate the training and improve the performance of the network.The numerical results show that the model could reconstruct the solution behaviors of the equation very accurately.Moreover,this model is remarkably robust under small disturbances to some extent.

    Despite some progress,we are still at the early stages of understanding the capabilities and limitations of such deep learning models.In addition,other advanced frameworks,for example,generative adversarial networks,recurrent neural networks and networks with some numerical schemes embedded,will also be considered in the future research.

    Acknowledgments

    The first author would like to express his sincere thanks to Dr Yuqi Li and Dr Xiaoen Zhang for their valuable comments and excellent suggestions on this work.The authors gratefully acknowledge the support of the National Natural Science Foundation of China(Grant No.11675054),Shanghai Collaborative Innovation Center of Trustworthy Software for Internet of Things(Grant No.ZF1213)and Science and Technology Commission of Shanghai Municipality(Grant No.18dz2271000).

    猜你喜歡
    陳勇李軍
    木棉花開
    人民之聲(2022年3期)2022-04-12 12:00:14
    Superconductivity in octagraphene
    A Direct Algorithm Maple Package of One-Dimensional Optimal System for Group Invariant Solutions?
    Lump Solutions and Interaction Phenomenon for(2+1)-Dimensional Sawada–Kotera Equation?
    Symmetry Analysis and Exact Solutions of the 2D Unsteady Incompressible Boundary-Layer Equations?
    In fluence of Cell-Cell Interactions on the Population Growth Rate in a Tumor?
    Mechanical Behavior of Plastic PiPe Reinforced by Cross-Winding Steel Wire Subject to Foundation Settlement
    陳勇:勵精圖治 銳意創(chuàng)新
    滬港通一周成交概況
    陳勇:我不看好這樣的藥房托管
    两个人免费观看高清视频| 丰满饥渴人妻一区二区三| 国产免费一区二区三区四区乱码| 欧美大码av| 国产成人一区二区三区免费视频网站| 一区二区三区精品91| cao死你这个sao货| 免费观看av网站的网址| 国产精品自产拍在线观看55亚洲 | 美女脱内裤让男人舔精品视频| 中文字幕人妻丝袜一区二区| 国产av一区二区精品久久| 欧美黑人精品巨大| 国产在线观看jvid| 99久久人妻综合| 久久国产精品影院| 国产精品1区2区在线观看. | 久热这里只有精品99| 国产又爽黄色视频| 久久久国产欧美日韩av| av天堂在线播放| 久久久久久久大尺度免费视频| 婷婷色av中文字幕| 一边摸一边做爽爽视频免费| 亚洲一区中文字幕在线| 老司机影院成人| 美女国产高潮福利片在线看| 美女国产高潮福利片在线看| 岛国在线观看网站| 美国免费a级毛片| 一本综合久久免费| 一区二区三区乱码不卡18| 天天躁日日躁夜夜躁夜夜| 搡老熟女国产l中国老女人| 欧美乱码精品一区二区三区| 99九九在线精品视频| 咕卡用的链子| 一本大道久久a久久精品| 亚洲va日本ⅴa欧美va伊人久久 | 极品少妇高潮喷水抽搐| 乱人伦中国视频| 伊人久久大香线蕉亚洲五| 自线自在国产av| 乱人伦中国视频| 久久久国产欧美日韩av| tube8黄色片| 国产成人系列免费观看| 日韩视频在线欧美| 啦啦啦免费观看视频1| 精品福利观看| 正在播放国产对白刺激| 久久久久久久精品精品| 国产成人免费无遮挡视频| 精品国产国语对白av| 欧美 亚洲 国产 日韩一| 高清欧美精品videossex| 99国产精品一区二区三区| 国产老妇伦熟女老妇高清| 欧美日韩精品网址| 又紧又爽又黄一区二区| 蜜桃国产av成人99| 亚洲精品在线美女| 老司机靠b影院| 午夜视频精品福利| av免费在线观看网站| 欧美日韩精品网址| 999久久久精品免费观看国产| 成人av一区二区三区在线看 | 亚洲精品日韩在线中文字幕| 免费人妻精品一区二区三区视频| 丝袜人妻中文字幕| 午夜精品久久久久久毛片777| 香蕉国产在线看| 国产国语露脸激情在线看| 老司机午夜十八禁免费视频| 国产成人欧美在线观看 | 久久久久精品人妻al黑| 90打野战视频偷拍视频| 久久人妻熟女aⅴ| 亚洲一区二区三区欧美精品| 一区在线观看完整版| 日本撒尿小便嘘嘘汇集6| 人妻久久中文字幕网| 丰满迷人的少妇在线观看| 国产免费视频播放在线视频| 王馨瑶露胸无遮挡在线观看| 在线 av 中文字幕| 亚洲中文字幕日韩| www.自偷自拍.com| 国产成人系列免费观看| 久久天堂一区二区三区四区| 亚洲精品粉嫩美女一区| 精品免费久久久久久久清纯 | 老汉色∧v一级毛片| 国产在线一区二区三区精| 国产在线观看jvid| 考比视频在线观看| 国产欧美亚洲国产| 中文字幕人妻丝袜制服| 日韩大码丰满熟妇| 久久精品亚洲av国产电影网| 麻豆国产av国片精品| 色婷婷av一区二区三区视频| 大码成人一级视频| 国产高清videossex| xxxhd国产人妻xxx| 精品国产国语对白av| 国产精品久久久久久精品电影小说| 久久人人爽人人片av| 亚洲国产欧美日韩在线播放| 老熟女久久久| 久久亚洲精品不卡| 99re6热这里在线精品视频| 国产精品av久久久久免费| 黄频高清免费视频| 午夜福利在线观看吧| 久久香蕉激情| 国产男人的电影天堂91| 国产高清国产精品国产三级| 大片免费播放器 马上看| 欧美日韩精品网址| 国产高清视频在线播放一区 | 亚洲黑人精品在线| 亚洲人成电影观看| 黄色片一级片一级黄色片| 人妻久久中文字幕网| 91老司机精品| 午夜福利影视在线免费观看| 欧美 亚洲 国产 日韩一| 性色av一级| 免费黄频网站在线观看国产| 蜜桃在线观看..| 久久国产精品大桥未久av| 最近最新免费中文字幕在线| 久久这里只有精品19| 丰满人妻熟妇乱又伦精品不卡| 精品一区二区三区av网在线观看 | 各种免费的搞黄视频| 久久ye,这里只有精品| 99久久综合免费| 水蜜桃什么品种好| 免费在线观看影片大全网站| 日韩视频在线欧美| 国产精品九九99| 精品福利永久在线观看| 国产精品秋霞免费鲁丝片| 国产男女内射视频| 中文字幕高清在线视频| 成年人午夜在线观看视频| 亚洲欧美日韩另类电影网站| 韩国精品一区二区三区| 热99久久久久精品小说推荐| 久久青草综合色| 一本综合久久免费| 激情视频va一区二区三区| 国产在线观看jvid| 啦啦啦 在线观看视频| 最近最新免费中文字幕在线| 黄色片一级片一级黄色片| 亚洲精品国产色婷婷电影| 亚洲国产欧美网| 丰满饥渴人妻一区二区三| 日韩视频在线欧美| 色94色欧美一区二区| 免费高清在线观看视频在线观看| 日韩大片免费观看网站| 色视频在线一区二区三区| 一边摸一边抽搐一进一出视频| 国产高清视频在线播放一区 | 国产精品亚洲av一区麻豆| 黄色a级毛片大全视频| 亚洲国产精品成人久久小说| 午夜视频精品福利| kizo精华| 国产成人精品久久二区二区91| 99热国产这里只有精品6| 久久久久精品国产欧美久久久 | 丝袜喷水一区| 精品人妻熟女毛片av久久网站| 精品免费久久久久久久清纯 | av在线app专区| 人妻一区二区av| 亚洲七黄色美女视频| 美女主播在线视频| 午夜成年电影在线免费观看| 咕卡用的链子| 亚洲精品中文字幕在线视频| 91成人精品电影| 国产欧美日韩精品亚洲av| 伊人亚洲综合成人网| 国产精品秋霞免费鲁丝片| 人人妻人人添人人爽欧美一区卜| a 毛片基地| 久久久国产欧美日韩av| 日韩三级视频一区二区三区| 久久天躁狠狠躁夜夜2o2o| 最新在线观看一区二区三区| 亚洲 欧美一区二区三区| 午夜激情av网站| 一个人免费看片子| 成人免费观看视频高清| 青草久久国产| 久久久久精品国产欧美久久久 | www.自偷自拍.com| 1024香蕉在线观看| 大型av网站在线播放| 18禁黄网站禁片午夜丰满| 91大片在线观看| 免费在线观看视频国产中文字幕亚洲 | 亚洲熟女毛片儿| 一区二区日韩欧美中文字幕| 最近中文字幕2019免费版| 国产黄色免费在线视频| av片东京热男人的天堂| 青青草视频在线视频观看| 少妇 在线观看| 国产精品香港三级国产av潘金莲| 一个人免费在线观看的高清视频 | 成人18禁高潮啪啪吃奶动态图| 在线十欧美十亚洲十日本专区| 久久人人97超碰香蕉20202| 亚洲成av片中文字幕在线观看| 又大又爽又粗| 亚洲欧洲日产国产| 免费在线观看视频国产中文字幕亚洲 | 国产在线观看jvid| 国产成人精品在线电影| 日韩人妻精品一区2区三区| av网站在线播放免费| 精品高清国产在线一区| 亚洲精品中文字幕在线视频| 在线永久观看黄色视频| 亚洲欧美一区二区三区黑人| 两人在一起打扑克的视频| 国产1区2区3区精品| 欧美亚洲日本最大视频资源| 蜜桃在线观看..| 久久天躁狠狠躁夜夜2o2o| 国产男人的电影天堂91| 天天躁日日躁夜夜躁夜夜| 999久久久国产精品视频| 99国产精品免费福利视频| 国产成人影院久久av| 亚洲综合色网址| 午夜91福利影院| 日韩制服丝袜自拍偷拍| 岛国毛片在线播放| 午夜精品久久久久久毛片777| 9色porny在线观看| 一边摸一边做爽爽视频免费| 久久精品国产亚洲av香蕉五月 | 精品少妇一区二区三区视频日本电影| 亚洲国产毛片av蜜桃av| 最近中文字幕2019免费版| 国产欧美日韩综合在线一区二区| 亚洲精品久久午夜乱码| 国产精品国产三级国产专区5o| 精品亚洲成国产av| 香蕉丝袜av| 视频区图区小说| 看免费av毛片| 三上悠亚av全集在线观看| 亚洲成av片中文字幕在线观看| 亚洲国产av新网站| 色视频在线一区二区三区| 狂野欧美激情性bbbbbb| 亚洲精品久久成人aⅴ小说| 国内毛片毛片毛片毛片毛片| 麻豆国产av国片精品| 色视频在线一区二区三区| 婷婷色av中文字幕| 十八禁网站免费在线| 69精品国产乱码久久久| 久久久久久久精品精品| 老司机靠b影院| 成年人免费黄色播放视频| 一区在线观看完整版| 精品国内亚洲2022精品成人 | 国产精品av久久久久免费| e午夜精品久久久久久久| 9191精品国产免费久久| videos熟女内射| 精品国产乱码久久久久久男人| 女人精品久久久久毛片| 午夜福利影视在线免费观看| 国产野战对白在线观看| 男女之事视频高清在线观看| 高清av免费在线| 精品福利观看| 蜜桃在线观看..| 欧美成狂野欧美在线观看| 高潮久久久久久久久久久不卡| 国产一级毛片在线| 女性生殖器流出的白浆| 日本撒尿小便嘘嘘汇集6| 美国免费a级毛片| 免费少妇av软件| 国产精品久久久久久人妻精品电影 | 久久人妻熟女aⅴ| 婷婷丁香在线五月| 精品福利观看| 国产91精品成人一区二区三区 | 亚洲色图 男人天堂 中文字幕| 久久99热这里只频精品6学生| 成人av一区二区三区在线看 | 脱女人内裤的视频| 中文字幕制服av| 老司机影院毛片| 首页视频小说图片口味搜索| 久久久久网色| 91麻豆精品激情在线观看国产 | 亚洲avbb在线观看| 亚洲专区国产一区二区| 亚洲男人天堂网一区| 香蕉国产在线看| av国产精品久久久久影院| 好男人电影高清在线观看| 天天添夜夜摸| 精品久久蜜臀av无| 亚洲精品一卡2卡三卡4卡5卡 | 久久性视频一级片| 日日爽夜夜爽网站| 在线观看人妻少妇| 亚洲黑人精品在线| 久久人人爽av亚洲精品天堂| 亚洲全国av大片| 性色av乱码一区二区三区2| 国产男女内射视频| 丰满饥渴人妻一区二区三| 亚洲精品国产精品久久久不卡| 一区二区av电影网| 国产成+人综合+亚洲专区| h视频一区二区三区| 桃红色精品国产亚洲av| av国产精品久久久久影院| 精品亚洲乱码少妇综合久久| 日本a在线网址| 欧美精品av麻豆av| 亚洲精品国产av蜜桃| 少妇裸体淫交视频免费看高清 | 日本欧美视频一区| 国产精品成人在线| 精品久久久精品久久久| 国产淫语在线视频| 午夜福利在线观看吧| av国产精品久久久久影院| 91成人精品电影| 亚洲激情五月婷婷啪啪| 日本av手机在线免费观看| av在线app专区| 狠狠狠狠99中文字幕| 欧美日韩亚洲国产一区二区在线观看 | 亚洲中文字幕日韩| 日韩欧美一区视频在线观看| 国产一区二区在线观看av| 超碰成人久久| 日韩欧美一区视频在线观看| 国产黄频视频在线观看| 欧美久久黑人一区二区| 999久久久国产精品视频| 男女免费视频国产| 淫妇啪啪啪对白视频 | 成年人免费黄色播放视频| 首页视频小说图片口味搜索| a 毛片基地| 黄色视频不卡| 十分钟在线观看高清视频www| 97精品久久久久久久久久精品| 在线天堂中文资源库| 老司机午夜福利在线观看视频 | 国产99久久九九免费精品| tube8黄色片| 国产色视频综合| 窝窝影院91人妻| 日本五十路高清| 日韩人妻精品一区2区三区| 国产1区2区3区精品| 久久久国产欧美日韩av| 亚洲国产欧美一区二区综合| 国产日韩一区二区三区精品不卡| 老熟女久久久| 日韩人妻精品一区2区三区| 久久 成人 亚洲| 成年动漫av网址| 男人操女人黄网站| 免费在线观看视频国产中文字幕亚洲 | 亚洲精品国产av蜜桃| 亚洲一码二码三码区别大吗| 人人妻人人添人人爽欧美一区卜| 久久久国产欧美日韩av| 亚洲精品自拍成人| 色播在线永久视频| 80岁老熟妇乱子伦牲交| 日本一区二区免费在线视频| 久久精品亚洲熟妇少妇任你| 亚洲,欧美精品.| 十分钟在线观看高清视频www| 亚洲五月色婷婷综合| 女性生殖器流出的白浆| 美女扒开内裤让男人捅视频| 飞空精品影院首页| 大型av网站在线播放| 丁香六月欧美| 一区福利在线观看| bbb黄色大片| 少妇 在线观看| 国产一区二区三区在线臀色熟女 | 国产成人精品久久二区二区免费| 日韩欧美一区视频在线观看| 国产麻豆69| 欧美成人午夜精品| 俄罗斯特黄特色一大片| 男女高潮啪啪啪动态图| 国产精品影院久久| 久久久久国内视频| 一区二区三区乱码不卡18| 亚洲国产精品999| 国产成人欧美在线观看 | 最新在线观看一区二区三区| 韩国高清视频一区二区三区| 国产亚洲精品久久久久5区| 精品一区二区三区av网在线观看 | 亚洲欧美精品综合一区二区三区| 久久女婷五月综合色啪小说| av天堂久久9| 在线观看一区二区三区激情| 99精品久久久久人妻精品| 777久久人妻少妇嫩草av网站| 久久女婷五月综合色啪小说| 久久午夜综合久久蜜桃| 在线观看人妻少妇| 男人操女人黄网站| 欧美精品亚洲一区二区| 亚洲av电影在线进入| 亚洲国产精品一区三区| 天天躁狠狠躁夜夜躁狠狠躁| 香蕉国产在线看| 色老头精品视频在线观看| 亚洲中文av在线| 成人影院久久| 啦啦啦免费观看视频1| 国产精品秋霞免费鲁丝片| 性色av一级| 国产精品久久久久久人妻精品电影 | 99热国产这里只有精品6| 欧美成人午夜精品| 亚洲三区欧美一区| 老汉色av国产亚洲站长工具| 亚洲综合色网址| av线在线观看网站| 香蕉国产在线看| 国产精品免费视频内射| 少妇裸体淫交视频免费看高清 | 成在线人永久免费视频| 色视频在线一区二区三区| 亚洲精品久久成人aⅴ小说| 色婷婷av一区二区三区视频| 999精品在线视频| av超薄肉色丝袜交足视频| 一区二区三区乱码不卡18| 国产一区二区三区综合在线观看| 黑人猛操日本美女一级片| 欧美av亚洲av综合av国产av| 亚洲一卡2卡3卡4卡5卡精品中文| 侵犯人妻中文字幕一二三四区| 精品免费久久久久久久清纯 | 国产成人精品久久二区二区91| 精品卡一卡二卡四卡免费| 十八禁网站网址无遮挡| 亚洲精品国产色婷婷电影| 日韩视频在线欧美| 最新在线观看一区二区三区| 男人舔女人的私密视频| 亚洲五月婷婷丁香| 九色亚洲精品在线播放| 国产有黄有色有爽视频| 真人做人爱边吃奶动态| 欧美黑人欧美精品刺激| 欧美另类一区| 搡老岳熟女国产| 免费不卡黄色视频| 999久久久国产精品视频| 欧美精品一区二区免费开放| 一级a爱视频在线免费观看| 欧美xxⅹ黑人| 欧美日韩国产mv在线观看视频| 成年人免费黄色播放视频| 亚洲国产精品一区三区| 国产av一区二区精品久久| 亚洲综合色网址| 亚洲成人免费av在线播放| 久久这里只有精品19| 12—13女人毛片做爰片一| 91国产中文字幕| 麻豆乱淫一区二区| 精品福利永久在线观看| 超碰成人久久| 久久人妻福利社区极品人妻图片| 亚洲中文日韩欧美视频| 黄色视频不卡| xxxhd国产人妻xxx| 深夜精品福利| 久久国产精品人妻蜜桃| 精品人妻在线不人妻| 一级片'在线观看视频| 新久久久久国产一级毛片| bbb黄色大片| 午夜视频精品福利| 如日韩欧美国产精品一区二区三区| 久热爱精品视频在线9| 欧美在线一区亚洲| 成年人午夜在线观看视频| 国产免费福利视频在线观看| 欧美人与性动交α欧美软件| 国产有黄有色有爽视频| 美国免费a级毛片| 大陆偷拍与自拍| 午夜91福利影院| 国产av精品麻豆| 亚洲av成人一区二区三| 久久精品人人爽人人爽视色| 一级毛片女人18水好多| 十八禁网站免费在线| 人人妻人人添人人爽欧美一区卜| 精品乱码久久久久久99久播| av天堂久久9| 在线永久观看黄色视频| 91麻豆av在线| 国产又爽黄色视频| 亚洲国产欧美在线一区| 亚洲成人免费av在线播放| e午夜精品久久久久久久| av天堂在线播放| 亚洲av男天堂| 涩涩av久久男人的天堂| 两人在一起打扑克的视频| 精品亚洲成a人片在线观看| 桃花免费在线播放| 亚洲成人免费av在线播放| 亚洲免费av在线视频| 午夜福利视频在线观看免费| 亚洲成国产人片在线观看| av在线老鸭窝| 久久久久国产一级毛片高清牌| 天天影视国产精品| 成人三级做爰电影| 亚洲精品久久成人aⅴ小说| 欧美国产精品一级二级三级| 另类亚洲欧美激情| 亚洲精品国产色婷婷电影| 欧美老熟妇乱子伦牲交| 制服人妻中文乱码| 欧美黄色片欧美黄色片| 天堂8中文在线网| 亚洲国产日韩一区二区| 精品国内亚洲2022精品成人 | 十分钟在线观看高清视频www| 欧美av亚洲av综合av国产av| 大码成人一级视频| 久久狼人影院| 一级毛片电影观看| 宅男免费午夜| 我的亚洲天堂| 性高湖久久久久久久久免费观看| 国产成人a∨麻豆精品| 成在线人永久免费视频| 中文字幕精品免费在线观看视频| 亚洲精品中文字幕一二三四区 | 侵犯人妻中文字幕一二三四区| 国产成人影院久久av| 另类精品久久| 久久精品久久久久久噜噜老黄| av线在线观看网站| 日韩大片免费观看网站| 欧美在线黄色| 国产精品欧美亚洲77777| 久久天堂一区二区三区四区| 丰满迷人的少妇在线观看| 十八禁网站网址无遮挡| 精品高清国产在线一区| 欧美日韩亚洲综合一区二区三区_| 国产亚洲精品一区二区www | 日韩中文字幕欧美一区二区| 男人操女人黄网站| 成人国产一区最新在线观看| e午夜精品久久久久久久| 免费少妇av软件| 亚洲精品国产色婷婷电影| 中文字幕人妻丝袜一区二区| 国产1区2区3区精品| 大陆偷拍与自拍| 在线十欧美十亚洲十日本专区| av超薄肉色丝袜交足视频| 性色av一级| 国产成人av教育| 亚洲国产毛片av蜜桃av| 午夜成年电影在线免费观看| 亚洲国产精品999| 国产成人免费无遮挡视频| 久久中文字幕一级| 欧美日韩精品网址| 在线观看免费日韩欧美大片| 久久久久久人人人人人| 夜夜夜夜夜久久久久| 国产淫语在线视频| 久久影院123| 美女高潮到喷水免费观看| 丰满少妇做爰视频| cao死你这个sao货| 亚洲中文字幕日韩| 欧美日韩一级在线毛片| 涩涩av久久男人的天堂| 欧美中文综合在线视频| 日本精品一区二区三区蜜桃| 亚洲国产欧美网|