• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A physics-constrained deep residual network for solving the sine-Gordon equation

    2021-05-19 09:01:54JunLi李軍andYongChen陳勇
    Communications in Theoretical Physics 2021年1期
    關(guān)鍵詞:陳勇李軍

    Jun Li(李軍) and Yong Chen(陳勇)

    1 Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    2 School of Mathematical Sciences,Shanghai Key Laboratory of PMMP,Shanghai Key Laboratory of Trustworthy Computing,East China Normal University,Shanghai,200062,China

    3 College of Mathematics and Systems Science,Shandong University of Science and Technology,Qingdao,266590,China

    4 Department of Physics,Zhejiang Normal University,Jinhua,321004,China

    Abstract Despite some empirical successes for solving nonlinear evolution equations using deep learning,there are several unresolved issues.First,it could not uncover the dynamical behaviors of some equations where highly nonlinear source terms are included very well.Second,the gradient exploding and vanishing problems often occur for the traditional feedforward neural networks.In this paper,we propose a new architecture that combines the deep residual neural network with some underlying physical laws.Using the sine-Gordon equation as an example,we show that the numerical result is in good agreement with the exact soliton solution.In addition,a lot of numerical experiments show that the model is robust under small perturbations to a certain extent.

    Keywords:sine-Gordon equation,deep residual network,soliton,integrable system

    1.Introduction

    Solving nonlinear evolution equations computationally plays a very important role in physics and engineering.Approximating these equations using deep learning rather than traditional numerical methods has been studied[1–3].Han et al[1]introduce a deep learning approach to approximate the gradient of the unknown solution.However,Raissi et al[2,3]approximate the latent solution using deep neural networks directly.Recently,we also explore the applications in other evolution equations such as the Burgers equation(it should be emphasized that strictly speaking,we just study kink-type solitary wave solutions to this equation),the Korteweg–de Vries(KdV)equation,the modified KdV equation,and the Sharma–Tasso–Olver equation[4,5]where these integrable equations have exact and explicit solutions to test the accuracy of the solutions via neural network methods.These data-driven solutions are closed analytic,differentiable and easy to be used in subsequent calculations compared with traditional numerical approaches.Moreover,compared with other traditional numerical approaches,the deep learning method does not require the discretization of spatial and temporal domains.

    Some experiments show that the model in[2]could not approximate the solutions to some equations including highly nonlinear source terms such as sin(u)very well and then the model in[3]usually cannot represent the solution dynamics when using some activation functions.Thus,it is very interesting to improve the network framework to solve or alleviate these problems.In this paper,we propose a physics-constrained deep residual network that combines the deep residual neural network[6,7]with some underlying laws of physics.To our knowledge,it is the first framework combining the residual network with the underlying physical laws.This framework is easier to optimize than the classical feedforward networks.Moreover,it can increase gradient flow and then alleviate the gradient vanishing/exploding problems through adding the skip connection accordingly.It can also be used to train very deep networks to improve the performance of the network.By the way,here we just use a simple identity connection,more complex connections between these network layers are enable[8].

    Figure 1.This figure sketches how the residual block works.

    Specifically,in this paper,we just consider the sine-Gordon equation[9–11]which includes a highly nonlinear source term:

    where the solution u is a function of the space variable x and the time variable t.Note that,through the coordinate transformationandwe can obtain another expression of this equation,namely,

    The paper is organized as follows.In section 2,we propose the physics-constrained deep residual network and then present a new objective function.Then we give our main focus on the antikink solution to the sine-Gordon equation and the influence on the model of different settings such as noise,sampled data points,network layers and neurons in section 3.Finally,we conclude the paper in section 4.

    2.Method

    The residual block usually adds the output of a series of layers to the input of the block directly.Formally,in this work,we consider a residual block defined as

    where W(n,i),b(n,i),i=1…4 are the weights and biases of the corresponding layers,respectively.Note that we use the activation function only in the hidden layers rather than the output layer.The identity map skips training from a few layers and connects to the output directly.If any layer hurt the performance of the network,then it will be skipped.So,this mechanism can be regarded as a kind of regularization.This results in training very deep neural networks without the problems caused by vanishing or exploding gradients.

    In addition,we know that this residual structure is just a special case of the Euler forward scheme[12]

    when h=1.See[13,14]for more detailed analyses from the viewpoints of dynamical systems and differential equations.

    In this work,we approximate directly the unknown solution u using a deep residual network composed of three residual blocks and accordingly define a network

    where these two networks share all parameters to be learned.The solution network is trained to satisfy the sine-Gordon equation(1)and corresponding initial/boundary conditions.With the help of the automatic differentiation techniques[15],equation(5)is embedded into a new loss function we propose here:

    in order to utilize the underlying laws of physics to discover patterns fromexperimental/simulated data wheredenote the initial/boundary training data on the networku andspecify the collocation points for the networkf.The first term of the right hand side of equation(6)measures the difference between the true underlying and measured dynamics due to the noise that may be caused by different factors.Then the second term learns to satisfy f which prevents overfitting and denoises the data.In this paper,we just set λ=1 and choose the L-BFGS[16]algorithm to optimize the loss function(6).By the way,different from numerical differentiation,automatic differentiation possesses the advantages of small influence of noise and desirable stability,which will be observed in the next section.From equation(6),it can be observed that we use logcosh(x),which is also called smooth L1 function in some contexts,as the objective function.Through a simple calculus,we know that

    Obviously,it is less affected by outliers and second-order differentiable everywhere.Moreover,the comparison of the difference between the absolute function,the square function and this function is given in figure 2.The logcosh function’s relation to the absolute function is just like the swish function’s relation to the rectified linear unit(ReLU).More discussions about the choice of the objective will be given in the next section.

    Figure 2.The comparison of some common loss functions.

    Some studies indicate that orthogonal initializations can restrict unit variance to improve the training procedure and the performance of architecture,but for most applications,Xavier initialization[17]used in this paper is enough.The activation function is what makes a neural network nonlinear.This nonlinearity property makes the network more expressive.Specifically,in this paper,the second-order derivatives of the solution with respect to the spatial variable x and the temporal variable t are needed,so ReLU(max{0,x})evidently does not work.Then,the data are usually rescaled to[?1,1].The sigmoid function(σ,1(1 +exp(?x))),however,restricts the output to[0,1].Thus,this function also does not work well.Moreover,most of the derivative values of this type of S-shaped functions including σ and tanh tend to 0 which will lose too much information and lead to the vanishing gradient problem to some extent.In addition,ReLU can not represent complicated and fine-grained details.Some numerical experiments demonstrate that they indeed cannot recover the solution dynamics correctly.We think that other different choices of weight initializations and data normalizations may also affect the selection of the activation functions.So,in this paper,we choose some periodic functions such as the sinusoid functions as the activation functions[18].For sin(x)which is zero centered,its derivativecos(x)is just a shifted sine function.

    All experiments in this work are conducted on a Mac-Book Pro computer with 2.4 GHz Dual-Core Intel Core i5 processor.

    3.Numerical Results

    The soliton phenomena exist widely in physics,biology,communication and other scientific disciplines.The soliton solution(namely,kink or antikink)of the sine-Gordon equation we mention here,which is distinctly different from the KdV soliton(bell shaped),represents topologically invariant quantities in a system.Specifically,the exact one antikink solution is given by the B?cklund transformation:

    Figure 3.The comparison of the loss(taking the natural logarithm)curves when the physics-constrained deep residual network is trained with different loss functions.

    where c is the translation speed and x0is the initial position.For simplicity,we setand x0=0.Then the antikink solution is reduced to

    Using the derivative formulaand simplifying the result,we obtain the derivative of(9)with respect to t:

    For the antikink solution(9),we know that the initial conditions areand then the boundary conditions are u(t,x=?20)=2π,u(t,x=20)=0 for t ∈[0,10].We consider the sine-Gordon equation along with Dirichlet boundary condition.To obtain the training and testing data,we just sample the data on the evenly spaced grid every Δt=0.02 from t=0 to t=10 and finally obtain totally 501 snapshots.Out of this data,we generate a smaller training subset by randomly sub-sampling Nu=200(it is usually relatively small,more details will be discussed below)initial/boundary data and Nf=20 000 collocation data points generated usually using the Latin hypercube sampling method[19].

    From figure 3,we obviously know that the logcosh function as the loss objective is significantly better than the square function.Moreover,the algorithm only takes much less iterations for the former to achieve its optimal performance,that is to say,the function can accelerate the convergence of the optimization algorithm.

    Figure 4 graphically shows the antikink evolution of the sine-Gordon equation.The top panel of figure 4 compares between the exact dynamics and the predicted spatiotemporal behavior.The model achieves a relativeerror of size 6.09e-04 in a runtime of about 10 min where the error is defined as ‖utrue?upred‖2/‖utrue‖2.More detailed assessments are presented in the bottom panel of figure 4.In particular,we present a comparison between the exact solutions

    Figure 4.The antikink solution to the sine-Gordon equation.Top:an exact antikink is compared to the predicted solution of the learned model(right panel).The model correctly captures the dynamics and accurately reproduces the solution with a relative 2 error of 6.09e-04.Bottom:the comparison of the predicted solutions and the exact solutions at the three different temporal snapshots depicted by the white vertical lines in the top panel is presented.

    Figure 5.The antikink solution to the sine-Gordon equation.(a)The spatiotemporal behavior of the reconstructed antikink;(b)the spattiotemporal dynamics of the corresponding potential where the potential is given by v=?ux.

    Table 1.Relative 2 errors under different random seeds.

    Table 1.Relative 2 errors under different random seeds.

    1.79e-03 2.02e-03 1.77e-03 7.87e-04 1.88e-03 3.20e-03 1.61e-03 3.45e-04

    and predicted solutions at the three different instants t=1.24,3.74,8.76.The result indicates that the model can accurately capture the antikink dynamics of the sine-Gordon equation.Moreover,we can observe the reconstructed single antikink motion better from figure 5.

    Additionally,lots of numerical experiments show that our model is very robust for different random initializations(see table 1).

    Figure 6.The comparison of the relative 2 errors of the prediction results under small perturbations with different noise levels.

    Next,the performance of this framework in the presence of noise is compared.By adding some small amounts of noise

    where δ denotes the amount of noise,s is the standard deviation of u(t,x)andwe disturb the data.From fgiure 6,the numerical experiments reveal that the architecture is remarkably robust for some small noise and then the model is able to perform the long-term prediction even when trained with noisy data.That is to say,the model could reconstruct the solution behavior from noisy data.From these experiments,we believe that the input noise at a certain extent can be regarded as a regularization mechanism that increases the robustness of the network.This is a kind of weak generalization.Additionally,this perturbation phenomenon can also be described by results on orbital stability and asymptotic stability of solitons[20,21]from the mathematical viewpoint.

    As the noise level increases,however,the accuracy of the predicted solution decreases and the training time increases remarkably.Here,for noisy data,we can increase the numbers of training data to decrease the relative error and then improve the accuracy of the predicted solution.The experimental comparison of the influence of different numbers of sub-sampled data points on the prediction under different noise levels is given in table 2.

    Generally speaking,with more layers and more neurons,the model has a better performance[22].So,we carry out a lot of numerical experiments in order to check this empirical result.See table 3 for the comparison with different numbers of hidden layers and neurons per hidden layer.Some more detailed theoretical analyses are also conducted[23,24].

    Last,we also train the model using the Adam[25]optimizer with default parameter setups to approximate the antikink solution to the sine-Gordon equation.The training procedure takes approximately 2.5 h with Nu=100 and Nf=10 000 under 30 000 epochs.The relative2error between the exact and predicted solutions is 1.47e-02.The experimental result shows that the L-BFGS algorithm is much faster than Adam in this case and it gets a more accurate solution.However,the former sometimes suffers from some convergence issues.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Table 2.Relative2 errors for different numbers of data points under the distortion of different noise levels.

    Noise0%Noise1%NfNf Nu1000500010 00020 000Nu1000500010 00020 000 103.67e-01 1.47e-01 1.02e-01 2.27e-01105.17e-01 1.41e-01 8.58e-02 5.34e-02 506.98e-03 2.12e-03 1.14e-03 1.68e-03501.35e-02 5.71e-03 5.81e-03 3.75e-03 1002.91e-03 1.48e-03 1.37e-03 1.23e-031009.52e-03 5.95e-03 6.92e-03 3.55e-03 2001.07e-03 1.21e-03 9.73e-04 6.09e-042003.60e-03 1.42e-03 2.59e-03 2.03e-03

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Table 3.Relative 2 errors for different numbers of hidden layers and neurons per hidden layer under a fixed random seed.

    Neurons Hidden layers10204080 4 5.09e-01 7.01e-04 4.98e-04 1.21e-03 8 1.76e-03 1.47e-03 9.65e-04 8.43e-04 127.91e-04 7.87e-04 6.09e-04 7.05e-03 166.65e-04 3.38e-04 3.70e-04 2.16e-03

    4.Conclusions and discussion

    In this paper,we propose a new architecture that combines deep residual network with underlying physical laws for extracting soliton dynamics of the sine-Gordon equation from spatiotemporal data.This architecture can be used easily to train very deep networks and then alleviates the gradient exploding and vanishing problems.Moreover,we use the logcosh function rather than the square function in the objective in this paper in order to accelerate the training and improve the performance of the network.The numerical results show that the model could reconstruct the solution behaviors of the equation very accurately.Moreover,this model is remarkably robust under small disturbances to some extent.

    Despite some progress,we are still at the early stages of understanding the capabilities and limitations of such deep learning models.In addition,other advanced frameworks,for example,generative adversarial networks,recurrent neural networks and networks with some numerical schemes embedded,will also be considered in the future research.

    Acknowledgments

    The first author would like to express his sincere thanks to Dr Yuqi Li and Dr Xiaoen Zhang for their valuable comments and excellent suggestions on this work.The authors gratefully acknowledge the support of the National Natural Science Foundation of China(Grant No.11675054),Shanghai Collaborative Innovation Center of Trustworthy Software for Internet of Things(Grant No.ZF1213)and Science and Technology Commission of Shanghai Municipality(Grant No.18dz2271000).

    猜你喜歡
    陳勇李軍
    木棉花開
    人民之聲(2022年3期)2022-04-12 12:00:14
    Superconductivity in octagraphene
    A Direct Algorithm Maple Package of One-Dimensional Optimal System for Group Invariant Solutions?
    Lump Solutions and Interaction Phenomenon for(2+1)-Dimensional Sawada–Kotera Equation?
    Symmetry Analysis and Exact Solutions of the 2D Unsteady Incompressible Boundary-Layer Equations?
    In fluence of Cell-Cell Interactions on the Population Growth Rate in a Tumor?
    Mechanical Behavior of Plastic PiPe Reinforced by Cross-Winding Steel Wire Subject to Foundation Settlement
    陳勇:勵精圖治 銳意創(chuàng)新
    滬港通一周成交概況
    陳勇:我不看好這樣的藥房托管
    av福利片在线| 在现免费观看毛片| 亚洲美女搞黄在线观看| 久久久久久久久久久免费av| 国产欧美日韩综合在线一区二区 | 欧美国产精品一级二级三级 | 熟女电影av网| 菩萨蛮人人尽说江南好唐韦庄| 久久久久久伊人网av| 一区二区三区四区激情视频| 精品99又大又爽又粗少妇毛片| 精品视频人人做人人爽| av网站免费在线观看视频| 99久久精品热视频| 午夜91福利影院| 国产一级毛片在线| 国产伦在线观看视频一区| 91午夜精品亚洲一区二区三区| 丰满人妻一区二区三区视频av| 久久久久久久精品精品| 在线观看www视频免费| 国产在线男女| 日产精品乱码卡一卡2卡三| 男女边吃奶边做爰视频| 欧美xxxx性猛交bbbb| 久久狼人影院| 男的添女的下面高潮视频| 爱豆传媒免费全集在线观看| 男人爽女人下面视频在线观看| 亚洲精品国产av蜜桃| 亚洲久久久国产精品| 欧美精品人与动牲交sv欧美| 国产亚洲最大av| 老司机亚洲免费影院| 国产亚洲av片在线观看秒播厂| 国产av一区二区精品久久| 人妻夜夜爽99麻豆av| 18禁裸乳无遮挡动漫免费视频| 国产免费视频播放在线视频| 国产乱人偷精品视频| 亚洲经典国产精华液单| 一本—道久久a久久精品蜜桃钙片| 国产av国产精品国产| 婷婷色av中文字幕| 黄色日韩在线| 我的女老师完整版在线观看| 五月天丁香电影| 精品久久久噜噜| 欧美日韩一区二区视频在线观看视频在线| 亚洲综合色惰| 人妻人人澡人人爽人人| 久久狼人影院| 久久99蜜桃精品久久| 99九九线精品视频在线观看视频| 青春草国产在线视频| 欧美老熟妇乱子伦牲交| 自拍偷自拍亚洲精品老妇| 噜噜噜噜噜久久久久久91| 97精品久久久久久久久久精品| 日韩视频在线欧美| 交换朋友夫妻互换小说| 国产精品一区二区在线不卡| 亚洲av不卡在线观看| 不卡视频在线观看欧美| 精品国产一区二区久久| 高清欧美精品videossex| 一个人免费看片子| 成人毛片a级毛片在线播放| 又粗又硬又长又爽又黄的视频| 嫩草影院入口| 亚洲精品一区蜜桃| 在线看a的网站| 一级片'在线观看视频| 在线 av 中文字幕| 日韩在线高清观看一区二区三区| 亚洲真实伦在线观看| 青春草亚洲视频在线观看| 国产乱来视频区| 日日摸夜夜添夜夜添av毛片| av播播在线观看一区| 免费大片18禁| av线在线观看网站| 寂寞人妻少妇视频99o| 国产精品国产三级国产av玫瑰| 国模一区二区三区四区视频| 免费少妇av软件| 两个人的视频大全免费| 一级二级三级毛片免费看| 国产69精品久久久久777片| 欧美人与善性xxx| 深夜a级毛片| 日日摸夜夜添夜夜爱| 国产深夜福利视频在线观看| 国产成人精品无人区| 九色成人免费人妻av| 最近的中文字幕免费完整| 91精品一卡2卡3卡4卡| 自拍偷自拍亚洲精品老妇| 久久99热这里只频精品6学生| 国产亚洲一区二区精品| 日韩,欧美,国产一区二区三区| 国产精品无大码| 中文字幕av电影在线播放| 亚洲av在线观看美女高潮| 在线天堂最新版资源| 亚洲av福利一区| 亚洲国产精品成人久久小说| 两个人免费观看高清视频 | 在线观看www视频免费| 亚洲国产日韩一区二区| 亚洲国产精品专区欧美| a 毛片基地| 人人妻人人看人人澡| av一本久久久久| 一级黄片播放器| 亚洲欧美成人精品一区二区| 日韩不卡一区二区三区视频在线| 99久久精品国产国产毛片| 精品一区二区三卡| 蜜臀久久99精品久久宅男| 色视频在线一区二区三区| 亚洲美女搞黄在线观看| 精品人妻偷拍中文字幕| 午夜免费鲁丝| 久久久久久久亚洲中文字幕| 国产亚洲5aaaaa淫片| 91在线精品国自产拍蜜月| av线在线观看网站| 男人舔奶头视频| 免费观看性生交大片5| 亚洲精品一区蜜桃| 最新的欧美精品一区二区| 青青草视频在线视频观看| 国产91av在线免费观看| 纵有疾风起免费观看全集完整版| 人人澡人人妻人| 蜜桃久久精品国产亚洲av| 99精国产麻豆久久婷婷| 伦理电影免费视频| 亚洲国产精品国产精品| 这个男人来自地球电影免费观看 | 99热这里只有是精品在线观看| 中文资源天堂在线| 亚洲av男天堂| 人妻少妇偷人精品九色| 色94色欧美一区二区| 亚洲自偷自拍三级| 国产视频内射| 肉色欧美久久久久久久蜜桃| 天堂8中文在线网| 99久久人妻综合| 永久免费av网站大全| 偷拍熟女少妇极品色| 国产91av在线免费观看| 亚洲精品亚洲一区二区| 亚洲av福利一区| 成人无遮挡网站| 亚洲精品国产av蜜桃| 国产在线视频一区二区| 少妇被粗大猛烈的视频| 毛片一级片免费看久久久久| 亚洲人成网站在线播| 色5月婷婷丁香| 日韩人妻高清精品专区| 王馨瑶露胸无遮挡在线观看| 亚洲欧美成人精品一区二区| 国产欧美另类精品又又久久亚洲欧美| 国产精品久久久久久精品电影小说| 国产av一区二区精品久久| 免费观看a级毛片全部| 插阴视频在线观看视频| 赤兔流量卡办理| 久久国产乱子免费精品| 曰老女人黄片| 精品人妻熟女毛片av久久网站| 亚洲av成人精品一二三区| 在现免费观看毛片| 五月开心婷婷网| 午夜激情久久久久久久| 国产爽快片一区二区三区| 亚洲欧美一区二区三区黑人 | 欧美激情极品国产一区二区三区 | 观看av在线不卡| 国语对白做爰xxxⅹ性视频网站| 成人综合一区亚洲| 久久久久精品久久久久真实原创| 午夜老司机福利剧场| 少妇被粗大的猛进出69影院 | 欧美日本中文国产一区发布| 欧美国产精品一级二级三级 | av免费在线看不卡| 香蕉精品网在线| 天堂中文最新版在线下载| 久久综合国产亚洲精品| 男人添女人高潮全过程视频| 日本午夜av视频| 一本一本综合久久| 五月开心婷婷网| 精品久久国产蜜桃| 精品亚洲成a人片在线观看| 亚洲激情五月婷婷啪啪| 中文字幕精品免费在线观看视频 | 中文字幕久久专区| 久久久久精品久久久久真实原创| 欧美精品亚洲一区二区| 高清午夜精品一区二区三区| 嫩草影院新地址| 免费黄频网站在线观看国产| 婷婷色综合大香蕉| 国产淫片久久久久久久久| 欧美日本中文国产一区发布| 成人影院久久| 亚洲国产毛片av蜜桃av| 久久精品国产自在天天线| 麻豆成人午夜福利视频| 国产精品99久久久久久久久| 国产国拍精品亚洲av在线观看| 97精品久久久久久久久久精品| 国产欧美亚洲国产| 91久久精品国产一区二区三区| 热re99久久精品国产66热6| 色网站视频免费| 久久午夜综合久久蜜桃| 中文字幕久久专区| 日韩伦理黄色片| 人妻少妇偷人精品九色| 精品熟女少妇av免费看| 毛片一级片免费看久久久久| 精品少妇黑人巨大在线播放| 欧美国产精品一级二级三级 | 欧美精品国产亚洲| 久久久久久人妻| 天天操日日干夜夜撸| 亚洲,一卡二卡三卡| 日韩电影二区| 亚洲一级一片aⅴ在线观看| 在线观看www视频免费| 日本黄大片高清| 亚洲精品久久午夜乱码| 亚洲丝袜综合中文字幕| 十分钟在线观看高清视频www | 永久免费av网站大全| 在线观看免费高清a一片| 18+在线观看网站| 国产成人免费观看mmmm| 99热这里只有精品一区| 午夜av观看不卡| 中文乱码字字幕精品一区二区三区| 国产探花极品一区二区| a级毛片免费高清观看在线播放| 亚洲人成网站在线播| 亚洲一区二区三区欧美精品| 亚洲精品日本国产第一区| 精品酒店卫生间| 久久ye,这里只有精品| 中文字幕人妻熟人妻熟丝袜美| 欧美亚洲 丝袜 人妻 在线| 热99国产精品久久久久久7| 高清不卡的av网站| 国产精品人妻久久久久久| 七月丁香在线播放| 免费黄网站久久成人精品| 日日啪夜夜撸| 熟女人妻精品中文字幕| 大话2 男鬼变身卡| 婷婷色综合www| 日本黄色日本黄色录像| 天堂8中文在线网| 日韩欧美一区视频在线观看 | 午夜影院在线不卡| 亚洲av男天堂| 99热这里只有精品一区| 免费观看a级毛片全部| 最近的中文字幕免费完整| 久久久国产精品麻豆| 亚洲情色 制服丝袜| 久久狼人影院| 国产亚洲91精品色在线| 久久久国产欧美日韩av| 99久久精品热视频| 亚洲图色成人| 少妇丰满av| 性色av一级| 欧美精品一区二区大全| 亚洲av免费高清在线观看| .国产精品久久| 在线观看美女被高潮喷水网站| 午夜影院在线不卡| 亚州av有码| 中文字幕免费在线视频6| 国产精品偷伦视频观看了| 国产亚洲精品久久久com| 蜜桃在线观看..| 看非洲黑人一级黄片| 我要看黄色一级片免费的| 中文资源天堂在线| 亚洲美女视频黄频| 五月天丁香电影| 欧美人与善性xxx| 欧美另类一区| 欧美变态另类bdsm刘玥| 国产精品免费大片| 男女无遮挡免费网站观看| 亚洲综合色惰| 中文字幕人妻熟人妻熟丝袜美| 自拍偷自拍亚洲精品老妇| 下体分泌物呈黄色| 永久网站在线| 精品久久久精品久久久| 久久女婷五月综合色啪小说| 91精品国产国语对白视频| 99国产精品免费福利视频| 男女国产视频网站| av专区在线播放| 精品午夜福利在线看| 97在线人人人人妻| 桃花免费在线播放| 中文在线观看免费www的网站| 免费观看a级毛片全部| 建设人人有责人人尽责人人享有的| 亚洲电影在线观看av| 亚洲高清免费不卡视频| 免费av不卡在线播放| 久久99热这里只频精品6学生| a级一级毛片免费在线观看| 亚洲综合色惰| 免费黄频网站在线观看国产| 国产欧美另类精品又又久久亚洲欧美| 极品少妇高潮喷水抽搐| 精品一区二区免费观看| 亚洲图色成人| 日日撸夜夜添| 观看免费一级毛片| av又黄又爽大尺度在线免费看| 中文字幕免费在线视频6| 91aial.com中文字幕在线观看| 男人爽女人下面视频在线观看| 老司机影院毛片| 久久国产亚洲av麻豆专区| 欧美+日韩+精品| 91久久精品国产一区二区三区| 国产日韩欧美在线精品| 国产国拍精品亚洲av在线观看| 国产乱人偷精品视频| 国产精品秋霞免费鲁丝片| 特大巨黑吊av在线直播| 免费观看a级毛片全部| 免费看av在线观看网站| 精华霜和精华液先用哪个| 国产精品久久久久久精品古装| 亚洲精品乱码久久久久久按摩| 久久国内精品自在自线图片| 午夜福利在线观看免费完整高清在| 91精品伊人久久大香线蕉| 伦理电影大哥的女人| 久久久久久久精品精品| av有码第一页| 午夜91福利影院| av福利片在线观看| 婷婷色综合www| 国产精品久久久久久久电影| 男人和女人高潮做爰伦理| 国产成人精品久久久久久| 中文字幕精品免费在线观看视频 | 国产探花极品一区二区| 久久97久久精品| 看免费成人av毛片| 热re99久久国产66热| 国产毛片在线视频| 最近中文字幕高清免费大全6| 亚洲无线观看免费| 综合色丁香网| 欧美激情极品国产一区二区三区 | 亚洲精品国产成人久久av| 久热这里只有精品99| 国国产精品蜜臀av免费| 中国美白少妇内射xxxbb| 欧美性感艳星| 伦精品一区二区三区| 99九九线精品视频在线观看视频| 亚洲无线观看免费| 国产黄片视频在线免费观看| 少妇高潮的动态图| 18禁在线无遮挡免费观看视频| 国产成人精品福利久久| 国模一区二区三区四区视频| 精品熟女少妇av免费看| 国产 精品1| 一级,二级,三级黄色视频| 亚洲精品成人av观看孕妇| 2022亚洲国产成人精品| 国产精品伦人一区二区| 人人妻人人澡人人看| 九九爱精品视频在线观看| 精品一区二区免费观看| 免费少妇av软件| 国产男女内射视频| 国产精品国产av在线观看| 中国国产av一级| 欧美+日韩+精品| 国产高清有码在线观看视频| 亚洲成色77777| 一级毛片我不卡| 免费观看在线日韩| 亚洲国产精品成人久久小说| 蜜臀久久99精品久久宅男| 日本黄色片子视频| 亚洲欧洲国产日韩| 久久久国产欧美日韩av| 一级a做视频免费观看| 午夜视频国产福利| 久久人人爽av亚洲精品天堂| 看十八女毛片水多多多| 久久国内精品自在自线图片| 日韩人妻高清精品专区| 欧美3d第一页| 99热国产这里只有精品6| 亚洲精品乱码久久久v下载方式| 日韩伦理黄色片| 国产在线一区二区三区精| 国产精品国产三级国产av玫瑰| 精品亚洲成a人片在线观看| 精品午夜福利在线看| 亚洲丝袜综合中文字幕| 一级毛片久久久久久久久女| 性高湖久久久久久久久免费观看| av国产久精品久网站免费入址| 成人亚洲精品一区在线观看| 高清欧美精品videossex| 日韩欧美 国产精品| 在线免费观看不下载黄p国产| 久久热精品热| 免费看av在线观看网站| 国产一区二区在线观看av| 国产亚洲av片在线观看秒播厂| 中国三级夫妇交换| 久久精品熟女亚洲av麻豆精品| 国产亚洲欧美精品永久| 欧美高清成人免费视频www| 偷拍熟女少妇极品色| 大香蕉久久网| 99久国产av精品国产电影| 边亲边吃奶的免费视频| 欧美丝袜亚洲另类| 成人国产麻豆网| 少妇精品久久久久久久| 免费av中文字幕在线| 欧美激情国产日韩精品一区| 国模一区二区三区四区视频| av网站免费在线观看视频| 国产中年淑女户外野战色| 国产精品无大码| 十八禁网站网址无遮挡 | 日韩欧美精品免费久久| 内射极品少妇av片p| 亚洲av电影在线观看一区二区三区| 亚洲精品一二三| 日本av手机在线免费观看| 伦理电影大哥的女人| 婷婷色综合www| 日韩人妻高清精品专区| 欧美日韩亚洲高清精品| 日韩中文字幕视频在线看片| 国产又色又爽无遮挡免| 久久 成人 亚洲| 夫妻性生交免费视频一级片| 亚洲欧美中文字幕日韩二区| 街头女战士在线观看网站| 亚洲人成网站在线播| 亚州av有码| 欧美高清成人免费视频www| 一本久久精品| 永久网站在线| 日本黄色片子视频| 久久久久久伊人网av| 在线观看免费视频网站a站| 成人漫画全彩无遮挡| 大香蕉久久网| 中文乱码字字幕精品一区二区三区| 我的老师免费观看完整版| 国产中年淑女户外野战色| 26uuu在线亚洲综合色| tube8黄色片| 欧美少妇被猛烈插入视频| 99国产精品免费福利视频| 午夜老司机福利剧场| 日韩av不卡免费在线播放| 亚洲国产欧美日韩在线播放 | a级一级毛片免费在线观看| 人人妻人人添人人爽欧美一区卜| 性高湖久久久久久久久免费观看| 春色校园在线视频观看| 国产黄色免费在线视频| 亚洲图色成人| a级毛片免费高清观看在线播放| 亚洲性久久影院| 最后的刺客免费高清国语| a级毛片在线看网站| 三上悠亚av全集在线观看 | 日韩在线高清观看一区二区三区| 嫩草影院新地址| 高清午夜精品一区二区三区| 三级国产精品片| 亚洲伊人久久精品综合| 人妻 亚洲 视频| 啦啦啦视频在线资源免费观看| 国产有黄有色有爽视频| 青春草国产在线视频| www.色视频.com| 精品一品国产午夜福利视频| 国产高清有码在线观看视频| 久久99一区二区三区| 在线观看www视频免费| 日日撸夜夜添| 中文资源天堂在线| 最近手机中文字幕大全| 中文欧美无线码| 乱码一卡2卡4卡精品| 国产视频首页在线观看| 伦理电影大哥的女人| 亚洲欧美中文字幕日韩二区| 26uuu在线亚洲综合色| 精品视频人人做人人爽| 黄色毛片三级朝国网站 | 国产69精品久久久久777片| 亚洲综合色惰| 中文资源天堂在线| 美女cb高潮喷水在线观看| 日本与韩国留学比较| 观看免费一级毛片| 观看美女的网站| 欧美三级亚洲精品| 国产一级毛片在线| 久久人人爽人人爽人人片va| 岛国毛片在线播放| 国产精品99久久99久久久不卡 | 国产免费视频播放在线视频| 精品人妻偷拍中文字幕| 欧美成人午夜免费资源| 亚洲欧洲精品一区二区精品久久久 | 亚洲人与动物交配视频| 26uuu在线亚洲综合色| 在线观看免费视频网站a站| 久久国产亚洲av麻豆专区| 精品人妻偷拍中文字幕| 国产在线一区二区三区精| 成人影院久久| 人体艺术视频欧美日本| 99热这里只有是精品在线观看| 精品久久久久久电影网| 日韩成人伦理影院| 亚洲精品国产av成人精品| 日本av免费视频播放| 国产深夜福利视频在线观看| 在线观看三级黄色| 久久久欧美国产精品| 久久99热这里只频精品6学生| 亚洲国产日韩一区二区| 最后的刺客免费高清国语| 制服丝袜香蕉在线| 在线亚洲精品国产二区图片欧美 | 女性被躁到高潮视频| 少妇被粗大猛烈的视频| 80岁老熟妇乱子伦牲交| 大片免费播放器 马上看| 蜜臀久久99精品久久宅男| 乱码一卡2卡4卡精品| 午夜免费鲁丝| 美女大奶头黄色视频| 国产在线一区二区三区精| 青春草亚洲视频在线观看| 熟女电影av网| 美女视频免费永久观看网站| 精品久久国产蜜桃| 国产av精品麻豆| 男人狂女人下面高潮的视频| 国产亚洲欧美精品永久| 亚洲经典国产精华液单| 韩国高清视频一区二区三区| 中文精品一卡2卡3卡4更新| 国产精品.久久久| 少妇 在线观看| 我的女老师完整版在线观看| 不卡视频在线观看欧美| 国产精品久久久久久精品电影小说| 精品久久久久久久久亚洲| 日韩一区二区视频免费看| 久久午夜福利片| 街头女战士在线观看网站| 美女中出高潮动态图| 嫩草影院入口| 日本与韩国留学比较| 大又大粗又爽又黄少妇毛片口| 久久6这里有精品| 国产永久视频网站| 伊人久久国产一区二区| 欧美日韩视频高清一区二区三区二| 日本免费在线观看一区| 特大巨黑吊av在线直播| a级毛片免费高清观看在线播放| 免费av不卡在线播放| 热re99久久精品国产66热6| 日本猛色少妇xxxxx猛交久久| 岛国毛片在线播放| 国产日韩欧美视频二区| 久久久久视频综合| av有码第一页| 丰满饥渴人妻一区二区三| 免费久久久久久久精品成人欧美视频 | 性色avwww在线观看| 亚洲天堂av无毛| 少妇被粗大的猛进出69影院 | 日韩中字成人| 少妇精品久久久久久久| 国产日韩一区二区三区精品不卡 |