• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    LaNets:Hybrid Lagrange Neural Networks for Solving Partial Differential Equations

    2023-01-22 09:01:32YingLiLongxiangXuFangjunMeiandShihuiYing

    Ying Li,Longxiang Xu,Fangjun Mei and Shihui Ying

    1School of Computer Engineering and Science,Shanghai University,Shanghai,200444,China

    2School of Science,Shanghai University,Shanghai,200444,China

    ABSTRACT We propose new hybrid Lagrange neural networks called LaNets to predict the numerical solutions of partial differential equations.That is, we embed Lagrange interpolation and small sample learning into deep neural network frameworks.Concretely,we first perform Lagrange interpolation in front of the deep feedforward neural network.The Lagrange basis function has a neat structure and a strong expression ability, which is suitable to be a preprocessing tool for pre-fitting and feature extraction.Second, we introduce small sample learning into training,which is beneficial to guide the model to be corrected quickly.Taking advantages of the theoretical support of traditional numerical method and the efficient allocation of modern machine learning,LaNets achieve higher predictive accuracy compared to the state-of-the-art work.The stability and accuracy of the proposed algorithm are demonstrated through a series of classical numerical examples,including one-dimensional Burgers equation,onedimensional carburizing diffusion equations,two-dimensional Helmholtz equation and two-dimensional Burgers equation.Experimental results validate the robustness,effectiveness and flexibility of the proposed algorithm.

    KEYWORDS Hybrid Lagrange neural networks; interpolation polynomials; deep learning; numerical simulation; partial differential equations

    1 Introduction

    In this paper,we consider partial differential equations(PDEs)of the general form in Eqs.(1.1)–(1.3):

    whereLis a differential operator,Ωis a subset of RN, and?Ωrepresents its boundary.udenotes the unknown function that needs to be solved,u0anduΓrepresent initial and boundary conditions,respectively.Different governing equations and homologous initial/boundary conditions can describe many physical phenomena in nature, but it is practically difficult to find the analytical solutions.Therefore,more and more scholars have tried a variety of numerical methods to solve PDEs in recent years.

    At first,traditional numerical methods,including finite element method[1],finite volume method[2] and finite difference method [3], were usually used to solve partial differential equations.Afterwards, with the rapid development of machine learning [4–6], the universal approximation ability of neural networks is considered to be helpful for obtaining approximated solutions of differential equations.Han et al.[7–9] proposed deep learning-based numerical approaches to solve variational problems, backward stochastic differential equations and high-dimensional equations.Then Chen et al.[10] extended their work to solve Navier-Stokes and Cahn-Hillard equation.Sirignano et al.[11] combined Galerkin method and deep learning to solve high-dimensional free boundary PDEs.Raissi et al.[12] proposed the physics-informed neural network (PINNs) framework which acts as the benchmark algorithm in this field.In PINNs, physical constraints are added to limit the space of solutions to improve the accuracy.Futhermore,many scholars carry out research based on this method.Dwivedi et al.[13] incorporated PINNs with extreme learning machine to solve timedependent linear partial differential equations.Pang et al.[14]solved space-time fractional advectiondiffusion equations by expanding PINNs to fractional PINNs.Raissi et al.[15]subsequently developed a physics-informed deep learning framework that is able to encode Navier-Stokes equations into the neural networks.Kharazmi et al.[16] constructed a variational physics-informed neural network to effectively reduce the training cost in network training.Yang et al.[17] proposed Bayesian physicsinformed neural networks for solving forward and inverse nonlinear problems with PDEs and noisy data.Meng et al.[18]developed a parareal physics-informed neural network to significantly accelerate the long-time integration of partial differential equations.Gao et al.[19] proposed a new learning architecture of physics-constrained convolutional neural network to learn the solutions to parametric PDEs on irregular domains.

    Neural network is a black box model,its approximation ability depends partly on the depth and width of the network,and thus too many parameters will cause a decrease in computational efficiency.One may use Functional Link Artificial Neural Network (FLANN) [20] model to overcome this problem.In FLANN, the single hidden layer of neural network is replaced by an expansion layer based on distinct polynomials.Mall et al.[21]used Chebyshev neural network to solve elliptic partial differential equations by replacing the single hidden layer of the neural network with Chebyshev polynomials.Sun et al.[22] replaced the hidden layer with Bernstein polynomials to obtain the numerical solution of PDEs as well.Due to the application of polynomials, neural network has no actual hidden layers,and the number of parameters is greatly reduced.

    On the other hand,deep learning is a type of learning that requires a lot of data.The performance of deep learning depends on large-scale and high-quality sample sets but the cost of data acquisition is prohibitive.Moreover,sample labeling also needs to consume a lot of human and material resources.Therefore, a popular learning paradigm named Small Sample Learning (SSL) [23] has been used in some new fields.SSL refers to the ability to learn and generalize under a small number of samples.At present,SSL has been successfully applied in medical image analysis[24],long tail distribution target detection[25],remote sensing scene classification[26],etc.

    In this paper, we integrate Lagrange interpolation and small sample learning with deep neural networks frameworks to deal with the problems in existing models.Specifically, we replace the first hidden layer of the deep neural network with a Lagrange block.Here, Lagrange block is a preprocessing tool for preliminary fitting and feature extraction of input data.The Lagrange basis function has a neat structure and strong expressive ability, so it is fully capable of better extracting detailed features of input data for feature enhancement.The main thought of Lagrange interpolation is to interpolate the function values of other positions between nodes through the given nodes,so as to make a prefitting behaviour without adding any extra parameters.Then, the enhanced vector is input to the subsequent hidden layer for the training of the network model.Furthermore,we add the residual of a handful of observations into cost function to rectify the model and improve the predictive accuracy with less label data.This composite neural network structure is quite flexible,mainly in that the structure is easy to modify.That is,the number of polynomials and hidden layers can be adjusted according to the complexity of different problems.

    The structure of this paper is as follows.In Section 2, we present the introduction of Lagrange polynomials,the structure of the LaNets and the steps of algorithm.Numerical experiments for onedimensional PDEs and two-dimensional PDEs are described in Section 3.Finally, conclusions are incorporated into Section 4.

    2 LaNets:Theory,Architecture,Algorithm

    In this section,we start with illustrations on Lagrange interplotation polymonials.After that,we discuss the framework of LaNets.And finally,we clarify the detatils of the proposed algorithm.

    2.1 Lagrange Interpolation Polynomial

    Lagrange interpolation is a kind of polynomial interpolation methods proposed by Joseph-Louis Lagrange,a French mathematician in the 18th century,for numerical analysis[27].Interpolation is an important method for the approximation of functions,which uses the value of a function at a finite point to estimate the approximation of the function at other points.That is,the continuous function is interpolated on the basis of discrete data to make the continuous curve pass through all the given discrete data points.Mathematically speaking,Lagrange interpolation can give a polynomial function that passes through several known points on a two-dimensional plane.

    Assumingx1,x2,...,xn+1are the distinctn+1 points in the complex panels, andy1,y2,...,yn+1are the corresponding values atx1,x2,...,xn+1.The Lagrange polynomialL(x)corresponding to their degree not exceedingnis unique.Indeed,the uniqueness ofL(x)arises from the fact that the difference of two such polynomials vanish at pointsx1,...,xn+1without a degree greater thann.The following polynomial clearly possesses all the necessary properties in Eqs.(2)–(3):

    where

    Here,the polynomialL(x)is called Lagrange interpolation polynomial.And the distinct pointsx1,...,xn+1are called the interpolation points.It can be seen that the corresponding Lagrange polynomial can be obtained by givenn+1 value points:(x0,y0),...,(xn,yn).The Lagrange interpolation polynomial obtained from only some points can replace the function to obtain the solution at any other points.The correctness of Lagrange polynomials has been proved in the literatures[27,28].

    2.2 The Architecture of LaNets

    Fig.1 displays the structure of LaNets, which is composed of two main parts.One is a preprocessing part based on Lagrange polynomials, the other is the training of deep feedforward neural network.Thus, the LaNets model we designed is a joint feedforward neural network composed of input layer,Lagrange block,hidden layers and output layer.As described in Section 2.1,we can also write Lagrange interpolation polynomial in Eq.(4):

    wherexjandyjin the above formula correspond to the position of the independent variable and the value of function at this position, respectively.Here, we calllj(x)the Lagrange interpolation basis function,and the expression oflj(x)is as follows:

    Figure 1:The schematic drawing of the LaNets

    As shown in Fig.1,the original input vector is extended to a new enhanced vector by Lagrange block primarily,and then sent to deep feedforward neural network for training.The black Lagrange block on the right shows the Lagrange interpolation basis functionsl0,l1andl2visually.Spatiotemporal variables can be both handled with Lagrange basis functions.Actually,the proposed model not only increases the reliability and stability of the single-layer polynomial neural network,but also improves the predictive accuracy of the deep feedforward neural network without adding any extra parameters.

    2.3 Loss Function&Algorithm

    The problem we aim to solve is described as Eqs.(1.1)–(1.3).Following the original work of Raissi et al.[12],F(t,x)can be defined as Eq.(6):

    We continue to approximateu(t,x)with the deep neural networku(t,x;θ),where θ represents the parameter set of the network.The model is then trained by minimizing the following compound loss functionJ(θ)in Eq.(7):

    An entire overview of this work is shown in Algorithm 1.In the algorithm description,we consider the spatio-temporal variables x andt.Without a doubt,the proposed method is also applicable to timeindependent partial differential equations, and related examples will be mentioned in the following experiments.

    Algorithm 1 Overview of the Proposed Algorithm.Require:Initial and boundary data points,(tiu,xiu,ui)Nu i=1;Small sample data points,(ti in,ui)Nin i=1;Collocations points,(ti in,xi f,xi f)Nf i=1;Ensure:Predicted LaNets solution,u(t,x);1:Specify the data set including initial/boundary training data(tiu,xiu,ui)Nu i=1,small sample training data(ti in,xiin,ui)Nin i=1 and residual training data(ti f,xi f)Nf i=1;2:Construct the LaNets u(t,x;θ)with parameters θ;3:Specify the cost function by summing the initial/boundary conditions,the residuals of small sample data points and the residuals of governing equations;4:Train the neural network to find the optimal parameter set θ by minimizing the loss function J(θ);5:Get the predicted composite network solution u(t,x)on the entire domain;6:Return u(t,x).

    3 Numerical Experiments

    In this section, we verify the performance and accuracy of LaNets numerically through experiments with benchmark equations.In Subsection 3.1,we provide three typical one-dimensional timedependent PDEs to validate the robustness and validity of the proposed algorithm.In Subsection 3.2,two-dimensional PDEs are shown to illustrate the reliability and stability of the method.

    3.1 Numerical Results for One-Dimensional Equations

    In this subsection, we demonstrate the predictive accuracy of our method on three onedimensional time-dependent PDEs including Burgers equation, carburizing constant diffusion coefficient equation and carburizing variable diffusion coefficient equation.

    3.1.1 Burgers Equation

    We start with the following one-dimensional time-dependent Burgers equation in Eqs.(8.1)–(8.3):

    whereλis the viscosity parameter.In this case,we takeλ=0.01/π.

    Here,the LaNets model consists of one Lagrange block and 7 hidden layers with 20 neurons in each layer.Lagrange block contains three Lagrange basis functions.By default,the Lagrange block is composed of three Lagrange basis functions unless otherwise specified.Fig.2a illustrates the predicted numerical result of the Burgers equation,and the relativeL2error measured at the end is 3.84×10-4.The loss curvevs.iteration is displayed in Fig.2b.The mean square error loss decreases steadily,which illustrates the stability of the proposed method.

    Figure 2:(a) The predicted solution of one-dimensional Burgers equation.Here, we adopt a 9-layer LaNets.The size of small sample points is 100.The relative L2 error measured is 3.84×10-4.(b)The loss curve vs.iteration

    To further verify the effectiveness of the proposed algorithm,we compare the predicted solution with the analytical solution provided in the literature[30]at four timesnapshots,which are presented in Fig.3.It seems that there is almost no difference between the predicted solution and the exact solution.Moreover,the sharp gap formed near timet=0.65 is also well captured.

    Figure 3:The comparison of predicted solutions obtained by LaNets and exact solutions at four time snapshots t=(0.1,0.25,0.65,0.99)for the one-dimensional Burgers equation

    A more detailed numerical result is summarized in Table 1.It has to be noted that the early work[31]serves as a benchmark.In order to observe the influence of a different number of small sample points in the algorithm,we add 50 small sample points each time to calculate the corresponding results.From Table 1,one can visually see that the error of the LaNets model is one order of magnitude lower than that of PINNs.In addition, we can clearly find that 50 sample points used here can achieve a higher predictive accuracy than the 300 sample points used in the benchmark model.It means that using less label data to get more accurate predicted results is achievable, thereby saving a lot of manpower and material resources and increasing computational efficiency.

    Table 1:The relative L2 errors for one-dimensional Burgers equation

    3.1.2 Carburizing Diffusion Model

    We consider the one-dimensional carburizing diffusion model[32]in Eqs.(9.1)–(9.4):

    whereD(u)represents the diffusion coefficient anduis the concentration of carbon.Here,landrdenote the left and right boundary of the model.Diffusion is a fundamental process of carburizing,and the diffusion coefficient is related to temperature,the content of alloy elements,systems,etc.Next we consider carburizing diffusion equation with constant and variable diffusion coefficient,respectively.

    1.Constant diffusion coefficient

    We start with the constant diffusion coefficientD(u)according to Eq.(10):

    whereD0,Q,RandT2are already given.In a practical sense,D0represents the pre-exponential factor,Qdenotes the activation energy of carbon,Rdescribes the gas constant andT2is the temperature during the carburizing process(K).

    In this numerical experiment,we takeD0=16.2mm2/s,Q=137800J/mol,R=8.314J/(K×mol)andT2=1123K.The corresponding exact solution is written as Eq.(11):

    where we haveup=1.2 andud=0.2.The terminal timeTof this model is 36000,the left boundarylis 0,and the right boundaryris 2.5.

    Regarding the training set, we takeNu= 150,Nin= 300,Nf= 10000.Moreover, we employ a 8-layer LaNets to represent the solutionu(t,x)in this simulation.The LaNets model contains one Lagrange block, six hidden layers with 20 hidden neurons per layer and one output layer.Here, the relativeL2error is measured at 1.35×10-3.

    In order to evaluate the performance of our algorithm in multiple ways,we compare the simulation results with the simulation results obtained by the PINNs model and our earlier model.The results for the three models are shown in Fig.4.We can clearly see that the predicted solution of the PINNs model is not quite consistent with the exact solution, and the differences become more and more obvious over time.And it is here that the LaNets model fits more accurately than the benchmark model.Thus,the proposed method has obvious advantages in the long time simulation of time-dependent partial differential equations.A more intuitive error value obtained by three algorithms is listed in Table 2,from which we find that the predicted error of the benchmark model is one order of magnitude lower than PINNs.Meanwhile,the predicted error of LaNets when using 300 sample points is almost one order of magnitude lower than the benchmark model.The decline curve of the loss function in the training process is shown in the Fig.5a.It can be seen that the loss has been declined to a small value in few iterations during the training process.

    Figure 4:Predicted solutions for one-dimensional carburizing constant diffusion coefficient equation.Top row:LaNets model;Middle row:benchmark model;bottom row:PINNs model.First column:t=0.1 h;Middle column:t=1.0 h;last column:t=10.0 h

    Table 2:The relative L2 errors for one-dimensional carburizing constant diffusion coefficient equation

    2.Variable diffusion coefficient

    In this experiment,the carburizing diffusion coefficientD(u)varies with the temperature,systems and ratio of the element.Here,we considerD(u)=cos u,and add a source terms(t,x)as Eq.(12):

    The analytical solution corresponding to this setting is Eq.(13):

    In this example,we haved= 0.5,the left boundaryl= -π,the right boundaryr=πand the ending timeT=1.Moreover,we use a 8-layer LaNets to denote the spatio-temporal solutionu(t,x).The curve of loss function during training is shown in the Fig.5b.

    Figure 5:(a)The loss curve vs.iteration for the one-dimensional carburizing diffusion equation with constant diffusion coefficient.(b) The loss curve vs. iteration for the one-dimensional carburizing diffusion equation with variable diffusion coefficient

    Further,we make a contrast between the simulation results obtained by the proposed model and the benchmark model.The detailed results for them are displayed in Fig.6.While all experimental results seem to be consistent with the analytical results,one can find that the predicted solution of the LaNets model is more closer to the exact solution.A more accurate error evaluation is summarized in Table 3,from which we see that the prediction error of the LaNets model is always lower than that of the benchmark model when using the same number of small sample points.

    Figure 6:Predicted solutions for one-dimensional carburizing variable diffusion coefficient equation.Top row:LaNets model;bottom row:benchmark model.First column:t = 0.1;middle column:t =0.5;last column:t=1.0

    Table 3:The relative L2 errors for one-dimensional carburizing variable diffusion coefficient equation

    3.2 Numerical Results for Two-Dimensional Equations

    In this section,we consider two-dimensional problems including the time-independent Helmholtz equation and time-dependent Burgers equation to verify the effectiveness of the LaNets model.These two types of two-dimensional problems aim to demonstrate the generalization ability of our methods.

    3.2.1 Helmholtz Equation

    In this example,we consider a time-independent two-dimensional Helmholtz equation as Eq.(14):

    with homogeneous Dirichlet boundary conditions and the source functionf(x,y)is given by Eq.(15):

    Here,we takek=1 and the analytical solution is Eq.(16):

    The training set of this example is generated according to the exact solution in the above equation.The problem is solved using the 4-layer LaNets model on the domain[-1,1]×[-1,1].And each hidden layer consists of 40 hidden neurons.The relativeL2error measured is 5.28×10-4.The training set is specified asNu=400,Nin=200,Nf=10000.

    The visual comparison among LaNets,benchmark and PINNs results is displayed in Fig.7.From Fig.7,we find that the predicted solution of the benchmark model is not quite consistent with the exact solution.In addition,the proposed model is more accurate than the PINNs model especially on the boundary.Detailed error values for the three models are shown in Table 4,from which we see that the predicted error of the LaNets model is always minimal.The loss curve during the training process is shown in Fig.8a.From Fig.8a,we see that the value of the loss decreases continuously and smoothly from a higher value to a lower value,which shows the stability and robustness of the proposed model.

    Figure 7:(Continued)

    Figure 7:Predicted solutions for the two-dimensional Helmholtz equation.Top row:LaNets model;Middle row:benchmark model;bottom row:PINNs model.First column:x=-0.8;middle column:x=0.1;last column:x=0.5

    Table 4:The relative L2 errors for two-dimensional Helmholtz equation

    Figure 8:(a)The loss curve vs.iteration for two-dimensional Helmholtz equation.(b)The loss curve vs.iteration for two-dimensional Burgers equation

    3.2.2 2D Burgers Equation

    In the last experiment, we consider a two-dimensional time-dependent Burgers equation as Eq.(17):

    whereurepresents the predicted spatio-temporal solution.The corresponding initial and boundary conditions are given by Eq.(18):

    In this example,we takeλ= 0.1 andT= 3.The training set is generated by the exact solution Eq.(18),which is utilized to assess the accuracy of our method.The computing domain is set to[0, 1]×[0, 1] × [0, 3].We apply an 8-layer LaNets model and each hidden layer consists of 20 neurons.The residual training points are 20000 and the initial and boundary points are 150 whereas theNinpoints are 300.

    The decline curve of the loss function is shown in Fig.8b.It can be seen that the loss value drops steadily to a small value over fewer iterations.Fig.9 displays the 3D plot of the solution att= 0.5,and the relativeL2error calculated is 2.06×10-4.The experiment of the two-dimensional timedependent Burgers equation proves that the proposed method can effectively solve high-dimensional time-dependent PDEs.In theory, the LaNets model can solve PDEs in arbitrary dimensions, and the remaining research is left for future work.The detailed relativeL2errors obtained by LaNets,benchmark and PINNs are given in Table 5, from which we can know that the predicted error of LaNets is lower than that of the benchmark model and PINNs model.

    Figure 9:The predicted solution of the two-dimensional Burgers equation at t=0.5

    Table 5:The relative L2 errors for the two-dimensional Burgers equation

    4 Conclusion

    In this paper, we propose hybrid Lagrange neural networks called LaNets to solve partial differential equations.We first perform Lagrange interpolation through Lagrange block in front of deep feedforward neural network architecture to make pre-fitting and feature extraction.Then we add the residuals of small sample data points in the domain into the cost function to rectify the model.Compared with the single-layer polynomial network,LaNets greatly increase the reliability and stability.And compared with general deep feedforward neural network,the proposed model improves the predictive accuracy without adding any extra parameters.Moreover, the proposed model can obtain more accurate prediction with less label data,which makes it possible to save a lot of manpower and material resources and improve computational efficiency.A series of experiments demonstrate the effectiveness and robustness of the proposed method.In all cases,our model shows smaller predictive errors.The numerical results verify that the proposed method improves the predictive accuracy,robustness and generalization ability.

    Acknowledgement:This research was supported by NSFC(No.11971296),and National Key Research and Development Program of China(No.2021YFA1003004).

    Funding Statement:The authors received no specific funding for this study.

    Conflicts of Interest:The authors declare that they have no conflicts of interest to report regarding the present study.

    中出人妻视频一区二区| 99热国产这里只有精品6| 女人久久www免费人成看片| 国产深夜福利视频在线观看| 韩国av一区二区三区四区| 大片电影免费在线观看免费| 99riav亚洲国产免费| 日韩三级视频一区二区三区| 亚洲美女黄片视频| 久久香蕉精品热| 免费一级毛片在线播放高清视频 | 在线观看66精品国产| 91精品国产国语对白视频| 美女扒开内裤让男人捅视频| 欧美日韩视频精品一区| 99国产综合亚洲精品| 91麻豆精品激情在线观看国产 | 777久久人妻少妇嫩草av网站| 黄色毛片三级朝国网站| 首页视频小说图片口味搜索| 黄片大片在线免费观看| 国产精品一区二区在线观看99| 午夜福利一区二区在线看| 丝瓜视频免费看黄片| 亚洲欧美日韩另类电影网站| 韩国av一区二区三区四区| 久久人人爽av亚洲精品天堂| 国产精品亚洲av一区麻豆| 午夜精品在线福利| 性色av乱码一区二区三区2| 18禁国产床啪视频网站| 在线视频色国产色| 飞空精品影院首页| 欧美黄色淫秽网站| 在线观看免费日韩欧美大片| 一夜夜www| 一本一本久久a久久精品综合妖精| 免费黄频网站在线观看国产| 麻豆乱淫一区二区| 91在线观看av| 国产片内射在线| 久久香蕉激情| 亚洲av欧美aⅴ国产| 国产午夜精品久久久久久| 久久精品aⅴ一区二区三区四区| 久久久久久久国产电影| 欧美不卡视频在线免费观看 | 欧美老熟妇乱子伦牲交| 精品亚洲成国产av| 午夜视频精品福利| 黑人巨大精品欧美一区二区mp4| 手机成人av网站| 免费在线观看亚洲国产| 日韩熟女老妇一区二区性免费视频| 天天躁狠狠躁夜夜躁狠狠躁| 亚洲熟女毛片儿| 一级毛片女人18水好多| 女警被强在线播放| 人妻一区二区av| 老熟女久久久| 久久久久久久国产电影| 精品国产超薄肉色丝袜足j| 老司机深夜福利视频在线观看| 精品第一国产精品| 免费在线观看亚洲国产| 亚洲九九香蕉| 亚洲一码二码三码区别大吗| 久久中文字幕一级| 国产精品.久久久| av欧美777| 免费在线观看影片大全网站| 一本大道久久a久久精品| 一本综合久久免费| 一本大道久久a久久精品| 欧美 亚洲 国产 日韩一| 久久久久久久久免费视频了| 操出白浆在线播放| 久9热在线精品视频| 亚洲中文av在线| 国产主播在线观看一区二区| 亚洲午夜理论影院| 操美女的视频在线观看| 最近最新中文字幕大全免费视频| 一区福利在线观看| 在线观看www视频免费| 午夜成年电影在线免费观看| 亚洲av熟女| 亚洲男人天堂网一区| 午夜精品国产一区二区电影| 天天躁日日躁夜夜躁夜夜| 午夜免费鲁丝| 成年女人毛片免费观看观看9 | 男女之事视频高清在线观看| 国产精品一区二区在线不卡| 三级毛片av免费| 久久香蕉激情| cao死你这个sao货| 19禁男女啪啪无遮挡网站| 国产又色又爽无遮挡免费看| 久久久久久人人人人人| 成年版毛片免费区| 窝窝影院91人妻| 高清在线国产一区| 免费看十八禁软件| 亚洲九九香蕉| 99精品欧美一区二区三区四区| 人人妻,人人澡人人爽秒播| 777久久人妻少妇嫩草av网站| 99国产精品99久久久久| 免费av中文字幕在线| 免费观看a级毛片全部| 一夜夜www| 一区二区日韩欧美中文字幕| 亚洲精品在线观看二区| 亚洲专区国产一区二区| 国产亚洲精品一区二区www | 精品亚洲成a人片在线观看| 亚洲精品在线美女| 久久精品亚洲熟妇少妇任你| 窝窝影院91人妻| av天堂久久9| 久久久久视频综合| 国产成人一区二区三区免费视频网站| 亚洲人成电影观看| 久久99一区二区三区| 日韩熟女老妇一区二区性免费视频| 少妇被粗大的猛进出69影院| 精品亚洲成a人片在线观看| 亚洲成人免费电影在线观看| 高清欧美精品videossex| 亚洲精品av麻豆狂野| 黑人猛操日本美女一级片| 亚洲人成伊人成综合网2020| 亚洲五月婷婷丁香| 男女之事视频高清在线观看| 亚洲精品粉嫩美女一区| 日韩 欧美 亚洲 中文字幕| 国产在线一区二区三区精| 在线观看免费视频日本深夜| 咕卡用的链子| 91老司机精品| 亚洲精品国产一区二区精华液| 精品熟女少妇八av免费久了| 国产成人影院久久av| 在线看a的网站| 亚洲,欧美精品.| avwww免费| 婷婷精品国产亚洲av在线 | 免费看a级黄色片| 国产亚洲一区二区精品| 免费在线观看视频国产中文字幕亚洲| 国产在线精品亚洲第一网站| 亚洲精品国产一区二区精华液| 国产欧美日韩综合在线一区二区| 日韩大码丰满熟妇| 18禁美女被吸乳视频| 国产亚洲欧美在线一区二区| 91麻豆精品激情在线观看国产 | tocl精华| 亚洲av日韩在线播放| 一个人免费在线观看的高清视频| a级片在线免费高清观看视频| 在线观看一区二区三区激情| 美女视频免费永久观看网站| 一边摸一边抽搐一进一出视频| 女人爽到高潮嗷嗷叫在线视频| 国产激情欧美一区二区| www.熟女人妻精品国产| 777米奇影视久久| 99热网站在线观看| 制服诱惑二区| 欧美亚洲 丝袜 人妻 在线| 免费人成视频x8x8入口观看| 狠狠婷婷综合久久久久久88av| 亚洲av美国av| 色在线成人网| 不卡一级毛片| 99久久人妻综合| 久久人人97超碰香蕉20202| 激情视频va一区二区三区| 欧美不卡视频在线免费观看 | 国产精品美女特级片免费视频播放器 | 午夜精品国产一区二区电影| 999精品在线视频| 999精品在线视频| 国产成人欧美| 飞空精品影院首页| 精品一区二区三区视频在线观看免费 | 男女床上黄色一级片免费看| 国产不卡一卡二| 国产xxxxx性猛交| 久久国产精品男人的天堂亚洲| 欧美在线黄色| 亚洲国产毛片av蜜桃av| 日韩欧美国产一区二区入口| 亚洲全国av大片| 亚洲五月色婷婷综合| 色综合婷婷激情| 黄色a级毛片大全视频| 18禁黄网站禁片午夜丰满| 久99久视频精品免费| 欧美日韩福利视频一区二区| 久久久国产成人免费| 久久久国产成人免费| bbb黄色大片| 国产精品秋霞免费鲁丝片| 国产成人av教育| 亚洲人成伊人成综合网2020| 操美女的视频在线观看| 国产成人欧美在线观看 | 欧美亚洲日本最大视频资源| 亚洲中文字幕日韩| 麻豆乱淫一区二区| 婷婷成人精品国产| 女人被躁到高潮嗷嗷叫费观| 丝袜在线中文字幕| 大型黄色视频在线免费观看| 99国产极品粉嫩在线观看| 午夜福利视频在线观看免费| 91成人精品电影| 国产一区二区三区视频了| 老司机靠b影院| 成人精品一区二区免费| 窝窝影院91人妻| 一级毛片女人18水好多| 亚洲在线自拍视频| 中文字幕最新亚洲高清| 亚洲欧洲精品一区二区精品久久久| 黄色片一级片一级黄色片| 一区二区三区精品91| 大型av网站在线播放| 亚洲av熟女| 欧美日韩成人在线一区二区| 成年人黄色毛片网站| 两性夫妻黄色片| 黄色视频,在线免费观看| 久久99一区二区三区| 十八禁网站免费在线| 亚洲av成人av| 国产午夜精品久久久久久| 成年人免费黄色播放视频| 久久天堂一区二区三区四区| 国产精品.久久久| 51午夜福利影视在线观看| 最近最新中文字幕大全免费视频| 麻豆av在线久日| 国产av精品麻豆| 99热只有精品国产| 在线观看一区二区三区激情| 99国产极品粉嫩在线观看| 免费少妇av软件| 久热这里只有精品99| 热99久久久久精品小说推荐| 捣出白浆h1v1| 成年动漫av网址| 国产精品久久久av美女十八| 嫩草影视91久久| 动漫黄色视频在线观看| 久久久久久免费高清国产稀缺| 亚洲成人免费电影在线观看| 亚洲三区欧美一区| 国产成人欧美| 999久久久国产精品视频| 嫩草影视91久久| 精品国产超薄肉色丝袜足j| 免费黄频网站在线观看国产| 久久狼人影院| 一区二区日韩欧美中文字幕| 日本精品一区二区三区蜜桃| 人人妻人人澡人人爽人人夜夜| 91av网站免费观看| 免费高清在线观看日韩| 中文字幕色久视频| 国产精品一区二区在线不卡| 看片在线看免费视频| 黑人操中国人逼视频| 男男h啪啪无遮挡| 女性被躁到高潮视频| 亚洲va日本ⅴa欧美va伊人久久| 建设人人有责人人尽责人人享有的| 精品电影一区二区在线| 搡老岳熟女国产| 亚洲精品一二三| 久久国产精品大桥未久av| 母亲3免费完整高清在线观看| 男人操女人黄网站| 操美女的视频在线观看| 日本撒尿小便嘘嘘汇集6| 首页视频小说图片口味搜索| 亚洲国产中文字幕在线视频| 两个人看的免费小视频| 91精品国产国语对白视频| 欧美 亚洲 国产 日韩一| 一级a爱视频在线免费观看| 好男人电影高清在线观看| 精品国产一区二区三区四区第35| 日韩欧美三级三区| 久久香蕉国产精品| 久久精品国产亚洲av高清一级| 视频在线观看一区二区三区| 国产国语露脸激情在线看| 亚洲成国产人片在线观看| 久久ye,这里只有精品| 日日爽夜夜爽网站| 美女国产高潮福利片在线看| 黄网站色视频无遮挡免费观看| 日本五十路高清| 午夜免费观看网址| 日韩熟女老妇一区二区性免费视频| 欧美激情高清一区二区三区| 18禁裸乳无遮挡免费网站照片 | 日韩有码中文字幕| 男人操女人黄网站| 99久久综合精品五月天人人| 男女高潮啪啪啪动态图| 国产成人免费无遮挡视频| 欧美不卡视频在线免费观看 | 亚洲精品久久午夜乱码| 在线观看舔阴道视频| 久久精品亚洲熟妇少妇任你| 欧美日韩黄片免| 日韩制服丝袜自拍偷拍| 80岁老熟妇乱子伦牲交| 久久久久久久午夜电影 | 国产在线观看jvid| 国产一卡二卡三卡精品| 日韩熟女老妇一区二区性免费视频| 国产精品久久久久久人妻精品电影| 亚洲黑人精品在线| 黄片大片在线免费观看| 最新在线观看一区二区三区| 黄片播放在线免费| 好看av亚洲va欧美ⅴa在| 一区二区三区激情视频| 国产男靠女视频免费网站| 日韩欧美三级三区| 亚洲成人国产一区在线观看| 精品国产一区二区三区久久久樱花| 青草久久国产| 婷婷丁香在线五月| 在线观看免费高清a一片| 亚洲熟女毛片儿| 国产又色又爽无遮挡免费看| 国产精品九九99| 亚洲精品自拍成人| 欧美丝袜亚洲另类 | 狂野欧美激情性xxxx| 日日夜夜操网爽| 午夜两性在线视频| 村上凉子中文字幕在线| 亚洲精品国产精品久久久不卡| 老鸭窝网址在线观看| 两人在一起打扑克的视频| 亚洲情色 制服丝袜| 手机成人av网站| 久久久国产欧美日韩av| 亚洲中文av在线| 91国产中文字幕| 国产精品 欧美亚洲| 国产亚洲精品一区二区www | 91精品三级在线观看| 久久天躁狠狠躁夜夜2o2o| 99久久国产精品久久久| 国产精品欧美亚洲77777| 免费在线观看视频国产中文字幕亚洲| 久久久久久久国产电影| 18禁国产床啪视频网站| 男女免费视频国产| 色婷婷av一区二区三区视频| 老司机影院毛片| 国产欧美日韩一区二区精品| 亚洲精品av麻豆狂野| 丰满饥渴人妻一区二区三| 精品国产一区二区三区四区第35| 天天添夜夜摸| 免费观看a级毛片全部| 操出白浆在线播放| 我的亚洲天堂| 亚洲五月色婷婷综合| 啦啦啦免费观看视频1| 久久中文字幕人妻熟女| 99精品久久久久人妻精品| 制服人妻中文乱码| 黄色成人免费大全| 成年动漫av网址| 久久人妻熟女aⅴ| e午夜精品久久久久久久| 啦啦啦视频在线资源免费观看| 国产在线一区二区三区精| 国产精品二区激情视频| 欧美日韩成人在线一区二区| 日本一区二区免费在线视频| 亚洲第一欧美日韩一区二区三区| 精品卡一卡二卡四卡免费| 亚洲国产欧美日韩在线播放| 欧美精品高潮呻吟av久久| 久久中文字幕人妻熟女| 日本vs欧美在线观看视频| 丁香六月欧美| 老熟妇仑乱视频hdxx| 另类亚洲欧美激情| 免费观看人在逋| 热99re8久久精品国产| 婷婷丁香在线五月| 亚洲一区二区三区不卡视频| 韩国精品一区二区三区| 大香蕉久久网| 欧美精品一区二区免费开放| 搡老乐熟女国产| 精品少妇一区二区三区视频日本电影| 欧美在线黄色| 一夜夜www| 久久久久国产一级毛片高清牌| 美国免费a级毛片| 纯流量卡能插随身wifi吗| 精品国产美女av久久久久小说| 亚洲av欧美aⅴ国产| 亚洲成国产人片在线观看| 99国产精品一区二区三区| 国产精品99久久99久久久不卡| 国产在线观看jvid| 精品国产乱码久久久久久男人| 久久香蕉激情| 久久影院123| 777久久人妻少妇嫩草av网站| 免费看a级黄色片| 亚洲性夜色夜夜综合| 18禁裸乳无遮挡免费网站照片 | 久久人妻福利社区极品人妻图片| 成人18禁在线播放| 伊人久久大香线蕉亚洲五| 久久人妻福利社区极品人妻图片| 色综合欧美亚洲国产小说| 久热爱精品视频在线9| 国产一卡二卡三卡精品| av网站免费在线观看视频| 在线播放国产精品三级| 美国免费a级毛片| 大型黄色视频在线免费观看| 久久久久精品人妻al黑| 国产精品久久视频播放| 欧美久久黑人一区二区| 亚洲欧美日韩另类电影网站| 精品人妻在线不人妻| 国产精品 国内视频| 校园春色视频在线观看| 欧美不卡视频在线免费观看 | 亚洲人成电影观看| 校园春色视频在线观看| 女人爽到高潮嗷嗷叫在线视频| 岛国在线观看网站| 国产主播在线观看一区二区| 91av网站免费观看| 777米奇影视久久| 久热爱精品视频在线9| 在线免费观看的www视频| 久久久国产精品麻豆| 亚洲九九香蕉| 天天操日日干夜夜撸| 国产激情久久老熟女| 久久久久国产精品人妻aⅴ院 | 正在播放国产对白刺激| 日本a在线网址| 一a级毛片在线观看| 亚洲熟妇中文字幕五十中出 | 国产成人影院久久av| 免费av中文字幕在线| 建设人人有责人人尽责人人享有的| 大香蕉久久网| 最近最新免费中文字幕在线| 久久狼人影院| 看片在线看免费视频| 黑人欧美特级aaaaaa片| 亚洲一区高清亚洲精品| 中文欧美无线码| 老鸭窝网址在线观看| 这个男人来自地球电影免费观看| 免费不卡黄色视频| 精品国产一区二区久久| 日韩欧美免费精品| 久久精品成人免费网站| 国产成人影院久久av| 精品国产一区二区三区久久久樱花| 好男人电影高清在线观看| 亚洲久久久国产精品| 久久久国产精品麻豆| 如日韩欧美国产精品一区二区三区| 视频区图区小说| 欧美精品av麻豆av| 精品久久蜜臀av无| 亚洲av片天天在线观看| 一二三四在线观看免费中文在| 免费在线观看影片大全网站| 欧美亚洲日本最大视频资源| 高潮久久久久久久久久久不卡| 亚洲免费av在线视频| 狠狠狠狠99中文字幕| 精品少妇一区二区三区视频日本电影| 午夜精品国产一区二区电影| 少妇被粗大的猛进出69影院| 丰满迷人的少妇在线观看| 久久香蕉国产精品| 人妻一区二区av| 91九色精品人成在线观看| 视频区欧美日本亚洲| 亚洲性夜色夜夜综合| 三上悠亚av全集在线观看| 91成人精品电影| 国产亚洲精品第一综合不卡| 欧美亚洲日本最大视频资源| 亚洲中文字幕日韩| 飞空精品影院首页| 日韩欧美国产一区二区入口| av不卡在线播放| 亚洲三区欧美一区| 亚洲国产看品久久| 天天添夜夜摸| 免费高清在线观看日韩| 免费看a级黄色片| 丁香欧美五月| 99精国产麻豆久久婷婷| 十八禁人妻一区二区| 久久精品国产亚洲av香蕉五月 | 亚洲专区国产一区二区| 色尼玛亚洲综合影院| 黄色 视频免费看| 在线视频色国产色| 老司机影院毛片| 欧美大码av| 国产野战对白在线观看| 国产成人影院久久av| 纯流量卡能插随身wifi吗| 99精品久久久久人妻精品| 侵犯人妻中文字幕一二三四区| 免费观看a级毛片全部| 国产无遮挡羞羞视频在线观看| av片东京热男人的天堂| 午夜免费成人在线视频| 欧美中文综合在线视频| 午夜免费成人在线视频| 亚洲精品国产色婷婷电影| 久久久久久久午夜电影 | 丝袜人妻中文字幕| 飞空精品影院首页| 日本一区二区免费在线视频| 久久精品aⅴ一区二区三区四区| 99久久综合精品五月天人人| 91精品三级在线观看| 午夜日韩欧美国产| 免费高清在线观看日韩| 亚洲熟女毛片儿| 国产成人影院久久av| 91成人精品电影| 很黄的视频免费| 久久亚洲精品不卡| 久久 成人 亚洲| 五月开心婷婷网| 欧美不卡视频在线免费观看 | 精品亚洲成a人片在线观看| 国产成人av教育| 久久精品国产亚洲av香蕉五月 | 亚洲欧美激情在线| 很黄的视频免费| 日韩中文字幕欧美一区二区| 老司机午夜十八禁免费视频| 久久久久久人人人人人| 法律面前人人平等表现在哪些方面| 久久精品熟女亚洲av麻豆精品| 人妻丰满熟妇av一区二区三区 | 精品久久久久久久毛片微露脸| 久热爱精品视频在线9| 免费看十八禁软件| 激情在线观看视频在线高清 | 久久久久国产精品人妻aⅴ院 | 热re99久久精品国产66热6| 亚洲人成电影免费在线| 好男人电影高清在线观看| 国产国语露脸激情在线看| 国产成+人综合+亚洲专区| 精品视频人人做人人爽| 十分钟在线观看高清视频www| 亚洲午夜精品一区,二区,三区| 在线天堂中文资源库| 亚洲男人天堂网一区| 大陆偷拍与自拍| 老司机福利观看| 好看av亚洲va欧美ⅴa在| av视频免费观看在线观看| 成年人黄色毛片网站| 99国产精品99久久久久| 久久精品成人免费网站| 国产精品久久久久久精品古装| 99riav亚洲国产免费| 高清av免费在线| 男女下面插进去视频免费观看| 自拍欧美九色日韩亚洲蝌蚪91| 飞空精品影院首页| 国产淫语在线视频| 国产成人av教育| 亚洲欧美日韩高清在线视频| 国产精品av久久久久免费| 精品一区二区三区视频在线观看免费 | 国产精品国产高清国产av | 亚洲第一av免费看| 看免费av毛片| 久久久久久久久久久久大奶| 91成人精品电影| 一级毛片精品| 国产精品久久视频播放| 日本精品一区二区三区蜜桃| 天天操日日干夜夜撸| 精品福利观看| 久久久精品国产亚洲av高清涩受| 黄网站色视频无遮挡免费观看| 精品午夜福利视频在线观看一区|