• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A novel method for atomization energy prediction based on natural-parameter network

    2023-02-18 01:55:44ChoqinChuQinkunXioChozhengHeChenChenLuLiJunynZhoJinzhouZhengYinhunZhng
    Chinese Chemical Letters 2023年12期

    Choqin Chu ,Qinkun Xio,,?? ,Chozheng He ,Chen Chen ,Lu Li ,Junyn Zho ,Jinzhou Zheng,Yinhun Zhng

    a School of Mechanical and Electrical Engineering,Xi’an Technological University,Xi’an 710021,China

    b School of Electrical and Information Engineering,Xi’an Technological University,Xi’an 710021,China

    c School of Materials Science and Chemical Engineering,Xi’an Technological University,Xi’an 710021,China

    Keywords:Structure prediction Atomization energy Deep learning Coulomb matrix NPN End-to-end

    ABSTRACT Atomization energy (AE) is an important indicator for measuring material stability and reactivity,which refers to the energy change when a polyatomic molecule decomposes into its constituent atoms.Predicting AE based on the structural information of molecules has been a focus of researchers,but existing methods have limitations such as being time-consuming or requiring complex preprocessing and large amounts of training data.Deep learning (DL),a new branch of machine learning (ML),has shown promise in learning internal rules and hierarchical representations of sample data,making it a potential solution for AE prediction.To address this problem,we propose a natural-parameter network (NPN) approach for AE prediction.This method establishes a clearer statistical interpretation of the relationship between the network’s output and the given data.We use the Coulomb matrix (CM) method to represent each compound as a structural information matrix.Furthermore,we also designed an end-to-end predictive model.Experimental results demonstrate that our method achieves excellent performance on the QM7 and BC2P datasets,and the mean absolute error (MAE) obtained on the QM7 test set ranges from 0.2 kcal/mol to 3 kcal/mol.The optimal result of our method is approximately an order of magnitude higher than the accuracy of 3 kcal/mol in published works.Additionally,our approach significantly accelerates the prediction time.Overall,this study presents a promising approach to accelerate the process of predicting structures using DL,and provides a valuable contribution to the field of chemical energy prediction.

    Accurate prediction of structures in chemical compound space(CCS) is crucial for the design and synthesis of novel materials[1].Although theoretical structure prediction methods have been widely adopted in material research [2,3],the computational cost associated with large-scale structural systems remains a challenge.A CALYPSO software based on swarm intelligence has been developed to accelerate structure prediction [4,5].The existing firstprinciples-based (DFT) methods are time-consuming and economically expensive when applied to predict large-scale structures.Consequently,there is an urgent need to accelerate the process of material structure prediction [6].

    Typically,structure prediction methods involve several steps[6]: (1) Randomly generating an initial population of structures under symmetry constraints;(2) Assessing the similarity among generated structures and eliminating duplicates;(3) Performing local optimization and energy calculations using the DFT on generated structures;(4) Constructing the next generation of structures using swarm intelligence algorithms,where low-energy structures evolve into new structures while high-energy structures are replaced by randomly generated ones.However,for large-scale structures containing hundreds or thousands of atoms,the computational time required for energy calculations is significant.Furthermore,performing energy calculations and optimizations for multiple randomly generated structures further compounds the time cost.This limitation significantly impedes the prediction of large-scale material structures in most structure prediction software.Deep learning(DL) methods have gained substantial success in computer vision,pattern recognition,and other fields [7,8].However,their application in predicting and generating new structures in computational chemistry remains limited [9].

    In conclusion,the acceleration of material structure prediction is a critical problem that needs to be addressed.While traditional methods face challenges in dealing with large-scale structures,DL methods offer a potential solution.By leveraging DL techniques,it may be possible to overcome the time constraints associated with energy calculations and optimize the prediction process.This application of DL in computational chemistry holds promise for advancing the field of structure prediction and generating new structures.

    In recent years,the rapid development of artificial intelligence has made ML a new method that researchers are eager to explore.ML has become popular in various application scenarios that require prediction,classification,and decision-making [4,10].With the availability of large-scale quantum mechanical data [11–13],researchers have established ML models and used them to predict material properties such as formation energies,defect energies,elasticity,and other mechanical properties [4].For instance,Hansenetal.[14] employed a linear regression (LR) algorithm to learn the relationship between structural information and cluster energy and predicted the energy of new clusters based on atomic Cartesian coordinates and nuclear charge.Meanwhile,neural network (NN) methods [15,16] have been leveraged to accelerate the construction of potential energy surfaces.Ruppetal.[17] introduced a machine learning model that predicts the AEs of different organic molecules.Guptaetal.[18,19] utilized the DL method to establish the correlation between molecular topological features and accurate AE predictions,achieving an impressive MAE.

    In addition,some DL methods were used to replace the DFT calculation process to reduce the computational complexity of the system and accelerate structure prediction in an end-to-end way[20].Due to the strong feature extraction and feature learning abilities of the DL methods,we have introduced a new NPN [21] model that predicts the AE of different molecules in an end-to-end manner based solely on nuclear charge and atomic position.To address this task,an end-to-end energy prediction model was constructed,and its schematic diagram is shown in Fig.1.

    Fig.1. Schematic diagram of an end-to-end chemical energy prediction process.

    LetΦbe an AE prediction model,and the modelΦNPNutilizes a neural network architecture for probabilistic modeling.The model leverages the properties of exponential family distributions to transform the neural network output into the natural parameters of the exponential family distribution.This allows NPN to conveniently parameterize the probabilistic model and maximize the log-likelihood function using gradient-based optimization algorithms during training.NPN transforms the conditional probabilityp(Y/X) between the input variable and the output variable into the form of an exponential family distribution.According to all the molecular structure information,we extract the coordinate information of all atoms of each molecule as the input ofΦNPN,which is represented by X.The AE of each molecule is represented by Y.Here we have:

    where X is the coordinate information of all atoms of each molecule,andθNPNis the parameters of the NPN model.NPN consists of both linear and nonlinear transformation layers.Here,we first introduce the linear form of NPN.For simplicity,let’s assume a Gaussian distributionξ=(q,d)Twith two natural parameters.We decompose the distribution over a weight matrix=is the corresponding natural parameter.We assume similar decomposed distributions forbl,zl,andxlin NPN,which are all exponential family distributions.For computational convenience,we wish to approximatezlusing another exponential family distribution by matching its mean and variance.To computeandwe can obtainfrom the meanand varianceofzlas follows:

    where,the symbol ⊙represents the product of the elements in a distribution,while the bijective functionf(.,.) maps its natural parameters to the distribution’s mean and variance.Similarly,symbolf?1(.,.) denotes the inverse transformation.The means and variances ofWlandbl,which are obtained from the natural parameters,are represented byrespectively.The values ofthat are computed can be used to recoverthereby streamlining the feed-forward calculation of the nonlinear transformation.

    Once we have the distribution overzl,defined by natural parametersthat has been linearly transformed,we then apply an element-wise,nonlinear transformationv(·) to it.The resulting distribution of activations is denoted aspx(xl)=pz(v?1(xl))|v?1'(xl)|,wherepzis the factorized distribution overzldefined by.Even ifpx(xl) is not an exponential-family distribution,we can still approximate it usingby matching the first two moments.Once we obtain the meanand varianceofpx(xl),we can compute the corresponding natural parameters usingf?1(.,.).The feed-forward computation can be expressed as:

    To model distributions over weights and neurons,the natural parameters must be learned.The NPN is designed to take a vector random distribution as input,such as a multivariate Gaussian distribution.It then multiplies this input by a matrix random distribution and applies a nonlinear transformation before outputting another distribution.Since all three distributions in this process can be characterized by their natural parameters,learning and prediction of the network can be performed in the space of natural parameters.During back-propagation,for distributions characterized by two natural parameters,the gradient is composed of two terms.For instance,in the equation?E/?zq=(?E/?xm)⊙(?xm/?zc)+(?E/?xs)⊙(?xs/?zq),whereErepresents the error term of the network.Naturally,nonlinear NPN layers can be stacked to form deep NPN,as shown in Algorithm 1.NPN does not need expensive reasoning algorithms,such as variational reasoning or Markov chain Monte Carlo (MCMC).Moreover,in terms of flexibility,it can choose different types of exponential family distributions for weights and neurons.

    Algorithm 1 Deep NPN’s algorithm.

    We know thatχcan be converted using a previous study[22] with Coulomb matrices (CM).The CM are matrices that contain information about the atomic charges and atomic coordinates.The CM can be calculated using the following formula:

    among them,1 ≤i,j≤23,Ziis the nuclear charge of atomi,andRiis its Cartesian coordinate.

    Specifically,QM7 is a dataset consisting of 7165 molecules taken from the larger GDB-13 database,which comprises nearly one billion synthetic organic molecules.QM7 has information about the molecule that consists of the atoms H,C,N,O,and S.The maximum number of atoms in a molecule is 23.The molecules in QM7 exhibit a diverse range of chemical structures,such as double and triple bonds,cycles,carboxylic acids,cyanides,amides,alcohols,and epoxides [23].The CM representation is used to describe the molecular structures,and this approach is invariant to translations and rotations of the molecule.In addition to the CM,the dataset includes AEs ranging from ?800 kcal/mol to?2000 kcal/mol [14].

    The main focus of this article is on theχandLcomponents in the QM7 dataset.Due to the varying number of atoms in each molecule,the sizes of their true CMs are also different.To make all CMs with the same size of 23×23,we extended the size of the original CM and filled the expanded space in the CM with 0.

    For NPN,GCN,CNN,and LSTM models,we use the rectified linear unit (ReLU) activation function in the hidden layer.We used the ADAM optimizer [24] with a learning rate of 0.0001 and a decay rate of 0.00001,and based on our experience and knowledge,we have selected different combinations of network hyperparameters for model training and fine-tuning.We set the learning rate to [0.01 0.001 0.0001],the number of training rounds[100 200 300],the batch size [32 64 128],and the hidden layer nodes [100 200 300].Then,we combined the values of different network hyper-parameters.The experimental results showed that when the learning rate was 0.0001,the batch size was 64,the hidden layer nodes were 200,and the number of training rounds was 200,the model performance was optimal.The proposed models have been implemented in the Pytorch library.As for the hardware system configuration,the processor is an Intel (R) Xeon (R)Silver 4110 CPU@2.10 GHz,the RAM has a capacity of 32.0 GB,and the graphics card is an NVIDIA RTX 2080 Ti [25].We randomly divided the dataset into a training dataset and a test dataset in the ratio of 8:2.

    We used the proposed method to train the AE prediction models on training data and validated the performance of the models using testing data,and the losses of the proposed models during the training are shown in Fig.2.

    Fig.2. The results of training losses of different DL models.

    By training on a large set of different molecules in QM7,we use MAE and root mean square error (RMSE) as evaluation indicators to evaluate the prediction performance of various algorithms under different molecule presentation methods.The comparison results between our method and other methods [22] are shown in Table 1.Our results indicate that the proposed method exhibits a lower prediction error,indicating excellent prediction performance.Based on this,we trained and tested different DL models we designed on the QM7 dataset [26],and obtained some information that reflects the performance of the models,such as model parameters,testing time,and testing errors.Experiments on real-world datasets [26] show that NPN can achieve state-of-the-art performance on regression tasks.The performance of different methods are compared in Table 2.

    Table 1 Comparison of prediction errors across multiple algorithms under different representation types using MAE and RMSE metrics.

    Table 2 Comparison results of performance evaluation of different networks.

    In previous experiments [14],researchers often merged feature values/vectors and feature centers with flattened CM to form a 7165-sample dataset withN-dimensional features.However,the integration of new molecular property information has brought about a new problem: the concatenation of old and new features may lead to unwanted “heterogeneity” in the feature vectors,and including more features may actually increase noise in the dataset[30].It should be noted that some traditional ML techniques may not be able to identify meaningful patterns from these newly added features,so merging new features may result in poorer results [31].

    We present the results analysis of AEs prediction for four different molecules and their isomers,it can be shown in Fig.3.Based on the structural information,our proposed method achieved good AE prediction results by using the extended CM method to characterize molecules.However,we also found that the model has a significant prediction error for molecules [32] with fewer atoms,which may be due to the interference of additional information supplemented in the extended CM to the model’s learning of structural information features.For some molecules with compact structures relative to sparse spatial structures,the prediction errors were relatively larger.We analyzed that this may be due to the high density distribution of a large number of identical atoms in the molecule,which led to the model’s unsatisfactory extraction of the CM features.

    Fig.3. Visualization of different conformations of four molecules and comparison of their AE prediction results.Corresponding predicted AEs by our model are shown in parentheses,all AEs are in kcal/mol.Our model has a prediction error of about 0.2–3 kcal/mol for AE.The data marked in blue indicates an error between 5 and 10 kcal/mol,while the data marked in red indicates an error over 10 kcal/mol.

    To validate the robustness and effectiveness of our proposed method,we randomly divided the BC2P data from Fuetal.[33] into a train dataset and a test dataset with an 8:2 ratio.We use our proposed method to train a model on train data,and the trained model accurately predicts the corresponding energy [33].The training loss curve can be seen in Fig.4.Moreover,we tested the model on test data,and the results showed that our model had an average prediction time of 0.391 ms for the energy of each molecular conformation and an average test loss of 0.236 eV/atom.The results of performance evaluation of NPN model on BC2P is shown in Table 2.

    Fig.4. The results of training loss curve of our model on BC2P data.

    The study aims to propose and validate a method for predicting atomic energy.Initially,we introduce the extended CM representation of molecular structure along with its inherent characteristics.Subsequently,we compare the performance of various ML and DL models for predicting AEs.These models not only enable fast AE prediction but also expedite the structure prediction process.Furthermore,to overcome the limitations of our current method,we plan to incorporate the three-dimensional spatial pattern of material structure information in future research.We intend to employ the graph convolutional network (GCN) method to mathematically represent atoms in chemical molecules.In addition to model development,we informally discuss the results of numerical experiments conducted in this study.Through extensive comparative experiments and analyses,we gain valuable insights into the performance and potential applications of the proposed method.Some potential applications include catalyst design,materials discovery,optimization of energy storage and conversion,as well as material performance prediction.In the future,we envision conducting further in-depth research on material performance prediction and material discovery.By leveraging the power of machine learning and data mining techniques on large datasets,we can uncover hidden correlations within complex data structures.This will allow us to predict new material structures and their corresponding properties.Ultimately,this research contributes to the advancement of materials science and opens up new possibilities for designing innovative materials with tailored properties.

    Declaration of competing interest

    The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

    Acknowledgment

    This work was supported by the Nature Science Foundation of China (Nos.61671362 and 62071366).

    婷婷色麻豆天堂久久| 亚洲精品美女久久av网站| 一级a做视频免费观看| 肉色欧美久久久久久久蜜桃| 高清在线视频一区二区三区| 国产一区亚洲一区在线观看| 不卡视频在线观看欧美| 亚洲欧美中文字幕日韩二区| 亚洲一级一片aⅴ在线观看| 午夜免费男女啪啪视频观看| 国产精品蜜桃在线观看| 777米奇影视久久| 不卡视频在线观看欧美| 日韩免费高清中文字幕av| 久久久久久久久大av| 国产在线免费精品| 亚洲精品456在线播放app| 中文精品一卡2卡3卡4更新| 国产白丝娇喘喷水9色精品| 内地一区二区视频在线| 精品久久久久久久久av| 永久网站在线| 丁香六月天网| 国语对白做爰xxxⅹ性视频网站| 亚洲欧洲国产日韩| 九草在线视频观看| 久热久热在线精品观看| 日韩成人av中文字幕在线观看| 男人爽女人下面视频在线观看| 久久99一区二区三区| 国产精品一区www在线观看| 在线观看免费日韩欧美大片 | 91精品一卡2卡3卡4卡| 精品一区二区三卡| 狠狠精品人妻久久久久久综合| 久久精品久久久久久久性| 狂野欧美白嫩少妇大欣赏| 日韩免费高清中文字幕av| 最后的刺客免费高清国语| 91精品伊人久久大香线蕉| 中国国产av一级| 99久久中文字幕三级久久日本| 在线天堂最新版资源| 亚洲精品第二区| 中文字幕av电影在线播放| 成人国产av品久久久| 在线观看国产h片| 国产高清不卡午夜福利| 老女人水多毛片| 成人漫画全彩无遮挡| 国产精品一国产av| 亚洲国产av新网站| 自线自在国产av| 如日韩欧美国产精品一区二区三区 | 在线看a的网站| 日日啪夜夜爽| 国产精品99久久99久久久不卡 | 欧美3d第一页| 久久久久久久大尺度免费视频| 日韩伦理黄色片| 国产无遮挡羞羞视频在线观看| 国产精品秋霞免费鲁丝片| 观看av在线不卡| 看免费成人av毛片| 欧美 日韩 精品 国产| 久久ye,这里只有精品| 亚洲性久久影院| 热re99久久国产66热| 欧美人与性动交α欧美精品济南到 | 啦啦啦中文免费视频观看日本| 天天躁夜夜躁狠狠久久av| 亚洲第一av免费看| 99九九线精品视频在线观看视频| av.在线天堂| 精品亚洲成国产av| 黑人猛操日本美女一级片| 国产亚洲午夜精品一区二区久久| 80岁老熟妇乱子伦牲交| 少妇人妻精品综合一区二区| 成人国产av品久久久| 看非洲黑人一级黄片| av在线观看视频网站免费| 亚洲美女黄色视频免费看| 国产精品成人在线| 一区在线观看完整版| 久久免费观看电影| 欧美精品一区二区大全| 日本黄大片高清| 国产日韩欧美在线精品| 晚上一个人看的免费电影| 韩国高清视频一区二区三区| 中文乱码字字幕精品一区二区三区| 欧美人与性动交α欧美精品济南到 | 日韩制服骚丝袜av| 婷婷色麻豆天堂久久| av免费观看日本| 欧美精品人与动牲交sv欧美| 国产成人a∨麻豆精品| 99热国产这里只有精品6| 免费av不卡在线播放| 性高湖久久久久久久久免费观看| 亚洲中文av在线| .国产精品久久| 国产在线免费精品| 国产成人一区二区在线| 三上悠亚av全集在线观看| 日韩电影二区| 日本免费在线观看一区| 国产欧美日韩综合在线一区二区| 在线观看美女被高潮喷水网站| av免费在线看不卡| 欧美少妇被猛烈插入视频| 2018国产大陆天天弄谢| 亚洲欧洲精品一区二区精品久久久 | 性色av一级| 国产精品麻豆人妻色哟哟久久| 国产又色又爽无遮挡免| 久久这里有精品视频免费| 高清毛片免费看| 在线观看一区二区三区激情| 久久国产精品男人的天堂亚洲 | 中文字幕亚洲精品专区| 一区二区av电影网| 国产精品无大码| 日韩中字成人| 久久久亚洲精品成人影院| 熟女av电影| 考比视频在线观看| 高清毛片免费看| 99热这里只有是精品在线观看| 国产黄片视频在线免费观看| 啦啦啦在线观看免费高清www| 日韩av在线免费看完整版不卡| 在线看a的网站| 国产伦理片在线播放av一区| 大香蕉久久成人网| 夫妻午夜视频| 亚洲国产最新在线播放| 国产免费一区二区三区四区乱码| 性色av一级| 久久久精品免费免费高清| 人体艺术视频欧美日本| 如日韩欧美国产精品一区二区三区 | 午夜福利影视在线免费观看| 久久99热6这里只有精品| 少妇精品久久久久久久| 国产男女超爽视频在线观看| 视频区图区小说| 成人漫画全彩无遮挡| 亚洲欧美一区二区三区国产| 免费黄频网站在线观看国产| 女人久久www免费人成看片| 国产免费福利视频在线观看| 桃花免费在线播放| 亚洲精品国产av成人精品| 蜜桃国产av成人99| 久久狼人影院| 永久网站在线| 99热网站在线观看| 精品久久久精品久久久| 婷婷色av中文字幕| 人妻一区二区av| 成人毛片60女人毛片免费| 一级,二级,三级黄色视频| 久久久久久久久久久久大奶| 久久人人爽人人爽人人片va| 国模一区二区三区四区视频| 综合色丁香网| 有码 亚洲区| 国产一区亚洲一区在线观看| 成人影院久久| 热re99久久国产66热| a级毛片黄视频| 男人爽女人下面视频在线观看| 一二三四中文在线观看免费高清| 王馨瑶露胸无遮挡在线观看| 五月天丁香电影| 狂野欧美白嫩少妇大欣赏| 国产精品一区www在线观看| 性色av一级| 国产av国产精品国产| 国产精品久久久久成人av| 亚洲成人手机| 王馨瑶露胸无遮挡在线观看| 秋霞在线观看毛片| 国产亚洲av片在线观看秒播厂| 免费人成在线观看视频色| videosex国产| 成人免费观看视频高清| 免费黄频网站在线观看国产| 十八禁网站网址无遮挡| 欧美老熟妇乱子伦牲交| 日韩一区二区视频免费看| 久热这里只有精品99| av线在线观看网站| 久久99热6这里只有精品| 三级国产精品片| 久热久热在线精品观看| 啦啦啦视频在线资源免费观看| 激情五月婷婷亚洲| a级毛片免费高清观看在线播放| 又大又黄又爽视频免费| 91aial.com中文字幕在线观看| 极品人妻少妇av视频| 日本av免费视频播放| 亚洲,欧美,日韩| 久久久精品免费免费高清| 亚洲三级黄色毛片| 在线播放无遮挡| 日本黄色日本黄色录像| 亚洲成人av在线免费| 女人久久www免费人成看片| 久久这里有精品视频免费| 日韩一本色道免费dvd| √禁漫天堂资源中文www| 51国产日韩欧美| 美女中出高潮动态图| 午夜福利网站1000一区二区三区| 久久久久人妻精品一区果冻| 亚洲精品国产av蜜桃| 大陆偷拍与自拍| 极品人妻少妇av视频| 一本久久精品| av视频免费观看在线观看| 母亲3免费完整高清在线观看 | 亚洲国产最新在线播放| 两个人免费观看高清视频| www.色视频.com| 国产av国产精品国产| 三级国产精品片| 高清午夜精品一区二区三区| 精品久久蜜臀av无| 99热这里只有是精品在线观看| 亚洲久久久国产精品| 亚洲精品视频女| 国产免费现黄频在线看| 大香蕉97超碰在线| 精品一区二区三区视频在线| 亚洲av在线观看美女高潮| 午夜免费男女啪啪视频观看| 久久人妻熟女aⅴ| 如日韩欧美国产精品一区二区三区 | 精品一区二区三区视频在线| 91精品国产九色| 午夜免费男女啪啪视频观看| 永久免费av网站大全| 国产熟女欧美一区二区| 尾随美女入室| 又大又黄又爽视频免费| 中文精品一卡2卡3卡4更新| 国产综合精华液| 国产精品久久久久久av不卡| 成人无遮挡网站| 亚洲人成网站在线播| 亚洲精品国产av蜜桃| 国产在线免费精品| 欧美人与性动交α欧美精品济南到 | 母亲3免费完整高清在线观看 | 在线观看国产h片| 国产精品嫩草影院av在线观看| 免费观看无遮挡的男女| 国产片内射在线| √禁漫天堂资源中文www| 中文乱码字字幕精品一区二区三区| 国产在线一区二区三区精| www.av在线官网国产| 高清黄色对白视频在线免费看| 只有这里有精品99| 亚洲精品国产色婷婷电影| 最近2019中文字幕mv第一页| 在线精品无人区一区二区三| 日本色播在线视频| 毛片一级片免费看久久久久| 国产日韩欧美在线精品| 亚洲第一区二区三区不卡| 九九久久精品国产亚洲av麻豆| 一二三四中文在线观看免费高清| 熟女人妻精品中文字幕| 亚洲精品一区蜜桃| 午夜福利影视在线免费观看| 日韩免费高清中文字幕av| 晚上一个人看的免费电影| 成人午夜精彩视频在线观看| 久久亚洲国产成人精品v| 大话2 男鬼变身卡| 人人妻人人澡人人看| 卡戴珊不雅视频在线播放| 精品国产国语对白av| av在线播放精品| 国产av国产精品国产| 99热这里只有是精品在线观看| 男人添女人高潮全过程视频| 欧美精品一区二区大全| 97超视频在线观看视频| 久久久久人妻精品一区果冻| 亚洲精品国产av成人精品| 嘟嘟电影网在线观看| 热99久久久久精品小说推荐| 日韩av在线免费看完整版不卡| 成人18禁高潮啪啪吃奶动态图 | av电影中文网址| 欧美丝袜亚洲另类| 免费看不卡的av| 狠狠婷婷综合久久久久久88av| 母亲3免费完整高清在线观看 | .国产精品久久| 欧美人与性动交α欧美精品济南到 | 秋霞伦理黄片| 色94色欧美一区二区| 久久久午夜欧美精品| 人妻夜夜爽99麻豆av| 青春草亚洲视频在线观看| 国产乱来视频区| 久久久久久久久久人人人人人人| 欧美日韩成人在线一区二区| www.色视频.com| 亚洲精品av麻豆狂野| a级毛色黄片| 美女中出高潮动态图| www.av在线官网国产| 日本色播在线视频| 久久午夜综合久久蜜桃| 99久国产av精品国产电影| 丰满饥渴人妻一区二区三| 精品亚洲乱码少妇综合久久| 久久久欧美国产精品| 寂寞人妻少妇视频99o| 免费观看a级毛片全部| 亚洲综合色网址| 精品人妻熟女av久视频| 新久久久久国产一级毛片| 午夜免费观看性视频| 交换朋友夫妻互换小说| 国产成人91sexporn| 欧美另类一区| 亚洲国产色片| 狂野欧美白嫩少妇大欣赏| 十八禁高潮呻吟视频| 人妻一区二区av| 国产精品一国产av| 国产成人91sexporn| 一级二级三级毛片免费看| 一本大道久久a久久精品| 亚洲成人手机| 美女cb高潮喷水在线观看| 日日啪夜夜爽| 久久国产精品大桥未久av| 久久精品国产自在天天线| 黑人欧美特级aaaaaa片| 国产精品久久久久久av不卡| 国产av国产精品国产| 欧美 亚洲 国产 日韩一| 免费看av在线观看网站| 精品一区二区三卡| 欧美三级亚洲精品| 爱豆传媒免费全集在线观看| 欧美另类一区| 国产精品熟女久久久久浪| 内地一区二区视频在线| 青春草视频在线免费观看| 国产日韩欧美在线精品| 青春草视频在线免费观看| 另类亚洲欧美激情| 日日摸夜夜添夜夜添av毛片| 久久久久久久久久成人| 十八禁网站网址无遮挡| 亚洲在久久综合| 人妻制服诱惑在线中文字幕| 免费高清在线观看日韩| 亚洲欧美一区二区三区黑人 | a级毛片在线看网站| 日本午夜av视频| 欧美日韩国产mv在线观看视频| 在线免费观看不下载黄p国产| 亚洲国产精品国产精品| 国产精品人妻久久久久久| 亚洲经典国产精华液单| 自拍欧美九色日韩亚洲蝌蚪91| 国产视频内射| 欧美一级a爱片免费观看看| 日韩av在线免费看完整版不卡| 色哟哟·www| 国产片特级美女逼逼视频| 国产精品.久久久| 99热网站在线观看| 成人影院久久| 少妇熟女欧美另类| 久久亚洲国产成人精品v| 只有这里有精品99| 成人漫画全彩无遮挡| 久久久久久久大尺度免费视频| 天堂中文最新版在线下载| 男女高潮啪啪啪动态图| 下体分泌物呈黄色| 午夜影院在线不卡| av在线观看视频网站免费| 日本与韩国留学比较| 中国国产av一级| 成年人午夜在线观看视频| 精品人妻在线不人妻| freevideosex欧美| 亚洲情色 制服丝袜| 九九爱精品视频在线观看| 午夜福利网站1000一区二区三区| 色吧在线观看| 国产精品久久久久久精品电影小说| 久久久欧美国产精品| 自拍欧美九色日韩亚洲蝌蚪91| 美女国产高潮福利片在线看| 91精品三级在线观看| 狂野欧美激情性xxxx在线观看| 久久久久久久大尺度免费视频| 肉色欧美久久久久久久蜜桃| 狂野欧美激情性xxxx在线观看| 十八禁高潮呻吟视频| 91国产中文字幕| 国模一区二区三区四区视频| 黄色配什么色好看| 色吧在线观看| 亚洲人与动物交配视频| 欧美人与善性xxx| 日韩,欧美,国产一区二区三区| freevideosex欧美| 观看美女的网站| 国产深夜福利视频在线观看| 高清午夜精品一区二区三区| 精品久久久久久久久av| 精品人妻一区二区三区麻豆| tube8黄色片| 99精国产麻豆久久婷婷| 亚洲av在线观看美女高潮| 少妇的逼水好多| 蜜臀久久99精品久久宅男| 午夜免费鲁丝| 亚洲美女搞黄在线观看| 亚洲av日韩在线播放| 国产国拍精品亚洲av在线观看| 欧美成人精品欧美一级黄| h视频一区二区三区| 日本黄色日本黄色录像| 十分钟在线观看高清视频www| 美女cb高潮喷水在线观看| 欧美97在线视频| 涩涩av久久男人的天堂| 在线看a的网站| 国产亚洲午夜精品一区二区久久| 亚洲少妇的诱惑av| 少妇的逼水好多| 欧美3d第一页| 乱码一卡2卡4卡精品| 丝袜脚勾引网站| 能在线免费看毛片的网站| 涩涩av久久男人的天堂| a级毛色黄片| 国产日韩欧美在线精品| 男男h啪啪无遮挡| 国产永久视频网站| 日日摸夜夜添夜夜添av毛片| 免费人妻精品一区二区三区视频| 久久久久久久久大av| 亚洲精品亚洲一区二区| 久久免费观看电影| 永久网站在线| 最新的欧美精品一区二区| 人妻人人澡人人爽人人| 欧美激情 高清一区二区三区| 日韩成人伦理影院| 久久女婷五月综合色啪小说| 午夜福利在线观看免费完整高清在| 久久婷婷青草| 欧美日韩亚洲高清精品| kizo精华| 午夜福利在线观看免费完整高清在| 色94色欧美一区二区| 十八禁高潮呻吟视频| 中文字幕亚洲精品专区| 免费av中文字幕在线| 婷婷色综合大香蕉| 制服诱惑二区| av在线观看视频网站免费| 尾随美女入室| 中文字幕制服av| 女性被躁到高潮视频| 婷婷色av中文字幕| 亚洲国产成人一精品久久久| 久久国产精品男人的天堂亚洲 | 亚洲综合色惰| 日韩,欧美,国产一区二区三区| 久热这里只有精品99| 久久精品国产a三级三级三级| 伊人久久国产一区二区| 考比视频在线观看| 人体艺术视频欧美日本| 国产伦精品一区二区三区视频9| 免费观看无遮挡的男女| av线在线观看网站| 国产不卡av网站在线观看| 大香蕉久久网| 又大又黄又爽视频免费| 一级毛片电影观看| 最近手机中文字幕大全| 欧美亚洲日本最大视频资源| 欧美精品国产亚洲| av免费在线看不卡| 美女大奶头黄色视频| 中文字幕人妻丝袜制服| 国产精品.久久久| 日韩在线高清观看一区二区三区| 美女cb高潮喷水在线观看| 美女cb高潮喷水在线观看| 午夜久久久在线观看| 少妇人妻 视频| av.在线天堂| 精品熟女少妇av免费看| 日日摸夜夜添夜夜添av毛片| 97精品久久久久久久久久精品| 欧美日韩av久久| 五月天丁香电影| av在线播放精品| av一本久久久久| 免费久久久久久久精品成人欧美视频 | 亚洲怡红院男人天堂| 国产精品 国内视频| 人妻夜夜爽99麻豆av| 亚洲人成网站在线观看播放| 日韩 亚洲 欧美在线| 女的被弄到高潮叫床怎么办| 2018国产大陆天天弄谢| 久久久久久久久久久丰满| 另类精品久久| 国产片特级美女逼逼视频| 国产亚洲午夜精品一区二区久久| 亚洲综合色惰| 精品久久蜜臀av无| 一区二区日韩欧美中文字幕 | 日本欧美视频一区| 亚洲欧美色中文字幕在线| 一级爰片在线观看| 日韩电影二区| 欧美成人精品欧美一级黄| 国产成人a∨麻豆精品| 久久99热6这里只有精品| 午夜福利视频在线观看免费| 在线观看人妻少妇| 久久这里有精品视频免费| 80岁老熟妇乱子伦牲交| 人人妻人人爽人人添夜夜欢视频| 国产探花极品一区二区| 国产有黄有色有爽视频| 国产男女内射视频| 日韩免费高清中文字幕av| 国产成人精品一,二区| 国产精品人妻久久久久久| av有码第一页| 精品亚洲成国产av| 国产免费又黄又爽又色| 一区二区三区乱码不卡18| 2021少妇久久久久久久久久久| 能在线免费看毛片的网站| 视频中文字幕在线观看| 中国国产av一级| 欧美亚洲 丝袜 人妻 在线| 成人手机av| .国产精品久久| 熟女av电影| 久久久久国产网址| 亚洲欧美精品自产自拍| 夜夜看夜夜爽夜夜摸| 看非洲黑人一级黄片| 视频中文字幕在线观看| 一级毛片我不卡| 青春草视频在线免费观看| 国产一区二区三区av在线| 一级毛片我不卡| 热99久久久久精品小说推荐| 国产熟女午夜一区二区三区 | 人成视频在线观看免费观看| 成人漫画全彩无遮挡| 婷婷成人精品国产| 99re6热这里在线精品视频| 亚洲欧美成人综合另类久久久| 精品一区二区三区视频在线| 考比视频在线观看| av免费观看日本| 大片免费播放器 马上看| 精品一品国产午夜福利视频| 人成视频在线观看免费观看| 99九九线精品视频在线观看视频| 蜜桃国产av成人99| 人妻一区二区av| 日韩熟女老妇一区二区性免费视频| 99热国产这里只有精品6| 免费观看的影片在线观看| 在线播放无遮挡| 成人综合一区亚洲| 国产精品偷伦视频观看了| 亚洲欧美日韩卡通动漫| 精品亚洲成a人片在线观看| 亚洲欧美精品自产自拍| 久久午夜综合久久蜜桃| 99九九在线精品视频| 国产精品国产av在线观看| 777米奇影视久久| 久久久久网色| 欧美xxⅹ黑人| 日韩亚洲欧美综合| 人人妻人人爽人人添夜夜欢视频| 18禁裸乳无遮挡动漫免费视频| 亚洲国产精品成人久久小说| 欧美激情 高清一区二区三区| 精品一区二区免费观看| 免费日韩欧美在线观看| 狠狠婷婷综合久久久久久88av| 一级a做视频免费观看| 免费黄网站久久成人精品|