• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Neural network hyperparameter optimization based on improved particle swarm optimization①

    2023-12-15 10:43:52XIEXiaoyan謝曉燕HEWanqiZHUYunYUJinhao
    High Technology Letters 2023年4期

    XIE Xiaoyan(謝曉燕), HE Wanqi②, ZHU Yun, YU Jinhao

    (?School of Computer, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    (??School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, P.R.China)

    Abstract

    Key words: hyperparameter optimization, particle swarm optimization (PSO) algorithm, neural network

    0 Introduction

    With the rapid growth of the artificial intelligence(AI) technologies, deep learning has achieved significant results in complex regression and classification problems, involving computer vision (CV), natural language processing (NLP), object detection[1], and so on.a neural network model needs to be trained to drive a satisfied accuracy.But there are a large number of hyperparameters which need to be configured during the training processing, such as learning rate,batch size, and optimizer.How to select appropriate hyperparameters to help training and explore the best neural network model has become a focus and a difficulty.

    In the early studies, grid search was used to exhaustive search of parameter space.Later, an improved algorithm was implemented based on this called random search.Experiments have shown that, with the same number of search iterations, random search tries more parameter values compared with grid search, and reduces search time while ensuring model accuracy[2].However, these search methods cannot solve the parameter optimization problem of neural network very well.Ref.[3] mentioned that hyperparameter optimization is an NP-hard optimization problem, and most current approaches solve it by adopting metaheuristic algorithms.

    Ref.[4] successfully used genetic algorithm(GA) for hyperparameter tuning.It is found that more and more metaheuristic and modified algorithms could be used to optimize neural networks, and some researchers choose to extend different exploration[3].The development of hybrid models can improve performance and the ability of complex optimization.Ref.[5] proposed an improved hybrid algorithm based on bat echo location behavior, combining local search to optimize weights, architectures, and active neurons.Ref.[6]introduced a combination of genetic and gradient descent to train networks.The proposed HGADD-NN method achieved good results on several benchmark problems.

    Among most common algorithms, the particle swarm optimization (PSO)[7-8]is a more popular choice, compared with GA[9], grey wolf optimization algorithm (AWO)[10]and ant colony optimization(ACO), because of its less nodal parameters, efficient global search and easily concurrent processing.So, it is introduced into parameters selection of convolutional neural network (CNN).In Ref.[7], PSO is used for parameters decision of CNN and verified successfully in classification of CIFAR.This indicates that PSO is a feasible scheme for parameters optimization of CNN.To handle the different of value scopes and datatypes of hyperparameters, independent candidate particles were defined for different parameter in Ref.[7].But the initial position selected randomly would result in local optima trapped and the weakly local searching ability would entail a long convergent stage.Such a weakness may lead to a biased model even with the cost of additional calculations.Therefore, the improved PSO or integrating other heuristic algorithms become a striking field.Ref.[11] proposed a distributed PSO(DPSO)mechanism.The particles were updated and allocated in the master,and the fitness of different particles was calculated using multiple slaves.This strategy automatically and globally searches for optimal hyperparameters and dramatically reduces training time compared with traditional PSO, thanks to the distributed training method.Unfortunately, this idea only uses a distribution way to tackle long training time in the parameter configuration process.The cost of calculation is ignored.

    The key problem to this issue lies in balance of the diversity of particle swarms and cost of convergence.In this work, an improved solution of PSO is proposed to aid in locating the best hyperparameter combination of specific network architecture easier.The following specific contributions are made.

    (1) To response to the troubles caused by inconsistent data types and widely different value scopes of neural networks, the interval mapping method is adopted to data coding.The motivation of such design is to ensure the effectiveness of particles through a normalized strategy and to avoid the local evolution of swarm stem from randomly position selecting of original particles.

    (2) By introducing mutation and crossover operations to increase the diversity of particles, the proposed algorithm solves the problems of the raw PSO, such as being easily trapped in local optima and low convergence accuracy during hyperparameter searching.

    The rest of this paper is organized as follows.Section 1 discusses related work.Section 2 analyzes the motivation and describes the implementation of proposed algorithms in detail.Section 3 discusses the experimental environment and implementation scheme on different network structures and datasets.Section 4 summarizes the content of this paper.

    1 Related work

    In PSO[12], the optima seeking is converted to a process of traversing an-dimensional function with particles of a swarm.Each a potential solution to a given problem is viewed as a particle.PSO obtains the best solution from interaction of particles.When mapping the different value scope of hyperparameters into andimensional function, selecting of the best parameter combination can be figured out by PSO.

    If there areNhyperparameters in the specific CNN, the value of each parameter ranges fromlowtoup.A particle can be denoted as{x1,x2,…,xN-1,xN}, and the performance is evaluated by a fitness function (often the loss function in CNN).The swarm is constructed originally with particles generated randomly by predefined value.The fitness value of each particle is calculated iteratively until reaching its best position or meeting the pre-set termination condition.The prior personal best position (gbest) and the global best position (pbest) is characterized as the intermediate results.Updating of particles can be given by Eq.(1) and Eq.(2).

    where,vrepresents the velocity vector;wis the inertia weight utilized to balance the local exploitation and global exploration;r1andr2are random vectors uniformly distributed within the range of[low,up];c1andc2are acceleration coefficients, which are positive constants.

    Above scheme, mentioned in Ref.[7] had been demonstrated be helpful to parameters tuning.However, there are many kinds of hyperparameter included in CNN, their datatype is also inconsistent (the number of convolutional kernels is recorded as integer, the learning rate is float, etc.).Moreover, the value scope of parameters in different layers show an enormous difference.The uniform format in coding will merge the effective attribute of particles and results a calculation error.However, differentiated coding format will increase the complexity of the swarm.Additional, when solving the problem with a large number of dimensions or complex and large datasets, PSO shows poor-quality results,usually falling into local optima trap.

    To cope with this tricky situation, a series of solutions are proposed in this paper.The composite variabletype particles are mapped into an interval to ensure the effectiveness of particles update and to prevent the poor results due to significant differences in data sampling intervals.Then,the selection,mutation,and crossover operations of GA are introduced to add diversity of particles and avoid getting stuck in local optima during the parameter search process.Finally, an improved particle swarm optimization (IPSO) algorithm is proposed based on the above two improvement measures.

    2 Methods

    2.1 Hybrid particle coding

    The parameters of CNN to be optimized selected in this paper include the number of convolution kernels, learning rate and batch size, etc., as shown in Table 1.

    Table 1 Parameters to be optimized

    It can be seen from Table 1 that the scope of parameters of int-type ranges from 6 ×(Parm1) to 120 ×( Parm4 ).In addition, a float-type parameter(Parm6) is also involved.Such a variety of value maybe leads to bias direction in updating procedure and unable to converge in severe cases.As a result, there needs an effective manner to remove this difference.To this end, the PSO advocate[7-8]can only independently characterizes particles for each parameter.As a result,initial value of each particle only occurs randomly in self-defined scope.Although the difference of parameter scope needs not to be considered, such a treatment may probably generate an unreachable position during searching.For example, in a two-dimensional space composed of Parm2 and Parm3, the value of Parm2 could not reach [1,3] and [8,11], marked as ‘?’in Fig.1, due to the limiting of random function.Subsequently, particles generated will not overall coverage to initial swarm, while it would trap in local optima in iteration.

    To avoid such incidents, a random sampling function conformed to uniform distribution is introduced in our scheme, which uniforms the coding of overall parameters.Meanwhile, Eq.(3) is used to mapping scopes of parameters to a normalized uniform.

    where the lower bound and upper bound of a parameter are denoted aslowAreaandupArearespectively.The distribution of initial particles is presented as ‘?’ in Fig.1.It is obvious that particles scattered steadily in overall interval.The global coverage of search space and completeness of swarm are guaranteed very well.

    Fig.1 Schematic diagram of particle distribution before and after optimization

    2.2 Genetic manipulation

    In this paper, in order to increase the diversity of particles during the updating, the selection, crossover,and mutation operations of GA are added to the IPSO.The cross process of particles is shown in Fig.2.Suppose there areNparticles in the swarm, which are denoted as{X1,X2,X3,…,XN-1,XN}.After the calculation, the fitness values of each particle are marked as{F(X1),F(X2),F(X3),…,F(XN-1),F(XN)}.Firstly, the first two particles with higher fitness values are selected and recorded asXiandXj.The inter-section pointPis calculated using Eq.(4).TakingN=6 as an example, the selection result is shown in Fig.2(a).Then, using the inter-section pointPas the boundary, the particlesXiandXjare divided into four parts:Xi-front,Xi-after,Xj-front, andXj-after, as shown in Fig.2(b).Finally,Xi-frontis spliced withXj-after,andXj-frontis spliced withXi-after, then two new particlesXi-newandXj-neware formed, as shown in Fig.2(c).

    Fig.2 An example illustration of the crossover process

    The schematic diagram of the particle mutation is presented in Fig.3,takingN=6 as an example.Firstly,the complete information of the initial particle would be obtained.Then, an integer value in [1,N] will be randomly generated,Ndenotes the total number of optimized parameters.Finally, a randomly generated number within the corresponding parameter scope of that position will replace the original value, and the mutation operation will be done.

    Fig.3 Schematic diagram of mutation process

    For the raw PSO method, particle position participating in the next iteration is determined by the optimal solution obtained from this searching round and the historical optimal solution.The update domain of raw PSO can be described by the zone, shown as dashed lines in Fig.4.The updating domain of raw PSO is limited to such a domain.It is only stopped in the optima of such local domain.If the global optima are escaped out of this interval,the final result will be deviated to the correct one.

    The GA operations will help to jump out of this limitation by involving new particle.It can enhance the diversity of particles in iteration of swarm, and make the global optima be located easier.

    Fig.4 Comparison of particle positions before and after optimization

    2.3 Overall scheme

    The flow chart of hyperparameter configuration based on the proposed IPSO is shown in Fig.5, there are 3 steps as follows.

    Step 1 Initialization of swarm and its particles.The learning factorsc1,c2, weight coefficientw,number of particles, and maximum particle speed need to be initialized at beginning.At the same time, it opens up a space to store local and global optimal values and randomly generates a specified numberNof particles to form a swarm according to the number and scope of parameters to be optimized.

    Step 2 Calculate the fitness value of the particles.The values of each particle are sequentially sent to the designated neural network for training and testing, and the accuracy rate obtained will be selected as the fitness value of the particle and recorded.

    Step 3 Particles updating.According to Algorithm 1, the particles in the current swarm are updated to complete the subsequent iteration.

    Algorithm 1 Particle updating Input: Primitive swarm D(X), Fitness function F(X)Output: The updated swarm D(X)1: function update (D(X))2: Initialization max1 and max2 3 for:i in length ( D(X) )4: Evaluate F(Xi)5: set max1 and max2 6: end for 7: update gbest and pbest 8:new1 ,new2 = crossover ( max1 ,max2 )9: D(X) = mutate ( new1 ,new2 )10: return D(X)11: end function

    Fig.5 Flow chart of the IPSO

    3 Experiments and results analysis

    3.1 Implementation details

    The initialization of the algorithm is completed according to the parameters shown in Table 2.The proposed algorithm is verified under different network structures and data sets.The experimental environment of this paper is shown in Table 3.

    Table 2 IPSO algorithm related parameters

    Table 3 Experimental environment setting

    3.2 Experimental results and analysis

    3.2.1 Under different network structures

    In this work,LeNet-5,ResNet-18,VGG-16,MobileViT, and Long short-term memory (LSTM) are selected as the test objects.The MNIST was used on LeNet-5 and LSTM, the Flowers-Recognition was used on MobileViT, and the remaining networks tested with the CIFAR10.This presearch uses the parameters searched by the IPSO and PSO to complete the training task.The accuracy rate changes with epoch during the training are shown in Fig.6.It can be seen that for the same training epoch,no matter which neural network is used, the parameter configuration searched by the IPSO can make the neural network converge faster than the PSO.In addition, Fig.6 (c) and Fig.6(d) show more apparent difference in accuracy.

    Fig.6 Accuracy rate change with Epoch

    Using the IPSO and the parameters searched by the PSO to complete the training, the final accuracy rate is compared in Table 4.It is clear that, after structure tuned by IPSO, the CNN model can finally obtain higher accuracy, compared with tuned by PSO,among which the accuracy rates are increased by about 0.4%,1.9%, and 4.5%, respectively.On the ViT model, the network built by the IPSO can finally complete the image classification tasks with an accuracy rate of 66.21%, about 13% higher than that by the raw PSO.IPSO and PSO have achieved similar accuracy on the recurrent neural network.On the ViT model experiment, IPSO and PSO only have a difference of 1 in parameter 1, but the final network structure gains a significant difference due to the layer-by-layer accumulation.Meanwhile, although the data number of Flowers-Recognition used in this paper is small, different parameter configurations will still bring obvious difference of performance.LSTM is more inclined to the application of NLP.At the same time, this paper only considers two network-related parameters, the hidden state and the number of recurrent network layers.Although the difference between them will still change the number of parameters, the extraction effect of changing data feature is not obvious, which can be to negligible.

    Table 4 Comparison of the accuracy of different algorithms on different network structures

    3.2.2 Under different datasets

    To get the effectiveness of the IPSO proposed in this paper on the hyperparameter seeking, three datasets of MNIST, Fashion-MNIST and CIFAR10 are selected for verification.

    (1) The MNIST dataset includes 70 000 handwritten digital images.The training set contains 60 000 samples, while the test set contains 10 000 samples.The dataset is categorized into 10 classes related to 10 numerals.

    (2) The Fashion-MNIST dataset contains 10 categories of images: T-shirt, trouser, puller, dress, coat,sandy, shirt, sneaker, bag, and ankle boot.The training dataset has a total of 60 000 samples, and the test dataset has a total of 10 000 samples.

    (3) The CIFAR-10 dataset includes 60 000 color images from 10 categories, with 6000 images per category.These 10 different categories represent aircraft, cars, birds, cats, deer, dogs, frogs, horses, boats, and trucks.The training set contains 50 000 samples (83% of the original dataset),while the test set contains 10 000 samples (17% of the original dataset).

    The parameters searched by the IPSO on the above three data sets can have 99.58%, 93.39%,and 78.96% accuracy when training the neural network.Compared with PSO, the accuracy has increased by about 0.1%,0.5%,2%, respectively.Compared with the SSO[13], the same accuracy has been achieved on the MNIST dataset meanwhile.On Fashion-MNIST and CIFAR10, the accuracy has increased by 0.36%and 5.83% respectively.Compared with the DPSO[11], the model accuracy obtained using IPSO on MNIST and Fashion MNIST is slightly higher, and the final results are shown in Table 5.In addition, in order to compare with the WOA[10], the network’s optimization parameters are adjusted to be consistent with those in GWO,including batch size,epochs,and optimizer.The results are shown in Table 6.From the table, it can be seen that the IPSO algorithm’s search for parameter combinations completed training on the Fashion-MNIST dataset has a higher accuracy on the test set than GWO algorithm.

    Table 5 Comparison of the accuracy of different algorithms on different network structures

    Table 6 Accuracy comparison on the Fashion-MNIST dataset

    4 Conclusion

    This paper proposes an IPSO algorithm that integrates GA to address the issues of traditional PSO,such as easily falling into local optima and low convergence accuracy in the process of seeking.Finally, the performance is verified by using different types of neural network models and datasets.Experimental results show that the proposed IPSO achieves higher accuracy than traditional PSO on CNNs and ViT models, tested with Fashion-MNIST and CIFAR10 datasets.Moreover, with optimized parameter configurations, the model is more stable and converges faster during training.However, this paper only considers a limited number of parameters to be optimized, while other parameters affecting the neural network structure, such as the depth and number of convolutional layers, can also be searched for the optimal solution.Therefore, in the future, it is worth considering the fusion of parameters at different levels to find the optimal network structure and model parameters for better model performance.

    亚洲中文av在线| 老司机靠b影院| 大码成人一级视频| 亚洲av日韩在线播放| 国产精品久久久人人做人人爽| 亚洲五月色婷婷综合| 91老司机精品| 首页视频小说图片口味搜索| 日韩有码中文字幕| 欧美日韩视频精品一区| 欧美日韩中文字幕国产精品一区二区三区 | 亚洲一区二区三区欧美精品| 露出奶头的视频| 在线观看免费视频网站a站| 国产成人欧美| 成人三级做爰电影| а√天堂www在线а√下载 | 亚洲成国产人片在线观看| 极品人妻少妇av视频| 桃红色精品国产亚洲av| 日韩欧美免费精品| 99国产综合亚洲精品| 日本黄色日本黄色录像| 国产一卡二卡三卡精品| 精品亚洲成国产av| 亚洲伊人色综图| 亚洲欧美激情在线| 国产成+人综合+亚洲专区| 国产精品电影一区二区三区 | 亚洲欧美日韩另类电影网站| 热re99久久国产66热| 夫妻午夜视频| 国产成人欧美在线观看 | 欧美不卡视频在线免费观看 | 久久中文字幕一级| 国产又色又爽无遮挡免费看| 十八禁高潮呻吟视频| 女同久久另类99精品国产91| 9191精品国产免费久久| 看片在线看免费视频| 国产人伦9x9x在线观看| 色尼玛亚洲综合影院| av天堂久久9| 视频区图区小说| 亚洲国产精品一区二区三区在线| 19禁男女啪啪无遮挡网站| 少妇被粗大的猛进出69影院| 精品人妻1区二区| 妹子高潮喷水视频| 在线观看66精品国产| 欧美人与性动交α欧美软件| 看片在线看免费视频| 成人影院久久| 人妻久久中文字幕网| 脱女人内裤的视频| 国产精品欧美亚洲77777| 美女 人体艺术 gogo| 18在线观看网站| 日韩三级视频一区二区三区| 飞空精品影院首页| 欧美国产精品va在线观看不卡| 欧美日韩亚洲国产一区二区在线观看 | 亚洲精品久久成人aⅴ小说| 午夜福利欧美成人| 欧美黑人精品巨大| av欧美777| 亚洲精品一二三| av视频免费观看在线观看| 国产在线一区二区三区精| 亚洲精品国产区一区二| 亚洲欧美日韩另类电影网站| 欧美性长视频在线观看| 在线观看舔阴道视频| 高潮久久久久久久久久久不卡| 国产精品永久免费网站| 成年女人毛片免费观看观看9 | 国产激情久久老熟女| 777米奇影视久久| 国产极品粉嫩免费观看在线| 久久久久国内视频| 男女免费视频国产| 天天添夜夜摸| 天堂√8在线中文| 国产一卡二卡三卡精品| 久久国产精品男人的天堂亚洲| 免费在线观看影片大全网站| 黑人巨大精品欧美一区二区mp4| 精品久久久精品久久久| 国产1区2区3区精品| 黑人欧美特级aaaaaa片| 色综合欧美亚洲国产小说| e午夜精品久久久久久久| 99香蕉大伊视频| 后天国语完整版免费观看| 国产精品久久视频播放| 精品免费久久久久久久清纯 | 久久人妻av系列| 国产精品免费大片| 亚洲成国产人片在线观看| 99热国产这里只有精品6| 久久精品亚洲av国产电影网| 电影成人av| 无限看片的www在线观看| 人人妻人人添人人爽欧美一区卜| 成人影院久久| 欧美大码av| 欧美精品av麻豆av| 夫妻午夜视频| 法律面前人人平等表现在哪些方面| 99久久国产精品久久久| 国产欧美日韩一区二区精品| 国产精品成人在线| 大香蕉久久成人网| 香蕉国产在线看| 又大又爽又粗| 一区二区三区精品91| 十八禁人妻一区二区| 麻豆av在线久日| 国产精品久久视频播放| 久久久久久久久久久久大奶| 亚洲精品中文字幕一二三四区| 性色av乱码一区二区三区2| 欧美日韩乱码在线| 91av网站免费观看| 欧美日韩乱码在线| 精品免费久久久久久久清纯 | 国产99久久九九免费精品| 亚洲欧美日韩高清在线视频| 日日爽夜夜爽网站| 女人爽到高潮嗷嗷叫在线视频| 精品国产国语对白av| 精品久久久精品久久久| av电影中文网址| 又黄又粗又硬又大视频| 97人妻天天添夜夜摸| 国产野战对白在线观看| 天堂动漫精品| 亚洲精品美女久久av网站| 国产亚洲精品久久久久5区| 久久人妻福利社区极品人妻图片| 亚洲av电影在线进入| 国产精品一区二区在线观看99| 成人免费观看视频高清| 国产精品一区二区在线观看99| 亚洲人成77777在线视频| 国产av精品麻豆| 精品免费久久久久久久清纯 | netflix在线观看网站| 成人永久免费在线观看视频| 欧美日韩瑟瑟在线播放| 国产精品1区2区在线观看. | 国产主播在线观看一区二区| 国产免费av片在线观看野外av| 久久久精品免费免费高清| 午夜免费观看网址| 欧美激情 高清一区二区三区| 亚洲av成人av| 亚洲精品粉嫩美女一区| 午夜福利免费观看在线| 久久人妻福利社区极品人妻图片| 高清黄色对白视频在线免费看| 国产精品 国内视频| av福利片在线| 亚洲第一欧美日韩一区二区三区| 大片电影免费在线观看免费| 伦理电影免费视频| 精品一区二区三区av网在线观看| 免费在线观看日本一区| 亚洲午夜理论影院| 亚洲va日本ⅴa欧美va伊人久久| 侵犯人妻中文字幕一二三四区| 激情视频va一区二区三区| 国产深夜福利视频在线观看| 欧美av亚洲av综合av国产av| av视频免费观看在线观看| 国产午夜精品久久久久久| 国产有黄有色有爽视频| 亚洲五月天丁香| 大香蕉久久成人网| 无人区码免费观看不卡| 女人精品久久久久毛片| 国产精品免费大片| 精品第一国产精品| 美女福利国产在线| 欧美国产精品va在线观看不卡| 中文亚洲av片在线观看爽 | 久久天躁狠狠躁夜夜2o2o| 亚洲人成电影免费在线| 精品久久蜜臀av无| 国产99久久九九免费精品| netflix在线观看网站| 在线观看66精品国产| 成人av一区二区三区在线看| 午夜福利视频在线观看免费| 久久精品国产亚洲av香蕉五月 | 亚洲精品国产一区二区精华液| 国产高清国产精品国产三级| 日日爽夜夜爽网站| 欧美日韩精品网址| 天天躁狠狠躁夜夜躁狠狠躁| 欧美午夜高清在线| 99国产综合亚洲精品| 欧美乱妇无乱码| 少妇的丰满在线观看| 国产一区在线观看成人免费| 一区二区日韩欧美中文字幕| 亚洲精品美女久久av网站| 免费观看人在逋| 在线观看午夜福利视频| av片东京热男人的天堂| 人妻丰满熟妇av一区二区三区 | 国产精品综合久久久久久久免费 | 满18在线观看网站| 制服诱惑二区| 午夜久久久在线观看| 精品国内亚洲2022精品成人 | 伊人久久大香线蕉亚洲五| 久久狼人影院| 亚洲熟妇熟女久久| а√天堂www在线а√下载 | 妹子高潮喷水视频| 80岁老熟妇乱子伦牲交| 精品免费久久久久久久清纯 | 91字幕亚洲| 欧美黄色片欧美黄色片| 日本a在线网址| 亚洲免费av在线视频| 精品国内亚洲2022精品成人 | 亚洲欧美色中文字幕在线| 在线免费观看的www视频| 免费女性裸体啪啪无遮挡网站| 国产熟女午夜一区二区三区| 校园春色视频在线观看| 丁香六月欧美| 在线观看66精品国产| 午夜影院日韩av| 视频区欧美日本亚洲| 久久久国产成人免费| 精品少妇久久久久久888优播| 手机成人av网站| 久久久国产欧美日韩av| 天堂√8在线中文| 久久久久国内视频| 婷婷丁香在线五月| 亚洲一区二区三区不卡视频| 日韩欧美国产一区二区入口| 美女高潮喷水抽搐中文字幕| 亚洲aⅴ乱码一区二区在线播放 | 超碰97精品在线观看| 搡老岳熟女国产| www.999成人在线观看| 中亚洲国语对白在线视频| 可以免费在线观看a视频的电影网站| 啦啦啦视频在线资源免费观看| 一级a爱视频在线免费观看| 国产亚洲精品第一综合不卡| www.999成人在线观看| 亚洲精品自拍成人| 亚洲精品乱久久久久久| 国产99久久九九免费精品| 色婷婷久久久亚洲欧美| 午夜精品国产一区二区电影| 欧美黑人精品巨大| 俄罗斯特黄特色一大片| 欧美日韩视频精品一区| 亚洲欧洲精品一区二区精品久久久| 国产精品乱码一区二三区的特点 | 久久九九热精品免费| 人妻久久中文字幕网| 亚洲精品av麻豆狂野| 国产99久久九九免费精品| 午夜两性在线视频| 亚洲视频免费观看视频| 啦啦啦 在线观看视频| 中文字幕人妻丝袜制服| 性色av乱码一区二区三区2| 免费人成视频x8x8入口观看| 最新在线观看一区二区三区| 窝窝影院91人妻| 在线观看免费视频日本深夜| 丁香欧美五月| 国产精品自产拍在线观看55亚洲 | 9热在线视频观看99| 日韩制服丝袜自拍偷拍| 黄片小视频在线播放| 欧美精品人与动牲交sv欧美| 十分钟在线观看高清视频www| 啦啦啦在线免费观看视频4| 国产精品1区2区在线观看. | 亚洲成av片中文字幕在线观看| 亚洲五月婷婷丁香| www.精华液| 高清欧美精品videossex| 午夜精品国产一区二区电影| 国产精品免费一区二区三区在线 | 精品久久蜜臀av无| 91精品国产国语对白视频| 亚洲成人免费电影在线观看| 在线观看一区二区三区激情| 一个人免费在线观看的高清视频| 精品久久久久久电影网| 免费在线观看日本一区| 身体一侧抽搐| 十八禁网站免费在线| 嫁个100分男人电影在线观看| 欧美成人免费av一区二区三区 | 妹子高潮喷水视频| 欧美午夜高清在线| 这个男人来自地球电影免费观看| 91成年电影在线观看| 欧美乱妇无乱码| 亚洲情色 制服丝袜| 国产精品1区2区在线观看. | 在线看a的网站| 日本撒尿小便嘘嘘汇集6| 亚洲精品av麻豆狂野| 国产成人欧美| 精品人妻熟女毛片av久久网站| 嫁个100分男人电影在线观看| 黑人猛操日本美女一级片| 色老头精品视频在线观看| 精品亚洲成国产av| 国产精品久久久久久人妻精品电影| 亚洲成国产人片在线观看| 在线国产一区二区在线| 国产欧美日韩一区二区三区在线| 日本精品一区二区三区蜜桃| 国产黄色免费在线视频| 国内久久婷婷六月综合欲色啪| 国产精品乱码一区二三区的特点 | 男人操女人黄网站| 亚洲欧洲精品一区二区精品久久久| 国产精品98久久久久久宅男小说| 国产一卡二卡三卡精品| 亚洲精品自拍成人| 国内久久婷婷六月综合欲色啪| 777久久人妻少妇嫩草av网站| 久久久国产欧美日韩av| 国内毛片毛片毛片毛片毛片| 夜夜夜夜夜久久久久| 精品卡一卡二卡四卡免费| 极品少妇高潮喷水抽搐| 性色av乱码一区二区三区2| 欧美中文综合在线视频| 在线观看66精品国产| 久久中文看片网| 日韩人妻精品一区2区三区| 正在播放国产对白刺激| 国产精品免费视频内射| 99国产精品免费福利视频| 久99久视频精品免费| 99国产精品一区二区三区| 日韩精品免费视频一区二区三区| 国产成人精品无人区| 美女扒开内裤让男人捅视频| 一二三四在线观看免费中文在| 丰满的人妻完整版| 嫁个100分男人电影在线观看| 欧美精品av麻豆av| 精品国产超薄肉色丝袜足j| 高清视频免费观看一区二区| 国产av又大| 脱女人内裤的视频| 韩国精品一区二区三区| 极品教师在线免费播放| 国内久久婷婷六月综合欲色啪| 亚洲五月色婷婷综合| 十八禁网站免费在线| 日本一区二区免费在线视频| 中文字幕av电影在线播放| 飞空精品影院首页| 丰满的人妻完整版| 成在线人永久免费视频| 国产97色在线日韩免费| 9热在线视频观看99| 久久天堂一区二区三区四区| 亚洲人成电影观看| 精品国内亚洲2022精品成人 | 成年动漫av网址| 桃红色精品国产亚洲av| 一级毛片精品| 色在线成人网| 老鸭窝网址在线观看| 最新美女视频免费是黄的| 女人被躁到高潮嗷嗷叫费观| 成在线人永久免费视频| 人妻久久中文字幕网| 午夜两性在线视频| 亚洲精品国产精品久久久不卡| 性少妇av在线| 国产精品国产高清国产av | 成熟少妇高潮喷水视频| 久久香蕉精品热| 国产av精品麻豆| 色尼玛亚洲综合影院| 欧美日韩黄片免| 黄频高清免费视频| 午夜久久久在线观看| 999久久久精品免费观看国产| 19禁男女啪啪无遮挡网站| 无人区码免费观看不卡| 无限看片的www在线观看| 亚洲欧美色中文字幕在线| 嫁个100分男人电影在线观看| 多毛熟女@视频| 欧美日韩亚洲高清精品| 极品教师在线免费播放| 中文字幕最新亚洲高清| 飞空精品影院首页| 香蕉丝袜av| 久久ye,这里只有精品| 国产欧美日韩一区二区三区在线| 最近最新中文字幕大全免费视频| 丝袜美足系列| 国产欧美日韩一区二区三| 精品熟女少妇八av免费久了| 欧美精品啪啪一区二区三区| 99热只有精品国产| 无人区码免费观看不卡| 手机成人av网站| 日韩欧美三级三区| 人人妻人人澡人人看| 欧美性长视频在线观看| 久久天堂一区二区三区四区| 欧美日韩福利视频一区二区| 日韩免费高清中文字幕av| 80岁老熟妇乱子伦牲交| 久9热在线精品视频| 国产亚洲精品一区二区www | 中文欧美无线码| netflix在线观看网站| 成年动漫av网址| 制服诱惑二区| 午夜福利影视在线免费观看| 国产高清激情床上av| 欧美日韩视频精品一区| av电影中文网址| 他把我摸到了高潮在线观看| 免费在线观看黄色视频的| 免费日韩欧美在线观看| 欧美激情 高清一区二区三区| 国产精品美女特级片免费视频播放器 | 91精品三级在线观看| 亚洲人成伊人成综合网2020| 美女国产高潮福利片在线看| www.999成人在线观看| 久久久久久亚洲精品国产蜜桃av| 国产乱人伦免费视频| 老汉色av国产亚洲站长工具| 在线av久久热| 欧美成人午夜精品| 国产精品一区二区在线观看99| 国产精品免费大片| 少妇粗大呻吟视频| 久久久久久久国产电影| 国产成人精品在线电影| 午夜久久久在线观看| 午夜视频精品福利| 亚洲av第一区精品v没综合| 国产aⅴ精品一区二区三区波| 嫩草影视91久久| 两性夫妻黄色片| 国产精品国产高清国产av | 人人妻人人添人人爽欧美一区卜| 亚洲国产看品久久| 日本wwww免费看| 欧美成人午夜精品| 黄频高清免费视频| 国精品久久久久久国模美| 亚洲国产看品久久| 色婷婷av一区二区三区视频| 亚洲欧美激情在线| 男女免费视频国产| 久久国产亚洲av麻豆专区| 久久精品aⅴ一区二区三区四区| 99热国产这里只有精品6| 久久香蕉精品热| 日韩人妻精品一区2区三区| 成人永久免费在线观看视频| 久久久久久久午夜电影 | 老司机午夜福利在线观看视频| 国产精品久久久久久精品古装| 久久精品国产亚洲av高清一级| 国产成人av激情在线播放| 午夜福利欧美成人| 国产精品美女特级片免费视频播放器 | 熟女少妇亚洲综合色aaa.| 日韩制服丝袜自拍偷拍| 久久九九热精品免费| 色婷婷久久久亚洲欧美| 成人免费观看视频高清| 欧美久久黑人一区二区| 国产精品香港三级国产av潘金莲| 99久久精品国产亚洲精品| 久久婷婷成人综合色麻豆| 色精品久久人妻99蜜桃| 两性午夜刺激爽爽歪歪视频在线观看 | 久99久视频精品免费| 欧美精品亚洲一区二区| 国产男女超爽视频在线观看| 亚洲情色 制服丝袜| 久久狼人影院| 99久久精品国产亚洲精品| 超碰97精品在线观看| 国产成人精品久久二区二区91| 人人妻人人添人人爽欧美一区卜| 一区二区日韩欧美中文字幕| 亚洲欧美激情综合另类| 亚洲中文字幕日韩| 天天操日日干夜夜撸| 黑人操中国人逼视频| 大香蕉久久网| 9色porny在线观看| av欧美777| 女人被狂操c到高潮| 纯流量卡能插随身wifi吗| 亚洲五月天丁香| 岛国在线观看网站| 大陆偷拍与自拍| 国产99久久九九免费精品| 两个人免费观看高清视频| 亚洲成国产人片在线观看| av线在线观看网站| 在线观看日韩欧美| 久久久久久免费高清国产稀缺| 电影成人av| 欧美日韩瑟瑟在线播放| 国内毛片毛片毛片毛片毛片| 亚洲成人手机| 19禁男女啪啪无遮挡网站| 久久热在线av| 国产精品国产av在线观看| 高清av免费在线| 亚洲美女黄片视频| aaaaa片日本免费| 午夜精品国产一区二区电影| 欧美乱妇无乱码| 亚洲三区欧美一区| 午夜精品在线福利| 一边摸一边做爽爽视频免费| 久久精品国产亚洲av高清一级| 无限看片的www在线观看| 国产精品自产拍在线观看55亚洲 | 国产男靠女视频免费网站| 人人妻人人澡人人看| 99久久人妻综合| bbb黄色大片| 亚洲中文字幕日韩| 久久中文字幕人妻熟女| 国精品久久久久久国模美| avwww免费| 成人三级做爰电影| 亚洲国产精品sss在线观看 | 亚洲中文av在线| 国产在线精品亚洲第一网站| 欧美激情 高清一区二区三区| 精品久久蜜臀av无| 91成人精品电影| 欧美丝袜亚洲另类 | 午夜久久久在线观看| 欧美av亚洲av综合av国产av| 免费不卡黄色视频| 成人国产一区最新在线观看| 久9热在线精品视频| 精品一区二区三区视频在线观看免费 | 国产激情欧美一区二区| www.999成人在线观看| 搡老乐熟女国产| 黄色女人牲交| 精品一区二区三区四区五区乱码| 国产精品综合久久久久久久免费 | 高清在线国产一区| 一区二区日韩欧美中文字幕| 人妻久久中文字幕网| 日本欧美视频一区| 久久久久国产一级毛片高清牌| 久久久国产欧美日韩av| 免费不卡黄色视频| 亚洲伊人色综图| 日韩一卡2卡3卡4卡2021年| 日韩制服丝袜自拍偷拍| 无人区码免费观看不卡| 嫩草影视91久久| 一进一出抽搐动态| 91字幕亚洲| 国产成人av教育| 欧美+亚洲+日韩+国产| 国产成人一区二区三区免费视频网站| 三上悠亚av全集在线观看| 国产极品粉嫩免费观看在线| 午夜精品国产一区二区电影| 精品熟女少妇八av免费久了| 亚洲专区中文字幕在线| 国产精品美女特级片免费视频播放器 | 午夜成年电影在线免费观看| avwww免费| 免费av中文字幕在线| 黄色女人牲交| 国产高清国产精品国产三级| 18禁国产床啪视频网站| 国产精品一区二区免费欧美| 欧美黄色片欧美黄色片| 国产成人av激情在线播放| 亚洲精品美女久久久久99蜜臀| 精品少妇久久久久久888优播| 捣出白浆h1v1| 啦啦啦在线免费观看视频4| 国产三级黄色录像| 99国产精品99久久久久| 一个人免费在线观看的高清视频| 亚洲av成人一区二区三| 成在线人永久免费视频| 成人永久免费在线观看视频| 久久精品国产亚洲av高清一级| 在线观看66精品国产|