• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Hyperparameter on-line learning of stochastic resonance based threshold networks

    2022-08-31 09:56:14WeijinLi李偉進(jìn)YuhaoRen任昱昊andFabingDuan段法兵
    Chinese Physics B 2022年8期
    關(guān)鍵詞:李偉

    Weijin Li(李偉進(jìn)), Yuhao Ren(任昱昊), and Fabing Duan(段法兵)

    College of Automation,Qingdao University,Qingdao 266071,China

    Keywords: noise injection,adaptive stochastic resonance,threshold neural network,hyperparameter learning

    1. Introduction

    Gradient-based optimizers are commonly used for training neural networks,but a necessary condition for successfully implementing the network performance optimization is the continuously differentiable activation functions therein. However,the introduction of the piecewise-linear(e.g.,ReLU)[1,2]or the hard-limited (e.g., binarization) activations[3,4]has gained increasing attention in recent years due to the fact of training much deeper networks or largely saving memory storage and computation.[1–8]Therefore, training the neural network with nondifferentiable or zero-gradient activation functions becomes a tricky problem for applications of available gradient-based optimizers.

    A natural way of training threshold neural networks by the gradient-descent based back-propagation algorithm is approximating the nondifferentiable threshold neuron to a smoothed activation function,[6,9–13]but performing the network testing with threshold activation functions and the well trained network weights. For instance,a conventional method is to substitute the sigmoid function 1/(1+e?λu)with a large parameterλ>0(e.g.,λ=10)for the threshold neuron.[9–11]Unfortunately, the generalization performance of the threshold neural network is unsatisfactory in testing phase,because network weights are inadequately trained for the saturated regimes of the sigmoid function.[6,9–13]

    Recently, the approach of noise injection becomes a useful alternative option to optimize the artificial neural network.[4,6,8,12–22]It is interesting to notice that the benefits of injecting noise in threshold neural networks can be viewed as a type of stochastic resonance effect,[22]because there is also a nonzero noise level for improving the performance of nonlinear systems.[6,12–14,23–33]By injecting artificial noise into the saturated regime of activation function[4]or smoothing the input-output characteristic of hard-limiting neurons with an ensemble of mutually independent noise components,[6,12,13]the non-differentiable threshold networks own a proper definition of the gradients. Then, the gradientbased back-propagation algorithm can successfully perform in finely transformed threshold neural networks. Meanwhile, a stochastic resonance based threshold neural network on the distribution of injected noise into the hidden layer is proposed for effectively recycling the back-propagation training algorithm.[6,12,13]Furthermore,using a stochastic gradient descent (SGD) optimizer, we actually realize a noise-boosted back-propagation training algorithm in this kind of stochastic resonance based threshold networks by adaptively learning of both weights and noise levels.[13,29]

    The introduction of injected noise into the threshold neural network extends the order of dimension of the parameter space, and also poses a challenge to the SGD optimizer for finding a good optima in the non-convex performance landscape of the loss function.[34–38]This is because the SGD optimizer is sometimes trapped in a domain of the flat landscape of the loss function,and the gradients with respect to the noise level and network weights approach zero.[36–38]In this paper,the noise-boosted Adam optimizer is further demonstrated to train the stochastic resonance based threshold neural network more effectively in comparison with the SGD optimizer. It is shown that,with the powerful hyperparameter on-line learning capacity of the Adam optimizer,[34–38]the designed threshold network can attain the much lower mean square error (MSE)for function approximation or the higher accuracy for image classification. Moreover, in the testing phase of the stochastic resonance based threshold neural network,the practical realization of the threshold network trained by the Adam optimizer requires less computational cost than that of the optimized threshold network by the SGD optimizer. The benefits of injected noise manifested by the Adam optimizer in each hidden neuron with different noise levels also conduce to the research of exploiting adaptive stochastic resonance effects in machine learning.

    2. Main results

    2.1. Stochastic resonance based threshold network

    Consider anN×K×Mfeed-forward threshold neural network with three layers.[6,12,13,29,39,40]The input layer receives the datax ∈RN×1,and the weight matrixW ∈RK×Nconnects the hidden layer and the input one. The hidden layer consists ofKthreshold neurons[41]described as

    with an adjustable threshold parameterθkfork=1,2,...,K.The weight matrixU ∈RM×KconnectsKthreshold neurons in the hidden layer withMlinear activation functions in the output layer, andy ∈RM×1denotes the network output vector.

    It is seen from Fig. 1(a) that the threshold activation functionψ(u) of Eq. (1) is nondifferentiable atu=θkand with zero-gradients foru ?=θk. Based on the noise injection method,[6,12,13,29,39,40]we can replace the activation functionψ(u) in Eq. (1) with a differentiable one. This process is formed by injecting a sufficiently large numberTof mutually independent noise variablesηtwith the common probability density function (PDF)fη(η), as shown in Fig. 1(b). Then,a continuously differentiable activation function is deduced as the substitution,i.e.,the expectation[6,12,13,29]

    Fig.1. (a)Threshold activation function ψ(u)in Eq.(1)with the threshold parameter θk,(b)an ensemble of T noise samples with a common level σ of the standard deviation and(c)the noise-smoothed activation function h(x)with learnable parameters θk and σ.

    It is interesting to note that, for the designed threshold neural network with the hidden neuron given by Eq. (2),the aforementioned difficulty of training threshold neural networks can be overcame,because the activation functionh(x)in Eq.(2)has its finely defined gradient?h(x)/?x=fη(θk ?x).Furthermore,it is emphasized that the effective learning ability of the designed threshold neural network is also extended by the introduction of the learnable parameters of noise levelsσkand thresholdθkin Eq.(2).During the training process,the parametersσkandθk,as well as the network weight matricesWandU, are all optimized by the gradient-based learning rule. Specially, we here represent the corresponding gradients as?h(x)/?σk=∞θk?x ?fη(η)/?σkdηand?h(x)/?θk=

    ?fη(θk ?x)with respect toσkandθk,respectively. Thus,the back-propagation learning algorithm can be successfully implemented for training the designed threshold neural network.

    2.2. Motivated example

    Let{x(?),s(?)}L?=1denoteLexamples of the training set to train the desinged threshold network in a supervised learning manner, ands ∈?M×1represents the desired response vector. The loss function of the empirical mean square error(MSE)can be calculated as

    where‖ · ‖denotes the Euclidean norm. LetΘ ∈{W,U,θk,σk}denote the learnable parameter of the designed threshold network,the plain stochastic gradient descent(SGD)optimizer updates the parameterΘby the learning rule

    Then, the 1×K×1 (K= 10) stochastic resonance based threshold neural network is trained to fit the training set{x(?),s(?)}L?=1sampled from the target function of Eq. (5).Here,the learning rate takesα=0.01,the initial noise levelsσk(0)=1, the initial threshold parametersθk(0) and the initial weight vectorsW(0)andU(0)are uniformly distributed in the interval [?1,1]. The learning process ofJmseof the designed threshold network is shown in Fig. 2 via the SGD optimizer for different initial values of learnable parameters.

    It is important to note in Fig. 2 that the SGD optimizer is sensitive to the initial values of parameters. For example, setting the initial weightW1,1=?5 and the noise levelσ8=1, it is seen in Fig. 2 that the MSEJmse(?) of the designed threshold network converges to a local minimum ofJmse=1.149×10?3after 5000 epochs of training. However,when the initial values ofW1,1=?2.3 andσ8=0.2 are not properly set, it is shown in Fig. 2 that the learning trajectory of the MSEJmse(▽) is stuck in a flat landscape of the network performance surface, because the gradient herein is almost zero in every direction. Here,besidesW1,1and the noise levelσ8,other parametersΘ ∈{W,U,θk,σk}are fixed.

    Next, we employ the Adam optimizer to update the parameterΘof the stochastic resonance based threshold neural network by the learning rule[34]

    whereβ1=0.9 andβ2=0.999 are attenuation coefficients,0<ε ?1 is the regularization parameter,m(?) denotes the first-order moment with its corrected value ?m(?)andv(?)is the second-order moment estimation with its corrected value ?v(?).The operator⊙denotes the Hadamard product. Likewise,starting from the initial values ofW1,1=?2.3 andσ8=0.2,it is seen in Fig. 2 that the MSEJmse(+) of the designed network trained by the Adam optimizer can escape from the flat landscape of the network performance surface,and finally converge to the minimum ofJmse=1.109×10?3.Here,other parametersΘ ∈{W,U,θk,σk}(not includingW1,1andσ8)are fixed as the converged values searched by the SGD optimizer. This result clearly shows that the Adam optimizer,compared with the SGD one, is more efficient for optimizing the designed stochastic resonance based threshold network.

    Fig.2.(a)Performance surface and(b)the corresponding contour of the MSE Jmse versus the weight W1,1 and the noise level σ8 of the eighth hidden neuron. The learning curves of Jmse of the designed threshold network are also illustrated for different optimizers.

    Furthermore, it is shown in Fig. 3 that the Adam optimizer can optimize the designed threshold network with a smaller MSEJmse=1.137×10?5after 1600 epochs of training.Here,the performance surface is still illustrated versus the weightW1,1and the injected noise levelσ8, but the optimum value ofJmse=1.137×10?5is searched over the the whole space of the learnable network parameters by the Adam optimizer.The parametersΘ ∈{W,U,θk,σk}includingW1,1and the noise levelσ8are simultaneously updated by the learning rule of Eq.(6).

    It is emphasized that each hidden neuron in Eq.(2)is associated with one noise levelσkfork=1,2,...,K. Thus, in the training process of the designed threshold network,the injected noise will manifest its beneficial role as the convergence of the noise levelσkreaches a non-zero(local)optimum. It is seen in Figs.2 and 3 that,for a given weightW1,1nearby 1.2,the conventional stochastic resonance phenomenon can be observed as the noise levelσ8increases. When the noise levelσ8reaches the corresponding non-zero(local)optimum,the MSEJmseattains its minimum. It is noted that the resonant phenomenon of the designed threshold networks characterized by the MSEJmseis induced by a bundle ofKnoise levelσkin the hidden neurons.The stochastic resonance phenomenon shown in Fig.2 or Fig.3 only describes a slice of the MSEJmseversus the noise levelσ8for visualization. In view of the learning curves of theKnoise levelsσk,Fig.4 also presents the distinct characteristic of the adaptive stochastic resonance effect in the training phase. The learning curves of the injected noise levelsσkin Fig.4 show that the noise levelsσkstart from the same initial value of unity, and converge to different but non-zero optima. This fact also validates the practicability of the proposed noise-boosted Adam optimizer in Eq.(6)by adaptively optimizing the noise level in the training phase.

    Fig. 3. (a) Performance surface and (b) the corresponding contour of the MSE Jmse versus the weight coefficient W1,1 and the noise level σ8 of the eighth hidden neuron. The learning curve(?)of Jmse is also illustrated for the Adam optimizer.

    However,the hidden neuronh(x)in Eq.(2)is a limit expression,which ensures the success of training,but cannot be realized in practical testing of the threshold neural network. In testing experiments, the hidden neuronh(x)needs to be realized as

    where a finite numberTof threshold activation functions in Eq. (1) are activated byTmutually independent and identically distributed noise componentsηt, respectively. For 103testing datax(?) equally spaced in the interval [?2,2], the trained threshold neural network with the hidden neuronsh(x)indicated by Eq. (7) is simulated for 102times. In each trial,for thek-th hidden neuronh(x) (k=1,2,...,K),Tindependent noise componentsηtare randomly generated at the converged noise levelσkafter the training phase. Then, the outputs of the designed threshold network are averaged as an approximation to the target function of Eq.(5).

    Fig. 4. Learning curves of K =10 noise levels σk in the hidden layer of the designed stochastic resonance based threshold network. Other parameters are the same as those in Fig.3.

    For different numbers ofT,the experimental results of the MSEJmseare shown in Fig.5(a)for testing the trained threshold networks by both optimizers. It is seen from Fig.5(a)that,in order to achieve the same MSEJmse, the designed threshold network trained by the Adam optimizer needs a smaller numberTof threshold activation functions in Eq.(1)as well as the noise componentsηt. For instance,when the statistical mean value of the MSEJmse=4×10?4,T=3×103threshold functions in Eq. (1) are required for the designed threshold network trained by the Adam optimizer,whereasT=104threshold functions are required for the SGD optimizer. In Fig. 5(b), usingT=104threshold functions assisted by the injected noise with the same size,the output(blue dashed line)of the well trained threshold network by the Adam optimizer is plotted. For comparison,the target function(red solid line)of Eq. (5) is also illustrated. It is seen in Fig. 5(b) that the trained threshold neural network performs well on approximating the target unidimensional function in the testing phase.It is emphasized that, in the practical realization of the designed threshold network,the beneficial role of injected noise is accomplished by averaging an ensemble of threshold functions driven by the same size of injected noise samples in each hidden neuron,whereby the application of adaptive stochastic resonance in threshold neural networks is evidently confirmed.

    Fig.5. (a)Experimental results of the MSE Jmse for 103 testing points of the target unidimensional function in Eq.(5)versus the number T. Here,102 trials are realized for each point of experimental results,and the ensemble of outputs of T threshold functions activated by T noise samples is regarded as the realization of the hidden neuron of Eq.(7). (b)The output of the trained threshold network by the Adam optimizer as the approximation(blue dash line)of the target function of Eq.(5). For comparison,the target function(red solid line)of Eq.(5)is also plotted.

    2.3. Two-dimensional function approximation

    Furthermore, we consider the test of a two-dimensional benchmark function

    The stochastic resonance based threshold neural network is designed with the size 2×50×1. The 16×16 training set of{x(?),s(?)}L?=1sampled from the target function of Eq. (8) is equally spaced in the range [?3,3]×[?3,3]. After 2×104training epochs, the MSEsJmseare achieved as 0.1003 and 7×10?4for the SGD optimizer and the Adam optimizer, respectively. The Adam optimizer is still superior to the SGD optimizer in training the designed threshold network to approximate the two-dimensional function in Eq.(8).For 32×32 testing set sampled from the target function of Eq.(8)over the range[?3,3]×[?3,3],Fig.6(a)illustrates the outputs of the trained threshold network as the approximation(patched surface)of the two-dimensional functionf(x1,x2)in Eq. (8). The experimental results of the MSEJmseare obtained as 0.1012 and 7.5×10?4for the SGD optimizer and the Adam optimizer,respectively. Especially,Fig.6(b)illustrates the relative errors|y ?f(x1,x2)|/?between the outputyof the trained threshold network by the Adam optimizer and the testing data. Here, the maximum relative error is obtained as max|y ?f(x1,x2)|/?=5.188×10?3and the maximum difference?=maxf(x1,x2)?minf(x1,x2)=13.4628 for the target function in Eq.(8)in the range[?3,3]×[?3,3].

    Fig. 6. (a) Network outputs y of the trained threshold neural network by the Adam optimizer as the approximation (patched surface) to the 32×32 testing data(?)of the two-dimensional function f(x1,x2)in Eq.(8). (b)The corresponding relative error|y?f(x1,x2)|/?between the threshold network output and the testing data. For reference,the maximum difference ?=max f(x1,x2)?min f(x1,x2)=13.4628 is taken for the target function in Eq.(8)in the range[?3,3]×[?3,3].

    2.4. Real world data set of the function regression

    We also validate the trainedN×20×1 threshold network on nine real-world data sets.[42–50]The dimensionNand the lengthLof data sets are listed in Table 1,and the computer is equipped with CPU of Intel Core i7-6700@3.40 GHz and 16G RAM DDR4@2133 MHz. Using the two-eight rule, 80% of data are used for training,while 20%of data are employed to test the trained threshold neural network. Table 2 reports the training and testing results of the MSEJmseof the stochastic resonance based threshold neural network for two considered optimizers. It is seen in Table 2 that the Adam optimizer can optimize the designed threshold network with a lower MSEJmsethan the SGD optimizer does in both training and testing data sets. The hyperparameter on-line learning of stochastic resonance based threshold networks via the Adam optimizer enables more precise control of injected noise levels in the hidden layer, leading to significantly improved performance of the designed threshold network.

    Table 1. Feature dimension and length of data sets.

    Table 2. Experimental results of the MSE Jmse of designed threshold threshold networks.

    2.5. Recognition of handwritten digits

    We further incorporate the expectation expression of Eq. (2) into the deep convolution neural network for image classification. The architecture of the deep convolution neural network contains 20 convolutional filters with the size 9×9 and the same padding, the hidden layer with 20 neurons indicated by Eq. (2), the factor-2 pooling layer, and the fully connected layers with 10 neurons. The benchmark data set of MNIST consists of a training set of 6×104and a testing set of 104gray-scale images(28×28)representing the handwritten digits from 0 to 9. Here, we employ 104images with the training set and the test set in a ratio 4:1. Using the SGD and Adam optimizers,the training accuracies of the designed convolution neural network versus the training epoch number are shown in Fig. 7. It is seen in Fig. 7 that the designed convolution neural network optimized by the Adam optimizer generally achieves a higher accuracy for recognizing handwritten digits. For the well trained convolution neural network after 50 training epochs,usingT=10 threshold functions assisted by the injected noise with the same size,the experimental accuracy rate in the test set is 98.41% for the Adam optimizer and 97.63% for the SGD optimizer, respectively. For comparison,it is noted that the full-connected backward propagation network[29]and the support vector machine[51]achieve the accuracy rates of 97%and 94.1%,respectively. The accuracy rate 98.41%obtained by the trained convolution threshold neural network by the Adam optimizer indicates a very satisfactory efficiency of the deep learning method. These results demonstrate that the injection of noise into threshold neurons facilitates the optimization of the designed deep convolution neural network, and the hyperparameter on-line learning of the Adam optimizer can also train the deeper stochastic resonance based threshold networks for image classification with a competitive performance.

    Fig.7.Learning curves of accuracies of the designed convolution neural network on the MNIST data set.

    3. Conclusion

    In summary,in order to optimize the threshold neural network in the training phase, we replace the zero-gradient activation function with the continuously differentiable function that is based on the PDF of the injected noise. However, this substitution strategy introduces a group of noise parameters related to the noise PDF,and poses challenges for the training algorithm. For function approximation and image classification,it is shown that,due to the hyperparameter on-line learning capacity,the Adam optimizer can speed up the training of the designed threshold neural network and overcome the local convergence problem existing in the SGD optimizer. More interestingly, the injected noise not only extends the dimension of the parameter space wherein the designed threshold network is optimized,but also converges to a nonzero noise level in each hidden neuron. This distinguishing feature is closely related to the adaptive stochastic resonance effect,and also indicates a meaningful application of the stochastic resonance phenomenon in artificial neural networks.

    However,it is noted that the Adam optimizer can train the artificial neural network to fit the target function well or recognize labels in classification more correctly. Then,for noisy observations of a target function or the uncorrected labels,the overfitting problem may occur for the training process of the designed threshold network by the Adam optimizer. Noting the the regularization in the loss function contributed by the injected noise,[16,17]it is much more meaningful to further investigate the generalization performance of the stochastic resonance based threshold neural network,especially for the low signal-to-noise ratio of the acquired signals by sensors. In Eq. (2), only the Gaussian noise injected into the designed feed-forward threshold neural network is considered. It will be interesting to find the optimal injected noise type to achieve the improved performance of the designed threshold network with respect to the noise PDF.In addition,we test the convolution threshold neural network with the injection of noise on image classification of the MNIST data set, and the potential applications of this kind of deep convolution threshold neural networks on more challenging data set, e.g., CIFAR-10 and ImageNet,also deserve to be explored.

    Acknowledgement

    Project supported by the Natural Science Foundation of Shandong Province,China(Grant No.ZR2021MF051).

    猜你喜歡
    李偉
    年夜飯里的漢字
    “田”野里的樂趣
    “制造”年獸
    孟母三遷
    李偉作品選登
    變魔術(shù)
    錯(cuò)在哪兒
    這樣玩多好
    拼拼 讀讀 寫寫
    電話里傳來的“暖氣”
    成人国产麻豆网| 日韩,欧美,国产一区二区三区| 久久精品国产亚洲av涩爱| 亚洲国产精品一区二区三区在线| 国产成人91sexporn| 黄片播放在线免费| 国产亚洲一区二区精品| 亚洲精品一二三| 午夜福利在线观看免费完整高清在| 成人手机av| 精品一区在线观看国产| 免费黄色在线免费观看| 亚洲精华国产精华液的使用体验| 一区二区三区精品91| www.精华液| 国产精品嫩草影院av在线观看| 日日啪夜夜爽| 国产精品一区二区在线观看99| 婷婷色综合大香蕉| 美女福利国产在线| 国产爽快片一区二区三区| 中文字幕人妻丝袜一区二区 | 国产成人精品无人区| 桃花免费在线播放| 欧美 亚洲 国产 日韩一| 中文字幕另类日韩欧美亚洲嫩草| 欧美精品一区二区免费开放| 一区二区三区四区激情视频| 中文字幕亚洲精品专区| 亚洲中文av在线| 波多野结衣一区麻豆| 久久影院123| 国产黄色视频一区二区在线观看| 久久精品aⅴ一区二区三区四区 | 国产精品熟女久久久久浪| videos熟女内射| 久久久久久久久久人人人人人人| 一本—道久久a久久精品蜜桃钙片| 啦啦啦在线免费观看视频4| 国产一级毛片在线| 色94色欧美一区二区| 人人妻人人澡人人看| 亚洲,一卡二卡三卡| 美女视频免费永久观看网站| 免费高清在线观看日韩| 日韩三级伦理在线观看| a 毛片基地| 久久精品熟女亚洲av麻豆精品| 亚洲av日韩在线播放| 日韩中文字幕视频在线看片| 曰老女人黄片| 精品一品国产午夜福利视频| 日韩人妻精品一区2区三区| 午夜激情久久久久久久| 欧美日韩av久久| 久久精品aⅴ一区二区三区四区 | 最近2019中文字幕mv第一页| 高清视频免费观看一区二区| 国产精品秋霞免费鲁丝片| 少妇猛男粗大的猛烈进出视频| 在线亚洲精品国产二区图片欧美| 最近中文字幕2019免费版| 性少妇av在线| av国产久精品久网站免费入址| 国产乱人偷精品视频| 国产高清不卡午夜福利| 桃花免费在线播放| 日本欧美国产在线视频| 赤兔流量卡办理| 亚洲精品日韩在线中文字幕| 久久青草综合色| 女的被弄到高潮叫床怎么办| 亚洲成色77777| 极品人妻少妇av视频| 精品卡一卡二卡四卡免费| 国产乱人偷精品视频| 熟女电影av网| 色哟哟·www| 狂野欧美激情性bbbbbb| 国产精品不卡视频一区二区| 国产黄频视频在线观看| 大片免费播放器 马上看| 亚洲图色成人| 亚洲精品aⅴ在线观看| www.精华液| 亚洲精品第二区| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 国产精品久久久久久av不卡| 午夜影院在线不卡| 日韩av在线免费看完整版不卡| 国产片内射在线| 一区二区三区激情视频| 国产欧美亚洲国产| 亚洲成国产人片在线观看| 在现免费观看毛片| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 亚洲四区av| 精品一品国产午夜福利视频| 美女主播在线视频| 免费在线观看视频国产中文字幕亚洲 | 国产午夜精品一二区理论片| 一区在线观看完整版| 国产成人免费无遮挡视频| 欧美成人午夜免费资源| 纵有疾风起免费观看全集完整版| 国产成人av激情在线播放| 丝袜在线中文字幕| 欧美激情高清一区二区三区 | 成年人午夜在线观看视频| 亚洲精品久久成人aⅴ小说| 2022亚洲国产成人精品| 丝袜美足系列| 久久久国产欧美日韩av| 亚洲精品视频女| 国产精品女同一区二区软件| 国产亚洲av片在线观看秒播厂| 亚洲欧美精品自产自拍| 一级毛片电影观看| 自线自在国产av| 叶爱在线成人免费视频播放| 一级片'在线观看视频| 精品少妇一区二区三区视频日本电影 | 黑人巨大精品欧美一区二区蜜桃| 欧美精品一区二区大全| 国产视频首页在线观看| 亚洲国产欧美日韩在线播放| 精品国产乱码久久久久久男人| 在线观看三级黄色| 黄色 视频免费看| 人人妻人人澡人人爽人人夜夜| 99久久中文字幕三级久久日本| 五月天丁香电影| 人妻 亚洲 视频| 香蕉国产在线看| 热re99久久国产66热| av在线老鸭窝| 欧美精品一区二区免费开放| 久久国内精品自在自线图片| 国产成人一区二区在线| 少妇人妻久久综合中文| 女性被躁到高潮视频| 热99久久久久精品小说推荐| 日韩一卡2卡3卡4卡2021年| 在线 av 中文字幕| 老汉色av国产亚洲站长工具| 在线观看三级黄色| 999精品在线视频| 国产精品一区二区在线观看99| xxx大片免费视频| 在线免费观看不下载黄p国产| 国产精品二区激情视频| 精品久久蜜臀av无| 欧美 日韩 精品 国产| 国产av码专区亚洲av| 高清av免费在线| 日韩中文字幕欧美一区二区 | 精品国产一区二区三区四区第35| 视频在线观看一区二区三区| 国产精品久久久久久av不卡| 午夜福利乱码中文字幕| 久久精品久久久久久噜噜老黄| 欧美人与性动交α欧美软件| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 啦啦啦在线免费观看视频4| 韩国高清视频一区二区三区| 少妇的逼水好多| 1024视频免费在线观看| 色哟哟·www| a 毛片基地| 成年女人在线观看亚洲视频| 90打野战视频偷拍视频| 久久精品亚洲av国产电影网| √禁漫天堂资源中文www| 欧美国产精品va在线观看不卡| 老汉色∧v一级毛片| 亚洲人成网站在线观看播放| 最新的欧美精品一区二区| www.熟女人妻精品国产| 日韩免费高清中文字幕av| 18禁动态无遮挡网站| 极品少妇高潮喷水抽搐| 十八禁网站网址无遮挡| 男的添女的下面高潮视频| 国产精品秋霞免费鲁丝片| 黄色怎么调成土黄色| 欧美日韩亚洲高清精品| 精品人妻熟女毛片av久久网站| 久久久亚洲精品成人影院| 高清黄色对白视频在线免费看| 欧美日韩一级在线毛片| 在线观看国产h片| 国产黄色视频一区二区在线观看| 欧美成人午夜精品| 十八禁高潮呻吟视频| 秋霞伦理黄片| 国产精品不卡视频一区二区| 国产一区二区三区综合在线观看| 精品国产乱码久久久久久男人| 看非洲黑人一级黄片| 啦啦啦在线观看免费高清www| 啦啦啦在线免费观看视频4| 我要看黄色一级片免费的| 亚洲国产欧美日韩在线播放| 午夜福利视频在线观看免费| 亚洲精品美女久久久久99蜜臀 | 国产亚洲欧美精品永久| 一区二区三区四区激情视频| 国产精品久久久久成人av| 最近的中文字幕免费完整| 亚洲精品一区蜜桃| 中文字幕人妻丝袜一区二区 | 五月开心婷婷网| 久久久久久久久久久久大奶| 看非洲黑人一级黄片| 熟女少妇亚洲综合色aaa.| 又粗又硬又长又爽又黄的视频| 久久影院123| 精品卡一卡二卡四卡免费| 一级a爱视频在线免费观看| 如何舔出高潮| 日韩视频在线欧美| 亚洲男人天堂网一区| 欧美日韩综合久久久久久| 精品人妻偷拍中文字幕| 成人国产av品久久久| 久久久久网色| 国产精品一二三区在线看| 精品亚洲成a人片在线观看| 久久久欧美国产精品| 久久国产精品大桥未久av| 久久久久久久久久久久大奶| 中文字幕人妻熟女乱码| 亚洲美女视频黄频| 午夜福利影视在线免费观看| 青春草视频在线免费观看| 亚洲av欧美aⅴ国产| 亚洲内射少妇av| 狂野欧美激情性bbbbbb| 老司机亚洲免费影院| 我要看黄色一级片免费的| 男女午夜视频在线观看| 边亲边吃奶的免费视频| 制服人妻中文乱码| 三上悠亚av全集在线观看| videosex国产| 久久精品久久久久久噜噜老黄| 中文字幕最新亚洲高清| 丰满少妇做爰视频| 亚洲五月色婷婷综合| 久久精品国产a三级三级三级| 欧美变态另类bdsm刘玥| 伦理电影大哥的女人| 亚洲精品aⅴ在线观看| 又大又黄又爽视频免费| 天堂8中文在线网| 777久久人妻少妇嫩草av网站| 亚洲成人一二三区av| 男女午夜视频在线观看| 亚洲人成网站在线观看播放| av有码第一页| 日本av免费视频播放| 精品久久蜜臀av无| 久久久久久久久免费视频了| av电影中文网址| 成人二区视频| 午夜免费男女啪啪视频观看| 99国产综合亚洲精品| 欧美亚洲 丝袜 人妻 在线| 久久这里只有精品19| 午夜日本视频在线| 看非洲黑人一级黄片| 亚洲精品国产一区二区精华液| 日本av手机在线免费观看| 婷婷色综合www| 啦啦啦啦在线视频资源| 久久毛片免费看一区二区三区| 伦理电影大哥的女人| 高清黄色对白视频在线免费看| 亚洲第一区二区三区不卡| 99国产综合亚洲精品| 老司机影院毛片| 在线天堂最新版资源| 亚洲美女黄色视频免费看| 亚洲内射少妇av| 国产黄色视频一区二区在线观看| 国产淫语在线视频| 天堂中文最新版在线下载| 亚洲综合色惰| 精品少妇内射三级| 激情视频va一区二区三区| 老女人水多毛片| 在线精品无人区一区二区三| 亚洲av欧美aⅴ国产| 日韩三级伦理在线观看| 丝袜美足系列| 欧美日韩亚洲高清精品| 免费在线观看完整版高清| 国产成人精品久久二区二区91 | 十八禁高潮呻吟视频| 人妻一区二区av| 久久久久久久久久久久大奶| 亚洲av成人精品一二三区| 91在线精品国自产拍蜜月| 亚洲美女搞黄在线观看| 免费大片黄手机在线观看| 永久网站在线| 一级毛片 在线播放| 午夜91福利影院| 丝袜喷水一区| 熟女电影av网| 桃花免费在线播放| 男男h啪啪无遮挡| 日韩制服丝袜自拍偷拍| 精品酒店卫生间| 免费在线观看黄色视频的| 中文字幕av电影在线播放| 婷婷色av中文字幕| 欧美激情 高清一区二区三区| 2022亚洲国产成人精品| 亚洲欧洲精品一区二区精品久久久 | 成年动漫av网址| 成人毛片a级毛片在线播放| 可以免费在线观看a视频的电影网站 | 男人操女人黄网站| 人人妻人人添人人爽欧美一区卜| 精品国产超薄肉色丝袜足j| kizo精华| 亚洲精品视频女| 国产亚洲最大av| 人妻少妇偷人精品九色| 久久婷婷青草| 日韩av不卡免费在线播放| 欧美另类一区| 国产精品久久久久久久久免| 亚洲国产精品一区二区三区在线| 国产亚洲欧美精品永久| 国产精品嫩草影院av在线观看| 免费黄频网站在线观看国产| 肉色欧美久久久久久久蜜桃| 最近最新中文字幕免费大全7| 高清欧美精品videossex| 国产成人精品一,二区| 亚洲国产精品一区二区三区在线| 久久久国产一区二区| 女性被躁到高潮视频| 巨乳人妻的诱惑在线观看| 中文天堂在线官网| 亚洲精品国产一区二区精华液| 国产成人aa在线观看| 亚洲视频免费观看视频| 只有这里有精品99| 我的亚洲天堂| 日韩制服丝袜自拍偷拍| 叶爱在线成人免费视频播放| 亚洲成国产人片在线观看| a级毛片黄视频| 久久精品aⅴ一区二区三区四区 | 日韩不卡一区二区三区视频在线| 热re99久久国产66热| av在线app专区| 久久人妻熟女aⅴ| 国产亚洲欧美精品永久| 水蜜桃什么品种好| 国产精品国产av在线观看| 青草久久国产| 一二三四中文在线观看免费高清| 999久久久国产精品视频| 成人亚洲欧美一区二区av| 色吧在线观看| 免费在线观看视频国产中文字幕亚洲 | 一区二区三区乱码不卡18| 久久精品国产a三级三级三级| 午夜福利在线观看免费完整高清在| 有码 亚洲区| 97精品久久久久久久久久精品| 深夜精品福利| 欧美日韩一级在线毛片| 亚洲欧美一区二区三区久久| 国产成人精品一,二区| 视频在线观看一区二区三区| 在线观看免费高清a一片| 国产亚洲av片在线观看秒播厂| 日韩人妻精品一区2区三区| 人人妻人人添人人爽欧美一区卜| 狠狠精品人妻久久久久久综合| 黄色一级大片看看| 桃花免费在线播放| 亚洲国产色片| 一本一本久久a久久精品综合妖精 国产伦在线观看视频一区 | 精品人妻在线不人妻| 老汉色av国产亚洲站长工具| 人人妻人人澡人人看| 亚洲精品国产av成人精品| 精品视频人人做人人爽| 亚洲av国产av综合av卡| av天堂久久9| 成年av动漫网址| 色网站视频免费| 成人国产av品久久久| 人人妻人人爽人人添夜夜欢视频| 日韩精品有码人妻一区| 五月天丁香电影| 国产精品不卡视频一区二区| 亚洲精品日韩在线中文字幕| 啦啦啦视频在线资源免费观看| 老女人水多毛片| 麻豆av在线久日| 免费观看性生交大片5| 久久久a久久爽久久v久久| 精品福利永久在线观看| 久久热在线av| av线在线观看网站| 亚洲激情五月婷婷啪啪| 男女国产视频网站| 十八禁高潮呻吟视频| av网站免费在线观看视频| 黄片播放在线免费| 新久久久久国产一级毛片| 久久免费观看电影| 国语对白做爰xxxⅹ性视频网站| 欧美成人午夜精品| 国产男女内射视频| 国产精品一二三区在线看| 日本黄色日本黄色录像| 成人国语在线视频| 三级国产精品片| 女性生殖器流出的白浆| 亚洲精品国产一区二区精华液| 在线观看www视频免费| 熟女电影av网| 观看美女的网站| 新久久久久国产一级毛片| 汤姆久久久久久久影院中文字幕| 中文字幕av电影在线播放| 熟妇人妻不卡中文字幕| 成人毛片60女人毛片免费| 我的亚洲天堂| av在线老鸭窝| 国产精品一国产av| av电影中文网址| 精品人妻熟女毛片av久久网站| 少妇的丰满在线观看| 在线观看www视频免费| 国产成人精品在线电影| a级毛片黄视频| 中文字幕人妻丝袜一区二区 | 黑丝袜美女国产一区| 又粗又硬又长又爽又黄的视频| 亚洲成人一二三区av| 你懂的网址亚洲精品在线观看| 丰满迷人的少妇在线观看| 国产亚洲av片在线观看秒播厂| 婷婷色麻豆天堂久久| 精品国产国语对白av| 国产精品不卡视频一区二区| 久久久久久伊人网av| 晚上一个人看的免费电影| 丝袜人妻中文字幕| 丝袜脚勾引网站| 国产精品av久久久久免费| 亚洲欧洲精品一区二区精品久久久 | 男女边摸边吃奶| 午夜福利视频精品| 午夜福利一区二区在线看| 亚洲欧美成人综合另类久久久| 下体分泌物呈黄色| 亚洲成人av在线免费| 黄色怎么调成土黄色| 少妇被粗大的猛进出69影院| 十分钟在线观看高清视频www| 老汉色∧v一级毛片| 大话2 男鬼变身卡| 日本爱情动作片www.在线观看| 天天影视国产精品| av福利片在线| 777久久人妻少妇嫩草av网站| 国产成人精品福利久久| 97在线视频观看| 国产精品久久久av美女十八| 免费播放大片免费观看视频在线观看| 国产精品99久久99久久久不卡 | 国产免费现黄频在线看| 国产日韩一区二区三区精品不卡| 天天影视国产精品| 最新中文字幕久久久久| 秋霞在线观看毛片| 寂寞人妻少妇视频99o| 啦啦啦中文免费视频观看日本| 亚洲精品久久成人aⅴ小说| 精品一区二区免费观看| 国产精品国产三级国产专区5o| 最近最新中文字幕大全免费视频 | 午夜福利网站1000一区二区三区| 热re99久久国产66热| 国产伦理片在线播放av一区| av又黄又爽大尺度在线免费看| 纯流量卡能插随身wifi吗| 久久免费观看电影| 午夜老司机福利剧场| 人人妻人人爽人人添夜夜欢视频| 一区二区av电影网| 人妻少妇偷人精品九色| 欧美精品一区二区大全| 国产精品久久久久久精品电影小说| 男女免费视频国产| 美女国产高潮福利片在线看| 满18在线观看网站| 欧美国产精品va在线观看不卡| videosex国产| 在线观看美女被高潮喷水网站| h视频一区二区三区| av免费在线看不卡| 在线亚洲精品国产二区图片欧美| 亚洲av中文av极速乱| 99久久精品国产国产毛片| 国产在线免费精品| 在线观看国产h片| 亚洲欧美色中文字幕在线| 少妇人妻 视频| 国产一级毛片在线| 18在线观看网站| a级毛片黄视频| 精品国产露脸久久av麻豆| 最近中文字幕高清免费大全6| 成人午夜精彩视频在线观看| 男女边吃奶边做爰视频| 国产精品国产av在线观看| 国产人伦9x9x在线观看 | 国产激情久久老熟女| 一级a爱视频在线免费观看| 制服丝袜香蕉在线| 人妻人人澡人人爽人人| 亚洲av电影在线进入| 日日爽夜夜爽网站| 久久久久精品人妻al黑| 丰满迷人的少妇在线观看| 一区二区日韩欧美中文字幕| 国产一区亚洲一区在线观看| 亚洲图色成人| 欧美日韩精品成人综合77777| 人妻系列 视频| 亚洲国产精品一区二区三区在线| av国产久精品久网站免费入址| 国产男女超爽视频在线观看| 91精品伊人久久大香线蕉| 久久亚洲国产成人精品v| 午夜久久久在线观看| 午夜影院在线不卡| av片东京热男人的天堂| 在线亚洲精品国产二区图片欧美| 国产精品偷伦视频观看了| 97精品久久久久久久久久精品| 国产成人精品无人区| 在线 av 中文字幕| 男女下面插进去视频免费观看| 又黄又粗又硬又大视频| 色网站视频免费| 热re99久久国产66热| av有码第一页| 啦啦啦在线免费观看视频4| 不卡视频在线观看欧美| 亚洲情色 制服丝袜| 国产又色又爽无遮挡免| 一级毛片电影观看| 亚洲男人天堂网一区| 在线观看免费高清a一片| 免费女性裸体啪啪无遮挡网站| 亚洲av免费高清在线观看| 母亲3免费完整高清在线观看 | 欧美黄色片欧美黄色片| 亚洲精品av麻豆狂野| 久久久久久久久久久免费av| 午夜日韩欧美国产| 黑丝袜美女国产一区| 午夜日韩欧美国产| 女性生殖器流出的白浆| 卡戴珊不雅视频在线播放| 黄色一级大片看看| 精品久久久精品久久久| 久久久久久久久久久久大奶| 免费看av在线观看网站| 少妇人妻 视频| 在线观看一区二区三区激情| 搡老乐熟女国产| 色94色欧美一区二区| 国产精品熟女久久久久浪| 午夜福利在线观看免费完整高清在| 中文字幕精品免费在线观看视频| 香蕉丝袜av| 亚洲av免费高清在线观看| 寂寞人妻少妇视频99o| 男人爽女人下面视频在线观看| av网站免费在线观看视频| 少妇熟女欧美另类| 免费播放大片免费观看视频在线观看| 波多野结衣av一区二区av| av又黄又爽大尺度在线免费看| 成年动漫av网址| 亚洲精品国产色婷婷电影| kizo精华| 国产综合精华液| 九草在线视频观看| 亚洲,一卡二卡三卡| 成人亚洲精品一区在线观看| 国产成人a∨麻豆精品| 亚洲经典国产精华液单| 一区二区三区乱码不卡18| av.在线天堂| 国产精品久久久久久精品电影小说| 国产精品 国内视频| 国产乱来视频区| 婷婷色综合大香蕉| 亚洲av在线观看美女高潮| 国产一区亚洲一区在线观看| 美女中出高潮动态图|