• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    A Quantized Kernel Least Mean Square Scheme with Entropy-Guided Learning for Intelligent Data Analysis

    2017-05-09 01:39:33XiongLuoJingDengJiLiuWeipingWangXiaojuanBanJenqHaurWang
    China Communications 2017年7期

    Xiong Luo , Jing Deng , Ji Liu , Weiping Wang , Xiaojuan Ban , Jenq-Haur Wang

    1 School of Computer and Communication Engineering, University of Science and Technology Beijing, Beijing 100083, China

    2 Beijing Key Laboratory of Knowledge Engineering for Materials Science, Beijing 100083, China

    3 Department of Computer Science and Information Engineering, National Taipei University of Technology, Taipei 10608, Taiwan* The corresponding authors, emails: xluo@ustb.edu.cn; shiya666888@126.com

    I. INTRODUCTION

    Kernel methods have become increasingly popular in machine learning and they are power tools to perform complex computing for network service and application. For example,the kernel learning algorithms could be employed to improve performance in detecting cyber-based attacks on computer networks [1].In recent years, enormous research efforts have been devoted to the development of kernel learning methods, such as support vector machine [2], kernel principal component analysis[3], and many others. By using kernel methods to map the original input space into a high-dimensional feature space and then performing the linear learning in feature space, these nonlinear algorithms show significant optimization performance. Especially, kernel adaptive filtering (KAF) is a powerful nonlinear filter developed in reproducing kernel Hilbert space(RKHS), using the linear structure of RKHS to achieve a mature linear adaptive algorithm in the input space [4]. Currently, there are some typical nonlinear adaptive filtering algorithms that are generated by mapping the linear algorithms to RKHS, such as kernel least mean square (KLMS) algorithm [5], kernel recursive least square (KRLS) algorithm [6], and many others [7]-[12]. There also has been a surge of interest on the study of applications of kernel learning methods [13]-[15].

    Generally, these nonlinear adaptive filtering algorithms will generate a growing radial basis functional (RBF) network via a radial symmetric Gauss kernel. This growing structure leads to the increase of computing costs and memory requirements, especially on the occasion of continuous adaptation [7]. Hence,some online sparsi fication methods have been proposed to address this issue, while using a certain criterion function with thresholds to decide whether the new samples should be added into the central set. There are some typical sparsification criteria. Although those methods can dramatically reduce the scale of network, its high computational cost also imposes challenging obstacles to the practical applications. Then, a novel quantization approach of constraining the network size was developed, and a quantized kernel least mean square (QKLMS) algorithm was proposed[7]. Similarly, the basic idea of quantization is to use a smaller body of data to represent the whole input data by partitioning the input space into smaller regions. But the quantization approach is computationally simple while providing a relatively easy method to determining quantization size, and the redundant data are used to improve the performance by quantizing to the nearest center.

    However, the methods mentioned above do not consider the pretreatment for input dataset.Recently, the entropy-based optimization technique was integrated into machine learning algorithms [16]. It is feasible to reduce input and output datasets by measuring the importance degree of every input vector in accordance to their information entropies, and deleting the input and corresponding output vectors when they are insigni ficant or they are easy to cause errors, so that it is possible to compress the memories and the scale of the network. Generally, most of current nonparametric entropy estimation techniques are with the estimation of the probability density function (PDF). These estimates are then substituted by the expression of entropy, and have been widely applied to the estimation of the Shannon entropy [17].Meanwhile, an optimization method using consecutive square entropy technique was proposed [18]. It incorporates Parzen’s windows function into square entropy to realize the entropy estimation. Parzen’s windows approach is a nonparametric method for estimating the PDF of a finite set of patterns [19],and it can fully re flect the distribution characteristics of data. Compared with the traditional information entropy, the square entropy using Parzen’s window as the PDF of dataset can better describe the uncertainty of the dataset, thus it can describe the importance degree of input vector more accurately.

    In consideration of the above analysis, we improve QKLMS by optimizing initial input data. Through the combination of square entropy and QKLMS algorithm, a novel kernel optimization scheme with entropy-guided learning, called EQ-KLMS, is proposed. In EQ-KLMS, on the basis of optimal entropy weight, the adaptive filter can be performed with high precision and low computational cost. In addition, data analysis is now widely used in practical applications. Some learning based methods have been used for data analysis, such as extreme learning machine (ELM)[20] and least squares support vector machine(LSSVM) [2], and they have their own advantages and de ficiencies in data prediction. With the development of data analytics while making decisions via data-driven learning scheme[21], kernel learning method has been widely used in intelligent data analysis, thus achieving many good results. This article also focuses on the intelligent data analysis by using our proposed scheme to address data prediction.

    The rest of this article is organized as follows. In Section 2, we introduce the adaptive filters, QKLMS algorithm, and entropy estimate technique. In Section 3, the details of the proposed scheme EQ-KLMS are presented.Experiment and discussion are provided in Section 4. Conclusion is summarized in Section 5.

    To improve the learning accuracy and reduce the computing time, we combine entropy-guided learning technique and quantization approach to develop a novel kernel optimization scheme EQ-KLMS.

    II. BACKGROUNDS

    2.1 Adaptive filters

    Adaptive flter is a class of fltering structures equipped with a built-in mechanism that enables such a flter to automatically adjust its free parameters according to statistical variations in the environment.

    The basic structure of adaptive filter is shown in Figure 1. The filter embodies a set of adjustable parameters, e.g., weights vector ω(i-1). Andy(i) is the actual response to input vector u(i), which applied to the filter at timei. The difference between actual responsey(i)and desired responsed(i) is the error signale(i). Thene(i) is used to produce an adjustment to the parameter vector ω(i-1) of the filter. The adaptive filtering process is constantly repeated in this manner until the parameter adjustments become small enough to stop the adaptation.

    2.2 Quantized kernel least mean square algorithm

    KLMS is actually the linear least mean square algorithm in RKHS, which is the simplest one among the family of KAF. The KLMS produces a growing RBF network by allocating a new kernel unit for every new example with input as the center. The growing RBF network structure with each new sample is the biggest obstacle to its wide application.

    Quantization techniques have been employed in some fields. QKLMS algorithm uses the online vector quantization (VQ) method to compact the RBF structure of KLMS by reducing the network size [7]. QKLMS algorithm can be implemented by just quantizing the feature vector ω(i) in the weight-update equation ω(i) = ω(i-1) +ηe(i)φ(i) of KLMS,whereerrore(i) =d(i) - ω(i-1)Tφ(i), φ(·) denotes the input vector in the high-dimensional feature space,d(i) is the desire signal, andηis the step-size [5].

    Then, the basic idea of quantization approach can be described briefly as follows.When a new input vector u(i) is available, we compute the Euclidean distance between u(i)and codebook C(i-1) firstly. If the distance is less than the given threshold, we keep the codebook unchanged, and quantize u(i) to the closest code-vector. Otherwise, we need to update the codebook by C(i) = {C(i-1),u(i)},and allocate a new kernel unit for u(i). Thus,the scale of the network will be compressed.

    2.3 Entropy estimation with Parzen’s windows

    Generally, the greater the entropy, the less invalid information contained in the system [22]. The consecutive square entropy of a random variable is closely related to the PDF for all possible values of this random variable, and its expression is shown in [18]. Remarkably, the PDF of consecutive square entropy could be a Parzen’s window function, which is de fined in [23].

    To reduce the influence of the discrete points of the input data on the model accuracy,a method of measuring the amount of information contained in a variable based on entropy weight is proposed. Those steps are as follows:

    Fig.1 The basic structure of adaptive filter

    2) Define validity coefficienthj=1-ej, the entropy weight of each input vector could be calculated by the validity coefficient:

    It can be seen, for an input vector, the greater the square entropy, the lower the validity coefficient, the smaller the entropy weight,and the less important in the whole system.

    Now, we give an example to demonstrate the relationship between entropy and the weight.Assuming the dataset as [1 21 3 4 5 6 7 8], we construct the input as [1 21 3 4 5; 21 3 4 5 6; 3 4 5 6 7], and output as [4 5 6 7 8]. Each column in the input represents an input vector. Then,the caculation results of consecutive square entropy is [0.7452 0.7054 0.5311 0.5311 0.5311],and the entropy weight of each input vector is[0.1302 0.1506 0.2397 0.2397 0.2397]. Here, it can be seen that the element 21 does not follow the overall distribution of data, and it can be considered as an outlier. Then, the first two input vectors contain outliers, and the calculation results show the entropy of two input vectors are larger than others.

    Generally, weight is used to measure the importance of components, so we transform entropy into the expression of entropy weight.In the prediction process, we use the firstldata to predict the next data, so the more orderly the data is, the smaller the deviation is,the higher the success rate is, which is helpful to find the relationship between input and output. That is to say, a vector with large entropy weight plays a positive role in the training process, and the vector with small entropy weight plays a negative role. So the entropy weight can be used to measure the signi ficance of vector in the training system.

    III. EQ-KLMS

    3.1 Implementation

    In EQ-KLMS, we firstly calculate the entropy weight of each input vector, to measure the importance of each input vector in the system.And then we remove the input vector and the corresponding outputs whose entropy weights are less than the average value. In this way, we can delete the input vector that is insigni ficant or may lead to errors before training, so that it can compress the dataset as well as improve the learning accuracy. Finally, with the modified training set, QKLMS model is trained,and the range of parameter adjustments will become small enough, that is to say, the weight vector of adaptive filter has tended to be stable. Now, the hidden relationships between inputs and outputs are found. Thus, we can use it to learn or predict output value more accurately when an input is given.

    ?

    The implementation of EQ-KLMS is mainly divided into five steps: constructing the inputs and outputs, calculating the entropy weights of every input vector, modifying the dataset in accordance with the entropy weights, training the QKLMS model with the modified training set, and making data analysis. And the framework of EQ-KLMS is shown in Algorithm 1.

    Remark 1: If the prediction is failed, we replace the output by desired output, so that the error is 0. Then, the weight of the last moment does not change, with the purpose of avoiding the impact of failed prediction.

    3.2 Complexity analysis

    It is clear that the core part of our scheme is the calculation of the entropy weight and the training of QKLMS model. If there areninput vectors, we need to computentimes to obtain the entropy weight of each vector, and the time complexity of computing entropy weight isO(n). In addition, the computational cost of online VQ and updating α(i) are also equal toO(n) [8]. Hence, we can infer that computational complexity of EQ-KLMS isO(n).

    IV. EXPERIMENT AND DISCUSSION

    4.1 Dataset and metrics

    We use the actual dataset obtained in [25] to conduct data prediction. We evaluate the performance through the prediction results, the computational time, the mean absolute error(MAE). Here,

    whereujrepresents the real value,uj′ denotes the predicted value, andNis the number of predicted values. According toRemark 1, if the prediction is failed, the error is 0 while replacing the output by desired output. It means that the MAE is only calculated within the successful prediction range, which is different from the conventional method.

    Assuming thatMis the total number of test items while successfully predictingmtimes,the successful prediction rate (SPR) ism/M×100%.

    To verify the effectiveness of our scheme,we perform KLMS and QKLMS under the same experimental condition. In addition,since we calculate the entropy weight for all the training inputs, we also provide a comparison experiment with ELM and LSSVM, which are two of fline algorithms.

    In the first experiment, we extract 4,000 continuous temperature data items of Haikou city in 2013, while using the first 3,005 data to generate the training input set (5×3,000)and the corresponding desired output set(3,000×1), using the following 605 data to generate the testing input set (5×600) and the corresponding desired output set (600×1).In the second experiment, we extract 5,000 continuous humidity data items of Sian city in 2013, while using the first 3,005 data to generate the training input set (5×3,000) and the corresponding desired output set (3,000×1),using the following 1,005 data to generate the testing input set (5×1000) and the corresponding desired output set (1000×1). The experiments are conducted in MATLAB computing environment running in an Intel(R) Core(TM)i5-3317U, 1.70 GH CPU. We test the SPR under different thresholdεwhich represents the requirement on the prediction accuracy.

    4.2 Results in temperature dataset

    After conducting some tests, the parameters are selected as: kernel parameterδ=0.02, stepsize parameterη=0.007, since they achieve the best accuracy. Figures 2 and 3 show how the quantization factorγaffects the performance of these five algorithms. Whenγincreases, the MAEs of QKLMS and EQ-KLMS increase gradually, but the network sizes of them decrease dramatically. Here, since there is no impact ofγfor KLMS, ELM, and LSSVM, the lines of those three algrotihms are overlapped in Figure 3. Then, we take a compromise valueγ=0.03, making MAE and network size achieve more ideal values. Because the sum ofbiis 1, and each element in the vector has equal weight, so we setbi=1/mto simplify the calculation.

    另一方面,當(dāng)0

    The prediction results for the temperature dataset are shown in Figure 4 whenε=0.15.We can find that the prediction values of EQKLMS are in good agreement with the actual values. However, the prediction accuracy of ELM is relatively worse than other schemes.Specifically, we give the prediction error of EQ-KLMS at every sampling point in Figure 5, and we can find that the prediction error converges to a value within a small enough interval.

    Figure 6 shows the MAE when the thresholdεchanges. A smaller MAE indicates a better prediction effect. On the whole, the prediction effect of EQ-KLMS is the best,and prediction error of ELM is maximum.Asεincreases, the performance of LSSVM is close to QKLMS. Figure 7 shows the SPR asεchanges. It is obvious that the greater the threshold, the higher the SPR. Among the five algorithms, the SPR of EQ-KLMS is the highest.

    We also compare the computational time whenε=0.15 in Table I. It can be observed that the computational time of LSSVM is more than that of other methods, and the time EQKLMS spends is close to QKLMS. Hence,even if EQ-KLMS spends a part of time in computing entropy, it can signi ficantly reduce the computational cost with the optimized training set. The details of calculation are shown in Table II. EQ-KLMS can significantly compress the scale of network and improve the prediction accuracy. Compared with the calculation time of the whole algorithm, the computation time of the entropy operation is a small part. And the percentage of computing time of entropy in EQ-KLMS is 0.3048/2.4649=12.37%.

    4.3 Results in humidity dataset

    After conducting some tests, the parameters of KLMS are selected as:δ=0.0021,η=0.0029, as they achieve the best accuracy. The in fluence of quantization factorγon MAE and network size of EQ-KLMS and QKLMS are similar as the previous experiment, we setγ=0.15 in this experiment. The choice ofbiis the same as above.

    Fig.2 The MAE with different quantization factor γ

    Fig.3 The network size with different quantization factor γ

    Fig.4 The prediction results for temperature item

    Figure 8 describes the humidity data of every sampling point and the prediction results whenε=0.15. Except to ELM, the prediction values of those schemes are almost follow the actual humidity data. But at some sampling points, KLMS, QKLMS, and LSSVM perform worse with bigger errors than EQ-KLMS.Meanwhile, the prediction error of EQ-KLMS at every sampling point is shown in Figure 9,it also converges to a small value.

    Figure 10 shows the corresponding MAE when the thresholdεchanges. It is clear that the MAE of EQ-KLMS is minimum. Figure 11 shows the SPR under different thresholdε.It is obvious that EQ-KLMS has the maximal SPR.

    We can conclude that the quantization approach with entropy-guided learning can improve the prediction accuracy. Considering the prediction effect and computational time simultaneously, EQ-KLMS is a competitive choice under the current computational framework.

    Table I Computational time for temperature dataset

    Table II Computational details of EQ-KLMS for temperature dataset

    Fig.5 The prediction errors for temperature item

    Fig.6 The MAE with different threshold ε for temperature item

    V. CONCLUSION

    To improve the learning accuracy and reduce the computing time, we combine entropy-guided learning technique and quantization approach to develop a novel kernel optimization scheme EQ-KLMS. Through the calculation of entropy weights, we firstly modify the training set by removing the inputs and their corresponding outputs with larger uncertainty,which are insigni ficant or easy to cause errors in learning process. Thus the memories and the scale of network will be compressed, the computational efforts and the data storage requirement will decrease when performing kernel algorithm, and the learning error will reduce as well. The experimental results on a data analysis task have demonstrated the effectiveness of our proposed scheme. However,EQ-KLMS calculates the entropy weight for all the training inputs, and it is one of limitations. Thus, we will embed the entropy weight calculation in the training process in the future. In addition, it will be an interesting issue while using our proposed scheme to address some other complex datasets (e.g., stock trading datasets).

    ACKNOWLEDGEMENT

    This work was partially supported by the National Key Technologies R&D Program of China under Grant No. 2015BAK38B01,the National Natural Science Foundation of China under Grant Nos. 61174103 and 61603032, the National Key Research and Development Program of China under Grant Nos. 2016YFB0700502, 2016YFB1001404,and 2017YFB0702300, the China Postdoctoral Science Foundation under Grant No.2016M590048, the Fundamental Research Funds for the Central Universities under Grant No. 06500025, the University of Science and Technology Beijing - National Taipei University of Technology Joint Research Program under Grant No. TW201610, and the Foundation from the National Taipei University of Technology of Taiwan under Grant No.NTUT-USTB-105-4.

    [1] J.M. Fossaceca, T.A. Mazzuchi, S. Sarkani,“MARK-ELM: Application of a novel multiple kernel learning framework for improving the robustness of network intrusion detection”,Expert Systems with Applications, vol.42, no.8, pp 4062-4080, May, 2015.

    [2] R. Mall, J.A. Suykens, “Very sparse LSSVM reductions for large-scale data”,IEEE Transactions on Neural Networks and Learning Systems, vol.26,no.5, pp. 1086-1097, May, 2015.

    Fig.7 The SPR with different threshold ε for temperature item

    Fig.8 The prediction results for humidity item

    Fig.9 The prediction errors for humidity item

    [3] P. Honeine, “Online kernel principal component analysis: A reduced-order model”,IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.34, no.9, pp 1814-1826, September,2012.

    [4] W.F. Liu, J.C. Principe, S. Haykin,Kernel Adaptive Filtering. Wiley, Hoboken, NJ, USA, 2011.

    [5] W.F. Liu, P. Pokharel, J.C. Principe, “The kernel least mean square algorithm”,IEEE Transactions on Signal Processing, vol.56, no.2, pp 543-554,February, 2008.

    [6] Y. Engel, S. Mannor, R. Meir, “The kernel recursive least-squares algorithm”,IEEE Transactions on Signal Processing, vol.52, no.8, pp 2275-2285, August, 2004.

    [7] B.D. Chen, S. Zhao, P. Zhu, J.C. Principe, “Quantized kernel least mean square algorithm”,IEEE Transactions on Neural Networks and Learning Systems, vol.23, no.1, pp 22-32, January, 2012.

    [8] B.D. Chen, S. Zhao, P. Zhu, J.C. Principe, “Quantized kernel recursive least squares algorithm”,IEEE Transactions on Neural Networks and Learning Systems, vol.24, no.9, pp 1484-1491,September, 2013.

    [9] S.Y. Nan, L. Sun, B.D. Chen, Z.P. Lin, KA Toh, ”Density-dependent quantized least squares support vector machine for large data sets”,IEEE Transactions on Neural Networks and Learning Systems, vol.28, no.1, pp 94-106, January, 2017.

    [10] X.G. Xu, H. Qu, J.H. Zhao, X.H. Yang, B.D. Chen,“Quantized kernel least mean square with desired signal smoothing”,Electronics Letters,vol.51, no.18, pp 1457-1459, September, 2015.

    [11] S. Zhao, B.D. Chen, J.C. Principe, “Fixed budget quantized kernel least mean square algorithm”,Signal Processing, vol. 93, no.9, pp 2759-2770,September, 2013.

    [12] S. Zhao, B.D. Chen, Z. Cao, P.P. Zhu, J.C. Principe,“Self-organizing kernel adaptive filtering”,EURASIP Journal on Advances in Signal Processing,vol.2016, no.1, December, 2016.

    [13] X. Luo, D. Zhang, L.T. Yang, J. Liu, X. Chang, H.Ning, “A kernel machine-based secure data sensing and fusion scheme in wireless sensor networks for the cyber-physical systems”,Future Generation Computer Systems, vol.61, pp 85-96,August, 2016.

    [14] X. Luo, J. Liu, D. Zhang, X. Chang, “A large-scale web QoS prediction scheme for the industrial Internet of Things based on a kernel machine learning algorithm”,Computer Networks,vol.101, pp 81-89, June, 2016.

    [15] Y. Xu, X. Luo, W. Wang, W. Zhao, “Eき cient DVHOP localization for wireless cyber-physical social sensing system: A correntropy-based neural network learning scheme”,Sensors, vol.17, no.1,135, January, 2017.

    [16] P. Tang, D. Chen, Y. Hou, “Entropy method combined with extreme learning machine method for the short-term photovoltaic power gener-ation forecasting”,Chaos Solitons and Fractals,vol.89, pp 243-248, October, 2015.

    Table III Computational time for humidity dataset

    Table IV Computational details of EQ-KLMS for humidity dataset

    Fig.10 The MAE with different threshold ε for humidity item

    Fig.11 The SPR with different threshold ε for humidity item

    [17] J. Beirlant, E.J. Dudewicz, L. Gyor fi, E.C. Meulen,“Nonparametric entropy estimation: An overview”,International Journal of the Mathematical Statistics Sciences, vol.6, no.1, pp 1-14, 1997.

    [18] Z.B. Liu, “A maximum margin learning machine based on entropy concept and kernel density estimation”,Journal of Electronics & Information Technology, vol.33, no.9, pp 2187-2191, September, 2011.

    [19] E. Parzen, “On estimation of a probability density function and mode”, Annals of Mathematical Statistics, vol.33, no.3, pp 1065-1076, September, 1962.

    [20] G.B. Huang, H. Zhou, X. Ding, R. Zhang, “Extreme learning machine for regression and multiclass classification”,IEEE Transactions on Systems,Man, and Cybernetics,Part B: Cybernetics,vol.42, no.2, pp. 513-529, April, 2012.

    [21] C. Wu, Y, Chen, F. Li, “Decision model of knowledge transfer in big data environment”,China Communications, vol. 13, no. 7, pp. 100-107,July, 2016.

    [22] C.E. Shannon, “Communication theory of secrecy systems”,The Bell System Technical Journal,vol.28, no.4, pp. 656-715, October, 1949.

    [23] Z.B. Liu, W.J. Zhao, “One-class Learning Machine based on Entropy”,Computer Applications and Software, vol.30, no.11, pp. 99-101, November,2013.

    [24] M.P. Wand, M.C. Jones,Kernel Smoothing. Chapman and Hall/CRC, Boca Raton, FL, USA, 1994.

    [25] https://www.wunderground.com/history

    亚洲熟女毛片儿| 久久久久精品国产欧美久久久| 午夜福利免费观看在线| 午夜福利一区二区在线看| 色在线成人网| 亚洲 国产 在线| 日韩免费av在线播放| 国产亚洲av高清不卡| 欧美日韩瑟瑟在线播放| 亚洲一区二区三区不卡视频| 国产精品久久电影中文字幕| 国产视频内射| 热99re8久久精品国产| 日本 欧美在线| 精品免费久久久久久久清纯| 露出奶头的视频| 99热只有精品国产| 亚洲国产精品999在线| 久久婷婷成人综合色麻豆| 亚洲中文字幕日韩| 国产精品99久久99久久久不卡| 黑人操中国人逼视频| 午夜免费鲁丝| 国产人伦9x9x在线观看| 精品日产1卡2卡| 老司机午夜十八禁免费视频| 99热6这里只有精品| 国内揄拍国产精品人妻在线 | 国产片内射在线| 国产区一区二久久| xxxwww97欧美| 国产日本99.免费观看| 可以在线观看的亚洲视频| 99国产精品99久久久久| 久久久久精品国产欧美久久久| 亚洲精品国产一区二区精华液| 成人亚洲精品av一区二区| www国产在线视频色| 欧美人与性动交α欧美精品济南到| 99在线人妻在线中文字幕| 丁香欧美五月| 精品日产1卡2卡| 一进一出抽搐gif免费好疼| 最新美女视频免费是黄的| 99久久精品国产亚洲精品| 午夜激情福利司机影院| 日本一区二区免费在线视频| 精品无人区乱码1区二区| 日本一区二区免费在线视频| 国产麻豆成人av免费视频| 亚洲成a人片在线一区二区| www国产在线视频色| 午夜久久久在线观看| 日本在线视频免费播放| 级片在线观看| 久久人妻av系列| 亚洲一卡2卡3卡4卡5卡精品中文| 午夜免费激情av| 亚洲精品美女久久av网站| 91成年电影在线观看| 露出奶头的视频| 亚洲欧美一区二区三区黑人| 一个人观看的视频www高清免费观看 | 伦理电影免费视频| 国内久久婷婷六月综合欲色啪| 色尼玛亚洲综合影院| 成人18禁高潮啪啪吃奶动态图| 亚洲第一电影网av| 在线播放国产精品三级| 国产在线精品亚洲第一网站| 成人一区二区视频在线观看| 久久久久久久久中文| 女人爽到高潮嗷嗷叫在线视频| 欧美国产精品va在线观看不卡| 国产精品98久久久久久宅男小说| 黄网站色视频无遮挡免费观看| 亚洲成人久久性| 好男人电影高清在线观看| 成人三级黄色视频| 亚洲激情在线av| 精品乱码久久久久久99久播| 亚洲精品粉嫩美女一区| 欧美日韩亚洲综合一区二区三区_| 欧美激情高清一区二区三区| 69av精品久久久久久| 俺也久久电影网| 久久这里只有精品19| 久久精品国产99精品国产亚洲性色| 亚洲国产精品久久男人天堂| 精品久久久久久久久久久久久 | 老司机在亚洲福利影院| 白带黄色成豆腐渣| 亚洲色图av天堂| 亚洲 欧美 日韩 在线 免费| 天天躁夜夜躁狠狠躁躁| 欧美av亚洲av综合av国产av| 天天躁狠狠躁夜夜躁狠狠躁| 日韩精品免费视频一区二区三区| 精品乱码久久久久久99久播| 一个人免费在线观看的高清视频| av超薄肉色丝袜交足视频| 久9热在线精品视频| 无人区码免费观看不卡| 亚洲人成伊人成综合网2020| 中出人妻视频一区二区| 精品久久久久久久久久久久久 | 亚洲专区中文字幕在线| 久久国产精品男人的天堂亚洲| 亚洲第一青青草原| 久久香蕉国产精品| 男女之事视频高清在线观看| 国产精品久久视频播放| 久久久国产成人免费| 热99re8久久精品国产| 欧美一级毛片孕妇| 校园春色视频在线观看| 亚洲天堂国产精品一区在线| 啦啦啦韩国在线观看视频| 午夜福利成人在线免费观看| 久久午夜综合久久蜜桃| svipshipincom国产片| 精品国内亚洲2022精品成人| 亚洲国产日韩欧美精品在线观看 | 亚洲专区字幕在线| 熟妇人妻久久中文字幕3abv| 搞女人的毛片| 精品少妇一区二区三区视频日本电影| 丝袜在线中文字幕| 男人舔奶头视频| 丁香欧美五月| 亚洲,欧美精品.| 亚洲精品国产精品久久久不卡| 极品教师在线免费播放| 免费搜索国产男女视频| 国产真实乱freesex| 性欧美人与动物交配| 午夜久久久久精精品| 啦啦啦韩国在线观看视频| 美国免费a级毛片| 亚洲五月色婷婷综合| 97碰自拍视频| 国产精品久久久久久精品电影 | 国产精品一区二区免费欧美| 成人精品一区二区免费| 一本综合久久免费| 精品不卡国产一区二区三区| 国产午夜福利久久久久久| 精品久久久久久,| 88av欧美| 国产黄色小视频在线观看| 欧美成狂野欧美在线观看| 欧美丝袜亚洲另类 | 一本精品99久久精品77| 国产99久久九九免费精品| 久久精品国产综合久久久| 久热爱精品视频在线9| av天堂在线播放| 久久中文看片网| 极品教师在线免费播放| 欧美成狂野欧美在线观看| 国产精品亚洲av一区麻豆| 精品一区二区三区av网在线观看| 亚洲av五月六月丁香网| 成人国产一区最新在线观看| 国产亚洲精品第一综合不卡| 1024香蕉在线观看| 黄色视频,在线免费观看| www.www免费av| e午夜精品久久久久久久| 高清在线国产一区| 人人妻人人看人人澡| 日韩国内少妇激情av| 午夜两性在线视频| 2021天堂中文幕一二区在线观 | 国产麻豆成人av免费视频| 欧美日韩亚洲国产一区二区在线观看| 精品第一国产精品| 少妇熟女aⅴ在线视频| 国产爱豆传媒在线观看 | 2021天堂中文幕一二区在线观 | 级片在线观看| 人妻丰满熟妇av一区二区三区| 在线十欧美十亚洲十日本专区| 一边摸一边做爽爽视频免费| 人人妻人人澡欧美一区二区| 色综合亚洲欧美另类图片| 淫秽高清视频在线观看| 午夜精品在线福利| 人妻久久中文字幕网| 日日夜夜操网爽| 亚洲欧美精品综合一区二区三区| 欧美又色又爽又黄视频| 久久中文看片网| 满18在线观看网站| 国产成人av教育| 国产成人av激情在线播放| 国产黄片美女视频| av天堂在线播放| 久久久久久久久久黄片| 中文字幕另类日韩欧美亚洲嫩草| 亚洲色图 男人天堂 中文字幕| 国产人伦9x9x在线观看| 一级毛片高清免费大全| 国产精品99久久99久久久不卡| 国产男靠女视频免费网站| 一二三四社区在线视频社区8| 久久久久久久精品吃奶| 亚洲成a人片在线一区二区| 高潮久久久久久久久久久不卡| 欧美日韩瑟瑟在线播放| 韩国精品一区二区三区| 国产精品一区二区三区四区久久 | 人人妻人人澡人人看| svipshipincom国产片| 无限看片的www在线观看| 悠悠久久av| 国产精品久久久人人做人人爽| 亚洲欧美一区二区三区黑人| 久久久精品国产亚洲av高清涩受| 精品国产乱子伦一区二区三区| 黑人巨大精品欧美一区二区mp4| 亚洲男人的天堂狠狠| 国产成人欧美在线观看| 丁香六月欧美| 亚洲熟女毛片儿| 少妇被粗大的猛进出69影院| 亚洲av熟女| 久久草成人影院| 国产1区2区3区精品| 在线观看www视频免费| 久久狼人影院| 精品国产乱码久久久久久男人| 中文字幕人妻熟女乱码| 黄片播放在线免费| 黄色成人免费大全| avwww免费| 成人午夜高清在线视频 | 亚洲成人久久性| 久久天堂一区二区三区四区| 大型av网站在线播放| 热99re8久久精品国产| 亚洲中文日韩欧美视频| 天天躁夜夜躁狠狠躁躁| 久久精品影院6| 禁无遮挡网站| 亚洲欧洲精品一区二区精品久久久| 亚洲午夜精品一区,二区,三区| 久久人人精品亚洲av| 91成年电影在线观看| 欧美人与性动交α欧美精品济南到| 国产精品一区二区免费欧美| 一边摸一边做爽爽视频免费| 久久亚洲精品不卡| 欧美在线一区亚洲| 久久精品国产99精品国产亚洲性色| 国产亚洲精品一区二区www| 精品人妻1区二区| 一个人免费在线观看的高清视频| bbb黄色大片| 一区二区三区国产精品乱码| 亚洲精品国产区一区二| 精品乱码久久久久久99久播| 成人手机av| 啦啦啦观看免费观看视频高清| 999久久久精品免费观看国产| 日本精品一区二区三区蜜桃| 制服诱惑二区| 黄色a级毛片大全视频| 精品少妇一区二区三区视频日本电影| 香蕉av资源在线| 精品久久久久久久末码| 伦理电影免费视频| 亚洲成人免费电影在线观看| 国产av一区二区精品久久| 啪啪无遮挡十八禁网站| 美女扒开内裤让男人捅视频| 777久久人妻少妇嫩草av网站| 黄色女人牲交| 视频在线观看一区二区三区| 久久人妻av系列| 国产高清视频在线播放一区| 天堂动漫精品| 欧美成狂野欧美在线观看| 国产成人av激情在线播放| 99久久久亚洲精品蜜臀av| 一二三四社区在线视频社区8| 国产精华一区二区三区| 一级毛片女人18水好多| 亚洲国产高清在线一区二区三 | 国产精品 国内视频| 国产久久久一区二区三区| 欧美成人午夜精品| 在线免费观看的www视频| 欧美成狂野欧美在线观看| 中文字幕久久专区| 妹子高潮喷水视频| 天堂√8在线中文| 99精品久久久久人妻精品| 可以在线观看毛片的网站| 亚洲专区中文字幕在线| 久久久国产精品麻豆| 桃色一区二区三区在线观看| av电影中文网址| 亚洲欧美日韩无卡精品| 成人午夜高清在线视频 | 一级a爱片免费观看的视频| 国内毛片毛片毛片毛片毛片| 狂野欧美激情性xxxx| 欧美日韩瑟瑟在线播放| 国产av一区二区精品久久| 国产成人影院久久av| 久久久久久久久免费视频了| 亚洲成人久久性| 国产精品爽爽va在线观看网站 | 欧美日韩瑟瑟在线播放| 午夜福利欧美成人| 精品国产乱码久久久久久男人| 国产精品亚洲av一区麻豆| 国产亚洲精品一区二区www| 99re在线观看精品视频| av超薄肉色丝袜交足视频| 精品久久久久久久久久免费视频| 国产高清视频在线播放一区| 露出奶头的视频| 色综合欧美亚洲国产小说| 国产精品亚洲一级av第二区| 91麻豆精品激情在线观看国产| 禁无遮挡网站| 免费一级毛片在线播放高清视频| 无遮挡黄片免费观看| 黑丝袜美女国产一区| 国产精品亚洲av一区麻豆| 婷婷亚洲欧美| 香蕉av资源在线| 后天国语完整版免费观看| 久久 成人 亚洲| 亚洲av成人不卡在线观看播放网| 日韩视频一区二区在线观看| 首页视频小说图片口味搜索| 亚洲色图av天堂| 亚洲av成人av| 美女 人体艺术 gogo| 亚洲专区字幕在线| 黄色女人牲交| 99在线视频只有这里精品首页| 精品国产国语对白av| 日韩 欧美 亚洲 中文字幕| 中亚洲国语对白在线视频| 人妻丰满熟妇av一区二区三区| 人人妻,人人澡人人爽秒播| 大型av网站在线播放| 九色国产91popny在线| 97超级碰碰碰精品色视频在线观看| 啦啦啦韩国在线观看视频| 国产野战对白在线观看| 亚洲av熟女| 最近最新中文字幕大全电影3 | 亚洲av美国av| 精品无人区乱码1区二区| 美女国产高潮福利片在线看| 1024香蕉在线观看| 欧美成狂野欧美在线观看| 韩国av一区二区三区四区| 午夜福利一区二区在线看| 亚洲色图 男人天堂 中文字幕| 成人精品一区二区免费| 亚洲欧美日韩无卡精品| 99riav亚洲国产免费| 精品卡一卡二卡四卡免费| 91成人精品电影| 18美女黄网站色大片免费观看| 日日夜夜操网爽| 国产真人三级小视频在线观看| 国产精品久久视频播放| 国产99久久九九免费精品| 亚洲第一av免费看| 午夜免费激情av| 99国产精品99久久久久| 美女午夜性视频免费| 后天国语完整版免费观看| 欧美乱色亚洲激情| 1024香蕉在线观看| 国产国语露脸激情在线看| 两性午夜刺激爽爽歪歪视频在线观看 | 日本在线视频免费播放| 国产亚洲av高清不卡| 露出奶头的视频| 国产久久久一区二区三区| 黄色 视频免费看| 欧美性猛交╳xxx乱大交人| 色av中文字幕| 丝袜在线中文字幕| 最近最新免费中文字幕在线| 一边摸一边抽搐一进一小说| 欧美成人午夜精品| 欧美成人午夜精品| 亚洲av成人不卡在线观看播放网| 成人三级黄色视频| 久久精品aⅴ一区二区三区四区| 亚洲激情在线av| 一二三四社区在线视频社区8| 亚洲熟妇熟女久久| 香蕉丝袜av| 少妇的丰满在线观看| 成人精品一区二区免费| 亚洲国产精品合色在线| 精品电影一区二区在线| 一区二区日韩欧美中文字幕| 琪琪午夜伦伦电影理论片6080| 欧美日韩亚洲综合一区二区三区_| 久久午夜亚洲精品久久| 亚洲第一欧美日韩一区二区三区| 观看免费一级毛片| 日韩精品中文字幕看吧| 韩国精品一区二区三区| 在线观看一区二区三区| 国产午夜精品久久久久久| 国产亚洲精品久久久久5区| 真人一进一出gif抽搐免费| 少妇的丰满在线观看| 国产亚洲精品综合一区在线观看 | 日日摸夜夜添夜夜添小说| 激情在线观看视频在线高清| 国产av在哪里看| 国产在线观看jvid| 欧美日本亚洲视频在线播放| 日韩欧美三级三区| 午夜亚洲福利在线播放| tocl精华| 国产精品永久免费网站| 欧美 亚洲 国产 日韩一| 日本免费一区二区三区高清不卡| 国产精品久久久人人做人人爽| 叶爱在线成人免费视频播放| 亚洲人成电影免费在线| 欧美久久黑人一区二区| 欧美不卡视频在线免费观看 | 久久精品亚洲精品国产色婷小说| 亚洲精品中文字幕在线视频| 色综合亚洲欧美另类图片| 亚洲成av人片免费观看| 久久午夜综合久久蜜桃| 国产精品98久久久久久宅男小说| 欧美日本亚洲视频在线播放| 欧美性猛交黑人性爽| 精品欧美国产一区二区三| 中文字幕久久专区| 亚洲成人免费电影在线观看| 天堂√8在线中文| 男人的好看免费观看在线视频 | 久久天躁狠狠躁夜夜2o2o| 国产成人啪精品午夜网站| 国产精品乱码一区二三区的特点| 人人妻,人人澡人人爽秒播| videosex国产| 国产成人一区二区三区免费视频网站| 757午夜福利合集在线观看| 久久精品影院6| 亚洲一区高清亚洲精品| 不卡一级毛片| av超薄肉色丝袜交足视频| 夜夜躁狠狠躁天天躁| 久久久久亚洲av毛片大全| 1024视频免费在线观看| 在线国产一区二区在线| 少妇被粗大的猛进出69影院| 无人区码免费观看不卡| 色老头精品视频在线观看| 久久久水蜜桃国产精品网| 黄色丝袜av网址大全| 韩国精品一区二区三区| 99riav亚洲国产免费| 欧美日本视频| 亚洲av美国av| 久久草成人影院| 熟女电影av网| 国产黄色小视频在线观看| 成人精品一区二区免费| 日韩欧美一区二区三区在线观看| 免费在线观看亚洲国产| 精品不卡国产一区二区三区| 欧美激情极品国产一区二区三区| 成人18禁在线播放| 久久久久亚洲av毛片大全| 老鸭窝网址在线观看| 亚洲av成人一区二区三| 午夜免费观看网址| 少妇的丰满在线观看| 国内少妇人妻偷人精品xxx网站 | 欧美日韩黄片免| 欧美在线一区亚洲| 人人妻人人看人人澡| 国产成年人精品一区二区| 成人亚洲精品av一区二区| 久久久久亚洲av毛片大全| 一本一本综合久久| 变态另类成人亚洲欧美熟女| 免费看美女性在线毛片视频| 黄片播放在线免费| 男女下面进入的视频免费午夜 | 精品国内亚洲2022精品成人| 国产又色又爽无遮挡免费看| 99热6这里只有精品| 亚洲熟女毛片儿| av视频在线观看入口| 黄色毛片三级朝国网站| 国产一区在线观看成人免费| 午夜精品久久久久久毛片777| 老熟妇乱子伦视频在线观看| 免费看美女性在线毛片视频| 国产av一区在线观看免费| 亚洲真实伦在线观看| 中亚洲国语对白在线视频| 中文字幕av电影在线播放| 男女床上黄色一级片免费看| 色av中文字幕| 午夜免费成人在线视频| 国产色视频综合| 国产亚洲av高清不卡| 大型av网站在线播放| 黄色毛片三级朝国网站| 亚洲成人国产一区在线观看| 丰满人妻熟妇乱又伦精品不卡| 成人三级黄色视频| 午夜老司机福利片| 色哟哟哟哟哟哟| 在线观看午夜福利视频| 亚洲国产欧美网| 十八禁网站免费在线| 正在播放国产对白刺激| 国产又色又爽无遮挡免费看| 日韩一卡2卡3卡4卡2021年| 亚洲专区国产一区二区| 欧美日韩乱码在线| 波多野结衣巨乳人妻| 黄片小视频在线播放| 91麻豆av在线| 欧美一区二区精品小视频在线| 桃红色精品国产亚洲av| 老司机在亚洲福利影院| 男女之事视频高清在线观看| 欧美日本视频| 亚洲av日韩精品久久久久久密| 一级毛片精品| 国产精品,欧美在线| 99热这里只有精品一区 | 香蕉丝袜av| 色播在线永久视频| 久久久国产精品麻豆| 50天的宝宝边吃奶边哭怎么回事| 一区二区日韩欧美中文字幕| 欧美日韩亚洲国产一区二区在线观看| 久久人人精品亚洲av| 亚洲人成伊人成综合网2020| 精品国产超薄肉色丝袜足j| 国产精品av久久久久免费| 久久青草综合色| 少妇被粗大的猛进出69影院| 法律面前人人平等表现在哪些方面| 精品国产一区二区三区四区第35| 亚洲欧美激情综合另类| 一进一出抽搐gif免费好疼| 国产极品粉嫩免费观看在线| 老汉色∧v一级毛片| 国产精华一区二区三区| 一级a爱视频在线免费观看| 夜夜爽天天搞| 少妇熟女aⅴ在线视频| 国产精品av久久久久免费| 国产成+人综合+亚洲专区| 日韩视频一区二区在线观看| 中文字幕av电影在线播放| 亚洲熟妇熟女久久| 日日摸夜夜添夜夜添小说| 亚洲精品美女久久av网站| 91大片在线观看| 久久精品国产清高在天天线| 国内久久婷婷六月综合欲色啪| 久热这里只有精品99| 国产久久久一区二区三区| 黄色女人牲交| www.熟女人妻精品国产| 他把我摸到了高潮在线观看| 久久人妻av系列| 欧美激情 高清一区二区三区| 欧美国产日韩亚洲一区| 午夜福利一区二区在线看| 精品第一国产精品| 日本精品一区二区三区蜜桃| 啦啦啦观看免费观看视频高清| 免费高清视频大片| 高潮久久久久久久久久久不卡| 天堂动漫精品| 脱女人内裤的视频| 久久久国产精品麻豆| 亚洲av片天天在线观看| av在线天堂中文字幕| 色播在线永久视频| 99国产综合亚洲精品| 亚洲专区字幕在线| 亚洲av第一区精品v没综合| 在线十欧美十亚洲十日本专区| 国产成人精品久久二区二区91| 午夜激情av网站| 日韩大码丰满熟妇| 精品国产亚洲在线| 久久午夜亚洲精品久久| 少妇粗大呻吟视频| 人人妻人人澡人人看| 亚洲精品在线观看二区| 99久久精品国产亚洲精品| 日韩成人在线观看一区二区三区| 亚洲avbb在线观看| 国产不卡一卡二| 午夜久久久久精精品|