• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Short-term photovoltaic power prediction using combined K-SVD-OMP and KELM method

    2022-09-19 07:07:44LIJunZHENGDanyang

    LI Jun, ZHENG Danyang

    (School of Automation and Electrical Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China)

    Abstract: For photovoltaic power prediction, a kind of sparse representation modeling method using feature extraction techniques is proposed. Firstly, all these factors affecting the photovoltaic power output are regarded as the input data of the model. Next, the dictionary learning techniques using the K-mean singular value decomposition (K-SVD) algorithm and the orthogonal matching pursuit (OMP) algorithm are used to obtain the corresponding sparse encoding based on all the input data, i.e. the initial dictionary. Then, to build the global prediction model, the sparse coding vectors are used as the input of the model of the kernel extreme learning machine (KELM). Finally, to verify the effectiveness of the combined K-SVD-OMP and KELM method, the proposed method is applied to a instance of the photovoltaic power prediction. Compared with KELM, SVM and ELM under the same conditions, experimental results show that different combined sparse representation methods achieve better prediction results, among which the combined K-SVD-OMP and KELM method shows better prediction results and modeling accuracy.

    Key words: photovoltaic power prediction; sparse representation; K-mean singular value decomposition algorithm (K-SVD); kernel extreme learning machine (KELM)

    0 Introduction

    Due to the random, fluctuation, and intermittent nature of solar energy, the output power of photovoltaic (PV) power plants is often not easy to control, which harms the stable operation of the power grid. At present, the ultra-short-term and short-term power prediction of PV power generation can ensure the regular operation and reliability of the power system, which has attracted more and more attention from researchers[1-4].

    The short-term PV power prediction methods are generally classified into physical methods[5], classical statistical methods[6-8], and artificial intelligence methods[9]. The artificial intelligence methods, such as neural networks[10-14]and support vector machine (SVM), etc[15-17], have the remarkable ability of reasoning and learning in an uncertain and imprecise environment, which have been successfully applied in the field of PV prediction and have achieved good prediction results. For example, Hossain et al.[10]proposed a prediction algorithm for PV power using a long short term memory (LSTM) neural network. Sun et al.[15]established a short-term step-wise temperature prediction model for PV module based on SVM, and the results indicate that the stepwise prediction model has better accuracy than the direct prediction model if other things are equal. Ref.[16] further presents a comparison of extreme learning machine (ELM) and SVM for PV power estimation. Lin et al.[17]introduced an inertia weighting strategy and the Cauchy mutation operator are to improve the moth-flame optimization algorithm for SVM prediction of PV power generation. In Ref.[18], weather types were classified into abnormal day (weather changed suddenly) and normal day, and a combined prediction method based on ensemble empirical mode decomposition (EEMD) and SVM was proposed to tackle the problem of the short-term forecast of hourly output PV system a day ahead. In addition, an EEMD-GRNN(generalized regression neural network) prediction method is further proposed by Zhang et al.[19], and the prediction error of the EEMD-GRNN method can also meet the requirements.

    ELM is a fast learning algorithm based on single-hidden layer feedforward neural networks (SLFNs). Since the network structure is determined randomly by the number of hidden layer nodes, the input layer weights and hidden layer node parameters are also randomly given, and the regularized least-squares algorithm is used for training, which only needs to consider adjusting the output layer weights of the network, thus increasing the network training speed extremely fast. ELM was successfully applied to short-term photovoltaic power prediction and achieved high prediction accuracy in Ref.[20]. Since the ELM network is more sensitive to its hidden nodes, to solve this problem, the kernel extreme learning machine (KELM), which does not need consider the number of the hidden layer nodes, only uses the kernel function to represent the nonlinear feature mapping of the hidden layer unknown. Therefore, KELM can achieve good performance.

    In addition, principal component analysis (PCA) or kernel principal component analysis (KPCA) is an important means for data preprocessing, which can effectively extract the features hidden in the data. Lv et al.[21]used the PCA algorithm and neural network to preprocess the input data, and the results showed that PCA could effectively improve the accuracy of prediction results. Similarly, sparse representation can also be used in the field of prediction as to the primary means of feature extraction as well. As an advanced machine learning method, sparse representation has been widely used in pattern recognition field. The sparse representation method contains two parts: dictionary learning and sparse coding. Compared with PCA, a set of super complete basic vectors can be obtained through sparse representation.

    In view of the advantages of sparse representation algorithms in feature extraction, combined with the KELM method, we propose a new global prediction model for short-term prediction of PV power based on theK-mean singular value decomposition and orthogonal matching pursuit (K-SVD-OMP) sparse representation algorithm. To verify the effectiveness of the method, it is applied to an instance for PV benchmark power prediction provided by the Global Energy Forecasting Competition 2014 (GEFCOM2014), and compared with the existing prediction methods such as SVM and KELM and other sparse representation algorithms using non-dictionary learning to evaluate the effectiveness of the combined K-SVD-OMP method.

    1 Sparse representation

    Sparse representation is to represent the original signal by a linear combination of coefficients of a super-complete basis function, where the sets of basis functions are called dictionaries, and the basis functions are called atoms. Sparse coding in sparse representation is basis on the concept of a large number of candidate basis vector dictionaries, which linearly combine the base vectors in a super-complete dictionary to represent the original signal in a compact and efficient way. The weights of most of the dictionary basis vectors tend to zero, and the basis vectors with non-zero weights are selected for the original signal. That is, the sparse representation of the signal vectorx∈Rmis expressed as

    x=Φα,

    (1)

    whereΦ={φ1,φ2, …,φK}, and the basis vectorφis an element in the dictionaryΦ, called an atom. For Eq.(1), it can be obtained by solving thel0-parametric optimization problem for the coefficient vectorα, that is,

    (2)

    If the number of non-zero coefficients inαis restricted, Eq.(2) can be formulated as anM-sparse optimization problem

    (3)

    whereMis the maximum number of non-zero coefficients inα.

    Applying the Lagrange multiplier, Eq.(2) is converted to

    (4)

    whereλ>0 and acts as a compromise in terms of reconstruction error and sparsity.

    For Eqs.(2)-(4), which are essentially NP-hard problems, solving such issues usually uses the greedy algorithm and the relaxation algorithm to sparse the representation of the signal, for example, the prior distribution ofαis assumed to obey the Laplacian distribution[22]as

    (5)

    (6)

    Letλ1=2σ2β, the above equation can be transformed into

    (7)

    If an additional constraint with a sum of weights of 1 is given, the above equation can be transformed into

    (8)

    2 K-SVD algorithm

    2.1 Dictionary selection

    In sparse representation, dictionary learning is to transform samples into appropriate sparse representation, allowing the learning task to be simplified and the model complexity reduced. Therefore, different dictionaries can be chosen for different types of data, for example, discrete cosine transform (DCT) dictionary, data dictionary, structured dictionary, etc. For a better sparse representation of other signals, we selectk-means singular value decomposition (K-SVD) dictionary with better performance. This method can obtain the most suitable and compact dictionary for the data that are trained without the training process of the model. The dictionary is further applied to the sparse coding algorithm based on OMP to obtain the optimal solution of the sparse coding vector.

    2.2 K-SVD-OMP algorithm

    As a classical dictionary training algorithm, K-SVD is a generalization ofk-means based on sparse representation, which includes explicitly two phases of dictionary updating and sparse encoding. For K-SVD algorithm, inKiterations, the SVD technique is used, and iterative alternating learning is employed to optimize the representation of the input data in the current dictionary and update the atoms in the dictionary to fit the data by iterations better. Considering a finite data vector as the standard basis vector, the objective function of thek-means algorithm is expressed as

    s.t. ?i,xi=ek,

    (9)

    The algorithm essentially solves the optimization problem shown as

    (10)

    In the sparse encoding stage, assuming that the dictionaryΦis fixed, the optimization problem of Eq.(10) is transformed into searching for a sparse representation of the coefficient matrixAcorresponding to the dictionary matrixΦ. Then Eq.(10) can be optimized as

    (11)

    Eq.(11) is solved in substantially the same way as Eq.(3), and different sparse encoding algorithms can solve both.

    In the dictionary update phase, which includes the sequential processing of the atoms (i.e., columns) of the dictionary and keeping all columns unchanged except for thekth columnφk. Columnφkcan be updated by multiplying its coefficients inΦ, isolating the elements associated withφk. Eq.(10) can be rewritten as

    (12)

    Definingωkis the index set of dataxiusing the dictionary atomφkas

    Therefore, the objective function of Eq.(12) is equivalent to

    (13)

    In summary, for the optimization of Eq.(10), the aim is to obtain the best dictionary setΦrepresenting datasetX.

    The K-SVD algorithm is implemented in the following steps:

    Step 1) Initialize the dictionary: setJ=1 and the columns ofΦ(0)∈Rm×Kwithl2-parametric normalization.

    Step 2) Sparse coding phase uses the sparse coding algorithm to approximate the sparse vectorαiwith the correspondingxi, i.e.

    Step 3) Dictionary update phase: update each column (k=1,2,…,K) in dictionaryΦ(J-1).

    Next, compute the error matrixEkby

    Step 4) DoJ=J+1 and return to step 2 until the convergence or stopping condition of Eq.(10) is satisfied.

    2.3 OMP algorithm

    OMP algorithm is an effective solution to the sparse coding problem with sparse constraints for Eq.(3) as a class of greedy algorithmic solution strategies. In statistical modeling, OMP algorithm aims to select the atom with the highest correlation to the current residual at each step in the solving sparse coding vector stage. After selecting the atom, the signal is projected orthogonally onto the space of the selected atoms, the residuals are recalculated, and the process is repeated. As a result, the OMP algorithm converges faster for the same accuracy requirements.

    To solve the optimal estimate of the sparse encoding vector, the optimization problem for Eq.(3) can be rewritten as

    (14)

    To solve Eq.(14), the OMP algorithm is implemented in the following steps.

    Step 1) Initialize the residual vectorr0=x, the index setΛ0=Φ, and the number of iterationst=1.

    Step 2) Find the indexχt, whereχtis for the residual and the index corresponding to the column vectorφiof the dictionary matrix when it has the maximum inner product, i.e..

    Step 3) Augment the index set, select index setΛt=Λt-1∪{χt}, and update dictionaryΦt=[Φt-1,φχt], noting thatΦ0is the empty matrix.

    Step 4) Solve the least-squares problem to obtain the new data representation, i.e.

    Step 5) Update the residualsrt=x-Φtαt.

    Step 6) Dot=t+1, ift≤M, return to Step 2.

    3 Global prediction model using sparse representation algorithm

    This section specifically considers how to combine the sparse representation algorithm with KELM to form a global prediction model. Firstly, the factors affecting the PV power prediction output and the PV power to be predicted are included into an input-output data pair with time delay. Secondly, the training data are sparsely decomposed, transformed, and mapped into the sparse domain, and the sparse coding vector is the sparse representation of the training input data with the help of the basis vector selected by the dictionary used for sparse decomposition, i.e., a hidden mode representation of the input data is obtained. Finally, the corresponding target output is paired and trained using a KELM global regression model.

    The model to be predicted is established as

    (15)

    For a standard single-layer feedforward neural network (SLFN) with a node activation function and several nodes in the hidden layer ofLwith a single output, the outputs of the network nodes are expressed as

    (16)

    Unlike conventional SLFN, the activation function of ELM is usually fixed during the training period, and the initial parameter settings of the standard activation function can be generated by uniformly distributed random numbers, which converts the training learning of ELM to the estimation problem of finding the optimal weight vectorθ. Theoretically, it can be confirmed that ELM has universal approximation property and is a general function approximator.

    The optimal weight problem of ELM is solved by thel2-normalized optimization problem, and we have

    (17)

    where the matrixH=[h(x1),…,h(xN)] is the regularization parameter.

    For the implicit layer feature mappingh(x), we define a kernel matrix as

    (18)

    (19)

    In Eq.(19), the Gaussian kernel function is usually used, i.e.

    (20)

    whereζis the width parameter of the function, which sets the radial action range of the function. The penalty parametersCandζare selected by the cross-validation method.

    The implementation of the global prediction method based on the K-SVD-OMP algorithm is as follows:

    Step 3) The obtained dictionaryΦis used as data dictionary or used to learn a new compact dictionary. The sparse coding solution is performed based on the OMP algorithm, and the sparse coding vector corresponding to eachxiis calculated.

    Step 4) Usingαias the input and the corresponding target outputyi, a global prediction model is built. The prediction modelf(·) can be constructed by Eq.(18).

    4 Data processing and experiment

    4.1 Data processing

    In this section, an instance of a PV power prediction experiment is carried out to verify the method’s effectiveness, and the model using the experimental data containing PV power values and other influencing factors is performed by Eq.(15).

    Take the PV power datasets[23-25]provided by the organizer of the Global Energy Forecast Competition 2014 (GEFCOM) as an example, which is based on three adjacent solar power stations in an Australian region. The training datasets consist of PV power data from April 2012 to March 2013. The test datasets consider the influence of different dictionaries and different sparse encoding algorithms on the prediction results are consist of data from April 2014 to June 2015.The data interval is sampled by the hour. Besides, the initial dictionary selection in the experiment takes into account the time-delay input data vector, not only the DCT dictionary, but also the sparse coding algorithm of non-dictionary learning and the least absolute shrinkage and selection operator (LASSO) algorithm. Therefore, the sparse representation modeling algorithms such as DCT-LASSO, DCT-OMP, K-SVD-LASSO are used for comparison. The SVM method is accomplished using libsvm software.

    Fig.1 gives a schematic diagram of the training and test datasets composed of 15 tasks. Each task consists of training dataset and test dataset, respectively. After the test of each task is completed, the actual PV power data of that test month are merged into the training dataset of the next task, and so on, to obtain 15 tasks.

    Fig.1 Training and test datasets of each task

    For a particular regionz,z∈{1,2,3}, assuming the current time is hourhof dayd, whereh=0,1,…,23, using data from the European Centre for Medium-Term Weather Forecast (ECMWF) as the data source and 12 factors that measured 24 hours in advance in the Numerical Weather Forecast (NWF) as input variables, the PV power value in hourhof dayd+1 can be predicted. That is, the values measured 24 h in advance by each prediction model are available for PV power prediction. Therefore, 24 prediction models are needed to complete the daily prediction. Considering that the output of PV power at night is zero, only 16 prediction models are needed. There are 12 factors affecting the prediction model input and their physical implications are shown in Table 1.

    Table 1 Specific interpretation of the 12 variables

    4.2 Experiment results and discussion

    In the experiment, to evaluate the performance of the proposed model, root mean square error (RMSE) and Mean Absolute Error (MAE) are used for the experiment. The K-SVD algorithm selects a Gaussian kernel function ofζ=5 and a regularization factor ofη=102. Fig.2 gives a comparison of the prediction accuracy of different sparse representations under different values of the sparsityM. The best prediction accuracy is achieved whenMis equal to 4.

    Fig.2 Comparison of MAE of different prediction methods based on sparse representation algorithm

    To further measure the prediction effectiveness of the proposed method, the prediction results of the K-SVD-OMP method are compared with the sparse representation modeling method with non-dictionary learning. In the single KELM, SVM, and ELM methods, DCT dictionary is used in the sparse representation method with non-dictionary learning, the same Gaussian kernel function is selected for both the single KELM and SVM; the penalty parameterC=16, insensitive lossε=0.1, the number of hidden layersL=50 for the ELM. The activation function is a sigmoid function.

    First, Task1 and Task15 were randomly selected as examples to analyze the proposed method. Tables 2 and 3 show the values of MAE and RMSE of the above tasks. According to Tables 2 and 3, different sparse modeling methods all give better prediction accuracy values. Although the MAE value of K-SVD-OMP combining SVM model is higher than that of DCT-LASSO algorithm, its RMSE value is still better than that of the DCT-LASSO algorithm. Combining K-SVD-OMP algorithm with SVM, ELM, and KELM, respectively, it can be clearly seen that the accuracy of the prediction results is improved more effectively and the accuracy of K-SVD -OMP algorithm with KELM is the highest among them. In addition, compared with the GRNN, EEMD-SVM and EEMD-GRNN methods, the evaluation indicators of K-SVD-OMP-KELM method are better than those of GRNN, EEMD-SVM, EEMD-GRNN methods.

    Table 2 Results of different prediction methods (Task 1)

    Table 3 Results of different prediction methods (Task 15)

    Fig.3 gives the predicted and actual values of Zone 1 in K-SVD-OMP combined with the KELM method and other sparse representation modeling methods for Task 7 and shows the expected results for the first ten days of January 2015. Fig.4 then gives the comparison of corresponding prediction errors using different methods. Figs.3 and 4 further demonstrate that the sparse representation method based on K-SVD-OMP combined with KELM presents better prediction results.

    Fig.3 Prediction results using different methods for Zone 1 in Task 7

    Fig.4 Prediction errors using different methods for Zone 1 in Task 7

    Fig.5 compares the absolute percentage error (APE) box plots of Zone 2 in Task 7 using different sparse representations with the single SVM and KELM methods, and it can be seen that the global modelling approach of K-SVD-OMP combined with KELM achieves better prediction results under the APE metric. Fig.6 further gives the prediction results using K-SVD-OMP-KELM method for the first three days in April 2013. It can be seen that the K-SVD-OMP-KELM model for the first three days fits the actual output value better, at around 24-h time ahead and 48-h time ahead, the prediction result fits the actual value better, which shows a better prediction effect.

    Fig.5 Comparison of APE box plots using different methods

    Fig.6 Prediction results of Zone 3 in Task 1 using K-SVD-OMP-KELM method

    5 Conclusions

    Aiming at the actual PV power prediction problem, a sparse representation modeling method based on K-SVD-OMP algorithm and ELM with kernels is proposed. In the dictionary learning stage, this method uses the K-SVD algorithm to update the initial dictionary column by column. Compared with the non-dictionary learning algorithm that directly uses the DCT dictionary, it can obtain a more compact dictionary set under the condition of meeting the sparsity. It further reduces the reconstruction error and has a certain degree of adaptability, and the OMP algorithm is used in the sparse coding solution stage.

    The proposed sparse representation method based on the K-SVD-OMP dictionary learning algorithm and the ELM with kernels is then applied to an instance of PV power prediction, and compared with other methods under the same conditions. The experimental results show that compared with the existing single SVM or ELM prediction method, the proposed method combining sparse representation and KELM is similar to the representation learning process in deep learning networks, therefore, it can be regarded as a feature preprocessing method. Since the prediction accuracy of the model is further improved, it has a better ability in feature representation and prediction.

    国产三级黄色录像| 在线播放国产精品三级| av不卡在线播放| 久久婷婷成人综合色麻豆| 亚洲精品久久成人aⅴ小说| 成在线人永久免费视频| 男女免费视频国产| 一级片免费观看大全| 国产单亲对白刺激| av不卡在线播放| 精品亚洲成a人片在线观看| 人妻一区二区av| 亚洲五月天丁香| 女人被狂操c到高潮| 美女福利国产在线| 亚洲国产欧美日韩在线播放| 日韩三级视频一区二区三区| 99热国产这里只有精品6| 久久人人爽av亚洲精品天堂| 国产精品久久久久久精品古装| 午夜福利一区二区在线看| 狠狠狠狠99中文字幕| 亚洲av欧美aⅴ国产| 亚洲性夜色夜夜综合| 国产成人啪精品午夜网站| 日本一区二区免费在线视频| 亚洲一卡2卡3卡4卡5卡精品中文| 国产欧美日韩一区二区精品| 精品国产一区二区久久| 极品人妻少妇av视频| 99热网站在线观看| 亚洲avbb在线观看| 高清在线国产一区| 免费观看精品视频网站| 亚洲成人手机| 老司机深夜福利视频在线观看| 亚洲欧美精品综合一区二区三区| 国产欧美日韩一区二区三区在线| 91成人精品电影| 国产av一区二区精品久久| 最近最新中文字幕大全免费视频| 国产精品欧美亚洲77777| 又黄又爽又免费观看的视频| 午夜福利乱码中文字幕| 午夜亚洲福利在线播放| 天堂中文最新版在线下载| 中文字幕高清在线视频| 久久人妻av系列| 夜夜躁狠狠躁天天躁| 日日摸夜夜添夜夜添小说| 好看av亚洲va欧美ⅴa在| 色老头精品视频在线观看| 亚洲三区欧美一区| 成人国产一区最新在线观看| 日韩 欧美 亚洲 中文字幕| 国产av一区二区精品久久| 国产午夜精品久久久久久| 精品免费久久久久久久清纯 | 国产97色在线日韩免费| 9色porny在线观看| 欧美最黄视频在线播放免费 | tocl精华| 美女 人体艺术 gogo| 首页视频小说图片口味搜索| 亚洲欧洲精品一区二区精品久久久| 日日夜夜操网爽| 不卡av一区二区三区| 中文亚洲av片在线观看爽 | 日韩一卡2卡3卡4卡2021年| 亚洲综合色网址| 搡老乐熟女国产| 亚洲一区二区三区不卡视频| 一区二区三区激情视频| 女人被躁到高潮嗷嗷叫费观| 女人高潮潮喷娇喘18禁视频| 亚洲精品粉嫩美女一区| 亚洲成a人片在线一区二区| 欧美乱码精品一区二区三区| 精品视频人人做人人爽| 一区二区三区精品91| 两个人看的免费小视频| 丝瓜视频免费看黄片| 视频区欧美日本亚洲| 久久久精品国产亚洲av高清涩受| 又黄又粗又硬又大视频| 国产免费av片在线观看野外av| 亚洲五月天丁香| 最新美女视频免费是黄的| 身体一侧抽搐| 精品国产一区二区三区久久久樱花| 色婷婷av一区二区三区视频| 欧美日韩av久久| 国产av一区二区精品久久| 欧美日韩成人在线一区二区| 一本一本久久a久久精品综合妖精| 少妇裸体淫交视频免费看高清 | 男女高潮啪啪啪动态图| 免费在线观看影片大全网站| 极品教师在线免费播放| 欧美国产精品一级二级三级| 又紧又爽又黄一区二区| 19禁男女啪啪无遮挡网站| 99久久国产精品久久久| 亚洲午夜精品一区,二区,三区| 久久国产精品男人的天堂亚洲| 中文字幕人妻丝袜一区二区| 一本一本久久a久久精品综合妖精| 黄色视频,在线免费观看| 母亲3免费完整高清在线观看| 69精品国产乱码久久久| 久久精品国产99精品国产亚洲性色 | 高潮久久久久久久久久久不卡| 亚洲专区字幕在线| 超碰97精品在线观看| 亚洲欧美色中文字幕在线| 精品一品国产午夜福利视频| 99精国产麻豆久久婷婷| 欧美成狂野欧美在线观看| 欧美成人免费av一区二区三区 | videosex国产| 亚洲精品国产一区二区精华液| 亚洲欧美精品综合一区二区三区| 欧美日韩中文字幕国产精品一区二区三区 | 欧美av亚洲av综合av国产av| 午夜福利在线观看吧| 黑人欧美特级aaaaaa片| 精品亚洲成a人片在线观看| 亚洲aⅴ乱码一区二区在线播放 | 亚洲午夜精品一区,二区,三区| 国产成人欧美| 美女福利国产在线| 久久久久视频综合| 亚洲欧洲精品一区二区精品久久久| 在线观看免费高清a一片| 亚洲精品一卡2卡三卡4卡5卡| 精品久久久久久,| 国产精品二区激情视频| 午夜福利在线免费观看网站| 黄片大片在线免费观看| 欧美丝袜亚洲另类 | 欧美日韩国产mv在线观看视频| 亚洲精品中文字幕在线视频| 欧美激情极品国产一区二区三区| 两人在一起打扑克的视频| 超碰成人久久| 99国产精品免费福利视频| 午夜老司机福利片| 少妇 在线观看| 亚洲成av片中文字幕在线观看| 视频区图区小说| 欧美乱色亚洲激情| 日本精品一区二区三区蜜桃| 亚洲成国产人片在线观看| 欧美精品亚洲一区二区| videosex国产| 91大片在线观看| 午夜视频精品福利| 亚洲一码二码三码区别大吗| 日日夜夜操网爽| 亚洲国产毛片av蜜桃av| 一级,二级,三级黄色视频| 人妻丰满熟妇av一区二区三区 | 免费观看a级毛片全部| netflix在线观看网站| avwww免费| 男人的好看免费观看在线视频 | 国产男靠女视频免费网站| 精品福利永久在线观看| 美女福利国产在线| 欧美午夜高清在线| 女警被强在线播放| 欧美国产精品va在线观看不卡| 国产蜜桃级精品一区二区三区 | 在线视频色国产色| 亚洲一卡2卡3卡4卡5卡精品中文| 激情在线观看视频在线高清 | 搡老熟女国产l中国老女人| 午夜福利欧美成人| 欧美精品一区二区免费开放| 欧美 亚洲 国产 日韩一| av欧美777| 如日韩欧美国产精品一区二区三区| 韩国av一区二区三区四区| 国产高清激情床上av| 一区在线观看完整版| 超碰成人久久| 女人久久www免费人成看片| 欧美丝袜亚洲另类 | 水蜜桃什么品种好| 极品教师在线免费播放| 国产99久久九九免费精品| 久久国产乱子伦精品免费另类| 18禁裸乳无遮挡免费网站照片 | 成年人免费黄色播放视频| 可以免费在线观看a视频的电影网站| 侵犯人妻中文字幕一二三四区| 午夜福利免费观看在线| 国产激情久久老熟女| 老司机靠b影院| 性色av乱码一区二区三区2| 久久影院123| 韩国精品一区二区三区| 高清毛片免费观看视频网站 | 97人妻天天添夜夜摸| 日本精品一区二区三区蜜桃| 欧美日韩成人在线一区二区| 亚洲色图综合在线观看| 中文字幕高清在线视频| 好男人电影高清在线观看| 亚洲精品美女久久久久99蜜臀| 亚洲一区二区三区欧美精品| 性少妇av在线| 国产激情欧美一区二区| 少妇 在线观看| 久热这里只有精品99| 欧美在线一区亚洲| 亚洲成a人片在线一区二区| 亚洲 欧美一区二区三区| 老汉色∧v一级毛片| 久久久久精品人妻al黑| a在线观看视频网站| 成人18禁在线播放| 欧美人与性动交α欧美软件| 在线视频色国产色| 国产av精品麻豆| 亚洲一区高清亚洲精品| netflix在线观看网站| 国产深夜福利视频在线观看| 大陆偷拍与自拍| x7x7x7水蜜桃| 午夜91福利影院| 精品亚洲成a人片在线观看| 国产成人影院久久av| 亚洲欧美一区二区三区久久| 亚洲av成人av| 亚洲国产中文字幕在线视频| 国产一区有黄有色的免费视频| 男人操女人黄网站| 国产精品99久久99久久久不卡| 国产一区二区三区在线臀色熟女 | 在线观看舔阴道视频| 18禁国产床啪视频网站| 午夜成年电影在线免费观看| 天天影视国产精品| av福利片在线| 久久久精品区二区三区| 麻豆av在线久日| 欧美乱码精品一区二区三区| 视频区欧美日本亚洲| 国产aⅴ精品一区二区三区波| 三上悠亚av全集在线观看| 男女免费视频国产| 国产在视频线精品| 女性生殖器流出的白浆| 国产在线精品亚洲第一网站| 热re99久久精品国产66热6| 日本欧美视频一区| 亚洲成人手机| 国产在线一区二区三区精| 男女床上黄色一级片免费看| 别揉我奶头~嗯~啊~动态视频| 一级毛片精品| 老熟女久久久| 一二三四社区在线视频社区8| 黄色片一级片一级黄色片| 王馨瑶露胸无遮挡在线观看| av天堂在线播放| 免费观看人在逋| 黄色片一级片一级黄色片| 一本综合久久免费| 国产一区在线观看成人免费| 高清黄色对白视频在线免费看| 久久久国产成人精品二区 | 亚洲国产欧美网| 亚洲美女黄片视频| 亚洲av第一区精品v没综合| 精品欧美一区二区三区在线| 欧美成人午夜精品| 19禁男女啪啪无遮挡网站| 黄色片一级片一级黄色片| 老熟妇乱子伦视频在线观看| 满18在线观看网站| 国产精品乱码一区二三区的特点 | 深夜精品福利| 国产男女内射视频| 日本欧美视频一区| 成年女人毛片免费观看观看9 | 如日韩欧美国产精品一区二区三区| 激情在线观看视频在线高清 | 国产亚洲av高清不卡| 免费女性裸体啪啪无遮挡网站| 99国产精品99久久久久| 大香蕉久久网| 亚洲色图综合在线观看| 亚洲熟妇熟女久久| 久久青草综合色| 在线永久观看黄色视频| 欧美另类亚洲清纯唯美| 日韩 欧美 亚洲 中文字幕| 国产成人免费观看mmmm| 午夜免费成人在线视频| 国产成人一区二区三区免费视频网站| 精品电影一区二区在线| 女警被强在线播放| 亚洲全国av大片| 青草久久国产| 国产99白浆流出| 亚洲精品自拍成人| 黑人巨大精品欧美一区二区mp4| 满18在线观看网站| 老司机深夜福利视频在线观看| 亚洲人成77777在线视频| 50天的宝宝边吃奶边哭怎么回事| 欧美老熟妇乱子伦牲交| 精品第一国产精品| a级片在线免费高清观看视频| 欧美另类亚洲清纯唯美| 午夜福利乱码中文字幕| 国产精品 欧美亚洲| 在线观看一区二区三区激情| 成人国语在线视频| 午夜影院日韩av| 美女 人体艺术 gogo| 少妇粗大呻吟视频| 欧美人与性动交α欧美精品济南到| 久久久精品区二区三区| 男人的好看免费观看在线视频 | 亚洲av美国av| 久久影院123| 丰满的人妻完整版| av天堂久久9| 国产又色又爽无遮挡免费看| 狂野欧美激情性xxxx| 亚洲伊人色综图| 国产国语露脸激情在线看| 老司机靠b影院| 国产色视频综合| 亚洲精品成人av观看孕妇| 亚洲国产欧美一区二区综合| 免费日韩欧美在线观看| 国产成人精品久久二区二区91| 黄色毛片三级朝国网站| 国产三级黄色录像| 色综合婷婷激情| 岛国毛片在线播放| 国产成人免费无遮挡视频| 在线观看一区二区三区激情| 久久久国产一区二区| 18禁观看日本| 日本撒尿小便嘘嘘汇集6| 99久久人妻综合| 国产熟女午夜一区二区三区| 精品电影一区二区在线| 极品人妻少妇av视频| av线在线观看网站| 午夜老司机福利片| 欧美+亚洲+日韩+国产| 久久精品91无色码中文字幕| 看片在线看免费视频| 亚洲精品久久成人aⅴ小说| 黄色片一级片一级黄色片| 黄色丝袜av网址大全| 国产高清激情床上av| 国产精品久久电影中文字幕 | 亚洲人成伊人成综合网2020| 欧美日韩亚洲高清精品| 操出白浆在线播放| 亚洲国产欧美日韩在线播放| 亚洲一卡2卡3卡4卡5卡精品中文| 欧美+亚洲+日韩+国产| 欧美日韩一级在线毛片| 91字幕亚洲| 99热国产这里只有精品6| 999久久久国产精品视频| 波多野结衣一区麻豆| 日韩中文字幕欧美一区二区| 久久香蕉激情| 乱人伦中国视频| 99久久精品国产亚洲精品| 免费看a级黄色片| 日本撒尿小便嘘嘘汇集6| 国产日韩欧美亚洲二区| 精品午夜福利视频在线观看一区| 两个人免费观看高清视频| 久久精品国产99精品国产亚洲性色 | 男女免费视频国产| 黑人操中国人逼视频| 丝袜美腿诱惑在线| 成人永久免费在线观看视频| 狠狠狠狠99中文字幕| 国产精品国产高清国产av | 丝袜在线中文字幕| 日本wwww免费看| www.自偷自拍.com| 国产色视频综合| 妹子高潮喷水视频| 国产欧美日韩一区二区精品| 日韩欧美国产一区二区入口| 可以免费在线观看a视频的电影网站| 久久国产乱子伦精品免费另类| 亚洲精品乱久久久久久| 亚洲第一av免费看| 国产片内射在线| 欧美黑人精品巨大| 精品午夜福利视频在线观看一区| 亚洲精品久久成人aⅴ小说| 久久久国产成人精品二区 | 欧美激情久久久久久爽电影 | 黄色怎么调成土黄色| 精品视频人人做人人爽| 亚洲人成电影免费在线| 欧美不卡视频在线免费观看 | 国产亚洲精品第一综合不卡| 国产黄色免费在线视频| 久久午夜综合久久蜜桃| 三上悠亚av全集在线观看| 十八禁人妻一区二区| 在线观看舔阴道视频| 久久久久精品人妻al黑| 久久久久精品国产欧美久久久| 女人高潮潮喷娇喘18禁视频| 中文欧美无线码| 国产精品99久久99久久久不卡| 亚洲人成77777在线视频| 淫妇啪啪啪对白视频| 国产精品国产av在线观看| 久久久国产成人精品二区 | 啦啦啦免费观看视频1| 无人区码免费观看不卡| 欧美+亚洲+日韩+国产| 99久久精品国产亚洲精品| 搡老岳熟女国产| www.精华液| 欧美黄色片欧美黄色片| 啦啦啦视频在线资源免费观看| 18在线观看网站| 两性午夜刺激爽爽歪歪视频在线观看 | 日韩熟女老妇一区二区性免费视频| 黑人操中国人逼视频| 久久国产精品大桥未久av| 一本大道久久a久久精品| 国产精品久久久久久人妻精品电影| 一边摸一边做爽爽视频免费| 久99久视频精品免费| 久久人人97超碰香蕉20202| 最新在线观看一区二区三区| avwww免费| 精品第一国产精品| 国产成人精品在线电影| 久久精品亚洲熟妇少妇任你| 天天操日日干夜夜撸| 国产亚洲一区二区精品| 日韩有码中文字幕| 久久国产乱子伦精品免费另类| www.精华液| 高清视频免费观看一区二区| 91在线观看av| 最近最新免费中文字幕在线| 欧美 日韩 精品 国产| 国产成人免费观看mmmm| 少妇被粗大的猛进出69影院| 国产在线精品亚洲第一网站| 国产三级黄色录像| 老司机亚洲免费影院| 最新的欧美精品一区二区| 国产高清视频在线播放一区| 亚洲免费av在线视频| 日韩欧美一区视频在线观看| 精品国内亚洲2022精品成人 | 欧美最黄视频在线播放免费 | 国产黄色免费在线视频| 亚洲全国av大片| 99久久综合精品五月天人人| a级片在线免费高清观看视频| 成人18禁高潮啪啪吃奶动态图| 国产不卡一卡二| 777米奇影视久久| 在线观看一区二区三区激情| 久久久国产一区二区| 精品一品国产午夜福利视频| 国产精品永久免费网站| 国产免费av片在线观看野外av| 欧美精品av麻豆av| 国产91精品成人一区二区三区| av天堂久久9| 欧美黑人欧美精品刺激| 黄色怎么调成土黄色| 女人久久www免费人成看片| 成人精品一区二区免费| 黑丝袜美女国产一区| 亚洲自偷自拍图片 自拍| 欧美在线一区亚洲| 亚洲精品成人av观看孕妇| 麻豆av在线久日| 变态另类成人亚洲欧美熟女 | 啦啦啦免费观看视频1| 久久久久视频综合| 天天躁狠狠躁夜夜躁狠狠躁| 日本撒尿小便嘘嘘汇集6| 老熟女久久久| 国产极品粉嫩免费观看在线| 国产精品久久久久久精品古装| 国产成人影院久久av| 久久国产精品影院| 99国产精品一区二区蜜桃av | 成人免费观看视频高清| 50天的宝宝边吃奶边哭怎么回事| 国产不卡一卡二| 免费在线观看亚洲国产| 亚洲熟女毛片儿| 一级,二级,三级黄色视频| 免费在线观看视频国产中文字幕亚洲| 亚洲精品粉嫩美女一区| 50天的宝宝边吃奶边哭怎么回事| 99热只有精品国产| 久久影院123| 美女高潮喷水抽搐中文字幕| xxx96com| 欧美激情 高清一区二区三区| 制服人妻中文乱码| 精品国产美女av久久久久小说| 熟女少妇亚洲综合色aaa.| 在线观看一区二区三区激情| 国产日韩欧美亚洲二区| 精品久久久久久电影网| 韩国精品一区二区三区| 亚洲第一欧美日韩一区二区三区| 亚洲精品一二三| 免费在线观看日本一区| 亚洲一码二码三码区别大吗| 国产精品综合久久久久久久免费 | cao死你这个sao货| 中文字幕制服av| 18禁裸乳无遮挡动漫免费视频| 99热只有精品国产| 午夜福利影视在线免费观看| 久久九九热精品免费| 99久久人妻综合| 正在播放国产对白刺激| 极品教师在线免费播放| 欧美成人午夜精品| 久久久国产成人免费| 好男人电影高清在线观看| 天天添夜夜摸| 亚洲一卡2卡3卡4卡5卡精品中文| 久久久久久免费高清国产稀缺| 欧美日韩成人在线一区二区| 精品国产超薄肉色丝袜足j| 国产精华一区二区三区| 精品国产超薄肉色丝袜足j| 久久精品亚洲av国产电影网| 每晚都被弄得嗷嗷叫到高潮| 女警被强在线播放| 精品人妻1区二区| 精品欧美一区二区三区在线| 日韩欧美一区视频在线观看| 久久人妻av系列| 在线天堂中文资源库| 王馨瑶露胸无遮挡在线观看| 亚洲七黄色美女视频| 久久国产精品影院| tocl精华| 亚洲伊人色综图| 自拍欧美九色日韩亚洲蝌蚪91| 国产主播在线观看一区二区| 久久热在线av| videosex国产| 99国产精品一区二区三区| 一级,二级,三级黄色视频| 国产99白浆流出| 午夜老司机福利片| 亚洲第一av免费看| 欧美精品av麻豆av| 日本黄色日本黄色录像| 王馨瑶露胸无遮挡在线观看| 亚洲av第一区精品v没综合| 亚洲精品一卡2卡三卡4卡5卡| 成人手机av| 日韩制服丝袜自拍偷拍| 可以免费在线观看a视频的电影网站| 久久精品国产亚洲av香蕉五月 | 国产高清激情床上av| 精品一区二区三区av网在线观看| 精品人妻熟女毛片av久久网站| 国产精品乱码一区二三区的特点 | 久久性视频一级片| 国产一区在线观看成人免费| 亚洲中文日韩欧美视频| 嫁个100分男人电影在线观看| 日本欧美视频一区| 一区二区三区国产精品乱码| 黄片大片在线免费观看| 精品一区二区三卡| 在线观看免费高清a一片| 桃红色精品国产亚洲av| 精品国产一区二区三区四区第35| 看黄色毛片网站| 中文欧美无线码| 精品一品国产午夜福利视频| 搡老熟女国产l中国老女人| 久久久国产成人免费| 国产精品久久久人人做人人爽| 一本大道久久a久久精品| 久99久视频精品免费| 91麻豆精品激情在线观看国产 | 成年动漫av网址| 精品卡一卡二卡四卡免费| tocl精华| 丰满的人妻完整版| 国产精品乱码一区二三区的特点 | 免费观看a级毛片全部| 日韩欧美一区二区三区在线观看 | 久久国产精品大桥未久av| 美女视频免费永久观看网站| 国产成人啪精品午夜网站|