• <tr id="yyy80"></tr>
  • <sup id="yyy80"></sup>
  • <tfoot id="yyy80"><noscript id="yyy80"></noscript></tfoot>
  • 99热精品在线国产_美女午夜性视频免费_国产精品国产高清国产av_av欧美777_自拍偷自拍亚洲精品老妇_亚洲熟女精品中文字幕_www日本黄色视频网_国产精品野战在线观看 ?

    Multivariate time series prediction based on AR_CLSTM

    2021-09-15 02:34:02QIAOGangzhuSURongZHANGHongfei

    QIAO Gangzhu,SU Rong,ZHANG Hongfei

    (School of Data Science and Technology,North University of China,Taiyuan 030051,China)

    Abstract:Time series is a kind of data widely used in various fields such as electricity forecasting,exchange rate forecasting,and solar power generation forecasting,and therefore time series prediction is of great significance.Recently,the encoder-decoder model combined with long short-term memory (LSTM)is widely used for multivariate time series prediction.However,the encoder can only encode information into fixed-length vectors,hence the performance of the model decreases rapidly as the length of the input sequence or output sequence increases.To solve this problem,we propose a combination model named AR_CLSTM based on the encoder_decoder structure and linear autoregression.The model uses a time step-based attention mechanism to enable the decoder to adaptively select past hidden states and extract useful information,and then uses convolution structure to learn the internal relationship between different dimensions of multivariate time series.In addition,AR_CLSTM combines the traditional linear autoregressive method to learn the linear relationship of the time series,so as to further reduce the error of time series prediction in the encoder_decoder structure and improve the multivariate time series Predictive effect.Experiments show that the AR_CLSTM model performs well in different time series predictions,and its root mean square error,mean square error,and average absolute error all decrease significantly.

    Key words:encoder_decoder;attention mechanism;convolution;autoregression model;multivariate time series

    0 Introduction

    With the advent of the big data era,time series as a widely used data type has gradually become a research hotspot in various fields.Time series prediction refers to predicting the value within a period of time in the future by analyzing the relationship between the historical value of the time series and the future value[1].The current analysis and research on time seriesprediction has been extended to meteorology[2],economy[3],industry[4-5],transportation[6],medicine[7]and other fields.Traditional time series prediction methods using auto regressive (AR)models,moving average (MA)models,autoregressive moving average (ARMA)models,or autoregressive integrated moving average (ARIMA)models mainly focus on time series with stationarity,which can only model univariate,homoscedastic and linear time series.However,most of time series are multivariate,heteroscedastic and nonlinear,those methods cannot meet actual needs.Recently,lots of nonlinear model analysis methods have been proposed,such as artificial neural networks,support vector regression,genetic programming and other algorithms,which have strong learning and data processing capabilities,and do not need to assume functional relationship between the data in advance.Through training on a large number of sample data,the model can spontaneously approximate linear characteristics which are complex and even difficult to describe mathematically[13].

    Recurrent neural network (RNN)is a type of the neural network used to process sequence data.However,the traditional RNN generally has the problems of gradient disappearance and gradient explosion,which greatly limits the effect of long-term sequence prediction.Therefore,a longshort-term memory (LSTM)network is proposed to solve the problem in a targeted manner,which makes the analysis results have great improvement,so a large number of researchers focus on the development and optimization of related models.

    As LSTM has shown excellent performance in time series data processing,lots of improved LSTM-based time series processing methods have been proposed.Zhang et al.[8]divided the data into different scales,and processed the data of different scales with the multi-layer LSTM.Du et al.[9]use the two-way long and short-term memory network layer as the coding network,combined with the attention mechanism,to adaptively learn the long-term correlation and hidden correlation characteristics of multivariate time data.Heidari et al.[10]used the time-step attention LSTM model to predict the short-term energy utilization of solar water heating systems.The models listed above are all optimized models based on LSTM.Especially,the encoder-decoder combined with LSTM is widely used for multivariate time series prediction[11-15].However,since the encoder can only encode information into fixed-length vectors,the performance of the model decreases rapidly as the length of the input sequence or output sequence increases.Furthermore,due to the nonlinear characteristics of the neural network,the output scale is not sensitive to the input scale,while in a specific real data set,the input signal scale changes continuously in a non-periodic manner,which significantly reduces the prediction accuracy of neural network model.

    To improve the sensitivity of the output scale of the neural networkkto the input scale and reduce difficulty of learning the relationship between the variables of the multivariate time series,we propose a combination model based on encode_decoder structure and linear autoregression is proposed called AR_CLSTM.

    1 Convolution neural network and LSTM network

    1.1 Convolution neural network

    One of the important reasons why convolution neural network (CNN)can achieve excellent performance on images lies in its feature extraction capability.When multiple convolution layers are stacked,low-level features such as pixels of the original image can be gradually extracted to edges,corners,contours,…,to the entire target.This feature hierarchical representation phenomenon is not only present in image data.For time series data,due to its local correlation,this process also reflects the hierarchical representation phenomenon of features.Therefore,using convolution operations is used to learn correlation between short time steps and local dependencies between time series variables.In this study,one-dimensional convolution is used for time series data.Using different convolution kernels to perform convolution operations on sequence data,through the convolution operation,different features can be extracted from the time series to the feature space of the time series.The convolution operation structure is shown in Fig.1.

    Fig.1 Convolution operation diagram

    The convolution layer is generally composed of multiple filters or convolution kernels.The output result of thekth filter scanning the input matrix is

    Xk=f(Wk*t+bk),

    (1)

    where * means convolution operation;Wkandbkare the parameters to be trained;tconsists of {t1,t2,t3,ti,tn};f(·)is the activation function;andXkrepresents the output of the convolution layer.

    1.2 LSTM network

    LSTM network is a variant of recurrent neural network.LSTM solves the long-distance dependence problem that traditional recurrent neural networks cannot handle[16-17].The LSTM adds a cell state to the recurrent neural network,and uses cells to store the long-term state.In order to enable the cell state to effectively preserve the long-term state,the LSTM neural network adds three control gates,a forget gate,an input gate and an output gate.The network structure of the LSTM unit is shown in Fig.2.

    Fig.2 LSTM unit structure diagram

    The forget gateftis used to control how much message of the unit statect-1from the previous moment can be retained to the current state.The forget gate is calculated by

    ft=σ(Wf[ht-1,xt]+bf),

    (2)

    whereWfis the weight matrix of the forget gate;[ht-1,xt]represents a vector that concatenates the output of the previous moment and the input of the current moment into one;bfis the bias of the forget gate;σis the Sigmoid activation function,whose value is 0 to 1,which describes how much of each part can pass:0 means “no amount is allowed”to pass and 1 means “allow any amount to pass”.

    The input gateitis used to control the input of the immediate state to the long-term statec,and it is calculated by

    it=σ(W[ht-1,xt]+bi),

    (3)

    whereWiis the weight matrix of the input gate,biis the bias term of the forget gate.Assumingctis the unit state at the current moment,it is calculated by

    (4)

    wherectrefers to the current state of the cell input.

    Thanks to the control of the forget gate,the state of the unit can save information from a long time ago.In addition the input gate can avoid irrelevant information from entering the memory.

    The output gateotis used to control the influence of long-term memory on the current output,and it is calculated by

    ot=σ(Wo[ht-1,xt]+bo).

    (5)

    The final output of the LSTM network is

    ht=ottanh(ct),yt=σ(Wht).

    (6)

    2 Structure and principle of AR_CLSTM

    2.1 Data processing

    For the collected multivariate time series data,a fixed window is used to select specific step data as the input of the model,and the next step of the window is used as the output of the model.By sliding the window from left to right,the input and output of the model are constructed,respectively.As shown in Fig.3,the time series is represented byT=(t1,t2,t3,…,tn).a sliding window with a size of 4 is selected,where (X1,X2,…,Xp)is the input of the model,(Y1,Y2,…,Yp)is the output of the model,andpis the size of the divided data set.

    Fig.3 Data processing diagram

    2.2 AR_CLSTM

    In AR_CLSTM model,a linear structure and a nonlinear structure are combined in an end-to-end way.Thus,the AR_CLSTM structure is divided into two parts,one part learns the nonlinear characteristics of the data,and the other learns the linear characteristics of the data.

    1)Nonlinear learning

    The data nonlinear feature learning module is shown in Fig.4.After the input data of the model constructed are processed by the convolution module,the feature space of the sequence data can be obtained,denoted asS=(s1,s2,…,st),S∈Rtxd,wheredis the number of variables in a multivariate sequence.Specific convolution operations are shown section 1.1.The feature space is input into the encoder for processing to obtain the hidden state of the sequence data.In the attention module,the hidden states of different time steps are selected in different proportions to form the coding vector of thetth time step.Inputting all the obtained time step encoding vectors into the decoder will get the predicted valueyt[18].

    Fig.4 Nonlinear learning diagram

    As shown in the above formula,the encoder is essentially a recurrent neural network with LSTM as the basic unit.For a given input featurest,the encoder can be used to learn the mapping relationshipf1fromsttoht,andht=f1(ht-1,st).It encodes the input sequence into a featurehof fixed size.The hidden stateh=(h1,h2,…,ht),whereht∈Rmis the hidden state of the encoder at timet,andmis the size of the hidden unit.

    In order to adaptively select the hidden state of the relevant encoders at all time steps,we add an attention mechanism to the hidden states of different time steps.Specifically,thetth context vector will be weighted and obtained according to the hidden statehtof the decoder of each input vector,namely

    (7)

    Score=Vtanh(W1h+W2ht).

    (8)

    The vectorain Eq.(7)can be obtained by the Score function through the SoftMax function.The Score function is shown in Eq.(8),wherehis the hidden state of all steps of the encoder,htis the hidden state of stept,andW1,W2andVare the parameters to be trained.

    The structure of the decoder is the same as that of the encoder.Initially,the last stepstandC1are spliced as the LSTM basic unit input.After that,the last step of the LSTM outputyt-1andCtare spliced together as the LSTM basic unit Input,and finally get the predicted valueyt.

    2)Linear learning

    Due to the non-linear characteristics of convolution and recursive components,one of the main disadvantages of neural network models is that the output scale is not sensitive to the input scale.In order to solve this problely,weadd the linear prediction part on the basis of the above structure,using the classic multivariate autoregressive model (VAR)to learn linear features,as shown in Eq.(9)

    (9)

    wherew1andb1are the parameters to be trained,tiis the time step in the training set,andwis the window size value.VAR learns the linear relationship between the current step and each historical time step.

    The output result of the nonlinear model learning of the AR_CLSTM model is denoted asy1,the output result of the linear model learning is denoted asy2,and then the output of the AR_CLSTM can be denoted as

    y=α1y1+α2y2,

    (10)

    whereα1andα2are the parameters to be trained,and the AR_CLSTM model is combined in an end-to-end way.Here,we use adaptive moment estimation (Adam)to solve the optimal parameters.In this way,the model combination can make the linear model and the nonlinear model make up for each other,so that the overall effect of the model can be improved.

    3 Experiment

    3.1 Data set

    In order to prove the effectiveness of AR_CLSTM,we use a total of 4 data sets for testing.The first is the electricity dataset,which includes the electricity consumption data of 321 customers from 2012 to 2014 (recorded every 1 h)in kWh;The second is the exchange_rate dataset,which collects the data from 1990 to 2016,the daily exchange rates of eight countries including the United Kingdom,Canada,Switzerland,China,Japan,New Zealand and Singapore;The third is the solar_AL data set,which records 137 photovoltaic power plants from Alabama in 2006,the results of sampling every 10 min in the middle;The fourth is the traffic data set,which comes from the hourly data of road occupancy (between 0 and 1)measured by different sensors on highways in the San Francisco Bay area in the 48 months (2015-2016)of the California Department of Transportation.The number of training set,test set,and data set variables are shown in Table 1.In this study,80% of the experimental data are used as the training set and 20% as the test set.At the same time,in order to eliminate the influence caused by the difference in magnitude between the features,the data are normalized.

    Table 1 Specifications of data set

    3.2 Network parameters

    The data flow changes in the AR_CLSTM network structure are shown in Fig.5.The specific parameter settings of AR_CLSTM are shown in Table 2.

    It can be seen in Fig.5 that the convolution block used by AR_CLSTM is a three-layer convolution network.Each layer is composed of convId,maxpooling and dropout.The specific parameter settings are shown in Table 2.

    Fig.5 AR_CLSTM network structure diagram

    Table 2 AR_CLSTM parameter

    Convolution learns the relationship between different dimensions.The pooling layer performs an up-sampling operation on the convolution output to retain strong features and remove weak features.At the same time,the dropout layer is introduced to reduce the number of parameters and prevent overfitting.In addition,it can be seen from Table 2 that the encoding and decoding structures are both composed of LSTM layers with a hidden unit size of 100.The size selected by dense is the basic unit of the output feature size to be fully connected.The structures of VAR,attention and model choic are described in section 2.2 and they have no control parameters.The model choose structure is shown in Eq.(10).The model uses mean square error(MSE)as the loss function,and uses the Adam optimization algorithm to update the weights of the network.

    3.3 Evaluation criteria

    In order to prove the effectiveness of the AR_CLSTM model,we use MSE,root mean square error (RMSE),and mean absolute error (MAE)to evaluate the errors produced by the model predictions.MSE,RMSE and MAE are calculated by Eqs.(11)-(13),respectively.The smaller the value of MSE,RMSE,and MAE,the better the accuracy of the model.

    (11)

    (12)

    (13)

    3.4 Experimental results and analysis

    3.4.1 Analysis of evaluation index

    In order to verify the effectiveness of the AR_CLSTM model,we compare it with four baseline methods on four data sets.The experimental results are shown in Table 3.

    Table 3 Experimental results

    Table 3 shows the index results of RMSE,MAE,and MSE for different models under different data sets.The window in the table is the size of the window mentioned in section 2.1.the value of the window is selected through the grid searchw={12,24,30,48}.

    It can be seen from Table 3 that the proposed model can learn the linear and nonlinear characteristics of the data very well,and performs best on all data sets.Under the RMSE indicator,the effect of the AR_CLSTM model in the electricity data set is improved by 11.8% compared with VAR,by 57.8% compared with LSTM,and by 59.1% compared with GRU;The effect of the AR_CLSTM model in the exchange_rate data set is improved by 60.4% compared with VAR,by 49.7%compared with LSTM,and by 14.7% compared with GRU.The effect of the AR_CLSTM model in the solar_AL data set is improved by 13.5% compared with VAR,by 34.3%compared with LSTM,and by 40.2% compared with GRU.The effect of the AR_CLSTM model in the traffic data set is improved by 22.8% compared with VAR,by 26.7%compared with LSTM,and by 28.3% compared with GRU.It can be seen that CNN has the worst learning ability for time series data,GRU and LSTM are similar,and AR_CLSTM is the best.

    3.4.2 Analysis of model time-consuming

    In this study,we chose the data set solar_AL with the largest test set specification and Traffic with the highest data variable dimension to analyze and compare the prediction time of AR_CLSTM model with those of VAR,LSTM,and GRU.The time required to complete the prediction of the four models under the solar_AL data set is shown in Table 4,and the time required to complete the prediction under the Traffic data set is shown in Table 5.

    Table 4 Time-consuming of solar_AL

    Table 5 Time-consuming of traffic

    From the comparison of Tables 4 and 5,compared with other models,the AR_CLSTM model does not take much time to complete the prediction,but the accuracy is higher.

    3.4.3 Display of prediction results

    In order to better observe the details of the prediction results,the experiment chose to draw the prediction effect diagram of the last 300 steps under each data set.Figs.6-9 show the partial prediction results of the AR,LSTM,and AR_CLSTM models in the data sets electricity,exchange_rate,solar_AL,and traffic,respectively.The predicted value is drawn with a dashed line,and the true value is drawn with a solid line.It can be seen that the prediction results of the model are not the same under different data sets.

    (a)AR

    For electricity and exchange rate data sets,AR_CLSTM has the best fitting effect.In the solar AL data set,the prediction results of AR model for night solar energy are not as good as that of AR_CLSTM,and the overall fitting effect of LSTM is not as good as that of AR_CLSTM.In the traffic data set,for the appearance of irregular super-high peaks in the traffic data set,the prediction results of the models AR,LSTM,and AR_CLSTM are not satisfactory.However,it can be seen from the prediction results that AR_CLSTM has the strongest learning ability for sequence data among different data set.

    (a)AR

    (a)AR

    (a)AR

    3.4.4 Ablation experiment

    In order to further analyze the improvement effect of the convolution,attention and autoregression proposed by the (AR_CLSTM)model on the original model,we analyze and compare them under the electricity and traffic data sets.In order to facilitate the description,the encode_decoder network without attention mechanism mentioned in the above is named en_de,and the model with attention mechanism is named en_de_att.After the convolution module is introduced,it is named CNN_en_de_att.The specific situations are shown in Tables 6 and 7.It can be seen that the attention mechanism,convolution module,and autoregressive module have been improved to varying degrees on the basis of the original model.However,the obvious attention mechanism and autoregressive module contribute more to AR_CLSTM.For this reason,a new challenge is proposed for how to use CNN to learn the characteristics of sequence data.

    Table 6 Ablation experiment on electricity

    Table 7 Ablation experiment on traffic

    4 Conclusions

    The AR_CLSTM model is proposed for multivariate time series prediction,which uses the structure of convolution to learn the internal relations between the different dimensions of multivariate time series,and then the feature information with less redundant data are input into the encode_decoder network structure based on time step attention to extract the coupling relationship between time series data.The model also uses the traditional linear regression method to learn the linear relationship of the time series,and improves the accuracy of multivariate time series forecasting by learning the linear and non-linear relationships of the time series.The effectiveness of the model is verified by testing on four multivariate time series data sets.Experimental results show that the AR_CLSTM model can combine the advantages of linear prediction and nonlinear prediction,and has different effects on various types of data sets.

    It can be seen from experiments that the addition of CNN modules can improve the learning ability of the model to a certain extent.But how to use the convolution operation to further extract the features of the time series data needs further research and exploration.In addition,since the performance of deep learning models is related to different parameters and data set conditions in most cases,how to improve and stabilize the predictive ability of this model and better support online time series predition and early warning applications is also the focus of future research.

    欧美一区二区亚洲| 99热这里只有精品一区| 天堂动漫精品| 成年女人看的毛片在线观看| 日日摸夜夜添夜夜添小说| 欧美另类亚洲清纯唯美| 国内精品宾馆在线| 日日干狠狠操夜夜爽| 亚洲精品一区av在线观看| 一个人看的www免费观看视频| 国产单亲对白刺激| 国产女主播在线喷水免费视频网站 | 日本一本二区三区精品| 午夜日韩欧美国产| 国产片特级美女逼逼视频| 网址你懂的国产日韩在线| 一级黄片播放器| 久久久久久久久久黄片| 日韩一本色道免费dvd| 亚洲国产日韩欧美精品在线观看| 99热这里只有是精品在线观看| 美女xxoo啪啪120秒动态图| 日本黄大片高清| 日韩一区二区视频免费看| 狠狠狠狠99中文字幕| 国内精品美女久久久久久| 看片在线看免费视频| 国产精品美女特级片免费视频播放器| 日本a在线网址| 97超视频在线观看视频| 老熟妇仑乱视频hdxx| 国产伦在线观看视频一区| 精品日产1卡2卡| 欧美xxxx黑人xx丫x性爽| 欧美bdsm另类| 成人av一区二区三区在线看| 看黄色毛片网站| 亚洲不卡免费看| 高清毛片免费观看视频网站| 久久人人爽人人爽人人片va| 日韩欧美 国产精品| 美女被艹到高潮喷水动态| 亚洲成人久久爱视频| 欧美日韩乱码在线| 搡老岳熟女国产| 亚洲第一区二区三区不卡| 又黄又爽又刺激的免费视频.| 狂野欧美白嫩少妇大欣赏| 亚洲无线在线观看| 久久精品国产亚洲av香蕉五月| 亚洲精华国产精华液的使用体验 | 久久国内精品自在自线图片| 免费高清视频大片| 久久欧美精品欧美久久欧美| 老熟妇乱子伦视频在线观看| 久久久久精品国产欧美久久久| 欧美色视频一区免费| 日韩欧美免费精品| 日本 av在线| 亚洲国产高清在线一区二区三| 搡老熟女国产l中国老女人| 日韩精品青青久久久久久| 久久久久久久久中文| 国产精品一区二区免费欧美| 亚洲五月天丁香| 丰满乱子伦码专区| 日韩制服骚丝袜av| 日本色播在线视频| av天堂在线播放| 亚洲精品亚洲一区二区| 亚洲精品日韩在线中文字幕 | 夜夜看夜夜爽夜夜摸| 国产熟女欧美一区二区| 国内久久婷婷六月综合欲色啪| 欧美成人免费av一区二区三区| 午夜福利成人在线免费观看| 亚洲av不卡在线观看| 久久久国产成人免费| av视频在线观看入口| 国产国拍精品亚洲av在线观看| 成人国产麻豆网| 男女啪啪激烈高潮av片| 身体一侧抽搐| 18+在线观看网站| 色av中文字幕| 男人的好看免费观看在线视频| 哪里可以看免费的av片| 久久人人精品亚洲av| 亚洲第一区二区三区不卡| 欧美激情国产日韩精品一区| 麻豆一二三区av精品| 国产精品伦人一区二区| 欧美成人a在线观看| 九九热线精品视视频播放| 最新中文字幕久久久久| 日本-黄色视频高清免费观看| 18+在线观看网站| 乱码一卡2卡4卡精品| 一进一出抽搐动态| 午夜福利在线观看吧| 久久草成人影院| 18禁在线无遮挡免费观看视频 | 老司机午夜福利在线观看视频| 春色校园在线视频观看| 欧美性猛交╳xxx乱大交人| 精品免费久久久久久久清纯| av卡一久久| 久久久久国内视频| 在线观看午夜福利视频| 99在线视频只有这里精品首页| 在线免费观看不下载黄p国产| 99riav亚洲国产免费| 免费不卡的大黄色大毛片视频在线观看 | 中出人妻视频一区二区| 国产真实乱freesex| 色播亚洲综合网| 国产精品三级大全| 你懂的网址亚洲精品在线观看 | 免费观看在线日韩| 国内精品一区二区在线观看| 欧美潮喷喷水| 人妻制服诱惑在线中文字幕| 在线天堂最新版资源| 久久久午夜欧美精品| 91麻豆精品激情在线观看国产| 搡老妇女老女人老熟妇| 国产精品美女特级片免费视频播放器| 老司机福利观看| 国产av麻豆久久久久久久| 伊人久久精品亚洲午夜| 少妇高潮的动态图| 少妇高潮的动态图| 人人妻,人人澡人人爽秒播| 日韩欧美三级三区| 国产成人a∨麻豆精品| 在线看三级毛片| 亚洲人成网站在线观看播放| 久久久久国产精品人妻aⅴ院| 久久精品人妻少妇| 日韩三级伦理在线观看| 禁无遮挡网站| 午夜福利视频1000在线观看| 精品久久久噜噜| 伦理电影大哥的女人| 亚洲精品日韩av片在线观看| 免费高清视频大片| 禁无遮挡网站| 国产综合懂色| 成年女人看的毛片在线观看| 色综合亚洲欧美另类图片| 亚洲aⅴ乱码一区二区在线播放| 国产成人影院久久av| 成人漫画全彩无遮挡| 嫩草影院入口| 国产爱豆传媒在线观看| 12—13女人毛片做爰片一| 国内精品一区二区在线观看| 在线播放无遮挡| 天堂动漫精品| 日本熟妇午夜| 亚洲丝袜综合中文字幕| h日本视频在线播放| 人妻夜夜爽99麻豆av| 久久这里只有精品中国| 乱人视频在线观看| 国产成人影院久久av| 女人被狂操c到高潮| 亚洲欧美日韩高清在线视频| 日韩成人伦理影院| 久久热精品热| 婷婷六月久久综合丁香| 我要搜黄色片| 中文字幕精品亚洲无线码一区| 日韩欧美国产在线观看| 日韩,欧美,国产一区二区三区 | 亚洲国产欧洲综合997久久,| 91在线精品国自产拍蜜月| 中文字幕精品亚洲无线码一区| 一级毛片aaaaaa免费看小| 亚洲电影在线观看av| av国产免费在线观看| 少妇的逼好多水| 日本欧美国产在线视频| 午夜精品国产一区二区电影 | 亚洲内射少妇av| av卡一久久| 亚洲一区高清亚洲精品| 国产精品野战在线观看| 三级男女做爰猛烈吃奶摸视频| 男人舔女人下体高潮全视频| 国产精品女同一区二区软件| 少妇熟女欧美另类| 99久久成人亚洲精品观看| 色5月婷婷丁香| av中文乱码字幕在线| 丰满乱子伦码专区| 最近2019中文字幕mv第一页| 级片在线观看| 国产精品伦人一区二区| 啦啦啦韩国在线观看视频| 波野结衣二区三区在线| 热99在线观看视频| 亚洲欧美日韩卡通动漫| 干丝袜人妻中文字幕| 亚洲av中文字字幕乱码综合| av在线蜜桃| 香蕉av资源在线| 天天躁日日操中文字幕| 如何舔出高潮| 不卡视频在线观看欧美| 亚洲精品在线观看二区| 欧美一区二区国产精品久久精品| 99久久精品国产国产毛片| 亚洲精品乱码久久久v下载方式| 久久精品国产亚洲av涩爱 | 国产三级中文精品| 亚洲无线在线观看| 少妇猛男粗大的猛烈进出视频 | 夜夜爽天天搞| 国产亚洲欧美98| 可以在线观看毛片的网站| 国产极品精品免费视频能看的| 一级a爱片免费观看的视频| 日韩强制内射视频| 国产麻豆成人av免费视频| 国产伦精品一区二区三区四那| 国产亚洲精品久久久久久毛片| 男女下面进入的视频免费午夜| 九九热线精品视视频播放| 一级黄片播放器| 欧美日本视频| 日韩av不卡免费在线播放| 成年女人永久免费观看视频| 一本一本综合久久| 国产 一区 欧美 日韩| 亚洲av二区三区四区| 欧美日韩在线观看h| 听说在线观看完整版免费高清| 久久人人爽人人片av| 午夜免费男女啪啪视频观看 | 日韩亚洲欧美综合| 国产精品野战在线观看| 12—13女人毛片做爰片一| 国产黄a三级三级三级人| 简卡轻食公司| 成人一区二区视频在线观看| 亚洲第一区二区三区不卡| 免费电影在线观看免费观看| 一本久久中文字幕| 丝袜喷水一区| 永久网站在线| 黑人高潮一二区| 国内精品宾馆在线| 淫妇啪啪啪对白视频| 精品一区二区三区视频在线观看免费| 国产高清激情床上av| 最后的刺客免费高清国语| 最近最新中文字幕大全电影3| 国产熟女欧美一区二区| 最新中文字幕久久久久| 一个人观看的视频www高清免费观看| 国产91av在线免费观看| 啦啦啦韩国在线观看视频| 神马国产精品三级电影在线观看| 亚洲国产欧美人成| 人妻少妇偷人精品九色| 啦啦啦韩国在线观看视频| 日韩一本色道免费dvd| 欧美性感艳星| 久99久视频精品免费| 少妇的逼水好多| av中文乱码字幕在线| 99国产精品一区二区蜜桃av| 亚洲成a人片在线一区二区| 亚洲av五月六月丁香网| 草草在线视频免费看| 久久韩国三级中文字幕| 亚洲国产精品久久男人天堂| 亚洲专区国产一区二区| 久久婷婷人人爽人人干人人爱| 欧美xxxx黑人xx丫x性爽| 久久人妻av系列| 国产极品精品免费视频能看的| 变态另类丝袜制服| 亚洲国产精品成人久久小说 | 成人美女网站在线观看视频| 精品一区二区三区人妻视频| 亚洲内射少妇av| 国产黄色视频一区二区在线观看 | 一夜夜www| 国产精品久久久久久久久免| 国产黄片美女视频| 日韩欧美一区二区三区在线观看| 中国美女看黄片| 成人美女网站在线观看视频| 亚洲精品亚洲一区二区| or卡值多少钱| 在线免费观看不下载黄p国产| 国产成人一区二区在线| 一a级毛片在线观看| 久久6这里有精品| 精品久久久久久久久久久久久| 日本-黄色视频高清免费观看| 少妇的逼好多水| 亚洲精品456在线播放app| 国产高清不卡午夜福利| 久久99热这里只有精品18| 春色校园在线视频观看| av专区在线播放| 久久热精品热| 我要搜黄色片| 成年av动漫网址| 日本黄色视频三级网站网址| 国产精品三级大全| 夜夜夜夜夜久久久久| 国产在线男女| 激情 狠狠 欧美| 人妻久久中文字幕网| 成人特级黄色片久久久久久久| 国产精品无大码| 久久久久国内视频| 免费一级毛片在线播放高清视频| 久久精品人妻少妇| 亚洲aⅴ乱码一区二区在线播放| 国产aⅴ精品一区二区三区波| 在线观看免费视频日本深夜| АⅤ资源中文在线天堂| 非洲黑人性xxxx精品又粗又长| 欧美日韩国产亚洲二区| 日韩精品有码人妻一区| 中国美女看黄片| 亚洲丝袜综合中文字幕| 日产精品乱码卡一卡2卡三| 性插视频无遮挡在线免费观看| 欧美潮喷喷水| 搡女人真爽免费视频火全软件 | 国产av一区在线观看免费| 国产精品久久久久久久电影| 长腿黑丝高跟| 性色avwww在线观看| 黄色欧美视频在线观看| 亚洲经典国产精华液单| 美女大奶头视频| 免费观看的影片在线观看| 国产av麻豆久久久久久久| 亚洲精品日韩av片在线观看| 免费观看的影片在线观看| 伦精品一区二区三区| 久久久久久久久久黄片| 午夜激情欧美在线| 人妻丰满熟妇av一区二区三区| 久久久欧美国产精品| 赤兔流量卡办理| 久久精品国产亚洲av天美| 午夜激情欧美在线| 一区福利在线观看| av天堂中文字幕网| 国产毛片a区久久久久| 久久久久久九九精品二区国产| 精品久久国产蜜桃| 久久九九热精品免费| 亚洲欧美日韩东京热| 变态另类丝袜制服| 91精品国产九色| 寂寞人妻少妇视频99o| а√天堂www在线а√下载| 一级毛片aaaaaa免费看小| 国内揄拍国产精品人妻在线| 麻豆国产av国片精品| 日韩,欧美,国产一区二区三区 | 在线国产一区二区在线| 18禁黄网站禁片免费观看直播| 麻豆国产av国片精品| 国产成人一区二区在线| 亚洲av.av天堂| 麻豆成人午夜福利视频| 成人永久免费在线观看视频| 99热网站在线观看| 欧美色欧美亚洲另类二区| 国产黄片美女视频| a级毛片免费高清观看在线播放| 日本三级黄在线观看| 久久精品人妻少妇| 亚洲五月天丁香| 久久久欧美国产精品| 99久久精品热视频| 欧美xxxx性猛交bbbb| 国产成人freesex在线 | 中文字幕久久专区| www日本黄色视频网| 亚洲国产精品成人综合色| 日日摸夜夜添夜夜添小说| 在线观看一区二区三区| 99在线视频只有这里精品首页| av在线亚洲专区| 长腿黑丝高跟| 亚洲人成网站在线播| 99国产极品粉嫩在线观看| 美女cb高潮喷水在线观看| 一个人看视频在线观看www免费| 国产国拍精品亚洲av在线观看| 国产精品伦人一区二区| 精品一区二区三区人妻视频| 亚洲av第一区精品v没综合| 尤物成人国产欧美一区二区三区| 大型黄色视频在线免费观看| 国产成人a∨麻豆精品| 国产精品不卡视频一区二区| 我的女老师完整版在线观看| 男人的好看免费观看在线视频| 免费看光身美女| 久久久久久大精品| 亚洲在线观看片| 欧美bdsm另类| 热99re8久久精品国产| 在线天堂最新版资源| 欧美性感艳星| 日日摸夜夜添夜夜添小说| 免费av毛片视频| 如何舔出高潮| av在线观看视频网站免费| 久久九九热精品免费| 婷婷亚洲欧美| 99九九线精品视频在线观看视频| 欧美一级a爱片免费观看看| 欧美日韩在线观看h| 午夜精品在线福利| 婷婷精品国产亚洲av| 99在线人妻在线中文字幕| 欧美高清性xxxxhd video| 亚洲乱码一区二区免费版| 久久韩国三级中文字幕| 国产精品电影一区二区三区| 少妇丰满av| 韩国av在线不卡| 国产视频一区二区在线看| 丝袜美腿在线中文| 看片在线看免费视频| 热99在线观看视频| 99热这里只有是精品在线观看| 亚洲精品日韩在线中文字幕 | 熟女人妻精品中文字幕| 在线观看免费视频日本深夜| 我要搜黄色片| 天堂av国产一区二区熟女人妻| 欧美精品国产亚洲| 日本三级黄在线观看| 午夜影院日韩av| 午夜福利视频1000在线观看| 三级经典国产精品| 美女xxoo啪啪120秒动态图| 可以在线观看毛片的网站| 18禁黄网站禁片免费观看直播| 国产爱豆传媒在线观看| 卡戴珊不雅视频在线播放| 久久久久久久久中文| 国产亚洲91精品色在线| 老女人水多毛片| 国内揄拍国产精品人妻在线| 日本撒尿小便嘘嘘汇集6| 99久久久亚洲精品蜜臀av| av卡一久久| 国产毛片a区久久久久| 国产精品女同一区二区软件| 久久亚洲国产成人精品v| 久久精品影院6| 成年av动漫网址| 99久久久亚洲精品蜜臀av| 国产黄色小视频在线观看| 直男gayav资源| 免费无遮挡裸体视频| 一个人免费在线观看电影| 国产激情偷乱视频一区二区| 久久精品夜夜夜夜夜久久蜜豆| 久久久国产成人精品二区| 身体一侧抽搐| 男插女下体视频免费在线播放| 国产毛片a区久久久久| 免费看美女性在线毛片视频| 真人做人爱边吃奶动态| 国产av不卡久久| 又爽又黄a免费视频| avwww免费| 中出人妻视频一区二区| 无遮挡黄片免费观看| 国产成人aa在线观看| 国产成人91sexporn| 精品无人区乱码1区二区| 我的老师免费观看完整版| 午夜福利视频1000在线观看| 亚洲精品一卡2卡三卡4卡5卡| 少妇裸体淫交视频免费看高清| 亚洲在线自拍视频| 一级a爱片免费观看的视频| 国产视频内射| 最好的美女福利视频网| 久久精品夜色国产| 身体一侧抽搐| 亚洲精品影视一区二区三区av| 亚洲最大成人中文| 最近视频中文字幕2019在线8| 中文字幕久久专区| 日韩三级伦理在线观看| 天堂影院成人在线观看| 18禁黄网站禁片免费观看直播| 国产单亲对白刺激| 亚洲美女黄片视频| 午夜激情欧美在线| 男女啪啪激烈高潮av片| 美女内射精品一级片tv| 看黄色毛片网站| 十八禁网站免费在线| 成年女人毛片免费观看观看9| 亚洲欧美日韩无卡精品| 亚洲欧美精品综合久久99| 自拍偷自拍亚洲精品老妇| 在线免费观看不下载黄p国产| 22中文网久久字幕| 能在线免费观看的黄片| 久久热精品热| 亚洲国产色片| 三级国产精品欧美在线观看| 亚洲最大成人手机在线| 日本-黄色视频高清免费观看| 国产精华一区二区三区| 三级男女做爰猛烈吃奶摸视频| 欧美激情在线99| 伦理电影大哥的女人| 免费搜索国产男女视频| 欧美不卡视频在线免费观看| 女人十人毛片免费观看3o分钟| 免费看光身美女| 夜夜爽天天搞| 精品少妇黑人巨大在线播放 | 亚洲丝袜综合中文字幕| 色吧在线观看| 精品99又大又爽又粗少妇毛片| 亚洲av中文av极速乱| 干丝袜人妻中文字幕| 美女免费视频网站| 欧美3d第一页| 成人三级黄色视频| 亚洲人成网站在线播| 热99在线观看视频| 尤物成人国产欧美一区二区三区| 一进一出好大好爽视频| 国产成人a区在线观看| 精品午夜福利在线看| 啦啦啦韩国在线观看视频| 国产一区二区激情短视频| 赤兔流量卡办理| 国产一区二区三区在线臀色熟女| 亚洲精华国产精华液的使用体验 | 欧美性感艳星| 国产av麻豆久久久久久久| 亚洲国产欧美人成| 91在线观看av| 我要搜黄色片| 国产精品乱码一区二三区的特点| 少妇猛男粗大的猛烈进出视频 | 国产69精品久久久久777片| 97碰自拍视频| 午夜激情福利司机影院| 91午夜精品亚洲一区二区三区| 久久精品国产亚洲网站| 国产三级中文精品| 美女高潮的动态| av国产免费在线观看| 精品久久久久久成人av| 国产精品一及| 日韩欧美国产在线观看| 一进一出好大好爽视频| 久久久久久久久久久丰满| 色吧在线观看| 免费一级毛片在线播放高清视频| 日韩强制内射视频| 伊人久久精品亚洲午夜| 69人妻影院| 欧美区成人在线视频| 亚洲成人久久爱视频| av中文乱码字幕在线| 成年女人毛片免费观看观看9| 久久99热这里只有精品18| 亚洲在线观看片| 亚洲内射少妇av| 欧美+日韩+精品| 国产精品久久久久久av不卡| 成年女人看的毛片在线观看| 1024手机看黄色片| 97碰自拍视频| 91精品国产九色| 九九久久精品国产亚洲av麻豆| 男女那种视频在线观看| 丰满人妻一区二区三区视频av| 大又大粗又爽又黄少妇毛片口| 女生性感内裤真人,穿戴方法视频| 午夜视频国产福利| 精品免费久久久久久久清纯| 国产精品亚洲一级av第二区| 69av精品久久久久久| 免费看光身美女| 亚洲va在线va天堂va国产| 12—13女人毛片做爰片一| 夜夜夜夜夜久久久久| 色噜噜av男人的天堂激情| 欧美国产日韩亚洲一区| 日本精品一区二区三区蜜桃| 午夜免费男女啪啪视频观看 | 天天一区二区日本电影三级| av视频在线观看入口| 午夜久久久久精精品| 哪里可以看免费的av片| 中文资源天堂在线| 一个人观看的视频www高清免费观看| 国产精品一区二区三区四区免费观看 |